Recurrent Brain Graph Mapper for Predicting Time-Dependent Brain Graph Evaluation Trajectory


Tekin A., Nebli A., Rekık I.

3rd MICCAI Workshop on Domain Adaptation and Representation Transfer (DART), Strasbourg, France, 27 September - 01 October 2021, vol.12968, pp.180-190 identifier identifier

  • Publication Type: Conference Paper / Full Text
  • Volume: 12968
  • Doi Number: 10.1007/978-3-030-87722-4_17
  • City: Strasbourg
  • Country: France
  • Page Numbers: pp.180-190
  • Keywords: Recurrent graph convolution, Transformation layer, Topological loss, Time-dependent graph evolution prediction, NEURAL-NETWORKS, SERIES

Abstract

Several brain disorders can be detected by observing alterations in the brain's structural and functional connectivities. Neurological findings suggest that early diagnosis of brain disorders, such as mild cognitive impairment (MCI), can prevent and even reverse its development into Alzheimer's disease (AD). In this context, recent studies aimed to predict the evolution of brain connectivities over time by proposing machine learning models that work on brain images. However, such an approach is costly and time-consuming. Here, we propose to use brain connectivities as a more efficient alternative for time-dependent brain disorder diagnosis by regarding the brain as instead a large interconnected graph characterizing the interconnectivity scheme between several brain regions. We term our proposed method Recurrent Brain Graph Mapper (RBGM), a novel efficient edge-based recurrent graph neural network that predicts the time-dependent evaluation trajectory of a brain graph from a single baseline. Our RBGM contains a set of recurrent neural network-inspired mappers for each time point, where each mapper aims to project the ground-truth brain graph onto its next time point. We leverage the teacher forcing method to boost training and improve the evolved brain graph quality. To maintain the topological consistency between the predicted brain graphs and their corresponding ground-truth brain graphs at each time point, we further integrate a topological loss. We also use l1 loss to capture time-dependency and minimize the distance between the brain graph at consecutive time points for regularization. Benchmarks against several variants of RBGM and state-of-the-art methods prove that we can achieve the same accuracy in predicting brain graph evolution more efficiently, paving the way for novel graph neural network architecture and a highly efficient training scheme. Our RBGM code is available at https://github.com/basiralab/ RBGM.