Brain changes associated with Alzheimer's Disease (AD) occur many years before the onset of clinical symptoms, and improving clinicians' ability to detect these changes would widen the window for early intervention.
Entorhinal cortex (EC) volume loss is the earliest measurable sign of AD, a key network hub for memory and spatial navigation, closely integrated with the hippocampus. Spatial navigation, especially allocentric strategies, also depend heavily on the hippocampus. As the hippocampus degenerates, individuals struggle with allocentric navigation, relying on egocentric strategies. This shift is a clear and measurable sign of cognitive impairment. Therefore, although EEG cannot reliably capture hippocampal activity, it can capture brain states associated with the loss of spatial navigation.
Our project combines VR-based navigation tasks with EEG analysis to detect early neuronal changes associated with AD. Our VR-based navigation tasks consist of two components: an allocentric task, where participants navigate using geometric cues (i.e., Y-intersections and sharp turns) and an egocentric task, where participants rely on landmarks. Patients with AD may still manage allocentric navigation but with reduced accuracy. As the disease progresses, their dependence on landmarks will increase, revealing a clear shift in their navigation strategies. By comparing performance across these asks, we aim to identify the subtle cognitive shifts that can serve as early indicators of AD.
Additionally, EEG data will be monitored from key brain regions, including the frontopolar area (executive function), IFG/Broca's area (motor action planning), dLPFC (working memory), LTC (memory encoding), mPFC (categorization), and RTC (spatial navigation) during the task.
Log in or sign up for Devpost to join the conversation.