International Mind and Brain Society

Ibrahim Dahlstrom-Hakki and Zac Alstad presented at the International Mind and Brain Society 2024 Conference in Leuven, Belgium from July 10-12, 2024.

Ibrahim Dahlstrom-Hakki – NeuroVivid: A BCI Maker Experience for Neurodivergent Youth

BCI (brain-computer interfaces) is a rapidly growing field that combines advances in AI, brain sensing technologies (such as EEG and fNIRS), neuroscience, and engineering. Dr. Dahlstrom-Hakki presented work that provides opportunities for ethnically diverse neurodivergent youth (ages 11-14) to assemble, program, and use simple EEG headsets to create BCI devices in a makerspace setting. This NSF funded project explores how best to make EEG sensor technologies accessible and usable by middle school aged students, with specific attention to the intersectionality of learners’ identities. Each maker activity is being developed through a codesign process that includes the project’s research and development team working closely with neurodivergent codesigners who are involved in all aspects of the project.

Dr. Dahlstrom-Hakki reported on the initial year of codesign as well as the results of the pilot implementation of the maker program at the New York Hall of Science (NYSCI). The discussion included guidelines for how to engage neurodivergent youth in codesign, how to design and implement a program to teach the use of simple sensor technology to young learners, and the cognitive and social needs of neurodivergent youth with an intersectionality of identities.

Zachary Alstad – Exploring Executive Function Supports Using Augmented Reality

Given the growth of abundant technological distractions available to students and increasingly compelling incentives to remain off task, it is evident that many learners could benefit from tools that support executive function (EF) skills and help them to maintain focus. This need may be even more pronounced in communities of students who are neurodiverse. Traditional interventions aimed at supporting engagement generally provide structure to regulate time for on task and off task behavior (e.g. the pomodoro technique). However, this is problematic in that there is no feedback with regards to the actual attention state of the student.

Dr. Alstad reported on an NSF funded project developing an Augmented Reality (AR) “smart pomodoro” that uses headset sensor data to detect when learners are off-task to provide more meaningful and robust interventions. The AR headset will customize the type, timing, duration and intensity of prompting students receive based on various factors including head position, gaze direction and time on task. This customized response aims to account for some of the disparities in sensory needs typically observed within neurodiverse populations. Dr. Alstad shared a range of prompts developed using a codesign process working closely with neurodivergent college students, and empirically tested at a campus that exclusively serves neurodivergent learners. While this work is ongoing, initial results suggest that this system could be an asset for students who are struggling to engage during independent work.