Creating Age-Inclusive VR
ECE Associate Professor Sarah Ostadabbas, in collaboration with the University of Rhode Island, was awarded a $600,000 NSF grant for “Graph-Centric Exploration of Nonlinear Neural Dynamics in Visuospatial-Motor Functions During Immersive Human-Computer Interactions.” She is investigating how aging impacts the ability to use emerging human-computer interaction (HCI) technologies, such as virtual reality (VR).
The research will develop an Immersive Multimodal HCI (Immersive mHCI) framework to explore the neural dynamics connecting age-related changes in visuospatial-motor (VSM) functions to the digital competence required for adapting to immersive 3D HCI environments. The project will focus on designing a novel VR-based interface, developing advanced pattern recognition techniques, and testing statistical methods to predict age-related changes in VSM functions. The outcomes are expected to enhance our understanding of VSM functions and contribute to designing adaptive, inclusive HCI systems for diverse user needs.
Abstract Source: NSF
The proposed project aims to understand how aging affects people’s ability to use emerging technologies for human-computer interaction, such as virtual reality (VR). As people age, their visual, spatial, and motor control abilities decline; this decline in visuospatial-motor (VSM) functions likely affects how well they can use VR and related technologies that involve immersive 3D environments. This decline could in turn reduce older adults’ ability to reap the educational, social, health, and general well-being benefits that VR and related technologies can provide. The goal of this project is to link neuroscientific measures of brain activity with the use of VR-based 3D environments, building models that relate VSM abilities to the successful use of features of VR designs. These models will advance scientific understanding of brain function in virtual spaces and are intended to guide the design of future VR interfaces so that they are better able to adapt to variations in VSM ability associated with aging. The project will also support education and diversity by involving a multidisciplinary team from neuroscience, engineering, and computer science. The insights gained could inform the design of more accessible and inclusive HCI systems, benefiting a broader range of users across various demographics.
The project proposes an Immersive Multimodal HCI (Immersive mHCI) framework to explore the underlying neural dynamics that connect age-related changes in visuospatial-motor (VSM) functions to the digital competence required for adapting to immersive 3D HCI environments. The research is structured around three key thrusts: Thrust 1 involves the design of a novel dual visuospatial-motor virtual reality-based interface (VSM-VRI) as an immersive 3D task environment. This interface will facilitate the multimodal characterization of the complex nonlinear dynamics underlying visuospatial and motor interactions, providing a realistic and challenging context for studying VSM functions. Thrust 2 focuses on developing novel nonlinear pattern recognition techniques and a graph-based learning framework. These tools will characterize and fuse the nonlinear dynamics of VSM neural interrelations as reflected in electrical and vascular-hemodynamic neural activities captured when experimental participants use the proposed 3D task environment. The goal is to create a comprehensive model that captures the intricate spatiotemporal neural patterns associated with VSM functions. Thrust 3 aims to develop and test statistical methods to evaluate the proposed VSM-VRI and graph-based computational frameworks. These methods will predict age-related VSM functionality changes and their effect on adaptation to emerging 3D HCI environments, compared to traditional 2D screen-based interactions. The project’s outcomes will enhance understanding of VSM functions and inform the design of adaptive, inclusive HCI systems that cater to diverse user needs.