Developing a New Generation of Intelligent Tutoring Systems for Advanced Manufacturing
MIE Assistant Professor Mohsen Moghaddam is leading a $850K NSF grant for “Accelerating Skill Acquisition in Complex Psychomotor Tasks via an Intelligent Extended Reality Tutoring System.” Project collaborators include Northeastern University co-PIs Kemi Jona, assistant vice chancellor for digital innovation and enterprise learning, Casper Harteveld, associate professor of game design and associate dean of CAMD, and Mehmet Kosa, postdoctoral research associate working with Casper. This project builds upon the ongoing research of the PI at the intersection of AI and augmented, virtual, and mixed reality, sponsored by NSF, DARPA, and the Navy. This interdisciplinary collaboration between COE and CAMD aims to foster learning and adaptability in educational and workplace settings across a range of industries, including manufacturing, healthcare, construction, and defense, among others.
Abstract Source: NSF
Manufacturing, medical laboratory, construction, and many other jobs require workers to learn complex physical ‘psychomotor’ tasks that combine both perceptual and motor skills. These are often taught using an apprenticeship model on real job sites, which raises both productivity and safety risks for workers. Further, relatively little is known about how to assess trainees’ skill levels in these tasks and to adapt training practices based on those assessments. This project tackles these problems by developing a new generation of intelligent tutoring systems that combine extended reality (XR), artificial intelligence (AI) and Internet-of-things (IoT) technologies to support training and assessment of complex skills required by modern, highly automated manufacturing facilities. The high level idea is that new sources of data captured by XR headsets, wearable devices, cameras, and IoT sensors can be used to build models of psychomotor skill development and new methods for providing personalized, just-in-time coaching guidance. Through partnerships with manufacturing consulting firms, local community colleges, and K-12 schools, the project will enhance the skill development of a diverse population of learners and professionals and expand interest in advanced manufacturing careers.
The project team brings together expertise in engineering, cognitive psychology, learning sciences, game design, and XR, to make fundamental contributions to both learning science and learning technologies around just-in-time, personalized, context-aware provision of learning scaffolds for manufacturing workers learning new skills. On the learning side, the project team will examine the stages of expertise development for specific psychomotor tasks, and the effectiveness of adaptive interventions on learners’ engagement, performance gains, and accuracy. A virtual reality (VR) game in an advanced manufacturing scenario will be used to collect ecologically valid baseline data and prepare more novice learners for real-world task performance. On the technology side, the project team will build and validate an intelligent XR tutoring system to accelerate the learning of psychomotor tasks with high complexity that arises from task structures and human information processing requirements. The innovative aspects of the technology include data-driven activity understanding (e.g., task step identification and error detection) and user modeling (e.g., cognitive load detection), through novel multimodal AI architectures designed to process and fuse data captured from augmented reality (AR) headsets, wearables that capture physiological data, cameras, IoT sensors, and manufacturing machines. Both learning and technology innovations will be validated through extensive laboratory studies; together, the work will lead to an intelligent feedback algorithm to dynamically adapt the nature, frequency, and depth of feedback to the expertise of the learner to facilitate optimal learning and speed-to-competence.