MIE’s Moghaddam Awarded $2M for Intelligent Extended Reality Research

Mohsen Moghaddam

Mohsen Moghaddam, assistant professor, mechanical and industrial engineering (MIE), has been awarded a $2 million grant from the Future of Work at the Human-Technology Frontier Program, one the 10 Big Ideas of the National Science Foundation (NSF). With this funding, he will study how extended reality (XR) coupled with artificial intelligence (AI) can enhance learning and adaptability of manufacturing workers on the shop floor.

“The U.S. manufacturing industry is experiencing a perfect storm of challenges as the baby boomer generation begins to retire, hiring practices are shifted due to the introduction of advanced technologies, and the new generation remains unenthusiastic to the manufacturing industry,” says Moghaddam. “This research addresses that growing skills mismatch in the manufacturing workforce.”

Moghaddam’s project uses both augmented reality—in which digital elements are overlaid onto a device’s live view—and mixed reality—in which real-world and digital objects can interact—on digital devices to train manufacturing employees on the job.

“Instead of taking workers to classrooms and giving them presentations, the idea here is to integrate the information the worker needs on the job into their experience as they learn,” explains Moghaddam. “Our technology would provide the necessary information on demand in an adaptive and personalized fashion. The information would also be presented in a variety of forms, including 3D visualizations superimposed on physical objects; notifications (visual, auditory, or haptic); action recommendations; spatial and causal reasoning animations; and remote assistance.”

Besides the NSF project, Moghaddam is also currently collaborating with Casper Harteveld of CAMD on a project funded by the U.S. Navy and led by the Kostas Research Institute (KRI) developing virtual and mixed reality training systems for personnel use specific robotic devices. Workers would first undergo online courses, then learn in the virtual environment, and then work on-site using a mixed reality headset to complete their on-the-job training. Rather than giving step-by-step instructions, the technology assesses each user’s skill level and understanding and caters to their individual needs.

“We want to develop a system that functions as a teammate for workers to undertake complex tasks that are within their area of specialization but that they haven’t done before,” says Moghaddam. “The goal is to get the workers to learn the technology and then be able to perform without the headset with full independence and confidence.”

Prior studies by manufacturers such as Boeing and Mercedes have reported that the use of augmented reality has been a boon to their production, showing up to 50 percent improvement on production time and more than 80 percent reduction in error rates.

Moghaddam’s work complements Executive Order 14005, Ensuring the Future is Made in All of America by All of America’s Workers, a White House initiative to revitalize American manufacturing.

This project comes out of Moghaddam’s interest in exploring and understanding how the role of humans in future workplaces will evolve given the highly automated technologies and intelligent workplace tools that are rapidly growing because of automation and AI.

“A 100 percent replacement of humans in factories won’t happen—hardware and software and AI are not as intelligent or as adaptive as humans, and they can’t deal with complexities the way people do,” says Moghaddam. “The latest wave of automation technologies is more to increase precision, safety, and quality, and to transition human workers away from the dirtier, more dangerous jobs.”

Moghaddam has many collaborators on his work:

  • Robert Roy, a consultant from GE Aviation, on designing use cases around interactive production and inspection work in precision manufacturing, where complex parts must be produced with extreme precision and recurrently inspected by workers using complex gauges
  • Stacy Marsella, professor of Computer Sciences and Psychology at Northeastern, on devising new AI methods for data-driven modeling of human behavior and task performance to enable adaptive and personalized learning experiences in XR
  • Alicia Sasser Modestino, associate professor at Northeastern’s School of Public Policy and Urban Affairs, to better understand how the use of XR technologies in the industry interacts with other factors, such as level of education and socioeconomic background.
  • Kemi Jona, Assistant Vice Chancellor, Digital Innovation and Enterprise Learning, and Nicholas Wilson, Associate Director, Center for Advancing Teaching and Learning Through Research (CATLR), to investigate the effects of adaptive scaffolding of conceptual development during initial learning and progression of novice-to-expert development of skill acquisition through case-based reasoning
  • Chitra Javdekar, Dean of STEM at MassBay Community College, to validate the IXR technology through extensive laboratory experiments on precision machining and inspection

The project’s diverse advisory committee—including members from PTC Inc., GE Aviation, MassMEP, MassBay Community College, Springfield Tech Community College, Northeastern College of Professional Studies, Aalborg University (Denmark), NIST, Festo, and Burning Glass Technologies—will illuminate the potential potentials, adoption barriers, risks of XR for workplace-based learning in manufacturing.

With these deep partnerships, Moghaddam hopes to have a functioning prototype for use in the next three to four years on its journey to becoming an at-scale product available for commercial use.


Abstract NSF

 

Related Faculty: Mohsen Moghaddam

Related Departments:Mechanical & Industrial Engineering