Yong Receives ONR YIP Award to Help Create Autonomous Navy Ships
MIE Associate Professor Sze Zheng Yong was awarded a $750K Office of Naval Research Young Investigator Program Award for “Distributed Coordination of Autonomous Swarms with Limited or Absent Communication and Intermittent Data.” Yong is one of only twenty-five to share nearly $17.5 million in funding to conduct innovative scientific research that will benefit science and technology for the U.S. Navy and Marine Corps.
You never know where you’ll find the seed that helps to establish your career. For Sze Zheng Yong, an associate professor of mechanical and industrial engineering, the seed was planted during his bicycle commute while in graduate school at MIT.
While pedaling through Cambridge, Yong became increasingly aware of the constant non-verbal communication needed to avoid accidents with bikes, automobiles, pedestrians, and people getting out of parked cars. His fascination blossomed into a PhD thesis on autonomous systems with intent awareness—which, in turn, led to a series of grants for work related to multi-agent intent estimation (or more broadly, model estimation) in self-driving cars and lunar/planetary rovers.
Those accomplishments led to Yong receiving a prestigious a Young Investigator Program Award from the Office of Naval Research—a highly competitive and popular early-career award program where prior academic achievement and potential for significant scientific breakthrough are key elements of the evaluation criteria. Yong plans to use the three-year $750,000 grant, titled “Distributed Coordination of Autonomous Swarms with Limited or Absent Communication and Intermittent Data,” to develop algorithms for non-verbal communication between unmanned, autonomous Naval ships. The ultimate goal is to make it possible for unmanned vessels to carry out dangerous operations in conditions where GPS and other modes of communication are limited or unavailable.
“My work is inspired by the way human teams use body language to communicate nonverbally,” says Yong. “If humans can do it, we should be able to create a robotic system that can do something similar. My challenge is to make it possible for robots to express their intentions through their movements and motions, and for other robots in the team to infer this intent from those movements.”
Yong’s task is to create the complex algorithms needed for autonomous robots and Naval ships to design “information-bearing motions” for enhanced nonverbal communication. His work will also make it possible for other robot teammates to decode the information from these motions with limited or no communication. The research is to make it possible for unmanned ships to function autonomously as teams and adapt to the unexpected when there is no central command. This is particularly important in situations where GPS and other forms of communication are spotty or non-existent. It is also important to make these information-bearing movements subtle, so they are not easily intercepted and decoded by adversaries.
Yong’s innovation will help make it possible for unmanned ships to protect seaports, monitor enemy activity, transport cargo, conduct dangerous rescue operations, and disrupt enemy activity—all without endangering the lives of military personnel.
“I want to conduct research not just for the fun of it, but because I’m creating something useful,” he says.
Research Abstract: Coordinated swarms of autonomous platforms can very efficiently and affordably conduct a range of naval operations, including environmental sensing and inspection, persistent monitoring and patrol, rescue operations, cargo delivery, disruption and tracking of potential adversaries, and ensuring access to maritime domains. Although the technology to develop autonomous multi-agent systems has been progressing, it remains a challenge to deploy autonomous platforms that can perform tasks in realistic, GPS-denied environments where radio communications are limited or unreliable or when inter-robot communication need to be minimized to conserve power during long-duration missions or to reduce the possibility of detection by hostile entities. To address some of these challenges, this project aims to develop fundamental theory and efficient algorithms for distributed coordination of autonomous swarms with limited or absent communication and realistic assumptions on sensing/perception under field conditions such as occlusion or glare and limited sensing range. In particular, motivated by non-verbal (non-explicit) communication in the form of body language or gestures in human collaborations, we hypothesize that the motion/trajectory taken by a certain agent can be information-bearing and hence, we propose to develop novel intent-expressive/legible motion planning and intent estimation algorithms for improving distributed control and coordination of swarms. The proposed cooperative autonomous swarm technology will enable autonomous naval/maritime teams to provide and interpret “non-verbal” cues or motions of communication-less or communication-limited but cooperative agents.