Restuccia Receives AFOSR Young Investigator Award to Advance Future Data Driven Wireless Systems
Wireless networks like cellular, WiFi, and Bluetooth, surround us. We depend on them almost continuously, in everything from consumer and business uses to more specialized industrial and military applications. As this technology becomes more central to our lives, and is used in more sophisticated ways, it’s crucial to ensure that it functions accurately and reliably, even if networks become congested or compromised.
This is the complex task before Francesco Restuccia, assistant professor of electrical and computer engineering and member of Northeastern’s Institute for the Wireless Internet of Things and Roux Institute. In December of 2022, he received an Air Force Office of Scientific Research Young Investigator Program Award to examine the algorithmic foundations and theoretical performance bounds of the dynamic, data-driven wireless systems of the future.
To illustrate the challenge, Restuccia offers the example of a drone or other autonomous device that’s required to perform object recognition. The artificial intelligence and machine learning algorithms used for object recognition, he explains, are so computationally complex that they cannot be effectively run on the device itself.
“We offload those AI/ML tasks through the wireless network to the edge and receive the response in real time,” says Restuccia. “To do that, you need to do two things: one, adjust the wireless network to make sure these tasks are executed accurately; and two, control the resolution of the images that you’re sending to the edge. You must do both at the same time to really optimize. When you do that, while also trying to minimize resource utilization, the complexity of what you’re handling increases exponentially.”
Optimizing the Cyber-Physical Wireless Network
Restuccia is exploring new ways to enable these kinds of complex real-time wireless transmissions within a cyber-physical system. The “cyber” portion in this case is represented by the artificial intelligence and machine learning algorithms required by an application using the network, and the “physical” component is the sensors and actuators connected to the network. These “cyber” and “physical” sides must communicate reliably and in real time across the network, requiring completely new levels of optimization, both in the network itself and in the algorithms using it.
Networks, Restuccia explains, can be optimized mathematically by creating multiple logical networks on shared physical infrastructure. This approach can help increase access capacity, reduce latency, and improve reliability in networks.
On the algorithmic side, Restuccia is applying application-level semantic optimization to the problem. In general semantic communication, all data need not be transmitted, only those attributes required for an application to perform its function, thus reducing bandwidth demand. Returning to the example of object recognition, he explains, “You don’t need to send the entire image—you can just send the foreground, where the object will most likely be.”
Using network slicing and semantic approaches, and taking into account changing spectrum conditions, Restuccia aims to dynamically compress sensed data to minimize demand on the network while ensuring that application demands are satisfied.
Another vital aspect of Restuccia’s work is making sure that AI/ML algorithms can reliably comply with the constantly shifting limitations of a network and the similarly changeable demands of an application.
“You cannot do that offline,” says Restuccia. “Because demands change, constraints change. So you need to adapt the AI/ML models in real time, almost continuously, and you need to certify the performance.”
Certification refers to ensuring that an AI/ML model will perform at a given level for a given amount of time in given circumstances. When those circumstances arise again, the model can be reused.
“All of these efforts are in pursuit of guaranteeing wireless operations,” says Restuccia. “It’s connected to the certifiable AI, it’s connected to optimizing the sensing, and it’s connected to optimizing the network. Basically, every component of the system has to be optimized in real time.”
Training and Testing
While his research challenge is formidable, Restuccia has ready access to several unique assets in his effort to optimize cyber-physical systems. One is Arena, a wireless radio testbed located on Northeastern’s Boston campus. An open-access platform equipped with a grid of 64 antennas, 12 compute servers, and 24 software-defined radios, Arena provides computational power and scale that Restuccia can use to validate his theoretical findings.
In addition, he plans to make use of Colosseum, the world’s largest radiofrequency channel emulator, located at Northeastern’s Innovation Campus in Burlington, Massachusetts. With massive computing power that can process more information in a single second than is estimated to be held in the entire print collection of the Library of Congress, Colosseum can create customized virtual environments containing hundreds of wireless signals.
Restuccia plans to use these facilities to generate data that will help refine AI/ML algorithms.
“These are data-driven algorithms,” he says. “To get the data, you need this experimental component. We’ll set up different wireless networks and based on that setup we’ll collect data in different channel conditions while considering different AI/ML applications. We can use that to train and test the AI.”
Wireless networks have already transformed our lives radically over just a few years. The work that Restuccia is doing now will help drive this transformation further, in areas like augmented and virtual reality, telepresence, remote surgery, and the metaverse—all applications that will depend on massive exchanges of data happening swiftly and reliably.
We will research the underlying algorithmic foundations and theoretical performance bounds of future dynamic data-driven wireless systems. The key target of this project is to achieve strategic mission-critical application-level objectives by guaranteeing assured wireless communications in congested, contested, concealed, and contaminated environments. First, we mathematically optimize network operations through dynamic allocation of network, computation, and memory resources, taking into account the semantics of the sensed data and the current spectrum conditions into the problem formulation. Importantly, our optimization dynamically compresses sensed data according to application-level semantics, so that network load can be minimized while still guaranteeing key application performance indicators. Next, we investigate novel techniques to guarantee the performance of the proposed data-driven optimization and classification techniques in terms of latency, accuracy, and resource occupation in challenged and constrained scenarios. Our key theoretical findings will be extensively validated through extensive data collection campaigns leveraging several wireless testbeds available at the Institute for the Wireless Internet of Things.