Assuring that Self-Driving Cars are Accessible to Those with Disabilities

steering wheel and dashboard of self-driving car

ECE Assistant Professor Xue “Shelley” Lin is working with the algorithms in self-driving vehicles to ensure that they will be accessible to those with disabilities.

Self-driving cars could revolutionize transportation for people with disabilities. These researchers are trying to make it happen.

Main photo: Hyundai’s autonomous fuel cell electric vehicle Nexo is driven through a tunnel near the Pyeongchang Olympic Stadium in Pyeongchang, South Korea. At Pyeongchang Winter Olympic Games. AP Photo by Jae C. Hong

Forget about 2020. It’s the future now, and you’re riding in a self-driving car.

It’s full of touchscreen displays and shiny dashboards up and down the cabin. You’re hanging out with other passengers, everyone’s seats leisurely facing each other. It’s as if you were in a limousine—except, of course, that the chauffeur is an extremely smart computer sustained with artificial intelligence.

Why not use all that kind of technology, which is set to disrupt the automotive industry, to also revolutionize transportation for people who are blind or visually impaired?

That’s the challenge Shelley Lin, an assistant professor of electrical and computer engineering at Northeastern, is trying to tackle with her algorithms, working with a team of researchers from the Virtual Environment & Multimodal Interaction Laboratory at the University of Maine who are studying the way humans interact with modern technology.

Currently, prototypes of self-driving cars are being designed with the assumption that passengers will have full sight abilities. Those designs present an untapped benefit for people who can’t see well, such as the elderly, the visually impaired, or blind people.

For them, it would be difficult to find buttons on a touch screen, Lin says.

“With the interface in autonomous vehicles, you may have a tablet that has useful information only through that tablet,” Lin says. “With our techniques, we can provide another more convenient interface within the autonomous vehicle.”

Another important problem with those autonomous vehicles is that because they won’t have a need for a human driver, the seating system may not always have passengers facing forward. Because of an increasing focus on the passenger experience, the cabins of the future are being reimagined into settings that enable a more organic interaction between passengers.

The issue, Lin says, is that when the vehicle’s computers need to convey crucial information such as navigation and orientation instructions, things can get complicated. Depending on where and how someone is sitting, the computer will struggle to tell the passenger the direction of their destination and which way to exit the vehicle.

And if you are a blind passenger, and the machine tells you to exit on the left, it really makes a difference to know which left. You don’t want to step out into traffic or other dangerous situations.

To begin tackling that challenge, Lin’s team recently received seed funding from a new joint initiative by the Roux Institute at Northeastern University and the University of Maine to kickstart collaborations in areas of research such as artificial intelligence, health and life sciences, and manufacturing—challenges that are in critical need of solutions for Maine and beyond.

Shelley Lin, an assistant professor of electrical and computer engineering at Northeastern. Photo by Matthew Modoono/Northeastern University

In the next year, Lin will be developing and testing her algorithms so that researchers in Maine can put them to the test in the VEMI lab and on self-driving cars.

The idea is to lay the foundation of work so engineers can integrate deep learning–a branch of artificial intelligence that can help algorithms analyze huge amounts of high-dimensional info rapidly—and plug that information into the computing systems in self-driving cars.

Lin plans to start with publicly available data that has similar scenarios as those in the cabins of self-driving cars, as well as the representative images captured by the UMaine team to study the footage of passengers as they sit, move, and interact within the cabin. Those images will then be used to train deep learning algorithms.

Eventually, the algorithm should be able to analyze data in real time and even refine itself, getting increasingly better at recognizing passengers and their orientation. That’s important, Lin says, because you wouldn’t want the machine to think it needs to give directions to a pet or different objects in the cabin.

Lin wants her algorithms to help test ways in which smart vehicles can help assist passengers. She’s thinking in the realm of integrating recognition capabilities to scan for voice and hand gesture commands, and even scan for basic passenger behavior that would tell the computer whether it should seek immediate medical response.

Such complex interactions between the passengers and the vehicle will require the system to be ultrasmart and fast, in order to give feedback in real time. Currently, Lin says, that kind of technology needs large machines to support the computations.

“The shortcoming of deep learning is that it is expensive in terms of the resources, because the model size is very large,” Lin says. “We want the model to be implemented into a fast computing system in the autonomous vehicle.”

For someone who doesn’t always think about human-computer interactions, the idea of spending computing and engineering power in algorithms that can track the passengers’ positions might seem impractical.

It all comes down to what people want, and what the rapidly-growing autonomous vehicle industry is providing, says Richard Corey, a professor of spatial informatics at the University of Maine who is working with Lin on the project.

“These automotive industry folks have got all these magical ideas of how we’re going to be interacting in these cars,” Corey says. “You see people in pictures with tables, and they’re playing cards, or they’re laying down to take a nap.”

Corey directs the Virtual Environment & Multimodal Interaction Laboratory at the University of Maine, a facility that will be crucial in testing Lin’s deep learning models. After studying self-driving vehicles in recent years, Corey says the human-machine interaction is a matter of two-way communication.

“We realized that there’s going to be times when the vehicle is going to need stuff from us, or we’re going to need stuff from the vehicle,” he says. “And we just know that there’s going to be many different ways that people then will need to interact [with the vehicle].”

Nicholas Giudice, a professor of spatial informatics at the University of Maine who co-founded the VEMI lab with Corey, says autonomous vehicles that are designed to help people with disabilities will benefit every passenger.

“In an autonomous vehicle, sighted people are going to be highly distracted,” Giudice says. “They aren’t going to be aware of their environment, much like a blind person will be, so in that instance, the types of things that we’re talking about could support them as well, which is a huge amount of people.”

Lin’s project is one of five that have been selected following a rigorous review process involving peer faculty reviewers and research leaders at each university. Selected from a pool of 21 applications, the five winners were awarded $50,000 each. The other four winners include:

  • a proposal to look at artificial intelligence’s role in examining the interplay between newborns’ pacifier use and Sudden Infant Death Syndrome, the leading cause of infant mortality in the first 30 days after birth. The project is led by Sarah Ostadabbas, assistant professor of electrical and computer engineering at Northeastern, and Rebecca Schwartz-Mette, assistant professor of psychology at UMaine.
  • a proposal to develop a new biofluid analysis instrument that would have unprecedented sensitivity and selectivity, and could have broad applications for health care and medical diagnoses. The project is led by Srinivas Tadigadapa, professor and chair of the Department of Electrical and Computer Engineering at Northeastern, and Rosemary Smith, professor of electrical and computer engineering at UMaine.
  • a proposal to develop a new, cost-effective, easily deployable ingredient to help create a stronger immune response in fish, led by Jiahe Li, assistant professor of bioengineering at Northeastern, Xin Sun, doctoral candidate at Northeastern, and research collaborators at UMaine.
  • a proposal to better understand the immune system’s response to Influenza A virus infection and develop an automated AI-based network modeling approach to find new antiviral therapeutic targets. The team is led by Mingyang Lu, assistant professor of bioengineering and Ataur Katebi, associate research scientist in the Department of Bioengineering, both at Northeastern, and research collaborators at UMaine.

“I guarantee that some of these collaborations will yield proposals going to the federal government,” said David Luzzi, senior vice provost for research at Northeastern and vice president of the Innovation Campus in Burlington, MA. “When you bring together human minds where the ability to make intuitive leaps can feed off of each other, that’s where the power of collaboration really happens.”

by Roberto Molar Candanosa, News @ Northeastern

Related Faculty: Xue "Shelley" Lin, Sarah Ostadabbas, Srinivas Tadigadapa, Mingyang Lu, David Luzzi

Related Departments:Electrical & Computer Engineering