The Magic of Engineering

student working on robotic hand that can do sign language

Blog by Samantha Johnson, E’21 and ME’21, bioengineering

Have you ever seen a magic trick? Not an elaborate show where a man with a rabbit in his hat cuts a beautiful woman into two, but a close-up magic trick? One where you can see the magician’s hands, where you can fully conceptualize all the variables at play—you know how many cards are in a deck, you know the way his hands can move, and you know the card you chose? However, despite your seemingly complete understanding of the situation, you cannot wrap your head around how the magician was able to determine your card?

To me, tactile American Sign Language (ASL) is like close up magic.

robot hand doing sign languageWhen I was a second-year bioengineering student, before knowing anything about circuits or anthropomorphic relationships, I took a sign language class with Professor Laurie Achin. As part of the class, I was required to go out into deaf and deafblind communities to learn about communication patterns and to gain a better understanding of the life of a deaf person in Boston. On one outing, I met a deafblind woman, Lucy. By Lucy’s side was an interpreter. When I spoke to Lucy, she would place her right hand atop the interpreter’s hand, and he would begin signing ASL. Solely through feeling the top of her interpreter’s hand as he signed, Lucy was able to understand what I was saying. She was able to, somehow, feel a visual language.

I immediately became fascinated with Lucy and the deafblind community. With two less senses than you and me, Lucy was able to travel, hold a job, and maintain relationships—only needing an interpreter to provide communication assistance. And though I was impressed by the independence Lucy could attain, I began to wonder if complete self-sufficiency could be possible. I began thinking about the autonomy that Lucy must be striving for, comparing it to my own growing sense of independence as a college student. All she needed was a way to efficiently communicate with the outside world without the use of an interpreter.

Master’s thesis focused on aiding the deafblind community

Samantha Johnson

Samantha Johnson

With the start of my master’s thesis, I began to design a device to aid Lucy and the deafblind community. With classes like bioelectricity and musculoskeletal biomechanics under my belt, I reached out to Bioengineering Assistant Professor Chiara Bellini and told her about my idea—to design a robotic arm capable of signing the complex motions of ASL to function as a communication tool for the deafblind. Quickly, the project took off as we found collaboration partners, both down the street at the Deafblind Contact Center in Allston, Massachusetts and across the world at the New Dexterity Lab in Auckland, New Zealand. Only a few months into the project, we already have a functional robotic hand, ready to be fine-tuned with the help of some feedback sessions with deafblind participants.

As I continued to work on my thesis during the COVID pandemic, social distancing increased the need for an independent interpreting device for the deafblind community. As the availability of interpreter services decreased, deafblind individuals have experienced limited access to family, public safety announcements, and their communities. With one more semester of school, I plan to make the ASL interpreting robotic arm an open source device, capable of expanding community involvement for any deafblind individual looking for a new way of communicating.

Related Departments:Bioengineering