Loading Events

« All Events

  • This event has passed.

ECE PhD Dissertation Defense: Mo Han

March 26, 2021 @ 12:00 pm - 1:00 pm

PhD Dissertation Defense: Human Grasp Intent Inference and Multimodal Control in Prosthetic Hands

Mo Han

Location: Zoom Link

Abstract: Upper limb and hand functionality is critical to many activities of daily living and the amputation of one can lead to significant functionality loss for individuals. From this perspective, advanced prosthetic hands of the future are anticipated to benefit from improved shared control between a robotic hand and its human user, but more importantly from the improved capability to infer human intent from multimodal sensor data to provide the robotic hand perception abilities regarding the operational context. Such multimodal data may be collected from various environment sensors such as camera providing visual information, as well as easily-accessed human physiologic sensors including electromyographic (EMG) sensors. A fusion methodology for environmental state and human intent estimation can combine these sources of evidence in order to help prosthetic hand motion planning and control.

As part of a multi-disciplinary project, i.e. HANDS project, which aims at designing a robotic hand as an upper limb prosthetic device, we developed two independent prosthetic control systems (HANDS V1 and HANDS V2) integrating multimodal sources of EMG and visual evidences into the control loop. Multiple grasps required for activities of daily living can be performed by both robotic systems which were developed in a lighter and cheaper semi-autonomous manner. The HANDS V1 system was first developed to provide an easy and convenient prosthesis with a portable EMG armband and a built-in palm camera, and hereafter the HANDS V2 was constructed as an upgraded solution of HANDS V1 to achieve more difficult tasks with more identified grasp types, more EMG channels and more complicated visual information involved. Both systems depend on multimodal signals from EMG and vision, where the EMG could reflect the physiologic features related to user intents, while the robustness and adaptability to different users could be retained by the visual information relying more on surrounding environments. We collected two datasets for the initialization of each system, and the developments of the EMG-control, visual-control, and joint-control algorithms were conducted for both systems. We exploited efficient computer vision and physiological signal processing methodologies to decrease the system complexity as well as improve the user comfort, in order to provide smarter and cheaper prosthetic hands to the audience. Online experiments were executed and evaluated on both HANDS V1 and HANDS V2 systems, implemented by the Robot Operating System (ROS) system.

Details

Date:
March 26, 2021
Time:
12:00 pm - 1:00 pm
Website:
https://northeastern.zoom.us/j/95550946164#success

Other

Department
Electrical and Computer Engineering
Topics
MS/PhD Thesis Defense