Loading Events

« All Events

  • This event has passed.

Cheng Gongye’s PhD Proposal Review

June 2, 2023 @ 11:00 am - 12:00 pm

“Hardware Security Vulnerabilities in Deep Neural Networks and Mitigations”

Committee Members:
Prof. Yunsi Fei (Advisor)
Prof. Xue Lin
Prof. Xiaolin Xu

Abstract:
Over the past decade, Deep Neural Networks (DNNs) have revolutionized numerous fields. With the increasing deployment of DNN models in security-sensitive and mission-critical applications, such as autonomous driving, ensuring the security and privacy of DNN inference is of paramount importance.

This Ph.D. dissertation investigates two primary hardware security attack vectors: fault attacks and side-channel attacks. Fault attacks compromise the integrity of a targeted application by intentionally disrupting the computation or injecting faults on parameters. Side-channel attacks exploit information leakage from the application execution through physical parameters such as power consumption, electromagnetic emanations, and timing to retrieve secrets, thereby breaching confidentiality.

For fault attacks, we demonstrate a power-glitching fault injection attack on FPGA-based DNN accelerators in cloud environments. The attack exploits vulnerabilities in the shared power distribution network and leverages time-to-digital converter (TDC) sensors for precise fault injection timing, and results in model misclassification, an integrity compromise on the targeted application. We propose a lightweight defense framework for detecting and mitigating adversarial bit-flip attacks induced by RowHammer on DNNs. This framework employs a dynamic channel-shuffling obfuscation scheme and a logits-based model integrity monitor, offering negligible performance loss. This framework effectively protects various DNN models from RowHammer attacks without any retraining or model structure modifications.

For side-channel attacks, we present a floating-point timing side channels attack to reverse-engineer multi-layer perceptron (MLP) model parameters in software implementations. This attack successfully recovers DNN parameters, weights and biases.

Regarding ongoing research, we observe that previous studies often focus on academic prototypes, resulting in limited applicability. To bridge these gaps, we select the AMD-Xilinx DPU, one of the most advanced DNN accelerators to date, to conduct the analysis. We propose a side-channel attack that utilizes electromagnetic emissions to extract parameters. Furthermore, we propose a comprehensive fault analysis of quantized DNN models by simulations and discuss potential mitigation strategies.

Details

Date:
June 2, 2023
Time:
11:00 am - 12:00 pm
Website:
https://northeastern.zoom.us/j/99188817074?pwd=TncxekdkaEVtMldYOFBvaGxIWFFrUT09#success

Other

Department
Electrical and Computer Engineering
Topics
MS/PhD Thesis Defense
Audience
Faculty, Staff