Loading Events

« All Events

  • This event has passed.

Cheng Gongye PhD Dissertation Defense

December 4, 2023 @ 10:30 am - 11:30 am

Title:
Hardware Security Vulnerabilities in Deep Neural Networks and Mitigations

Date:
12/4/2023

Time:
10:30:00 AM

Committee Members:
Prof. Yunsi Fei (Advisor)
Prof. Aidong Ding
Prof. Xue Lin
Prof. Xiaolin Xu

Abstract:
In the past decade, Deep Neural Networks (DNNs) have become pivotal in numerous fields, including security-sensitive autonomous driving and privacy-critical medical diagnosis. This Ph.D. dissertation delves into the hardware security of DNNs, discovering their vulnerabilities to fault and side-channel attacks and exploring novel countermeasures essential for their safe deployment in critical applications.

Fault attacks disrupt computation or inject faults into parameters, compromising the integrity of targeted applications. This dissertation demonstrates a power-glitching fault injection attack on FPGA-based DNN accelerators, common in cloud environments, which exploits vulnerabilities in the shared power distribution network and results in model misclassification. In response to these threats, we introduce a novel, lightweight defense mechanism to protect DNN parameters from adversarial bit-flip attacks. The proposed framework incorporates a dynamic channel-shuffling obfuscation scheme coupled with a logits-based model integrity monitor. The approach effectively safeguards various DNN models against bit-flip attacks, without necessitating retraining or structural changes to the models. Furthermore, our research expands the scope of fault analysis beyond just the parameters of DNN models. We thoroughly examine the entire implementation of commercial products, defying the prevailing assumption that quantized DNNs are inherently resistant to bit-flips.

Side-channel attacks exploit information leakage of system implementations, such as power consumption and electromagnetic emanations, to reveal system secrets and therefore compromise confidentiality. This dissertation makes significant contributions to side-channel assisted model extraction of DNNs. We present a floating-point timing side-channel attack on x86 CPUs that reverse-engineers DNN model parameters in software implementations. For hardware accelerators, we target the state-of-the-art AMD-Xilinx deep-learning processor unit (DPU), a reconfigurable engine dedicated to convolutional neural networks (CNNs) and representing the most complex commercial FPGA accelerator with encrypted IPs. Our work demonstrates that electromagnetic analysis can be leveraged to recover the data flow and scheduling of the DNN accelerators, facilitating follow-on architecture and parameter extraction attacks. To mitigate EM side-channel model extraction attacks, we introduce a novel defense mechanism that devises a random importance-aware activation mask on input pixels to disrupt the operation alignment on EM traces, with minimal performance and efficiency impacts.

Overall, this dissertation significantly deepens the understanding of hardware security of DNN models. It makes important contributions in discovering novel and critical vulnerabilities of DNN inference pertaining to system implementations, and proposing effective and practical solutions for securing DNNs in mission-critical environments. The research work marks a substantial step forward in the development of resilient and secure AI systems.

Details

Date:
December 4, 2023
Time:
10:30 am - 11:30 am
Website:
https://northeastern.zoom.us/j/93474737782?pwd=TWJjSGxtQmtzaFpjZGhsSHNBMTFSUT09

Other

Department
Electrical and Computer Engineering
Topics
MS/PhD Thesis Defense
Audience
MS, PhD, Faculty, Staff