BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Northeastern University College of Engineering - ECPv6.15.20//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:Northeastern University College of Engineering
X-ORIGINAL-URL:https://coe.northeastern.edu
X-WR-CALDESC:Events for Northeastern University College of Engineering
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/New_York
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20210314T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20211107T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20220313T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20221106T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20230312T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20231105T060000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20220804T140000
DTEND;TZID=America/New_York:20220804T150000
DTSTAMP:20260516T184845
CREATED:20221103T142334Z
LAST-MODIFIED:20221103T142334Z
UID:34117-1659621600-1659625200@coe.northeastern.edu
SUMMARY:Tong Jian's PhD Dissertation Defense
DESCRIPTION:“Robust Sparsified Deep Learning” \nAbstract: \nThis dissertation studies robustness issues around DNN deployments on resource constrained systems\, under both environmental and adversarial input adaptation. We propose a means of compressing a Radio Frequency deep neural network architecture through weight pruning\, and provide a systems-level analysis of the implementation of such a pruned architecture at resource-constrained edge devices. In particular\, we jointly train and sparsify neural networks tailored to edge hardware implementations. \nNext\, we propose a new learn-prune-share (LPS) algorithm for achieving robustness to environment adaptation in the field of lifelong learning. Our method maintains a parsimonious neural network model and achieves exact no forgetting by splitting the network into task-specific partitions via a weight pruning strategy optimized by the Alternating Direction Methods of Multipliers (ADMM). Moreover\, a novel selective knowledge sharing scheme is integrated seamlessly into the ADMM optimization framework to address knowledge reuse.\nFurthermore\, we investigate the Hilbert-Schmidt Information Bottleneck as regularizer (HBaR) as a means to enhance adversarial robustness. We show that the Hilbert-Schmidt Information bottleneck enhances robustness to adversarial attacks both theoretically and experimentally. In particular\, we prove that the HSIC bottleneck regularizer reduces the sensitivity of the classifier to adversarial examples. \nFinally\, we propose a novel framework Pruning-without-Adversarial-training (PwoA) for the purpose of achieving adversarial robustness on resource-constrained systems. PwoA can efficiently prune a previously trained robust neural network while maintaining adversarial robustness\, without further generating adversarial examples. We leverage concurrent self-distillation and pruning to preserve knowledge in the original model as well as regularizing the pruned model via the HBaR. \nCommittee: \nProf. Stratis Ioannidis (Advisor) \nProf. Jennifer Dy\nProf. Kaushik Chowdhury \nProf. Yanzhi Wang
URL:https://coe.northeastern.edu/event/tong-jians-phd-dissertation-defense/
END:VEVENT
END:VCALENDAR