BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Northeastern University College of Engineering - ECPv6.15.20//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:https://coe.northeastern.edu
X-WR-CALDESC:Events for Northeastern University College of Engineering
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/New_York
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20200308T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20201101T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20210314T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20211107T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20220313T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20221106T060000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20210707T140000
DTEND;TZID=America/New_York:20210707T150000
DTSTAMP:20260513T100641
CREATED:20210706T135010Z
LAST-MODIFIED:20210706T135010Z
UID:26505-1625666400-1625670000@coe.northeastern.edu
SUMMARY:ECE PhD Proposal Review: Xiaolong Ma
DESCRIPTION:PhD Proposal Review: Towards Efficient Deep Neural Network Execution with Model Compression and Platform-specific Optimization \nXiaolong Ma \nLocation: Zoom \nAbstract: Deep learning or deep neural networks (DNNs) have become the fundamental element and core enabler of ubiquitous artificial intelligence. Recently\, with the emergence of a spectrum of high-end mobile devices\, many deep learning applications that formerly required desktop-level computation capability are being transferred to these devices. However\, executing DNN inference is still challenging considering the high computation and storage demands\, specifically\, if real-time performance with high accuracy is needed. Weight pruning of DNNs is proposed\, but existing schemes represent two extremes in the design space: non-structured pruning is fine-grained\, accurate\, but not hardware friendly; structured pruning is coarse-grained\, hardware-efficient\, but with higher accuracy loss. To solve the problem\, we propose a compression-compilation co-optimization framework\, which includes 1) a new dimension\, fine-grained pruning patterns inside the coarse-grained structures that achieves accuracy enhancement and preserve the structural regularity that can be leveraged for hardware acceleration\, 2) a pattern-aware pruning framework that achieves pattern library extraction\, pattern selection\, pattern and connectivity pruning and weight training simultaneously\, and 3) a set of thorough architecture-aware compiler/code generation-based optimizations\, i.e.\, filter kernel reordering\, compressed weight storage\, register load redundancy elimination\, and parameter auto-tuning for real-time execution of the mainstream DNN applications on the mobile platforms. Evaluation results demonstrate that our framework outperforms three state-of-the-art end-to-end DNN frameworks\, TensorFlow Lite\, TVM\, and Alibaba Mobile Neural Network with speedup up to 44.5x\, 11.4x\, and 7.1x\, respectively\, with no accuracy compromise. Real-time inference of representative large-scale DNNs (e.g.\, VGG-16\, ResNet-50) can be achieved using mobile devices.
URL:https://coe.northeastern.edu/event/ece-phd-proposal-review-xiaolong-ma/
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20210707T170000
DTEND;TZID=America/New_York:20210707T180000
DTSTAMP:20260513T100641
CREATED:20210706T135131Z
LAST-MODIFIED:20210706T135131Z
UID:26484-1625677200-1625680800@coe.northeastern.edu
SUMMARY:ECE PhD Dissertation Defense: Kaidi Xu
DESCRIPTION:PhD Dissertation Defense: Can We Trust AI? Towards Practical Implementation and Theoretical Analysis in Trustworthy Machine Learning \nKaidi Xu \nLocation: Zoom Link \nAbstract: Deep learning has achieved extraordinary performance in many application domains recently. It has been well accepted that DNNs are vulnerable to adversarial attacks\, which raises concerns of DNNs in security-critical applications and may result in disastrous consequences. Adversarial attacks are usually implemented by generating adversarial examples\, i.e.\, adding sophisticated perturbations onto benign examples\, such that adversarial examples are classified by the DNN as target (wrong) labels instead of the correct labels of the benign examples. The adversarial machine learning aims to study this phenomenon and leverage it to build robust machine learning systems and explain DNNs.\nIn this talk\, I will present the mechanism of adversarial machine learning in both empirical and theoretical ways. Specifically\, a uniform adversarial attack generation framework\, structured attack (StrAttack) is introduced\, which explores group sparsity in adversarial perturbations by sliding a mask through images aiming for extracting key spatial structures. Second\, we discuss the feasibility of adversarial attacks in the physical world and introduce a convincing framework\, Expectation over Transformation (EoT). Utilize EoT with Thin Plate Spline (TPS) transformation\, we can generate Adversarial T-shirts\, a powerful physical adversarial patch for evading person detectors even if it could undergo non-rigid deformation due to a moving person’s pose changes. Third\, we stand on the defense side and design the first adversarial training method based on Graph Neural Network. Finally\, we introduce Linear relaxation-based perturbation analysis (LiRPA) for neural networks\, which computes provable linear bounds of output neurons given a certain amount of input perturbation. LiRPA studies the adversarial example in a theoretical way and can guarantee the test accuracy of a model by given perturbation constraints. The generality\, flexibility\, efficiency and ease-of-use of our proposed framework facilitate the adoption of LiRPA based provable methods for other machine learning problems beyond robustness verification
URL:https://coe.northeastern.edu/event/ece-phd-dissertation-defense-kaidi-xu/
END:VEVENT
END:VCALENDAR