BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Northeastern University College of Engineering - ECPv6.15.20//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:https://coe.northeastern.edu
X-WR-CALDESC:Events for Northeastern University College of Engineering
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/New_York
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20230312T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20231105T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20240310T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20241103T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20250309T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20251102T060000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20240812T083000
DTEND;TZID=America/New_York:20240812T090000
DTSTAMP:20260424T130203
CREATED:20240805T170809Z
LAST-MODIFIED:20240805T170809Z
UID:44828-1723451400-1723453200@coe.northeastern.edu
SUMMARY:Galante Program Virtual Info Sessions
DESCRIPTION:Learn how the Galante Engineering Business Program and Engineering Business Certificate can complement your technical engineering education with essential business skills. \nJoin us for an informational session to learn more about the Galante Engineering Business Program on one of the following dates: \n\nThursday\, August 8 at 4:30 p.m. EDT – Virtual\nMonday\, August 12 at 8:30 a.m. EDT – Virtual\nTuesday\, August 20 at 12:00 p.m. EDT – Virtual\nThursday\, August 22 at 9:00 a.m. EDT – Virtual\n\nRSVP Here \nDuring the session\, we will cover the details of the Galante Program and Engineering Business Certificate\, including the application process and eligibility requirements.
URL:https://coe.northeastern.edu/event/galante-program-virtual-info-sessions/2024-08-12/
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20240812T100000
DTEND;TZID=America/New_York:20240812T110000
DTSTAMP:20260424T130203
CREATED:20240820T180016Z
LAST-MODIFIED:20240820T180016Z
UID:45121-1723456800-1723460400@coe.northeastern.edu
SUMMARY:Gözde Özcan PhD Dissertation Defense
DESCRIPTION:Name:\nGözde Özcan \nTitle:\nLearning and Optimizing Set Functions \nDate:\n8/12/2024 \nTime:\n10:00:00 AM \nLocation:\nEXP 601\nCommittee Members:\nProf. Stratis Ioannidis (Advisor)\nProf. Jennifer Dy\nProf. Evimaria Terzi \nAbstract:\nLearning and optimizing set functions play a crucial role in the artificial intelligence research as various problems of interest can be characterized with set inputs and/or outputs. Submodular functions\, i.e.\, set functions with a diminishing returns property\, are an important subcategory of such functions. They naturally present themselves in applications such as sensor placement\, data summarization\, feature selection\, influence maximization\, hyper-parameter optimization\, and facility location\, to name a few. In a lot of these compelling problems\, the objective is to maximize a submodular function subject to matroid constraints\, which is known to be NP-hard. For problems of this nature\, the continuous greedy algorithm provides a (1 − 1/e)-approximation guarantee in polynomial-time. It does so by estimating the gradient of the so-called multilinear relaxation of the objective function via sampling. However\, for the general class of submodular functions\, the number of samples required to achieve this theoretical guarantee can be computationally prohibitive. \nIn this dissertation\, we address deterministic submodular maximization problems with matroid constraints\, specifically those with objectives expressed through compositions of analytic and multilinear functions. We introduce a novel polynomial series estimator to approximate the multilinear relaxation of such functions and demonstrate that the sub-optimality introduced by our polynomial expansion can be minimized by increasing the polynomial order. By utilizing this estimator\, a variant of the continuous greedy algorithm achieves an approximation ratio close to (1 − 1/e) ≈ 0.63 through deterministic gradient estimation. In numerical experiments\, our polynomial estimator outperforms the sampling estimator\, offering reduced errors in less time. \nWe extend our study to the stochastic submodular maximization setting with general matroid constraints\, where objectives are defined as expectations over submodular functions with an unknown distribution. Adapting polynomial estimators to this context reduces the variance of the gradient estimation while introducing a controlled bias term. For several notable stochastic submodular maximization problems\, we demonstrate that this bias decays exponentially with the degree of our polynomial approximators. Furthermore\, for monotone functions\, a stochastic variant of the continuous greedy algorithm attains an approximation ratio (in expectation) close to (1 − 1/e) ≈ 0.63 using these polynomial estimators. Our experimental results validate the advantages of our approach across synthetic and real-life datasets. \nFinally\, we turn our attention to the learning set functions under a so-called optimal subset oracle setting. A recent approach approximates the underlying utility function with an energy-based model. Approximating this energy-based model yields iterations of fixed-point update steps during mean-field variational inference. However\, these fixed-point iterations are not guaranteed to converge and as the number of iterations increases\, automatic differentiation quickly becomes computationally prohibitive due to the size of the Jacobians that are stacked during backpropagation. We address these challenges by examining the convergence conditions for the fixed-point iterations and utilizing implicit differentiation over automatic differentiation. We empirically demonstrate the efficiency of our method on synthetic and real-world subset selection applications.
URL:https://coe.northeastern.edu/event/gozde-ozcan-phd-dissertation-defense/
END:VEVENT
END:VCALENDAR