BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Northeastern University College of Engineering - ECPv6.15.20//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:Northeastern University College of Engineering
X-ORIGINAL-URL:https://coe.northeastern.edu
X-WR-CALDESC:Events for Northeastern University College of Engineering
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/New_York
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20210314T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20211107T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20220313T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20221106T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20230312T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20231105T060000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20220825T140000
DTEND;TZID=America/New_York:20220825T150000
DTSTAMP:20260512T044506
CREATED:20221103T141955Z
LAST-MODIFIED:20221103T141955Z
UID:34131-1661436000-1661439600@coe.northeastern.edu
SUMMARY:Tarik Kelestemur's PhD Dissertation Defense
DESCRIPTION:Location: ISEC 532 \n“Combining Classical and Learning-based Methods for Visual and Tactile Manipulation” \nAbstract: \nRobots that operate in dynamic and ever-changing environments need to make sense of their surroundings and act in them safely and efficiently. This requires the integration of multiple sensory modalities such as visual and tactile. Humans can naturally fuse different feedbacks from the environment or substitute them with one another to perform everyday tasks. For example\, to use a computer mouse\, we first locate the mouse using vision and then use touch feedback from our fingers to precisely localize the buttons. Ideally\, we would like robots to have human-level perception and control of the environment to achieve various tasks. This dissertation address two significant problems toward this overarching goal. \nThe first problem we consider in this dissertation is figuring out how to use tactile information in conjunction with visual feedback. Robotic manipulators that interact with objects and environments are often equipped with visual sensors such as RGB and depth cameras. They estimate the state of their environment using these sensors and act upon the estimated state. While a large body of previous work has shown that we can achieve impressive results only with visual sensors\, more precise and delicate tasks require touch information which gives direct feedback from the environment. To this end\, we propose methods for efficiently combining the tactile and visual information to leverage the advantages of these modalities.\nThe second problem we investigate is how to build visual and tactile manipulation methods that can generalize over the different novel environments and objects. The rise of deep learning has enabled robots to solve challenging perception and control problems using visual and tactile observations while generalizing to novel objects and environments. However\, a common issue among deep learning-based methods is that these methods usually work only within the distribution of the training data and do not perform well when they are presented with unseen examples. Furthermore\, they cannot distinguish whether they are dealing with in or out-of-distribution data. We propose to address this issue by combining well-established and principled algorithmic priors with the generalization capabilities of deep learning. \nIn the first part of this dissertation\, we investigate the problem of pose estimation of the robotic grippers with respect to the environment and objects. The proposed framework introduces a learnable Bayes filter that can estimate the position of a gripper in a single image of the environment. We learn the observation and motion models of the Bayes filter using modern neural network architectures and use recursive belief updates for tracking the position of the gripper over time. Later\, the belief estimation is used as an input to policies where the aim is to solve manipulation tasks using tactile feedback. In the second part\, we look at the problem of estimating shapes from partial observations. We propose a method called DeepGPIS that combines a powerful deep learning-based implicit shape representation with a non-parametric inference approach model for implicit surfaces (GPIS) which allows us to generate complete shapes of novel objects and estimate their predictive uncertainties. \nCommittee: \nProf. Taskin Padir (Advisor) \nProf. Robert Platt (Advisor) \nProf. David Rosen (Advisor)
URL:https://coe.northeastern.edu/event/tarik-kelestemurs-phd-dissertation-defense/
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20220825T150000
DTEND;TZID=America/New_York:20220825T153000
DTSTAMP:20260512T044506
CREATED:20220722T172244Z
LAST-MODIFIED:20220722T172255Z
UID:31979-1661439600-1661441400@coe.northeastern.edu
SUMMARY:Library Bite-Sized Webinar: How do I add a Creative Commons license to my work?
DESCRIPTION:You’ve probably heard about copyright\, but are you familiar with Creative Commons (CC)? CC licenses allow you to retain the copyright to your work\, even while granting advanced permissions to others who may want to use or build upon it. \nIn this session\, you’ll learn the basics of the 6 Creative Commons licenses and how to actually apply a license to your work. We will also outline some considerations and questions to help you decide whether a CC license is right for you – and which one. Presented by Arts\, Humanities\, and Experiential Learning Librarian\, Regina Pagani. \nRegistration is required. Please register at the library events calendar.  \nThis live webinar will be held in EST. The webinar will be recorded\, captioned and sent out to registrants. To receive a copy\, please register at the library events calendar.
URL:https://coe.northeastern.edu/event/library-bite-sized-webinar-how-do-i-add-a-creative-commons-license-to-my-work/
END:VEVENT
END:VCALENDAR