Mark Zolotas received the M.Eng. and Ph.D. degrees from the Department of Electrical and Electronic Engineering at Imperial College London, UK. He is currently a postdoctoral research associate in the Institute for Experiential Robotics at Northeastern University. His research interests span the domains of mixed reality, shared control and representation learning for human-robot interaction. All of these topics coincide with his overarching ambition to advance assistive robotics through transparent and seamless user experiences.
- Zolotas, M., Wonsick, M., Long, P. and Padır, T., 2021. Motion Polytopes in Virtual Reality for Shared Control in Remote Manipulation Applications. Frontiers in Robotics and AI, p.286.
- Zolotas, M. and Demiris, Y., 2021. Disentangled Sequence Clustering for Human Intention Inference. arXiv preprint arXiv:2101.09500.
- Zolotas, M. and Demiris, Y., 2019, November. Towards explainable shared control using augmented reality. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 3020-3026). IEEE.
- Zolotas, M., Elsdon, J. and Demiris, Y., 2018, October. Head-mounted augmented reality for explainable robotic wheelchair assistance. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 1823-1829). IEEE.
Sep 28, 2021
Northeastern University won second place and best technical paper at NASA’s 2020-21 RASC-AL Moon to Mars Ice and Prospecting Challenge for their project “Percussive And Rotary Surveying & Extracting Carousel (PARSEC).”