Time: 3:00 - 4:00pm Venue: Graduate Centre - Room GC222 Mile End Campus E1 4NS
April 12, 15:00-16:00
Room GC 222, Graduate Centre (new building 18, see campus map), QMUL Mile End campus
Campus map
Abstract: Mobile robots need dedicated sensing and processing for localisation and mapping as well as scene understanding. Recent years have brought tremendous advances in vision sensors (e.g. RGB-D cameras) and processing power (e.g. GPUs) that have led us to design new algorithms that will empower the next generation of mobile robots. With the arrival of deep learning, we are furthermore now in the position to link respective unprecedented performance in scene understanding with 3D mapping. In this talk, I will go through some recent algorithms and software we have developed as well as their application to mobile robots, including drones.
Bio: I am a Lecturer in the Dyson Robotics Lab, Imperial College London, co-leading it with Andrew Davison. My research is centered around autonomous robot navigation: robots need dedicated sensing capabilities as well as algorithms for localisation inside a potentially unknown environment. I received a BSc and MSc in Mechanical Engineering from ETH Zurich in 2006, 2008, respectively, and a PhD in 2014, working at the Autonomous Systems Lab of ETH Zurich on Unmanned Solar Airplanes: Design and Algorithms for Efficient and Robust Autonomous Operation.
website
More details at ARQ website