Mar 29, 2011
Recent advances in probabilistic Simultaneous Localisation and Mapping (SLAM) algorithms, together with modern computer power, have made it possible to create practical systems able to perform real-time estimation of the motion of a single camera in 3D purely from the image stream it acquires. This is of interest in robotics, but also in other fields like wearable computing and augmented reality. I will review my work on visual SLAM over the past few years, and present recent work which is now turning towards not just estimating camera motion for very rapid or large scale motion, but also recovering dense scene models and mosaics in real-time.
Dr. Andrew Davison
Imperial College London
download the lecture presentation