NEWS‎ > ‎

[27/1/2015] Invited talk at Qualcomm Vienna

posted 30 Jan 2015, 03:56 by Hanme Kim   [ updated 25 Aug 2015, 14:02 ]
It was a great honour for me to give an invited talk in the prestigious Qualcomm Augmented Reality Lecture Series organised by Qualcomm Austria Research Center.

Visual SLAM with an event-based camera
Date: 27-1-15
Speaker: Hanme Kim, Imperial College London

Abstract: An event camera is a silicon retina which outputs not a sequence of video frames like a standard camera, but a stream of asynchronous spikes, each with pixel location, sign and precise timing, indicating when individual pixels record a threshold log intensity change. By encoding only image change, it offers the potential to transmit the information in a standard video but at vastly reduced bitrate, and with huge added advantages of very high dynamic range and temporal resolution. However, event data calls for new algorithms, and in particular we believe that algorithms which incrementally estimate global scene models are best placed to take full advantages of its properties. We recently showed for the first time that an event stream, with no additional sensing, can be used to track accurate camera rotation while building a persistent and high quality mosaic of a scene which is super-resolution accurate and has high dynamic range. The method involves parallel camera rotation tracking and template reconstruction from estimated gradients, both operating on an event-by-event basis and based on probabilistic filtering. This talk will give an overview over our BMVC 2014 best industry paper, simultaneous mosaicing and tracking with an event camera, the key ideas and techniques behind it, and some current and future extensions we are working on.


Very nice view from the Qualcomm Austria Research Center office.

Comments