[29/06/2017-14/07/2017] S. Korea and Japan Trips

posted 7 Aug 2017, 05:28 by Hanme Kim   [ updated 7 Aug 2017, 05:29 ]

I had a fantastic opportunity to visit many exciting companies and organisations in both S. Korea and Japan, especially to give a talk about our event-based SLAM technology at:
  • Korea University
  • Korea Electronics Technology Institute (KETI)
  • Samsung Advanced Institute of Technology (SAIT)
  • Korea Advanced Institute of Science and Technology (KAIST)
  • Agency for Defence Development (ADD)


[02/06/2017] 1st International Workshop on Event-based Vision at ICRA 2017

posted 1 Aug 2017, 05:56 by Hanme Kim   [ updated 1 Aug 2017, 05:57 ]

It was honoured to give a presentation (Event-Based SLAM at Slamcore) on behalf of our startup Slamcore at the first international workshop on event-based vision at ICRA 2017.


[26/10/2016] Our work on the iniLabs website

posted 2 Nov 2016, 05:24 by Hanme Kim

Our event camera based SLAM methods [Kim et al., BMVC'14; Kim et al., ECCV'16] are introduced on the iniLabs website.

[13/10/2016] We've got the best paper award at ECCV 2016!

posted 14 Oct 2016, 14:07 by Hanme Kim

We've got the best paper award at ECCV 2016! Big congratulations to Stefan and Andy as well! :)

The Qualcomm Innovation Fellowship European champion and the best industry paper award at BMVC, both in 2014, and now this at one of the best computer vision conferences?! Today is the best day of my PhD life! :)

Honestly, without my supervisor Andy, it would not have been possible, so my special thanks goes to him!


[22/07/2016] Our ECCV'16 paper has been accepted as an oral presentation!

posted 22 Jul 2016, 04:26 by Hanme Kim   [ updated 6 Oct 2016, 13:52 ]

I am excited to share our latest work on "Real-Time 3D Reconstruction and 6-DoF Tracking with an Event Camera" in collaboration with Stefan Leutenegger and Andrew Davison which will be presented at ECCV 2016!

Abstract:
We propose a method which can perform real-time 3D reconstruction from a single hand-held event camera with no additional sensing, and works in unstructured scenes of which it has no prior knowledge. It is based on three decoupled probabilistic filters, each estimating 6-DoF camera motion, scene logarithmic (log) intensity gradient and scene inverse depth relative to a keyframe, and we build a real-time graph of these to track and model over an extended local workspace. We also upgrade the gradient estimate for each keyframe into an intensity image, allowing us to recover a real-time video-like intensity sequence with spatial and temporal super-resolution from the low bit-rate input event stream. To the best of our knowledge, this is the first algorithm provably able to track a general 6D motion along with reconstruction of arbitrary structure including its intensity and the reconstruction of grayscale video that exclusively relies on event camera data.

Link to the paper:

Link to the video:

Link to the presentation:

[26/6-1/7/2016] CVPR 2016

posted 1 Jul 2016, 11:25 by Hanme Kim

I attended CVPR 2016 from 26th June until 1st July in Las Vegas, NV, USA solely for our start-up, slamcore. All the meetings with other companies and individuals we had were very useful, and I believe that some of them will become very exciting opportunities for us soon.




[11/02/2016] slamcore Limited

posted 9 May 2016, 00:29 by Hanme Kim


I am very excited to announce that I co-founded a start-up, slamcore Limited, with Jacek, Stefan, Andy and Owen. For more about us or our technologies, please visit www.slamcore.com or contact us at contact@slamcore.com.

To keep you updated, please follow us via:

[3-21/8/2015] IBM TrueNorth BootCamp

posted 25 Aug 2015, 13:58 by Hanme Kim   [ updated 25 Aug 2015, 14:02 ]

I attended the IBM TrueNorth BootCamp from 3rd to 21st August, 2015 in San Jose, CA, USA. It was truly honoured to have the 3 weeks intense training course with very enthusiastic IBM people in the SyNAPSE team and researchers from the selected renowned academic, government and industrial research institutes!

Here is a link to a wired news article.

 

[20/7/2015] Ongoing work with Michael Milford

posted 25 Aug 2015, 13:50 by Hanme Kim   [ updated 25 Aug 2015, 14:02 ]

While Michael Milford, associate professor at QUT -- Queensland University of Technology, was visiting us recently, he showed that his SeqSLAM approach could be combined with a simple visual odometry algorithm to obtain quite nice large scale walking trajectories from only the very low bit rate data from a DVS event-based camera, collaborating with us. Some initial results of this ongoing research, towards a complete and generally applicable visual SLAM system using event cameras, were presented at ICRA and RSS workshops.

[1-2/7/2015] ViiHM 2nd Workshop

posted 25 Aug 2015, 13:47 by Hanme Kim

I attended the 2nd Visual Image Interpretation in Humans and Machines (ViiHM) workshop from 1st to 2nd July, 2015 in Bath, UK -- ViiHM is an EPSRC network for Biological and Computer Vision.


1-10 of 18