Projektbeschreibung
Reaction time plays a critical role when robots move through the real world because it determines whether a robot crashes or not. Many robots today can only move slowly in unknown environments because it takes time to process all visual information. This is because the way conventional cameras observe visual scenes is very inefficient. To move faster, robots have to become more efficient in performing “simultaneous localization and mapping” (SLAM) which is necessary for navigation.
We have developed a camera which is inspired by the processing in our eyes. These cameras react much faster to changes and consume less power than conventional cameras. In this project we have develop the world’s first real-time SLAM algorithms for this camera.
Was ist das Besondere an diesem Projekt?
The Silicon Eye technology fuses the advantages of conventional machine vision with bio-inspired event-based processing. This type of sensor has a great potential in the applications such as robotics or smart glasses and with this project we delivered the proof that existing algorithms can be adapted to the output of our sensor.
Stand/Resultate
The dynamic and active pixel vision sensor (DAVIS) which is the basis of the Silicon Eye technology has been successfully designed, produced and tested. It has following advantages over a conventional camera:
1. Low reaction time of a few microseconds (vs. several milliseconds)
2. High dynamic range of 130dB (vs. 60dB)
3. No motion blur (vs. a lot of motion blur during fast movements)
4. Efficient data representation leading to a low system level power consumption
A market analysis at the Consumer Electronics Show (CES 2015) and in the Silicon Valley including direct feedback from some of the most relevant players in the field of mobile robotics and smart glasses revealed a huge demand for the targeted products.
With the successful outcome of this project the Insightness AG can now market these novel event-based visual positioning systems world-wide. A
granted CTI project will allow to further develop the chips and potential partnerships with interested Fortune 100 companies will allow to further develop and integrate the technology.
Our project received support from ETH Zurich, University of Zurich, NCCR Robotics, CTI, Venture Kick and Venture Leaders.
Publikationen
C. Brandli, R. Berner, M. Yang, S.-C. Liu and T. Delbruck, «A 240 180 130 dB 3 µs Latency Global Shutter Spatio-temporal Vision Sensor», IEEE Journal of Solid-State Circuits, 2014;
E. Mueggler, B. Huber, D. Scaramuzza, «Event-based, 6-DOF Pose Tracking for High-Speed Maneuvers», 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS);
A Censi, D. Scaramuzza, «Low-Latency Event-Based Visual Odometry», 2014 IEEE International Conference on Robotics and Automation (ICRA).
Medienecho
Links
Am Projekt beteiligte Personen
Letzte Aktualisierung dieser Projektdarstellung 13.11.2020