Vision is an important sensory modality which is relied upon extensively not only by humans, but also by much smaller organisms, such as drosophila melanogaster (fruit flies). Among other tasks, drosophila rely on vision for navigation, speed control, collision avoidance, and landing. Perhaps even more incredible than the tasks they perform is the extremely low Size, Weight, And Power (SWAP) of the neural circuitry they use to achieve such performance. A typical drosophila melanogaster specimen is only 2.5mm long and weighs a quarter of a milligram! The objective of this project is to use our understanding of biological systems to improve artificial visual systems, particularly in terms of accuracy, robustness, and SWAP.
The Neuromorphic Vision group makes use of bio-inspired change detection sensors (often referred to as "silicon retinae") which sense the visual scene in a manner analogous to certain neurons found in primate retina. The data output by these sensors can be likened to neural "spikes", which are the primary form of communication between neurons. The nature of the spiking data from such cameras opens the possibility for efficient, real-time, asynchronous visual processing using spiking neural networks.
In collaboration with teams from around the world, the Neuromorphic Vision group is developing a bio-inspired visual system which could be used to aid a small mobile vehicle in a real world environment. At SINAPSE, the team focuses mainly on the development of spiking neural processing algorithms to exploit the efficiency and robustness which bio-inspired methods promise to offer, allowing for real-time processing of visual information using minimal power, even for high speed tasks.
Applications for vision are not limited to mobile robotics. At SINAPSE we are also exploring the use of vision to improve the functionality of active prosthetic arms for amputees, by helping the arm to visually recognize and sense nearby objects for manipulation.