In the last years, due to the availability of large amounts of annotated data and the increase of the computation capability of highperformance computing platforms, we have witnessed a resurgence of artificial intelligence (AI) and neuro-inspired computation. AI systems outperforming human beings in image classification tasks have been demonstrated. However, those systems still lag well behind human beings if we compare them in terms of speed and energy efficiency. The intensive computation requirements of AI recognition systems cause that the developed AI systems for our portable devices perform computations on the cloud. It has been foreseen that by the year 2025, one-fifth of the world's electricity will be consumed by the internet.
The development of efficient information coding schemes and low power AI hardware platforms is a must if we want to witness the spreadof AI systems while keeping an affordable energy budget. Current state-of-the-art AI systems are based on an information coding and processing paradigm which is quite different from the way biological brains code and process the information. If we consider vision as an example, state-of-the-art AI computational vision systems code and process the information as sequences of static frames. However, biological neurons produce and communicate sequences of spikes. In this context, the so-called third generation of neural networks or spiking neural networks has emerged to emulate the efficiency in information coding and computation of human brains.
However, spiking neural networks computational systems lack the maturity of frame-based conventional computing systems in terms of theoretical development, learning and controlling algorithms and availability of event-based sensors, event-based hardware computing platforms, and event-based robotic actuators.
The NANO-MIND project aims to advance in the theoretical and hardware development of neuromorphic spiking neural systems from the sensors level, to the processing level up to the control and actuation level.