Spanish National Research Council · University of Seville
 HOME
INTRANET
esp    ing
IMSE-CNM in Digital.CSIC


 


In all publications
Author: Leñero Bardallo, Juan A.
Year: Since 2002
All publications
On the implementation of asynchronous sun sensors
J.A. Leñero-Bardallo, R. Carmona-Galán and A. Rodríguez-Vázquez
Conference - IS&T International Symposium on Electronic Imaging 2019
[abstract]
Abstract not avaliable

Live Demonstration: A Miniaturized Two-Axis Low Latency and Low-Power Sun Sensor for Attitude Determination of Sounding Rockets
L. Farian, J.A. Lenero-Bardallo and P. Hafliger
Conference - IEEE International Symposium on Circuits and Systems ISCAS 2018
[abstract]
This demo shows a first prototype two-axis miniaturized spiking sun sensor. The device is composed of spiking pixels, and uses a novel Time-to-First-n-Spikes with time-out readout mode to reduce bandwidth consumption and post-processing computation. Due to on-chip processing, and compressing the angle information, the sensor produces much less data and is much faster than digital sensors. Its response latency is 88 µW, and average power consumption is 6.3 µW. An integrated circuit with core electronics was fabricated in the AMS 0.35 µm CMOS image sensor process, and was integrated inside a very small QFN64 package with micro-optics on top.

On the characterization of light sources irradiation profiles with an HDR image sensor
J.A. Leñero-Bardallo, J. Fernández-Berni, R. Carmona-Galán and A. Rodríguez-Vázquez
Conference - ACM International Conference on Distributed Smart Cameras ICDSC 2018
[abstract]
We demonstrate how light emissions of very bright light sources can be rendered with an HDR image sensor with linear operation. We showcase the device usefulness to study transient variations of very high illumination levels and to determine the irradiance profile of light sources. The sensor can track transient illumination changes at video rates, preserving details of darker regions within the visual scene.

Fast luminance measurement method for asynchronous spiking pixels
J.A. Leñero-Bardallo and F.J. García-Pacheco
Journal Paper - Electronics Letters, vol. 54, no. 8, pp 492-494, 2018
IET    DOI: 10.1049/el.2017.3834    ISSN: 0013-5194    » doi
[abstract]
A new method to obtain a continuous and fast measurement of light intensity is presented. It is targeted for Integrate and Fire pixels that pulse with a frequency proportional to illumination. The procedure is intended to speed up the pixel readout of low illuminated pixels. It does not require synchronisation of different digital signals, being compatible with continuous pixel operation. The fundamentals of this method are described. Experimental results validating the theory are provided.

Applications of event-based image sensors - Review and analysis
J.A. Leñero-Bardallo, R. Carmona-Galán and A. Rodríguez-Vázquez
Journal Paper - International Journal of Circuit Theory and Applications, vol. 46, no. 9, pp 1620-1630, 2018
JOHN WILEY & SONS    DOI: 10.1002/cta.2546    ISSN: 0098-9886    » doi
[abstract]
The spread of event-driven asynchronous vision sensors during the last years has increased significantly the industrial interest and the application scenarios for them. This article reviews the main fields of application that event-based image sensors have found during the last 20 years. We focus in the description of applications where such devices can outperform conventional frame-based sensors. The practical functions of the three main families of asynchronous event-based sensors are analyzed. The article also studies what are the factors that increase nowadays the demand of sensors that minimize the power and bandwidth consumption. Moreover, the technological factors that have facilitated the development of asynchronous sensors are discussed.

Asynchronous spiking pixel with programmable sensitivity to illumination
J.A. Leñero-Bardallo, M. Delgado-Restituto, R. Carmona-Galan and A. Rodriguez-Vazquez
Journal Paper - IEEE Transactions on Circuits and Systems I: Regular Papers, vol. 65, no. 11, pp 3854-3863, 2018
IEEE    DOI: 10.1109/TCSI.2018.2857220    ISSN: 1549-8328    » doi
[abstract]
A spiking pixel to be used in image sensor arrays for asynchronous frame-based operation is presented. The pixel features both local and global adaptive sensitivity to the illumination level. Local adaptation is performed by adjusting the voltage stored in an embedded analog memory according to the average illumination within a neighborhood. Global adaptation to the overall illumination of the array is implemented by adjusting a voltage value common to all the pixels. These programming capabilities allow full control on the sensor sensitivity, pixel output data flow, and energy consumption, thus, overcoming the limitations observed in current image sensors based on spiking pixels. Experimental results validate the functionality of the proposal.

On the analysis and detection of flames with an asynchronous spiking image sensor
J.A. Leñero-Bardallo, J.M. Guerrero-Rodríguez, R. Carmona-Galán and A. Rodríguez-Vázquez
Journal Paper - IEEE Sensors Journal, vol. 18, no. 16, pp 6588-6595, 2018
ELSEVIER    DOI: 10.1109/JSEN.2018.2851063    ISSN: 1530-437X    » doi
[abstract]
We have investigated the capabilities of a custom asynchronous spiking image sensor operating in the Near Infrared (NIR) band to study flame radiation emissions, monitor their transient activity, and detect their presence. Asynchronous sensors have inherent capabilities, i.e. good temporal resolution, high dynamic range, and low data redundancy. This makes them competitive against Infrared (IR) cameras and CMOS frame-based NIR imagers. In the article, we analyze, discuss and compare the experimental data measured with our sensor against results obtained with conventional devices. A set of measurements have been taken to study the flame emission levels and their transient variations. Moreover, a flame detection algorithm, adapted to our sensor asynchronous outputs, has been developed. Results show that asynchronous spiking sensors have an excellent potential for flame analysis and monitoring.

CMOS Vision Sensors: Embedding Computer Vision at Imaging Front-Ends
A. Rodríguez-Vázquez, J. Fernández-Berni, J.A. Leñero-Bardallo, I. Vornicu and R. Carmona-Galán
Journal Paper - IEEE Circuits and Systems Magazine, vol. 18, no. 2, pp 90-107, 2018
IEEE    DOI: 10.1109/MCAS.2018.2821772    ISSN: 1531-636X    » doi
[abstract]
CMOS Image Sensors (CIS) are key for imaging technologies. These chips are conceived for capturing optical scenes focused on their surface, and for delivering electrical images, commonly in digital format. CISs may incorporate intelligence; however, their smartness basically concerns calibration, error correction and other similar tasks. The term CVISs (CMOS VIsion Sensors) defines other class of sensor front-ends which are aimed at performing vision tasks right at the focal plane. They have been running under names such as computational image sensors, vision sensors and silicon retinas, among others. CVIS and CISs are similar regarding physical implementation. However, while inputs of both CIS and CVIS are images captured by photo-sensors placed at the focal-plane, CVISs primary outputs may not be images but either image features or even decisions based on the spatial-temporal analysis of the scenes. We may hence state that CVISs are more ‘intelligent’ than CISs as they focus on information instead of on raw data. Actually, CVIS architectures capable of extracting and interpreting the information contained in images, and prompting reaction commands thereof, have been explored for years in academia, and industrial applications are recently ramping up. One of the challenges of CVISs architects is incorporating computer vision concepts into the design flow. The endeavor is ambitious because imaging and computer vision communities are rather disjoint groups talking different languages. The Cellular Nonlinear Network Universal Machine (CNNUM) paradigm, proposed by Profs. Chua and Roska, defined an adequate framework for such conciliation as it is particularly well suited for hardware-software co-design. This paper overviews CVISs chips that were conceived and prototyped at IMS E Vision Lab over the past twenty years. Some of them fit the CNNUM paradigm while others are tangential to it. All of them employ per-pixel mixed-signal processing circuitry to achieve sensor-processing concurrency in the quest of fast operation with reduced energy budget.

A Miniaturized Two-Axis Ultra Low Latency and Low-Power Sun Sensor for Attitude Determination of Micro Space Probes
L. Farian, P. Häfliger and J.A. Leñero-Bardallo
Journal Paper - IEEE Transactions on Circuits and Systems I: Regular Papers, vol. 65, no. 5, pp 1543-1554, 2018
IEEE    DOI: 10.1109/TCSI.2017.2763990    ISSN: 1549-8328    » doi
[abstract]
This paper describes design, fabrication process, and comprehensive experimental results of a first prototype two-axis miniaturized spiking sun sensor. The sun sensor is a fusion of analog and digital sensor types, such that it takes advantage of spatial selectivity of digital sensors, and is not limited by the global frame rate as in analog sun sensors. It is composed of spiking pixels, and uses a novel Time-to-First-n-Spikes with time-out readout mode to reduce bandwidth consumption and post-processing computation. A thin glass lid with a metal deposited pattern serves as a mask projecting a light pattern onto the sensor. The sun sensor is able to extract a profile of the incident light in the form of time-stamped events. Its latency depends on light intensity, and for medium radiance conditions is equal to 88μs. The sun sensor consumes 6.3μW in normal operation, and has a precision of 0.98°, and a field of view of 144°. The high temporal resolution, low power consumption, and small QFN64 package make this sun sensor suitable for space probe and sounding rocket applications, where low temporal latency and payload size are essential. This sun sensor is designed to be employed in the sounding rocket attitude determination system as part of the 4DSpace research initiative to study ionospheric plasma disturbances.

Sun sensor based on a luminance spiking pixel array
J.A. Lenero-Bardallo, L. Farian, J.M. Guerrero-Rodriguez, R. Carmona-Galan and A. Rodriguez-Vazquez
Journal Paper - IEEE Sensors Journal, vol. 17, no. 20, pp 6578-6588, 2017
IEEE    DOI: 10.1109/JSEN.2017.2749414    ISSN: 1530-437X    » doi
[abstract]
We present a novel sun sensor concept. It is the very first sun sensor built with an Address Event Representation (AER) spiking pixel matrix. Its pixels spike with a frequency proportional to illumination. It offers remarkable advantages over conventional digital sun sensors based on Active Pixel Sensor (APS) pixels. Its output data flow is quite reduced. It is possible to resolve the sun position just receiving one single event operating in Time-to-First-Spike (TFS) mode. It operates with a latency in the order of milliseconds. It has higher dynamic range than APS image sensors (higher than 100dB). A custom algorithm to compute the centroid of the illuminated pixels is presented. Experimental results are provided.

Towards Bioinspired Close-Loop Local Motor Control: a Simulated Approach Supporting Neuromorphic Implementations
F. Pérez-Peña, J.A. Leñero-Bardallo, E. Chicca and A. Linares-Barranco
Conference - IEEE International Symposium on Circuits and Systems ISCAS 2017
[abstract]
Despite being well established in robotics, classical motor controllers have several disadvantages: they pose a high computational load, therefore requiring powerful devices, they are not easy to tune and they are not suited for neuroprosthetics. In contrast, bio-inspired controller do not transform the output of the controller therefore no delays are introduced and a smooth response is achieved; they also have a high scalability. Finally, the most important feature of bio-inspired controllers is that they could integrate learning features to make them adaptable to new tasks within the same hardware robotic platform. We present the model and simulation of a spiking neural network for low-level motor control. The proposed neural network acts as a motor controller and produces pulsed signals which can be directly interfaced with commercial DC motors. The simulated network is compatible with neuromorphic VLSI implementation and paves the way to the implementation bio-inspired motor controller which are compact, low power, scalable and compatible with neuroprosthetic. The network presented is inspired by the current knowledge about biological motor control: it comprises alpha motoneuron for driving the motor and spindle populations to provide the feedback and close the loop. The spikes from the motoneuron population are time lengthen to a fixed amount of time and supplied to the simulated motor: Pulse Frequency Modulation (PFM) modulation is used. This paper presents the software simulations using the Brian simulator for a position controller. Our controller is a first step toward a novel bio-inspired motor control approach suitable for robotics as well as neuroprosthetic.

On the design of sun sensors with event-based operation
J.A. Leñero-Bardallo, L. Farian, J.M. Guerrero-Rodríguez, R. Carmona-Galán and A. Rodríguez-Vázquez
Conference - Workshop on the Architecture of Smart Cameras WASC 2017
[abstract]
Abstract not avaliable

Real-time phase correlation based integrated system for seizure detection
J.B. Romaine, M. Delgado-Restituto, J.A. Leñero-Bardallo and A. Rodríguez-Vázquez
Conference - Bio-MEMS and Medical Microdevices III Conference 2017
[abstract]
This paper reports a low area, low power, integer-based digital processor for the calculation of phase synchronization between two neural signals. The processor calculates the phase-frequency content of a signal by identifying the specific time periods associated with two consecutive minima. The simplicity of this phase-frequency content identifier allows for the digital processor to utilize only basic digital blocks, such as registers, counters, adders and subtractors, without incorporating any complex multiplication and or division algorithms. In fact, the processor, fabricated in a 0.18μm CMOS process, only occupies an area of 0.0625μm2 and consumes 12.5nW from a 1.2V supply voltage when operated at 128kHz. These low-area, low-power features make the proposed processor a valuable computing element in closed loop neural prosthesis for the treatment of neural diseases, such as epilepsy, or for extracting functional connectivity maps between different recording sites in the brain.

A sun sensor implemented with an asynchronous luminance vision sensor
J.A. Leñero-Bardallo, L. Farian, J.M. Guerrero-Rodríguez, R. Carmona-Galán and A. Rodríguez-Vázquez
Conference - European Solid-State Circuits Conference ESSCIRC 2017
[abstract]
A sun sensor implemented with a spiking pixel matrix is reported. It is the very first sun sensor based on an asynchronous event-based pixel array. A paradigm associted to classic digital sun sensors is solved with this approach. Only the pixels illuminated by the sun light are readout. Hence, the output data flow is quite reduced. The computational load to resolve the sun position is quite low, comparing to prior sensors. Sensor's latency is in the order of milliseconds. The advantages over implementations with APS pixels are more reduced data flow, less latency, and higher dynamic range.

Pipeline AER Arbitration with Event Aging
J.A. Leñero-Bardallo, F. Pérez-Peña, R. Carmona-Galán and A. Rodríguez-Vázquez
Conference - IEEE International Symposium on Circuits and Systems ISCAS 2017
[abstract]
We present a simple circuit to handle communication between cells of neuromorphic arrays. It allows cells to operate continuously without waiting for acknowledgement signals back from the AER (Address Event Representation) arbitration circuitry. The module also implements aging of cell petitions i.e., old petitions to access to the AER bus are automatically discarded to give priority to the more recent ones and alleviate the bus congestion. The new arbitration scheme has been implemented and tested. A particular application scenario with an image sensor with spiking pixels that sense light continuously is explained. Experimental data obtained with real visual scenes are provided.

A Wide Linear Dynamic Range Image Sensor Based on Asynchronous Self-Reset and Tagging of Saturation Events
J.A. Leñero-Bardallo, R. Carmona-Galán and A. Rodríguez-Vázquez
Journal Paper - IEEE Journal of Solid-State Circuits, vol. 52, no. 6, pp 1605-1617, 2017
IEEE    DOI: 10.1109/JSSC.2017.2679058    ISSN: 0018-9200    » doi
[abstract]
We report a high dynamic range (HDR) image sensor with a linear response that overcomes some of the limitations of sensors with pixels with self-reset operation. It operates similar to an active pixel sensor, but its pixels have a novel asynchronous event-based overflow detection mechanism. Whenever the pixel voltages at the integration capacitance reach a programmable threshold, the pixels self-reset and send out asynchronously an event indicating this. At the end of the integration period, the voltage at the integration capacitance is digitized and readout. Combining this information with the number of events fired by each pixel, it is possible to render linear HDR images. Event operation is transparent to the final user. There is no limitation for the number of self-resets of each pixel. The output data format is compatible with frame-based devices. The sensor was fabricated in the AMS 0.18-μm HV technology. A detailed system description and experimental results are provided in this paper. The sensor can render images with an intra-scene dynamic range of up to 130 dB with linear outputs. The pixels' pitch is 25 μm and the sensor power consumption is 58.6 mW.

In the quest of vision-sensors-on-chip: Pre-processing sensors for data reduction
A. Rodríguez-Vázquez, R. Carmona-Galán, J. Fernández-Berni, V. Brea, J.A. Leñero-Bardallo
Conference - IS&T International Symposium on Electronic Imaging 2017
[abstract]
This paper shows that the implementation of vision systems benefits from the usage of sensing front-end chips with embedded pre-processing capabilities -called CVIS. Such embedded pre-processors reduce the number of data to be delivered for ulterior processing. This strategy, which is also adopted by natural vision systems, relaxes system-level requirements regarding data storage and communications and enables highly compact and fast vision systems. The paper includes several proof-o-concept CVIS chips with embedded pre-processing and illustrate their potential advantages.

Image dynamic range extension by using stacked (unmatched) photodiodes in CMOS
R. Carmona-Galán, J. A. Leñero-Bardallo, J. Fernández-Berni and Á. Rodríguez-Vázquez
Conference - Workshop on the Architecture of Smart Cameras WASC 2016
[abstract]
Capturing images containing unevenly illuminated areas within the same frame is very useful in application fields like surveillance, assisted driving, intelligent transportation, or industrial applications with high intra-scene contrast. Without the appropriate dynamic range to allocate these diverse illumination values, obtaining a detailed view of the brightest zones can easily obscure other elements in the scene. In order to increase the image dynamic range within the same frame, different techniques have been developed: using a sensor with a companding scheme, providing the means to avoid saturation, or employing multiple image captures. The problem with multiple captures is that uncorrelation between the different integration times can generate inexistent edges and distort the interpretation of the scene. In order to realize multiple captures in parallel, we need to be simultaneously sensitive to different illumination ranges. CMOS technology offers a variety of devices to capture light in the visible and near infrared range. If a deep-n-well is available, these structures can be stacked so spatial alignment is obtained by construction (Fig. 1a). The conversion gain of the different photodiodes is defined by their capacitance per unit area (Fig. 1b); therefore each of them will render a different voltage for the same light intensity. This discrepancy in the response can be exploited to extract information from different illumination ranges simultaneously. In this way, light can be sensed in parallel with different conversion gains and the resulting output voltages can then be digitized and combined into a single digital word with a larger number of bits. This mechanism for dynamic range extension does not depend on the difference of exposure times, so artifacts related with unmatched dynamics in the sensor and the scene can be avoided.

Integer-based digital processor for the estimation of phase synchronization between neural signals
J.B. Romaine, M. Delgado-Restituto, J.A. Lenero-Bardallo and A. Rodriguez-Vazquez
Conference - Conference on Ph.D Research in Microelectronics and Electronics PRIME 2016
[abstract]
This paper reports a low area, low power, integer-based neural digital processor for the calculation of phase synchronization between two neural signals. The processor calculates the phase-frequency content of a signal by identifying the specific time periods associated with two consecutive minima. The simplicity of this phase-frequency content identifier allows for the digital processor to utilize only basic digital blocks, such as registers, counters, adders and subtractors, without incorporating any complex multiplication and or division algorithms. The low area and power consumptions make the processor an extremely scalable device which would work well in closed loop neural prosthesis for the treatment of neural diseases.

Enhanced Sensitivity of CMOS Image Sensors by Stacked Diodes
J.A. Lenero-Bardallo, M. Delgado-Restituto, R. Carmona-Galan and A. Rodriguez-Vazquez
Journal Paper - IEEE Sensors Journal, vol. 16, no. 23, pp 8448-8455, 2016
IEEE    DOI: 10.1109/JSEN.2016.2611759    ISSN: 1530-437X    » doi
[abstract]
We have investigated and compared the performance of photodiodes built with stacked p/n junctions operating in parallel versus conventional ones made with single p/n junctions. We propose a method to characterize and compare photodiodes sensitivity. For this purpose, a dedicated chip in the standard AMS 180-nm HV technology has been fabricated. Four different sensor structures were implemented and compared. Experimental results are provided. Measurements show sensitivity enhancement ranging from 55% to 70% within the 500-1100 nm spectral region. The larger increment is happening in the near infrared band (up to 62%). Such results make stacked photodiodes suitable candidates for the implementation of photosensors in vision chips designed for standard CMOS technologies.

A Time-to-First-n-Spikes and Time-out Read-out Extension to the AER Arbitration System
L. Farian, J.A. Lenero-Bardallo and P. Hafliger
Conference - International Conference on Event-Based Control, Communication and Signal Processing EBCCSP 2016
[abstract]
This paper describes extensions of the asynchronous Address Event Representation (AER) arbitration mechanism for VLSI neural networks to obtain a Time-to-First-n-Spikes (TFnS) read-out operation in addition to the standard free running read-out. The global reset of neurons can be programmed to occur after the first n spikes or after a given time-out to the first occurrence of either of these two conditions. This solution does not require any modifications inside the array of the artificial neurons, but uses only a few additional blocks in the periphery. Like the AER communication, the additional blocks also operate asynchronously, they are fully integrated with the AER communication periphery, and they do not affect the system's scalability. The extensions of the AER arbitration system are described, and supporting simulations are provided.

Demo: HDR image sensor with linear response and asynchronous detection of saturation
J.A. Leñero-Bardallo, R. Carmona-Galan and A. Rodriguez-Vazquez
Conference - International Conference on Distributed Smart Cameras ICDSC 2016
[abstract]
Abstract not avaliable

Pixel-wise parameter adaptation for single-exposure extension of the image dynamic range
R. Carmona-Galán, J.A. Leñero-Bardallo, J. Fernández-Berni and A. Rodríguez-Vázquez
Conference - International Conference on Distributed Smart Cameras ICDSC 2016
[abstract]
High dynamic range imaging is central in application fields like surveillance, intelligent transportation and advanced driving assistance systems. In some scenarios, methods for dynamic range extension based on multiple captures have shown limitations in apprehending the dynamics of the scene. Artifacts appear that can put at risk the correct segmentation of objects in the image. We have developed several techniques for the on-chip implementation of single-exposure extension of the dynamic range. We work on the upper extreme of the range, i. e. administering the available full-well capacity. Parameters are adapted pixel-wise in order to accommodate a high intra-scene range of illuminations.

ADCs for Image Sensors: Review and Performance Analysis
Juan A. Leñero-Bardallo and Angel Rodríguez-Vázquez
Book Chapter - Analog Electronics for Radiation Detection, pp 47-70, 2016
CRC PRESS    ISBN: 978-1-498-70356-7    
[abstract]
Abstract not avaliable

A high dynamic range linear vision sensor with event asynchronous and frame-based synchronous operation
J.A. Leñero-Bardallo, R. Carmona-Galán and A. Rodríguez-Vázquez
Conference - IS&T International Symposium on Electronic Imaging 2016
[abstract]
We present a novel High-Dynamic-Range (HDR) image sensor with linear output. Photogenerated charge is continuously integrated at every pixel without saturating. Each time the photodiode voltage reaches a programmable threshold, the pixel resets and starts over integrating charge again. With an eventbased approach, it is possible to count the number of times (if any) that a pixel has saturated during exposure. Pixel illumination is represented with a 20-bit word. The most significant 12b represent the number of times that a pixel has saturated during exposure. The least significant 8b are the result of an analog-todigital conversion in the end of exposure. Thus, pixels provide linear outputs proportional to light intensity. A dynamic range of 120dB is expected. The maximum dynamic range that can be measured is limited by the maximum event rate that the chip peripheral circuitry can handle and by the space dedicated on memory to store the event information. Pixel pitch is 25μm. A prototype sensor with 128 x 96 pixels has been implemented in the AMS 180nm CMOS-HV technology. In this article, the pixel operation will be explained. Preliminary experimental results and snapshots will be also provided.

A Bio-Inspired Vision Sensor with Dual Operation and Readout Modes
J.A. Leñero-Bardallo, P. Häfliger, R. Carmona-Galán and A. Rodriguez-Vazquez
Journal Paper - IEEE Sensors Journal, vol. 16, no. 2, pp. 317-330, 2016
IEEE    DOI: 10.1109/JSEN.2015.2483898    ISSN: 1530-437X    » doi
[abstract]
This paper presents a novel event-based vision sensor with two operation modes: 1) intensity mode and spatial contrast detection. They can be combined with two different readout approaches: 1) pulse density modulation and time-to-first spike. The sensor is conceived to be a node of an smart camera network made up of several independent an autonomous nodes that send information to a central one. The user can toggle the operation and the readout modes with two control bits. The sensor has low latency (below 1 ms under average illumination conditions), low power consumption (19 mA), and reduced data flow, when detecting spatial contrast. A new approach to compute the spatial contrast based on inter-pixel event communication less prone to mismatch effects than diffusive networks is proposed. The sensor was fabricated in the standard AMS4M2P 0.35-μm process. A detailed system-level description and experimental results are provided.

A high dynamic range image sensor with linear response based on asynchronous event detection
J.A. Leñero-Bardallo, R. Carmona-Galán and Á. Rodríguez-Vázquez
Conference - European Conference on Circuit Theory and Design ECCTD 2015
[abstract]
This paper investigates the potential of an image sensor that combines event-based asynchronous outputs with conventional integration of photocurrents. Pixels voltages can be read out following a traditional approach with a source follower and analog-to-digital converter. Furthermore, pixels have circuitry to implement Pulse Density Modulation (PDM) sending out pulses with a frequency that is proportional to the photocurrent. Both read-out approaches operate simultaneously. Their information is combined to render high dynamic range images. In this paper, we explain the new vision sensor concept and we develop a theoretical analysis of the expected performance in standard AMS 0.18 µm HV technology. Moreover, we provide a description of the vision sensor architecture and its main blocks.

Miniaturized Sun Sensor with In-Pixel Processing for Attitude Determination of Micro Space Probes
L. Farian, P. Häfliger and J.A. Leñero-Bardallo
Conference - IEEE International Conference on Event-Based Control, Communication and Signal Processing EBCCSP 2015
[abstract]
This work presents a novel concept for a miniaturized two-axis asynchronous Sun sensor which will be a part of a sounding rocket- and micro-probe Attitude Determination System. The sensor is composed of only two lines of bio-inspired pixels performing asynchronous in-pixel parallel processing, and two optical slits aligned above the chip. The Sun location is determined with much higher temporal resolution than synchronous sensors could achieve. Furthermore, the sensor output data directly indicates the Sun's position, without the need for any further processing (except initial calibration procedure). Also the power consumption is expected to be small in comparison to traditional Sun sensors, thanks to abandoning the need for a clock. This approach occupies much less area than previously reported Sun sensors. Hence, the remaining chip area can be used to realise a system-on-chip integrating other sensor interfaces, and processing- and communication units. This allows a high degree of integration and miniaturization which is essential for the planned micro-probe application. The simulations proved the Sun sensor to have temporal resolution of 1ms, standby power consumption of only a few pico-Watts and achieved a resolution of ~ 0:5 °.

A Bio-Inspired AER Temporal Tri-Color Differentiator Pixel Array
L. Farian, J.A. Leñero-Bardallo and P. Häfliger
Journal Paper - IEEE Transactions on Biomedical Circuit and Systems, vol. 9, no. 5, pp 686-698, 2015
IEEE    DOI: 10.1109/TBCAS.2015.2492460    ISSN: 1932-4545    » doi
[abstract]
This article investigates the potential of a bio-inspired vision sensor with pixels that detect transients between three primary colors. The in-pixel color processing is inspired by the retinal color opponency that are found in mammalian retinas. Color transitions in a pixel are represented by voltage spikes, which are akin to a neuron's action potential. These spikes are conveyed off-chip by the Address Event Representation (AER) protocol. To achieve sensitivity to three different color spectra within the visual spectrum, each pixel has three stacked photodiodes at different depths in the silicon substrate. The sensor has been fabricated in the standard TSMC 90 nm CMOS technology. A post-processing method to decode events into color transitions has been proposed and implemented as a custom interface to display real-time color changes in the visual scene. Experimental results are provided. Color transitions can be detected at high speed (up to 2.7 kHz). The sensor has a dynamic range of 58 dB and a power consumption of 22.5 mW. This type of sensor can be of use in industrial, robotics, automotive and other applications where essential information is contained in transient emissions shifts within the visual spectrum.

Técnicas ICA de procesamiento de datos para señales de audio
J.A. Leñero-Bardallo and S.A. Cruces
Book - 144 p, 2014
PUBLICIA    ISBN: 978-3-639-55634-6    » link
[abstract]
El desarrollo de los procesadores actuales permite implementar algoritmos cada vez más complejos para procesar señales de audio y vídeo en tiempo real. El presente libro está especialmente dirigido a personas que quieran familiarizarse y aprender técnicas de Análisis de Componentes Independientes (ICA) y Separación Ciega de Fuentes (BSS), para procesar señales en tiempo real. En el libro se explica cómo se puede hacer uso de estas técnicas de procesamiento de señal para separar fuentes de audio recibidas en varios receptores. Se describen de forma detallada los pasos a seguir para implementar algoritmos, que utilicen estas técnicas de procesamiento de la información, en cualquier lenguaje de programación.

Using 3-D Technologes for Form Factor Improvement of Low-Power Vision Sensors
J. Fernández-Berni, S. Vargas, J.A. Leñero and B. Pérez-Verdú
Conference - IEEE Latin American Symposium on Circuits and Systems LASCAS 2014
[abstract]
While conventional CMOS active pixel sensors embed only the circuitry required for photo-detection, pixel addressing and voltage buffering, smart pixels incorporate also circuitry for data processing, data storage and control of data interchange. This additional circuitry enables data processing be realized concurrently with the acquisition of images which is instrumental to reduce the number of data needed to carry to information contained into images. This way, more efficient vision systems can be built at the cost of larger pixel pitch. Vertically-integrated 3D technologies enable to keep the advnatges of smart pixels while improving the form factor of smart pixels.

Live Demo: Real-time Focal-plane Face Obfuscation through Programmable Pixelation
J. Fernández-Berni, R. Carmona-Galán, R. del Río, J.A. Leñero-Bardallo, R. Kleihorsty, W. Philipsy and Á. Rodríguez-Vázquez
Conference - Workshop on the Architecture of Smart Cameras WASC 2014
[abstract]
Privacy concerns are hindering the introduction of smart camera networks in application scenarios like retailing analytics, factories or elderly care. Indeed, there is usually no need of dealing with sensitive data when it comes to carrying out a meaningful visual analysis in these scenarios. Time spent by customers in front of a showcase, trajectories of workers around a manufacturing site or fall detection in a nursing home are three examples where video analytics can be performed without compromising privacy. But still the idea of networked cameras pervasively collecting data generates social rejection in the face of sensitive information being tampered by hackers or misused by legitimate users. New strategies must be developed in order to ensure privacy from the very point where sensitive data are generated: the sensors. Protection measures embedded on-chip at the front-end sensor of each network node significantly reduce the number of trusted system components as well as the impact of potential software flaws. In this demonstration, we present a full-custom QVGA vision sensor that can be reconfigured to implement programmable pixelation of image regions at the focal plane. According to the literature, pixelation provides the best performance in terms of balance between privacy protection and intelligibility of the surveyed scene.

Live demonstration: A Bio-Inspired AER Temporal Tri-Color Differentiator
L. Farian, J.A. Leñero-Bardallo and P. Häfliger
Conference - IEEE Biomedical Circuits and Systems Conference BioCAS 2014
DOI: 10.1109/BioCAS.2014.6981674    » doi
[abstract]
We demonstrate the first array of asynchronous event pixels that react to temporal color contrast of three different color spectra. The three different spectra are transduced into photo currents by stacked photo diodes. Temporal changes of the contrast of these three spectra are quantified as pulse density modulated signals and conveyed off-chip by the Address Event Representation (AER) protocol. The 16×16 pixel array has been fabricated in the standard TSMC 90nm CMOS process.

A dual operation mode bio-inspired pixel
J.A. Leñero-Bardallo and P. Häfliger
Journal Paper - IEEE Transactions of Circuits and Systems-II, vol. 61, no. 11, pp 855- 859, 2014
IEEE    DOI: 10.1109/TCSII.2014.2350352    ISSN: 1549-7747    » doi
[abstract]
A new bio-inspired pixel concept is proposed. It can compute the spatial contrast and provide intensity images. The pixel sends spikes to communicate with its neighbors, computing the spatial contrast. Its expected fixed-pattern noise within neuromorphic arrays is 1% without using calibration. Its fill factor is 8.5%. Furthermore, the pixel has two different readout modes: pulse density modulation and time to first spike. The user can toggle any time between operation and readout modes, depending on the desired image quality and the bandwidth and power consumption requirements. The pixel has been implemented in AMS4M2P 0.35-μm CMOS technology. Experimental results are provided.

Fire detection with a frame-less vision sensor working in the NIR band
J.A. Leñero-Bardallo, J. Fernández-Berni, R. Carmona-Galán, P. Häfliger and Á. Rodríguez-Vázquez
Conference - International Conference on Forest Fire Research ICFFR 2014
DOI: 10.14195/978-989-26-0884-6_151    » doi
[abstract]
This paper draws the attention of the community about the capabilities of an emerging generation of bio-inspired vision sensors to be used in fire detection systems. Their principle of operation will be described. Moreover experimental results showing the performance of an event-based vision sensor will be provided. The sensor was intended to monitor flames activity without using optic filters. In this article, we will also extend this preliminary work and explore how its outputs can be processed to detect fire in the environment.

Review of ADCs for imaging
J.A. Leñero-Bardallo, J. Fernández-Berni and Á. Rodríguez-Vázquez
Conference - IS&T International Symposium on Electronic Imaging 2014
[abstract]
The aim of this article is to guide image sensors designers to optimize the analog-to-digital conversion of pixel outputs. The most common ADCs topologies for image sensors are presented and discussed. The ADCs specific requirements for these sensors are analyzed and quantified. Finally, we present relevant recent contributions of specific ADCs for image sensors and we compare them using a novel FOM.

Towards an ultra-low-power low-cost wireless visual sensor node for fine-grain detection of forest fires
J. Fernández-Berni, R. Carmona-Galán, J.A. Leñero-Bardallo, R. Kleihorst and Á. Rodríguez-Vázquez
Conference - International Conference on Forest Fire Research ICFFR 2014
[abstract]
Advances in electronics, sensor technologies, embedded hardware and software are boosting the application scenarios of wireless sensor networks. Specifically, the incorporation of visual capabilities into the nodes means a milestone, and a challenge, in terms of the amount of information sensed and processed by these networks. The scarcity of resources -power, processing and memory- imposes strong restrictions on the vision hardware and algorithms suitable for implementation at the nodes. Both, hardware and algorithms must be adapted to the particular characteristics of the targeted application. This permits to achieve the required performance at lower energy and computational cost. We have followed this approach when addressing the detection of forest fires by means of wireless visual sensor networks. From the development of a smoke detection algorithm down to the design of a low-power smart imager, every step along the way has been influenced by the objective of reducing power consumption and computational resources as much as possible. Of course, reliability and robustness against false alarms have also been crucial requirements demanded by this specific application. All in all, we summarize in this paper our experience in this topic. In addition to a prototype vision system based on a full-custom smart imager, we also report results from a vision system based on ultra-low-power low-cost commercial imagers with a resolution of 30x30 pixels. Even for this small number of pixels, we have been able to detect smoke at around 100 meters away without false alarms. For such tiny images, smoke is simply a moving grey stain within a blurry scene, but it features a particular spatio-temporal dynamics. As described in the manuscript, the key point to succeed with so low resolution thus falls on the adequate encoding of that dynamics at algorithm level.

Form Factor Improvement of Smart-Pixels for Vision Sensors through 3-D Vertically-Integrated Technologies
A. Rodríguez-Vázquez, R. Carmona-Galán, J. Fernández Berni, S. Vargas, J.A. Leñero, M. Suárez, V. Brea and B. Pérez-Verdú
Conference - IEEE Latin American Symposium on Circuits and Systems LASCAS 2014
[abstract]
While conventional CMOS active pixel sensors embed only the circuitry required for photo-detection, pixel addressing and voltage buffering, smart pixels incorporate also circuitry for data processing, data storage and control of data interchange. This additional circuitry enables data processing be realized concurrently with the acquisition of images which is instrumental to reduce the number of data needed to carry to information contained into images. This way, more efficient vision systems can be built at the cost of larger pixel pitch. Vertically-integrated 3D technologies enable to keep the advnatges of smart pixels while improving the form factor of smart pixels.

Smart imaging for power-efficient extraction of Viola-Jones local descriptors
J. Fernández-Berni, R. Carmona-Galán, R. del Río, J.A. Leñero-Bardallo, M. Suárez-Cambre and A. Rodríguez-Vázquez
Conference - IS&T International Symposium on Electronic Imaging 2014
[abstract]
In computer vision, local descriptors permit to summarize relevant visual cues through feature vectors. These vectors constitute inputs for trained classifiers which in turn enable diferent high-level vision tasks. While local descriptors certainly alleviate the computation load of subsequent processing stages by preventing them from handling raw images, they still have to deal with individual pixels. Feature vector extraction can thus become a major limitation for conventional embedded vision hardware. In this paper, we present a power-eficicient sensing-processing array conceived to provide the computation of integral images at diferent scales. These images are intermediate representations that speed up feature extraction. In particular, the mixed-signal array operation is tailored for extraction of Haar-like features. These features feed the cascade of classifiers at the core of the Viola-Jones framework. The processing lattice has been designed for the standard UMC 0.18μm 1P6M CMOS process. In addition to integral image computation, the array can be reprogrammed to deliver other early vision tasks: concurrent rectangular area sum, block-wise HDR imaging, Gaussian pyramids and image pre-warping for subsequent reduced kernel filtering.

Flame monitoring with an AER color vision sensor
J.A. Leñero-Bardallo, D.H. Bryn and P. Häfliger
Conference - IEEE International Symposium on Circuits and Systems ISCAS 2013
[abstract]
We present a new method to sense NIR (Near Infrared Radiation) with an asynchronous pixel event AER (Address Event Representation) color vision sensor. The sensor's asynchronous output data flow can be processed by a computer to monitor oscillations of NIR illumination levels. Such variations can be indicators for fires or flames. The new sensor is an alternative to high-speed CMOS cameras with NIR filters as front-end for flame detection and characterization methods. Our sensor maintains a high temporal resolution and low latency for regions that are bright in the NIR spectrum, e.g. flames and hot-spots, at an overall low (scene dependent) data rate and power consumption. This makes it specially suitable for Wireless Sensor Networks (WSNs). A dedicated real-time Java interface was programmed to process the sensor outputs. In this paper, we describe the algorithm for sensing NIR intensity and describe experiments that show oscillations of NIR levels of a flame.

A dual operation mode bio-inspired vision sensor
J.A. Lenero-Bardallo and P. Häfliger
Conference - IEEE Biomedical Circuits and Systems Conference BioCAS 2013
[abstract]
We present a bio-inspired frame-free vision sensor with two different operation modes: spatial contrast computation and intensity mode. Cross-pixel communication for contrast computation uses spike signals. Two read-out methods, Pulse Density Modulation (PDM) or Time-to-first spike (TFS), are available. Both use address event representation (AER) for off-chip communication. The user can toggle any time between different operation and read-out modes with two digital control signals that set automatically the bias settings. The sensor is aimed for applications where speed and low output data flow are preferred, i.g. surveillance and industrial processes, offering the possibility of providing detailed intensity images if necessary. The sensor does not need calibration.

Live Demonstration: A Bio-inspired Asynchronous Pixel Event Tri-color Vision Sensor
J.A. Leñero-Bardallo, D.H. Bryn and P. Häfliger
Conference - IEEE International Symposium on Circuits and Systems ISCAS 2012
[abstract]
Summary form only given. We demonstrate the very first tri-color asynchronous pixel-event vision sensor, the latest addition of the growing family of bio-inspired AER (Address Event Representation) vision sensors. It is an asynchronous pulse/frequency density modulation (PDM/PFM) sensor (popularly known as octopus retina) that employs stacked photo diodes for color separation. Simple linear combination of the sensor's inherent pseudo color representation is employed to reconstruct approximate a RGB video stream. The 22×22 pixel array has been fabricated in the standard STM 90nm CMOS process.

A Bioinspired 128x128 Pixel Dynamic-Vision-Sensor
T. Serrano-Gotarredona, J.A. Leñero-Bardallo and B. Linares-Barranco
Conference - Conference on Design of Circuits and Integrated Systems DCIS 2011
[abstract]
This paper presents a 128x128 dynamic vision sensor. Each pixel detects temporal changes in the local illumination. A minimum illumination temporal contrast of 10% can be detected. A compact preamplification stage has been introduced that allows to improve the minimum detectable contrast over previous designs, while at the same time reducing the pixel area by 1/3. The pixel responds to illumination changes in less than 3.6μs. The ability of the sensor to capture very fast moving objects has been verified experimentally. A frame-based sensor capable to achieve this, would require at least 100K frames per second.

A 3.6 μs latency asynchronous frame-free event-driven dynamic-vision-sensor
J.A. Leñero-Bardallo, T. Serrano-Gotarredona and B. Linares-Barranco
Journal Paper - IEEE Journal of Solid-State Circuits, vol. 46, no. 6, pp 1443-1455 2011
IEEE    DOI: 10.1109/JSSC.2011.2118490    ISSN: 0018-9200    » doi
[abstract]
This paper presents a 128 x 128 dynamic vision sensor. Each pixel detects temporal changes in the local illumination. A minimum illumination temporal contrast of 10% can be detected. A compact preamplification stage has been introduced that allows to improve the minimum detectable contrast over previous designs, while at the same time reducing the pixel area by 1/3. The pixel responds to illumination changes in less than 3.6 mu s. The ability of the sensor to capture very fast moving objects, rotating at 10 K revolutions per second, has been verified experimentally. A frame-based sensor capable to achieve this, would require at least 100 K frames per second.

A calibrated spatial contrast AER vision sensor with adjustable contrast threshold
J.A Leñero-Bardallo, T. Serrano-Gotarredona and B. Linares-Barranco
Conference - International Symposium on Circuits and Systems ISCAS 2010
[abstract]
Abstract not available

Study, design, implementation, and test of VLSI retinae sensitive to spatial and temporal contrast
J.A. Leñero-Bardallo
Thesis - Date of defense: 20/05/2010
UNIVERSIDAD DE SEVILLA, IMSE-CNM    » link
[abstract]
La tesis describe dos sensores visuales AER (Address Event Representation) capaces de detectar el contraste espacial y temporal. Se tratan de dos sistemas de visión con inspiración biológica que tratan de emular el comportamiento de la retina humana. Al contrario de los sistemas de visión convencionales, los sistemas bioinspirados no están basados en frames. Ello posibilita que tengan una serie de ventajas inherentes, como el amplio rango dinámico, el bajo consumo y la alta velocidad. En el documento se explica el funcionamiento de dos sensores AER específicos para la detección de contraste espacial y temporal. Además, se muestran numerosos resultados experimentales. Como cunclusión, se muestra que para determinadas aplicaciones, como la detección de movimiento a alta velocidad, los sensores implementados ofrecen mejores características que los sistemas de visión convencionales que podemos encontrar en el mercado.

A 100dB dynamic range event-driven spatial contrast sensor with 100us response time and time-to-first-spike mode
J.A. Leñero-Bardallo, T. Serrano-Gotarredona and B. Linares Barranco
Conference - European Solid State Circuits Conference ESSCIRC 2010
[abstract]
Bio-inspired vision sensors have some inherent advantages over conventional sequential-still-image sensors. Some of them are high speed, low latency and reduced bandwidth and power consumption. In this paper, we present a new spatial contrast retina with signed output. Its output is zero if there is no contrast. The new sensor includes an optional Time-to-First-Spike mode (TFS) that combines the advantages of AER vision systems and frame-based ones. In TFS mode, times between consecutive frames can be adjusted dynamically by transmitting only relevant information. Both operation modes are ambient-light-independent, to first order. A 32x32 pixel prototype has been fabricated in 0.35um CMOS. Experimental results are provided. ©2010 IEEE.

A five-decade dynamic-range ambient-light-independent calibrated signed-spatial-contrast AER retina with 0.1-ms latency and optional time-to-first-spike mode
J.A. Leñero-Bardallo, T. Serrano-Gotarredona and B. Linares-Barranco
Journal Paper - IEEE Transactions on Circuits and Systems I-Regular Papers, vol. 57, no. 10, pp 2632-2643, 2010
IEEE    DOI: 10.1109/TCSI.2010.2046971    ISSN: 1549-8328    » doi
[abstract]
Address Event Representation (AER) is an emergent technology for assembling modular multiblock bio-inspired sensory and processing systems. Visual sensors (retinae) are among the first AER modules to be reported since the introduction of the technology. Spatial-contrast AER retinae are of special interest since they provide highly compressed data flow without reducing the relevant information required for performing recognition. The reported AER contrast retinae perform a contrast computation based on the ratio between a pixel's local light intensity and a spatially weighted average of its neighborhood. This resulted in compact circuits but with the penalty of all pixels generating output signals even if they sensed no contrast. In this paper, we present a spatial-contrast retina with a signed output: Contrast is computed as the relative difference (not the ratio) between a pixel's local light and its surrounding spatial average and normalized with respect to ambient light. As a result, contrast is ambient light independent, includes a sign, and the output will be zero if there is no contrast. Furthermore, an adjustable thresholding mechanism has been included, such that pixels remain silent until they sense an absolute contrast above the adjustable threshold. The pixel contrast-computation circuit is based on Boahen's biharmonic operator contrast circuit, which has been improved to include mismatch calibration and adaptive-current-based biasing. As a result, the contrast-computation circuit shows much less mismatch, is almost insensitive to ambient light illumination, and biasing is much less critical than in the original voltage biasing scheme. The retina includes an optional global reset mechanism for operation in ambient-light-independent Time-to-First-Spike contrast-computation mode. A 32 32 pixel test prototype has been fabricated in 0.35-mu m CMOS. Experimental results are provided.

A signed spatial contrast event spike retina chip
J.A. Leñero-Bardallo, T. Serrano-Gotarredona and B. Linares-Barranco
Conference - IEEE International Symposium on Circuits and Systems ISCAS 2010
[abstract]
Reported AER (Address Event Representation) contrast retinae perform a contrast computation based on the ratio between a pixel's local light intensity and a spatially weighted average of its neighbourhood. This results in compact circuits, but with the penalty of all pixels generating output signals even if they sense no contrast. In this paper we present a spatial contrast retina with bipolar output: contrast is computed as the relative normalized difference (not the ratio) between a pixel's local light and its weighted spatial average, normalized to average light. As a result contrast includes a sign, is ambient light independent, and the output will be zero if there is no contrast. Furthermore, an adjustable thresholding mechanism has been included, such that pixels remain silent until they sense an absolute contrast above the adjustable threshold. The pixel contrast computation circuit is based on Boahen's Biharmonic operator contrast circuit, which has been improved to include mismatch calibration and adaptive current based biasing. As a result, the contrast computation circuit shows much less mismatch, is almost insensitive to ambient light illumination, and biasing is much less critical than in the original voltage biasing scheme. The retina also includes an optional TFS (Time-to-First-Spike) integration mode. A full AER retina version has been fabricated and tested. In the present paper we provide preliminary experimental results.

A spatial calibrated AER contrast retina with adjustable contrast threshold
J.A. Leñero-Bardallo, T. Serrano-Gotarredona and B. Linares-Barranco
Conference - Conference on Design of Circuits and Integrated Systems DCIS 2009
[abstract]
Address Event Representation (AER) is an emergent technology for assembling modular multi-blocks bio-inspired sensory and processing systems. Visual sensors (retinae) are among the first AER modules to be reported since the introduction of the technology. Spatial contrast AER retinae are of special interest since they provide highly compressed data flow without reducing the relevant information required for performing recognition. Reported AER contrast retinae perform a contrast computation based on the ratio between a pixel's local light intensity and a spatially weighted average of its neighbourhood. This results in compact circuits, but with the penalty of all pixels generating output signals even if they sensed no contrast. In this paper we present a spatial contrast retina with bipolar output: contrast is computed as the relative difference between a pixel's local light and its weighted spatial average. As a result, contrast includes a sign and the output will be zero if there is no contrast. Furthermore, an adjustable thresholding mechanism has been included, such that pixels remain silent until they sense an absolute contrast above the adjustable threshold. The pixel contrast computation circuit is based on Boahen's Biharmonic operator contrast circuit, which has been improved to include mismatch calibration and adaptive current based biasing. As a result, the contrast computation circuit shows much less mismatch, is almost insensitive to ambient light illumination, and biasing is much less critical than in the original voltage biasing scheme. A full AER retina version has been fabricated. In the present paper we provide simulation and preliminary experimental results.

A mismatch calibrated bipolar spatial contrast AER retina with adjustable contrast threshold
J.A. Leñero-Bardallo, T. Serrano-Gotarredona and B. Linares-Barranco
Conference - International Symposium on Circuits and Systems ISCAS 2009
[abstract]
Address Event Representation (AER) is an emergent technology for assembling modular multi-blocks bio-inspired sensory and processing systems. Visual sensors (retinae) are among the first AER modules to be reported since the introduction of the technology. Spatial contrast AER retinae are of special interest since they provide highly compressed data flow without reducing the relevant information required for performing recognition. Reported AER contrast retinae perform a contrast computation based on the ratio between a pixel's local light intensity and a spatially weighted average of its neighbourhood. This resulted in compact circuits, but with the penalty of all pixels generating output signals even if they sensed no contrast. In this paper we present a spatial contrast retina with bipolar output: contrast is computed as the relative difference between a pixel's local light and its weighted spatial average. As a result, contrast includes a sign and the output will be zero if there is no contrast. Furthermore, an adjustable thresholding mechanism has been included, such that pixels remain silent until they sense an absolute contrast above the adjustable threshold. The pixel contrast computation circuit is based on Boahen's Biharmonic operator contrast circuit, which has been improved to include mismatch calibration and adaptive current based biasing. As a result, the contrast computation circuit shows much less mismatch, is almost insensitive to ambient light illumination, and biasing is much less critical than in the original voltage biasing scheme. A full AER retina version has been submitted for fabrication. In the present paper we provide simulation results.

Compact calibration circuit for large neuromorphic arrays
J.A. Leñero-Bardallo, T. Serrano-Gotarredona and B. Linares-Barranco
Conference - International Symposium on Circuits and Systems ISCAS 2008
[abstract]
Low current applications, like neuromorphic circuits, where operating currents can be as low as few nano amps or less, suffer from huge transistor mismatches, resulting in around or less than I-bit precision. Here we present a new calibration approach based on individually calibratable current sources made with MOS transistors of digitally adjustable length, which require only N unit transistors. The scheme includes a translinear circuit based tuning scheme, which allows to expand the operating range of the calibrated circuits with graceful precision degradation, over 4 decades of operating currents. Experimental results are provided for 5-bit resolution DACs operating at 20nA.

A calibration technique for very low current and compact tunable neuromorphic cells: Application to 5-bit 20-nA DACs
J.A. Leñero-Bardallo, T. Serrano-Gotarredona and B. Linares-Barranco
Journal Paper - IEEE Transactions on Circuits and Systems II: Express Briefs, vol. 55, no. 6, pp 522-526, 2008
IEEE    DOI: 10.1109/TCSII.2007.916864    ISSN: 1549-7747    » doi
[abstract]
Low current applications, like neuromorphic circuits, where operating currents can be as low as a few nanoamperes or less, suffer from huge transistor mismatches, resulting in around or less than 1-bit precisions. Recently, a neuromorphic programmable-kernel 2-D convolution chip has been reported where each pixel included two compact calibrated digital-to-analog converters (DACs) of 5-bit resolution, for currents down to picoamperes. Those DACs were based on MOS ladder structures, which although compact require 3N + 1 unit transistors (N is the number of calibration bits). Here, we present a new calibration approach not based on ladders, but on individually calibratable current sources made with MOS transistors of digitally adjustable length, which require only N-sized transistors. The scheme includes a translinear circuit-based tuning scheme, which allows us to expand the operating range of the calibrated circuits with graceful precision degradation, over four decades of operating currents. Experimental results are provided for 5-bit resolution DACs operating at 20 nA using two different translinear tuning schemes. Maximum measured precision is 5.05 and 7.15 b, respectively, for the two DAC schemes.

The stochastic I-Pot: A circuit block for programming bias currents
R. Serrano-Gotarredona, L. Camuñas-Mesa, T. Serrano-Gotarredona, J.A. Leñero-Bardallo and B. Linares-Barranco
Journal Paper - IEEE Transactions on Circuits and Systems II: Express Briefs, vol. 54, no. 9, pp 760-764, 2007
IEEE    DOI: 10.1109/TCSII.2007.900881    ISSN: 1549-7747    » doi
[abstract]
In this brief, we present the "Stochastic I-Pot." It is a circuit element that allows for digitally programming a precise bias current ranging over many decades, from pico-amperes up to hundreds of micro-amperes. I-Pot blocks can be chained within a chip to allow for any arbitrary number of programmable bias currents. The approach only requires to provide the chip with three external pins, the use of an external current measuring instrument, and a computer. This way, once all internal I-Pots have been characterized, they can be programmed through a computer to provide any desired current bias value with very low error. The circuit block turns out to be very practical for experimenting with new circuits (specially when a large number of biases are required), testing wide ranges of biases, introducing means for current mismatch calibration, offsets compensations, etc. using a reduced number of chip pins. We show experimental results of generating bias currents with errors of 0.38% (8 bits) for currents varying from 176 mu A to 19.6 pA. Temperature effects are characterized.

Scopus access Wok access