Instrumentation & Measurement Magazine 23-8 - 19

such as computation and memory emulating neurons and
synapses, together with the polarization vision capabilities
of certain animal species, has the potential to revolutionize
and give rise to the next-generation highly efficient artificial intelligence vision systems. Traditionally, Von-Neumann
computer architectures, although highly efficient to perform
intensive computational tasks, lack the ability to recognize,
analyze and classifya large set of data. In contrast, neuromorphic computing exhibits potential bioinspired functions such
as parallel processing and human cognition. Unlike semiconductor-based devices, the human brain has the potential to
learn, understand and recognize images, all while using very
little energy. A key-feature of human cognitive vision systems,
adapted by neuromorphic computer vision, is their inherent
redundancy and their ability to focus on the most relevant part
of the scene, retrieving salient features [5]-[9]. These salient
features contribute to fast visual processing and lead to visual
functions such as recognition, localization, and tracking while
also reducing the amount of information the system processes
as well as the bandwidth and storage used. To achieve this,
neuromorphic technology integrates algorithms to support
near real-time data acquisition and processing with architectures built on innovative computing hardware to address
specific user applications. In parallel, considerable efforts are
being made to mirror human cognition by exploring the development of deep learning architectures such as convolutional
neural networks that mimic the connectivity and adaptation in
connection strengths of synapses, jointly with machine learning and deep learning algorithms. Expanding this knowledge
to computer vision and pattern recognition, an array of highly
efficient applications that span from autonomous vehicles
(AV)s, microsatellites, and nanosatellites to robotic systems
for Industry 4.0 factories and smart cities would be developed,
while providing enhanced logistics, sound economic and social benefits through improved efficiency, agility, decreased
computational complexity, reduced payload, low-power consumption, sustainable traffic flow, and route safety.

Neuromorphic or Retina Vision Sensors
Neuromorphic or retina vision sensors mimic and implement
models of biological visual systems [15]-[23]. Event-based Dynamic Vision Sensors (DVS)s is a class of neuromorphic vision
sensors that operate on pixel-autonomous detection of temporal contrast. The DVS is inspired by the biological retina,
enabling the consideration of an additional physical parameter, namely, time. DVS-based systems are able to operate
asynchronously based on differential light intensity variations. In other words, this type of vision sensors is sensitive
under dynamically evolving scenarios and directly responds
to changes, i.e., temporal contrast, pixel individually, and near
real time. An asynchronous occurrence of spikes corresponds
to the exact time stamp encoding of input signal variations. In
contrast, full-frame operating cameras operate synchronously
by capturing certain number of frames per second, while each
pixel contains an absolute intensity value. In other words, one
of the advantages of a DVS is that it encodes the information
November 2020	

Fig. 1. Asynchronous event-based visual signal encoding example. Please,
note the uneven spacing between events.

in a compressive way and only sends spikes when there is a
relevant change in light intensity. Another advantage is that
the exact spatio-temporal information of the object is preserved with a reported precision in the spiking times of the
order of 1 μs. As a result, these sensors are ideal candidates for
high-speed processing and recognition systems. A schematic
diagram of the operation of a DVS is shown in Fig. 1.

Polarimetric Dynamic Vision Principles
p(DVS)
A new class of sensors aimed to identify target motion classification, namely, the "Polarimetric Dynamic Vision Sensor
p(DVS)," has been introduced by Giakos and coworkers, aiming to further enhance the local spatial-temporal contrast to the
DVS imaging sensor [10]-[14]. It can operate at high processing speeds and exhibit high spatial-temporal contrast, while
operating at low bandwidth, with low power consumption,
low memory, and low storage. This unique class of sensors
offers distinct advantages for a wide range of motion target detection, recognition, tracking, and classification problems. The
operational principle of p(DVS)s is shown in Fig. 2.
The presented system consists of a 128 by 128 pixel DVS128,
coupled to a pair of high-contrast linear polarizers placed in
parallel. Although, the spatial resolution of the DVS polarimetric sensor is significantly lower than traditional cameras,
it exhibits a high dynamic range of 120 dB and excellent time
resolution of 1 μs, as well as enhanced contrast, scatter rejection and dynamic range attributed to the applied polarimetric
principles.
The DVS 128 is an event-based sensor in the sense that it
will only capture data when the incident light intensity of a
particular pixel varies beyond one of two specified thresholds. If the light intensity drops below the negative threshold,
an OFF event is triggered. If the light intensity raises above
the positive threshold, an ON event is triggered, as shown in
Fig. 1. Each pixel in the DVS 128 operates in this manner and

IEEE Instrumentation & Measurement Magazine	19



Instrumentation & Measurement Magazine 23-8

Table of Contents for the Digital Edition of Instrumentation & Measurement Magazine 23-8

No label
Instrumentation & Measurement Magazine 23-8 - Cover1
Instrumentation & Measurement Magazine 23-8 - No label
Instrumentation & Measurement Magazine 23-8 - 2
Instrumentation & Measurement Magazine 23-8 - 3
Instrumentation & Measurement Magazine 23-8 - 4
Instrumentation & Measurement Magazine 23-8 - 5
Instrumentation & Measurement Magazine 23-8 - 6
Instrumentation & Measurement Magazine 23-8 - 7
Instrumentation & Measurement Magazine 23-8 - 8
Instrumentation & Measurement Magazine 23-8 - 9
Instrumentation & Measurement Magazine 23-8 - 10
Instrumentation & Measurement Magazine 23-8 - 11
Instrumentation & Measurement Magazine 23-8 - 12
Instrumentation & Measurement Magazine 23-8 - 13
Instrumentation & Measurement Magazine 23-8 - 14
Instrumentation & Measurement Magazine 23-8 - 15
Instrumentation & Measurement Magazine 23-8 - 16
Instrumentation & Measurement Magazine 23-8 - 17
Instrumentation & Measurement Magazine 23-8 - 18
Instrumentation & Measurement Magazine 23-8 - 19
Instrumentation & Measurement Magazine 23-8 - 20
Instrumentation & Measurement Magazine 23-8 - 21
Instrumentation & Measurement Magazine 23-8 - 22
Instrumentation & Measurement Magazine 23-8 - 23
Instrumentation & Measurement Magazine 23-8 - 24
Instrumentation & Measurement Magazine 23-8 - 25
Instrumentation & Measurement Magazine 23-8 - 26
Instrumentation & Measurement Magazine 23-8 - 27
Instrumentation & Measurement Magazine 23-8 - 28
Instrumentation & Measurement Magazine 23-8 - 29
Instrumentation & Measurement Magazine 23-8 - 30
Instrumentation & Measurement Magazine 23-8 - 31
Instrumentation & Measurement Magazine 23-8 - 32
Instrumentation & Measurement Magazine 23-8 - 33
Instrumentation & Measurement Magazine 23-8 - 34
Instrumentation & Measurement Magazine 23-8 - 35
Instrumentation & Measurement Magazine 23-8 - 36
Instrumentation & Measurement Magazine 23-8 - 37
Instrumentation & Measurement Magazine 23-8 - 38
Instrumentation & Measurement Magazine 23-8 - 39
Instrumentation & Measurement Magazine 23-8 - 40
Instrumentation & Measurement Magazine 23-8 - 41
https://www.nxtbook.com/allen/iamm/26-6
https://www.nxtbook.com/allen/iamm/26-5
https://www.nxtbook.com/allen/iamm/26-4
https://www.nxtbook.com/allen/iamm/26-3
https://www.nxtbook.com/allen/iamm/26-2
https://www.nxtbook.com/allen/iamm/26-1
https://www.nxtbook.com/allen/iamm/25-9
https://www.nxtbook.com/allen/iamm/25-8
https://www.nxtbook.com/allen/iamm/25-7
https://www.nxtbook.com/allen/iamm/25-6
https://www.nxtbook.com/allen/iamm/25-5
https://www.nxtbook.com/allen/iamm/25-4
https://www.nxtbook.com/allen/iamm/25-3
https://www.nxtbook.com/allen/iamm/instrumentation-measurement-magazine-25-2
https://www.nxtbook.com/allen/iamm/25-1
https://www.nxtbook.com/allen/iamm/24-9
https://www.nxtbook.com/allen/iamm/24-7
https://www.nxtbook.com/allen/iamm/24-8
https://www.nxtbook.com/allen/iamm/24-6
https://www.nxtbook.com/allen/iamm/24-5
https://www.nxtbook.com/allen/iamm/24-4
https://www.nxtbook.com/allen/iamm/24-3
https://www.nxtbook.com/allen/iamm/24-2
https://www.nxtbook.com/allen/iamm/24-1
https://www.nxtbook.com/allen/iamm/23-9
https://www.nxtbook.com/allen/iamm/23-8
https://www.nxtbook.com/allen/iamm/23-6
https://www.nxtbook.com/allen/iamm/23-5
https://www.nxtbook.com/allen/iamm/23-2
https://www.nxtbook.com/allen/iamm/23-3
https://www.nxtbook.com/allen/iamm/23-4
https://www.nxtbookmedia.com