Instrumentation & Measurement Magazine 23-8 - 22

approach of artificial and biological vision," Computer Vision and
Image Understanding, vol. 130, pp. 1-30, 2016.
[2]	 D. Cho and T. Lee, "A review of bioinspired vision sensors and
their applications," Sensors and Materials, vol. 27, no. 6, pp. 447463, 2015.
[3]	 S. Herculano-Houzel, "The human brain in numbers: a linearly
scaled-up primate brain," Frontiers in Human Neuroscience, vol. 3,
no. 31, 2009.
[4]	 G. C. Giakos, T. Quang, T. Farrahi et al., "Bioinspired polarization
navigation sensor for autonomous munitions systems," in Proc.
SPIE 8723, Sensing Technologies for Global Health, Military Medicine,
and Environmental Monitoring, pp. 87231H-87231H-11, 2013.
[5]	 G. Indiveri and T. K. Horiuchi, "Frontiers in neuromorphic
engineering," Frontiers in Neuroscience, vol. 5, no. 118, 2011.
[6]	 M. Mahowald and C. Mead, "Silicon Retina," in Analog VLSI and
Neural Systems. San Francisco, CA, USA: Addison-Wesley VLSI

Fig. 5. Neural Network Validation with sequence of 1000 full frame images
(256 μs /frame).

Systems Series, pp. 257-278, 1989.
[7]	 P. Lichtsteiner, C. Posch, C. and T. Delbruck, "An 128 × 128 120 dB
15 μs latency temporal contrast vision sensor. IEEE J. Solid-State

To study the discrimination potential of the p(DVS) sensor
against targets rotating at different speeds, a neural network
was fed sequences of time event windows. Each input to the
neural network consists of 200 time event windows, which
equates to 100 ms of real-time information. The data entries
used for this study were separated 90/10 into training and testing sets, respectively. The validation accuracy and validation
loss for this neural network are shown in Fig. 4.
The neural network trained using time event windows exhibits training accuracy of 100%, validation accuracy 100%,
loss 1.07 × 10−3 and validation loss 9.61 × 10−4. In contrast, in the
neural network trained using full frame reconstructed images
as the input of the neural network, training accuracy drops to
95%, validation accuracy drops to 89%, training loss drops to
0.37, and validation loss drops to 0.46, as shown in Fig. 5.

Srinivasan, "Asynchronous frameless event-based optical flow,"
Neural Networks, vol. 27, pp. 32-37, 2012.
[9]	 R. Sarpeshkar, "Neuromorphic and biomorphic engineering
systems," Research Review, McGraw-Hill Yearbook of Science &
Technology 2009. New York, NY, USA: McGraw-Hill, 2009.
[10]	N. Douard, M. Surovich, G. Bauman, Z. Giakos, and G. Giakos,
"A novel cognitive neuromorphic polarimetric dynamic vision
system (pDVS) with enhanced discrimination and temporal
contrast," in Proc. 2018 IEEE Conf. Precision Electromagnetic Meas.
(CPEM 2018), pp. 1-2, 2018.
[11]	M. Nowak, A. Beninati, N. Douard et al., "A cognitive radar
for classification of resident space objects (RSO) operating on
polarimetric retina vision sensors and deep learning," in Proc.
2019 IEEE Int. Conf. Imaging Syst. Techniques (IST 2019), 2020.

Conclusion

[12]	M. Surovich, S. Shrestha, N. Douard, Z. Giakos, and G. C.

The outcome of this study indicates that by using deep
learning coupled with polarimetric Dynamic Vision Sensors
p(DVS) principles, efficient discrimination of targets, operating at different rotational speeds, is achieved. The system
has the potential to operate at high processing speeds and
low latency, while exhibiting high spatial-temporal contrast, operation at low bandwidth, low power consumption,
low memory, and low storage. It has potential applications
for a wide range of motion target detection, recognition,
tracking, and classification problems. Overall, bioinspired
vision architectures, integrating human cognition capabilities such as computation and memory emulating neurons
and synapses, together with the polarization vision capabilities of certain animal species, would lead to the next
generation of highly efficient augmented intelligence vision systems.

References
[1]	 N. V. Kartheek Medathatia, H. Neumann, G. S. Masson, and P.
Kornprobst, "Bio-inspired computer vision: towards a synergistic
22	

Circuits, vol. 43, no. 2, pp. 566-576, 2008.
[8]	 R. Benosman, S. H. Ieng, C. Clercq, C. Bartolozzi, and M.

Giakos, "Dynamic segmentation using a novel neuromorphic
polarimetric imaging system," in Proc. 2017 IEEE Int. Conf.
Imaging Syst. Techniques (IST 2017), 2017.
[13]	A. Beninati, N. Douard, G. Bauman, J. Hoogerhyde, A. Passalaris,
Z. Giakos, M. Nowak, S. Shrestha, H. E. Mohamed, M. Zervakis,
G. Livanos, and G. C. Giakos, "Space target motion salient
classification using polarimetric retina vision sensing principles,"
in Proc. 2018 IEEE Int. Conf. Imaging Syst. Techniques (IST 2018),
2018.
[14]	G. Giakos, T. Farrahi, N. Douard, M. Surovich, S. Shrestha,
A. Beninati, G. Bauman, H. Mohamed, N. Ying, Z. Giakos, M.
Zervakis, and G. Livanos, "Integration of bioinspired vision
principles towards the design of autonomous guidance,
navigation, and control systems," in Proc. 2018 9th Int. Conf.
Information, Intelligence, Systems and Applications (IISA), 2018.

Martin Nowak develops software to analyze neuromorphic data. Martin currently works full-time as a Software
Engineer at Passport Corp. in Warwick, New York. He is a
graduate student at Manhattan College. Martin specializes

IEEE Instrumentation & Measurement Magazine	

November 2020



Instrumentation & Measurement Magazine 23-8

Table of Contents for the Digital Edition of Instrumentation & Measurement Magazine 23-8

No label
Instrumentation & Measurement Magazine 23-8 - Cover1
Instrumentation & Measurement Magazine 23-8 - No label
Instrumentation & Measurement Magazine 23-8 - 2
Instrumentation & Measurement Magazine 23-8 - 3
Instrumentation & Measurement Magazine 23-8 - 4
Instrumentation & Measurement Magazine 23-8 - 5
Instrumentation & Measurement Magazine 23-8 - 6
Instrumentation & Measurement Magazine 23-8 - 7
Instrumentation & Measurement Magazine 23-8 - 8
Instrumentation & Measurement Magazine 23-8 - 9
Instrumentation & Measurement Magazine 23-8 - 10
Instrumentation & Measurement Magazine 23-8 - 11
Instrumentation & Measurement Magazine 23-8 - 12
Instrumentation & Measurement Magazine 23-8 - 13
Instrumentation & Measurement Magazine 23-8 - 14
Instrumentation & Measurement Magazine 23-8 - 15
Instrumentation & Measurement Magazine 23-8 - 16
Instrumentation & Measurement Magazine 23-8 - 17
Instrumentation & Measurement Magazine 23-8 - 18
Instrumentation & Measurement Magazine 23-8 - 19
Instrumentation & Measurement Magazine 23-8 - 20
Instrumentation & Measurement Magazine 23-8 - 21
Instrumentation & Measurement Magazine 23-8 - 22
Instrumentation & Measurement Magazine 23-8 - 23
Instrumentation & Measurement Magazine 23-8 - 24
Instrumentation & Measurement Magazine 23-8 - 25
Instrumentation & Measurement Magazine 23-8 - 26
Instrumentation & Measurement Magazine 23-8 - 27
Instrumentation & Measurement Magazine 23-8 - 28
Instrumentation & Measurement Magazine 23-8 - 29
Instrumentation & Measurement Magazine 23-8 - 30
Instrumentation & Measurement Magazine 23-8 - 31
Instrumentation & Measurement Magazine 23-8 - 32
Instrumentation & Measurement Magazine 23-8 - 33
Instrumentation & Measurement Magazine 23-8 - 34
Instrumentation & Measurement Magazine 23-8 - 35
Instrumentation & Measurement Magazine 23-8 - 36
Instrumentation & Measurement Magazine 23-8 - 37
Instrumentation & Measurement Magazine 23-8 - 38
Instrumentation & Measurement Magazine 23-8 - 39
Instrumentation & Measurement Magazine 23-8 - 40
Instrumentation & Measurement Magazine 23-8 - 41
https://www.nxtbook.com/allen/iamm/26-6
https://www.nxtbook.com/allen/iamm/26-5
https://www.nxtbook.com/allen/iamm/26-4
https://www.nxtbook.com/allen/iamm/26-3
https://www.nxtbook.com/allen/iamm/26-2
https://www.nxtbook.com/allen/iamm/26-1
https://www.nxtbook.com/allen/iamm/25-9
https://www.nxtbook.com/allen/iamm/25-8
https://www.nxtbook.com/allen/iamm/25-7
https://www.nxtbook.com/allen/iamm/25-6
https://www.nxtbook.com/allen/iamm/25-5
https://www.nxtbook.com/allen/iamm/25-4
https://www.nxtbook.com/allen/iamm/25-3
https://www.nxtbook.com/allen/iamm/instrumentation-measurement-magazine-25-2
https://www.nxtbook.com/allen/iamm/25-1
https://www.nxtbook.com/allen/iamm/24-9
https://www.nxtbook.com/allen/iamm/24-7
https://www.nxtbook.com/allen/iamm/24-8
https://www.nxtbook.com/allen/iamm/24-6
https://www.nxtbook.com/allen/iamm/24-5
https://www.nxtbook.com/allen/iamm/24-4
https://www.nxtbook.com/allen/iamm/24-3
https://www.nxtbook.com/allen/iamm/24-2
https://www.nxtbook.com/allen/iamm/24-1
https://www.nxtbook.com/allen/iamm/23-9
https://www.nxtbook.com/allen/iamm/23-8
https://www.nxtbook.com/allen/iamm/23-6
https://www.nxtbook.com/allen/iamm/23-5
https://www.nxtbook.com/allen/iamm/23-2
https://www.nxtbook.com/allen/iamm/23-3
https://www.nxtbook.com/allen/iamm/23-4
https://www.nxtbookmedia.com