Instrumentation & Measurement Magazine 24-2 - 95

estimate that a decision made in less than one second is a good
benchmark. Therefore, any system that can provide a real-time
reaction time (time from sensing the issue to implementing a
response mechanism) of less than one second can be considered to be faster than a human response, and sufficient for a
vehicle control system [14].
For fast output, one can design a CNN with few layers such
as the one proposed by researchers at the Universities of Calgary and Saskatchewan [15] where an architecture consisting
of three convolutional layers followed by a 3 × 3 pooling layer
was used for leaf counting in rosette plants. Moreover, in the
last few years, a novel network architecture called R-CNN has
been proposed for object detection in real-time applications.
The initial convolutional pipeline was proposed by Girshick
et al. [16] in 2014 and won the Pascal VOC challenge. The
author successively proposed an updated version of the architecture (so called Fast R-CNN) [17] which jointly trained
the CNN, classifier, and bounding box regressor using fully
connected layers. The network was successively improved by
Shaoqing Ren (Faster R-CNN) [18] by replacing the Region
Proposal Network with a fully convolutional region extraction
technique. Such network is particularly used in agricultural
applications since it allowed a reliable and real-time processing of the images, as detailed in [19]-[21].
The discussion above has been focused on images. As
CNNs were very successful for tackling the problem of image
description when compared to what had been available before, the pretrained networks often use inputs in the form of
images consisting of matrices of unsigned integers with pixel
intensity values ranging from 0 to 255 and of fixed sizes that
vary from 224 by 224 to 331 by 331 such as the ones listed in
Table 1. If images of different sizes are available, one can also
rescale them to fit one of these networks. The image pattern
will be distorted, but the CNN network is expected to find
features from these distorted images as well as if the images
were kept in their original sizes. If audio is to be used, one can
transform these 1D signals into 2D images via time-frequency
or time-scale analysis [22]-[26]. Furthermore, a newer type
of deep network that allows for the inclusion of temporal behavior, such as the longshort-term recurrent neural network,
has proven to be very efficient for audio applications [27], [28]
where the inputs are not the standard unsigned 8-bit 2D data
that form images but real numbers that form 1D input vectors.
Inputs then can be in the form of data coming from sensors that
record vibration, humidity, temperature, etc.

Not Enough Data: What Can be Done?
As long as data is available and labeled with its correct classification to train a network, a prototype can be designed. This
availability of data can actually be an obstacle as many examples, in the order of thousands, are usually needed when using
a deep network such as the CNNs mentioned before [29]. One
way to alleviate the problem of a smaller data set is by creating more images doing operations such as rotation, scaling,
shifting, etc. of the original data. This step is known as data
augmentation [30]. Nevertheless, it is highly recommended
April 2021	

to start with a big data set that can include different scenarios
such as possible occlusions of objects, different types of illumination, or any other factors that would generate very different
images that were not considered during the network training.
In the case of audio, one must include cases that can be foreseen, such as noise coming from traffic if the system is to be
recording near a public road. As indicated in [31], it is not just
the amount of data but the data significance and usefulness for
the application that has to be considered.
The scenario of not having enough data is likely when an
innovative solution to a particular problem is sought. In such
cases, one can start with a shallow network by using a conventional neural network. Not as complex as CNNs, these
networks can still use low resolution images and have been
used in the past for self-driving vehicles. An example of this
is Autonomous Land Vehicle in a Neural Network (ALVINN)
which back in 1989, used a simple neural network consisting on one hidden layer with 29 nodes, having as an input 30
× 32 video images as well as input from a range finder that
was trained to yield 45 direction outputs [32]. ALVINN can be
considered as a proof of concept and because of the accuracy
importance of such application, not until recently has DL taken
on commercial autonomous vehicles, having reached much
better results than the ones reported in [32]. Usually such shallow networks take on inputs in the form of features whose total
number is much less than the total number of pixels in an image. They execute very rapidly, as the network requires only
a matrix multiplication. The results of this multiplication are
evaluated with a sigmoid function in which outputs can be
logical ones and zeros by simply thresholding the output, or
without this thresholding, the numerical output can give an
indication on how probable the network found that particular output to be. This can actually lead to a preliminary system
in which a shallow network is designed first in which classifications made with high probabilities are used to create a new
labeled database and the less likely classifications can be manually assessed by an expert, a process known as active learning
[33]. Once a bigger data set is formed this way, a DL can potentially improve the accuracy later.

Training Databases Available
Computer vision applications require the training on a large
amount of images. A popular training strategy is to exploit
training databases such as COCO, PASCAL VOC, SUN. Microsoft COCO is a large-scale object detection, segmentation, and
captioning dataset for object segmentation and recognition in
context including 330 K images [34]. Pascal VOC [35] provides
standardized image data sets for object class recognition. SUN
[36] provides a benchmark for scene categorization. The Places
[37] database introduced by MIT provides a novel scene-centric database with 205 scene categories and 2.5 millions of
images with a category label. The importance of this dataset is
not only to provide a large data-set for network training, but
also enables international challenges, allowing researchers to
compare novel network architectures. However, training on
such large data-bases sometimes requires hardware and GPU

IEEE Instrumentation & Measurement Magazine	95



Instrumentation & Measurement Magazine 24-2

Table of Contents for the Digital Edition of Instrumentation & Measurement Magazine 24-2

No label
Instrumentation & Measurement Magazine 24-2 - No label
Instrumentation & Measurement Magazine 24-2 - Cover2
Instrumentation & Measurement Magazine 24-2 - 1
Instrumentation & Measurement Magazine 24-2 - 2
Instrumentation & Measurement Magazine 24-2 - 3
Instrumentation & Measurement Magazine 24-2 - 4
Instrumentation & Measurement Magazine 24-2 - 5
Instrumentation & Measurement Magazine 24-2 - 6
Instrumentation & Measurement Magazine 24-2 - 7
Instrumentation & Measurement Magazine 24-2 - 8
Instrumentation & Measurement Magazine 24-2 - 9
Instrumentation & Measurement Magazine 24-2 - 10
Instrumentation & Measurement Magazine 24-2 - 11
Instrumentation & Measurement Magazine 24-2 - 12
Instrumentation & Measurement Magazine 24-2 - 13
Instrumentation & Measurement Magazine 24-2 - 14
Instrumentation & Measurement Magazine 24-2 - 15
Instrumentation & Measurement Magazine 24-2 - 16
Instrumentation & Measurement Magazine 24-2 - 17
Instrumentation & Measurement Magazine 24-2 - 18
Instrumentation & Measurement Magazine 24-2 - 19
Instrumentation & Measurement Magazine 24-2 - 20
Instrumentation & Measurement Magazine 24-2 - 21
Instrumentation & Measurement Magazine 24-2 - 22
Instrumentation & Measurement Magazine 24-2 - 23
Instrumentation & Measurement Magazine 24-2 - 24
Instrumentation & Measurement Magazine 24-2 - 25
Instrumentation & Measurement Magazine 24-2 - 26
Instrumentation & Measurement Magazine 24-2 - 27
Instrumentation & Measurement Magazine 24-2 - 28
Instrumentation & Measurement Magazine 24-2 - 29
Instrumentation & Measurement Magazine 24-2 - 30
Instrumentation & Measurement Magazine 24-2 - 31
Instrumentation & Measurement Magazine 24-2 - 32
Instrumentation & Measurement Magazine 24-2 - 33
Instrumentation & Measurement Magazine 24-2 - 34
Instrumentation & Measurement Magazine 24-2 - 35
Instrumentation & Measurement Magazine 24-2 - 36
Instrumentation & Measurement Magazine 24-2 - 37
Instrumentation & Measurement Magazine 24-2 - 38
Instrumentation & Measurement Magazine 24-2 - 39
Instrumentation & Measurement Magazine 24-2 - 40
Instrumentation & Measurement Magazine 24-2 - 41
Instrumentation & Measurement Magazine 24-2 - 42
Instrumentation & Measurement Magazine 24-2 - 43
Instrumentation & Measurement Magazine 24-2 - 44
Instrumentation & Measurement Magazine 24-2 - 45
Instrumentation & Measurement Magazine 24-2 - 46
Instrumentation & Measurement Magazine 24-2 - 47
Instrumentation & Measurement Magazine 24-2 - 48
Instrumentation & Measurement Magazine 24-2 - 49
Instrumentation & Measurement Magazine 24-2 - 50
Instrumentation & Measurement Magazine 24-2 - 51
Instrumentation & Measurement Magazine 24-2 - 52
Instrumentation & Measurement Magazine 24-2 - 53
Instrumentation & Measurement Magazine 24-2 - 54
Instrumentation & Measurement Magazine 24-2 - 55
Instrumentation & Measurement Magazine 24-2 - 56
Instrumentation & Measurement Magazine 24-2 - 57
Instrumentation & Measurement Magazine 24-2 - 58
Instrumentation & Measurement Magazine 24-2 - 59
Instrumentation & Measurement Magazine 24-2 - 60
Instrumentation & Measurement Magazine 24-2 - 61
Instrumentation & Measurement Magazine 24-2 - 62
Instrumentation & Measurement Magazine 24-2 - 63
Instrumentation & Measurement Magazine 24-2 - 64
Instrumentation & Measurement Magazine 24-2 - 65
Instrumentation & Measurement Magazine 24-2 - 66
Instrumentation & Measurement Magazine 24-2 - 67
Instrumentation & Measurement Magazine 24-2 - 68
Instrumentation & Measurement Magazine 24-2 - 69
Instrumentation & Measurement Magazine 24-2 - 70
Instrumentation & Measurement Magazine 24-2 - 71
Instrumentation & Measurement Magazine 24-2 - 72
Instrumentation & Measurement Magazine 24-2 - 73
Instrumentation & Measurement Magazine 24-2 - 74
Instrumentation & Measurement Magazine 24-2 - 75
Instrumentation & Measurement Magazine 24-2 - 76
Instrumentation & Measurement Magazine 24-2 - 77
Instrumentation & Measurement Magazine 24-2 - 78
Instrumentation & Measurement Magazine 24-2 - 79
Instrumentation & Measurement Magazine 24-2 - 80
Instrumentation & Measurement Magazine 24-2 - 81
Instrumentation & Measurement Magazine 24-2 - 82
Instrumentation & Measurement Magazine 24-2 - 83
Instrumentation & Measurement Magazine 24-2 - 84
Instrumentation & Measurement Magazine 24-2 - 85
Instrumentation & Measurement Magazine 24-2 - 86
Instrumentation & Measurement Magazine 24-2 - 87
Instrumentation & Measurement Magazine 24-2 - 88
Instrumentation & Measurement Magazine 24-2 - 89
Instrumentation & Measurement Magazine 24-2 - 90
Instrumentation & Measurement Magazine 24-2 - 91
Instrumentation & Measurement Magazine 24-2 - 92
Instrumentation & Measurement Magazine 24-2 - 93
Instrumentation & Measurement Magazine 24-2 - 94
Instrumentation & Measurement Magazine 24-2 - 95
Instrumentation & Measurement Magazine 24-2 - 96
Instrumentation & Measurement Magazine 24-2 - 97
Instrumentation & Measurement Magazine 24-2 - 98
Instrumentation & Measurement Magazine 24-2 - 99
Instrumentation & Measurement Magazine 24-2 - 100
Instrumentation & Measurement Magazine 24-2 - 101
Instrumentation & Measurement Magazine 24-2 - 102
Instrumentation & Measurement Magazine 24-2 - 103
Instrumentation & Measurement Magazine 24-2 - 104
Instrumentation & Measurement Magazine 24-2 - 105
Instrumentation & Measurement Magazine 24-2 - 106
Instrumentation & Measurement Magazine 24-2 - 107
Instrumentation & Measurement Magazine 24-2 - 108
Instrumentation & Measurement Magazine 24-2 - 109
Instrumentation & Measurement Magazine 24-2 - 110
Instrumentation & Measurement Magazine 24-2 - 111
Instrumentation & Measurement Magazine 24-2 - 112
Instrumentation & Measurement Magazine 24-2 - 113
Instrumentation & Measurement Magazine 24-2 - 114
Instrumentation & Measurement Magazine 24-2 - 115
Instrumentation & Measurement Magazine 24-2 - 116
Instrumentation & Measurement Magazine 24-2 - 117
Instrumentation & Measurement Magazine 24-2 - 118
Instrumentation & Measurement Magazine 24-2 - 119
Instrumentation & Measurement Magazine 24-2 - 120
Instrumentation & Measurement Magazine 24-2 - 121
Instrumentation & Measurement Magazine 24-2 - 122
Instrumentation & Measurement Magazine 24-2 - 123
Instrumentation & Measurement Magazine 24-2 - 124
Instrumentation & Measurement Magazine 24-2 - 125
Instrumentation & Measurement Magazine 24-2 - 126
Instrumentation & Measurement Magazine 24-2 - 127
Instrumentation & Measurement Magazine 24-2 - 128
Instrumentation & Measurement Magazine 24-2 - 129
Instrumentation & Measurement Magazine 24-2 - 130
Instrumentation & Measurement Magazine 24-2 - 131
Instrumentation & Measurement Magazine 24-2 - 132
Instrumentation & Measurement Magazine 24-2 - Cover3
Instrumentation & Measurement Magazine 24-2 - Cover4
https://www.nxtbook.com/allen/iamm/26-6
https://www.nxtbook.com/allen/iamm/26-5
https://www.nxtbook.com/allen/iamm/26-4
https://www.nxtbook.com/allen/iamm/26-3
https://www.nxtbook.com/allen/iamm/26-2
https://www.nxtbook.com/allen/iamm/26-1
https://www.nxtbook.com/allen/iamm/25-9
https://www.nxtbook.com/allen/iamm/25-8
https://www.nxtbook.com/allen/iamm/25-7
https://www.nxtbook.com/allen/iamm/25-6
https://www.nxtbook.com/allen/iamm/25-5
https://www.nxtbook.com/allen/iamm/25-4
https://www.nxtbook.com/allen/iamm/25-3
https://www.nxtbook.com/allen/iamm/instrumentation-measurement-magazine-25-2
https://www.nxtbook.com/allen/iamm/25-1
https://www.nxtbook.com/allen/iamm/24-9
https://www.nxtbook.com/allen/iamm/24-7
https://www.nxtbook.com/allen/iamm/24-8
https://www.nxtbook.com/allen/iamm/24-6
https://www.nxtbook.com/allen/iamm/24-5
https://www.nxtbook.com/allen/iamm/24-4
https://www.nxtbook.com/allen/iamm/24-3
https://www.nxtbook.com/allen/iamm/24-2
https://www.nxtbook.com/allen/iamm/24-1
https://www.nxtbook.com/allen/iamm/23-9
https://www.nxtbook.com/allen/iamm/23-8
https://www.nxtbook.com/allen/iamm/23-6
https://www.nxtbook.com/allen/iamm/23-5
https://www.nxtbook.com/allen/iamm/23-2
https://www.nxtbook.com/allen/iamm/23-3
https://www.nxtbook.com/allen/iamm/23-4
https://www.nxtbookmedia.com