Instrumentation & Measurement Magazine 23-6 - 13

produce outputs, as explained earlier, that are fed into the second hidden layer. The latter repeats what the previous hidden
layer performed and feeds its outputs as inputs into the next
hidden layer. This process is repeated until the output layer
produces the final output. If the final output differs from the
foreseen output, the weights on the connections between the
layers are adjusted and the whole process is repeated starting
from the input layer. In each iteration of training, the amount
of deviation between the generated output and the expected
output is measured; this is referred to as the loss function. The
ultimate goal is to bring the loss function as close to zero as
possible, making the DL algorithm more accurate in extracting
the correct features or patterns from a given set of input data.
But this also means that DL inherently has some level of uncertainty, which we discuss next.

Uncertainty in Deep Learning

estimation or Ensemble Averaging), Dropout Ensembles,
Quantile Regression, and Gaussian Process Inference. Details
are beyond the scope of this article, and in fact we are writing
an article just about uncertainty in DL-based measurements
which we hope will be published in this same magazine in the
near future. With the above general description about DL and
its uncertainty in mind, let us now take a look at specific DL
techniques used in practice.

Deep Learning Techniques
The DL techniques that we observed in our literature search
include recurrent neural networks, convolutional neural networks, deep Boltzmann machines, deep belief networks,
generative adversarial networks, and autoencoders.

Recurrent Neural Networks

Recurrent neural networks (RNNs) are developed to process
DL applications vary in the level of uncertainty they can tol- sequential information. In these networks, the same task is aperate. While we can tolerate most Siri mistakes that appear plied for every element in the sequence of information, with
regularly, critical applications like autonomous driving or the output being dependent on the previous computations
content filtering for copyright legislation require models with [14]; thus, the name recurrent. The typical architecture of an
much less uncertainty in their predictions. Therefore, let us RNN is illustrated in Fig. 4a. In this figure, N designates part of
shed some light on the uncertainty in DL techniques and how a neural network, Xt refers to inputs over time, and Yt refers to
the corresponding outputs over time.
it is addressed by researchers.
It is quite convenient to unfold the architecture in Fig. 4a to
Like any other measurement system, two types of effects
can contribute to uncertainty in a DL system, namely: sys- look like the one in Fig. 4b. In this figure, we can clearly see that
tematic and random [12]. Systematic effects occur when the a type of memory is employed in RNNs where the information
DL system is trained with a dataset that is not sufficiently rep- that has been calculated till time t is preserved. One popuresentative of the entire input domain, or if the DL model lar model for an RNN is the long short-term memory (LSTM)
underfits the training data. Even if a DL model does not un- model that was proposed by Gers and Schmidhuber in [15]. Typderfit the training data, it can reasonably predict measurement ical applications that benefit from RNNs are speech recognition,
results based on data patterns that it has been trained with be- language modeling, translation, and image captioning [14].
fore; uncertainty may rise when it is fed with data patterns that
it is seeing for the first time. This can be improved by training Convolutional Neural Networks
the DL with more generalizable data, and with using better hy- Convolutional neural networks (CNNs) employ convolupothesis functions. Random effects, on the other hand, arise if tional layers, subsampling (pooling) layers, and a final stage of
during prediction (not during training) the DL model slightly a fully connected layer, as illustrated in Fig. 5. These networks
changes, such as with Monte Carlo Dropout, which causes the are named convolutional due to their use of the convolution
DL model to produce different outputs for the same input de- mathematical operation in their feature extraction process.
pending on the nodes that
are dropped. These random effects will be larger
if the DL model suffers
from overfitting during
training. Whether we are
dealing with systematic or
random effects, it is essential to quantify the level
of uncertainty associated
with the DL techniques. In
DL, we can typically use
the following methods to
quantify the uncertainty
[13]: Monte-Carlo Dropout, Deep Ensembles (e.g.,
Fig. 4. a) The typical architecture of an RNN. b) Unfolded RNN architecture.
Distributional Parameter
September 2020	

IEEE Instrumentation & Measurement Magazine	13



Instrumentation & Measurement Magazine 23-6

Table of Contents for the Digital Edition of Instrumentation & Measurement Magazine 23-6

No label
Instrumentation & Measurement Magazine 23-6 - Cover1
Instrumentation & Measurement Magazine 23-6 - No label
Instrumentation & Measurement Magazine 23-6 - 2
Instrumentation & Measurement Magazine 23-6 - 3
Instrumentation & Measurement Magazine 23-6 - 4
Instrumentation & Measurement Magazine 23-6 - 5
Instrumentation & Measurement Magazine 23-6 - 6
Instrumentation & Measurement Magazine 23-6 - 7
Instrumentation & Measurement Magazine 23-6 - 8
Instrumentation & Measurement Magazine 23-6 - 9
Instrumentation & Measurement Magazine 23-6 - 10
Instrumentation & Measurement Magazine 23-6 - 11
Instrumentation & Measurement Magazine 23-6 - 12
Instrumentation & Measurement Magazine 23-6 - 13
Instrumentation & Measurement Magazine 23-6 - 14
Instrumentation & Measurement Magazine 23-6 - 15
Instrumentation & Measurement Magazine 23-6 - 16
Instrumentation & Measurement Magazine 23-6 - 17
Instrumentation & Measurement Magazine 23-6 - 18
Instrumentation & Measurement Magazine 23-6 - 19
Instrumentation & Measurement Magazine 23-6 - 20
Instrumentation & Measurement Magazine 23-6 - 21
Instrumentation & Measurement Magazine 23-6 - 22
Instrumentation & Measurement Magazine 23-6 - 23
Instrumentation & Measurement Magazine 23-6 - 24
Instrumentation & Measurement Magazine 23-6 - 25
Instrumentation & Measurement Magazine 23-6 - 26
Instrumentation & Measurement Magazine 23-6 - 27
Instrumentation & Measurement Magazine 23-6 - 28
Instrumentation & Measurement Magazine 23-6 - 29
Instrumentation & Measurement Magazine 23-6 - 30
Instrumentation & Measurement Magazine 23-6 - 31
Instrumentation & Measurement Magazine 23-6 - 32
Instrumentation & Measurement Magazine 23-6 - 33
Instrumentation & Measurement Magazine 23-6 - 34
Instrumentation & Measurement Magazine 23-6 - 35
Instrumentation & Measurement Magazine 23-6 - 36
Instrumentation & Measurement Magazine 23-6 - 37
Instrumentation & Measurement Magazine 23-6 - 38
Instrumentation & Measurement Magazine 23-6 - 39
Instrumentation & Measurement Magazine 23-6 - 40
Instrumentation & Measurement Magazine 23-6 - 41
Instrumentation & Measurement Magazine 23-6 - 42
Instrumentation & Measurement Magazine 23-6 - 43
Instrumentation & Measurement Magazine 23-6 - 44
Instrumentation & Measurement Magazine 23-6 - 45
Instrumentation & Measurement Magazine 23-6 - 46
Instrumentation & Measurement Magazine 23-6 - 47
https://www.nxtbook.com/allen/iamm/26-1
https://www.nxtbook.com/allen/iamm/25-9
https://www.nxtbook.com/allen/iamm/25-8
https://www.nxtbook.com/allen/iamm/25-7
https://www.nxtbook.com/allen/iamm/25-6
https://www.nxtbook.com/allen/iamm/25-5
https://www.nxtbook.com/allen/iamm/25-4
https://www.nxtbook.com/allen/iamm/25-3
https://www.nxtbook.com/allen/iamm/instrumentation-measurement-magazine-25-2
https://www.nxtbook.com/allen/iamm/25-1
https://www.nxtbook.com/allen/iamm/24-9
https://www.nxtbook.com/allen/iamm/24-7
https://www.nxtbook.com/allen/iamm/24-8
https://www.nxtbook.com/allen/iamm/24-6
https://www.nxtbook.com/allen/iamm/24-5
https://www.nxtbook.com/allen/iamm/24-4
https://www.nxtbook.com/allen/iamm/24-3
https://www.nxtbook.com/allen/iamm/24-2
https://www.nxtbook.com/allen/iamm/24-1
https://www.nxtbook.com/allen/iamm/23-9
https://www.nxtbook.com/allen/iamm/23-8
https://www.nxtbook.com/allen/iamm/23-6
https://www.nxtbook.com/allen/iamm/23-5
https://www.nxtbook.com/allen/iamm/23-2
https://www.nxtbook.com/allen/iamm/23-3
https://www.nxtbook.com/allen/iamm/23-4
https://www.nxtbookmedia.com