Instrumentation & Measurement Magazine 24-3 - 26

adversarial training, Lakshminarayanan et al. [13] propose
using ensemble learning to further improve uncertainty
estimation. To evaluate an input, every member of the ensemble makes a prediction. All predictions are averaged to
obtain a Gaussian mixture distribution (given that every
member predicts a Gaussian distribution). This results in a
variable-output ML model where each member of the ensemble
produces a Gaussian distribution ( pˆ  x ). Hence, the aleatoric uncertainty is estimated as the average of all variance
values estimated by the members of the ensemble. The epistemic uncertainty corresponds to the variance in the means
estimated by every member of the ensemble. Both components can be combined to produce an overall estimation of
the model's uncertainty.

Fig. 3. (a) Typical ANN model that produces a single prediction for an input.
(b) ANN model as proposed by Lakshminarayanan et al. [13] that produces a
probability distribution for an input.

x. The log-likelihood function enables the calculation of the
most likely parameters or maximum-likelihood of the joint
probability of independent events. Given that observations
are assumed to be independent, the maximum-likelihood
reflects the most likely probabilistic distribution the observations were sampled from. The log-likelihood simplifies the
estimation of the maximum-likelihood parameters as their calculation requires a differentiation that is difficult to compute.
In (5), the second term produces high loss values if the difference between the estimated mean and the observation is large,
which pressures the network to adjust the weights to reduce
this difference. However, when this difference cannot be further reduced, then the variance in the denominator increases
to compensate and reduce the loss.

Adversarial Training
Researchers have remarked that ANNs can produce widely
divergent predictions for very similar inputs. For instance, objects appearing in nearly identical images may be differently
classified. Hence, Szegedy et al. [14] proposed augmenting
the training data with examples that are close to those in the
training data to address this issue. These are called adversarial
examples and are chosen strategically to be similar to examples from the training data yet result in increasing the loss
during training. To this end, Goodfellow et al. [15] introduced
the fast gradient sign method as a computationally fast approach to generate adversarial examples. Lakshminarayanan
et al. [13] propose incorporating adversarial training into their
probabilistic distribution prediction approach. They posit that
adversarial training improves the predictive accuracy of their
method.

Deep Ensemble Dropout
Ensemble learning involves training multiple models on the
training dataset to improve predictive accuracy and in some
cases reduce overfitting. At run time, the results of these
models are fused using a variety of schemes. In addition to
26	

Dropout Ensemble
Bachstein [16] presents an approach that combines aspects of
the Monte Carlo dropout and deep ensemble methods. Just
like the deep ensemble method, Bachstein proposes to modify ANNs to predict probabilistic distributions, and uses the
negative log-likelihood as a loss function. However, instead
of using adversarial training, the proposed approach relies on
dropout to build in robustness into the network.

Uncertainty Estimation without ReTraining
The methods surveyed above require modification to the
ANN design and/or training process. However, in practice
there are many instances when developers employ pretrained networks for classification or regression tasks. These
networks are typically trained on large datasets, and their
retraining may require vast computational resources and access to the training data, which is not always possible. For
this purpose, Mi et al. [17] introduce an approach for uncertainty estimation on pre-trained networks that were not
designed and/or trained with uncertainty approximation in
mind. They define two scenarios: black-box and gray-box uncertainty estimation. In the black-box scenario, the developer
has access to the trained model which however is impractical
to modify or retrain. In the gray-box scenario, the developer
has access to intermediate layers in the network but is unable to modify the weights by retraining. In the latter case,
Mi et al. access feature maps produced by the layers of Convolutional Neural Networks (CNNs). CNNs can automate
the process of feature engineering by automatically extracting useful features for a particular regression or classification
task. Hence, the network learns which features are important for the problem in question through the training process.
The feature extraction process is performed progressively
through multiple convolutional layers. The output of each
convolutional layer corresponds to features deemed useful
through the training process. While the early layers produce
general features that may generally be applicable to numerous tasks, the last layers generate specific features that are
optimized for the problem under consideration. To estimate
uncertainty, Mi et al. [17] impose tolerable perturbation on the

IEEE Instrumentation & Measurement Magazine	

May 2021



Instrumentation & Measurement Magazine 24-3

Table of Contents for the Digital Edition of Instrumentation & Measurement Magazine 24-3

No label
Instrumentation & Measurement Magazine 24-3 - No label
Instrumentation & Measurement Magazine 24-3 - Cover2
Instrumentation & Measurement Magazine 24-3 - 1
Instrumentation & Measurement Magazine 24-3 - 2
Instrumentation & Measurement Magazine 24-3 - 3
Instrumentation & Measurement Magazine 24-3 - 4
Instrumentation & Measurement Magazine 24-3 - 5
Instrumentation & Measurement Magazine 24-3 - 6
Instrumentation & Measurement Magazine 24-3 - 7
Instrumentation & Measurement Magazine 24-3 - 8
Instrumentation & Measurement Magazine 24-3 - 9
Instrumentation & Measurement Magazine 24-3 - 10
Instrumentation & Measurement Magazine 24-3 - 11
Instrumentation & Measurement Magazine 24-3 - 12
Instrumentation & Measurement Magazine 24-3 - 13
Instrumentation & Measurement Magazine 24-3 - 14
Instrumentation & Measurement Magazine 24-3 - 15
Instrumentation & Measurement Magazine 24-3 - 16
Instrumentation & Measurement Magazine 24-3 - 17
Instrumentation & Measurement Magazine 24-3 - 18
Instrumentation & Measurement Magazine 24-3 - 19
Instrumentation & Measurement Magazine 24-3 - 20
Instrumentation & Measurement Magazine 24-3 - 21
Instrumentation & Measurement Magazine 24-3 - 22
Instrumentation & Measurement Magazine 24-3 - 23
Instrumentation & Measurement Magazine 24-3 - 24
Instrumentation & Measurement Magazine 24-3 - 25
Instrumentation & Measurement Magazine 24-3 - 26
Instrumentation & Measurement Magazine 24-3 - 27
Instrumentation & Measurement Magazine 24-3 - 28
Instrumentation & Measurement Magazine 24-3 - 29
Instrumentation & Measurement Magazine 24-3 - 30
Instrumentation & Measurement Magazine 24-3 - 31
Instrumentation & Measurement Magazine 24-3 - 32
Instrumentation & Measurement Magazine 24-3 - 33
Instrumentation & Measurement Magazine 24-3 - 34
Instrumentation & Measurement Magazine 24-3 - 35
Instrumentation & Measurement Magazine 24-3 - 36
Instrumentation & Measurement Magazine 24-3 - 37
Instrumentation & Measurement Magazine 24-3 - 38
Instrumentation & Measurement Magazine 24-3 - 39
Instrumentation & Measurement Magazine 24-3 - 40
Instrumentation & Measurement Magazine 24-3 - 41
Instrumentation & Measurement Magazine 24-3 - 42
Instrumentation & Measurement Magazine 24-3 - 43
Instrumentation & Measurement Magazine 24-3 - 44
Instrumentation & Measurement Magazine 24-3 - 45
Instrumentation & Measurement Magazine 24-3 - 46
Instrumentation & Measurement Magazine 24-3 - 47
Instrumentation & Measurement Magazine 24-3 - 48
Instrumentation & Measurement Magazine 24-3 - 49
Instrumentation & Measurement Magazine 24-3 - 50
Instrumentation & Measurement Magazine 24-3 - 51
Instrumentation & Measurement Magazine 24-3 - 52
Instrumentation & Measurement Magazine 24-3 - 53
Instrumentation & Measurement Magazine 24-3 - 54
Instrumentation & Measurement Magazine 24-3 - 55
Instrumentation & Measurement Magazine 24-3 - 56
Instrumentation & Measurement Magazine 24-3 - 57
Instrumentation & Measurement Magazine 24-3 - 58
Instrumentation & Measurement Magazine 24-3 - 59
Instrumentation & Measurement Magazine 24-3 - 60
Instrumentation & Measurement Magazine 24-3 - 61
Instrumentation & Measurement Magazine 24-3 - 62
Instrumentation & Measurement Magazine 24-3 - 63
Instrumentation & Measurement Magazine 24-3 - 64
Instrumentation & Measurement Magazine 24-3 - 65
Instrumentation & Measurement Magazine 24-3 - 66
Instrumentation & Measurement Magazine 24-3 - 67
Instrumentation & Measurement Magazine 24-3 - 68
Instrumentation & Measurement Magazine 24-3 - 69
Instrumentation & Measurement Magazine 24-3 - 70
Instrumentation & Measurement Magazine 24-3 - 71
Instrumentation & Measurement Magazine 24-3 - 72
Instrumentation & Measurement Magazine 24-3 - 73
Instrumentation & Measurement Magazine 24-3 - 74
Instrumentation & Measurement Magazine 24-3 - 75
Instrumentation & Measurement Magazine 24-3 - 76
Instrumentation & Measurement Magazine 24-3 - 77
Instrumentation & Measurement Magazine 24-3 - 78
Instrumentation & Measurement Magazine 24-3 - 79
Instrumentation & Measurement Magazine 24-3 - 80
Instrumentation & Measurement Magazine 24-3 - 81
Instrumentation & Measurement Magazine 24-3 - 82
Instrumentation & Measurement Magazine 24-3 - 83
Instrumentation & Measurement Magazine 24-3 - 84
Instrumentation & Measurement Magazine 24-3 - 85
Instrumentation & Measurement Magazine 24-3 - 86
Instrumentation & Measurement Magazine 24-3 - 87
Instrumentation & Measurement Magazine 24-3 - 88
Instrumentation & Measurement Magazine 24-3 - 89
Instrumentation & Measurement Magazine 24-3 - 90
Instrumentation & Measurement Magazine 24-3 - 91
Instrumentation & Measurement Magazine 24-3 - 92
Instrumentation & Measurement Magazine 24-3 - 93
Instrumentation & Measurement Magazine 24-3 - 94
Instrumentation & Measurement Magazine 24-3 - 95
Instrumentation & Measurement Magazine 24-3 - 96
Instrumentation & Measurement Magazine 24-3 - 97
Instrumentation & Measurement Magazine 24-3 - 98
Instrumentation & Measurement Magazine 24-3 - 99
Instrumentation & Measurement Magazine 24-3 - 100
Instrumentation & Measurement Magazine 24-3 - Cover3
Instrumentation & Measurement Magazine 24-3 - Cover4
https://www.nxtbook.com/allen/iamm/26-6
https://www.nxtbook.com/allen/iamm/26-5
https://www.nxtbook.com/allen/iamm/26-4
https://www.nxtbook.com/allen/iamm/26-3
https://www.nxtbook.com/allen/iamm/26-2
https://www.nxtbook.com/allen/iamm/26-1
https://www.nxtbook.com/allen/iamm/25-9
https://www.nxtbook.com/allen/iamm/25-8
https://www.nxtbook.com/allen/iamm/25-7
https://www.nxtbook.com/allen/iamm/25-6
https://www.nxtbook.com/allen/iamm/25-5
https://www.nxtbook.com/allen/iamm/25-4
https://www.nxtbook.com/allen/iamm/25-3
https://www.nxtbook.com/allen/iamm/instrumentation-measurement-magazine-25-2
https://www.nxtbook.com/allen/iamm/25-1
https://www.nxtbook.com/allen/iamm/24-9
https://www.nxtbook.com/allen/iamm/24-7
https://www.nxtbook.com/allen/iamm/24-8
https://www.nxtbook.com/allen/iamm/24-6
https://www.nxtbook.com/allen/iamm/24-5
https://www.nxtbook.com/allen/iamm/24-4
https://www.nxtbook.com/allen/iamm/24-3
https://www.nxtbook.com/allen/iamm/24-2
https://www.nxtbook.com/allen/iamm/24-1
https://www.nxtbook.com/allen/iamm/23-9
https://www.nxtbook.com/allen/iamm/23-8
https://www.nxtbook.com/allen/iamm/23-6
https://www.nxtbook.com/allen/iamm/23-5
https://www.nxtbook.com/allen/iamm/23-2
https://www.nxtbook.com/allen/iamm/23-3
https://www.nxtbook.com/allen/iamm/23-4
https://www.nxtbookmedia.com