Systems, Man & Cybernetics - July 2017 - 21

= 0.082
-3.344

4) Activations (h2)
3) The dot products of the
activation in layer 1 and
neuron weights in layer 2.

image unrolled into a vector of 784 values. Both the input test sample and the RFs of the first AE layer are presented as images. The architecture is 784-10-10-3. The
weights of the output layer are plotted as a diagram, with one row for each output neuron and one column for every hidden neuron in the second layer [13]. Black
pixels indicate negative weights, and white pixels indicate positive ones. The range of weights are scaled to [−1,1] and mapped to the graycolor map. w = −1 is
assigned to black, w = 0 to gray, and w = 1 to white.

= 0.038
-3.567

2) Activations (h1)

= 0.016
-27.69

-3.793

= 0.0401

= 0.0606

= 0.13
-3.098

-3.899

= 0.0607

= 0.073
-3.173

-3.987

=0.036
-3.163

Figure 9. Filtering the signal through two stacked L 1 /L 2 NCSAEs trained using the reduced MNIST data set with class labels 1, 2, and 6. The test image is a 28 × 28-pixel

6) Finally, the softmax
nonlinearity is applied
to get probabilities.

= 0.12
-2.919

-3.410

= 0.0528

5) The dot-product with
classification layer weights.
Biases are added.

0.0034 for "6"
= 55.39
= 0.0691
-3.969

0.0004 for "1"

0.9962 for "2"
= 61.07
= 0.0393

= 53.16

= 0.022

-3.699

-3.381

= 0.0439

The matrix of classification weights,
where each row represents one
output neuron.
= 0.0468

= 0.0914

-3.169

-3.550

= 0.044
-3.329

-4.142

= 0.0425
-3.917
= 0.072

Weights and biases of hidden
neurons in layer 2. Each row is
a vector of the weights of a
single neuron.
Weights (shown as images) and
biases of hidden neurons in
layer 1. Each image is formed
from weights of a single neuron.

-5.881

Ju ly 2017

1) The dot-products of
the input and neuron
weights in layer 1.

Computational
Considerations
and Software Libraries
Many existing DL libraries have
good AE implementations, but their
functionalities var y. The list of

Test Sample
(Image)

output is discrete). Finally, in the
fine-tuning stage, the weights of all
the layers are tuned simultaneously
in a supervised fashion to improve
the classification accuracy [23]. To
illustrate this concept, two L 1 /L 2
NCSAEs were stacked and trained,
as described previously, on 150 samples of the digits 1, 2, and 6 from the
MNIST handwritten data set, 30
samples from each digit category.
The number of hidden neurons
was chosen to obtain reasonably good
classification accuracy while keeping
the network rather small. For this
subset of MNIST data, the hidden
sizes of the first and second AEs were
selected to be ten to allow easy
inspection. The network is intentionally kept small, because the full
MNIST data would require a larger
hidden-layer size, and this could limit
network interpretability. The filtering
of a test image of the digit 2 is shown
in Figure 9. It can be seen that the
fourth and seventh RFs of the first AE
layer have dominant activations (with
activation values 0.12 and 0.13,
respectively), and they capture most
of the information about the test
input. Also, they are able to filter a
distinct part of input digit. The outputs of the first-layer sigmoid constitute higher level features extracted
from the test image, with emphasis on
the fourth and seventh features. Subsequently, in the second layer, the second, sixth, eighth, and tenth neurons
have dominant activations (with activation values 0.0914, 0.0691, 0.0607,
and 0.0606, respectively), because
they have stronger connections with
the dominant neurons in the first layer
than the rest. Last, in the softmax
layer, the second neuron was 99.62%
activated, because it has the strongest
connections with the dominant neurons in the second layer, thereby classifying the test image as 2.

IEEE SYSTEMS, MAN, & CYBERNETICS MAGAZINE

21



Table of Contents for the Digital Edition of Systems, Man & Cybernetics - July 2017

Systems, Man & Cybernetics - July 2017 - Cover1
Systems, Man & Cybernetics - July 2017 - Cover2
Systems, Man & Cybernetics - July 2017 - 1
Systems, Man & Cybernetics - July 2017 - 2
Systems, Man & Cybernetics - July 2017 - 3
Systems, Man & Cybernetics - July 2017 - 4
Systems, Man & Cybernetics - July 2017 - 5
Systems, Man & Cybernetics - July 2017 - 6
Systems, Man & Cybernetics - July 2017 - 7
Systems, Man & Cybernetics - July 2017 - 8
Systems, Man & Cybernetics - July 2017 - 9
Systems, Man & Cybernetics - July 2017 - 10
Systems, Man & Cybernetics - July 2017 - 11
Systems, Man & Cybernetics - July 2017 - 12
Systems, Man & Cybernetics - July 2017 - 13
Systems, Man & Cybernetics - July 2017 - 14
Systems, Man & Cybernetics - July 2017 - 15
Systems, Man & Cybernetics - July 2017 - 16
Systems, Man & Cybernetics - July 2017 - 17
Systems, Man & Cybernetics - July 2017 - 18
Systems, Man & Cybernetics - July 2017 - 19
Systems, Man & Cybernetics - July 2017 - 20
Systems, Man & Cybernetics - July 2017 - 21
Systems, Man & Cybernetics - July 2017 - 22
Systems, Man & Cybernetics - July 2017 - 23
Systems, Man & Cybernetics - July 2017 - 24
Systems, Man & Cybernetics - July 2017 - 25
Systems, Man & Cybernetics - July 2017 - 26
Systems, Man & Cybernetics - July 2017 - 27
Systems, Man & Cybernetics - July 2017 - 28
Systems, Man & Cybernetics - July 2017 - 29
Systems, Man & Cybernetics - July 2017 - 30
Systems, Man & Cybernetics - July 2017 - 31
Systems, Man & Cybernetics - July 2017 - 32
Systems, Man & Cybernetics - July 2017 - 33
Systems, Man & Cybernetics - July 2017 - 34
Systems, Man & Cybernetics - July 2017 - 35
Systems, Man & Cybernetics - July 2017 - 36
Systems, Man & Cybernetics - July 2017 - 37
Systems, Man & Cybernetics - July 2017 - 38
Systems, Man & Cybernetics - July 2017 - 39
Systems, Man & Cybernetics - July 2017 - 40
Systems, Man & Cybernetics - July 2017 - 41
Systems, Man & Cybernetics - July 2017 - 42
Systems, Man & Cybernetics - July 2017 - 43
Systems, Man & Cybernetics - July 2017 - 44
Systems, Man & Cybernetics - July 2017 - 45
Systems, Man & Cybernetics - July 2017 - 46
Systems, Man & Cybernetics - July 2017 - 47
Systems, Man & Cybernetics - July 2017 - 48
Systems, Man & Cybernetics - July 2017 - Cover3
Systems, Man & Cybernetics - July 2017 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/smc_202310
https://www.nxtbook.com/nxtbooks/ieee/smc_202307
https://www.nxtbook.com/nxtbooks/ieee/smc_202304
https://www.nxtbook.com/nxtbooks/ieee/smc_202301
https://www.nxtbook.com/nxtbooks/ieee/smc_202210
https://www.nxtbook.com/nxtbooks/ieee/smc_202207
https://www.nxtbook.com/nxtbooks/ieee/smc_202204
https://www.nxtbook.com/nxtbooks/ieee/smc_202201
https://www.nxtbook.com/nxtbooks/ieee/smc_202110
https://www.nxtbook.com/nxtbooks/ieee/smc_202107
https://www.nxtbook.com/nxtbooks/ieee/smc_202104
https://www.nxtbook.com/nxtbooks/ieee/smc_202101
https://www.nxtbook.com/nxtbooks/ieee/smc_202010
https://www.nxtbook.com/nxtbooks/ieee/smc_202007
https://www.nxtbook.com/nxtbooks/ieee/smc_202004
https://www.nxtbook.com/nxtbooks/ieee/smc_202001
https://www.nxtbook.com/nxtbooks/ieee/smc_201910
https://www.nxtbook.com/nxtbooks/ieee/smc_201907
https://www.nxtbook.com/nxtbooks/ieee/smc_201904
https://www.nxtbook.com/nxtbooks/ieee/smc_201901
https://www.nxtbook.com/nxtbooks/ieee/smc_201810
https://www.nxtbook.com/nxtbooks/ieee/smc_201807
https://www.nxtbook.com/nxtbooks/ieee/smc_201804
https://www.nxtbook.com/nxtbooks/ieee/smc_201801
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1017
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0717
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0417
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0117
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1016
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0716
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0416
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0116
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1015
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0715
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0415
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0115
https://www.nxtbookmedia.com