Systems, Man & Cybernetics - July 2015 - 21

lar. It tells us if a feature is presThe convolution layer allows
ent in the considered group, but
the local properties in the data to
not precisely where.
be discovered. It performs a conThe decision making
The fully connected layer is the
volution of the given image with
procedure in the NT
common ANN layer where every
a kernel by adopting a particular
is governed by the
input neuron is connected to each
layer-connection rule (Figure 2).
layer neuron. While it can be used
In particular, the kernel weights
Classify function.
everywhere in the CNN architecdefine the weights that connect a
ture, it is generally the last layer [4],
subset of the input neurons with
[36]. This is because the fully cona hidden one. As the network is
nected layer acts as a classifier by
trained, the weights of the kernel
separating the input data space.
are adapted to extract the optimal local feature. However, if we use a single kernel, the layer will be highly
specialized; hence, its generalization capabilities are
Learning Features by BP
limited. To get around this, multiple different kernels
The layers of a CNN define particular connection rules
are randomly initialized; thus, different kernels will be
that constrain the neurons to consider only the local
learned. In other words, for a given input feature map,
information. The fully connected one is not subject to such
e.g., the input image, we can have a convolution layer
restrictions and is used to produce the final classification.
that produces multiple output feature maps.
Since CNNs are a particular type of ANN, the classificaThe LCN layer limits the effects of intensity variations
tion error can be backpropagated to learn the connection
over different feature maps. When looking at similar imagweights of each CNN layer. In particular, the convolution
es, it may be that, after the convolution operation, the
layer has weights that define the convolution kernel. As the
resulting feature maps span a different feature space. To
network is trained, these are adapted to extract the optideal with this problem, an LCN layer [37] is stacked at the
mal local features.
output of the convolution layer. The layer, inspired by computational neuroscience models [38], [39], adopts a similar
What Are the Limits of CNNs?
connection rule to the convolution layer. It performs local
CNNs are a special kind of ANN, so they suffer from
subtractive and divisive normalizations, enforcing a sort
similar problems, BP above all. Most impor tantly,
of local competition between adjacent features in a feature
CNNs demand a huge amount of labeled data for trainmap and between features at the same spatial location in
ing. This boils down to weak generalization power
different feature maps [37].
when the number of training data is small and the
The pooling/subsampling layer adds robustness to
number of free parameters is large, i.e., when the netthe small shifts of the input data by looking at groups of
work is very deep. This is due to the fact that, in such a
input neurons that can come from either a single-input
case, the CNNs are prone to overfit, or, in other words,
feature map or multiple feature maps [37]. While many
the net faces a case of overparameterization. In addidifferent kinds of pooling layers, e.g., average pooling
tion, as for ANNs, there is no fixed rule to define a
and L2 pooling, can be found in the literature [35], [37],
priori how many hidden layers and neurons should be
the max-pooling layer [35] is currently extremely popuemployed to obtain optimal performance.

Image
A B C D E F
G H

I

J K L

M N O P Q R
S T U V W X
Convolution Mask
a b
c d

A B C D E F
Linearization

a

G H

c

b

J K L
a

d
f

I

...

b

M N O P Q R
c

a

d
f

...

S T U V W X

b

c
d
f

Figure 2. An example of a convolution layer. The convolution of a two-dimensional signal with a kernel can

be defined as a particular ANN with two layers. The input neurons consist of signal values. The second-layer
neurons are connected only with a subset of the input ones. The connection weights from the input to the
second layer (i.e., a, b, c, and d ) are the same for every neuron. Finally, the weighted input is processed by
the function f, which is generally the activation function of an ANN.

Ju ly 2015

IEEE Systems, Man, & Cybernetics Magazine

21



Table of Contents for the Digital Edition of Systems, Man & Cybernetics - July 2015

Systems, Man & Cybernetics - July 2015 - Cover1
Systems, Man & Cybernetics - July 2015 - Cover2
Systems, Man & Cybernetics - July 2015 - 1
Systems, Man & Cybernetics - July 2015 - 2
Systems, Man & Cybernetics - July 2015 - 3
Systems, Man & Cybernetics - July 2015 - 4
Systems, Man & Cybernetics - July 2015 - 5
Systems, Man & Cybernetics - July 2015 - 6
Systems, Man & Cybernetics - July 2015 - 7
Systems, Man & Cybernetics - July 2015 - 8
Systems, Man & Cybernetics - July 2015 - 9
Systems, Man & Cybernetics - July 2015 - 10
Systems, Man & Cybernetics - July 2015 - 11
Systems, Man & Cybernetics - July 2015 - 12
Systems, Man & Cybernetics - July 2015 - 13
Systems, Man & Cybernetics - July 2015 - 14
Systems, Man & Cybernetics - July 2015 - 15
Systems, Man & Cybernetics - July 2015 - 16
Systems, Man & Cybernetics - July 2015 - 17
Systems, Man & Cybernetics - July 2015 - 18
Systems, Man & Cybernetics - July 2015 - 19
Systems, Man & Cybernetics - July 2015 - 20
Systems, Man & Cybernetics - July 2015 - 21
Systems, Man & Cybernetics - July 2015 - 22
Systems, Man & Cybernetics - July 2015 - 23
Systems, Man & Cybernetics - July 2015 - 24
Systems, Man & Cybernetics - July 2015 - 25
Systems, Man & Cybernetics - July 2015 - 26
Systems, Man & Cybernetics - July 2015 - 27
Systems, Man & Cybernetics - July 2015 - 28
Systems, Man & Cybernetics - July 2015 - 29
Systems, Man & Cybernetics - July 2015 - 30
Systems, Man & Cybernetics - July 2015 - 31
Systems, Man & Cybernetics - July 2015 - 32
Systems, Man & Cybernetics - July 2015 - 33
Systems, Man & Cybernetics - July 2015 - 34
Systems, Man & Cybernetics - July 2015 - 35
Systems, Man & Cybernetics - July 2015 - 36
Systems, Man & Cybernetics - July 2015 - 37
Systems, Man & Cybernetics - July 2015 - 38
Systems, Man & Cybernetics - July 2015 - 39
Systems, Man & Cybernetics - July 2015 - 40
Systems, Man & Cybernetics - July 2015 - 41
Systems, Man & Cybernetics - July 2015 - 42
Systems, Man & Cybernetics - July 2015 - 43
Systems, Man & Cybernetics - July 2015 - 44
Systems, Man & Cybernetics - July 2015 - 45
Systems, Man & Cybernetics - July 2015 - 46
Systems, Man & Cybernetics - July 2015 - 47
Systems, Man & Cybernetics - July 2015 - 48
Systems, Man & Cybernetics - July 2015 - 49
Systems, Man & Cybernetics - July 2015 - 50
Systems, Man & Cybernetics - July 2015 - 51
Systems, Man & Cybernetics - July 2015 - 52
Systems, Man & Cybernetics - July 2015 - 53
Systems, Man & Cybernetics - July 2015 - 54
Systems, Man & Cybernetics - July 2015 - 55
Systems, Man & Cybernetics - July 2015 - 56
Systems, Man & Cybernetics - July 2015 - 57
Systems, Man & Cybernetics - July 2015 - 58
Systems, Man & Cybernetics - July 2015 - 59
Systems, Man & Cybernetics - July 2015 - 60
Systems, Man & Cybernetics - July 2015 - 61
Systems, Man & Cybernetics - July 2015 - 62
Systems, Man & Cybernetics - July 2015 - 63
Systems, Man & Cybernetics - July 2015 - 64
Systems, Man & Cybernetics - July 2015 - 65
Systems, Man & Cybernetics - July 2015 - 66
Systems, Man & Cybernetics - July 2015 - 67
Systems, Man & Cybernetics - July 2015 - 68
Systems, Man & Cybernetics - July 2015 - 69
Systems, Man & Cybernetics - July 2015 - 70
Systems, Man & Cybernetics - July 2015 - 71
Systems, Man & Cybernetics - July 2015 - 72
Systems, Man & Cybernetics - July 2015 - Cover3
Systems, Man & Cybernetics - July 2015 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/smc_202310
https://www.nxtbook.com/nxtbooks/ieee/smc_202307
https://www.nxtbook.com/nxtbooks/ieee/smc_202304
https://www.nxtbook.com/nxtbooks/ieee/smc_202301
https://www.nxtbook.com/nxtbooks/ieee/smc_202210
https://www.nxtbook.com/nxtbooks/ieee/smc_202207
https://www.nxtbook.com/nxtbooks/ieee/smc_202204
https://www.nxtbook.com/nxtbooks/ieee/smc_202201
https://www.nxtbook.com/nxtbooks/ieee/smc_202110
https://www.nxtbook.com/nxtbooks/ieee/smc_202107
https://www.nxtbook.com/nxtbooks/ieee/smc_202104
https://www.nxtbook.com/nxtbooks/ieee/smc_202101
https://www.nxtbook.com/nxtbooks/ieee/smc_202010
https://www.nxtbook.com/nxtbooks/ieee/smc_202007
https://www.nxtbook.com/nxtbooks/ieee/smc_202004
https://www.nxtbook.com/nxtbooks/ieee/smc_202001
https://www.nxtbook.com/nxtbooks/ieee/smc_201910
https://www.nxtbook.com/nxtbooks/ieee/smc_201907
https://www.nxtbook.com/nxtbooks/ieee/smc_201904
https://www.nxtbook.com/nxtbooks/ieee/smc_201901
https://www.nxtbook.com/nxtbooks/ieee/smc_201810
https://www.nxtbook.com/nxtbooks/ieee/smc_201807
https://www.nxtbook.com/nxtbooks/ieee/smc_201804
https://www.nxtbook.com/nxtbooks/ieee/smc_201801
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1017
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0717
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0417
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0117
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1016
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0716
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0416
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0116
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1015
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0715
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0415
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0115
https://www.nxtbookmedia.com