Systems, Man & Cybernetics - July 2015 - 20

Training
As in the standard NT training algorithms, the perceptron
tree training procedure generates a tree in a recursive
manner by partitioning the training set. The procedure
involves three steps: 1) training internal nodes, 2) determining leaf nodes, and 3) labeling them.
For a generic NT with heterogeneous in-node models
(e.g., splits, ANNs, and so on), the training procedure can
be written as shown in Algorithm 1. Starting from the root,
the following holds.
1) The TrainNode procedure is used to train the innode model.
2) The training patterns are later classified by means of
the Classify procedure; hence, the training set is
partitioned.
3) If a partition satisfies the homogeneity condition, then a
leaf node is created and labeled with the corresponding
class. Under particular conditions, the in-node model
can be replaced by a more appropriate one (e.g., in the
perceptron tree, if the training patterns of a node are
not linearly separable, the trained perceptron is substituted by a split node).
4) If a partition is not homogeneous, a new child node is
added to the tree. The child node will be trained using
the patterns of the associated partition.
5) The training stops when all nodes become leaves.
The differences between the alternative NT schemes rely
on the TrainNode procedure and the homogeneity definitions. The former decides which type of in-node model

Algorithm 1. NT Training.
Input: The training set X
Output: The trained NT model
Set Q v ! " v 0 , and Q X ! " X ,;
while Q v do
v ! Pop^Q v h and Xl ! Pop^Q X h;
TrainNode ^v, Xlh;
^Q vt , Q Xt h ! Classify^v, Xlh;
while Q Xt do
if X* ! Pop^Q Xt h is homogeneous then
v * ! Pop^Q vt h is set to leaf;
else
Push^Q v , Pop^Q vt hh;
Push^Q X , X * h;
end
end
end
v 0 represents the root node, and X ! R d # n is the training set
consisting of n d-dimensional feature vectors. Xl 1 X is the
local training set (LTS) at a given node v. Let Q v and Q X be
the queues holding the nodes to train and the corresponding LTSs. TrainNode is the procedure used to train an ANN. It
eventually substitutes the ANN node with a more appropriate classification scheme (e.g., a split node). Classify is the
procedure that, given a node v, classifies the related LTS
and produces the list of child nodes Q vt together with a list
of corresponding patterns Q Xt. The homogeneity property of
a training set is a set of rules that define the correct classification of an LTS. Finally, Pop and Push are the usual queuerelated procedures.

20

IEEE Systems, Man, & Cybernetics Magazine July 2015 	

has to be considered, while the latter defines the training
stop criteria.
Classification
The decision-making procedure in the NT is governed
by the Classify function. The class of a test pattern
is obtained by traversing the tree in a top-down fashion,
starting from the root node. At each node, the Classify
function classifies the pattern and, hence, selects the
children to which the pattern should be presented next.
The procedure stops when the pattern reaches a leaf node,
representing the class.
What Are the Limits of NTs?
While NTs solve the problem of defining the ANN architecture a priori, they still have a number of drawbacks.
◆ There is a lack of control on the depth of the tree.
◆ Convergence is not guaranteed for all of the schemes.
◆ Training in-node models on small LTSs limits the
generalization.
◆ As ANNs, NTs also work on features representing the
data.
In recent years, many studies were successfully conducted to address the aforementioned issues [33]. As for
ANNs, the success or failure of NTs for pattern classification still depends on the raw data representations. These
are usually built by means of hand-crafted feature-extraction algorithms that fit to the considered problem. This
highlights the essence of new learning architectures in
inducing the discriminative raw data representation that is
not task-dependent.
Learning Data Representations
To solve the raw data representation problem, in the last
few years, several different systems that are able to automatically learn proper data representations have been
investigated [34], [35]. However, much of the success of the
new deep architectures can be attributed to a relatively old
[36] but special kind of ANN that is able to capture signal
spatiotemporal dependences: the convolutional neural
network (CNN).
To make the introduction to CNNs more intuitive, in the
following sections, we use the example of images rather
than generic signals. Nevertheless, the discussion can be
generalized to any type of signal.
CNNs
In pattern-recognition problems, a vector is usually used
to represent a real signal. In the case of images, this vector
becomes a matrix. Within the matrix, there exists a group
of pixels that share local (spatiotemporal) properties,
which CNNs are able to capture [36]. A CNN is a particular
type of ANN, where precise rules are defined to introduce
the following layers: 1) a convolution layer, 2) a local contrast normalization (LCN) layer, 3) a pooling/subsampling
layer, and 4) a fully connected layer.



Table of Contents for the Digital Edition of Systems, Man & Cybernetics - July 2015

Systems, Man & Cybernetics - July 2015 - Cover1
Systems, Man & Cybernetics - July 2015 - Cover2
Systems, Man & Cybernetics - July 2015 - 1
Systems, Man & Cybernetics - July 2015 - 2
Systems, Man & Cybernetics - July 2015 - 3
Systems, Man & Cybernetics - July 2015 - 4
Systems, Man & Cybernetics - July 2015 - 5
Systems, Man & Cybernetics - July 2015 - 6
Systems, Man & Cybernetics - July 2015 - 7
Systems, Man & Cybernetics - July 2015 - 8
Systems, Man & Cybernetics - July 2015 - 9
Systems, Man & Cybernetics - July 2015 - 10
Systems, Man & Cybernetics - July 2015 - 11
Systems, Man & Cybernetics - July 2015 - 12
Systems, Man & Cybernetics - July 2015 - 13
Systems, Man & Cybernetics - July 2015 - 14
Systems, Man & Cybernetics - July 2015 - 15
Systems, Man & Cybernetics - July 2015 - 16
Systems, Man & Cybernetics - July 2015 - 17
Systems, Man & Cybernetics - July 2015 - 18
Systems, Man & Cybernetics - July 2015 - 19
Systems, Man & Cybernetics - July 2015 - 20
Systems, Man & Cybernetics - July 2015 - 21
Systems, Man & Cybernetics - July 2015 - 22
Systems, Man & Cybernetics - July 2015 - 23
Systems, Man & Cybernetics - July 2015 - 24
Systems, Man & Cybernetics - July 2015 - 25
Systems, Man & Cybernetics - July 2015 - 26
Systems, Man & Cybernetics - July 2015 - 27
Systems, Man & Cybernetics - July 2015 - 28
Systems, Man & Cybernetics - July 2015 - 29
Systems, Man & Cybernetics - July 2015 - 30
Systems, Man & Cybernetics - July 2015 - 31
Systems, Man & Cybernetics - July 2015 - 32
Systems, Man & Cybernetics - July 2015 - 33
Systems, Man & Cybernetics - July 2015 - 34
Systems, Man & Cybernetics - July 2015 - 35
Systems, Man & Cybernetics - July 2015 - 36
Systems, Man & Cybernetics - July 2015 - 37
Systems, Man & Cybernetics - July 2015 - 38
Systems, Man & Cybernetics - July 2015 - 39
Systems, Man & Cybernetics - July 2015 - 40
Systems, Man & Cybernetics - July 2015 - 41
Systems, Man & Cybernetics - July 2015 - 42
Systems, Man & Cybernetics - July 2015 - 43
Systems, Man & Cybernetics - July 2015 - 44
Systems, Man & Cybernetics - July 2015 - 45
Systems, Man & Cybernetics - July 2015 - 46
Systems, Man & Cybernetics - July 2015 - 47
Systems, Man & Cybernetics - July 2015 - 48
Systems, Man & Cybernetics - July 2015 - 49
Systems, Man & Cybernetics - July 2015 - 50
Systems, Man & Cybernetics - July 2015 - 51
Systems, Man & Cybernetics - July 2015 - 52
Systems, Man & Cybernetics - July 2015 - 53
Systems, Man & Cybernetics - July 2015 - 54
Systems, Man & Cybernetics - July 2015 - 55
Systems, Man & Cybernetics - July 2015 - 56
Systems, Man & Cybernetics - July 2015 - 57
Systems, Man & Cybernetics - July 2015 - 58
Systems, Man & Cybernetics - July 2015 - 59
Systems, Man & Cybernetics - July 2015 - 60
Systems, Man & Cybernetics - July 2015 - 61
Systems, Man & Cybernetics - July 2015 - 62
Systems, Man & Cybernetics - July 2015 - 63
Systems, Man & Cybernetics - July 2015 - 64
Systems, Man & Cybernetics - July 2015 - 65
Systems, Man & Cybernetics - July 2015 - 66
Systems, Man & Cybernetics - July 2015 - 67
Systems, Man & Cybernetics - July 2015 - 68
Systems, Man & Cybernetics - July 2015 - 69
Systems, Man & Cybernetics - July 2015 - 70
Systems, Man & Cybernetics - July 2015 - 71
Systems, Man & Cybernetics - July 2015 - 72
Systems, Man & Cybernetics - July 2015 - Cover3
Systems, Man & Cybernetics - July 2015 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/smc_202110
https://www.nxtbook.com/nxtbooks/ieee/smc_202107
https://www.nxtbook.com/nxtbooks/ieee/smc_202104
https://www.nxtbook.com/nxtbooks/ieee/smc_202101
https://www.nxtbook.com/nxtbooks/ieee/smc_202010
https://www.nxtbook.com/nxtbooks/ieee/smc_202007
https://www.nxtbook.com/nxtbooks/ieee/smc_202004
https://www.nxtbook.com/nxtbooks/ieee/smc_202001
https://www.nxtbook.com/nxtbooks/ieee/smc_201910
https://www.nxtbook.com/nxtbooks/ieee/smc_201907
https://www.nxtbook.com/nxtbooks/ieee/smc_201904
https://www.nxtbook.com/nxtbooks/ieee/smc_201901
https://www.nxtbook.com/nxtbooks/ieee/smc_201810
https://www.nxtbook.com/nxtbooks/ieee/smc_201807
https://www.nxtbook.com/nxtbooks/ieee/smc_201804
https://www.nxtbook.com/nxtbooks/ieee/smc_201801
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1017
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0717
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0417
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0117
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1016
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0716
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0416
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0116
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1015
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0715
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0415
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0115
https://www.nxtbookmedia.com