IEEE Systems, Man and Cybernetics Magazine - January 2021 - 26

as it reduces or eliminates the impact of the so-called small
sample size problem and the curse of dimensionality [3]-
[6]. However, the 2DPCA variants produce a feature matrix
rather than a feature vector, as in conventional PCA. Often,
the 2DPCA feature matrix has more entries (or coefficients)
than the PCA feature vector.
On the other hand, B2DPCA produces a much smaller
feature matrix than 2DPCA generally, and often the number
of coefficients is comparable to the number of coefficients
in the PCA feature vector. For 2DPCA and B2DPCA, dimenPCA in Image Applications
sionality reduction decreases the feature matrix size. SimiPCA has been successful in various image compression and
lar to PCA, a distance measure in the feature space defines
recognition applications since the well-known work of
the best match to an unknown image. Image reconstruction
Sirovich and Kirby [1] in 1987 and Turk and Pentland [2] in
and compression with B2DPCA are also addressed in [4]-[6].
1991. In the context of grayscale images with m rows and n
Numerous modifications of the original 2DPCA and B2DPcolumns of pixels, each image is considered as a point in
CA algorithms have appeared. Examples include use of
mn-dimensional space, expressed as an mn # 1 column
L1-norm (instead of L2-norm) to reduce outlier sensitivity
vector. It was observed in [1] that sample face images lie in
[7], use of Haar-like functions that
a much lower dimensional subefficiently approximate eigenvecspace, and PCA was proposed as a
tors in so-called binary 2DPCA [8],
means to find the subspace. PCA
Unlike conventional
and extensions of the basic algofinds a new basis in the image
PCA, the variants
rithm with techniques such as linspace for which new variables corear discriminant analysis (LDA),
responding to the new basis vec2DPCA, B2DPCA,
for example, B2DPCA + LDA [9].
tors are uncorrelated, and where
and CSA preserve
CSA as outlined in [5], a spemost of the variance of the sample
cial
case of concurrent subspace
images is embodied in a relatively
the original image
analysis [10], is essentially an itersmall number of these new varistructure.
ated version of B2DPCA. In the
ables. By retaining only the new
" Experiments: CSA and PCA " secvariables embodying most of the
tion of this article we demonvariance, significant dimensionalistrate end-to-end application of CSA utilizing two
ty reduction can result. Sample images as well as new
well-known image datasets: the MNIST handwritten digunknown images can then be expressed by a relatively
its [24] and the Fashion-MNIST [25]. These data sets are
small set of coordinates, which are simply projections of
often used for benchmarking machine learning image
the images onto the new basis vectors. Thus, recognizing a
recognition algorithms. We give an overview of each step
new image involves determining which known image (or
of CSA, including mathematical background, CSA algoimage class average, as in [2]) is closest to the unknown
rithm, and the procedure for classifying unknown imagimage representation in the reduced dimensional space
es. We show the impact on image classification accuracy
( " face " space or, more generally, feature space); the categoand on image reconstruction, as the feature matrix size
ry (class) or identity associated with the smallest distance
varies with different levels of dimensionality reduction
is assigned to the unknown image (in [2], if the distance
as compared to conventional PCA.
exceeds specified limits, it may be considered a new face or
Although our focus is on methods applicable directly to
not a face). In face image compression, it was noted in [1]
matrices, we briefly mention a broader perspective. The PCA
and [2] that the original face images could be reconstructed
variants 2DPCA, B2DPCA, and CSA, as applied to matrices
as linear combinations of the new basis vectors (or eigenor tensors of order 2, are special cases of more general methfaces) found by PCA. The reconstruction is more accurate
ods that operate directly on data having the form of tensors
as more eigenfaces are included, with perfect reconstrucor multiway data arrays of order 3 and higher. A hyperspection using all eigenfaces.
tral image is a well-known example of a third-order tensor-
In contrast to the conventional PCA approach in [1] and
a cube-like array with m rows and n columns of pixels and a
[2] where each image is treated as an mn # 1 column vecthird mode representing spectral bands. These methods as
tor, the PCA variants we consider here preserve the origiapplied to tensors (including matrices) are often called mulnal matrix form of the grayscale images. Depending on the
tilinear techniques, and they provide a lower rank approxiPCA variant, the focus is on the rows [3], [4] (row 2DPCA),
mation to the original tensor data; we note that for tensors of
columns [4] (column 2DPCA), or both the rows and colorder 3 and higher, the concept of rank is not as straightforumns [4] (B2DPCA) of the grayscale image matrix rather
ward as for matrices [11]. Example methods include multilinthan on the individual image pixels comprising the vectorear PCA (MPCA) [12], higher-order orthogonal iteration
ized images, as in conventional PCA. This is advantageous,
applications. We address some advantages and disadvantages of these variants in relation to PCA. Utilizing
the Modified National Institute of Standards and Technology (MNIST) digits and Fashion-MNIST image sets,
we demonstrate application of CSA for image recognition and reconstruction compared to PCA. Finally, we
mention how these PCA variants fit into a more general
framework using tensors.

26	

IEEE SYSTEMS, MAN, & CYBERNETICS MAGAZINE Janu ar y 2021



IEEE Systems, Man and Cybernetics Magazine - January 2021

Table of Contents for the Digital Edition of IEEE Systems, Man and Cybernetics Magazine - January 2021

CONTENTS
IEEE Systems, Man and Cybernetics Magazine - January 2021 - Cover1
IEEE Systems, Man and Cybernetics Magazine - January 2021 - Cover2
IEEE Systems, Man and Cybernetics Magazine - January 2021 - CONTENTS
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 2
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 3
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 4
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 5
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 6
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 7
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 8
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 9
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 10
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 11
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 12
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 13
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 14
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 15
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 16
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 17
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 18
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 19
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 20
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 21
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 22
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 23
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 24
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 25
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 26
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 27
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 28
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 29
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 30
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 31
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 32
IEEE Systems, Man and Cybernetics Magazine - January 2021 - 33
https://www.nxtbook.com/nxtbooks/ieee/smc_202310
https://www.nxtbook.com/nxtbooks/ieee/smc_202307
https://www.nxtbook.com/nxtbooks/ieee/smc_202304
https://www.nxtbook.com/nxtbooks/ieee/smc_202301
https://www.nxtbook.com/nxtbooks/ieee/smc_202210
https://www.nxtbook.com/nxtbooks/ieee/smc_202207
https://www.nxtbook.com/nxtbooks/ieee/smc_202204
https://www.nxtbook.com/nxtbooks/ieee/smc_202201
https://www.nxtbook.com/nxtbooks/ieee/smc_202110
https://www.nxtbook.com/nxtbooks/ieee/smc_202107
https://www.nxtbook.com/nxtbooks/ieee/smc_202104
https://www.nxtbook.com/nxtbooks/ieee/smc_202101
https://www.nxtbook.com/nxtbooks/ieee/smc_202010
https://www.nxtbook.com/nxtbooks/ieee/smc_202007
https://www.nxtbook.com/nxtbooks/ieee/smc_202004
https://www.nxtbook.com/nxtbooks/ieee/smc_202001
https://www.nxtbook.com/nxtbooks/ieee/smc_201910
https://www.nxtbook.com/nxtbooks/ieee/smc_201907
https://www.nxtbook.com/nxtbooks/ieee/smc_201904
https://www.nxtbook.com/nxtbooks/ieee/smc_201901
https://www.nxtbook.com/nxtbooks/ieee/smc_201810
https://www.nxtbook.com/nxtbooks/ieee/smc_201807
https://www.nxtbook.com/nxtbooks/ieee/smc_201804
https://www.nxtbook.com/nxtbooks/ieee/smc_201801
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1017
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0717
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0417
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0117
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1016
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0716
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0416
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0116
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1015
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0715
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0415
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0115
https://www.nxtbookmedia.com