IEEE Systems, Man and Cybernetics Magazine - January 2020 - 18

analyzed in the cloud, the artists' emotion is conveyed to
the artwork by adjusting its lines and colors.
Artwork Data Sets
The artwork data sets include a personal, historical works
data set for each artist. Aside from content feature, brain
wave, and multimodal emotion data of brain-wearable
devices as well as the wearable-clothing emotional robot
that serves as the data source for the artistic creations of
machines, an original, large-quantity artwork data set can
also be input into the system. The data sets are classified
according to the different creation styles of artists. After
the motor imagery model is used to classify the style features of EEG data, they match them to the style features
of the historical works data set. In this way, the styles of
the content features can be transferred to create artworks
with specific styles and contents. The algorithm learns
and is trained by combining the historical data set, EEG
data, and emotion data. The system can generate paintings in specific emotional themes and styles. Because this
is a digital game, it can be used to present AI creativity.
Artwork data sets integrate the data of famous artists,
which form a rich database of works. The system can recommend style features for ordinary users and match the
style of the user's EEG sports imagination to extract the
style features.
Intelligent Decision Fusion and
Creative Game Production
It is necessary to rapidly transmit the data to the cloud for
intelligent decision fusion after obtaining EEG data, the
content feature data of real-time artistic creation, and multimodal emotion data. First, the cloud uses a motor imagery algorithm to classify the style features of EEG data
and analyze the styles desired by the artists. In this article,
the style features are divided into four classes: oil paintings, traditional Chinese paintings, sketches, and cartoons. They are then matched with the style features of the
historical artworks, which were previously uploaded to
the cloud. After the EEG data styles are matched to that of
the corresponding historical data set, the system confirms
the actual style of the artistic creation. The content features are determined by the work draft created by the artists in real time.
To form an artwork with specific contents and styles,
the Visual Geometry Group (VGG) 19 network algorithm
is deployed in the cloud to extract and rebuild content and
style features. In addition, the AI algorithm in the cloud
includes an attention-based recurrent neural network
(RNN) algorithm, which is used for recognizing and analyzing emotion data. The RNN algorithm can effectively
memorize the relevant feature information according to a
specific context. By introducing an attention mechanism
into the RNN algorithm framework, a new weight-pooling
strategy can be implemented into the network to project
the part of the voice that has intense emotional features.
18

IEEE SYSTEMS, MAN, & CYBERNETICS MAGAZINE Janu ar y 2020

After the artists' emotion is recognized in the cloud, the
artworks are rectified with contents and style. The changing of the lines and color are used to express the artists'
state of mind when creating artworks.
The Algorithm Model
The CreativeBioMan system's creativity is determined
by the per for ma nce of the A I a lgor ith m, which is
deployed in the cloud. As described in this article, a
motor imagery model is used to classify the styles of
EEG data. A VGG-19 network is then used to rebuild
style and contents to create a new artwork. An attention-based RNN algorithm is employed to raise the accuracy rate of emotion recognition to analyze and
recognize emotion data. The system rectifies the colors
and lines of artworks, which are created according to
the emotion-recognition result, and fuses the emotion
into the works. The systematical algorithm procedures
are depicted in Figure 2.
EEG Data Processing and Motor Imagery Model
The common spatial pattern (CSP) method is frequently
used in brain-computer interface (BCI) research based on
EEG data. A data set is required to have a label and its
class is known in each experiment. For the task of brain
signal classification, the data collected in a single experiment are a matrix the size of N # P and noted as E i,
where N is the number of channels for signal collection,
P is the number of samples of single channel, and i signifies the ith class. If there are M experiments in the ith
class, then there will be M E i matrixes. Because it is different from the traditional mean-normalization method
used for obtaining the covariance of the class, the M E i
matrixes in the same class are linked in the direction of
the row vector to obtain the entire EEG signal data, Ti, in
the ith class in size N # (M # P ). The corresponding
space covariance is then obtained according to the Ti
matrix in the ith class, displayed as
Ci =

Ti T Ti
.
tr (Ti T Ti )

(1)

The i ! {1, 2} CSP method is used to obtain the spacefiltering matrix W of the two classes and validates
(2) and (3):
W T C 1 W = K 1,

(2)

W C2 W = K2 .

(3)

T

The artwork style in this article is a four-class problem. A one-to-one strategy is chosen to build the CSP in
multiple classes. The four classes are combined anew in
pairs, and six space-filtering matrixes W will be
obtained. The best six-column vectors are chosen for
each space-filtering matrix, with each column vector
seen as a filter. There are 36, i.e., 6 # 6, column vectors in
total; therefore, the 36 # N, mixed, space-filtering matrix



IEEE Systems, Man and Cybernetics Magazine - January 2020

Table of Contents for the Digital Edition of IEEE Systems, Man and Cybernetics Magazine - January 2020

Contents
IEEE Systems, Man and Cybernetics Magazine - January 2020 - Cover1
IEEE Systems, Man and Cybernetics Magazine - January 2020 - Cover2
IEEE Systems, Man and Cybernetics Magazine - January 2020 - Contents
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 2
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 3
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 4
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 5
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 6
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 7
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 8
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 9
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 10
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 11
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 12
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 13
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 14
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 15
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 16
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 17
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 18
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 19
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 20
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 21
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 22
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 23
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 24
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 25
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 26
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 27
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 28
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 29
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 30
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 31
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 32
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 33
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 34
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 35
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 36
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 37
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 38
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 39
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 40
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 41
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 42
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 43
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 44
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 45
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 46
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 47
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 48
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 49
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 50
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 51
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 52
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 53
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 54
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 55
IEEE Systems, Man and Cybernetics Magazine - January 2020 - 56
IEEE Systems, Man and Cybernetics Magazine - January 2020 - Cover3
IEEE Systems, Man and Cybernetics Magazine - January 2020 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/smc_202310
https://www.nxtbook.com/nxtbooks/ieee/smc_202307
https://www.nxtbook.com/nxtbooks/ieee/smc_202304
https://www.nxtbook.com/nxtbooks/ieee/smc_202301
https://www.nxtbook.com/nxtbooks/ieee/smc_202210
https://www.nxtbook.com/nxtbooks/ieee/smc_202207
https://www.nxtbook.com/nxtbooks/ieee/smc_202204
https://www.nxtbook.com/nxtbooks/ieee/smc_202201
https://www.nxtbook.com/nxtbooks/ieee/smc_202110
https://www.nxtbook.com/nxtbooks/ieee/smc_202107
https://www.nxtbook.com/nxtbooks/ieee/smc_202104
https://www.nxtbook.com/nxtbooks/ieee/smc_202101
https://www.nxtbook.com/nxtbooks/ieee/smc_202010
https://www.nxtbook.com/nxtbooks/ieee/smc_202007
https://www.nxtbook.com/nxtbooks/ieee/smc_202004
https://www.nxtbook.com/nxtbooks/ieee/smc_202001
https://www.nxtbook.com/nxtbooks/ieee/smc_201910
https://www.nxtbook.com/nxtbooks/ieee/smc_201907
https://www.nxtbook.com/nxtbooks/ieee/smc_201904
https://www.nxtbook.com/nxtbooks/ieee/smc_201901
https://www.nxtbook.com/nxtbooks/ieee/smc_201810
https://www.nxtbook.com/nxtbooks/ieee/smc_201807
https://www.nxtbook.com/nxtbooks/ieee/smc_201804
https://www.nxtbook.com/nxtbooks/ieee/smc_201801
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1017
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0717
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0417
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0117
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1016
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0716
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0416
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0116
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1015
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0715
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0415
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0115
https://www.nxtbookmedia.com