Systems, Man & Cybernetics - April 2017 - 8

We will focus on phase A and
result, we were able to learn and
consider only visual and audio
predict in 74.58% of the cases the
We are offering an
inputs u k (without limiting the
emotional perception of specific
emotion mapper/
images by an individual within
generality of the concept) and the
a given session. The proposed
observable PVs of heart rate (bpm)
reader/model that is
approach contributes to a better
and skin conductance ^nS h . In
essentially similar to
understanding of the nature of
this study, instead of real visual
perception and learning. It also
similarity between images or audio
a lie detector.
offers opportunities to build tools
similarity between musical selecand devices that can be useful in
tions, we have a two-dimensional
advertising, medicine, and enter(2-D) vector, y k + TE = 6bpm /nS@ k + TE ),
tainment and for predicting the reactions of professionals
where TE denotes the delay in expressing emotions to
such as firefighters and police to specific situations.
a  sensory, in this case visual, input. (Usually, this is
reported to be less than 10 s; obviously, this is personThe Concept of the Proposed Method
specific and different for different types of sensory
We start with the observation that people's perceptions
inputs.) TA denotes the delay in (re)actions, which may
and learning are neither clearly probabilistic nor deterbe longer or may never take place and is not a subject of
ministic. A simple illustration of this is the emotions that
this article. The cognitive feedback (the dotted line in
arise if we consider a memorable scene we may have
Figure 1) exists in real life in most cases when we are
seen just once in our life or think of the effort it will take
awake and conscious, but we will ignore it and also conto learn a foreign language. Learning is clearly linked
sider only the steady state. (In reality, there are also tranwith emotions and associations and is not simply a matsients related to the fact that we continuously receive
ter of repeating something n times. Associations in learndifferent sensory inputs.)
ing have been studied earlier from a psychological point
We argue that everything else is learned through experiof view, e.g., in [10]. In computational intelligence, the
ence and propose a method to mathematically model this
fuzzy set theory [5] was proposed to mathematically repprocess within the framework of systems theory and cyberresent the subjective preferences and claims in dealing
netics. To the best of our knowledge, this is the first such
with perception. However, traditional approaches require
comprehensive model within a cybernetics frame. We also
a priori defined or assumed types of membership funcargue that human learning is always linked with emotions
tions (in the case of fuzzy sets theory [5]) or pdfs (in the
and cognitive feedback, and thus there is a disposition
probabilistic approach [13]). In addition, to build a credi(positive or negative) to learning. However, this learning is
ble model, the probabilistic approach requires a large
very often of the unsupervised type (especially in adults) in
enough (or theoretically infinite) amount of data, includthe broader sense that no external teacher or external feeding their independence and orthogonality.
back or stimulus may be present, and still we can continue
In this article, we propose building a macrolevel
to learn autonomously and to dynamically evolve our own
cybernetic model of the mapping of the sensory input
understanding of the world [14]. This hypothesis has strong
to the emotions represented by the observable PVs. We
links with the ARTSCENCE concept by Grossberg and
consider this (Figure 1, phase A) as a step toward a betHuang in regard to threshold and arousal when new inforter understanding of the learning process and as a
mation is presented [11].
(subconscious) part of the more complex problem of
We consider a layered hierarchical structure of making
human learning.
sense of sensory inputs and learning from them, as depicted in Figure 2. In particular, we consider a
three-layer structure ^ L = 3 h, where layer 1
corresponds to forming a relatively small
number of concepts with clear semantic
Visual
meaning, aggregating and integrating the
Conscious,
Subconscious,
yk + ∆E
raw multimodal data streams (visual, audi(Re)Action
Phase B
Auditory
Phase A
tory, olfactory, and so forth). The process
Decision
Zk + ∆A
Tactile
Making
Mapping
the
Sensory
of forming concepts based on clouds of
uk
Inputs to Emotions
data with similar properties is not considOlfactory
Expressed by
ered in detail in this article, but this can be
Observable PVs
PVk + ∆E
Gustatory
automated, as described in [15]. This proGoals
cess
can be done with cognitive feedback
(External and Internal)
(the dotted line in Figure 1) or, in some
cases, subconsciously in a fully unsuperFigure 1. a systems theory-based view of the mapping of sensory
vised manner when we associate some
inputs to emotions (and further to actions).
8

IEEE SyStEmS, man, & CybErnEtICS magazInE A pri l 2017



Table of Contents for the Digital Edition of Systems, Man & Cybernetics - April 2017

Systems, Man & Cybernetics - April 2017 - Cover1
Systems, Man & Cybernetics - April 2017 - Cover2
Systems, Man & Cybernetics - April 2017 - 1
Systems, Man & Cybernetics - April 2017 - 2
Systems, Man & Cybernetics - April 2017 - 3
Systems, Man & Cybernetics - April 2017 - 4
Systems, Man & Cybernetics - April 2017 - 5
Systems, Man & Cybernetics - April 2017 - 6
Systems, Man & Cybernetics - April 2017 - 7
Systems, Man & Cybernetics - April 2017 - 8
Systems, Man & Cybernetics - April 2017 - 9
Systems, Man & Cybernetics - April 2017 - 10
Systems, Man & Cybernetics - April 2017 - 11
Systems, Man & Cybernetics - April 2017 - 12
Systems, Man & Cybernetics - April 2017 - 13
Systems, Man & Cybernetics - April 2017 - 14
Systems, Man & Cybernetics - April 2017 - 15
Systems, Man & Cybernetics - April 2017 - 16
Systems, Man & Cybernetics - April 2017 - 17
Systems, Man & Cybernetics - April 2017 - 18
Systems, Man & Cybernetics - April 2017 - 19
Systems, Man & Cybernetics - April 2017 - 20
Systems, Man & Cybernetics - April 2017 - 21
Systems, Man & Cybernetics - April 2017 - 22
Systems, Man & Cybernetics - April 2017 - 23
Systems, Man & Cybernetics - April 2017 - 24
Systems, Man & Cybernetics - April 2017 - 25
Systems, Man & Cybernetics - April 2017 - 26
Systems, Man & Cybernetics - April 2017 - 27
Systems, Man & Cybernetics - April 2017 - 28
Systems, Man & Cybernetics - April 2017 - 29
Systems, Man & Cybernetics - April 2017 - 30
Systems, Man & Cybernetics - April 2017 - 31
Systems, Man & Cybernetics - April 2017 - 32
Systems, Man & Cybernetics - April 2017 - 33
Systems, Man & Cybernetics - April 2017 - 34
Systems, Man & Cybernetics - April 2017 - 35
Systems, Man & Cybernetics - April 2017 - 36
Systems, Man & Cybernetics - April 2017 - 37
Systems, Man & Cybernetics - April 2017 - 38
Systems, Man & Cybernetics - April 2017 - 39
Systems, Man & Cybernetics - April 2017 - 40
Systems, Man & Cybernetics - April 2017 - 41
Systems, Man & Cybernetics - April 2017 - 42
Systems, Man & Cybernetics - April 2017 - 43
Systems, Man & Cybernetics - April 2017 - 44
Systems, Man & Cybernetics - April 2017 - 45
Systems, Man & Cybernetics - April 2017 - 46
Systems, Man & Cybernetics - April 2017 - 47
Systems, Man & Cybernetics - April 2017 - 48
Systems, Man & Cybernetics - April 2017 - 49
Systems, Man & Cybernetics - April 2017 - 50
Systems, Man & Cybernetics - April 2017 - 51
Systems, Man & Cybernetics - April 2017 - 52
Systems, Man & Cybernetics - April 2017 - 53
Systems, Man & Cybernetics - April 2017 - 54
Systems, Man & Cybernetics - April 2017 - 55
Systems, Man & Cybernetics - April 2017 - 56
Systems, Man & Cybernetics - April 2017 - Cover3
Systems, Man & Cybernetics - April 2017 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/smc_202310
https://www.nxtbook.com/nxtbooks/ieee/smc_202307
https://www.nxtbook.com/nxtbooks/ieee/smc_202304
https://www.nxtbook.com/nxtbooks/ieee/smc_202301
https://www.nxtbook.com/nxtbooks/ieee/smc_202210
https://www.nxtbook.com/nxtbooks/ieee/smc_202207
https://www.nxtbook.com/nxtbooks/ieee/smc_202204
https://www.nxtbook.com/nxtbooks/ieee/smc_202201
https://www.nxtbook.com/nxtbooks/ieee/smc_202110
https://www.nxtbook.com/nxtbooks/ieee/smc_202107
https://www.nxtbook.com/nxtbooks/ieee/smc_202104
https://www.nxtbook.com/nxtbooks/ieee/smc_202101
https://www.nxtbook.com/nxtbooks/ieee/smc_202010
https://www.nxtbook.com/nxtbooks/ieee/smc_202007
https://www.nxtbook.com/nxtbooks/ieee/smc_202004
https://www.nxtbook.com/nxtbooks/ieee/smc_202001
https://www.nxtbook.com/nxtbooks/ieee/smc_201910
https://www.nxtbook.com/nxtbooks/ieee/smc_201907
https://www.nxtbook.com/nxtbooks/ieee/smc_201904
https://www.nxtbook.com/nxtbooks/ieee/smc_201901
https://www.nxtbook.com/nxtbooks/ieee/smc_201810
https://www.nxtbook.com/nxtbooks/ieee/smc_201807
https://www.nxtbook.com/nxtbooks/ieee/smc_201804
https://www.nxtbook.com/nxtbooks/ieee/smc_201801
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1017
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0717
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0417
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0117
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1016
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0716
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0416
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0116
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1015
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0715
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0415
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0115
https://www.nxtbookmedia.com