IEEE Systems, Man and Cybernetics Magazine - October 2020 - 12

The proposed system achieved >89% accuracy for
activity recognition and >98% for activity level recognition. These results demonstrated that by using our
designed smart cushion, the activity levels of the seated
users could be seen to encourage physical exercise,
when necessary, especially for the wheelchair users, as
they suffer from prolonged exposure to high pressure of
the buttocks, which may cause pressure ulcers [19].
Body Language Expressions Recognition
The recognition of seated postures and activities not only
supports health-care applications (such the ones discussed
previously), but it effectively enables completely different
scenarios. The automatic recognition of postures and body
movements can promote the analysis of kinesics (often
referred to as body language). In our research [50], we
designed a method to recognize emotion-relevant activities
based on the analysis of seated actions. We were specifically interested in daily life basic seated activities (in terms of
postures combined with gestures) with emotional valence.
The processing workflow, depicted in Figure 6, is composed of the following three main tasks:
1) posture and gesture synthesis
2) data synchronization and normalization; the posture and
gesture data are synchronized and normalized to generalize feature sets for emotion-relevant activity recognition
3) feature fusion and classification.
Our approach employs a hierarchical classification: first,
posture and gestures are classified independently, and then
posture and gestures are organized in form-sequence

Table 5. The activity levels and the
representative activity examples.
Activity Level

Examples of Activities

Light intensity

Reading, desk work, and conversation

Moderate intensity

Swing left-right or front-back

Vigorous intensity

Upper body exercises

AAI (Tw)

F1 F2 ... Fn
Feature
Extraction

Activity and
Activity Level

vectors, which are, in turn, classified using HMMs. An
HMM is effective to model the sequence of states and finds
wide application in pattern recognition. In our method,
each type of activity is modeled using a dedicated
trained HMM; therefore, when a new discrete observation sequence is available, the appropriate HMM is found
through maximum likelihood. The results obtained from
the recognition of four different body language expressions (interest, frustration, sadness, and happiness) are
promising, with an average accuracy of 91.8%.
Conclusion and Future Directions
In this article, we reviewed methods, technologies, and
applications of an emerging smart object known as a
smart cushion. We showed its great potential application
value, as it is very suitable for monitoring several parameters of people performing tasks that require them to be
seated. The article widely covered the current state of the
art, though it is focused on our previous related research
studies and results. We presented a diversified plethora of
applications enabled by our designed smart cushion,
spanning medical scenarios to automotive setups.
Although there is consolidated research on activity recognition based on the smart cushion, there still exists several open research challenges. The following open research
directions and challenges are the most interesting:
◆ Smart cushion design: New types of materials should
be applied to improve the smart cushion design. To
make the recognition of activities more precise,
more sensing parameters should be added, e.g., from
biological signals such as electrocardiograms and
electromyograms.
◆ Big data issues: Human daily life activities will generate a huge volume of data, so it is critical to effectively
exploit cloud technology [51], [52] and big data analytics [53] to develop more user-centric applications.

Raw Pressure
Signals

Raw Accelerometer
Signals

Posture
Synthesis

Gesture
Synthesis

AAI
Data Synchronization

Recognition
Result

Feature Fusion and Classification
Data
Preprocessing

Filtering
Raw Data

Figure 5. The activity level assessment workflow. AAI:

activity assessment index.

12

Emotion-Relevant
Activity

Preprocessing

IEEE SYSTEMS, MAN, & CYBERNETICS MAGAZINE O ctober 2020

Figure 6. The emotion-relevant activity processing

workflow.



IEEE Systems, Man and Cybernetics Magazine - October 2020

Table of Contents for the Digital Edition of IEEE Systems, Man and Cybernetics Magazine - October 2020

Contents
IEEE Systems, Man and Cybernetics Magazine - October 2020 - Cover1
IEEE Systems, Man and Cybernetics Magazine - October 2020 - Cover2
IEEE Systems, Man and Cybernetics Magazine - October 2020 - Contents
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 2
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 3
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 4
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 5
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 6
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 7
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 8
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 9
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 10
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 11
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 12
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 13
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 14
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 15
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 16
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 17
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 18
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 19
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 20
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 21
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 22
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 23
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 24
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 25
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 26
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 27
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 28
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 29
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 30
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 31
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 32
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 33
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 34
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 35
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 36
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 37
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 38
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 39
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 40
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 41
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 42
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 43
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 44
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 45
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 46
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 47
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 48
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 49
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 50
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 51
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 52
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 53
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 54
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 55
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 56
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 57
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 58
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 59
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 60
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 61
IEEE Systems, Man and Cybernetics Magazine - October 2020 - 62
https://www.nxtbook.com/nxtbooks/ieee/smc_202110
https://www.nxtbook.com/nxtbooks/ieee/smc_202107
https://www.nxtbook.com/nxtbooks/ieee/smc_202104
https://www.nxtbook.com/nxtbooks/ieee/smc_202101
https://www.nxtbook.com/nxtbooks/ieee/smc_202010
https://www.nxtbook.com/nxtbooks/ieee/smc_202007
https://www.nxtbook.com/nxtbooks/ieee/smc_202004
https://www.nxtbook.com/nxtbooks/ieee/smc_202001
https://www.nxtbook.com/nxtbooks/ieee/smc_201910
https://www.nxtbook.com/nxtbooks/ieee/smc_201907
https://www.nxtbook.com/nxtbooks/ieee/smc_201904
https://www.nxtbook.com/nxtbooks/ieee/smc_201901
https://www.nxtbook.com/nxtbooks/ieee/smc_201810
https://www.nxtbook.com/nxtbooks/ieee/smc_201807
https://www.nxtbook.com/nxtbooks/ieee/smc_201804
https://www.nxtbook.com/nxtbooks/ieee/smc_201801
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1017
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0717
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0417
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0117
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1016
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0716
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0416
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0116
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1015
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0715
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0415
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0115
https://www.nxtbookmedia.com