Systems, Man & Cybernetics - January 2016 - 10

+90°. We asked four subjects to follow this cue with the
developed TMI in real time. From the experiments where
90 random cues were given to each of four subjects, the
mean of the absolute errors between the cues and the
input angles was 18.9°, which implies the detection accuracy. As a practical application, we applied the TMI to the
task of wheelchair control that enabled subjects to turn a
wheelchair to the intended direction. The developed
interface could precisely distinguish between different
directions of the tongue. This fine controllability enables
the wheelchair to be effortlessly maneuvered in a manner
similar to driving with a steering wheel.
From this applicative research, we found unique
advantages of the developed interfaces. The interface
could detect the tongue's position from the sensors
located outside the mouth, so it is more comfortable and
hygienic than existing TMIs [11]-[13] that require the
injection of sensors or magnets inside the mouth. Second, the interface enables continuous analog manipulation, unlike the conventional interfaces that generate
discretely separated commands.
GKP for Language-Related
Tongue Movements
GKP Patterns for Language-Related
Tongue Movements
I n t he prev iou s sect ion, we de scr ibed t he GK P
response for horizontal tongue movements. The next
step would be to describe the GKP response for vertical or frontal movements. Before explaining such
kinds of responses, it is necessary to note that the vertical and frontal tongue movements are closely related
to vocalization tasks. The tongue is one of the major
articulatory organs and is concerned with the articulation of various consonants. For example, dental consonants, such as / a /and /i /, are articulated with the
forward movement of the tongue to touch the teeth.
Velar phonemes, such as /g/ and /k/, are articulated
with the backward and upward movements of the
tongue to touch the soft palate. Since the vertical and
frontal movements are closely related to the vocalization task, it would be more informative to describe
GKP responses for language-related tongue movements
than to describe the responses for two types of directional movements. We described such language-related
GKPs based on our extensive experimental results
presented in [14].
To investigate the GKP response for language-related tongue movements, we first investigate the relation
between GKP and each phoneme. A phoneme is a basic
unit of a language's phonology. Therefore, investigating
their relationship is the first step to understand the
language-related GKP patterns. Generally, phonemes
can be divided into consonants and vowels. The consonants are articulated with complete or partial closure
10

IEEE SyStEmS, man, & CybErnEtICS magazInE Janu ar y 2016

of the vocal tract, whereas the vowels are articulated
with an open vocal tract [15]. Vowels do not allow the
tongue to touch the lips or buccal wall, so vocalizing
them will not generate GKP signals. However, in the
case of consonants, the tongue may touch various
articulators, and the consonant phonemes can have
various GKP patterns.
These consona nt phonemes ca n be categor ized
according to the place of articulation, which is one of
the most widely used phonetic features in articulatory phonetics. The place of articulation classifies the
phonemes according to the place where the closure of
the vocal tract occurred as shown in Figure 3. In the
international phonetic alphabet chart devised in 2005
[16], 11 places of ar ticulation were distinguished.
Among these types, we selected seven types of consonants having distinctive positions of the tongue. We
recorded and analyzed GKP responses for these seven
types of consonants, expecting that they might have
distinctive GK P patter ns due to their distinctive
tongue positions.
From each type of consonant, we selected the representative phonemes: 1) / a / for dental, 2) /d/ for alveolar, 3) / / for retroflex, 4) /d / for palato-alveolar, 5)
/J-/ for palatal, 6) /g/ for velar, and 7) /m/ for bilabial.
Then, we recorded five trials of EEG signals for each
selected phoneme. A single trial consisted of two visual cues. The first cue presented the phonetic symbol
/a/, then the second cue presented one of the selected
phonemes. During pronouncing /a/, the tongue does not
contact any tissue, so the potential levels in this state
ca n be used a s the reference point w ithout GK P
response. We showed these visual cues to four subjects
and asked them to pronounce the presented phonemes.
We specifically asked the subjects to pronounce the
second phoneme slowly with maintaining the contact
between the tongue and other articulators until the cue
was finished.
EEG signals were recorded with the same device
and method described in the previous section. The
example signals obtained from one subject were plotted
in Figure 4. We plotted six trials of signals for three
types of consonants (alveolar, retroflex, and velar), and
the length of each cue was 3 s. We indicated the potential increase or decrease evoked by tongue movements
by arrows. The potential levels in the frontal region
(Fp1, Fp2) increased when the subject pronounced retroflex consonant /1/, whereas they decreased when he
pronounced alveolar or velar consonants. On the other
hand, the potential levels in the occipital region (O1,
O2) decreased for the retroflex consonant, whereas it
increased for the alveolar and velar consonants.
To investigate the spatial patterns of GKP, we measured the potential differences evoked when the tongue
moved to the designated positions in each place of articulation. We collected five trials for each type of consonant,



Table of Contents for the Digital Edition of Systems, Man & Cybernetics - January 2016

Systems, Man & Cybernetics - January 2016 - Cover1
Systems, Man & Cybernetics - January 2016 - Cover2
Systems, Man & Cybernetics - January 2016 - 1
Systems, Man & Cybernetics - January 2016 - 2
Systems, Man & Cybernetics - January 2016 - 3
Systems, Man & Cybernetics - January 2016 - 4
Systems, Man & Cybernetics - January 2016 - 5
Systems, Man & Cybernetics - January 2016 - 6
Systems, Man & Cybernetics - January 2016 - 7
Systems, Man & Cybernetics - January 2016 - 8
Systems, Man & Cybernetics - January 2016 - 9
Systems, Man & Cybernetics - January 2016 - 10
Systems, Man & Cybernetics - January 2016 - 11
Systems, Man & Cybernetics - January 2016 - 12
Systems, Man & Cybernetics - January 2016 - 13
Systems, Man & Cybernetics - January 2016 - 14
Systems, Man & Cybernetics - January 2016 - 15
Systems, Man & Cybernetics - January 2016 - 16
Systems, Man & Cybernetics - January 2016 - 17
Systems, Man & Cybernetics - January 2016 - 18
Systems, Man & Cybernetics - January 2016 - 19
Systems, Man & Cybernetics - January 2016 - 20
Systems, Man & Cybernetics - January 2016 - 21
Systems, Man & Cybernetics - January 2016 - 22
Systems, Man & Cybernetics - January 2016 - 23
Systems, Man & Cybernetics - January 2016 - 24
Systems, Man & Cybernetics - January 2016 - 25
Systems, Man & Cybernetics - January 2016 - 26
Systems, Man & Cybernetics - January 2016 - 27
Systems, Man & Cybernetics - January 2016 - 28
Systems, Man & Cybernetics - January 2016 - 29
Systems, Man & Cybernetics - January 2016 - 30
Systems, Man & Cybernetics - January 2016 - 31
Systems, Man & Cybernetics - January 2016 - 32
Systems, Man & Cybernetics - January 2016 - 33
Systems, Man & Cybernetics - January 2016 - 34
Systems, Man & Cybernetics - January 2016 - 35
Systems, Man & Cybernetics - January 2016 - 36
Systems, Man & Cybernetics - January 2016 - 37
Systems, Man & Cybernetics - January 2016 - 38
Systems, Man & Cybernetics - January 2016 - 39
Systems, Man & Cybernetics - January 2016 - 40
Systems, Man & Cybernetics - January 2016 - 41
Systems, Man & Cybernetics - January 2016 - 42
Systems, Man & Cybernetics - January 2016 - 43
Systems, Man & Cybernetics - January 2016 - 44
Systems, Man & Cybernetics - January 2016 - Cover3
Systems, Man & Cybernetics - January 2016 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/smc_202310
https://www.nxtbook.com/nxtbooks/ieee/smc_202307
https://www.nxtbook.com/nxtbooks/ieee/smc_202304
https://www.nxtbook.com/nxtbooks/ieee/smc_202301
https://www.nxtbook.com/nxtbooks/ieee/smc_202210
https://www.nxtbook.com/nxtbooks/ieee/smc_202207
https://www.nxtbook.com/nxtbooks/ieee/smc_202204
https://www.nxtbook.com/nxtbooks/ieee/smc_202201
https://www.nxtbook.com/nxtbooks/ieee/smc_202110
https://www.nxtbook.com/nxtbooks/ieee/smc_202107
https://www.nxtbook.com/nxtbooks/ieee/smc_202104
https://www.nxtbook.com/nxtbooks/ieee/smc_202101
https://www.nxtbook.com/nxtbooks/ieee/smc_202010
https://www.nxtbook.com/nxtbooks/ieee/smc_202007
https://www.nxtbook.com/nxtbooks/ieee/smc_202004
https://www.nxtbook.com/nxtbooks/ieee/smc_202001
https://www.nxtbook.com/nxtbooks/ieee/smc_201910
https://www.nxtbook.com/nxtbooks/ieee/smc_201907
https://www.nxtbook.com/nxtbooks/ieee/smc_201904
https://www.nxtbook.com/nxtbooks/ieee/smc_201901
https://www.nxtbook.com/nxtbooks/ieee/smc_201810
https://www.nxtbook.com/nxtbooks/ieee/smc_201807
https://www.nxtbook.com/nxtbooks/ieee/smc_201804
https://www.nxtbook.com/nxtbooks/ieee/smc_201801
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1017
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0717
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0417
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0117
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1016
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0716
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0416
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0116
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1015
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0715
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0415
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0115
https://www.nxtbookmedia.com