Instrumentation & Measurement Magazine 26-3 - 39

Using Physiological Signals and Machine
Learning Algorithms to Measure
Attentiveness During Robot-Assisted Social
Skills Intervention: A Case Study of Two
Children with Autism Spectrum Disorder
Karla Conn Welch, Robert Pennington, Saipruthvi Vanaparthy, Ha Manh Do,
Rohit Narayanan, Dan Popa, Gregory Barnes, and Grace Kuravackel
I
ndividuals with autism spectrum disorder (ASD) often
face barriers in accessing opportunities across a range
of educational, employment, and social contexts. One of
these barriers is the development of effective communication
skills sufficient for navigating the social demands of everyday
environments. Fortunately, researchers have established
evidence-based practices (EBP) for teaching critical communication
skills to individuals with ASD [1]. One EBP that has
received a great deal of attention over the last few decades is
technology-aided instruction and intervention (TAII) [1],[2].
TAII is an instructional practice in which technology is an essential
component and is used to facilitate behavior change.
Further, it encompasses a wide range of applications including
computer-assisted instruction, virtual and augmented reality,
augmentative and alternative communication, and robot-assisted
intervention [2].
Over the last two decades, there has been increased interest
in the application of robot-assisted interventions (RAI) for
teaching social skills to learners with ASD [3]. RAI involves
the use of robots to deliver, augment, or support intervention
practices. Researchers have employed robots to teach a myriad
of social skills including joint attention [4], text messaging
[5], interpreting and making gestures [4], and emotion recognition
[6]. During RAI, robots can serve as direct instructional
agents, emitting directives or modeling targeted behaviors.
For example, Pennington and colleagues used an autonomous,
programmable humanoid (NAO) robot to verbally direct students
how to generate text messages [5]. The robot provided
a directive to the participant, waited for the participant to indicate
they were finished prior to emitting the next directive,
and then offered to perform a dance at the end of the session.
Similarly, another study used an NAO robot to assist in facilitating
game scenarios during pivotal response training [7].
During sessions, therapists controlled the robot to present
stimuli, prompt participants to respond, and provide reinforcing
feedback. The application of robots as change agents
during social skills intervention may offer several benefits including
increasing the reinforcing properties of intervention
packages, limiting human errors in implementation fidelity,
May 2023
and reducing social requirements of interpreting subtle social
cues emitted by the human instructors (e.g., facial expression,
changes in intonation).
The majority of investigations in which a robot served as
an instructor involved the programming of one-way (e.g., robot
presents a directive and the child responds) and two-way
(e.g., robot presents a vocal directive, the child responds, the
robot detects response and provides feedback or emits next directive)
interactions, or a Wizard of Oz approach in which the
instructor directs the actions of the robot. These approaches are
limited in that they rely on an external human change agent to
observe, detect, and respond to subtle changes in the learner's
attention; attention is essential to skill acquisition and may be
difficult for some learners with ASD. This reliance on an additional
human change agent to facilitate attention during
robot-learner interactions ultimately serves as a barrier to the
autonomous application of robots to support student learning.
A way to address this issue is by incorporating software
to detect and respond to physiological correlates for attention
within RAI packages. Physiological signals can be an
especially useful communication link for children with ASD,
whose outward expressions of affect may not be as apparent
as developmentally typical children [8]. However, the implicit
physiological signals of children with ASD do indicate
emotional changes to stimuli, including during social interactions
[8].
Researchers in affective computing have reported physiological
signals to be a reliable source of objective information
related to users' emotional reactions, because of the signals'
connections to the autonomic nervous system [8]-[12]. However,
conducting affective computing research presents
challenges in data collection. For example, although computer
vision could be used to process video streams of users
and their reactions [13], this signal presents privacy issues and
poses a challenge of collecting video from certain (e.g., preferably
head-on) angles at all times [13],[14], whereas privacy is
less of a concern with wrist-worn devices [15]. Additionally,
some physiological signals are more cumbersome to collect
than others and lead to less adoption rates by users [16], such
IEEE Instrumentation & Measurement Magazine
1094-6969/23/$25.00©2023IEEE
39

Instrumentation & Measurement Magazine 26-3

Table of Contents for the Digital Edition of Instrumentation & Measurement Magazine 26-3

Instrumentation & Measurement Magazine 26-3 - Cover1
Instrumentation & Measurement Magazine 26-3 - Cover2
Instrumentation & Measurement Magazine 26-3 - 1
Instrumentation & Measurement Magazine 26-3 - 2
Instrumentation & Measurement Magazine 26-3 - 3
Instrumentation & Measurement Magazine 26-3 - 4
Instrumentation & Measurement Magazine 26-3 - 5
Instrumentation & Measurement Magazine 26-3 - 6
Instrumentation & Measurement Magazine 26-3 - 7
Instrumentation & Measurement Magazine 26-3 - 8
Instrumentation & Measurement Magazine 26-3 - 9
Instrumentation & Measurement Magazine 26-3 - 10
Instrumentation & Measurement Magazine 26-3 - 11
Instrumentation & Measurement Magazine 26-3 - 12
Instrumentation & Measurement Magazine 26-3 - 13
Instrumentation & Measurement Magazine 26-3 - 14
Instrumentation & Measurement Magazine 26-3 - 15
Instrumentation & Measurement Magazine 26-3 - 16
Instrumentation & Measurement Magazine 26-3 - 17
Instrumentation & Measurement Magazine 26-3 - 18
Instrumentation & Measurement Magazine 26-3 - 19
Instrumentation & Measurement Magazine 26-3 - 20
Instrumentation & Measurement Magazine 26-3 - 21
Instrumentation & Measurement Magazine 26-3 - 22
Instrumentation & Measurement Magazine 26-3 - 23
Instrumentation & Measurement Magazine 26-3 - 24
Instrumentation & Measurement Magazine 26-3 - 25
Instrumentation & Measurement Magazine 26-3 - 26
Instrumentation & Measurement Magazine 26-3 - 27
Instrumentation & Measurement Magazine 26-3 - 28
Instrumentation & Measurement Magazine 26-3 - 29
Instrumentation & Measurement Magazine 26-3 - 30
Instrumentation & Measurement Magazine 26-3 - 31
Instrumentation & Measurement Magazine 26-3 - 32
Instrumentation & Measurement Magazine 26-3 - 33
Instrumentation & Measurement Magazine 26-3 - 34
Instrumentation & Measurement Magazine 26-3 - 35
Instrumentation & Measurement Magazine 26-3 - 36
Instrumentation & Measurement Magazine 26-3 - 37
Instrumentation & Measurement Magazine 26-3 - 38
Instrumentation & Measurement Magazine 26-3 - 39
Instrumentation & Measurement Magazine 26-3 - 40
Instrumentation & Measurement Magazine 26-3 - 41
Instrumentation & Measurement Magazine 26-3 - 42
Instrumentation & Measurement Magazine 26-3 - 43
Instrumentation & Measurement Magazine 26-3 - 44
Instrumentation & Measurement Magazine 26-3 - 45
Instrumentation & Measurement Magazine 26-3 - 46
Instrumentation & Measurement Magazine 26-3 - 47
Instrumentation & Measurement Magazine 26-3 - 48
Instrumentation & Measurement Magazine 26-3 - 49
Instrumentation & Measurement Magazine 26-3 - 50
Instrumentation & Measurement Magazine 26-3 - 51
Instrumentation & Measurement Magazine 26-3 - 52
Instrumentation & Measurement Magazine 26-3 - 53
Instrumentation & Measurement Magazine 26-3 - 54
Instrumentation & Measurement Magazine 26-3 - 55
Instrumentation & Measurement Magazine 26-3 - 56
Instrumentation & Measurement Magazine 26-3 - 57
Instrumentation & Measurement Magazine 26-3 - 58
Instrumentation & Measurement Magazine 26-3 - 59
Instrumentation & Measurement Magazine 26-3 - 60
Instrumentation & Measurement Magazine 26-3 - 61
Instrumentation & Measurement Magazine 26-3 - Cover3
Instrumentation & Measurement Magazine 26-3 - Cover4
https://www.nxtbook.com/allen/iamm/26-6
https://www.nxtbook.com/allen/iamm/26-5
https://www.nxtbook.com/allen/iamm/26-4
https://www.nxtbook.com/allen/iamm/26-3
https://www.nxtbook.com/allen/iamm/26-2
https://www.nxtbook.com/allen/iamm/26-1
https://www.nxtbook.com/allen/iamm/25-9
https://www.nxtbook.com/allen/iamm/25-8
https://www.nxtbook.com/allen/iamm/25-7
https://www.nxtbook.com/allen/iamm/25-6
https://www.nxtbook.com/allen/iamm/25-5
https://www.nxtbook.com/allen/iamm/25-4
https://www.nxtbook.com/allen/iamm/25-3
https://www.nxtbook.com/allen/iamm/instrumentation-measurement-magazine-25-2
https://www.nxtbook.com/allen/iamm/25-1
https://www.nxtbook.com/allen/iamm/24-9
https://www.nxtbook.com/allen/iamm/24-7
https://www.nxtbook.com/allen/iamm/24-8
https://www.nxtbook.com/allen/iamm/24-6
https://www.nxtbook.com/allen/iamm/24-5
https://www.nxtbook.com/allen/iamm/24-4
https://www.nxtbook.com/allen/iamm/24-3
https://www.nxtbook.com/allen/iamm/24-2
https://www.nxtbook.com/allen/iamm/24-1
https://www.nxtbook.com/allen/iamm/23-9
https://www.nxtbook.com/allen/iamm/23-8
https://www.nxtbook.com/allen/iamm/23-6
https://www.nxtbook.com/allen/iamm/23-5
https://www.nxtbook.com/allen/iamm/23-2
https://www.nxtbook.com/allen/iamm/23-3
https://www.nxtbook.com/allen/iamm/23-4
https://www.nxtbookmedia.com