Instrumentation & Measurement Magazine 26-3 - 43

processing a subject's physiological data as well as selecting
classifying algorithms.
In the current investigation, we studied 12 features extracted
from signals collected by the E4. This approach
balances between extracting over 100 features [9] and studying
a single feature [10],[20] and is grounded in years of study
supporting that these 12 are significant to affective computing
research applied to children with ASD [8],[12]. Further analysis,
such as principal component analysis, could be used to
test how much each feature impacts the classification metrics.
The current offline analysis did not have to face bandwidth
constraints that may arise with real-time transmission and
processing of data; thus, reducing the number of signals collected
and/or features extracted may be necessary as we move
to real-time processing and closed-loop feedback during RAI.
Therefore, ranking features will be part of our future work.
Challenges remain for collecting enough samples, and
accurate samples, for the algorithms to learn more precise
thresholds between one affective state and another. Since individually-designed
models are often shown to outperform
group-designed models, we would like to continue to collect
data with these subjects. However, a limitation is the models
trained for an individual can only assist that subject.
The algorithms were able to produce better-than-chance
accuracy for our affective states of interest. Furthermore, the
models have F1-scores > 0.67 for both classes; therefore, these
models would match a human coder on predicting attentiveness
or inattentiveness over two-thirds of the time. During
future interventions, the models could be used by a robot to
test a new sample of physiological data, return a prediction of
affect (e.g., attentiveness or inattentiveness), and make a decision
about actions to take during RAI. For example, if the
model predicts the subject's signals indicate attentiveness,
the robot can choose to continue with practicing a social skill.
However, if the model predicts inattentiveness, the robot can
remind the subject about the directions and about staying on
task. The current investigation lays the groundwork for future
research on the use of an affect-sensitive robot to develop social
skills in learners with ASD.
Acknowledgments
The authors wish to acknowledge student researchers Janet
Pulgares Soriano, Rebecca Castelly, Sabrina Daisey, Natalie
Warning, Zen Kim, and Jacob Berdichevsky for their contributions
to the research. We also want to thank the reliability
coders from the College of Education and Human Development,
the staff at the Norton Children's Autism Center, the
subjects, and their families. All procedures performed in studies
involving human participants were in accordance with the
ethical standards of the institutional and/or national research
committee and with the 1964 Helsinki declaration and its later
amendments or comparable ethical standards. Informed consent
was obtained from all individual participants included in
the study. This research was supported by the National Science
Foundation (NSF) under Smart and Connected Health (SCH)
Grant #1838808 and REU Grant #1950137.
May 2023
References
[1] J. R. Steinbrenner et al., Evidence-Based Practices for Children, Youth,
and Young Adults with Autism. The University of North Carolina at
Chapel Hill: Frank Porter Graham Child Development Institute,
National Clearinghouse on Autism Evidence and Practice Review
Team, 2020.
[2] E. E. Barton, J. E. Pustejovsky, D. M. Maggin, and B. Reichow,
" Technology-aided instruction and intervention for students with
ASD: a meta-analysis using novel methods of estimating effect
sizes for single-case research, " Remedial and Special Ed., vol. 38, no.
6, pp. 371-386, 2017.
[3] L. I. Ismail, T. Verhoeven, J. Dambre, and F. Wyffels, " Leveraging
robotics research for children with autism: a review, " Int. J. Social
Robotics, vol. 11, no. 3, pp. 389-410, 2019.
[4] W. C. So et al., " A robot-based play-drama intervention may
improve the joint attention and functional play behaviors of
Chinese-speaking preschoolers with autism spectrum disorder:
a pilot study, " J. Autism and Developmental Disorders, vol. 50, no. 2,
pp. 467-481, 2020.
[5] R. Pennington, M. N. Saadatzi, K. C. Welch, and R. Scott, " Using
robot-assisted instruction to teach students with intellectual
disabilities to use personal narrative in text messages, " J. Special
Ed. Technol., vol. 29, no. 4, pp. 49-58, 2014.
[6] [6] F. Marino et al., " Outcomes of a robot-assisted social-emotional
understanding intervention for young children with autism
spectrum disorders, " J. Autism and Developmental Disorders, vol.
50, no. 6, pp. 1973-1987, 2020.
[7] M. W. De Korte et al., " Self-initiations in young children with
autism during pivotal response treatment with and without robot
assistance, " Autism, vol. 24, no. 8, pp. 2117-2128, 2020.
[8] K. C. Welch, " Physiological signals of autistic children can be
useful, " IEEE Instrum. Meas. Mag., vol. 15, no. 1, pp. 28-32, Feb. 2012.
[9] O. AlZoubi, S. D'Mello, and R. A. Calvo, " Detecting naturalistic
expressions of nonbasic affect using physiological signals, " IEEE
Trans. Affect. Comput., vol. 3, no. 3, Jul.-Sep. 2012.
[10] R. Assabumrungrat et al., " Ubiquitous affective computing: a
review, " IEEE Sens. J., vol. 22, no. 3, pp. 1867-1881, Feb. 2022.
[11] C. Saitis and K. Kalimeri, " Multimodal classification of stressful
environments in visually impaired mobility using EEG and
peripheral biosignals, " IEEE Trans. Affect. Comput., vol. 12, no. 1,
pp. 203-214, Jan.-Mar. 2021.
[12] K. C. Welch, U. Lahiri, Z. W. Warren, and N. Sarkar, " A system to
measure physiological response during social interaction in VR
for children with ASD, " in Computational Models for Biomedical
Reasoning and Problem Solving, C. Chen & S. Cheung, Eds.
Hershey, Pennsylvania, USA: IGI Global, pp. 1-33, 2019.
[13] S. D'Mello, A. Kappas, and J. Gratch, " The affective computing
approach to affect measurement, " Emotion Rev., vol. 10, no. 2, pp.
174-183, Apr. 2018.
[14] K. C. Welch, C. Harnett, and Y-C. Lee, " A review on measuring
affect with practical sensors to monitor driver behavior, " Safety,
vol. 5, no. 4, pp. 72-89, 2019.
[15] H. Lee, S. Kang, and U. Lee, " Understanding privacy risks and
perceived benefits in open dataset collection for mobile affective
computing, " in Proc. ACM Interact. Mob. Wearable Ubiquitous
Technol., vol. 6, no. 2, Jun. 2022.
IEEE Instrumentation & Measurement Magazine
43

Instrumentation & Measurement Magazine 26-3

Table of Contents for the Digital Edition of Instrumentation & Measurement Magazine 26-3

Instrumentation & Measurement Magazine 26-3 - Cover1
Instrumentation & Measurement Magazine 26-3 - Cover2
Instrumentation & Measurement Magazine 26-3 - 1
Instrumentation & Measurement Magazine 26-3 - 2
Instrumentation & Measurement Magazine 26-3 - 3
Instrumentation & Measurement Magazine 26-3 - 4
Instrumentation & Measurement Magazine 26-3 - 5
Instrumentation & Measurement Magazine 26-3 - 6
Instrumentation & Measurement Magazine 26-3 - 7
Instrumentation & Measurement Magazine 26-3 - 8
Instrumentation & Measurement Magazine 26-3 - 9
Instrumentation & Measurement Magazine 26-3 - 10
Instrumentation & Measurement Magazine 26-3 - 11
Instrumentation & Measurement Magazine 26-3 - 12
Instrumentation & Measurement Magazine 26-3 - 13
Instrumentation & Measurement Magazine 26-3 - 14
Instrumentation & Measurement Magazine 26-3 - 15
Instrumentation & Measurement Magazine 26-3 - 16
Instrumentation & Measurement Magazine 26-3 - 17
Instrumentation & Measurement Magazine 26-3 - 18
Instrumentation & Measurement Magazine 26-3 - 19
Instrumentation & Measurement Magazine 26-3 - 20
Instrumentation & Measurement Magazine 26-3 - 21
Instrumentation & Measurement Magazine 26-3 - 22
Instrumentation & Measurement Magazine 26-3 - 23
Instrumentation & Measurement Magazine 26-3 - 24
Instrumentation & Measurement Magazine 26-3 - 25
Instrumentation & Measurement Magazine 26-3 - 26
Instrumentation & Measurement Magazine 26-3 - 27
Instrumentation & Measurement Magazine 26-3 - 28
Instrumentation & Measurement Magazine 26-3 - 29
Instrumentation & Measurement Magazine 26-3 - 30
Instrumentation & Measurement Magazine 26-3 - 31
Instrumentation & Measurement Magazine 26-3 - 32
Instrumentation & Measurement Magazine 26-3 - 33
Instrumentation & Measurement Magazine 26-3 - 34
Instrumentation & Measurement Magazine 26-3 - 35
Instrumentation & Measurement Magazine 26-3 - 36
Instrumentation & Measurement Magazine 26-3 - 37
Instrumentation & Measurement Magazine 26-3 - 38
Instrumentation & Measurement Magazine 26-3 - 39
Instrumentation & Measurement Magazine 26-3 - 40
Instrumentation & Measurement Magazine 26-3 - 41
Instrumentation & Measurement Magazine 26-3 - 42
Instrumentation & Measurement Magazine 26-3 - 43
Instrumentation & Measurement Magazine 26-3 - 44
Instrumentation & Measurement Magazine 26-3 - 45
Instrumentation & Measurement Magazine 26-3 - 46
Instrumentation & Measurement Magazine 26-3 - 47
Instrumentation & Measurement Magazine 26-3 - 48
Instrumentation & Measurement Magazine 26-3 - 49
Instrumentation & Measurement Magazine 26-3 - 50
Instrumentation & Measurement Magazine 26-3 - 51
Instrumentation & Measurement Magazine 26-3 - 52
Instrumentation & Measurement Magazine 26-3 - 53
Instrumentation & Measurement Magazine 26-3 - 54
Instrumentation & Measurement Magazine 26-3 - 55
Instrumentation & Measurement Magazine 26-3 - 56
Instrumentation & Measurement Magazine 26-3 - 57
Instrumentation & Measurement Magazine 26-3 - 58
Instrumentation & Measurement Magazine 26-3 - 59
Instrumentation & Measurement Magazine 26-3 - 60
Instrumentation & Measurement Magazine 26-3 - 61
Instrumentation & Measurement Magazine 26-3 - Cover3
Instrumentation & Measurement Magazine 26-3 - Cover4
https://www.nxtbook.com/allen/iamm/26-6
https://www.nxtbook.com/allen/iamm/26-5
https://www.nxtbook.com/allen/iamm/26-4
https://www.nxtbook.com/allen/iamm/26-3
https://www.nxtbook.com/allen/iamm/26-2
https://www.nxtbook.com/allen/iamm/26-1
https://www.nxtbook.com/allen/iamm/25-9
https://www.nxtbook.com/allen/iamm/25-8
https://www.nxtbook.com/allen/iamm/25-7
https://www.nxtbook.com/allen/iamm/25-6
https://www.nxtbook.com/allen/iamm/25-5
https://www.nxtbook.com/allen/iamm/25-4
https://www.nxtbook.com/allen/iamm/25-3
https://www.nxtbook.com/allen/iamm/instrumentation-measurement-magazine-25-2
https://www.nxtbook.com/allen/iamm/25-1
https://www.nxtbook.com/allen/iamm/24-9
https://www.nxtbook.com/allen/iamm/24-7
https://www.nxtbook.com/allen/iamm/24-8
https://www.nxtbook.com/allen/iamm/24-6
https://www.nxtbook.com/allen/iamm/24-5
https://www.nxtbook.com/allen/iamm/24-4
https://www.nxtbook.com/allen/iamm/24-3
https://www.nxtbook.com/allen/iamm/24-2
https://www.nxtbook.com/allen/iamm/24-1
https://www.nxtbook.com/allen/iamm/23-9
https://www.nxtbook.com/allen/iamm/23-8
https://www.nxtbook.com/allen/iamm/23-6
https://www.nxtbook.com/allen/iamm/23-5
https://www.nxtbook.com/allen/iamm/23-2
https://www.nxtbook.com/allen/iamm/23-3
https://www.nxtbook.com/allen/iamm/23-4
https://www.nxtbookmedia.com