IEEE Technology and Society Magazine - March 2018 - 25

model needs to work for robots with very different
embodiments, ranging from humanoids, to robot cars
and quadrirotors, to name a few.
The approach we suggest to increase the fluidity of
human-robot interaction is to leverage on the interactive
models humans have naturally developed to interact
with other humans. When working together, how a certain action is performed allows the human partner to
intuitively understand several unspoken properties of
the ongoing interaction and to make it more efficient
and synchronized. For instance, from a human's motion
or movements it is possible to infer how confident that
person is in what he is doing [7], how heavy or fragile is
the object that is being manipulated [8], and also what
the person intends to do with the same object [9]. We
posit that robots should be enabled to tap into such
flow of information by both reading and sending these
covert signals within the interaction.
Importantly, we need robots that can understand us
but that at the same time can be easily understood and
anticipated by us. Only through such a bidirectional,
mutual understanding can the interaction evolve in a
safe, natural, seamless way, similarly to what happens
in human-human exchanges.

Designing Robots to Predict Human Needs
A key ability in humans is the capability to anticipate
what others intend to do or might need. The formation of
expectations about others' actions and intentions
increases the efficiency of the interaction by limiting the
need for elaborate verbal exchanges and cutting drastically the delays. To form expectations the robot needs to
assess the internal, hidden status of the partners, in particular what is their intended goal and, to some extent,
what are their motivation and feelings. Between humans
this is achieved through a continuous exchange of tacit,
covert signals, subtly communicated in the way we
behave. For instance, the direction of the human gaze
correlates with the position of the focus of attention, and
is exploited for understanding the role of each participant in an interaction and to pace turn taking [10],
whereas the velocity with which an action is performed
can reveal the actors' emotional status or their intentions [9], [11].
Some of these signals are physiologically embedded
in human behavior and do not need to be added voluntarily for the sake of communication. Hence they do not
even require a sender's awareness. Others, still based
on the way human move, have an explicit communicative intent (as waving the hand to say good bye, or pointing to indicate something relevant) but they are intrinsic
to human culture and do not entail any conscious effort
to be interpreted. A robot reading similar signals could
decide when to act and what to do in an interaction
march 2018

∕

without requiring any learning or adaptation from the
human side, promoting a natural and intuitive (i.e., a
more humanized) collaboration. There is already evidence that the sensitivity to these signals facilitates
human-robot interaction (and makes it more pleasant
and acceptable). For instance, it has been show that a
robot monitoring head orientation can disambiguate
verbal expression on the basis of the participants' gaze
direction, making the interaction more natural, pleasant, and efficient [12]. The ability to read eye motion can
inform the robot of which object the person might need
in a collaborative joint task [10], with no need to process explicit verbal or gestural instructions [13]. Beyond
gaze, the properties of body motion can also help the
robot act as a more intuitive collaborator. The ability to
detect regularities of biological motions in a scene
enables a robot to detect human activities even when
no human shape is in sight (e.g., when only the tool
being used is visible [14]) and subtle variations in action
kinematics can inform the robot of the human intention
[15]. It has also been demonstrated that the combination of anticipation of human motion trajectories and
modeling of the potential uses of common objects
allows a robot to predict the next human actions with a
sufficient detail to perform anticipatory planning of their
reactive responses [16].

Human-robot interaction will
be facilitated by a sensitivity to
physiological signals.

A robot able to "read" body motion will also be able
to detect the affective state of the interacting partner, in
order to use also this information to adapt its behavior
accordingly (for a recent review on automatic recognition of body movements for affective expression, [11]).
In summary, the first step to make future robots considerate of humans will be to enable them to sense and
understand the subtle signals that humans naturally
exchange during everyday interactions. This ability is at
the basis of the process required to develop robots gifted with the kind of intuition found in our best human
collaborators. Conversely, as is explained in the next
section, in order for humans to be considerate of
robots, it is necessary to embed in robot motion the
same implicit messages used by humans.

IEEE Technology and Society Magazine

25



Table of Contents for the Digital Edition of IEEE Technology and Society Magazine - March 2018

Contents
IEEE Technology and Society Magazine - March 2018 - Cover1
IEEE Technology and Society Magazine - March 2018 - Cover2
IEEE Technology and Society Magazine - March 2018 - 1
IEEE Technology and Society Magazine - March 2018 - Contents
IEEE Technology and Society Magazine - March 2018 - 3
IEEE Technology and Society Magazine - March 2018 - 4
IEEE Technology and Society Magazine - March 2018 - 5
IEEE Technology and Society Magazine - March 2018 - 6
IEEE Technology and Society Magazine - March 2018 - 7
IEEE Technology and Society Magazine - March 2018 - 8
IEEE Technology and Society Magazine - March 2018 - 9
IEEE Technology and Society Magazine - March 2018 - 10
IEEE Technology and Society Magazine - March 2018 - 11
IEEE Technology and Society Magazine - March 2018 - 12
IEEE Technology and Society Magazine - March 2018 - 13
IEEE Technology and Society Magazine - March 2018 - 14
IEEE Technology and Society Magazine - March 2018 - 15
IEEE Technology and Society Magazine - March 2018 - 16
IEEE Technology and Society Magazine - March 2018 - 17
IEEE Technology and Society Magazine - March 2018 - 18
IEEE Technology and Society Magazine - March 2018 - 19
IEEE Technology and Society Magazine - March 2018 - 20
IEEE Technology and Society Magazine - March 2018 - 21
IEEE Technology and Society Magazine - March 2018 - 22
IEEE Technology and Society Magazine - March 2018 - 23
IEEE Technology and Society Magazine - March 2018 - 24
IEEE Technology and Society Magazine - March 2018 - 25
IEEE Technology and Society Magazine - March 2018 - 26
IEEE Technology and Society Magazine - March 2018 - 27
IEEE Technology and Society Magazine - March 2018 - 28
IEEE Technology and Society Magazine - March 2018 - 29
IEEE Technology and Society Magazine - March 2018 - 30
IEEE Technology and Society Magazine - March 2018 - 31
IEEE Technology and Society Magazine - March 2018 - 32
IEEE Technology and Society Magazine - March 2018 - 33
IEEE Technology and Society Magazine - March 2018 - 34
IEEE Technology and Society Magazine - March 2018 - 35
IEEE Technology and Society Magazine - March 2018 - 36
IEEE Technology and Society Magazine - March 2018 - 37
IEEE Technology and Society Magazine - March 2018 - 38
IEEE Technology and Society Magazine - March 2018 - 39
IEEE Technology and Society Magazine - March 2018 - 40
IEEE Technology and Society Magazine - March 2018 - 41
IEEE Technology and Society Magazine - March 2018 - 42
IEEE Technology and Society Magazine - March 2018 - 43
IEEE Technology and Society Magazine - March 2018 - 44
IEEE Technology and Society Magazine - March 2018 - 45
IEEE Technology and Society Magazine - March 2018 - 46
IEEE Technology and Society Magazine - March 2018 - 47
IEEE Technology and Society Magazine - March 2018 - 48
IEEE Technology and Society Magazine - March 2018 - 49
IEEE Technology and Society Magazine - March 2018 - 50
IEEE Technology and Society Magazine - March 2018 - 51
IEEE Technology and Society Magazine - March 2018 - 52
IEEE Technology and Society Magazine - March 2018 - 53
IEEE Technology and Society Magazine - March 2018 - 54
IEEE Technology and Society Magazine - March 2018 - 55
IEEE Technology and Society Magazine - March 2018 - 56
IEEE Technology and Society Magazine - March 2018 - 57
IEEE Technology and Society Magazine - March 2018 - 58
IEEE Technology and Society Magazine - March 2018 - 59
IEEE Technology and Society Magazine - March 2018 - 60
IEEE Technology and Society Magazine - March 2018 - 61
IEEE Technology and Society Magazine - March 2018 - 62
IEEE Technology and Society Magazine - March 2018 - 63
IEEE Technology and Society Magazine - March 2018 - 64
IEEE Technology and Society Magazine - March 2018 - 65
IEEE Technology and Society Magazine - March 2018 - 66
IEEE Technology and Society Magazine - March 2018 - 67
IEEE Technology and Society Magazine - March 2018 - 68
IEEE Technology and Society Magazine - March 2018 - 69
IEEE Technology and Society Magazine - March 2018 - 70
IEEE Technology and Society Magazine - March 2018 - 71
IEEE Technology and Society Magazine - March 2018 - 72
IEEE Technology and Society Magazine - March 2018 - 73
IEEE Technology and Society Magazine - March 2018 - 74
IEEE Technology and Society Magazine - March 2018 - 75
IEEE Technology and Society Magazine - March 2018 - 76
IEEE Technology and Society Magazine - March 2018 - 77
IEEE Technology and Society Magazine - March 2018 - 78
IEEE Technology and Society Magazine - March 2018 - 79
IEEE Technology and Society Magazine - March 2018 - 80
IEEE Technology and Society Magazine - March 2018 - 81
IEEE Technology and Society Magazine - March 2018 - 82
IEEE Technology and Society Magazine - March 2018 - 83
IEEE Technology and Society Magazine - March 2018 - 84
IEEE Technology and Society Magazine - March 2018 - 85
IEEE Technology and Society Magazine - March 2018 - 86
IEEE Technology and Society Magazine - March 2018 - 87
IEEE Technology and Society Magazine - March 2018 - 88
IEEE Technology and Society Magazine - March 2018 - 89
IEEE Technology and Society Magazine - March 2018 - 90
IEEE Technology and Society Magazine - March 2018 - 91
IEEE Technology and Society Magazine - March 2018 - 92
IEEE Technology and Society Magazine - March 2018 - Cover3
IEEE Technology and Society Magazine - March 2018 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_september2023
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_june2023
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_march2023
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_december2022
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_september2022
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_june2022
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_march2022
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_december2021
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_september2021
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_june2021
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_march2021
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_december2020
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_september2020
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_june2020
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_march2020
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_december2019
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_september2019
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_june2019
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_march2019
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_december2018
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_september2018
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_june2018
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_march2018
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_winter2017
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_fall2017
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_summer2017
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_spring2017
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_winter2016
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_fall2016
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_summer2016
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_spring2016
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_winter2015
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_fall2015
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_summer2015
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_spring2015
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_winter2014
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_fall2014
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_summer2014
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_spring2014
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_winter2013
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_fall2013
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_summer2013
https://www.nxtbook.com/nxtbooks/ieee/technologysociety_spring2013
https://www.nxtbookmedia.com