IEEE Systems, Man and Cybernetics Magazine - July 2019 - 59

in the case of an autonomous system, the damage or complication that took place should have a direct link to the
strategy adopted by the system, and there should be an
obvious deviation from clinical guidelines and evidencebased medical practices. However, there is no clear
answer for the accountability of the surgeon as an
observer in the complications. The surgeon usually follows clinical pathways and evidence-based medicine to
choose the treatment plan. Even so, choosing the right
treatment plan can raise concerns about the concept of
loss of chance for patients. This concept implies the
rights of a patient to go through an optimal procedure.
Thus, for patients who were treated with an intervention
according to the standard of care, the question is whether
they could have been treated with a better intervention
plan. This debate is considered one of the biggest challenges in the journey toward trusted autonomy, due to the
difficulty of providing proof.

References
[1] J.-Y. Jian, A. M. Bisantz, and C. G. Drury, "Foundations for an empirically determined scale of trust in automated systems," Int. J. Cognitive Ergonom., vol. 4, no. 1,
pp. 53-71, 2000. doi: 10.1207/S15327566IJCE0401_04.
[2] K. Schaefer, "The perception and measurement of human-robot trust," Ph.D.
dissertation, Psych. Dept., Univ. Central Florida, Orlando, 2013. [Online]. Available:
https://stars.library.ucf.edu/etd/2688
[3] Q. Jenkins and X. Jiang, "Measuring trust and application of eye tracking in
human robotic interaction," in Proc. Institute Industrial Engineers Annu. Conf.,
2010, pp. 1-6.
[4] A. Waytz, J. Heafner, and N. Epley, "The mind in the machine: Anthropomorphism
increases trust in an autonomous vehicle," J. Experimental Social Psychol., vol. 52,
pp. 113-117, May 2014. doi: 10.1016/j.jesp.2014.01.005.
[5] R. P. Van Gompel, Ed., Eye Movements: A Window on Mind and Brain. Amsterdam, The Netherlands: Elsevier, 2007.
[6] J. Iskander et al., "Eye behaviour as a hazard perception measure," in Proc. 2018
Annu. IEEE Int. Systems Conf. (SysCon), pp. 1-6, 2018.
[7] P. R. Murphy, J. Vandekerckhove, and S. Nieuwenhuis, "Pupil-linked arousal determines variability in perceptual decision making," PLoS Comput. Biol., vol. 10, no. 9,

Final Thoughts
I predict that autonomous systems will become an integral
part of our daily lives in the not-so-distant future. This
integration into our lives will involve the autonomous entities' gaining our trust. This is an inevitable future and will
follow a path similar to assimilation of industrial robotics
into traditional factories. Society will soon see a need to
define regulations for these autonomous entities, and we
have witnessed, from our experience with driverless vehicles, that developing these regulations can be very complex due to ethical and legal constraints. Therefore, I argue
that society should keep it simple and maintain a single set
of rules that applies to both human and nonhuman entities, even if that implies nonhuman entities don't abide by
the so-called traditional laws of robotics [19].
I also dispute that experiments involving trust and any
form of autonomous systems should be designed with a
"distrust situation first" principle, as the opposite is
extremely difficult to achieve. Finally, I discussed the
Autonomy of Things, the difference between automated
and autonomous systems, intention-aware autonomous
systems, and methods of building trust between human
and autonomous systems.

p. e1003854, 2014. doi: 10.1371/journal.pcbi.1003854.
[8] O. A. Kannape, A. Barre´, K. Aminian, and O. Blanke, "Cognitive loading affects
motor awareness and movement kinematics but not locomotor trajectories during
goal-directed walking in a virtual reality environment," PLoS ONE, vol. 9, no. 1,
pp. e85560, 2014. doi: 10.1371/journal.pone.0085560.
[9] K. Yoshihara et al., "Neural correlates of fear-induced sympathetic response
associated with the peripheral temperature change rate," NeuroImage, vol. 134,
pp. 522-531, July 1, 2016. doi: 10.1016/j.neuroimage.2016.04.040.
[10] L. M. Williams et al. "Arousal dissociates amygdala and hippocampal fear responses: Evidence from simultaneous FMRI and skin conductance recording," Neuroimage,
vol. 14, no. 5, pp. 1070-1079, Nov. 2001. doi: 10.1006/nimg.2001.0904.
[11] A. Dimoka, "What does the brain tell us about trust and distrust? Evidence from a
functional neuroimaging study," MIS Quart., pp. 373-396, 2010.
[12] Q. Le and B. Zoph, "Using machine learning to explore neural network architecture," Google AI Blog, May 17, 2017. [Online]. Available: https://ai.googleblog
.com/2017/05/using-machine-learning-to-explore.html
[13] K. Iyer, "Google's AI creates its own AI that is superior than the ones made by
humans," Techworm, Dec. 4, 2017. [Online]. Available: https://www.techworm.net/
2017/12/googles-ai- creates-ai-superior-ones-made-humans.html
[14] A. Sulleyman, "Google AI creates its own 'child' AI that's more advanced than
systems built by humans," Dec. 5, 2017. [Online]. Available: https://www.independent
.co.uk/life-style/gadgets-and-tech/news/google-child-ai-bot-nasnet-automl-machinelearning-artificial-intelligence-a8093201.html
[15] B. M. Muir, "Trust between humans and machines, and the design of decision

About the Author
Saeid Nahavandi (saeid.nahavandi@deakin.edu.au)
earned his Ph.D. degree from Durham University, United
Kingdom, in 1991. He is an Alfred Deakin professor, provice-chancellor, chair of engineering, and the director
for the Institute for Intelligent Systems Research and
Innovation at Deakin University, Australia. His research
interests include modeling of complex systems, robotics, and haptics. He has published more than 800 scientific papers in various international journals and conferences. He is a Fellow of Engineers Australia, the Institution of Engineering and Technology, and a Senior
Member of the IEEE.

aids," Int. J. Man-Mach. Stud., vol. 27, no. 5-6, pp. 527-539, 1987. doi: 10.1016/S00207373(87)80013-5.
[16] J. D. Lee and K. A. See, "Trust in automation: Designing for appropriate reliance,"
Hum. Factors, vol. 46, no. 1, pp. 50-80, 2004. doi: 10.1518/hfes.46.1.50_30392.
[17] K. Saleh, M. Hossny, and S. Nahavandi, "Intent prediction of pedestrians via
motion trajectories using stacked recurrent neural networks," IEEE Trans. Intell.
Vehicles, vol. 3, no. 4, pp. 414-424, 2018.
[18] D. D. Salvucci, A. Liu, and E. R. Boer, "Control and monitoring during lane changes," Vision Vehicles, vol. 9, 2001.
[19] Wikipedia, "Three laws of robotics." Accessed on: Jan. 15, 2019. [Online]. Available: https://en.wikipedia.org/wiki/Three_Laws_of_Robotics#Applications_to_future_
technology

Ju ly 2019

IEEE SYSTEMS, MAN, & CYBERNETICS MAGAZINE

59


https://stars.library.ucf.edu/etd/2688 https://ai.googleblog.com/2017/05/using-machine-learning-to-explore.html https://ai.googleblog.com/2017/05/using-machine-learning-to-explore.html https://www.techworm.net/%202017/12/googles-ai-%20creates-ai-superior-ones-made-humans.html https://www.techworm.net/%202017/12/googles-ai-%20creates-ai-superior-ones-made-humans.html https://www.independent.co.uk/life-style/gadgets-and-tech/news/google-child-ai-bot-nasnet-automl-machine-learning-artificial-intelligence-a8093201.html https://www.independent.co.uk/life-style/gadgets-and-tech/news/google-child-ai-bot-nasnet-automl-machine-learning-artificial-intelligence-a8093201.html https://www.independent.co.uk/life-style/gadgets-and-tech/news/google-child-ai-bot-nasnet-automl-machine-learning-artificial-intelligence-a8093201.html https://en.wikipedia.org/wiki/Three_Laws_of_Robotics#Applications_to_future_technology https://en.wikipedia.org/wiki/Three_Laws_of_Robotics#Applications_to_future_technology

IEEE Systems, Man and Cybernetics Magazine - July 2019

Table of Contents for the Digital Edition of IEEE Systems, Man and Cybernetics Magazine - July 2019

Contents
IEEE Systems, Man and Cybernetics Magazine - July 2019 - Cover1
IEEE Systems, Man and Cybernetics Magazine - July 2019 - Cover2
IEEE Systems, Man and Cybernetics Magazine - July 2019 - Contents
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 2
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 3
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 4
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 5
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 6
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 7
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 8
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 9
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 10
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 11
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 12
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 13
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 14
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 15
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 16
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 17
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 18
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 19
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 20
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 21
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 22
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 23
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 24
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 25
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 26
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 27
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 28
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 29
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 30
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 31
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 32
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 33
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 34
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 35
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 36
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 37
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 38
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 39
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 40
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 41
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 42
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 43
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 44
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 45
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 46
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 47
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 48
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 49
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 50
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 51
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 52
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 53
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 54
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 55
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 56
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 57
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 58
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 59
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 60
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 61
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 62
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 63
IEEE Systems, Man and Cybernetics Magazine - July 2019 - 64
IEEE Systems, Man and Cybernetics Magazine - July 2019 - Cover3
IEEE Systems, Man and Cybernetics Magazine - July 2019 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/smc_202310
https://www.nxtbook.com/nxtbooks/ieee/smc_202307
https://www.nxtbook.com/nxtbooks/ieee/smc_202304
https://www.nxtbook.com/nxtbooks/ieee/smc_202301
https://www.nxtbook.com/nxtbooks/ieee/smc_202210
https://www.nxtbook.com/nxtbooks/ieee/smc_202207
https://www.nxtbook.com/nxtbooks/ieee/smc_202204
https://www.nxtbook.com/nxtbooks/ieee/smc_202201
https://www.nxtbook.com/nxtbooks/ieee/smc_202110
https://www.nxtbook.com/nxtbooks/ieee/smc_202107
https://www.nxtbook.com/nxtbooks/ieee/smc_202104
https://www.nxtbook.com/nxtbooks/ieee/smc_202101
https://www.nxtbook.com/nxtbooks/ieee/smc_202010
https://www.nxtbook.com/nxtbooks/ieee/smc_202007
https://www.nxtbook.com/nxtbooks/ieee/smc_202004
https://www.nxtbook.com/nxtbooks/ieee/smc_202001
https://www.nxtbook.com/nxtbooks/ieee/smc_201910
https://www.nxtbook.com/nxtbooks/ieee/smc_201907
https://www.nxtbook.com/nxtbooks/ieee/smc_201904
https://www.nxtbook.com/nxtbooks/ieee/smc_201901
https://www.nxtbook.com/nxtbooks/ieee/smc_201810
https://www.nxtbook.com/nxtbooks/ieee/smc_201807
https://www.nxtbook.com/nxtbooks/ieee/smc_201804
https://www.nxtbook.com/nxtbooks/ieee/smc_201801
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1017
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0717
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0417
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0117
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1016
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0716
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0416
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0116
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1015
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0715
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0415
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0115
https://www.nxtbookmedia.com