Systems, Man & Cybernetics - April 2016 - 34

device that detects obstacle(s) in
objects (e.g., stairways) can be
its vicinity. The Nottingham obstaused as navigational waypoints
An RGD leads
cle detector [4] uses a sonar for
and others for obstacle avoidance.
the blind user
obstacle detection. The C-5 laser
The CRC can be used in either
cane [5] triangulates range using
robot cane (active) mode or white
along a walkable
three pairs of laser-/photodiodes.
cane (passive) mode. In the active
direction toward
The "virtual white cane" [6] meamode, it guides the user by steering
sures
obstacle distance by a trianitself into the desired direction of
the destination.
gulation system comprising a laser
travel, while in the passive mode it
pointer and a camera. The user
functions as a computer-visionreceives multiple range measureenhanced white cane. The CRC is a
ments by swinging an EWC. In spite of the portability,
co-robot. It can detect human intent and use the intent to
these EWCs 1) provide only limited obstacle information
select a suitable mode automatically.
due to the restricted sensing capability, 2) do not provide
location information for wayfinding, and 3) may limit or
A View of Impairment
deny the use of a white cane.
According to the World Health Organization, approximately
With respect to wayfinding, GPS has been widely
285 million people worldwide are visually impaired, of
used in portable navigation aids for the visually impaired
whom 39 million are blind. Visual impairment degrades
[7], [8]. However, the approach cannot be used in GPSone's independent mobility and deteriorates a person's
denied indoor environments. To address this problem
quality of life. In the visually impaired community, white
and the third disadvantage of the EWCs, a portable
canes are currently the most efficient and widely used
indoor localization aid is introduced in [9], where a senmobility tools. A white cane provides haptic feedback for
sor package, comprising an inertial measurement unit
obstacle avoidance. However, it cannot provide necessary
(IMU) and a two-dimensional (2-D) laser scanner, is
information for wayfinding-taking a path toward the desmounted on a white cane for device pose estimation. An
tination with awareness of position and orientation. Also,
extended Kalman filter is employed to predict the
as a point-contact device, a white cane has a limited range
device's pose from the data of the IMU and a user-worn
and cannot provide a "full picture" of its surroundings. To
pedometer and then update the prediction using the
address these limitations, RNAs have been introduced to
laser scans (observation). The pose estimation method
replace/enhance white canes. But only limited success has
requires a map of the environment that must be vertical
been achieved. Up to date, there is no RNA that has effecto predict laser measurements.
tively addressed both the obstacle avoidance and wayfindIn [10], we conceive a new RNA, the CRC, for indoor
ing problems. The main technical challenge is that both
navigation for a blind person. The CRC uses a 3-D camera
problems must be addressed inside a small platform with
for both pose estimation and object recognition in an
limited resources.
unknown indoor environment. The pose-estimation methExisting RNAs can be classified into three categories:
od does not require any prior knowledge of the environrobotic wheelchair [1], robotic guide-dog (RGD) [2], [3], and
ment. The object-recognition method detects indoor
electronic white cane (EWC) [4]-[6]. A robotic wheelchair is
structures and objects, some of which may be used as navwell suited for a blind person with a disability in a lower
igational waypoints. The CRC is a co-robot. It can detect
extremity. However, it gives its user an unpleasant sense of
human intent and use the intent to automatically select its
being controlled. Safety concerns will keep the blind from
use mode. Recently, we designed and fabricated the CRC.
using robotic wheelchairs for their mobility needs. An RGD
This article presents the three key technology compoleads the blind user along a walkable direction toward the
nents-human intent detection, pose estimation, and 3-D
destination. In this case, the user walks by him- or herself.
object recognition-and the fabrication of the CRC. It is an
An RGD can be passive or active. A passive RGD [2] indiextended version of [10].
cates the desired travel direction by steering the wheels,
and the user pushes the RGD forward. A passive RG gives
Overview of the CRC
its users the sense that they are controlling the device but
The conceptual CRC is an indoor navigation aid as depicted
requires extra workload that might cause them fatigue. An
in Figure 1. The CRC is a computer-vision-enhanced white
active RGD [3], however, generates an additional forward
cane that allows a blind traveler to "see" better and farther.
movement to lead the user to the destination. Therefore, it
It provides the user the desired travel direction in an intuican take on a certain payload and does not require the user
tive way and offers a friendly human-device interface. The
to push, causing no fatigue to the user. However, the robotcomputer vision system, comprising a 3-D camera, a threecentric motion may cause safety concerns.
axis gyro, and a Gumstix Overo AirSTORM COM computer,
In addition to the aforementioned disadvantages, both
provides both pose-estimation and object-recognition
robotic wheelchairs and RGDs lack portability. This issue
functions. The CRC's pose is used to provide the user with
makes EWC an appealing solution. An EWC is a handheld
34

IEEE SyStEmS, man, & CybErnEtICS magazInE A pri l 2016



Table of Contents for the Digital Edition of Systems, Man & Cybernetics - April 2016

Systems, Man & Cybernetics - April 2016 - Cover1
Systems, Man & Cybernetics - April 2016 - Cover2
Systems, Man & Cybernetics - April 2016 - 1
Systems, Man & Cybernetics - April 2016 - 2
Systems, Man & Cybernetics - April 2016 - 3
Systems, Man & Cybernetics - April 2016 - 4
Systems, Man & Cybernetics - April 2016 - 5
Systems, Man & Cybernetics - April 2016 - 6
Systems, Man & Cybernetics - April 2016 - 7
Systems, Man & Cybernetics - April 2016 - 8
Systems, Man & Cybernetics - April 2016 - 9
Systems, Man & Cybernetics - April 2016 - 10
Systems, Man & Cybernetics - April 2016 - 11
Systems, Man & Cybernetics - April 2016 - 12
Systems, Man & Cybernetics - April 2016 - 13
Systems, Man & Cybernetics - April 2016 - 14
Systems, Man & Cybernetics - April 2016 - 15
Systems, Man & Cybernetics - April 2016 - 16
Systems, Man & Cybernetics - April 2016 - 17
Systems, Man & Cybernetics - April 2016 - 18
Systems, Man & Cybernetics - April 2016 - 19
Systems, Man & Cybernetics - April 2016 - 20
Systems, Man & Cybernetics - April 2016 - 21
Systems, Man & Cybernetics - April 2016 - 22
Systems, Man & Cybernetics - April 2016 - 23
Systems, Man & Cybernetics - April 2016 - 24
Systems, Man & Cybernetics - April 2016 - 25
Systems, Man & Cybernetics - April 2016 - 26
Systems, Man & Cybernetics - April 2016 - 27
Systems, Man & Cybernetics - April 2016 - 28
Systems, Man & Cybernetics - April 2016 - 29
Systems, Man & Cybernetics - April 2016 - 30
Systems, Man & Cybernetics - April 2016 - 31
Systems, Man & Cybernetics - April 2016 - 32
Systems, Man & Cybernetics - April 2016 - 33
Systems, Man & Cybernetics - April 2016 - 34
Systems, Man & Cybernetics - April 2016 - 35
Systems, Man & Cybernetics - April 2016 - 36
Systems, Man & Cybernetics - April 2016 - 37
Systems, Man & Cybernetics - April 2016 - 38
Systems, Man & Cybernetics - April 2016 - 39
Systems, Man & Cybernetics - April 2016 - 40
Systems, Man & Cybernetics - April 2016 - 41
Systems, Man & Cybernetics - April 2016 - 42
Systems, Man & Cybernetics - April 2016 - 43
Systems, Man & Cybernetics - April 2016 - 44
Systems, Man & Cybernetics - April 2016 - 45
Systems, Man & Cybernetics - April 2016 - 46
Systems, Man & Cybernetics - April 2016 - 47
Systems, Man & Cybernetics - April 2016 - 48
Systems, Man & Cybernetics - April 2016 - 49
Systems, Man & Cybernetics - April 2016 - 50
Systems, Man & Cybernetics - April 2016 - 51
Systems, Man & Cybernetics - April 2016 - 52
Systems, Man & Cybernetics - April 2016 - 53
Systems, Man & Cybernetics - April 2016 - 54
Systems, Man & Cybernetics - April 2016 - 55
Systems, Man & Cybernetics - April 2016 - 56
Systems, Man & Cybernetics - April 2016 - Cover3
Systems, Man & Cybernetics - April 2016 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/smc_202310
https://www.nxtbook.com/nxtbooks/ieee/smc_202307
https://www.nxtbook.com/nxtbooks/ieee/smc_202304
https://www.nxtbook.com/nxtbooks/ieee/smc_202301
https://www.nxtbook.com/nxtbooks/ieee/smc_202210
https://www.nxtbook.com/nxtbooks/ieee/smc_202207
https://www.nxtbook.com/nxtbooks/ieee/smc_202204
https://www.nxtbook.com/nxtbooks/ieee/smc_202201
https://www.nxtbook.com/nxtbooks/ieee/smc_202110
https://www.nxtbook.com/nxtbooks/ieee/smc_202107
https://www.nxtbook.com/nxtbooks/ieee/smc_202104
https://www.nxtbook.com/nxtbooks/ieee/smc_202101
https://www.nxtbook.com/nxtbooks/ieee/smc_202010
https://www.nxtbook.com/nxtbooks/ieee/smc_202007
https://www.nxtbook.com/nxtbooks/ieee/smc_202004
https://www.nxtbook.com/nxtbooks/ieee/smc_202001
https://www.nxtbook.com/nxtbooks/ieee/smc_201910
https://www.nxtbook.com/nxtbooks/ieee/smc_201907
https://www.nxtbook.com/nxtbooks/ieee/smc_201904
https://www.nxtbook.com/nxtbooks/ieee/smc_201901
https://www.nxtbook.com/nxtbooks/ieee/smc_201810
https://www.nxtbook.com/nxtbooks/ieee/smc_201807
https://www.nxtbook.com/nxtbooks/ieee/smc_201804
https://www.nxtbook.com/nxtbooks/ieee/smc_201801
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1017
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0717
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0417
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0117
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1016
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0716
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0416
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0116
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1015
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0715
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0415
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0115
https://www.nxtbookmedia.com