Systems, Man & Cybernetics - April 2016 - 36

SR4000

accuracy for distant object and data completeness than a
stereo/RGB-D camera (e.g., Microsoft Kinect). This may
result in better pose-estimation and object-recognition
performances. In addition, the camera has a smaller
dimension that makes it suitable for the CRC.

Gumstix
Computer

6-DOF Pose Estimation
Front View

Rear View

Servo
Controller
Teensy+ +2.0

Active
Rolling Tip

(a)
Gearhead

Encoder Servo
Motor

Ball Bearings Rolling
Tip

Electromag- Flexible
netic Clutch Coupling

Egomotion Estimation
The camera's pose change between two views is determined by an egomotion estimation method, called visual
range odometry (VRO) [11], and the iterative closest
point (ICP) algorithm. The VRO method extracts and
matches the scale-invariant feature transform (SIFT) features [12] in two consecutive intensity images. As the features' 3-D coordinates are known from the depth data,
the feature-tracking process results in two associated
3-D point sets, " p i , and " q i , . The rotation and translation matrices, R and T, between the two point sets can
be determined by minimizing the error residual
e 2 = | i = 1 p i - Rq i - T,
N

(b)
Figure 3. (a) the first CrC prototype and (b) the art.

clutch, respectively. Both controller boards are connected
to the Gumstix's USB ports. An IMU is used to measure
the cane rotation rate. Using ground truth rotation provided by a motion capture system, we found that the maximum S value with no slip is ~1°/s. So we simply use 2°/s as
threshold for determining motion compliance. Since the S
value is much bigger than 2°/s when the user swings the
cane at the active mode, the threshold results in a reliable
detection of motion compliance. This has been validated
by experiments. With the current design, the CRC's maximum rotational (yaw) speed is 30°/s in the active mode. We
tested the CRC on the grounds in a few buildings on campus. With this maximum speed, S < 2°/s (no slip produced).
Currently, the CRC's weight is close to 1 kg. The majority
of the weight (0.7 kg) is located at 20 cm from the center of
the hand grip to make it easy to swing the cane. Our test
reveals that no discomfort is produced when the CRC user
swings the CRC back and forth. The CRC weight may be
reduced if a lighter camera will be used.

where N is the number of the matched SIFT features.
This least-squares data fitting problem is solved by the
singular value decomposition method [13]. As SIFT feature matching may produce incorrect feature correspondence s (out l ier s), a r a ndom s a mple con s en su s
(RANSAC) process is used to reject the outliers. The
resulting inliers are then used to estimate R and T, from
which the camera's pose change is determined. In this
article, the camera pose is described by the X, Y, Z coordinates and Euler angles (yaw-pitch-roll angles). To estimate pose change more accurately, a Gaussian filter is
used to reduce the noises of the intensity and range data,
and SIFT features with low confidence are discarded
[14] for VRO computation. To overcome the VRO's performance degradation in a visual-feature-spare environment (e.g., an open area with texture-less floor), an
ICP-based shape tracker is devised to refine the alignment of the two point sets and thus the camera's pose
change. In this work, a convex hall is created using the
3-D points of the matched SIFT features. The 3-D data
points within the convex hall are used for ICP calculation. This scheme substantially reduces the computational time of a standard ICP process that uses all data
points. The proposed egomotion estimation method is
termed VRO-based fast ICP (VRO-FICP) [15].

3-D Camera: SwissRanger SR4000
The SR4000 is a small-sized (65 # 65 # 68 mm3) 3-D timeof-flight camera. It illuminates the environment with modulated infrared light and measures
ranges up to 5 m (accuracy: ±1 cm,
resolution: 176 # 144) using phaseThe CRC weight may
shift measurement. The camera
produces range, intensity, and conbe reduced if a lighter
fidence data at a rate up to 54
camera will be used.
frames per second. The SR4000 has
a much better range measurement
36

IEEE SyStEmS, man, & CybErnEtICS magazInE A pri l 2016

(1)

Pose Error Minimization
by Pose Graph Optimization
Visual feature tracking by state
estimation filter, such as the
extended Kalman filtering (EKF)
and pose graph optimization (PGO)
methods, have been proposed to



Table of Contents for the Digital Edition of Systems, Man & Cybernetics - April 2016

Systems, Man & Cybernetics - April 2016 - Cover1
Systems, Man & Cybernetics - April 2016 - Cover2
Systems, Man & Cybernetics - April 2016 - 1
Systems, Man & Cybernetics - April 2016 - 2
Systems, Man & Cybernetics - April 2016 - 3
Systems, Man & Cybernetics - April 2016 - 4
Systems, Man & Cybernetics - April 2016 - 5
Systems, Man & Cybernetics - April 2016 - 6
Systems, Man & Cybernetics - April 2016 - 7
Systems, Man & Cybernetics - April 2016 - 8
Systems, Man & Cybernetics - April 2016 - 9
Systems, Man & Cybernetics - April 2016 - 10
Systems, Man & Cybernetics - April 2016 - 11
Systems, Man & Cybernetics - April 2016 - 12
Systems, Man & Cybernetics - April 2016 - 13
Systems, Man & Cybernetics - April 2016 - 14
Systems, Man & Cybernetics - April 2016 - 15
Systems, Man & Cybernetics - April 2016 - 16
Systems, Man & Cybernetics - April 2016 - 17
Systems, Man & Cybernetics - April 2016 - 18
Systems, Man & Cybernetics - April 2016 - 19
Systems, Man & Cybernetics - April 2016 - 20
Systems, Man & Cybernetics - April 2016 - 21
Systems, Man & Cybernetics - April 2016 - 22
Systems, Man & Cybernetics - April 2016 - 23
Systems, Man & Cybernetics - April 2016 - 24
Systems, Man & Cybernetics - April 2016 - 25
Systems, Man & Cybernetics - April 2016 - 26
Systems, Man & Cybernetics - April 2016 - 27
Systems, Man & Cybernetics - April 2016 - 28
Systems, Man & Cybernetics - April 2016 - 29
Systems, Man & Cybernetics - April 2016 - 30
Systems, Man & Cybernetics - April 2016 - 31
Systems, Man & Cybernetics - April 2016 - 32
Systems, Man & Cybernetics - April 2016 - 33
Systems, Man & Cybernetics - April 2016 - 34
Systems, Man & Cybernetics - April 2016 - 35
Systems, Man & Cybernetics - April 2016 - 36
Systems, Man & Cybernetics - April 2016 - 37
Systems, Man & Cybernetics - April 2016 - 38
Systems, Man & Cybernetics - April 2016 - 39
Systems, Man & Cybernetics - April 2016 - 40
Systems, Man & Cybernetics - April 2016 - 41
Systems, Man & Cybernetics - April 2016 - 42
Systems, Man & Cybernetics - April 2016 - 43
Systems, Man & Cybernetics - April 2016 - 44
Systems, Man & Cybernetics - April 2016 - 45
Systems, Man & Cybernetics - April 2016 - 46
Systems, Man & Cybernetics - April 2016 - 47
Systems, Man & Cybernetics - April 2016 - 48
Systems, Man & Cybernetics - April 2016 - 49
Systems, Man & Cybernetics - April 2016 - 50
Systems, Man & Cybernetics - April 2016 - 51
Systems, Man & Cybernetics - April 2016 - 52
Systems, Man & Cybernetics - April 2016 - 53
Systems, Man & Cybernetics - April 2016 - 54
Systems, Man & Cybernetics - April 2016 - 55
Systems, Man & Cybernetics - April 2016 - 56
Systems, Man & Cybernetics - April 2016 - Cover3
Systems, Man & Cybernetics - April 2016 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/smc_202310
https://www.nxtbook.com/nxtbooks/ieee/smc_202307
https://www.nxtbook.com/nxtbooks/ieee/smc_202304
https://www.nxtbook.com/nxtbooks/ieee/smc_202301
https://www.nxtbook.com/nxtbooks/ieee/smc_202210
https://www.nxtbook.com/nxtbooks/ieee/smc_202207
https://www.nxtbook.com/nxtbooks/ieee/smc_202204
https://www.nxtbook.com/nxtbooks/ieee/smc_202201
https://www.nxtbook.com/nxtbooks/ieee/smc_202110
https://www.nxtbook.com/nxtbooks/ieee/smc_202107
https://www.nxtbook.com/nxtbooks/ieee/smc_202104
https://www.nxtbook.com/nxtbooks/ieee/smc_202101
https://www.nxtbook.com/nxtbooks/ieee/smc_202010
https://www.nxtbook.com/nxtbooks/ieee/smc_202007
https://www.nxtbook.com/nxtbooks/ieee/smc_202004
https://www.nxtbook.com/nxtbooks/ieee/smc_202001
https://www.nxtbook.com/nxtbooks/ieee/smc_201910
https://www.nxtbook.com/nxtbooks/ieee/smc_201907
https://www.nxtbook.com/nxtbooks/ieee/smc_201904
https://www.nxtbook.com/nxtbooks/ieee/smc_201901
https://www.nxtbook.com/nxtbooks/ieee/smc_201810
https://www.nxtbook.com/nxtbooks/ieee/smc_201807
https://www.nxtbook.com/nxtbooks/ieee/smc_201804
https://www.nxtbook.com/nxtbooks/ieee/smc_201801
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1017
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0717
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0417
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0117
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1016
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0716
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0416
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0116
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1015
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0715
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0415
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0115
https://www.nxtbookmedia.com