Instrumentation & Measurement Magazine 23-4 - 28

Both object recognition/scene description-focused and
navigation-focused papers were able to recognize only a very
limited set of objects. The number of object classes reported
varied between four and five, with the exception of one paper
that reported the ability to differentiate among "1,000 kinds of
objects" with no further details [54]. The most common objects
recognized by the proposed solutions were vehicles, bicycles,
pedestrians, and static obstacles [2], [8], [55], [56] in an outdoor scenario and doors, corridors, halls, and junctions [47],
[49] indoors. Depending on the implementation, performance
measures for differentiating between moving objects (vehicle, bicycle, and pedestrian) ranged from 0.82 to 0.95 in terms
of precision and from 0.69 to 0.96 in terms of recall. Measures
were similar for detecting and recognizing static obstacles
(precision = 0.9-0.93, recall = 0.79-0.95) [5], [52], [55].
Due to the specifics of the inclusion criteria of this review,
all selected publications used live camera feeds as input for the
proposed tools. The use of QR-coded locations was proposed
for indoor navigation [47], [49] in combination with motion
sensors and building maps [48]. Mocanu et al. suggested the
use of ultrasonic sensors, which work together with computer
vision algorithms to enable accurate distance estimation to
detected objects [55]. Distance estimation to obstacles is also
feasible using stereo cameras [56], described in two of the selected publications [54], [57]. A typical output of all these
systems was an audio signal, often communicated to the user
through bone-conduction headphones. Dasila et al. proposed
using special binaural sound techniques to enhance the user
experience and immerse the user into a 3D audio surroundbased representation of the environment [58].
The options for processing the live video stream could be
easily divided into local (data was processed using the computational power of the smartphone) and remote (video was
transmitted to a more powerful device or cloud infrastructure for processing). The majority of the solutions included in
this review selected a local data-processing approach (ten papers), while the other five processed data remotely (two used
a dedicated laptop computer [48], [52], while three transmitted the video stream to the cloud infrastructure for processing
[54], [58], [59]).
Local video processing was implemented on a variety of
Android and iOS smartphones. Solutions reached a performance level ranging from 5 frames per second (fps) [53] to
10 fps [2], [47], [52] in evaluation experiments (based on five
papers). The choice of algorithms used in data-processing
pipelines varied greatly. The most popular choices, resulting
in highest performance (in terms of fps), were scale-invariant
feature transform (SIFT) [60] and speeded up robust features
(SURF) [61].
Algorithms used in data-processing pipelines often rely on
classical two-step composition: the camera image is first processed using engineering-based computer vision techniques
(e.g., feature and descriptor extraction), followed by a machine learning model (e.g., a classifier) utilizing the output
of the first step. Common choices for the first step were feature descriptors (e.g., histogram of oriented gradients (HOG)
28	

[62], SIFT [60], and SURF [61]). For example, two publications
used HOG for classifying detected obstacles [2], [56], two used
SURF descriptors to recognize visual situations and construct
a guidance system [47], [49], and one proposed a system relying on features from accelerated segment test (FAST) [63]
descriptors for obstacle handling [55]. Support vector machines (SVMs) [2], [55], [56] and template matching [47], [49]
were identified as popular machine learning methods for the
second step. These methods are fast to execute even in embedded computing platforms, which makes them attractive in
low-computational-resource scenarios.
Another class of algorithms used in the identified publications counted on direct computer vision approaches without
an adaptive machine learning component. For example, Garcia and Nahapetian [50] tried to detect a corridor by using
Canny edges followed by a Hough transform [50], [64], and
Elloumi et al. proposed a camera localization algorithm based
on orthogonal vanishing points [51]. Both approaches were intended to work only in indoor environments by design, and
tests conducted by the authors were limited to a single environment, which makes practical evaluation of the results
rather preliminary.
It is important to note that only a few studies (i.e., [52], [53],
and [59]) investigated computer vision methods based on deep
neural networks (DNNs), which perform feature extraction
and task modelling (e.g., classification or recognition) steps
in a single differentiable neural structure. The parameters of
both steps are estimated from the data, optimizing a single cost
function that corresponds not only to optimization of the task
modelling component but also to feature optimization. In most
cases, neural networks are known to achieve better and more
robust results compared to approaches that rely on the aforementioned engineering-based feature extractors [65]-[67].

Conclusions and Discussion
The goal of this paper was to provide strong evidence about
the development of the field and cover as many relevant R&D
projects as possible. Systematic reviews have been common
practice in the medical domain for some time, and they are
now making their way into more interdisciplinary and technology-oriented fields such as medical informatics, e-health,
and various technology-related research domains as well.
For example, PRISMA systemic review method is applied in
material sciences, optoelectronics, electronics, energy, IT and
communications, neuroscience, robotics, etc. In contrast to
other types of review techniques (e.g., traditional reviews,
meta-analyses), the strengths of a systematic review method
lie in transparent and reproducible methodology for including
all available evidence in the review and objective assessment
of validity and relevance of each included study. In this sense,
the PRISMA systematic review method, which we employed
in this paper, serves well in medical research and also in other
domains to obtain a representative overview of the latest
achievements in the field.
Let us briefly summarize and compare a few observations concerning different technological instrumentation

IEEE Instrumentation & Measurement Magazine	

June 2020



Instrumentation & Measurement Magazine 23-4

Table of Contents for the Digital Edition of Instrumentation & Measurement Magazine 23-4

No label
Instrumentation & Measurement Magazine 23-4 - No label
Instrumentation & Measurement Magazine 23-4 - Cover2
Instrumentation & Measurement Magazine 23-4 - 1
Instrumentation & Measurement Magazine 23-4 - 2
Instrumentation & Measurement Magazine 23-4 - 3
Instrumentation & Measurement Magazine 23-4 - 4
Instrumentation & Measurement Magazine 23-4 - 5
Instrumentation & Measurement Magazine 23-4 - 6
Instrumentation & Measurement Magazine 23-4 - 7
Instrumentation & Measurement Magazine 23-4 - 8
Instrumentation & Measurement Magazine 23-4 - 9
Instrumentation & Measurement Magazine 23-4 - 10
Instrumentation & Measurement Magazine 23-4 - 11
Instrumentation & Measurement Magazine 23-4 - 12
Instrumentation & Measurement Magazine 23-4 - 13
Instrumentation & Measurement Magazine 23-4 - 14
Instrumentation & Measurement Magazine 23-4 - 15
Instrumentation & Measurement Magazine 23-4 - 16
Instrumentation & Measurement Magazine 23-4 - 17
Instrumentation & Measurement Magazine 23-4 - 18
Instrumentation & Measurement Magazine 23-4 - 19
Instrumentation & Measurement Magazine 23-4 - 20
Instrumentation & Measurement Magazine 23-4 - 21
Instrumentation & Measurement Magazine 23-4 - 22
Instrumentation & Measurement Magazine 23-4 - 23
Instrumentation & Measurement Magazine 23-4 - 24
Instrumentation & Measurement Magazine 23-4 - 25
Instrumentation & Measurement Magazine 23-4 - 26
Instrumentation & Measurement Magazine 23-4 - 27
Instrumentation & Measurement Magazine 23-4 - 28
Instrumentation & Measurement Magazine 23-4 - 29
Instrumentation & Measurement Magazine 23-4 - 30
Instrumentation & Measurement Magazine 23-4 - 31
Instrumentation & Measurement Magazine 23-4 - 32
Instrumentation & Measurement Magazine 23-4 - 33
Instrumentation & Measurement Magazine 23-4 - 34
Instrumentation & Measurement Magazine 23-4 - 35
Instrumentation & Measurement Magazine 23-4 - 36
Instrumentation & Measurement Magazine 23-4 - 37
Instrumentation & Measurement Magazine 23-4 - 38
Instrumentation & Measurement Magazine 23-4 - 39
Instrumentation & Measurement Magazine 23-4 - 40
Instrumentation & Measurement Magazine 23-4 - 41
Instrumentation & Measurement Magazine 23-4 - 42
Instrumentation & Measurement Magazine 23-4 - 43
Instrumentation & Measurement Magazine 23-4 - 44
Instrumentation & Measurement Magazine 23-4 - 45
Instrumentation & Measurement Magazine 23-4 - 46
Instrumentation & Measurement Magazine 23-4 - 47
Instrumentation & Measurement Magazine 23-4 - 48
Instrumentation & Measurement Magazine 23-4 - 49
Instrumentation & Measurement Magazine 23-4 - 50
Instrumentation & Measurement Magazine 23-4 - 51
Instrumentation & Measurement Magazine 23-4 - 52
Instrumentation & Measurement Magazine 23-4 - 53
Instrumentation & Measurement Magazine 23-4 - 54
Instrumentation & Measurement Magazine 23-4 - 55
Instrumentation & Measurement Magazine 23-4 - 56
Instrumentation & Measurement Magazine 23-4 - 57
Instrumentation & Measurement Magazine 23-4 - 58
Instrumentation & Measurement Magazine 23-4 - 59
Instrumentation & Measurement Magazine 23-4 - 60
Instrumentation & Measurement Magazine 23-4 - 61
Instrumentation & Measurement Magazine 23-4 - 62
Instrumentation & Measurement Magazine 23-4 - 63
Instrumentation & Measurement Magazine 23-4 - 64
Instrumentation & Measurement Magazine 23-4 - 65
Instrumentation & Measurement Magazine 23-4 - 66
Instrumentation & Measurement Magazine 23-4 - 67
Instrumentation & Measurement Magazine 23-4 - 68
Instrumentation & Measurement Magazine 23-4 - 69
Instrumentation & Measurement Magazine 23-4 - 70
Instrumentation & Measurement Magazine 23-4 - 71
Instrumentation & Measurement Magazine 23-4 - 72
Instrumentation & Measurement Magazine 23-4 - 73
Instrumentation & Measurement Magazine 23-4 - 74
Instrumentation & Measurement Magazine 23-4 - 75
Instrumentation & Measurement Magazine 23-4 - 76
Instrumentation & Measurement Magazine 23-4 - 77
Instrumentation & Measurement Magazine 23-4 - 78
Instrumentation & Measurement Magazine 23-4 - 79
Instrumentation & Measurement Magazine 23-4 - 80
Instrumentation & Measurement Magazine 23-4 - 81
Instrumentation & Measurement Magazine 23-4 - 82
Instrumentation & Measurement Magazine 23-4 - 83
Instrumentation & Measurement Magazine 23-4 - 84
Instrumentation & Measurement Magazine 23-4 - 85
Instrumentation & Measurement Magazine 23-4 - 86
Instrumentation & Measurement Magazine 23-4 - 87
Instrumentation & Measurement Magazine 23-4 - 88
Instrumentation & Measurement Magazine 23-4 - 89
Instrumentation & Measurement Magazine 23-4 - 90
Instrumentation & Measurement Magazine 23-4 - 91
Instrumentation & Measurement Magazine 23-4 - 92
Instrumentation & Measurement Magazine 23-4 - 93
Instrumentation & Measurement Magazine 23-4 - 94
Instrumentation & Measurement Magazine 23-4 - 95
Instrumentation & Measurement Magazine 23-4 - 96
Instrumentation & Measurement Magazine 23-4 - 97
Instrumentation & Measurement Magazine 23-4 - 98
Instrumentation & Measurement Magazine 23-4 - 99
Instrumentation & Measurement Magazine 23-4 - 100
Instrumentation & Measurement Magazine 23-4 - Cover3
Instrumentation & Measurement Magazine 23-4 - Cover4
https://www.nxtbook.com/allen/iamm/26-6
https://www.nxtbook.com/allen/iamm/26-5
https://www.nxtbook.com/allen/iamm/26-4
https://www.nxtbook.com/allen/iamm/26-3
https://www.nxtbook.com/allen/iamm/26-2
https://www.nxtbook.com/allen/iamm/26-1
https://www.nxtbook.com/allen/iamm/25-9
https://www.nxtbook.com/allen/iamm/25-8
https://www.nxtbook.com/allen/iamm/25-7
https://www.nxtbook.com/allen/iamm/25-6
https://www.nxtbook.com/allen/iamm/25-5
https://www.nxtbook.com/allen/iamm/25-4
https://www.nxtbook.com/allen/iamm/25-3
https://www.nxtbook.com/allen/iamm/instrumentation-measurement-magazine-25-2
https://www.nxtbook.com/allen/iamm/25-1
https://www.nxtbook.com/allen/iamm/24-9
https://www.nxtbook.com/allen/iamm/24-7
https://www.nxtbook.com/allen/iamm/24-8
https://www.nxtbook.com/allen/iamm/24-6
https://www.nxtbook.com/allen/iamm/24-5
https://www.nxtbook.com/allen/iamm/24-4
https://www.nxtbook.com/allen/iamm/24-3
https://www.nxtbook.com/allen/iamm/24-2
https://www.nxtbook.com/allen/iamm/24-1
https://www.nxtbook.com/allen/iamm/23-9
https://www.nxtbook.com/allen/iamm/23-8
https://www.nxtbook.com/allen/iamm/23-6
https://www.nxtbook.com/allen/iamm/23-5
https://www.nxtbook.com/allen/iamm/23-2
https://www.nxtbook.com/allen/iamm/23-3
https://www.nxtbook.com/allen/iamm/23-4
https://www.nxtbookmedia.com