Systems, Man & Cybernetics - April 2016 - 16

health applications while simultaThe wide availability of smart
neously trying to minimize the
health-care appliances and a variA key challenge in
energy overhead of the sensor data
ety of standalone and integrated
deploying contextcollection process? In contrast, our
sensor devices makes it increasingprevious work dealt with a single
ly easy to ubiquitously and continudriven health-care
context-dependent health applicaously monitor an indiv idual's
applications involves
tion [12]. Our work is influenced
health-related vital signals and her
by the observation that the landactivity behavior and to integrate
energy-efficient
scape of remote health/wellness
such medical and activity data into
determination or
monitoring applications is changhealth-care information systems.
ing from the earlier stove-piped
We are already witnessing early
inference of high-level
model (where each application
commercial activity in this space,
context information
was customized for an explicit set
centered on remote monitoring of
from low-level sensor
of sensor devices) to a more fungielderly individuals and chronically
ble, standards-based model, where
ill patients within smart assisteddata streams.
the underlying sensors are viewed
living homes. A combination of
as common, shareable resources
body-worn medical and nonmedithat are simultaneously utilized by
cal sensors (e.g., sensors to monitor
multiple applications.
blood oxygenation or accelerometers to monitor movements) and in situ sensors (e.g., ther◆ Smart assisted-living environments are gradually being
mal and motion detectors) is used to continuously monitor
equipped with a variety of different networked sensors
and automatically determine an individual's context in
(e.g., cameras, motion sensors, or light sensors) capable
such smart environments. Broadly speaking, context here
of programmatic data retrieval and control.
refers to a variety of dynamically changing states, related
◆ Sensor-based health monitoring applications are growto either an individual's specific activities (e.g., walking vering, both in number and in the variety of medical consus sleeping) or biomedical conditions (e.g., elevated blood
texts being monitored. In large part, the explosion of
pressure, shortness of breath, or arrhythmia), or to surapps on the Apple AppStore and Google Googleplay
rounding environmental conditions (e.g., atmospheric
are responsible for these recent phenomena-promiozone levels or ambient temperature). In many health- and
nent examples of health-care-related applications
wellness-related applications, such context is the critical
include Stress Check, Stress Doctor, Instant Heart Rate
enabler of various capabilities, such as alerting a first
(http://www.azumio.com/), SmartRunner (http://www.
responder if the individual is judged to be sleeping for an
smartrunner.com/pages/), etc.
abnormal period of time or flagging a potential health risk
As an illustration, consider a remote context moniby analyzing wellness data to detect shortness of breath
toring scenario (shown in Figure 1) in a smart assistedafter everyday physical activities.
living environment in which an elderly person resides.
In many scenarios of practical interest, the data
The smart home may be equipped with many sensors
streams are generated by a variety of battery-operated
[light, humidity, electrocardiogram (ECG) electromyogstandalone or embedded sensors (e.g., accelerometers on a
raphy, etc.], some of which may be body-worn while othsmartphone), and the act of transmitting the sensor
ers may be embedded in the environment. A variety of
streams to a backend server for context extraction can
applications and stakeholders (e.g., fall monitoring by a
impose a significant energy burden. Accordingly, a crucial
caregiver, wellness activity monitoring by a doctor, or
technical challenge in the area of sensor-based pervasive
vital sign monitoring by a nurse) need to access this
health-care applications centers on the question of how
low-level sensed information to abstract high level conone can efficiently and reliably convert streams of low-levtext (both physiological and activity) about the resiel sensor-generated data into high-level abstractions of
dent. A n impor tant obser vation is that a specific
context. Previous work in the broader field of sensor-drivapplication's context can be satisfied by different possien context inferencing has largely assumed that the type
ble combinations of sensor data types. For example, the
and amount of low-level sensor data available to a specific
fall-detection application may utilize data either from
application are invariant. This prior work has therefore
multiple video cameras, from a set of body-worn accelfocused on how to 1) automatically map low-level sensor
erometer and wall-mounted motion sensors, from a set
data to appropriate abstractions of context states and 2)
of audio sensors, or from some arbitrary combination
empirically establish whether the accuracy of inferred
of these.
context is sufficient to enable automated adaptation [4].
The preceding example motivates the need for a
In this article, we take a somewhat contrarian view and
matchmaking software infrastructure that mediates
ask the question: How can we support the varying context
between the context-driven health and wellness applicarequirements of multiple emerging context-dependent
tions and the set of available sensors in a way that
16

IEEE SyStEmS, man, & CybErnEtICS magazInE A pri l 2016


http://www.azumio.com/ http://www http://www.smartrunner.com/pages/

Table of Contents for the Digital Edition of Systems, Man & Cybernetics - April 2016

Systems, Man & Cybernetics - April 2016 - Cover1
Systems, Man & Cybernetics - April 2016 - Cover2
Systems, Man & Cybernetics - April 2016 - 1
Systems, Man & Cybernetics - April 2016 - 2
Systems, Man & Cybernetics - April 2016 - 3
Systems, Man & Cybernetics - April 2016 - 4
Systems, Man & Cybernetics - April 2016 - 5
Systems, Man & Cybernetics - April 2016 - 6
Systems, Man & Cybernetics - April 2016 - 7
Systems, Man & Cybernetics - April 2016 - 8
Systems, Man & Cybernetics - April 2016 - 9
Systems, Man & Cybernetics - April 2016 - 10
Systems, Man & Cybernetics - April 2016 - 11
Systems, Man & Cybernetics - April 2016 - 12
Systems, Man & Cybernetics - April 2016 - 13
Systems, Man & Cybernetics - April 2016 - 14
Systems, Man & Cybernetics - April 2016 - 15
Systems, Man & Cybernetics - April 2016 - 16
Systems, Man & Cybernetics - April 2016 - 17
Systems, Man & Cybernetics - April 2016 - 18
Systems, Man & Cybernetics - April 2016 - 19
Systems, Man & Cybernetics - April 2016 - 20
Systems, Man & Cybernetics - April 2016 - 21
Systems, Man & Cybernetics - April 2016 - 22
Systems, Man & Cybernetics - April 2016 - 23
Systems, Man & Cybernetics - April 2016 - 24
Systems, Man & Cybernetics - April 2016 - 25
Systems, Man & Cybernetics - April 2016 - 26
Systems, Man & Cybernetics - April 2016 - 27
Systems, Man & Cybernetics - April 2016 - 28
Systems, Man & Cybernetics - April 2016 - 29
Systems, Man & Cybernetics - April 2016 - 30
Systems, Man & Cybernetics - April 2016 - 31
Systems, Man & Cybernetics - April 2016 - 32
Systems, Man & Cybernetics - April 2016 - 33
Systems, Man & Cybernetics - April 2016 - 34
Systems, Man & Cybernetics - April 2016 - 35
Systems, Man & Cybernetics - April 2016 - 36
Systems, Man & Cybernetics - April 2016 - 37
Systems, Man & Cybernetics - April 2016 - 38
Systems, Man & Cybernetics - April 2016 - 39
Systems, Man & Cybernetics - April 2016 - 40
Systems, Man & Cybernetics - April 2016 - 41
Systems, Man & Cybernetics - April 2016 - 42
Systems, Man & Cybernetics - April 2016 - 43
Systems, Man & Cybernetics - April 2016 - 44
Systems, Man & Cybernetics - April 2016 - 45
Systems, Man & Cybernetics - April 2016 - 46
Systems, Man & Cybernetics - April 2016 - 47
Systems, Man & Cybernetics - April 2016 - 48
Systems, Man & Cybernetics - April 2016 - 49
Systems, Man & Cybernetics - April 2016 - 50
Systems, Man & Cybernetics - April 2016 - 51
Systems, Man & Cybernetics - April 2016 - 52
Systems, Man & Cybernetics - April 2016 - 53
Systems, Man & Cybernetics - April 2016 - 54
Systems, Man & Cybernetics - April 2016 - 55
Systems, Man & Cybernetics - April 2016 - 56
Systems, Man & Cybernetics - April 2016 - Cover3
Systems, Man & Cybernetics - April 2016 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/smc_202110
https://www.nxtbook.com/nxtbooks/ieee/smc_202107
https://www.nxtbook.com/nxtbooks/ieee/smc_202104
https://www.nxtbook.com/nxtbooks/ieee/smc_202101
https://www.nxtbook.com/nxtbooks/ieee/smc_202010
https://www.nxtbook.com/nxtbooks/ieee/smc_202007
https://www.nxtbook.com/nxtbooks/ieee/smc_202004
https://www.nxtbook.com/nxtbooks/ieee/smc_202001
https://www.nxtbook.com/nxtbooks/ieee/smc_201910
https://www.nxtbook.com/nxtbooks/ieee/smc_201907
https://www.nxtbook.com/nxtbooks/ieee/smc_201904
https://www.nxtbook.com/nxtbooks/ieee/smc_201901
https://www.nxtbook.com/nxtbooks/ieee/smc_201810
https://www.nxtbook.com/nxtbooks/ieee/smc_201807
https://www.nxtbook.com/nxtbooks/ieee/smc_201804
https://www.nxtbook.com/nxtbooks/ieee/smc_201801
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1017
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0717
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0417
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0117
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1016
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0716
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0416
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0116
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_1015
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0715
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0415
https://www.nxtbook.com/nxtbooks/ieee/systems_man_cybernetics_0115
https://www.nxtbookmedia.com