Unmanned Systems Special Report. February 2023 - 3

really speak like this... yet. However,
the recently established field of XAI
(Explainable AI) is giving deep learning
systems a voice with which they can
convey the reasons behind their decision
and actions in a way that is natural
and easy for humans to understand.
The future of autonomous vehicles
means deployment for very long-term
missions in unexplored locations and
environments, from the deep sea to
outer space. When they come across
an anomaly, they need to know that
what they're observing is new and
interesting, and either make the decision
to get closer or follow it to learn more.
But creating a curious machine is not
easy. To be curious, a machine must
discern what's out of the ordinary; in
other words, it must perform anomaly
detection. Anomaly detection is
complicated. It requires the machine
to perceive something (detection) and
know what it is or isn't (classification).
This applies to a stationary object, like
an undersea mine, or a moving one,
or even groups of objects moving
together, like marine life, which can
require observation over days or weeks
to gather enough data to determine
what normal behavior is (that is, a
baseline in a pattern of life analysis).
How Do We Construct
Curiosity?
To be a curious member of a humanmachine
team, at-sea vehicles require
more advanced R&D in many fields:
perception, AI algorithms, world
modeling, navigation, anomaly detection,
human-understandable communication,
and more. Let's examine a few.
Perception: Signal Processing
and Data Interpretation. Visual
sensors (EO/IR) become less effective
underwater, so their information
must be fused with acoustic sensors
(sonar, doppler, vibrometers).
Advances in onboard image
processing are needed to remove
artifacts like " marine snow " -bright
spots in underwater images from the
reflection of particles suspended in
the water-so the images are clear
enough to support decision-making.
UNMANNED SYSTEMS SPECIAL REPORT
AutoTRap Onboard™ automated target recognition and sonar image processing is being tested on
Teledyne Gavia platforms for mine detection. (Image: Charles River Analytics)
All sensors must become smaller
and require less energy to support
longer missions to fully explore an
area or track species' patterns of life.
Onboard Intelligence: Autonomy,
Planning, Situational and SelfAwareness.
We need a big shift in AI
and machine learning to incorporate
true situational awareness; that is,
awareness of the surrounding external
and internal environment. Artificial
" intelligence " doesn't guarantee mission
success-how AI performs depends
on its knowledge of the environment
and everything in it, including itself.
To accurately identify whether
something is worth investigating in an
image of the seafloor, an AI must also
know the seafloor type, the state of the
water column, and what is usual for that
environment. It also needs to be aware
of its own performance; if something is
wrong with a sensor, it must discount
that sensor's observations, the same way
we would if we were wearing glasses
with a smudge on the lens. At a higher
level, it needs to know if it has enough
experience in a certain environment,
or if it has previously performed well
or poorly in that environment.
All this information must feed into a
decision-making process that supports
curious behaviors. Decision-making
software, that is, the combination
of AI algorithms and knowledge
models that can make a machine
exhibit curiosity, must be able to run
on small, lightweight, low-power
processors onboard the vehicle.
Anomaly Detection: Target
Detection, Classification, and World
Modeling. Advances in target detection
and classification on their own can't
determine whether an object is unusual.
That judgment requires a model of the
environment, and an understanding
of what is normal within it.
Currently, baseline parameters
are mostly provided by humans, but
by replacing these hardcoded rules
with complex models, a machine can
know, for example, how water and
sediment tend to mix in a certain
underwater environment, and how
the information being returned
by the sonar sensor should be
interpreted differently as a result.
Development of a world model
is in progress for the topography
of the seafloor. Companies like
Terradepth are using a fleet of
autonomous submersibles to generate
a set of detailed bathymetric maps
so we can Google Earth above
and below the water's surface.
Human-Machine Collaboration and
Communication: Current methods
of docking and transferring data
miss opportunities for discovery.
By transferring (or even liveFEBRUARY
2023 3

Unmanned Systems Special Report. February 2023

Table of Contents for the Digital Edition of Unmanned Systems Special Report. February 2023

Unmanned Systems Special Report. February 2023 - Cov1
Unmanned Systems Special Report. February 2023 - Cov2
Unmanned Systems Special Report. February 2023 - 1
Unmanned Systems Special Report. February 2023 - 2
Unmanned Systems Special Report. February 2023 - 3
Unmanned Systems Special Report. February 2023 - 4
Unmanned Systems Special Report. February 2023 - 5
Unmanned Systems Special Report. February 2023 - 6
Unmanned Systems Special Report. February 2023 - 7
Unmanned Systems Special Report. February 2023 - 8
Unmanned Systems Special Report. February 2023 - 9
Unmanned Systems Special Report. February 2023 - 10
Unmanned Systems Special Report. February 2023 - 11
Unmanned Systems Special Report. February 2023 - 12
Unmanned Systems Special Report. February 2023 - 13
Unmanned Systems Special Report. February 2023 - 14
Unmanned Systems Special Report. February 2023 - 15
Unmanned Systems Special Report. February 2023 - 16
Unmanned Systems Special Report. February 2023 - 17
https://www.nxtbookmedia.com