Instrumentation & Measurement Magazine 25-9 - 25

related to the current input or the memorized sequences. The
developed architecture is able to memorize both sequences
and sub-sequences of events. In fact, after being correctly
classified, each presented element activates, in addition to a
neuron in the ring outer to the previously active one, a neuron
in the first ring of the context layer, which can become the generator
of a subsequence.
Fig. 4b illustrates an example of sequence and subsequence
learning. Let us assume to learn the sequence
" A-B-C-D " of four objects classified as A, B, C and D. The
neuron corresponding to the first classified element " A " is
activated in the first ring of the context layer (Fig. 4b, topleft).
This neuron diffuses its activity to all neurons in the
second ring, where, at the second step, a conditioning signal
coming from the classification layer of the resonant neurons
activates that neuron in the second ring corresponding to
object " B. " Thus, the sequence A-B starts to emerge (Fig.4b,
top-right). Subsequently, the conditioning signal for element
" B " activates also the corresponding neuron in the
first ring. This gives rise to a new sequence (indeed a subsequence).
As a result (Fig. 4b, bottom-left), two traces are
now activated: " A-B " and " B. " When element " C " from the
classification layer stimulates the neuron corresponding
to object " C " in the context layer, the traces corresponding
to " A-B-C " and " B-C " are generated, but also the neuron
corresponding to object " C " is active in the first ring (thus
generating a new subsequence starting with " C " ). In Fig. 4b,
bottom right, we can see the evolution of the complete
sequence trace, ( " A-B-C-D " ), as well as all the other subsequences
( " B-C-D " , " C-D " , " D " ).
The developed computational model was embedded as
a neural controller in a roving robot (see Fig. 5), expected to
navigate in scenarios inspired by biological experiments performed
with honeybees [15]. The sensorial input is provided
through the omnidirectional camera equipped on-board
(Fig. 5a), and the features of the detected object of interest (i.e.,
landmarks) are processed through the LSM to control the robot
navigation while negotiating a maze-like scenario. In the proposed
experiment, the robot behavioral response consists of a
turning action followed by a forward motion to reach a new
decision point in the maze. During the learning phase, the robot
memorizes two sequences of actions corresponding to two
different routes that guarantee, in the end, reward signals with
different amplitudes (R=1 and R=3 in the reported experiment,
shown in Fig. 5b).
Each visually detected landmark (i.e., a symbol shown
within the monitors positioned within the arena) is considered
part of a sequence to be properly classified and stored when a
rewarding signal is acquired. After learning, during the recall
phase, the robot can internally generate the expected sequence
also in presence of noisy or missing input signals. In the experiment
illustrated in Fig. 5, at the beginning of the testing phase,
the monitors show the initial landmarks of the two learned
sequences. Starting from this condition, the robot internally
simulates both sequences and chooses (and follows) that path
related to the largest overall reward.
December 2022
Ready for Outdoor Implementation
The insect brain computational model built in recent years was
demonstrated through indoor experiments. Now it is ready to
be deployed into real-life conditions, exploiting the latest robust
implementations of biomimetic robots. In the last year, a
quadruped platform, designed and built upon the MIT minicheetah
software-hardware project [16], was duly modified
in our laboratory to host the needed sensory system for autonomous
outdoor navigation and inspection (Fig. 6). This
was used as a prototype for monitoring landslide terrain and
release environmental sensors. The additional hardware consisted
of an external inertial motion unit, a GPS and a PC able
to receive the target position and, as a function of the current
robot position, execute a series of segment-wise trajectories
following an optimized off-line routine. The addition of
the inset brain-inspired architecture will include on-board
decision-making capabilities and adaptive terrain locomotion
with proprioceptive feedback for improved autonomy.
Although this quadruped platform is equipped with a microprocessor-based
board with the possibility to implement
bio-inspired control strategies (e.g., LSM-based networks),
other solutions are available in the literature for neuromorphic
computing. The adoption of FPGA-based solutions as
well as other hardware devices based on photonics or specifically
designed integrated circuits can significantly improve
the performance when large networks are implemented [17].
The address event representation (AER) protocol is one of the
methods adopted to speed-up the communication process
by reducing data transfer: only the addresses of the neurons
where a spike-event occurs are transmitted. The aim is to reduce
energy consumption, a key characteristic of autonomous
robot applications.
Conclusions and Future Perspectives
Insect-inspired neural architectures were briefly presented
to act as spiking neural controllers in bioinspired robotics.
The unique features of reservoir computing structures joined
with the capability of directly processing time-dependent signals
make these structures very similar to the simplest insect
brain nets. An additional feature that makes these structures
interesting for future developments is the possibility to embed
into a unique dynamic structure (i.e., the neuron lattice)
several read-out maps to extract from the same lattice different
mappings that can be associated to parallel tasks to be
concurrently implemented, reducing the computational complexity,
according to the neural reuse paradigm. In particular,
the dynamic lattice should receive information from different
time-depending sensory modalities (as usually found in the
insect brain), and collectively and dynamically process them to
constitute a unique memory and knowledge reservoir. This is
ready to be exploited through different read-out maps, trained
to match different environment-tailored behavioral responses.
The two examples reported in this paper, although being implemented
separately, could be fused and a unique dynamic
lattice could serve both tasks concurrently. This opens the
way to extremely interesting applications of spiking neural
IEEE Instrumentation & Measurement Magazine
25

Instrumentation & Measurement Magazine 25-9

Table of Contents for the Digital Edition of Instrumentation & Measurement Magazine 25-9

Instrumentation & Measurement Magazine 25-9 - Cover1
Instrumentation & Measurement Magazine 25-9 - Cover2
Instrumentation & Measurement Magazine 25-9 - 1
Instrumentation & Measurement Magazine 25-9 - 2
Instrumentation & Measurement Magazine 25-9 - 3
Instrumentation & Measurement Magazine 25-9 - 4
Instrumentation & Measurement Magazine 25-9 - 5
Instrumentation & Measurement Magazine 25-9 - 6
Instrumentation & Measurement Magazine 25-9 - 7
Instrumentation & Measurement Magazine 25-9 - 8
Instrumentation & Measurement Magazine 25-9 - 9
Instrumentation & Measurement Magazine 25-9 - 10
Instrumentation & Measurement Magazine 25-9 - 11
Instrumentation & Measurement Magazine 25-9 - 12
Instrumentation & Measurement Magazine 25-9 - 13
Instrumentation & Measurement Magazine 25-9 - 14
Instrumentation & Measurement Magazine 25-9 - 15
Instrumentation & Measurement Magazine 25-9 - 16
Instrumentation & Measurement Magazine 25-9 - 17
Instrumentation & Measurement Magazine 25-9 - 18
Instrumentation & Measurement Magazine 25-9 - 19
Instrumentation & Measurement Magazine 25-9 - 20
Instrumentation & Measurement Magazine 25-9 - 21
Instrumentation & Measurement Magazine 25-9 - 22
Instrumentation & Measurement Magazine 25-9 - 23
Instrumentation & Measurement Magazine 25-9 - 24
Instrumentation & Measurement Magazine 25-9 - 25
Instrumentation & Measurement Magazine 25-9 - 26
Instrumentation & Measurement Magazine 25-9 - 27
Instrumentation & Measurement Magazine 25-9 - 28
Instrumentation & Measurement Magazine 25-9 - 29
Instrumentation & Measurement Magazine 25-9 - 30
Instrumentation & Measurement Magazine 25-9 - 31
Instrumentation & Measurement Magazine 25-9 - 32
Instrumentation & Measurement Magazine 25-9 - 33
Instrumentation & Measurement Magazine 25-9 - 34
Instrumentation & Measurement Magazine 25-9 - 35
Instrumentation & Measurement Magazine 25-9 - 36
Instrumentation & Measurement Magazine 25-9 - 37
Instrumentation & Measurement Magazine 25-9 - 38
Instrumentation & Measurement Magazine 25-9 - 39
Instrumentation & Measurement Magazine 25-9 - 40
Instrumentation & Measurement Magazine 25-9 - 41
Instrumentation & Measurement Magazine 25-9 - 42
Instrumentation & Measurement Magazine 25-9 - 43
Instrumentation & Measurement Magazine 25-9 - 44
Instrumentation & Measurement Magazine 25-9 - 45
Instrumentation & Measurement Magazine 25-9 - 46
Instrumentation & Measurement Magazine 25-9 - 47
Instrumentation & Measurement Magazine 25-9 - 48
Instrumentation & Measurement Magazine 25-9 - 49
Instrumentation & Measurement Magazine 25-9 - 50
Instrumentation & Measurement Magazine 25-9 - 51
Instrumentation & Measurement Magazine 25-9 - 52
Instrumentation & Measurement Magazine 25-9 - 53
Instrumentation & Measurement Magazine 25-9 - 54
Instrumentation & Measurement Magazine 25-9 - 55
Instrumentation & Measurement Magazine 25-9 - 56
Instrumentation & Measurement Magazine 25-9 - 57
Instrumentation & Measurement Magazine 25-9 - 58
Instrumentation & Measurement Magazine 25-9 - 59
Instrumentation & Measurement Magazine 25-9 - 60
Instrumentation & Measurement Magazine 25-9 - 61
Instrumentation & Measurement Magazine 25-9 - 62
Instrumentation & Measurement Magazine 25-9 - 63
Instrumentation & Measurement Magazine 25-9 - 64
Instrumentation & Measurement Magazine 25-9 - 65
Instrumentation & Measurement Magazine 25-9 - 66
Instrumentation & Measurement Magazine 25-9 - 67
Instrumentation & Measurement Magazine 25-9 - 68
Instrumentation & Measurement Magazine 25-9 - 69
Instrumentation & Measurement Magazine 25-9 - Cover3
Instrumentation & Measurement Magazine 25-9 - Cover4
https://www.nxtbook.com/allen/iamm/26-6
https://www.nxtbook.com/allen/iamm/26-5
https://www.nxtbook.com/allen/iamm/26-4
https://www.nxtbook.com/allen/iamm/26-3
https://www.nxtbook.com/allen/iamm/26-2
https://www.nxtbook.com/allen/iamm/26-1
https://www.nxtbook.com/allen/iamm/25-9
https://www.nxtbook.com/allen/iamm/25-8
https://www.nxtbook.com/allen/iamm/25-7
https://www.nxtbook.com/allen/iamm/25-6
https://www.nxtbook.com/allen/iamm/25-5
https://www.nxtbook.com/allen/iamm/25-4
https://www.nxtbook.com/allen/iamm/25-3
https://www.nxtbook.com/allen/iamm/instrumentation-measurement-magazine-25-2
https://www.nxtbook.com/allen/iamm/25-1
https://www.nxtbook.com/allen/iamm/24-9
https://www.nxtbook.com/allen/iamm/24-7
https://www.nxtbook.com/allen/iamm/24-8
https://www.nxtbook.com/allen/iamm/24-6
https://www.nxtbook.com/allen/iamm/24-5
https://www.nxtbook.com/allen/iamm/24-4
https://www.nxtbook.com/allen/iamm/24-3
https://www.nxtbook.com/allen/iamm/24-2
https://www.nxtbook.com/allen/iamm/24-1
https://www.nxtbook.com/allen/iamm/23-9
https://www.nxtbook.com/allen/iamm/23-8
https://www.nxtbook.com/allen/iamm/23-6
https://www.nxtbook.com/allen/iamm/23-5
https://www.nxtbook.com/allen/iamm/23-2
https://www.nxtbook.com/allen/iamm/23-3
https://www.nxtbook.com/allen/iamm/23-4
https://www.nxtbookmedia.com