Tech Briefs Magazine - March 2022 - Sensor-15

Quad-engine Core - Segmentation Network Data Flow
External Memory /
ISP real-time client
X
NeuPro-M™
AI Processor
L2
Shared
MSS
Y
Input:
NPM
AXI
Overlap
area
engine #0
NPM
engine #1
NPM
engine #2
NPM
engine #3
Parallel
processing
L1 MEM
engine #0
L1 MEM
engine #1
L1 MEM
engine #2
L1 MEM
engine #3
Output:
L1 MEM
engine #0
Figure 3 (Image: CEVA)
tem; and the NPM engine. The processor
can include anywhere from one to
eight engines, which can be selected to
meet the needs of a particular application.
The processor operation can be
scaled by choosing the number of
engines. " That's how you get more and
more horsepower, " said Abraham.
The NPM common subsystem is in constant
communication with the NPM
engine. That channel is monitored to
make sure that it does not become a bottleneck
- to make sure data will keep
flowing into the system. The inferencing
AI runs with two datasets: the data itself,
perhaps an image; and the weight, which
is applied to the data in order to do the
inferencing. The common subsystem
keeps the channel open by applying compression
to both the data and the weights.
Parallel processing can be implemented
both by using multiple engines and also
using the coprocessors within the engines,
each of which contains five coprocessors
and a shared internal memory.
Example - Controlling a Vehicle
with a Four-Engine NPM
Figure 3 illustrates a simple automotive
application of parallel processing. The left
side of the figure shows an image of the
road, which is captured by a front-facing
camera. A processor within the vehicle
blocks out the opposing lane to simplify
the computations needed to keep the vehicle
centered on its side of the road and
stores the image in memory. The stored
image is input from the vehicle's memory
to the NPM common subsystem, which in
Sensor Technology, March 2022
Cov
this example is serving four engines. The
software then decides what the use case is
- what is needed - and how to divide the
image in order to attain maximum performance
with minimum power (high utilization)
for the desired function. In this
case, the NPM divides the image into four
parts, with some overlap, and each part is
sent to a different engine. The AI inferencing
is then run on each of the four segments
of the road. The four segments are
then stitched back together into the subsystem
memory, from which it is output to the
perception layer elsewhere in the SoC, for
performing the desired tasks.
This example illustrates the two levels of
parallel processing, one by using the four
engines to work on different segments of
the image and within each engine, parallel
processing by sharing the computations
among the five internal coprocessors.
Optimization Via Software
AI functions chiefly through convolution,
which is a mathematical operation
on two functions that produces a third
function expressing how the shape of
one is modified by the other. The mathematician
Shmuel Winograd innovated
a new method of doing convolution in
half of the usual number of steps. CEVA
implemented this theoretical idea in
their processors to achieve the same precision
as would have been achieved with
normal convolution, but with a nearly 2x
acceleration - a gain in performance
with a reduction in power. This can be
done in each of the five coprocessors
within the engine.
www.techbriefs.com
ToC
Another trick is to operate differently
on different data types, depending on
which would be optimal for a particular
application. For example, simultaneous
localization and mapping (SLAM),
requires very high accuracy so you have
to use floating point arithmetic. For
other applications, a fixed number of
bits would be perfect. In this way the
automobile manufacturer can choose
the computational method that works
best for each function within the vehicle.
By using both software manipulation
and hardware optimization, you can
gain significant acceleration - up to
16x with
Abraham.
NeuPro-M according to
Summing it Up
This has been an overview of the
internal functioning of a particular AI
processor as it processes the data from
a variety of sensors - radar, lidar,
sonar, cameras - and makes decisions.
The NPM is a heterogeneous processor
- it can operate on different data types
and optimize its operation as measured
by TOPS/watt by using two levels of
parallel processing as well as targeted
design of the software.
This article was written by Ed Brown,
Editor of Sensor Technology. For more information,
contact Ed at edward.brown@saemediagroup.com
or visit http://info.hotims.
com/82319-164.
1 Fang Chen, PhD, SAE Edge Research Report -
Unsettled Issues in Vehicle Autonomy,
Artificial Intelligence, and Human-Machine
Interaction.
15
L1 MEM
engine #1
L1 MEM
engine #2
L1 MEM
engine #3
http://info.hotims.com/82319-710 http://info.hotims.com/82319-164 http://www.techbriefs.com http://info.hotims.com/82319-780

Tech Briefs Magazine - March 2022

Table of Contents for the Digital Edition of Tech Briefs Magazine - March 2022

Tech Briefs Magazine - March 2022 - Intro
Tech Briefs Magazine - March 2022 - Sponsor
Tech Briefs Magazine - March 2022 - Cov1
Tech Briefs Magazine - March 2022 - Cov2
Tech Briefs Magazine - March 2022 - 1
Tech Briefs Magazine - March 2022 - 2
Tech Briefs Magazine - March 2022 - 3
Tech Briefs Magazine - March 2022 - 4
Tech Briefs Magazine - March 2022 - 5
Tech Briefs Magazine - March 2022 - 6
Tech Briefs Magazine - March 2022 - 7
Tech Briefs Magazine - March 2022 - 8
Tech Briefs Magazine - March 2022 - 9
Tech Briefs Magazine - March 2022 - 10
Tech Briefs Magazine - March 2022 - 11
Tech Briefs Magazine - March 2022 - 12
Tech Briefs Magazine - March 2022 - 13
Tech Briefs Magazine - March 2022 - 14
Tech Briefs Magazine - March 2022 - 15
Tech Briefs Magazine - March 2022 - 16
Tech Briefs Magazine - March 2022 - 17
Tech Briefs Magazine - March 2022 - 18
Tech Briefs Magazine - March 2022 - 19
Tech Briefs Magazine - March 2022 - 20
Tech Briefs Magazine - March 2022 - 21
Tech Briefs Magazine - March 2022 - 22
Tech Briefs Magazine - March 2022 - 23
Tech Briefs Magazine - March 2022 - 24
Tech Briefs Magazine - March 2022 - 25
Tech Briefs Magazine - March 2022 - 26
Tech Briefs Magazine - March 2022 - 27
Tech Briefs Magazine - March 2022 - 28
Tech Briefs Magazine - March 2022 - 29
Tech Briefs Magazine - March 2022 - 30
Tech Briefs Magazine - March 2022 - 31
Tech Briefs Magazine - March 2022 - 32
Tech Briefs Magazine - March 2022 - 33
Tech Briefs Magazine - March 2022 - 34
Tech Briefs Magazine - March 2022 - 35
Tech Briefs Magazine - March 2022 - 36
Tech Briefs Magazine - March 2022 - 37
Tech Briefs Magazine - March 2022 - 38
Tech Briefs Magazine - March 2022 - 39
Tech Briefs Magazine - March 2022 - 40
Tech Briefs Magazine - March 2022 - 41
Tech Briefs Magazine - March 2022 - 42
Tech Briefs Magazine - March 2022 - 43
Tech Briefs Magazine - March 2022 - 44
Tech Briefs Magazine - March 2022 - 45
Tech Briefs Magazine - March 2022 - 46
Tech Briefs Magazine - March 2022 - 47
Tech Briefs Magazine - March 2022 - 48
Tech Briefs Magazine - March 2022 - 49
Tech Briefs Magazine - March 2022 - 50
Tech Briefs Magazine - March 2022 - 51
Tech Briefs Magazine - March 2022 - 52
Tech Briefs Magazine - March 2022 - 53
Tech Briefs Magazine - March 2022 - 54
Tech Briefs Magazine - March 2022 - 55
Tech Briefs Magazine - March 2022 - 56
Tech Briefs Magazine - March 2022 - 57
Tech Briefs Magazine - March 2022 - 58
Tech Briefs Magazine - March 2022 - 59
Tech Briefs Magazine - March 2022 - 60
Tech Briefs Magazine - March 2022 - Cov3
Tech Briefs Magazine - March 2022 - Cov4
Tech Briefs Magazine - March 2022 - PIT-Cov1
Tech Briefs Magazine - March 2022 - PIT-Cov2
Tech Briefs Magazine - March 2022 - PIT-1
Tech Briefs Magazine - March 2022 - PIT-2
Tech Briefs Magazine - March 2022 - PIT-3
Tech Briefs Magazine - March 2022 - PIT-4
Tech Briefs Magazine - March 2022 - PIT-5
Tech Briefs Magazine - March 2022 - PIT-6
Tech Briefs Magazine - March 2022 - PIT-7
Tech Briefs Magazine - March 2022 - PIT-8
Tech Briefs Magazine - March 2022 - PIT-9
Tech Briefs Magazine - March 2022 - PIT-10
Tech Briefs Magazine - March 2022 - PIT-11
Tech Briefs Magazine - March 2022 - PIT-12
Tech Briefs Magazine - March 2022 - PIT-13
Tech Briefs Magazine - March 2022 - PIT-14
Tech Briefs Magazine - March 2022 - PIT-15
Tech Briefs Magazine - March 2022 - PIT-16
Tech Briefs Magazine - March 2022 - PIT-17
Tech Briefs Magazine - March 2022 - PIT-18
Tech Briefs Magazine - March 2022 - PIT-19
Tech Briefs Magazine - March 2022 - PIT-20
Tech Briefs Magazine - March 2022 - PIT-21
Tech Briefs Magazine - March 2022 - PIT-22
Tech Briefs Magazine - March 2022 - PIT-23
Tech Briefs Magazine - March 2022 - PIT-24
Tech Briefs Magazine - March 2022 - PIT-25
Tech Briefs Magazine - March 2022 - PIT-26
Tech Briefs Magazine - March 2022 - PIT-27
Tech Briefs Magazine - March 2022 - PIT-28
Tech Briefs Magazine - March 2022 - PIT-Cov3
Tech Briefs Magazine - March 2022 - PIT-Cov4
Tech Briefs Magazine - March 2022 - Sensor-Cov1
Tech Briefs Magazine - March 2022 - Sensor-Cov2
Tech Briefs Magazine - March 2022 - Sensor-1
Tech Briefs Magazine - March 2022 - Sensor-2
Tech Briefs Magazine - March 2022 - Sensor-3
Tech Briefs Magazine - March 2022 - Sensor-4
Tech Briefs Magazine - March 2022 - Sensor-5
Tech Briefs Magazine - March 2022 - Sensor-6
Tech Briefs Magazine - March 2022 - Sensor-7
Tech Briefs Magazine - March 2022 - Sensor-8
Tech Briefs Magazine - March 2022 - Sensor-9
Tech Briefs Magazine - March 2022 - Sensor-10
Tech Briefs Magazine - March 2022 - Sensor-11
Tech Briefs Magazine - March 2022 - Sensor-12
Tech Briefs Magazine - March 2022 - Sensor-13
Tech Briefs Magazine - March 2022 - Sensor-14
Tech Briefs Magazine - March 2022 - Sensor-15
Tech Briefs Magazine - March 2022 - Sensor-16
Tech Briefs Magazine - March 2022 - Sensor-17
Tech Briefs Magazine - March 2022 - Sensor-18
Tech Briefs Magazine - March 2022 - Sensor-19
Tech Briefs Magazine - March 2022 - Sensor-20
Tech Briefs Magazine - March 2022 - Sensor-21
Tech Briefs Magazine - March 2022 - Sensor-Cov4
https://www.nxtbook.com/smg/techbriefs/24TB04
https://www.nxtbook.com/smg/techbriefs/24TB03
https://www.nxtbook.com/smg/techbriefs/24TB02
https://www.nxtbook.com/smg/techbriefs/24TB01
https://www.nxtbook.com/smg/techbriefs/23TB12
https://www.nxtbook.com/smg/techbriefs/23TB11
https://www.nxtbook.com/smg/techbriefs/23TB10
https://www.nxtbook.com/smg/techbriefs/23TB09
https://www.nxtbook.com/smg/techbriefs/23TB08
https://www.nxtbook.com/smg/techbriefs/23TB07
https://www.nxtbook.com/smg/techbriefs/23TB06
https://www.nxtbook.com/smg/techbriefs/23TB05
https://www.nxtbook.com/smg/techbriefs/23TB04
https://www.nxtbook.com/smg/techbriefs/23TB03
https://www.nxtbook.com/smg/techbriefs/23TB02
https://www.nxtbook.com/smg/techbriefs/23TB01
https://www.nxtbook.com/smg/Testing/22TB12
https://www.nxtbook.com/smg/techbriefs/22TB12
https://www.nxtbook.com/smg/techbriefs/22TB11
https://www.nxtbook.com/smg/techbriefs/22TB10
https://www.nxtbook.com/smg/techbriefs/22TB09
https://www.nxtbook.com/smg/techbriefs/22TB08
https://www.nxtbook.com/smg/techbriefs/22TB07
https://www.nxtbook.com/smg/techbriefs/22TB06
https://www.nxtbook.com/smg/techbriefs/22TB05-P
https://www.nxtbook.com/smg/techbriefs/22TB05-D
https://www.nxtbook.com/smg/techbriefs/22TB04
https://www.nxtbook.com/smg/techbriefs/22TB03
https://www.nxtbook.com/smg/techbriefs/22TB02
https://www.nxtbook.com/smg/techbriefs/22TB01
https://www.nxtbook.com/smg/techbriefs/21TB12
https://www.nxtbook.com/smg/techbriefs/21TB11
https://www.nxtbook.com/smg/techbriefs/21TB10
https://www.nxtbook.com/smg/techbriefs/21TB09
https://www.nxtbook.com/smg/techbriefs/21TB08
https://www.nxtbook.com/smg/techbriefs/21TB07
https://www.nxtbook.com/smg/techbriefs/21TB06
https://www.nxtbook.com/smg/techbriefs/21TB05
https://www.nxtbook.com/smg/techbriefs/21TB04
https://www.nxtbook.com/smg/techbriefs/21BT03
https://www.nxtbook.com/smg/techbriefs/21TB02
https://www.nxtbookmedia.com