IEEE Awards Booklet - 2016 - 18

ieee sPectrum feature

an underclassman at Princeton in the
late 1950s, Forney wasn't necessarily
set on a career in engineering. He did
ultimately decide to pursue a bachelor
of science in engineering, but not out of
any burning desire to invent or design.
"I thought, 'I'll keep my options
open,' but without any great intention to become an engineer," Forney
recalls, relaxing in the sunny sitting
area off the kitchen of his Cambridge,
Mass., home one December afternoon
last year. But an elective course in thermodynamics taught by John Archibald
Wheeler stirred a sense of discovery
in him.
"I really liked his approach," Forney
says of the legendary physicist's downto-earth teaching style. "It was much
more of an engineering course than a
physics course. And he [assigned] a term
paper instead of a final exam."
For the paper, Forney decided to
read Léon Brillouin's 1956 book, Science and Information Theory. The
book tackled thermodynamics in the
context of information theory, then a
fledgling field, founded about a decade
earlier with a groundbreaking paper by
Claude Shannon. In that 1948 paper, "A
Mathematical Theory of Communication," Shannon laid out the mathematical foundation for the transmission of
information (the centenary of his birth
is being celebrated this year).
For ne y s ay s he w a s s t r uck by
Brillouin's resolution of the problem

pusHing tHe liMits: Forney
[right] and Robert Gallager confer at
MIT� This photograph was taken around
1965, Forney says�

of Maxwell's demon, a thought experiment in which energy appears to be
created for free by sorting molecules
by their speed. Brillouin noted that
every physical system contains information, and extracting that information-in the case of Maxwell's demon,
the speed of particles-always costs
energy, enough to precisely satisfy the
laws of thermodynamics. "Information
isn't free; it comes at a cost," Forney
says, in summary. "Probably I could
poke holes in that now. But certainly
as an undergraduate this was all very
interesting."
He carried this interest to graduate
school at MIT, in 1961, where he found
a whirlwind of research activity. Shannon himself had recently arrived from
Bell Labs, and a research group was trying to extend his work and find practical uses for it.
In a master's thesis on information
theory and quantum mechanics, and a
doctoral dissertation on error-correcting codes, Forney displayed a bloodhound's nose for f inding the right
problems and the right questions to
ask about those problems. The 1990
IEEE Medal of Honor recipient, Rob18 | 2016 IEEE AWARDS BooKLET

ert Gallager, a young faculty member
in the mid-1960s and now an emeritus
professor at MIT's Research Laboratory of Electronics, singles out Forney's doctoral thesis as a leap forward
in the field.
At that time, digital technology was
taking off, and researchers were hunting for coding schemes-ways of transforming those 1s and 0s into a form
that could be carried from place to
place with little power and few errors.
In his celebrated 1948 paper, Shannon
had already worked out the ultimate
limit to such efficiency, a maximum
achievable error-free data rate for any
communications channel. But reaching that limit was easier said than done.
Disturbances during transmission
will flip bits at random. To tackle errors,
Shannon proposed adding redundant
bits to a sequence of data before transmission to create an encoded packet.
The longer the packet, the less likely
it would be corrupted to look like
another potential sequence. This
approach could push transmission to its
limits, but it posed a practical challenge
at the decoding stage. The straightforward, brute-force approach would
compare the incoming sequence with
every possible transmitted sequence to
find the most likely one. This process
could work for relatively short packets, Forney says. But longer packets,
which would be needed to obtain very
low error rates, would quickly exhaust
the computational power of a decoder,
even with today's technology.
Researchers proposed various coding schemes to try to achieve data rates
close to the Shannon limit with low
error rates and reasonable decoding
complexity. But there was no single
code that could do it all.
Forney had another idea. Why not
break the problem down and use multiple, complementary codes to encode
and decode data, one operating on the
outcome of the other? A simple inner
code operating directly on the input
and output of a communications channel could achieve a moderate error rate
at data rates near Shannon's limit. An
outer code, used before data enters the
inner code and after exiting it, could
drive error rates down further using a



Table of Contents for the Digital Edition of IEEE Awards Booklet - 2016

IEEE Awards Booklet - 2016 - Cover1
IEEE Awards Booklet - 2016 - Cover2
IEEE Awards Booklet - 2016 - 1
IEEE Awards Booklet - 2016 - 2
IEEE Awards Booklet - 2016 - 3
IEEE Awards Booklet - 2016 - 4
IEEE Awards Booklet - 2016 - 5
IEEE Awards Booklet - 2016 - 6
IEEE Awards Booklet - 2016 - 7
IEEE Awards Booklet - 2016 - 8
IEEE Awards Booklet - 2016 - 9
IEEE Awards Booklet - 2016 - 10
IEEE Awards Booklet - 2016 - 11
IEEE Awards Booklet - 2016 - 12
IEEE Awards Booklet - 2016 - 13
IEEE Awards Booklet - 2016 - 14
IEEE Awards Booklet - 2016 - 15
IEEE Awards Booklet - 2016 - 16
IEEE Awards Booklet - 2016 - 17
IEEE Awards Booklet - 2016 - 18
IEEE Awards Booklet - 2016 - 19
IEEE Awards Booklet - 2016 - 20
IEEE Awards Booklet - 2016 - 21
IEEE Awards Booklet - 2016 - 22
IEEE Awards Booklet - 2016 - 23
IEEE Awards Booklet - 2016 - 24
IEEE Awards Booklet - 2016 - 25
IEEE Awards Booklet - 2016 - 26
IEEE Awards Booklet - 2016 - 27
IEEE Awards Booklet - 2016 - 28
IEEE Awards Booklet - 2016 - 29
IEEE Awards Booklet - 2016 - 30
IEEE Awards Booklet - 2016 - 31
IEEE Awards Booklet - 2016 - 32
IEEE Awards Booklet - 2016 - 33
IEEE Awards Booklet - 2016 - 34
IEEE Awards Booklet - 2016 - 35
IEEE Awards Booklet - 2016 - 36
IEEE Awards Booklet - 2016 - Cover3
IEEE Awards Booklet - 2016 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/awards_2023
https://www.nxtbook.com/nxtbooks/ieee/awards_2022
https://www.nxtbook.com/nxtbooks/ieee/awards_2021
https://www.nxtbook.com/nxtbooks/ieee/awards_2020
https://www.nxtbook.com/nxtbooks/ieee/awards_2019
https://www.nxtbook.com/nxtbooks/ieee/awards_2018
https://www.nxtbook.com/nxtbooks/ieee/awards_2017
https://www.nxtbook.com/nxtbooks/ieee/awards_2016
https://www.nxtbook.com/nxtbooks/ieee/awards_2015
https://www.nxtbook.com/nxtbooks/ieee/awards_2014
https://www.nxtbook.com/nxtbooks/ieee/awards_2013
https://www.nxtbook.com/nxtbooks/ieee/awards_2012
https://www.nxtbook.com/nxtbooks/ieee/awards_2011
https://www.nxtbook.com/nxtbooks/ieee/awards_2010
https://www.nxtbook.com/nxtbooks/ieee/awards_2009
https://www.nxtbookmedia.com