IEEE Potentials - January/February 2018 - 16

Tambr uses sentiment analysis to generate the
pitches, durations, and intervals of the output
melodies in a way corresponding to the sentiment
of the novel-implementing algorithmic
composition of literature-based music at a level of
musicality not previously explored.
the study of it highly subjective, and
an exhaustive generative mapping
of timbre space is out of the scope of
this project. The machine- learning
algorithm used by Tambr leverages a
consistent human-labeled description
of synthesizer timbres, which will be
discussed later in detail.

On generating music
from literature
The first work describing a process
of generating music from literature
was published in 2014 by Hannah
Davis with her system TransProse.
This article describes Tambr, an
advancement that builds on the approach first explored by TransProse.
The central contributions of Tambr
are as follows:
■■Tambr uses up to 12 simultaneous voices to synthesize music,
while TransProse used only piano.
■■Tambr uses natural language
processing algorithms to extract
the topics in the novel
■■Tambr varies the intensity of notes
based on sentiment analysis
■■Tambr follows the thematic arch
and climax of the novel.
■■Tambr takes musical timbre into
account when selecting voices.
Tambr works toward the same
goal as TransProse, taking an approach more centered on how the instruments sound. This attempts to
push the current state of the art in
algorithmic composition from literature by utilizing new techniques informed by standard methods used in
natural language processing to get
closer to the underlying narrative.

Processing the novel
The input to Tambr is a text file
containing a piece of literature. The
text can be any size, though larger
texts tend to work better with the

16

■

J a n u a r y/Febr ua r y 2018

topic extraction algorithm used. The
next step after reading the text file
is to extract the relevant topics from
the novel.
The program performs a term frequency-inverse document frequency
(TF-IDF) transform on each paragraph of the text, which provides a
weight of how relevant each word is to
a given paragraph based on the number of other paragraphs in which it
appears. This makes common words
such as the, a, of, and on lower weighted than more potentially defining words, which correspondingly
become higher-weighted. The program represents the TF-IDF-transformed values in a document-term
matrix and performs a nonnegative
matrix factorization on the matrix,
which can help uncover latent (hidden) relationships and topics not explicitly present in a data set.
The output of this process is a list
of topics, where each topic is a set
of, at most, ten terms that define the
topic. For example, {caterpillar, Alice,
king, hatter, queen, gryphon} is one
of the topics extracted from Lewis
Carrol's The Adventures of Alice in
Wonderland. The topic extraction algorithm itself does not generate labels
for the topic; it only returns a group
of words that define it. If a person
interprets this example topic, he/
she might recognize that it is a list of
central characters in the novel. Another example of a topic extracted is
{war, party, power, doublethink, society}, which was taken from George
Orwell's 1984. We can see that this
topic characterizes the defining traits
of the dystopian society described in
the novel.

Selecting the synthesizer
The database of synthesizers was
taken from Apple Logic Pro X's (dig-

IEEE POTENTIALS

ital audio workstation) synthesizer
list, and it included 650+ synthesizers, which were separated by
category as per Logic's default categorizations: electronic dance music
bass, percussion, plucked, rhythmic, soundscape, strings, pad, lead,
classics, brass, bell, and bass. There
is no large explicitly human-labeled
database of how synthesizers sound
in terms of their timbre, but the
exploitation is here: Tambr uses
the names of the synthesizers as a
consistent set of labels indicating
how the synthesizers might sound.
The naming convent ion used by
Apple Logic Pro X is consistently
descriptive, with synthesizers named
{Alien Alarm, Fog Machine, Sheets
of Metal, Robot Talk, Noise Pump,
Laser Brain, Dark Movements, Black
Sun, Distant Air, Flying Waves, and
Peaceful Meadow}.
It is now this synthesizer-selection module's job to assign the synthesizer with the label most similar
in meaning to each topic previously
extracted. In this novel approach,
Tambr implements a search engine
for synthesizers based on a large semantic database made popular by
Google, as well as an open-source
tool, word2vec.
Every word has a context, and
words with similar contexts probably
have similar meaning. Take these
contexts for example:
■■I am going to eat a pie.
■■I am going to eat a burrito.
■■I am going to eat a banana.
■■I am going to eat a burger.
■■I am going to eat an X.
These words have similar contexts,
so a reader can infer that although
we do not know what X is, it probably is a food.
Google word2vec is a two-layer
artificial neural network whose output allows us to represent the entire contexts of words as a series of
numerical values in a manageable
vector space, making it possible to
effectively process the meaning of
words with computational methods.
One simple way of computing how
similar words are, now that we have
their context-vectors (called wordembeddings in the natural language



Table of Contents for the Digital Edition of IEEE Potentials - January/February 2018

Contents
IEEE Potentials - January/February 2018 - Cover1
IEEE Potentials - January/February 2018 - Cover2
IEEE Potentials - January/February 2018 - Contents
IEEE Potentials - January/February 2018 - 2
IEEE Potentials - January/February 2018 - 3
IEEE Potentials - January/February 2018 - 4
IEEE Potentials - January/February 2018 - 5
IEEE Potentials - January/February 2018 - 6
IEEE Potentials - January/February 2018 - 7
IEEE Potentials - January/February 2018 - 8
IEEE Potentials - January/February 2018 - 9
IEEE Potentials - January/February 2018 - 10
IEEE Potentials - January/February 2018 - 11
IEEE Potentials - January/February 2018 - 12
IEEE Potentials - January/February 2018 - 13
IEEE Potentials - January/February 2018 - 14
IEEE Potentials - January/February 2018 - 15
IEEE Potentials - January/February 2018 - 16
IEEE Potentials - January/February 2018 - 17
IEEE Potentials - January/February 2018 - 18
IEEE Potentials - January/February 2018 - 19
IEEE Potentials - January/February 2018 - 20
IEEE Potentials - January/February 2018 - 21
IEEE Potentials - January/February 2018 - 22
IEEE Potentials - January/February 2018 - 23
IEEE Potentials - January/February 2018 - 24
IEEE Potentials - January/February 2018 - 25
IEEE Potentials - January/February 2018 - 26
IEEE Potentials - January/February 2018 - 27
IEEE Potentials - January/February 2018 - 28
IEEE Potentials - January/February 2018 - 29
IEEE Potentials - January/February 2018 - 30
IEEE Potentials - January/February 2018 - 31
IEEE Potentials - January/February 2018 - 32
IEEE Potentials - January/February 2018 - 33
IEEE Potentials - January/February 2018 - 34
IEEE Potentials - January/February 2018 - 35
IEEE Potentials - January/February 2018 - 36
IEEE Potentials - January/February 2018 - 37
IEEE Potentials - January/February 2018 - 38
IEEE Potentials - January/February 2018 - 39
IEEE Potentials - January/February 2018 - 40
IEEE Potentials - January/February 2018 - 41
IEEE Potentials - January/February 2018 - 42
IEEE Potentials - January/February 2018 - 43
IEEE Potentials - January/February 2018 - 44
IEEE Potentials - January/February 2018 - 45
IEEE Potentials - January/February 2018 - 46
IEEE Potentials - January/February 2018 - 47
IEEE Potentials - January/February 2018 - 48
IEEE Potentials - January/February 2018 - Cover3
IEEE Potentials - January/February 2018 - Cover4
https://www.nxtbook.com/nxtbooks/ieee/potentials_20190102
https://www.nxtbook.com/nxtbooks/ieee/potentials_20181112
https://www.nxtbook.com/nxtbooks/ieee/potentials_20180910
https://www.nxtbook.com/nxtbooks/ieee/potentials_20180708
https://www.nxtbook.com/nxtbooks/ieee/potentials_20180506
https://www.nxtbook.com/nxtbooks/ieee/potentials_20180304
https://www.nxtbook.com/nxtbooks/ieee/potentials_20180102
https://www.nxtbook.com/nxtbooks/ieee/potentials_111217
https://www.nxtbook.com/nxtbooks/ieee/potentials_091017
https://www.nxtbook.com/nxtbooks/ieee/potentials_070817
https://www.nxtbook.com/nxtbooks/ieee/potentials_050617
https://www.nxtbook.com/nxtbooks/ieee/potentials_030417
https://www.nxtbook.com/nxtbooks/ieee/potentials_010217
https://www.nxtbook.com/nxtbooks/ieee/potentials_111216
https://www.nxtbook.com/nxtbooks/ieee/potentials_091016
https://www.nxtbook.com/nxtbooks/ieee/potentials_070816
https://www.nxtbook.com/nxtbooks/ieee/potentials_050616
https://www.nxtbook.com/nxtbooks/ieee/potentials_030416
https://www.nxtbook.com/nxtbooks/ieee/potentials_010216
https://www.nxtbook.com/nxtbooks/ieee/potentials_111215
https://www.nxtbook.com/nxtbooks/ieee/potentials_091015
https://www.nxtbook.com/nxtbooks/ieee/potentials_070815
https://www.nxtbook.com/nxtbooks/ieee/potentials_050615
https://www.nxtbook.com/nxtbooks/ieee/potentials_030415
https://www.nxtbook.com/nxtbooks/ieee/potentials_010215
https://www.nxtbook.com/nxtbooks/ieee/potentials_111214
https://www.nxtbook.com/nxtbooks/ieee/potentials_091014
https://www.nxtbook.com/nxtbooks/ieee/potentials_070814
https://www.nxtbookmedia.com