The Bridge - Issue 2, 2018 - 16

Feature
RELATED WORK

FUTURE WORK

Although the specific form of dropout that we
experimentally study here appeared in the literature
only recently, the idea of dropping out nodes or
edges has been in the literature for some time. For
example, in the 1990's genetic algorithms were used
to learn which nodes belonged in a neural network
(Ronald and Schoenauer, 1994).

We plan to expand our experiments to include a
larger variety of datasets. We plan to further explore
the parameters that could influence the optimum
dropout rate, including dataset size, number of
features, number of hidden nodes, and separability.

Dropout is considered as a sort of bootstrap
aggregation or bagging technique (Breiman 1996) in
which multiple models are trained on subsets of the
data and then combined. Unlike the most straightforward bagging implementation, all of the models
share weights even though they have different
structures at each step of training due to dropout
(Krizhevsky 2012).
This work has emphasized the relationship
between overfitting and neural network expressivity.
Curiously, neural networks have a built-in ability to
avoid overfitting even when they are capable of
memorizing the input set (Zhang et al. 2016). It is
not clear how this finding relates to the current work.

CONCLUSIONS
The optimum dropout rates for the credit card
default, breast cancer, and bank marketing datasets
were not consistently similar. Furthermore, the
dataset size seemed to have a large effect on
the optimum dropout rate, with smaller datasets
performing better with low dropout rates. Due
to the variance in optimum dropout rates for
the studied models, the implementation of a
universal dropout rate is not recommended. It is
likely there are too many varying factors between
different datasets that prohibit a common optimum
dropout rate. Therefore, it is recommended that
for each application of the deep neural network
studied, the dropout rate be optimized before live
implementation of the model.

THE BRIDGE

We would like to consider the number of hidden
nodes in addition to dropout rate and dataset size.
This is important because the dropout rate affects
the number of hidden nodes used during training.
We would expect that increasing the number of
hidden nodes would have a similar effect to reducing
the dropout rate, except when the dropout rate gets
close to zero. Varying the number of hidden nodes
also allows us to control the expressiveness of the
neural network and provides an alternative way to
avoid overfitting.
Another parameter we could explore is the number
of hidden layers used during training. This parameter
was not varied in this study because the network
cannot be trained with more than approximately
three hidden layers without becoming too
expressive.



Table of Contents for the Digital Edition of The Bridge - Issue 2, 2018

Contents
The Bridge - Issue 2, 2018 - Cover1
The Bridge - Issue 2, 2018 - Cover2
The Bridge - Issue 2, 2018 - Contents
The Bridge - Issue 2, 2018 - 4
The Bridge - Issue 2, 2018 - 5
The Bridge - Issue 2, 2018 - 6
The Bridge - Issue 2, 2018 - 7
The Bridge - Issue 2, 2018 - 8
The Bridge - Issue 2, 2018 - 9
The Bridge - Issue 2, 2018 - 10
The Bridge - Issue 2, 2018 - 11
The Bridge - Issue 2, 2018 - 12
The Bridge - Issue 2, 2018 - 13
The Bridge - Issue 2, 2018 - 14
The Bridge - Issue 2, 2018 - 15
The Bridge - Issue 2, 2018 - 16
The Bridge - Issue 2, 2018 - 17
The Bridge - Issue 2, 2018 - 18
The Bridge - Issue 2, 2018 - 19
The Bridge - Issue 2, 2018 - 20
The Bridge - Issue 2, 2018 - 21
The Bridge - Issue 2, 2018 - 22
The Bridge - Issue 2, 2018 - 23
The Bridge - Issue 2, 2018 - 24
The Bridge - Issue 2, 2018 - 25
The Bridge - Issue 2, 2018 - 26
The Bridge - Issue 2, 2018 - 27
The Bridge - Issue 2, 2018 - 28
The Bridge - Issue 2, 2018 - 29
The Bridge - Issue 2, 2018 - 30
The Bridge - Issue 2, 2018 - 31
The Bridge - Issue 2, 2018 - 32
The Bridge - Issue 2, 2018 - 33
The Bridge - Issue 2, 2018 - 34
The Bridge - Issue 2, 2018 - 35
The Bridge - Issue 2, 2018 - 36
The Bridge - Issue 2, 2018 - 37
The Bridge - Issue 2, 2018 - 38
The Bridge - Issue 2, 2018 - 39
The Bridge - Issue 2, 2018 - 40
The Bridge - Issue 2, 2018 - 41
The Bridge - Issue 2, 2018 - 42
The Bridge - Issue 2, 2018 - 43
The Bridge - Issue 2, 2018 - 44
https://www.nxtbook.com/nxtbooks/ieee/bridge_issue2_2022
https://www.nxtbook.com/nxtbooks/ieee/bridge_issue1_2022
https://www.nxtbook.com/nxtbooks/ieee/bridge_issue3_2021
https://www.nxtbook.com/nxtbooks/ieee/bridge_issue2_2021
https://www.nxtbook.com/nxtbooks/ieee/bridge_issue1_2021
https://www.nxtbook.com/nxtbooks/ieee/bridge_2020_issue3
https://www.nxtbook.com/nxtbooks/ieee/bridge_2020_issue2
https://www.nxtbook.com/nxtbooks/ieee/bridge_2020_issue1
https://www.nxtbook.com/nxtbooks/ieee/bridge_2019_issue3
https://www.nxtbook.com/nxtbooks/ieee/bridge_2019_issue2
https://www.nxtbook.com/nxtbooks/ieee/bridge_2019_issue1
https://www.nxtbook.com/nxtbooks/ieee/bridge_2018_issue3
https://www.nxtbook.com/nxtbooks/ieee/bridge_2018_issue2
https://www.nxtbook.com/nxtbooks/ieee/bridge_2018_issue1
https://www.nxtbookmedia.com