Electronic Theses and Dissertations

Date of Award


Document Type


Degree Name

M.S. in Engineering Science


Computer and Information Science

First Advisor

Dawn Wilkins

Second Advisor

Byunghyun Jang

Third Advisor

Yixin Chen

Relational Format



Deep neural networks (DNNs), and artificial neural networks (ANNs) in general, have recently received a great amount of attention from both the media and the machine learning community at large. DNNs have been used to produce world-class results in a variety of domains, including image recognition, speech recognition, sequence modeling, and natural language processing. Many of most exciting recent deep neural network studies have made improvements by hardcoding less about the network and giving the neural network more control over its own parameters, allowing flexibility and control within the network. Although much research has been done to introduce trainable hyperparameters into transformation layers (GRU [7], LSTM [13], etc), the introduction of hyperparameters into the activation layers have been largely ignored. This paper serves several purposes: to (1) equip the reader with the background knowledge, including theory and best practices for DNNs, which help contextualize the contributions of this paper, (2) to describe and verify the effectiveness of current techniques in the literature that utilize hyperparameters in the activation layer, and (3) to introduce some new activation layers that introduce hyperparameters into the model, including activation pools (APs) and parametric activation pools (PAPs), and study the effectiveness of these new constructs on popular image recognition datasets.


Emphasis: Computer Science



To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.