Skip to Main Content
Frequently Asked Questions
Submit an ETD
Global Search Box
Need Help?
Keyword Search
Participating Institutions
Advanced Search
School Logo
Files
File List
PatrickMartellThesis112617__final format approved LW 11-27-17.pdf (1.43 MB)
ETD Abstract Container
Abstract Header
Hierarchical Auto-Associative Polynomial Convolutional Neural Networks
Author Info
Martell, Patrick Keith
Permalink:
http://rave.ohiolink.edu/etdc/view?acc_num=dayton1513164029518038
Abstract Details
Year and Degree
2017, Master of Science (M.S.), University of Dayton, Electrical Engineering.
Abstract
Convolutional neural networks (CNNs) lack ample methods to improve performance without either adding more input data, modifying existing data, or changing network design. This work seeks to add to the methods available that do not require more data or a trial and error approach to network design. This thesis seeks to demonstrate that a polynomial layer inserted into a CNN, compared to all other factors being equal has great potential to improve classification rates. There are some methods that seek to help fill the gap that this research also investigates an alternative solution. Most other methods in the similar problem space look at ways to improve performance of existing layers, such as modifying the type of pooling or activation functions. Also, methods discussed later, Dropout and DropConnect zero out nodes or connections, respectively, seeking to improve performance. This research focused on adding a new type of layer to typical CNNs, the polynomial layer. This layer adds a local connectivity to each of the perceptrons creating N connections up to the Nth power of the initial value of the perceptron. This is done in either the convolutional portion or the fully connected portion, with the idea that the higher dimensionality allows for better description of the input space. This idea was tested on two datasets, MNIST and CIFAR10, both classification databases with 10 classes. These datasets contain 28×28 grayscale and 32×32 RGB images, respectively. It was determined that the polynomial layer universally enabled the tested CNN to perform better on the MNIST data and the convolutional layer polynomials aid CNNs that are trained at a lower learning rate on the CIFAR10 dataset. Looking forward, more CNN designs should be analyzed, along with more learning rates, including ones with a variable rate. Additionally, performing tests on a wider range of datasets would also enable a broader understanding.
Committee
Vijayan Asari, Ph.D. (Advisor)
Theus Aspiras, PH.D. (Committee Member)
Eric Balster, Ph.D. (Committee Member)
Pages
53 p.
Subject Headings
Electrical Engineering
Keywords
Convolutional Neural Network
;
Polynomial
;
CNN
;
Classification
;
MNIST
Recommended Citations
Refworks
EndNote
RIS
Mendeley
Citations
Martell, P. K. (2017).
Hierarchical Auto-Associative Polynomial Convolutional Neural Networks
[Master's thesis, University of Dayton]. OhioLINK Electronic Theses and Dissertations Center. http://rave.ohiolink.edu/etdc/view?acc_num=dayton1513164029518038
APA Style (7th edition)
Martell, Patrick.
Hierarchical Auto-Associative Polynomial Convolutional Neural Networks.
2017. University of Dayton, Master's thesis.
OhioLINK Electronic Theses and Dissertations Center
, http://rave.ohiolink.edu/etdc/view?acc_num=dayton1513164029518038.
MLA Style (8th edition)
Martell, Patrick. "Hierarchical Auto-Associative Polynomial Convolutional Neural Networks." Master's thesis, University of Dayton, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=dayton1513164029518038
Chicago Manual of Style (17th edition)
Abstract Footer
Document number:
dayton1513164029518038
Download Count:
461
Copyright Info
© 2017, all rights reserved.
This open access ETD is published by University of Dayton and OhioLINK.