Skip to Main Content
Frequently Asked Questions
Submit an ETD
Global Search Box
Need Help?
Keyword Search
Participating Institutions
Advanced Search
School Logo
Files
File List
Theus_DisFinal_final format approved LW 12-14-15.pdf (2.27 MB)
ETD Abstract Container
Abstract Header
Hierarchical Autoassociative Polynomial Network for Deep Learning of Complex Manifolds
Author Info
Aspiras, Theus Herrera
Permalink:
http://rave.ohiolink.edu/etdc/view?acc_num=dayton1449104879
Abstract Details
Year and Degree
2015, Doctor of Philosophy (Ph.D.), University of Dayton, Electrical Engineering.
Abstract
Artificial neural networks are an area of research that has been explored extensively. With the formation of these networks, models of biological neural networks can be created mathematically for several different purposes. The neural network architecture being explored here is the nonlinear line attractor (NLA) network, which uses a polynomial weighting scheme instead of a linear weighting scheme for specific tasks. We have conducted research on this architecture and found that it works well to converge towards a specific trained pattern and diverge with untrained patterns. We have also improved the architecture with a Gaussian weighting scheme, which provides a modularity in the architecture and reduces redundancy in the network. Testing on the new weighting scheme improves network on different datasets gave better convergence characteristics, quicker training times, and improved recognition rates. The NLA architecture, however, is not able to reduce the dimensionality, thus a nonlinear dimensionality reduction technique is used. To improve the architecture further, we must be able to decompose the NLA architecture further to alleviate problems in the original structures and allow further improvements. We propose a hierarchical autoassociative polynomial network (HAP Net) which reorders the NLA architecture to include different ways to use polynomial weighting. In each layer, we can have orders of each input connected by a weight set, which can be trained by a backpropagation algorithm. By combining different architectures based on the understanding of MLP, attractor, and modular networks, we create a multi-purpose architecture including all aspects of the previous architecture which is far improved for classification and recognition tasks. Experiments conducted on the standard dataset, MNIST, shows very promising results of the HAP Net framework. Research work is progressing in evaluating performance on HAP Net on various datasets and also incorporating advanced learning strategies, convolutional neural networks, and extreme learning machine to investigate the performance.
Committee
Vijayan Asari, Ph.D. (Committee Chair)
Raul Ordonez, Ph.D (Committee Member)
Eric Balster, Ph.D. (Committee Member)
Wesam Sakla, Ph.D. (Committee Member)
Pages
83 p.
Subject Headings
Computer Engineering
;
Electrical Engineering
Keywords
Polynomial Neural Network
;
Complex Manifolds
;
Deep Learning
;
Nonlinear Weighting
;
Modular
;
Classification
;
MNIST
;
HAP net
Recommended Citations
Refworks
EndNote
RIS
Mendeley
Citations
Aspiras, T. H. (2015).
Hierarchical Autoassociative Polynomial Network for Deep Learning of Complex Manifolds
[Doctoral dissertation, University of Dayton]. OhioLINK Electronic Theses and Dissertations Center. http://rave.ohiolink.edu/etdc/view?acc_num=dayton1449104879
APA Style (7th edition)
Aspiras, Theus.
Hierarchical Autoassociative Polynomial Network for Deep Learning of Complex Manifolds.
2015. University of Dayton, Doctoral dissertation.
OhioLINK Electronic Theses and Dissertations Center
, http://rave.ohiolink.edu/etdc/view?acc_num=dayton1449104879.
MLA Style (8th edition)
Aspiras, Theus. "Hierarchical Autoassociative Polynomial Network for Deep Learning of Complex Manifolds." Doctoral dissertation, University of Dayton, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=dayton1449104879
Chicago Manual of Style (17th edition)
Abstract Footer
Document number:
dayton1449104879
Download Count:
657
Copyright Info
© 2015, all rights reserved.
This open access ETD is published by University of Dayton and OhioLINK.