Skip to Main Content
Frequently Asked Questions
Submit an ETD
Global Search Box
Need Help?
Keyword Search
Participating Institutions
Advanced Search
School Logo
Files
File List
29107.pdf (2.81 MB)
ETD Abstract Container
Abstract Header
Built-In Self Training of Hardware-Based Neural Networks
Author Info
Anderson, Thomas
Permalink:
http://rave.ohiolink.edu/etdc/view?acc_num=ucin1512039036199393
Abstract Details
Year and Degree
2017, MS, University of Cincinnati, Engineering and Applied Science: Computer Engineering.
Abstract
Articial neural networks and deep learning are a topic of increasing interest in computing. This has spurred investigation into dedicated hardware like accelerators to speed up the training and inference processes. This work proposes a new hardware architecture called Built-In Self Training (BISTr) for both training a network and performing inferences. The architecture combines principles from the Built-In Self Testing (BIST) VLSI paradigm with the backpropagation learning algorithm. The primary focus of the work is to present the BISTr architecture and verify its efficacy. The development of the architecture began with analysis of the backpropagation algorithm and the derivation of new equations. Once the derivations were complete, the hardware was designed and all of the functional components were tested using VHDL from the bottom to top level. An automatic synthesis tool was created to generate the code used and tested in the experimental phase. The application tested during the experiments was function approximation. The new architecture was trained successfully for a couple of the test cases. The other test cases were not successful, but this was due to the data representation used in the VHDL code and not a result of the hardware design itself. The area overhead of the added hardware and speed performance were analyzed briefly. The results showed that: (1) the area overhead was significant (around 3 times the area without the additional hardware) and (2) the theoretical speed performance of the design is very good. The new BISTr architecture was proven to work and had a good theoretical speed performance. However, the architecture presented in this work cannot be implemented for large neural networks due to the large amount of area overhead. Further work would be required to expand upon the idea presented in this paper and improve it: (1) development of an alternative design that is more practical to implement, (2) more rigorous testing of area and speed, (3) implementation of other training methods and functionality, and (4) additions to the synthesis tool to increase its capability.
Committee
Wen-Ben Jone, Ph.D. (Committee Chair)
Ali Minai, Ph.D. (Committee Member)
Ranganadha Vemuri, Ph.D. (Committee Member)
Pages
123 p.
Subject Headings
Computer Engineering
Keywords
neural networks
;
backpropagation algorithm
;
training
;
accelerator
;
hardware
;
function approximation
Recommended Citations
Refworks
EndNote
RIS
Mendeley
Citations
Anderson, T. (2017).
Built-In Self Training of Hardware-Based Neural Networks
[Master's thesis, University of Cincinnati]. OhioLINK Electronic Theses and Dissertations Center. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1512039036199393
APA Style (7th edition)
Anderson, Thomas.
Built-In Self Training of Hardware-Based Neural Networks.
2017. University of Cincinnati, Master's thesis.
OhioLINK Electronic Theses and Dissertations Center
, http://rave.ohiolink.edu/etdc/view?acc_num=ucin1512039036199393.
MLA Style (8th edition)
Anderson, Thomas. "Built-In Self Training of Hardware-Based Neural Networks." Master's thesis, University of Cincinnati, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1512039036199393
Chicago Manual of Style (17th edition)
Abstract Footer
Document number:
ucin1512039036199393
Download Count:
387
Copyright Info
© 2017, some rights reserved.
Built-In Self Training of Hardware-Based Neural Networks by Thomas Anderson is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License. Based on a work at etd.ohiolink.edu.
This open access ETD is published by University of Cincinnati and OhioLINK.