Skip to Main Content
Frequently Asked Questions
Submit an ETD
Global Search Box
Need Help?
Keyword Search
Participating Institutions
Advanced Search
School Logo
Files
File List
shaodanDissertation.pdf (1.94 MB)
ETD Abstract Container
Abstract Header
Direct Optimization for Classification with Boosting
Author Info
Zhai, Shaodan
Permalink:
http://rave.ohiolink.edu/etdc/view?acc_num=wright1453001665
Abstract Details
Year and Degree
2015, Doctor of Philosophy (PhD), Wright State University, Computer Science and Engineering PhD.
Abstract
Boosting, as one of the state-of-the-art classification approaches, is widely used in the industry for a broad range of problems. The existing boosting methods often formulate classification tasks as a convex optimization problem by using surrogates of performance measures. While the convex surrogates are computationally efficient to globally optimize, they are sensitive to outliers and inconsistent under some conditions. On the other hand, boosting's success can be ascribed to maximizing the margins, but few boosting approaches are designed to directly maximize the margin. In this research, we design novel boosting algorithms that directly optimize non-convex performance measures, including the empirical classification error and margin functions, without resorting to any surrogates or approximations. We first applied this approach on binary classification, and then extended this idea to more complicated classification problems, including multi-class classification, semi-supervised classification, and multi-label classification. These extensions are non-trivial, where we have to mathematically re-formulate the optimization problem: defining new objectives and designing new algorithms that depend on the specific learning tasks. Moreover, we showed good theoretical properties of the optimization objectives, which explains why we define these objectives and how we design algorithms to efficiently optimize them. Finally, we showed experimentally that the proposed approaches display competitive or better results than state-of-the-art convex relaxation boosting methods, and they perform especially well on noisy cases.
Committee
Shaojun Wang, Ph.D. (Advisor)
Keke Chen, Ph.D. (Committee Member)
Krishnaprasad Thirunarayan, Ph.D. (Committee Member)
Chaocheng Huang, Ph.D. (Committee Member)
Pages
126 p.
Subject Headings
Computer Science
Keywords
computer science
Recommended Citations
Refworks
EndNote
RIS
Mendeley
Citations
Zhai, S. (2015).
Direct Optimization for Classification with Boosting
[Doctoral dissertation, Wright State University]. OhioLINK Electronic Theses and Dissertations Center. http://rave.ohiolink.edu/etdc/view?acc_num=wright1453001665
APA Style (7th edition)
Zhai, Shaodan.
Direct Optimization for Classification with Boosting.
2015. Wright State University, Doctoral dissertation.
OhioLINK Electronic Theses and Dissertations Center
, http://rave.ohiolink.edu/etdc/view?acc_num=wright1453001665.
MLA Style (8th edition)
Zhai, Shaodan. "Direct Optimization for Classification with Boosting." Doctoral dissertation, Wright State University, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=wright1453001665
Chicago Manual of Style (17th edition)
Abstract Footer
Document number:
wright1453001665
Download Count:
750
Copyright Info
© 2015, all rights reserved.
This open access ETD is published by Wright State University and OhioLINK.