Skip to Main Content
 

Global Search Box

 
 
 
 

ETD Abstract Container

Abstract Header

Generalization error rates for margin-based classifiers

Park, Changyi

Abstract Details

2005, Doctor of Philosophy, Ohio State University, Statistics.
Margin-based classifiers defined by functional margins are generally believed to yield high performance in classification. In this thesis, a general theory that quantifies the size of generalization error of a margin classifier is presented. The trade-off between geometric margins and training errors is captured, in addition to the complexity of a classification problem. The theory permits an investigation of the generalization ability of convex and nonconvex margin classifiers, including support vector machines (SVM), kernel logistic regression (KLR), and ψ-learning. Our theory indicates that the generalization ability of a certain class of nonconvex losses may be substantially faster than those for convex losses. Illustrative examples for both linear and nonlinear classification are provided.
Xiaotong Shen (Advisor)
63 p.

Recommended Citations

Citations

  • Park, C. (2005). Generalization error rates for margin-based classifiers [Doctoral dissertation, Ohio State University]. OhioLINK Electronic Theses and Dissertations Center. http://rave.ohiolink.edu/etdc/view?acc_num=osu1124282485

    APA Style (7th edition)

  • Park, Changyi. Generalization error rates for margin-based classifiers. 2005. Ohio State University, Doctoral dissertation. OhioLINK Electronic Theses and Dissertations Center, http://rave.ohiolink.edu/etdc/view?acc_num=osu1124282485.

    MLA Style (8th edition)

  • Park, Changyi. "Generalization error rates for margin-based classifiers." Doctoral dissertation, Ohio State University, 2005. http://rave.ohiolink.edu/etdc/view?acc_num=osu1124282485

    Chicago Manual of Style (17th edition)