TY - JOUR
T1 - Tuning support vector machines for minimax and Neyman-Pearson classification
AU - Davenport, Mark A.
AU - Baraniuk, Richard G.
AU - Scott, Clayton D.
N1 - Funding Information:
The work of Mark A. Davenport and Richard G. Baraniuk was supported by US National Science Foundation (NSF) Grant CCF-0431150 and the Texas Instruments Leadership University Program. The work of Clayton D. Scott was partially supported by NSF Vertical Integration of Research and Education grant 0240068 while he was a postdoctoral fellow at Rice University (for more details, see www.eecs.umich.edu/cscott).
Copyright:
Copyright 2010 Elsevier B.V., All rights reserved.
PY - 2010
Y1 - 2010
N2 - This paper studies the training of support vector machine (SVM) classifiers with respect to the minimax and Neyman-Pearson criteria. In principle, these criteria can be optimized in a straightforward way using a cost-sensitive SVM. In practice, however, because these criteria require especially accurate error estimation, standard techniques for tuning SVM parameters, such as cross-validation, can lead to poor classifier performance. To address this issue, we first prove that the usual cost-sensitive SVM, here called the 2C-SVM, is equivalent to another formulation called the 2\nu-SVM. We then exploit a characterization of the 2\nu-SVM parameter space to develop a simple yet powerful approach to error estimation based on smoothing. In an extensive experimental study, we demonstrate that smoothing significantly improves the accuracy of cross-validation error estimates, leading to dramatic performance gains. Furthermore, we propose coordinate descent strategies that offer significant gains in computational efficiency, with little to no loss in performance.
AB - This paper studies the training of support vector machine (SVM) classifiers with respect to the minimax and Neyman-Pearson criteria. In principle, these criteria can be optimized in a straightforward way using a cost-sensitive SVM. In practice, however, because these criteria require especially accurate error estimation, standard techniques for tuning SVM parameters, such as cross-validation, can lead to poor classifier performance. To address this issue, we first prove that the usual cost-sensitive SVM, here called the 2C-SVM, is equivalent to another formulation called the 2\nu-SVM. We then exploit a characterization of the 2\nu-SVM parameter space to develop a simple yet powerful approach to error estimation based on smoothing. In an extensive experimental study, we demonstrate that smoothing significantly improves the accuracy of cross-validation error estimates, leading to dramatic performance gains. Furthermore, we propose coordinate descent strategies that offer significant gains in computational efficiency, with little to no loss in performance.
KW - error estimation
KW - Minimax classification
KW - Neyman-Pearson classification
KW - parameter selection
KW - support vector machine
UR - http://www.scopus.com/inward/record.url?scp=77956056467&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=77956056467&partnerID=8YFLogxK
U2 - 10.1109/TPAMI.2010.29
DO - 10.1109/TPAMI.2010.29
M3 - Article
C2 - 20724764
AN - SCOPUS:77956056467
VL - 32
SP - 1888
EP - 1898
JO - IEEE Transactions on Pattern Analysis and Machine Intelligence
JF - IEEE Transactions on Pattern Analysis and Machine Intelligence
SN - 0162-8828
IS - 10
M1 - 5401162
ER -