Tuning support vector machines for minimax and Neyman-Pearson classification

Mark A. Davenport, Richard G. Baraniuk, Clayton D. Scott

Research output: Contribution to journalArticlepeer-review

61 Scopus citations


This paper studies the training of support vector machine (SVM) classifiers with respect to the minimax and Neyman-Pearson criteria. In principle, these criteria can be optimized in a straightforward way using a cost-sensitive SVM. In practice, however, because these criteria require especially accurate error estimation, standard techniques for tuning SVM parameters, such as cross-validation, can lead to poor classifier performance. To address this issue, we first prove that the usual cost-sensitive SVM, here called the 2C-SVM, is equivalent to another formulation called the 2\nu-SVM. We then exploit a characterization of the 2\nu-SVM parameter space to develop a simple yet powerful approach to error estimation based on smoothing. In an extensive experimental study, we demonstrate that smoothing significantly improves the accuracy of cross-validation error estimates, leading to dramatic performance gains. Furthermore, we propose coordinate descent strategies that offer significant gains in computational efficiency, with little to no loss in performance.

Original languageEnglish (US)
Article number5401162
Pages (from-to)1888-1898
Number of pages11
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Issue number10
StatePublished - 2010


  • error estimation
  • Minimax classification
  • Neyman-Pearson classification
  • parameter selection
  • support vector machine

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Vision and Pattern Recognition
  • Software
  • Computational Theory and Mathematics
  • Applied Mathematics


Dive into the research topics of 'Tuning support vector machines for minimax and Neyman-Pearson classification'. Together they form a unique fingerprint.

Cite this