Abstract
In this paper, we address the challenging problem of selecting tuning parameters for high-dimensional sparse regression. We propose a simple and computationally efficient method, called path thresholding (PaTh), that transforms any tuning parameter-dependent sparse regression algorithm into an asymptotically tuning-free sparse regression algorithm. More specifically, we prove that, as the problem size becomes large (in the number of variables and in the number of observations), PaTh performs accurate sparse regression, under appropriate conditions, without specifying a tuning parameter. In finite-dimensional settings, we demonstrate that PaTh can alleviate the computational burden of model selection algorithms by significantly reducing the search space of tuning parameters.
Original language | English (US) |
---|---|
Pages (from-to) | 948-957 |
Number of pages | 10 |
Journal | Journal of Machine Learning Research |
Volume | 33 |
State | Published - 2014 |
Event | 17th International Conference on Artificial Intelligence and Statistics, AISTATS 2014 - Reykjavik, Iceland Duration: Apr 22 2014 → Apr 25 2014 |
ASJC Scopus subject areas
- Software
- Artificial Intelligence
- Control and Systems Engineering
- Statistics and Probability