Relaxed Clipping: A Global Training Method for Robust Regression and Classification

Dec 1, 2010·
Y. Yu
,
M. Yang
,
L. Xu
,
M. White
,
D. Schuurmans
· 0 min read
Abstract
Robust regression and classification are often thought to require non-convex loss functions that prevent scalable, global training. However, such a view neglects the possibility of reformulated training methods that can yield practically solvable alternatives. A natural way to make a loss function more robust to outliers is to truncate loss values that exceed a maximum threshold. We demonstrate that a relaxation of this form of loss clipping’ can be made globally solvable and applicable to any standard loss while guaranteeing robustness against outliers. We present a generic procedure that can be applied to standard loss functions and demonstrate improved robustness in regression and classification problems.
Type
Publication
Advances in Neural Information Processing Systems (NeurIPS)