![]() |
|
||
Linearized Smooth Additive ClassifiersSubhransu Maji Toyota Technological Institute at Chicago, Chicago, IL 60637, USAsmaji@ttic.edu Abstract. We consider a framework for learning additive classifiers based on regularized empirical risk minimization, where the regularization favors “smooth” functions. We present representations of classifiers for which the optimization problem can be efficiently solved. The first family of such classifiers are derived from a penalized spline formulation due to Eilers and Marx, which is modified to enabled linearization. The second is a novel family of classifiers that are based on classes of orthogonal basis functions with othogonal derivatives. Both these families lead to explicit feature embeddings that can be used with off-the-shelf linear solvers such as LIBLINEAR to obtain additive classifiers. The proposed family of classifiers offer better trade-offs between training time, memory overhead and classifier accuracy, compared to the state-of-the-art in additive classifier training. LNCS 7583, p. 239 ff. lncs@springer.com
|