Abstract
Traditional extensions of the binary support vector machine (SVM) to multiclass problems
are either heuristics or require solving a large dual optimization problem. Here, a generalized
multiclass SVM is proposed called GenSVM. In this method classication boundaries
for a K-class problem are constructed in a (K ? 1)-dimensional space using a simplex
encoding. Additionally, several dierent weightings of the misclassication errors are incorporated
in the loss function, such that it generalizes three existing multiclass SVMs
through a single optimization problem. An iterative majorization algorithm is derived that
solves the optimization problem without the need of a dual formulation. This algorithm has
the advantage that it can use warm starts during cross validation and during a grid search,
which signicantly speeds up the training phase. Rigorous numerical experiments compare
linear GenSVM with seven existing multiclass SVMs on both small and large data sets.
These comparisons show that the proposed method is competitive with existing methods
in both predictive accuracy and training time, and that it signicantly outperforms several
existing methods on these criteria.
Original language | English |
---|---|
Pages (from-to) | 1-42 |
Number of pages | 42 |
Journal | Journal of Machine Learning Research |
Volume | 17 |
Issue number | 225 |
Publication status | Published - 2016 |