arXiv Analytics

Sign in

arXiv:2006.15791 [cs.LG]AbstractReferencesReviewsResources

Probabilistic Classification Vector Machine for Multi-Class Classification

Shengfei Lyu, Xing Tian, Yang Li, Bingbing Jiang, Huanhuan Chen

Published 2020-06-29Version 1

The probabilistic classification vector machine (PCVM) synthesizes the advantages of both the support vector machine and the relevant vector machine, delivering a sparse Bayesian solution to classification problems. However, the PCVM is currently only applicable to binary cases. Extending the PCVM to multi-class cases via heuristic voting strategies such as one-vs-rest or one-vs-one often results in a dilemma where classifiers make contradictory predictions, and those strategies might lose the benefits of probabilistic outputs. To overcome this problem, we extend the PCVM and propose a multi-class probabilistic classification vector machine (mPCVM). Two learning algorithms, i.e., one top-down algorithm and one bottom-up algorithm, have been implemented in the mPCVM. The top-down algorithm obtains the maximum a posteriori (MAP) point estimates of the parameters based on an expectation-maximization algorithm, and the bottom-up algorithm is an incremental paradigm by maximizing the marginal likelihood. The superior performance of the mPCVMs, especially when the investigated problem has a large number of classes, is extensively evaluated on synthetic and benchmark data sets.

Related articles: Most relevant | Search more
arXiv:1202.3770 [cs.LG] (Published 2012-02-14)
Hierarchical Maximum Margin Learning for Multi-Class Classification
arXiv:2106.08864 [cs.LG] (Published 2021-06-16)
Multi-Class Classification from Single-Class Data with Confidences
arXiv:0908.4144 [cs.LG] (Published 2009-08-28)
ABC-LogitBoost for Multi-class Classification