TY - JOUR
T1 - Solution path for manifold regularized semisupervised classification
AU - Wang, Gang
AU - Wang, Fei
AU - Chen, Tao
AU - Yeung, Dit Yan
AU - Lochovsky, Frederick H.
PY - 2012/4
Y1 - 2012/4
N2 - Traditional learning algorithms use only labeled data for training. However, labeled examples are often difficult or time consuming to obtain since they require substantial human labeling efforts. On the other hand, unlabeled data are often relatively easy to collect. Semisupervised learning addresses this problem by using large quantities of unlabeled data with labeled data to build better learning algorithms. In this paper, we use the manifold regularization approach to formulate the semisupervised learning problem where a regularization framework which balances a tradeoff between loss and penalty is established. We investigate different implementations of the loss function and identify the methods which have the least computational expense. The regularization hyperparameter, which determines the balance between loss and penalty, is crucial to model selection. Accordingly, we derive an algorithm that can fit the entire path of solutions for every value of the hyperparameter. Its computational complexity after preprocessing is quadratic only in the number of labeled examples rather than the total number of labeled and unlabeled examples.
AB - Traditional learning algorithms use only labeled data for training. However, labeled examples are often difficult or time consuming to obtain since they require substantial human labeling efforts. On the other hand, unlabeled data are often relatively easy to collect. Semisupervised learning addresses this problem by using large quantities of unlabeled data with labeled data to build better learning algorithms. In this paper, we use the manifold regularization approach to formulate the semisupervised learning problem where a regularization framework which balances a tradeoff between loss and penalty is established. We investigate different implementations of the loss function and identify the methods which have the least computational expense. The regularization hyperparameter, which determines the balance between loss and penalty, is crucial to model selection. Accordingly, we derive an algorithm that can fit the entire path of solutions for every value of the hyperparameter. Its computational complexity after preprocessing is quadratic only in the number of labeled examples rather than the total number of labeled and unlabeled examples.
KW - manifold regularization
KW - semi-supervised classification
KW - solution path
UR - https://www.webofscience.com/wos/woscc/full-record/WOS:000302097000003
UR - https://openalex.org/W2159880745
UR - https://www.scopus.com/pages/publications/84859009842
U2 - 10.1109/TSMCB.2011.2168205
DO - 10.1109/TSMCB.2011.2168205
M3 - Journal Article
C2 - 22010154
SN - 1083-4419
VL - 42
SP - 308
EP - 319
JO - IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
JF - IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
IS - 2
M1 - 6046145
ER -