"MutexMatch: Semi-Supervised Learning With Mutex-Based Consi

"MutexMatch: Semi-Supervised Learning With Mutex-Based Consistency Regu" by Yue Duan, Zhen Zhao et al.

The core issue in semi-supervised learning (SSL) lies in how to effectively leverage unlabeled data, whereas most existing methods tend to put a great emphasis on the utilization of high-confidence samples yet seldom fully explore the usage of low-confidence samples. In this article, we aim to utilize low-confidence samples in a novel way with our proposed mutex-based consistency regularization, namely MutexMatch. Specifically, the high-confidence samples are required to exactly predict “what it is” by the conventional true-positive classifier (TPC), while low-confidence samples are employed to achieve a simpler goal—to predict with ease “what it is not” by the true-negative classifier (TNC). In this sense, we not only mitigate the pseudo-labeling errors but also make full use of the low-confidence unlabeled data by the consistency of dissimilarity degree. MutexMatch achieves superior performance on multiple benchmark datasets, i.e., Canadian Institute for Advanced Research (CIFAR)-10, CIFAR-100, street view house numbers (SVHN), self-taught learning 10 (STL-10), and mini-ImageNet. More importantly, our method further shows superiority when the amount of labeled data is scarce, e.g., 92.23% accuracy with only 20 labeled data on CIFAR-10. Code has been released at https://github.com/NJUyued/MutexMatch4SSL.

Related Keywords

, Canadian Institute For Advanced Research , Canadian Institute , Advanced Research , Data Models , Entropy , Abeling , Utex Based Consistency Regularization , Predictive Models , Semi Supervised Classification , Semisupervised Learning , Task Analysis , Training ,

© 2025 Vimarsana