Generalization Guarantees of Self-Training of Halfspaces under Label Noise Corruption - IPS
Communication Dans Un Congrès Année : 2023

Generalization Guarantees of Self-Training of Halfspaces under Label Noise Corruption

Résumé

We investigate the generalization properties of a self-training algorithm with halfspaces. The approach learns a list of halfspaces iteratively from labeled and unlabeled training data, in which each iteration consists of two steps: exploration and pruning. In the exploration phase, the halfspace is found sequentially by maximizing the unsigned-margin among unlabeled examples and then assigning pseudo-labels to those that have a distance higher than the current threshold. These pseudo-labels are allegedly corrupted by noise. The training set is then augmented with noisy pseudo-labeled examples, and a new classifier is trained. This process is repeated until no more unlabeled examples remain for pseudo-labeling. In the pruning phase, pseudo-labeled samples that have a distance to the last halfspace greater than the associated unsigned-margin are then discarded. We prove that the misclassification error of the resulting sequence of classifiers is bounded and show that the resulting semi-supervised approach never degrades performance compared to the classifier learned using only the initial labeled training set. Experiments carried out on a variety of benchmarks demonstrate the efficiency of the proposed approach compared to state-of-the-art methods.
Fichier principal
Vignette du fichier
main.pdf (333.98 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04763760 , version 1 (02-11-2024)

Identifiants

Citer

Lies Hadjadj, Massih-Reza Amini, Sana Louhichi. Generalization Guarantees of Self-Training of Halfspaces under Label Noise Corruption. Thirty-Second International Joint Conference on Artificial Intelligence IJCAI-23, Aug 2023, Macau, China. pp.3777-3785, ⟨10.24963/IJCAI.2023/420⟩. ⟨hal-04763760⟩
0 Consultations
0 Téléchargements

Altmetric

Partager

More