Pseudo Label
329 papers with code • 0 benchmarks • 0 datasets
A lightweight but very power technique for semi supervised learning
Benchmarks
These leaderboards are used to track progress in Pseudo Label
Libraries
Use these libraries to find Pseudo Label models and implementationsMost implemented papers
Semi-supervised Left Atrium Segmentation with Mutual Consistency Training
Such mutual consistency encourages the two decoders to have consistent and low-entropy predictions and enables the model to gradually capture generalized features from these unlabeled challenging regions.
Semi-DETR: Semi-Supervised Object Detection with Detection Transformers
Specifically, we propose a Stage-wise Hybrid Matching strategy that combines the one-to-many assignment and one-to-one assignment strategies to improve the training efficiency of the first stage and thus provide high-quality pseudo labels for the training of the second stage.
Roll With the Punches: Expansion and Shrinkage of Soft Label Selection for Semi-supervised Fine-Grained Learning
While semi-supervised learning (SSL) has yielded promising results, the more realistic SSL scenario remains to be explored, in which the unlabeled data exhibits extremely high recognition difficulty, e. g., fine-grained visual classification in the context of SSL (SS-FGVC).
Learning without Exact Guidance: Updating Large-scale High-resolution Land Cover Maps from Low-resolution Historical Labels
However, it is still a non-trivial task hindered by complex ground details, various landforms, and the scarcity of accurate training labels over a wide-span geographic area.
Mutual Mean-Teaching: Pseudo Label Refinery for Unsupervised Domain Adaptation on Person Re-identification
In order to mitigate the effects of noisy pseudo labels, we propose to softly refine the pseudo labels in the target domain by proposing an unsupervised framework, Mutual Mean-Teaching (MMT), to learn better features from the target domain via off-line refined hard pseudo labels and on-line refined soft pseudo labels in an alternative training manner.
Generative Pseudo-label Refinement for Unsupervised Domain Adaptation
We exploit this finding in an iterative procedure where a generative model and a classifier are jointly trained: in turn, the generator allows to sample cleaner data from the target distribution, and the classifier allows to associate better labels to target samples, progressively refining target pseudo-labels.
Weakly supervised discriminative feature learning with state information for person identification
We evaluate our model on unsupervised person re-identification and pose-invariant face recognition.
SoQal: Selective Oracle Questioning for Consistency Based Active Learning of Cardiac Signals
One way to mitigate this burden is via active learning (AL) which involves the (a) acquisition and (b) annotation of informative unlabelled instances.
Improved Mutual Mean-Teaching for Unsupervised Domain Adaptive Re-ID
SDA, a domain-translation-based framework, focuses on carefully translating the source-domain images to the target domain.
Delving into Inter-Image Invariance for Unsupervised Visual Representations
In this work, we present a comprehensive empirical study to better understand the role of inter-image invariance learning from three main constituting components: pseudo-label maintenance, sampling strategy, and decision boundary design.