Recently, unsupervised deep hashing has attracted increasing attention, mainly because of its potential ability to learn binary codes without identity annotations.However, because the labels are predicted by their pretext tasks, unsupervised deep hashing becomes unstable when learning with noisy labels.To mitigate this issue, we Bath Milk propose a simple but effective approach to self-supervised hash learning based on dual pseudo agreement.By adding a consistency constraint, our method can prevent corrupted labels and encourage generalization for effective knowledge distillation.Specifically, we use the refined pseudo labels as a stabilization constraint to train hash codes, which can implicitly encode semantic structures of the data into the learned Hamming space.
Based on the stable pseudo labels, we propose a self-supervised hashing method with mutual information and noise contrastive loss.Throughout the process of Cages hash learning, the stable pseudo labels and data distributions collaboratively work together as teachers to guide the binary codes learning process.Extensive experiments on three publicly available datasets demonstrate that the proposed method can consistently outperform state-of-the-art methods by large margins.