AWEncoder: Adversarial Watermarking Pre-Trained Encoders in Contrastive Learning
As a self-supervised learning paradigm, contrastive learning has been widely used to pre-train a powerful encoder as an effective feature extractor for various downstream tasks.This process requires numerous unlabeled training data and computational resources, which makes the pre-trained encoder become the valuable intellectual property of the owne