[KD][CLS][OD] Improving Knowledge Distillation via Regularizing Feature Norm and Direction
[KD][CLS][OD] Improving Knowledge Distillation via Regularizing Feature Norm and Direction
[KD][CLS][OD] Improving Knowledge Distillation via Regularizing Feature Norm and Direction
[SSL][DG][CLS] Generalized Semi-Supervised Learning via Self-Supervised Feature Adaptation
[TTA][VLM] SwapPrompt: Test-Time Prompt Adaptation for Vision-Language Models
[VLM][CLS] EVA-CLIP: Improved Training Techniques for CLIP at Scale
[VLM][SS][OD] DenseCLIP: Language-Guided Dense Prediction with Context-Aware Prompting