Skip to yearly menu bar Skip to main content


Poster

DeIL: Direct-and-Inverse CLIP for Open-World Few-Shot Learning

Shuai Shao · Yu Bai · Yan WANG · Bao-di Liu · Yicong Zhou

Arch 4A-E Poster #420
[ ]
Fri 21 Jun 5 p.m. PDT — 6:30 p.m. PDT

Abstract:

Open-World Few-Shot Learning (OFSL) is a critical field of research, concentrating on the precise identification of target samples in environments with scarce data and unreliable labels, thus possessing substantial practical significance. Recently, the evolution of foundation models like CLIP has revealed their strong capacity for representation, even in settings with restricted resources and data. This development has led to a significant shift in focus, transitioning from the traditional method of “building models from scratch” to a strategy centered on “efficiently utilizing the capabilities of foundation models to extract relevant prior knowledge tailored for OFSL and apply it judiciously”. Amidst this backdrop, we unveil the Direct-and-Inverse CLIP (DeIL), an innovative method leveraging our proposed “Direct-and-Inverse” concept to activate CLIP-based methods for addressing OFSL. This concept transforms conventional single-step classification into a nuanced two-stage process: initially filtering out less probable categories, followed by accurately determining the specific category of samples. DeIL comprises two key components: a pre-trainer (frozen) for data denoising, and an adapter (tunable) for achieving precise final classification. In experiments, DeIL achieves SOTA performance on 11 datasets. https://github.com/The-Shuai/DeIL.

Chat is not available.