Paper
in
Workshop: VAND: Visual Anomaly and Novelty Detection - 3rd Edition
SK-RD4AD : Skip-Connected Reverse Distillation For Robust One-Class Anomaly Detection
EunJu Park · Taekyung Kim · Minju Kim · Hojun Lee · Gil-Jun Lee
Anomaly detection plays a critical role in industrial, healthcare, and security applications by enabling early identification of defects. While Reverse Knowledge Distillation (KD) has shown promise for one-class anomaly detection, existing models often suffer from deep feature loss due to excessive compression in the Student network, limiting their ability to detect fine-grained anomalies. We propose SK-RD4AD, a novel framework that introduces non-corresponding skip connections from intermediate Teacher layers to deeper Student layers. This cross-hierarchical feature transfer preserves multi-scale representations, enhancing both semantic alignment and anomaly localization. Extensive experiments on MVTec-AD, VisA, and VAD demonstrate that SK-RD4AD consistently outperforms prior methods. Specifically, it improves AUROC by +3.5% on VAD, boosts AUPRO by +21% on VisA, and achieves +1% gains on MVTec-AD. The model shows particular robustness on challenging cases such as the Transistor category in MVTec-AD and generalizes well across diverse domains. Our results establish SK-RD4AD as a robust and scalable solution for real-world one-class anomaly detection. Code is available at: https://github.com/pej0918/SK-RD4AD