Skip to yearly menu bar Skip to main content


Poster

RepAn: Enhanced Annealing through Re-parameterization

Xiang Fei · Xiawu Zheng · Yan Wang · Fei Chao · Chenglin Wu · Liujuan Cao

Arch 4A-E Poster #95
[ ]
Wed 19 Jun 5 p.m. PDT — 6:30 p.m. PDT

Abstract: The simulated annealing algorithm aims to improve model convergence through multiple restarts of training. However, existing annealing algorithms overlook the correlation between different cycles, neglecting the potential for incremental learning. We contend that a fixed network structure prevents the model from recognizing distinct features at different training stages. To this end, we propose RepAn, redesigning the irreversible re-parameterization (Rep) method and integrating it with annealing to enhance training. Specifically, the network goes through Rep, expansion, restoration, and backpropagation operations during training, and iterating through these processes in each annealing round. Such a method exhibits good generalization and is easy to apply, and we provide theoretical explanations for its effectiveness. Experiments demonstrate that our method improves baseline performance by $6.38\%$ on the CIFAR-100 dataset and $2.80\%$ on ImageNet, achieving state-of-the-art performance in the Rep field. The code is available in our supplementary material.

Chat is not available.