Monday May 22, 2023

CVPR 2023 - Learning to Retain while Acquiring: Combating Distribution-Shift in Adversarial Data-Free Knowledge Distillation

In this episode we discuss Learning to Retain while Acquiring: Combating Distribution-Shift in Adversarial Data-Free Knowledge Distillation by Gaurav Patel, Konda Reddy Mopuri, Qiang Qiu. The paper introduces a framework called Learning to Retain while Acquiring, which addresses the issue of non-stationary distribution of pseudo-samples in the Adversarial Data-free Knowledge Distillation (DFKD) framework. The proposed method treats the tasks of learning from newly generated samples and retaining knowledge on previously met samples as meta-train and meta-test, respectively. The authors also identify an implicit aligning factor between the two tasks, showing that the student update strategy enforces a common gradient direction for both objectives. The effectiveness of the proposed method is demonstrated through extensive evaluation and comparison on multiple datasets.

Comments (0)

To leave or reply to comments, please download free Podbean or

No Comments

Copyright 2023 All rights reserved.

Podcast Powered By Podbean

Version: 20241125