What Knowledge Gets Distilled in Knowledge Distillation?
![]() | What Knowledge Gets Distilled in Knowledge Distillation? NeurIPS 2023 Authors: Utkarsh Ojha*, Yuheng Li*, Anirudh Sundara Rajan*, Yingyu Liang, Yong Jae Lee (*Joint first authors*) |
Links
- 📄 Paper