You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
1. A self distillation scheme built upon distilling different augmented/distorted images by the same student.
2.A MMD loss distilling the features between different augmented/distorted images
Modifications
Probably removing the MMD loss and only retain the KL loss is fine,
since it can already demonstrate competitive performance.
The methods shows to be a very powerful self-distillation scheme, even with the absence of MMD loss, with my my local experiments on CIFAR10/100.
Plus, it also demonstrate a strong compatibility with other distillation scheme, and can perform as a component.
The text was updated successfully, but these errors were encountered:
Hi @yiqings, thanks for raising this issue. Unfortunately, development for KD-Lib has stalled for now, but we will be sure to keep this issue in mind when / if we resume.
Also, do let me know if you would be interested in contributing an implementation for this paper.
Description
Modifications
The methods shows to be a very powerful self-distillation scheme, even with the absence of MMD loss, with my my local experiments on CIFAR10/100.
Plus, it also demonstrate a strong compatibility with other distillation scheme, and can perform as a component.
The text was updated successfully, but these errors were encountered: