Skip to content

Official repository for "Dataset Distillation via Knowledge Distillation: Towards Efficient Self-Supervised Pre-training of Deep Networks" which presents the method MKDT.

License

Notifications You must be signed in to change notification settings

BigML-CS-UCLA/mkdt-data-distill-ssl

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 

Repository files navigation

Dataset Distillation via Knowledge Distillation: Towards Efficient Self-Supervised Pre-training of Deep Networks

Official repository for "Dataset Distillation via Knowledge Distillation: Towards Efficient Self-Supervised Pre-training of Deep Networks" which presents the method MKDT for dataset distillation for self-supervised learning.

Obtaining Teacher Model

Generating Expert Trajectories

python 

Distilling Dataset

python 

Evaluating Distilled Dataset

python 

Bibtex

n/a

About

Official repository for "Dataset Distillation via Knowledge Distillation: Towards Efficient Self-Supervised Pre-training of Deep Networks" which presents the method MKDT.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published