I will be demonstrating critical concepts of the diffusion model using a toy 2D distribution first, followed by using the same concepts on the EMNIST datasets.
Completed the following
- Ground truth data estimation/ Error estimation/ Score estimation of diffusion process [Toy Examples -1]
- Cosine and linear schedules Toy Examples - 1
- Clipping to improve stabalization of the generation process Toy Examples - 2
- Time embedding to encode timestep in model Toy Examples - improvements
- Classifier free guidance and semi supervised model training Toy Examples - improvements
- Faster sampling in generation process using striding steps in denoising Toy Examples - improvements
- EMNIST data generation using U-nets and JAX --
Unexplored ideas
- List of ideas yet to explore: Ideas Notebook
Notebook | Github Link | Colab |
---|---|---|
Basic: Predicting Original Distribution | Vanilla Implementation | |
Predicting Error and Score Function | Error / Score Prediction | |
Classifier free Guidance and other improvements | Advanced concepts | |
EMINST De-noising and Conditional generation | Colab EMNIST |