You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Add regression test for peak memory (especially with gradient accumulation enabled)
We should have tests that run for 2 epochs, instead of 1, to make sure intermediate_checkpoint are saved correctly
Add tests for Gemma2 attention
Too many of our tests use torchtune-format checkpoints (e.g.). We should have a couple of these, but the majority of our test checkpoints should be HF format.
The content you are editing has changed. Please copy your edits and refresh the page.
The purpose of this issue is to keep track of all changes we need to make in our tests for them to be more robust
FSDP sharding
Test correct initialization of LoRA params when using meta device (a la test_lora_meta_device_init_fsdp)
Better testing for eval recipe: add test for generation and log likelihood tasks #1873
Integration testing for DPO #1411
Regression testing in torchtune #1306
Add recipe tests which use DoRA (see [bug] DoRA is broken #1903)
Config CI: [WIP] Config Continous Integration (CCI) #1717
Add regression test for peak memory (especially with gradient accumulation enabled)
We should have tests that run for 2 epochs, instead of 1, to make sure intermediate_checkpoint are saved correctly
Add tests for Gemma2 attention
Too many of our tests use torchtune-format checkpoints (e.g.). We should have a couple of these, but the majority of our test checkpoints should be HF format.
Tasks
The text was updated successfully, but these errors were encountered: