Skip to content

Commit

Permalink
disable test_retain_grad_hidden_states_attentions on SeamlessM4TModel…
Browse files Browse the repository at this point in the history
…WithTextInputTest (#28169)

disable retain_grad_hidden_states_attentions on SeamlessM4TModelWithTextInputTest
  • Loading branch information
dwyatte authored Dec 21, 2023
1 parent 1d77735 commit e268d7e
Showing 1 changed file with 6 additions and 0 deletions.
6 changes: 6 additions & 0 deletions tests/models/seamless_m4t/test_modeling_seamless_m4t.py
Original file line number Diff line number Diff line change
Expand Up @@ -751,6 +751,12 @@ def test_training_gradient_checkpointing_use_reentrant(self):
def test_training_gradient_checkpointing_use_reentrant_false(self):
pass

@unittest.skip(
reason="In training model, the first encoder layer is sometimes skipped. Training is not supported yet, so the test is ignored."
)
def test_retain_grad_hidden_states_attentions(self):
pass


@require_torch
class SeamlessM4TGenerationTest(unittest.TestCase):
Expand Down

0 comments on commit e268d7e

Please sign in to comment.