Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix model integration ci #26322

Merged
merged 32 commits into from
Oct 2, 2023

Conversation

ArthurZucker
Copy link
Collaborator

What does this PR do?

Fixes the CI broken by #23909

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Sep 21, 2023

The documentation is not available anymore as the PR was closed or merged.

@ArthurZucker ArthurZucker marked this pull request as ready for review September 28, 2023 10:09
@LysandreJik LysandreJik self-assigned this Oct 2, 2023
Copy link
Member

@LysandreJik LysandreJik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks for investigating and fixing the failing tests @ArthurZucker

@ArthurZucker ArthurZucker merged commit 63864e0 into huggingface:main Oct 2, 2023
3 checks passed
blbadger pushed a commit to blbadger/transformers that referenced this pull request Nov 8, 2023
* fix wav2vec2

* nit

* stash

* one more file to update

* fix byt5

* vocab size is 256, don't change that!

* use other revision

* test persimon in smaller size

* style

* tests

* nits

* update add tokens from pretrained

* test tokenization

* nits

* potential fnet fix?

* more nits

* nits

* correct test

* assert close

* udpate

* ouch

* fix it

* some more nits

* FINALLU

* use `adept` checkpoints

* more adept checkpoints

* that was invlved!
EduardoPach pushed a commit to EduardoPach/transformers that referenced this pull request Nov 18, 2023
* fix wav2vec2

* nit

* stash

* one more file to update

* fix byt5

* vocab size is 256, don't change that!

* use other revision

* test persimon in smaller size

* style

* tests

* nits

* update add tokens from pretrained

* test tokenization

* nits

* potential fnet fix?

* more nits

* nits

* correct test

* assert close

* udpate

* ouch

* fix it

* some more nits

* FINALLU

* use `adept` checkpoints

* more adept checkpoints

* that was invlved!
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

length of tokenizer changes when using main branch causing batch_decode to fail
3 participants