Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

'do_sample' model default cannot be overridden #35372

Open
2 of 4 tasks
Zoher15 opened this issue Dec 20, 2024 · 1 comment
Open
2 of 4 tasks

'do_sample' model default cannot be overridden #35372

Zoher15 opened this issue Dec 20, 2024 · 1 comment
Labels

Comments

@Zoher15
Copy link

Zoher15 commented Dec 20, 2024

System Info

transformers 4.47.1, python 3.10.

Basically while using Qwen-2-VL-Instruct (default config sets do_sample=True), if I set the model_kwargs with do_sample=False, I am unable to override the model_config. I had to make changes to the generations/utils.py to override....

Who can help?

No response

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

  1. Use a model like Qwen2-VL-Instruct with default config using do_sample = True
  2. Try to override the behavior through kwargs used during model.generate

Expected behavior

If I set do_sample=False during generation, the default config of the model should be overridden

@Zoher15 Zoher15 added the bug label Dec 20, 2024
@Rocketknight1
Copy link
Member

cc @zucchini-nlp @gante

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants