Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About hyper-parameters #3

Open
fukuzawa-e opened this issue Jun 7, 2024 · 3 comments
Open

About hyper-parameters #3

fukuzawa-e opened this issue Jun 7, 2024 · 3 comments

Comments

@fukuzawa-e
Copy link

python run_zerogen.py --alpha ${ALPHA} --beta ${BETA} --eta ${ETA} --k ${K} --condition_method add
--task ${TASK} --decoding_len ${LENGTH} --alpha_scale --alpha_activasize ${ALPHA_HAT}
--beta_scale --beta_activesize 0.2 --beta_upper ${BETA_HAT} --n_obj ${N} --kw_mode max --k2t

By the above command, I want to test making long output, for example more than 100 words. I changed --decoding_len by 100, but the length of the results is not changed so different, the results are still short.
Thus, I tried to only set --decoding_len by 100, and the longer results can be output, but quality of the result text is not so good.
I want to confirm effective of each parameter, but there are no details in the paper, is there any documents that can explain mean of each parameter ?

@ImKeTT
Copy link
Owner

ImKeTT commented Jun 13, 2024

Thank you for your interest in our work.
For generating high-quality long controllable texts, you need to fine-tune the base LM on textual corpus with longer length rather than simply twiching the decoding_len parameter.

@fukuzawa-e
Copy link
Author

Thank you so much for your reply.
About fine-tune the base LM, do you use SimCTG for fine-tuning new LM model ?

@ImKeTT
Copy link
Owner

ImKeTT commented Jun 13, 2024

Yes, I employed this codebase for fine-tuning the base LM.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants