Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About train losses and evalution parameters setting #56

Open
MrCrims opened this issue Aug 15, 2024 · 1 comment
Open

About train losses and evalution parameters setting #56

MrCrims opened this issue Aug 15, 2024 · 1 comment

Comments

@MrCrims
Copy link

MrCrims commented Aug 15, 2024

Hello, I'm confused about my traing on GPT-XL image size 384×384. After 300 epoches, the traning loss is around 6.9, and FID is just 6.8. Could you share your detailed training settings?
Besides, when I try to use c2i_XL_384.pt to reproduce your results, my best results is 2.73 different with 2.62 in the paper. Could you share your detailed evalution settings including cfg-scale ,etc ?
I will be grateful if you could help me.

@Alpha-X-95
Copy link

Would you like to show us the training loss for c2i and t2i.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants