UserWarning error #6565
Unanswered
Accursed115
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I've been trying to load some models through this program, stuff like GPT2 and Qwuen have been working fine, but models like Midnight-Miqu-70B-v1.5 refuse to open up due to an user warning that keeps coming up and I cannot seem to figure out how to change or remove it.
D:\text-generation-webui-1.16\installer_files\env\Lib\site-packages\transformers\generation\configuration_utils.py:600: UserWarning:
do_sample
is set toFalse
. However,min_p
is set to0.0
-- this flag is only used in sample-based generation modes. You should setdo_sample=True
or unsetmin_p
.I've looked in the configuration file and nothing stood out as needing to be changed and look through the parameters and seems despite in the default "min_p" setting do_sample is checked and min_p is set to 0.5, the model does not load and forces WebUI to close each time.
Is there something I am missing? As there's another model that has this same or similar issue, such as MN-12B-Celeste-V1.9 where it says it in the command box, still somewhat loads, but then says;
RuntimeError: CUDA error: out of memory
CUDA kernel errors might be asynchronously reported at some other API call, so the stacktrace below might be incorrect.
For debugging consider passing CUDA_LAUNCH_BLOCKING=1
Compile with TORCH_USE_CUDA_DSA to enable device-side assertions
Beta Was this translation helpful? Give feedback.
All reactions