-
Notifications
You must be signed in to change notification settings - Fork 400
Issues: imoneoi/openchat
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
TypeError: AsyncLLMEngine.generate() got an unexpected keyword argument 'prompt'
#225
opened Jun 10, 2024 by
zombak79
About performance of LLama (llama2, llama3) model
question
Further information is requested
#215
opened Apr 25, 2024 by
huazhenliu
huggingface down: openchat can't be started
enhancement
New feature or request
#214
opened Apr 22, 2024 by
antioxidanz
use instructor in openchat
question
Further information is requested
#213
opened Apr 20, 2024 by
ramandada
Single GPU vs multiple GPU (tensor parallel) suggestion for API Server
#208
opened Mar 21, 2024 by
fdm-git
WARNING: Error in configuration: macro '\frac' failed its substitution!
#204
opened Mar 20, 2024 by
chengzhen123
Error when using openchat/openchat-3.5-0106-gemma in text-generation-inference
#202
opened Mar 18, 2024 by
houghtonweihu
Was the chat template applied in ochat/config/conversation_template.py?
#200
opened Mar 12, 2024 by
houghtonweihu
llama_model_load: error loading model: create_tensor: tensor 'output.weight' not found
#199
opened Mar 11, 2024 by
wac81
Shall we add a Contrib Guide .md (linked and separate from README)?
#195
opened Mar 10, 2024 by
KemingHe
Previous Next
ProTip!
What’s not been updated in a month: updated:<2024-11-15.