You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Ran the example, ingested the state_of_the_union, asked a few questions and every single time I was getting this error:
openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens, however you requested 4231 tokens (3975 in your prompt; 256 for the completion). Please reduce your prompt; or completion length.
The text was updated successfully, but these errors were encountered:
I don't know if it is really related, but for me the issue was resolved after specifying the openai model to use in query_data.py:
llm=OpenAI(model_name="gpt-3.5-turbo", temperature=0)
Without that, it defaults to using text-davinci which is way more expensive. You can check which model is used by referring to https://platform.openai.com/account/usage
Ran the example, ingested the state_of_the_union, asked a few questions and every single time I was getting this error:
openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens, however you requested 4231 tokens (3975 in your prompt; 256 for the completion). Please reduce your prompt; or completion length.
The text was updated successfully, but these errors were encountered: