You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
With the OpenAI Codex model, we are only able to use 4096 tokens in total (prompt + generation), so we need to either develop a scheme to chain multiple completions together (maybe some sort of stitching mechanism), or error out when the #tokens(input files) > (MAX_TOKENS - #tokens(prompt)) / 2
The text was updated successfully, but these errors were encountered:
guymguym
changed the title
Handle completion overflow
OpenAI Codex tokens limit - identify and help the user workaround the limit by reducing the file set
Apr 26, 2022
We have a tokenizer code which cannot be published publicly, we could work around this by creating an API or some other workaround. Keeping this at TBD until requirements become clearer
With the OpenAI Codex model, we are only able to use
4096
tokens in total (prompt + generation), so we need to either develop a scheme to chain multiple completions together (maybe some sort of stitching mechanism), or error out when the#tokens(input files) > (MAX_TOKENS - #tokens(prompt)) / 2
The text was updated successfully, but these errors were encountered: