Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Don't error on OpenAI valid top_p values. #2231

Closed
wants to merge 1 commit into from
Closed

Conversation

ErikKaum
Copy link
Member

@michael-conrad I've created a this branch to run CI checks 👍

Don't error on OpenAI `top_p` valid values.

* Closes: guidance-ai/guidance#945
* Closes: #2222
@@ -247,7 +247,7 @@ impl Validation {
// for the user
let top_p = top_p
.map(|value| {
if value <= 0.0 || value >= 1.0 {
if value < 0.0 || value > 1.0 {
Copy link
Member

@OlivierDehaene OlivierDehaene Jul 15, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

0.0 is not a valid value for the backend.

value==1.0 and value==0.0 are tricky values because the intent is not clear. Does the user want sampling to be on but without modifying the temperature of the softmax? Or should sampling be off?
Hence why we didn't allow them.

What does the OpenAI API do when these values are used?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let me dig in their code to make sure 👍

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Based on what information that looks concrete I can quickly find via Google.

This is my understanding:

  • top_p=1.0 is top_p = 100%. Is valid.
  • top_p=0.0 is technically not possible, but seems to be treated as a very very small number instead by the OpenAPI endpoint. (Server side clamping to something like 0.000000001?)

https://community.openai.com/t/what-is-the-top-p-value-range/612433

Copy link
Member Author

@ErikKaum ErikKaum Jul 22, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry for a late response but after a bit of reading it seems:

  • top_p=1 should be fine
  • top_p=0 is not but openAI accepts it but unclear how they actually treat it server side.

here are some of the other discussions:

I think given that there's no official statement from openAI I feel a bit hesitant to make an assumption on what they're doing. Even if it's most likely clamping the value to e.g. 0.000000001

Suggestion:
Allow top_p=1 in the interest of not breaking client code, but not allowing top_p=0 since there is not a clear interpretation on it.

Would this work for you?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think allowing top_p=1.0 that should work fine for those using the guidance lib via API.

As far as top_p=0.0, if the official API accepts this value, and you are creating an API that is supposed to be compatible, should it not accept this value, even if the actual behavior is slightly different as OpenAI doesn't say what happens for this scenario?

It seems like general consensus is that OpenAI clamps it to some very small non-zero value.

Shouldn't the TGI API accept the same values as the official OpenAI API, even if we don't know what OpenAI is really doing inside their black-box for top_p=0 and it seems that setting top_p=.000_001 or similar seems to match OpenAI's behavior based on conversations reviewed?

Copy link
Collaborator

@Narsil Narsil Jul 23, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

top_p=1.0 really means, no top_p, right ?
We can safely ignore the parameter. In general, I personally prefer avoiding silencing/discarding user input. If a user asks for something we should respect it, if that thing is nonsensical we should error out.

For this adaptation layer, we can definitely indulge here (even though I don't see any reason this should ever be an issue in practice, just don't set top_p if you don't want to use this filter.)

top_p=0.0 really means something impossible since you're not selecting anything. This is a user bug. Again I'd rather error out rather than doing something weird that no one has any clue about. top_p=0.001 is effectively greedy, but there are better ways to ask for greedy (even then is it really, top_p is rarely p>0.999).

Error > Undefined behavior in my book.
I'd rather not follow openai's example here, since it IS undefined behavior, it's quite probable we do things differently than them, and then people complain that we're doing things differently and we're forced between keeping existing "non openai-compliant" behavior that people might already be depending on, and "fixing" it to align and having a breaking change for something we're not responsible for (Openai could also be breaking their behavior since it doesn't seem to be defined).

Also and finally, there's a reason extreme values are an issue. We have had bug filed because models where doing crazy things with temperature=0.0001. This is almost greedy, and that was user's intention to get greedy, but t=0.0001 is not exactly greedy. The numbers are so extreme that it incurs instabilities in the numerical values, and the logits ended up messed up with NaN. So there IS a difference between doing greedy and t=0.00001.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Regardless the fix for the openai layer is not here, but in the openai layer (where we can adapt values to our stricter format)

@ErikKaum
Copy link
Member Author

I think for now I'd lean towards closing this.

The main argument being Error > Undefined behavior.
It means that we unfortunately aren't 100% in line with the OpenAI spec, but since the top_p=0 behavior isn't documented there really isn't a way to implement it either.

If there is still demand for this, I think we can discuss the top_p=1.0 == no top_p.

Hopefully this makes sense to you and feel free to reopen this 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants