New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[OpenAI] Support Top Logprob in Chat Completions #29432
base: main
Are you sure you want to change the base?
Conversation
API change check APIView has identified API level changes in this PR and created following API reviews. |
logprobs?: boolean; | ||
/** | ||
* An integer between 0 and 5 specifying the number of most likely tokens to return at each token position, | ||
* each with an associated log probability. `logprobs` must be set to `true` if this parameter is used. | ||
*/ | ||
topLogprobs?: number; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wonder do we need two knobs to control this feature? can't we just do with topLogprobs
? Perhaps it is more of a question to the architects.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a pattern in OpenAI API, so seems like most languages are surfacing the pattern. In .NET, they rename logprobs
as EnableLogProbabilities
and topLogprobs
as LogProbabilitiesPerToken
.
Packages impacted by this PR
@azure/openai
Issues associated with this PR
#29199
Describe the problem that is addressed by this PR
What are the possible designs available to address the problem? If there are more than one possible design, why was the one in this PR chosen?
Are there test cases added in this PR? (If not, why?)
Provide a list of related PRs (if any)
Command used to generate this PR:**(Applicable only to SDK release request PRs)
Checklists