Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add responseLogProbs and logProbs parameters to generateContentReq #266

Merged
merged 9 commits into from
Sep 26, 2024
5 changes: 5 additions & 0 deletions .changeset/funny-pillows-whisper.md
shilpakancharla marked this conversation as resolved.
Show resolved Hide resolved
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
"@google/generative-ai": minor
---

Added the responseLogProbs and logProbs parameters
5 changes: 5 additions & 0 deletions .changeset/weak-stingrays-serve.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
"@google/generative-ai": minor
---

Add GenerateContentResponse's avgLogProbs and logProbsResult
26 changes: 26 additions & 0 deletions common/api-review/generative-ai.api.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,9 @@ export interface BaseParams {
frequencyPenalty?: number;
// (undocumented)
generationConfig?: GenerationConfig;
logProbs?: number;
shilpakancharla marked this conversation as resolved.
Show resolved Hide resolved
presencePenalty?: number;
responseLogProbs?: boolean;
shilpakancharla marked this conversation as resolved.
Show resolved Hide resolved
// (undocumented)
safetySettings?: SafetySetting[];
}
Expand Down Expand Up @@ -62,6 +64,13 @@ export interface CachedContentBase {
tools?: Tool[];
}

// @public
export interface Candidate {
logProbability: number;
token: string;
tokenID: number;
}

// @public
export class ChatSession {
constructor(apiKey: string, model: string, params?: StartChatParams, _requestOptions?: RequestOptions);
Expand Down Expand Up @@ -371,6 +380,7 @@ export interface FunctionResponsePart {

// @public
export interface GenerateContentCandidate {
avgLogProbs?: number;
// (undocumented)
citationMetadata?: CitationMetadata;
// (undocumented)
Expand All @@ -381,6 +391,7 @@ export interface GenerateContentCandidate {
finishReason?: FinishReason;
// (undocumented)
index: number;
logProbsResult?: LogprobsResult;
// (undocumented)
safetyRatings?: SafetyRating[];
}
Expand Down Expand Up @@ -467,10 +478,14 @@ export class GenerativeModel {
// (undocumented)
generationConfig: GenerationConfig;
// (undocumented)
logProbs?: number;
// (undocumented)
model: string;
// (undocumented)
presencePenalty?: number;
// (undocumented)
responseLogProbs?: boolean;
// (undocumented)
safetySettings: SafetySetting[];
startChat(startChatParams?: StartChatParams): ChatSession;
// (undocumented)
Expand Down Expand Up @@ -577,6 +592,12 @@ export interface InlineDataPart {
text?: never;
}

// @public
export interface LogprobsResult {
chosenCandidates: Candidate[];
topCandidates: TopCandidates[];
}

// @public
export interface ModelParams extends BaseParams {
// (undocumented)
Expand Down Expand Up @@ -730,6 +751,11 @@ export interface ToolConfig {
functionCallingConfig: FunctionCallingConfig;
}

// @public
export interface TopCandidates {
candidates: Candidate[];
}

// @public
export interface UsageMetadata {
cachedContentTokenCount?: number;
Expand Down
13 changes: 13 additions & 0 deletions docs/reference/main/generative-ai.baseparams.logprobs.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [BaseParams](./generative-ai.baseparams.md) &gt; [logProbs](./generative-ai.baseparams.logprobs.md)

## BaseParams.logProbs property

Valid if responseLogProbs is set to True. This will set the number of top logprobs to return at each decoding step in the logProbsResult.

**Signature:**

```typescript
logProbs?: number;
```
2 changes: 2 additions & 0 deletions docs/reference/main/generative-ai.baseparams.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,8 @@ export interface BaseParams
| --- | --- | --- | --- |
| [frequencyPenalty?](./generative-ai.baseparams.frequencypenalty.md) | | number | _(Optional)_ Frequency penalty applied to the next token's logprobs, multiplied by the number of times each token has been seen in the respponse so far. |
| [generationConfig?](./generative-ai.baseparams.generationconfig.md) | | [GenerationConfig](./generative-ai.generationconfig.md) | _(Optional)_ |
| [logProbs?](./generative-ai.baseparams.logprobs.md) | | number | _(Optional)_ Valid if responseLogProbs is set to True. This will set the number of top logprobs to return at each decoding step in the logProbsResult. |
| [presencePenalty?](./generative-ai.baseparams.presencepenalty.md) | | number | _(Optional)_ Presence penalty applied to the next token's logprobs if the token has already been seen in the response. |
| [responseLogProbs?](./generative-ai.baseparams.responselogprobs.md) | | boolean | _(Optional)_ If True, export the logprobs results in response. |
| [safetySettings?](./generative-ai.baseparams.safetysettings.md) | | [SafetySetting](./generative-ai.safetysetting.md)<!-- -->\[\] | _(Optional)_ |

13 changes: 13 additions & 0 deletions docs/reference/main/generative-ai.baseparams.responselogprobs.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [BaseParams](./generative-ai.baseparams.md) &gt; [responseLogProbs](./generative-ai.baseparams.responselogprobs.md)

## BaseParams.responseLogProbs property

If True, export the logprobs results in response.

**Signature:**

```typescript
responseLogProbs?: boolean;
```
13 changes: 13 additions & 0 deletions docs/reference/main/generative-ai.candidate.logprobability.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [Candidate](./generative-ai.candidate.md) &gt; [logProbability](./generative-ai.candidate.logprobability.md)

## Candidate.logProbability property

The candidate's log probability.

**Signature:**

```typescript
logProbability: number;
```
22 changes: 22 additions & 0 deletions docs/reference/main/generative-ai.candidate.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [Candidate](./generative-ai.candidate.md)

## Candidate interface

Candidate for the logprobs token and score.

**Signature:**

```typescript
export interface Candidate
```

## Properties

| Property | Modifiers | Type | Description |
| --- | --- | --- | --- |
| [logProbability](./generative-ai.candidate.logprobability.md) | | number | The candidate's log probability. |
| [token](./generative-ai.candidate.token.md) | | string | The candidate's token string value. |
| [tokenID](./generative-ai.candidate.tokenid.md) | | number | The candidate's token id value. |

13 changes: 13 additions & 0 deletions docs/reference/main/generative-ai.candidate.token.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [Candidate](./generative-ai.candidate.md) &gt; [token](./generative-ai.candidate.token.md)

## Candidate.token property

The candidate's token string value.

**Signature:**

```typescript
token: string;
```
13 changes: 13 additions & 0 deletions docs/reference/main/generative-ai.candidate.tokenid.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [Candidate](./generative-ai.candidate.md) &gt; [tokenID](./generative-ai.candidate.tokenid.md)

## Candidate.tokenID property

The candidate's token id value.

**Signature:**

```typescript
tokenID: number;
```
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [GenerateContentCandidate](./generative-ai.generatecontentcandidate.md) &gt; [avgLogProbs](./generative-ai.generatecontentcandidate.avglogprobs.md)

## GenerateContentCandidate.avgLogProbs property

Average log probability score of the candidate.

**Signature:**

```typescript
avgLogProbs?: number;
```
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [GenerateContentCandidate](./generative-ai.generatecontentcandidate.md) &gt; [logProbsResult](./generative-ai.generatecontentcandidate.logprobsresult.md)

## GenerateContentCandidate.logProbsResult property

Log-likelihood scores for the response tokens and top tokens.

**Signature:**

```typescript
logProbsResult?: LogprobsResult;
```
2 changes: 2 additions & 0 deletions docs/reference/main/generative-ai.generatecontentcandidate.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,10 +16,12 @@ export interface GenerateContentCandidate

| Property | Modifiers | Type | Description |
| --- | --- | --- | --- |
| [avgLogProbs?](./generative-ai.generatecontentcandidate.avglogprobs.md) | | number | _(Optional)_ Average log probability score of the candidate. |
| [citationMetadata?](./generative-ai.generatecontentcandidate.citationmetadata.md) | | [CitationMetadata](./generative-ai.citationmetadata.md) | _(Optional)_ |
| [content](./generative-ai.generatecontentcandidate.content.md) | | [Content](./generative-ai.content.md) | |
| [finishMessage?](./generative-ai.generatecontentcandidate.finishmessage.md) | | string | _(Optional)_ |
| [finishReason?](./generative-ai.generatecontentcandidate.finishreason.md) | | [FinishReason](./generative-ai.finishreason.md) | _(Optional)_ |
| [index](./generative-ai.generatecontentcandidate.index.md) | | number | |
| [logProbsResult?](./generative-ai.generatecontentcandidate.logprobsresult.md) | | [LogprobsResult](./generative-ai.logprobsresult.md) | _(Optional)_ Log-likelihood scores for the response tokens and top tokens. |
| [safetyRatings?](./generative-ai.generatecontentcandidate.safetyratings.md) | | [SafetyRating](./generative-ai.safetyrating.md)<!-- -->\[\] | _(Optional)_ |

11 changes: 11 additions & 0 deletions docs/reference/main/generative-ai.generativemodel.logprobs.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [GenerativeModel](./generative-ai.generativemodel.md) &gt; [logProbs](./generative-ai.generativemodel.logprobs.md)

## GenerativeModel.logProbs property

**Signature:**

```typescript
logProbs?: number;
```
2 changes: 2 additions & 0 deletions docs/reference/main/generative-ai.generativemodel.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,8 +26,10 @@ export declare class GenerativeModel
| [cachedContent](./generative-ai.generativemodel.cachedcontent.md) | | [CachedContent](./generative-ai.cachedcontent.md) | |
| [frequencyPenalty?](./generative-ai.generativemodel.frequencypenalty.md) | | number | _(Optional)_ |
| [generationConfig](./generative-ai.generativemodel.generationconfig.md) | | [GenerationConfig](./generative-ai.generationconfig.md) | |
| [logProbs?](./generative-ai.generativemodel.logprobs.md) | | number | _(Optional)_ |
| [model](./generative-ai.generativemodel.model.md) | | string | |
| [presencePenalty?](./generative-ai.generativemodel.presencepenalty.md) | | number | _(Optional)_ |
| [responseLogProbs?](./generative-ai.generativemodel.responselogprobs.md) | | boolean | _(Optional)_ |
| [safetySettings](./generative-ai.generativemodel.safetysettings.md) | | [SafetySetting](./generative-ai.safetysetting.md)<!-- -->\[\] | |
| [systemInstruction?](./generative-ai.generativemodel.systeminstruction.md) | | [Content](./generative-ai.content.md) | _(Optional)_ |
| [toolConfig?](./generative-ai.generativemodel.toolconfig.md) | | [ToolConfig](./generative-ai.toolconfig.md) | _(Optional)_ |
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [GenerativeModel](./generative-ai.generativemodel.md) &gt; [responseLogProbs](./generative-ai.generativemodel.responselogprobs.md)

## GenerativeModel.responseLogProbs property

**Signature:**

```typescript
responseLogProbs?: boolean;
```
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [LogprobsResult](./generative-ai.logprobsresult.md) &gt; [chosenCandidates](./generative-ai.logprobsresult.chosencandidates.md)

## LogprobsResult.chosenCandidates property

Length = total number of decoding steps. The chosen candidates may or may not be in topCandidates.

**Signature:**

```typescript
chosenCandidates: Candidate[];
```
21 changes: 21 additions & 0 deletions docs/reference/main/generative-ai.logprobsresult.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [LogprobsResult](./generative-ai.logprobsresult.md)

## LogprobsResult interface

Logprobs Result

**Signature:**

```typescript
export interface LogprobsResult
```

## Properties

| Property | Modifiers | Type | Description |
| --- | --- | --- | --- |
| [chosenCandidates](./generative-ai.logprobsresult.chosencandidates.md) | | [Candidate](./generative-ai.candidate.md)<!-- -->\[\] | Length = total number of decoding steps. The chosen candidates may or may not be in topCandidates. |
| [topCandidates](./generative-ai.logprobsresult.topcandidates.md) | | [TopCandidates](./generative-ai.topcandidates.md)<!-- -->\[\] | Length = total number of decoding steps. |

13 changes: 13 additions & 0 deletions docs/reference/main/generative-ai.logprobsresult.topcandidates.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [LogprobsResult](./generative-ai.logprobsresult.md) &gt; [topCandidates](./generative-ai.logprobsresult.topcandidates.md)

## LogprobsResult.topCandidates property

Length = total number of decoding steps.

**Signature:**

```typescript
topCandidates: TopCandidates[];
```
3 changes: 3 additions & 0 deletions docs/reference/main/generative-ai.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,7 @@
| [BatchEmbedContentsResponse](./generative-ai.batchembedcontentsresponse.md) | Response from calling [GenerativeModel.batchEmbedContents()](./generative-ai.generativemodel.batchembedcontents.md)<!-- -->. |
| [CachedContent](./generative-ai.cachedcontent.md) | Describes <code>CachedContent</code> interface for sending to the server (if creating) or received from the server (using getters or list methods). |
| [CachedContentBase](./generative-ai.cachedcontentbase.md) | |
| [Candidate](./generative-ai.candidate.md) | Candidate for the logprobs token and score. |
| [CitationMetadata](./generative-ai.citationmetadata.md) | Citation metadata that may be found on a [GenerateContentCandidate](./generative-ai.generatecontentcandidate.md)<!-- -->. |
| [CitationSource](./generative-ai.citationsource.md) | A single citation source. |
| [CodeExecutionResult](./generative-ai.codeexecutionresult.md) | Result of executing the <code>ExecutableCode</code>. Only generated when using code execution, and always follows a <code>Part</code> containing the <code>ExecutableCode</code>. |
Expand Down Expand Up @@ -74,6 +75,7 @@
| [GenerationConfig](./generative-ai.generationconfig.md) | Config options for content-related requests |
| [GenerativeContentBlob](./generative-ai.generativecontentblob.md) | Interface for sending an image. |
| [InlineDataPart](./generative-ai.inlinedatapart.md) | Content part interface if the part represents an image. |
| [LogprobsResult](./generative-ai.logprobsresult.md) | Logprobs Result |
| [ModelParams](./generative-ai.modelparams.md) | Params passed to [GoogleGenerativeAI.getGenerativeModel()](./generative-ai.googlegenerativeai.getgenerativemodel.md)<!-- -->. |
| [PromptFeedback](./generative-ai.promptfeedback.md) | If the prompt was blocked, this will be populated with <code>blockReason</code> and the relevant <code>safetyRatings</code>. |
| [RequestOptions](./generative-ai.requestoptions.md) | Params passed to getGenerativeModel() or GoogleAIFileManager(). |
Expand All @@ -85,6 +87,7 @@
| [StartChatParams](./generative-ai.startchatparams.md) | Params for [GenerativeModel.startChat()](./generative-ai.generativemodel.startchat.md)<!-- -->. |
| [TextPart](./generative-ai.textpart.md) | Content part interface if the part represents a text string. |
| [ToolConfig](./generative-ai.toolconfig.md) | Tool config. This config is shared for all tools provided in the request. |
| [TopCandidates](./generative-ai.topcandidates.md) | Candidates with top log probabilities at each decoding step |
| [UsageMetadata](./generative-ai.usagemetadata.md) | Metadata on the generation request's token usage. |

## Variables
Expand Down
13 changes: 13 additions & 0 deletions docs/reference/main/generative-ai.topcandidates.candidates.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [TopCandidates](./generative-ai.topcandidates.md) &gt; [candidates](./generative-ai.topcandidates.candidates.md)

## TopCandidates.candidates property

Sorted by log probability in descending order.

**Signature:**

```typescript
candidates: Candidate[];
```
20 changes: 20 additions & 0 deletions docs/reference/main/generative-ai.topcandidates.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
<!-- Do not edit this file. It is automatically generated by API Documenter. -->

[Home](./index.md) &gt; [@google/generative-ai](./generative-ai.md) &gt; [TopCandidates](./generative-ai.topcandidates.md)

## TopCandidates interface

Candidates with top log probabilities at each decoding step

**Signature:**

```typescript
export interface TopCandidates
```

## Properties

| Property | Modifiers | Type | Description |
| --- | --- | --- | --- |
| [candidates](./generative-ai.topcandidates.candidates.md) | | [Candidate](./generative-ai.candidate.md)<!-- -->\[\] | Sorted by log probability in descending order. |

2 changes: 2 additions & 0 deletions src/methods/generate-content.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,8 @@ const fakeRequestParams: GenerateContentRequest = {
],
presencePenalty: 0.5,
frequencyPenalty: 0.1,
responseLogProbs: true,
logProbs: 2,
};

describe("generateContent()", () => {
Expand Down
Loading
Loading