Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
22 changes: 11 additions & 11 deletions docs/dotprompt.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ might call `greeting.prompt`:

```none
---
model: vertexai/gemini-1.0-pro
model: vertexai/gemini-1.5-flash
config:
temperature: 0.9
input:
Expand Down Expand Up @@ -193,7 +193,7 @@ the file itself, you can also override these values on a per-call basis:

```ts
const result = await greetingPrompt.generate({
model: 'google-genai/gemini-pro',
model: 'vertexai/gemini-1.5-pro',
config: {
temperature: 1.0,
},
Expand All @@ -210,7 +210,7 @@ You can set the format and output schema of a prompt to coerce into JSON:

```none
---
model: vertexai/gemini-1.0-pro
model: vertexai/gemini-1.5-flash
input:
schema:
theme: string
Expand Down Expand Up @@ -267,7 +267,7 @@ The `{{role}}` helper provides a simple way to construct multi-message prompts:

```none
---
model: vertexai/gemini-1.0-pro
model: vertexai/gemini-1.5-flash
input:
schema:
userQuestion: string
Expand Down Expand Up @@ -317,7 +317,7 @@ use the `{{media}}` helper:

```none
---
model: vertexai/gemini-1.0-pro-vision
model: vertexai/gemini-1.5-flash
input:
schema:
photoUrl: string
Expand Down Expand Up @@ -416,16 +416,16 @@ production environment side-by-side with existing versions. Dotprompt supports
this through its **variants** feature.

To create a variant, create a `[name].[variant].prompt` file. For instance, if
you were using Gemini 1.0 Pro in your prompt but wanted to see if Gemini 1.5 Pro
would perform better, you might create two files:
you were using Gemini 1.5 Flash in your prompt but wanted to see if Gemini 1.5
Pro would perform better, you might create two files:

- `my_prompt.prompt`: the "baseline" prompt
- `my_prompt.gemini15.prompt`: a variant named "gemini"
- `my_prompt.geminipro.prompt`: a variant named "geminipro"

To use a prompt variant, specify the `variant` option when loading:

```ts
const myPrompt = await prompt('my_prompt', { variant: 'gemini15' });
const myPrompt = await prompt('my_prompt', { variant: 'geminipro' });
```

The name of the variant is included in the metadata of generation traces, so you
Expand All @@ -447,7 +447,7 @@ Once a helper is defined you can use it in any prompt:

```none
---
model: vertexai/gemini-1.5-pro
model: vertexai/gemini-1.5-flash
input:
schema:
name: string
Expand Down Expand Up @@ -491,7 +491,7 @@ const myPrompt = await loadPromptUrl('https://example.com/my_prompt.prompt');
// Define a prompt in code
const myPrompt = defineDotprompt(
{
model: 'vertexai/gemini-1.0-pro',
model: 'vertexai/gemini-1.5-flash',
input: {
schema: z.object({
name: z.string(),
Expand Down
2 changes: 1 addition & 1 deletion docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -186,7 +186,7 @@ into a single file for easier testing and organization.

```none
---
model: vertexai/gemini-1.0-pro
model: vertexai/gemini-1.5-flash
config:
temperature: 0.9
input:
Expand Down
2 changes: 1 addition & 1 deletion docs/models.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ authentication. For example, Vertex API uses the Google Auth Library so it can
pull required credentials using Application Default Credentials.

To use models provided by the plugin, you can either refer to them by name (e.g.
`'vertexai/gemini-1.0-pro'`) or some plugins export model ref objects which
`'vertexai/gemini-1.5-flash'`) or some plugins export model ref objects which
provide additional type info about the model capabilities and options.

```js
Expand Down
6 changes: 3 additions & 3 deletions docs/prompts.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ call models this way for straight-forward use cases.
import { generate } from '@genkit-ai/ai';

generate({
model: 'googleai/gemini-pro',
model: 'googleai/gemini-1.5-flash-latest',
prompt: 'You are a helpful AI assistant named Walt.',
});
```
Expand All @@ -37,7 +37,7 @@ function helloPrompt(name: string) {
}

generate({
model: 'googleai/gemini-pro',
model: 'googleai/gemini-1.5-flash-latest',
prompt: helloPrompt('Fred'),
});
```
Expand Down Expand Up @@ -85,7 +85,7 @@ generate(
renderPrompt({
prompt: helloPrompt,
input: { name: 'Fred' },
model: 'googleai/gemini-pro',
model: 'googleai/gemini-1.5-flash-latest',
})
);
```
Expand Down