Skip to content

Commit 954d372

Browse files
mbleighpavelgj
authored andcommitted
chore: Adds .guides folders to a few Genkit packages (#3614)
Co-authored-by: Pavel Jbanov <pavelgj@gmail.com>
1 parent cfe7954 commit 954d372

File tree

11 files changed

+672
-7
lines changed

11 files changed

+672
-7
lines changed

js/genkit/.guides/config.json

Lines changed: 39 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,39 @@
1+
{
2+
"mcpServers": {
3+
"genkit": {
4+
"command": "genkit",
5+
"args": ["mcp"]
6+
}
7+
},
8+
"docs": [
9+
{
10+
"url": "https://genkit.dev/docs/models.md",
11+
"name": "generate-content",
12+
"title": "Generate Content",
13+
"description": "how to generate content (text, structured data, images, videos) with Genkit."
14+
},
15+
{
16+
"url": "https://genkit.dev/docs/flows.md",
17+
"name": "flows",
18+
"title": "Using Flows to build GenAI Workflows",
19+
"description": "how to construct strongly typed AI workflows with Genkit."
20+
},
21+
{
22+
"url": "https://genkit.dev/docs/tool-calling.md",
23+
"name": "tool-calling",
24+
"title": "Tool Calling",
25+
"description": "an in-depth guide to providing tools/functions to Genkit for GenAI"
26+
},
27+
{
28+
"url": "https://genkit.dev/docs/interrupts.md",
29+
"name": "tool-calling/interrupts",
30+
"title": "Interrupts (Tool Calling with Human-in-the-Loop)",
31+
"description": "instructions on how to use interrupts to provide human-in-the-loop capabilities to Genkit agents"
32+
},
33+
{
34+
"url": "https://genkit.dev/docs/context.md",
35+
"name": "context",
36+
"description": "how to pass context to tools and flows without exposing sensitive data to the LLM"
37+
}
38+
]
39+
}

js/genkit/.guides/setup.md

Lines changed: 30 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,30 @@
1+
Follow these instructions to set up Genkit in the current codebase. These instructions are general-purpose and have not been written with specific codebase knowledge, so use your best judgement when following them.
2+
3+
0. Tell the user "I'm going to check out your workspace and set you up to use Genkit for GenAI workflows."
4+
1. If the current workspace is empty or is a starter template, your goal will be to create a simple image generation flow that allows someone to generate an image based on a prompt and selectable style. If the current workspace is not empty, you will create a simple example flow to help get the user started.
5+
2. Check to see if any Genkit provider plugin (such as `@genkit-ai/google-genai` or `@genkit-ai/oai-compat` or others, may start with `genkitx-*`) is installed. If not, install `@genkit-ai/google-genai`. If this is a Next.js app, install `@genkit-ai/next` as well.
6+
3. Search the codebase for the exact string `genkit(` (remember to escape regexes properly) which would indicate that the user has already set up Genkit in the codebase. If found, no need to set it up again, tell the user "Genkit is already configured in this app." and exit this workflow.
7+
4. Create an `ai` directory in the primary source directory of the project (this may be e.g. `src` but is project-dependent).
8+
5. Create `{sourceDir}/ai/genkit.ts` and populate it using the example below. DO NOT add a `next` plugin to the file, ONLY add a model provider plugin to the plugins array:
9+
10+
```ts
11+
import { genkit, z } from 'genkit';
12+
import { googleAI } from '@genkit-ai/google-genai';
13+
14+
export const ai = genkit({
15+
plugins: [googleAI()],
16+
});
17+
18+
export { z };
19+
```
20+
21+
6. Create `{sourceDir}/ai/tools` and `{sourceDir}/ai/flows` directories, but leave them empty for now.
22+
7. Create `{sourceDir}/ai/index.ts` and populate it with the following (change the import to match import aliases in `tsconfig.json` as needed):
23+
24+
```ts
25+
import './genkit.js';
26+
// import each created flow, tool, etc. here for use in the Genkit Dev UI
27+
```
28+
29+
8. Add a `genkit:ui` script to `package.json` that runs `npx genkit-cli start -- npx tsx {sourceDir}/ai/index.ts` (or `pnpm dlx` or `yarn dlx` for those package managers). DO NOT try to run the script now.
30+
9. Tell the user "Genkit is now configured and ready for use." as setup is now complete. Also remind them to set appropriate env variables (e.g. `GEMINI_API_KEY` for Google providers). Wait for the user to prompt further before creating any specific flows.

js/genkit/.guides/style.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
- Prefer destructuring generate calls e.g. `const {text} = await ai.generate(...)`

js/genkit/.guides/usage.md

Lines changed: 49 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,49 @@
1+
## Basic Example
2+
3+
```ts
4+
import { ai, z } from "@/ai/genkit"; // or wherever genkit is initialized
5+
6+
const myTool = ai.defineTool({name, description, inputSchema: z.object(...)}, (input) => {...});
7+
8+
const {text} = await ai.generate({
9+
model: googleAI.model('gemini-2.5-flash'), // optional if default model is configured
10+
system: "the system instructions", // optional
11+
prompt: "the content of the prompt",
12+
// OR, for multi-modal content
13+
prompt: [{text: "what is this image?"}, {media: {url: "data:image/png;base64,..."}}],
14+
tools: [myTool],
15+
});
16+
17+
// structured output
18+
const CharacterSchema = z.object({...}); // make sure to use .describe() on fields
19+
const {output} = await ai.generate({
20+
prompt: "generate an RPG character",
21+
output: {schema: CharacterSchema},
22+
});
23+
```
24+
25+
## Important API Clarifications
26+
27+
**IMPORTANT:** This app uses Genkit v1.19 which has changed significantly from pre-1.0 versions. Important changes include:
28+
29+
```ts
30+
const response = await ai.generate(...);
31+
32+
response.text // CORRECT 1.x syntax
33+
response.text() // INCORRECT pre-1.0 syntax
34+
35+
response.output // CORRECT 1.x syntax
36+
response.output() // INCORRECT pre-1.0 syntax
37+
38+
const {stream, response} = ai.generateStream(...); // IMPORTANT: no `await` needed
39+
for await (const chunk of stream) { } // CORRECT 1.x syntax
40+
for await (const chunk of stream()) { } // INCORRECT pre-1.0 syntax
41+
await response; // CORRECT 1.x syntax
42+
await response(); // INCORRECT pre-1.0 syntax
43+
await ai.generate({..., model: googleAI.model('gemini-2.5-flash')}); // CORRECT 1.x syntax
44+
await ai.generate({..., model: gemini15Pro}); // INCORRECT pre-1.0 syntax
45+
```
46+
47+
- Use `import {z} from "genkit"` when you need Zod to get an implementation consistent with Genkit.
48+
- When defining Zod schemas, ONLY use basic scalar, object, and array types. Use `.optional()` when needed and `.describe('...')` to add descriptions for output schemas.
49+
- Genkit has many capabilities, make sure to read docs when you need to use them.
Lines changed: 110 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,110 @@
1+
Genkit's Express integration makes it easy to expose Genkit flows as Express API endpoints:
2+
3+
```ts
4+
import express from 'express';
5+
import { expressHandler } from '@genkit-ai/express';
6+
import { simpleFlow } from './flows/simple-flow.js';
7+
8+
const app = express();
9+
app.use(express.json());
10+
11+
app.post('/simpleFlow', expressHandler(simpleFlow));
12+
13+
app.listen(8080);
14+
```
15+
16+
You can also handle auth using context providers:
17+
18+
```ts
19+
import { UserFacingError } from 'genkit';
20+
import { ContextProvider, RequestData } from 'genkit/context';
21+
22+
const context: ContextProvider<Context> = (req: RequestData) => {
23+
if (req.headers['authorization'] !== 'open sesame') {
24+
throw new UserFacingError('PERMISSION_DENIED', 'not authorized');
25+
}
26+
return {
27+
auth: {
28+
user: 'Ali Baba',
29+
},
30+
};
31+
};
32+
33+
app.post(
34+
'/simpleFlow',
35+
authMiddleware,
36+
expressHandler(simpleFlow, { context })
37+
);
38+
```
39+
40+
Flows and actions exposed using the `expressHandler` function can be accessed using `genkit/beta/client` library:
41+
42+
```ts
43+
import { runFlow, streamFlow } from 'genkit/beta/client';
44+
45+
const result = await runFlow({
46+
url: `http://localhost:${port}/simpleFlow`,
47+
input: 'say hello',
48+
});
49+
50+
console.log(result); // hello
51+
```
52+
53+
```ts
54+
// set auth headers (when using auth policies)
55+
const result = await runFlow({
56+
url: `http://localhost:${port}/simpleFlow`,
57+
headers: {
58+
Authorization: 'open sesame',
59+
},
60+
input: 'say hello',
61+
});
62+
63+
console.log(result); // hello
64+
```
65+
66+
```ts
67+
// and streamed
68+
const result = streamFlow({
69+
url: `http://localhost:${port}/simpleFlow`,
70+
input: 'say hello',
71+
});
72+
for await (const chunk of result.stream) {
73+
console.log(chunk);
74+
}
75+
console.log(await result.output);
76+
```
77+
78+
You can use `startFlowServer` to quickly expose multiple flows and actions:
79+
80+
```ts
81+
import { startFlowServer } from '@genkit-ai/express';
82+
import { genkit } from 'genkit';
83+
84+
const ai = genkit({});
85+
86+
export const menuSuggestionFlow = ai.defineFlow(
87+
{
88+
name: 'menuSuggestionFlow',
89+
},
90+
async (restaurantTheme) => {
91+
// ...
92+
}
93+
);
94+
95+
startFlowServer({
96+
flows: [menuSuggestionFlow],
97+
});
98+
```
99+
100+
You can also configure the server:
101+
102+
```ts
103+
startFlowServer({
104+
flows: [menuSuggestionFlow],
105+
port: 4567,
106+
cors: {
107+
origin: '*',
108+
},
109+
});
110+
```
Lines changed: 106 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,106 @@
1+
---
2+
title: Edit images with `gemini-2.5-flash-image-preview` (aka "Nano Banana")
3+
description: read this if you need to perform sophisticated image edits such as background removal, post matching, character replacement, relighting, on an existing image
4+
---
5+
6+
The `gemini-2.5-flash-image-preview` model (also known as "Nano Banana") can perform sophisticated image edits.
7+
8+
- You must ALWAYS add `{config: {responseModalities: ['TEXT', 'IMAGE']}}` to your `ai.generate` calls when using this model.
9+
10+
<example>
11+
```ts
12+
// generate an image from a prompt
13+
14+
import { ai } from "@/ai/genkit"; // or wherever genkit is initialized
15+
import { googleAI } from "@genkit-ai/google-genai";
16+
17+
const {media} = await ai.generate({
18+
model: googleAI.model('gemini-2.5-flash-image-preview'),
19+
config: {responseModalities: ['TEXT', 'IMAGE']}},
20+
prompt: "generate a picture of a unicorn wearing a space suit on the moon",
21+
});
22+
23+
return media.url; // --> "data:image/png;base64,..."
24+
```
25+
</example>
26+
27+
<example>
28+
```ts
29+
// edit an image with a text prompt
30+
31+
import { ai } from "@/ai/genkit"; // or wherever genkit is initialized
32+
import { googleAI } from "@genkit-ai/google-genai";
33+
34+
const {media} = await ai.generate({
35+
model: googleAI.model('gemini-2.5-flash-image-preview'),
36+
config: {responseModalities: ['TEXT', 'IMAGE']}},
37+
prompt: [
38+
{text: "change the person's outfit to a banana costume"},
39+
{media: {url: "https://..." /* or 'data:...' */}},
40+
],
41+
});
42+
43+
return media.url; // --> "data:image/png;base64,..."
44+
```
45+
</example>
46+
47+
<example>
48+
```ts
49+
// combine multiple images together
50+
51+
import { ai } from "@/ai/genkit"; // or wherever genkit is initialized
52+
import { googleAI } from "@genkit-ai/google-genai";
53+
54+
const {personImageUri, animalImageUri, sceneryImageUri} = await loadImages(...);
55+
56+
const {media} = await ai.generate({
57+
model: googleAI.model('gemini-2.5-flash-image-preview'),
58+
config: {responseModalities: ['TEXT', 'IMAGE']}},
59+
prompt: [
60+
// the model tends to match aspect ratio of the *last* image provided
61+
{text: "[PERSON]:\n"},
62+
{media: {url: personImageUri}},
63+
{text: "\n[ANIMAL]:\n"},
64+
{media: {url: animalImageUri}},
65+
{text; "\n[SCENERY]:\n"},
66+
// IMPORTANT: the model tends to match aspect ratio of the *last* image provided
67+
{media: {url: sceneryImageUri}},
68+
{text: "make an image of [PERSON] riding a giant version of [ANIMAL] with a background of [SCENERY]"},
69+
],
70+
});
71+
72+
return media.url; // --> "data:image/png;base64,..."
73+
```
74+
</example>
75+
76+
<example>
77+
```ts
78+
// use an annotated image to guide generation
79+
80+
import { ai } from "@/ai/genkit"; // or wherever genkit is initialized
81+
import { googleAI } from "@genkit-ai/google-genai";
82+
83+
const originalImageUri = "data:..."; // the original image
84+
const annotatedImageUri = "data:..."; // the image with annotations on top of it
85+
86+
const {media} = await ai.generate({
87+
model: googleAI.model('gemini-2.5-flash-image-preview'),
88+
config: {responseModalities: ['TEXT', 'IMAGE']}},
89+
prompt: [
90+
91+
{text: "follow the instructions in the following annotated image:"},
92+
{media: {url: annotatedImageUri}},
93+
{text: "\n\napply the annotated instructions to the original image, making sure to follow the instructions of the annotations.\n\noriginal image:\n"},
94+
{media: {url: originalImageUri}},
95+
],
96+
});
97+
98+
return media.url; // --> "data:image/png;base64,..."
99+
```
100+
</example>
101+
102+
## Prompting tips for image editing
103+
104+
- For complex edits prefer a chain of small edits to a single complex edit. Feed the output of one generation as input to the next.
105+
- Be specific and detailed about the edits you want to make.
106+
- Be clear whether added images are meant as style or subject references.

0 commit comments

Comments
 (0)