-
Notifications
You must be signed in to change notification settings - Fork 101
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
1 parent
c6d6027
commit 135802a
Showing
4 changed files
with
176 additions
and
1 deletion.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,70 @@ | ||
<!-- Autogenerated by weave; DO NOT EDIT --> | ||
|
||
# Ollama plugin | ||
|
||
The Ollama plugin provides interfaces to any of the local LLMs supported by | ||
[Ollama](https://ollama.com/). | ||
|
||
## Prerequisites | ||
|
||
This plugin requires that you first install and run the Ollama server. You can | ||
follow the instructions on the [Download Ollama](https://ollama.com/download) | ||
page. | ||
|
||
Use the Ollama CLI to download the models you are interested in. For example: | ||
|
||
```posix-terminal | ||
ollama pull gemma2 | ||
``` | ||
|
||
For development, you can run Ollama on your development machine. Deployed apps | ||
usually run Ollama on a different, GPU-accelerated, machine from the app backend | ||
that runs Genkit. | ||
|
||
## Configuration | ||
|
||
To use this plugin, call `ollama.Init()`, specifying the address of your Ollama | ||
server: | ||
|
||
```go | ||
import "github.com/firebase/genkit/go/plugins/ollama" | ||
``` | ||
|
||
```go | ||
// Init with Ollama's default local address. | ||
if err := ollama.Init(ctx, "http://127.0.0.1:11434"); err != nil { | ||
return err | ||
} | ||
``` | ||
|
||
## Usage | ||
|
||
To generate content, you first need to create a model definition based on the | ||
model you installed and want to use. For example, if you installed Gemma 2: | ||
|
||
```go | ||
model := ollama.DefineModel( | ||
ollama.ModelDefinition{ | ||
Name: "gemma2", | ||
Type: "chat", // "chat" or "generate" | ||
}, | ||
&ai.ModelCapabilities{ | ||
Multiturn: true, | ||
SystemRole: true, | ||
Tools: false, | ||
Media: false, | ||
}, | ||
) | ||
``` | ||
|
||
Then, you can use the model reference to send requests to your Ollama server: | ||
|
||
```go | ||
genRes, err := model.Generate(ctx, ai.NewGenerateRequest( | ||
nil, ai.NewUserTextMessage("Tell me a joke.")), nil) | ||
if err != nil { | ||
return err | ||
} | ||
``` | ||
|
||
See [Generating content](models.md) for more information. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,44 @@ | ||
# Ollama plugin | ||
|
||
The Ollama plugin provides interfaces to any of the local LLMs supported by | ||
[Ollama](https://ollama.com/). | ||
|
||
## Prerequisites | ||
|
||
This plugin requires that you first install and run the Ollama server. You can | ||
follow the instructions on the [Download Ollama](https://ollama.com/download) | ||
page. | ||
|
||
Use the Ollama CLI to download the models you are interested in. For example: | ||
|
||
```posix-terminal | ||
ollama pull gemma2 | ||
``` | ||
|
||
For development, you can run Ollama on your development machine. Deployed apps | ||
usually run Ollama on a different, GPU-accelerated, machine from the app backend | ||
that runs Genkit. | ||
|
||
## Configuration | ||
|
||
To use this plugin, call `ollama.Init()`, specifying the address of your Ollama | ||
server: | ||
|
||
```go | ||
import "github.com/firebase/genkit/go/plugins/ollama" | ||
``` | ||
|
||
%include ../go/internal/doc-snippets/ollama.go init | ||
|
||
## Usage | ||
|
||
To generate content, you first need to create a model definition based on the | ||
model you installed and want to use. For example, if you installed Gemma 2: | ||
|
||
%include ../go/internal/doc-snippets/ollama.go definemodel | ||
|
||
Then, you can use the model reference to send requests to your Ollama server: | ||
|
||
%include ../go/internal/doc-snippets/ollama.go gen | ||
|
||
See [Generating content](models.md) for more information. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,60 @@ | ||
// Copyright 2024 Google LLC | ||
// | ||
// Licensed under the Apache License, Version 2.0 (the "License"); | ||
// you may not use this file except in compliance with the License. | ||
// You may obtain a copy of the License at | ||
// | ||
// http://www.apache.org/licenses/LICENSE-2.0 | ||
// | ||
// Unless required by applicable law or agreed to in writing, software | ||
// distributed under the License is distributed on an "AS IS" BASIS, | ||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
// See the License for the specific language governing permissions and | ||
// limitations under the License. | ||
|
||
package snippets | ||
|
||
import ( | ||
"context" | ||
|
||
"github.com/firebase/genkit/go/ai" | ||
"github.com/firebase/genkit/go/plugins/ollama" | ||
) | ||
|
||
func ollamaEx(ctx context.Context) error { | ||
var err error | ||
|
||
//!+init | ||
// Init with Ollama's default local address. | ||
if err := ollama.Init(ctx, "http://127.0.0.1:11434"); err != nil { | ||
return err | ||
} | ||
//!-init | ||
|
||
//!+definemodel | ||
model := ollama.DefineModel( | ||
ollama.ModelDefinition{ | ||
Name: "gemma2", | ||
Type: "chat", // "chat" or "generate" | ||
}, | ||
&ai.ModelCapabilities{ | ||
Multiturn: true, | ||
SystemRole: true, | ||
Tools: false, | ||
Media: false, | ||
}, | ||
) | ||
//!-definemodel | ||
|
||
//!+gen | ||
genRes, err := model.Generate(ctx, ai.NewGenerateRequest( | ||
nil, ai.NewUserTextMessage("Tell me a joke.")), nil) | ||
if err != nil { | ||
return err | ||
} | ||
//!-gen | ||
|
||
_ = genRes | ||
|
||
return nil | ||
} |