Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ollama[major]: Integration package #5990

Merged
merged 17 commits into from
Jul 20, 2024
Merged
Show file tree
Hide file tree
Changes from 12 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
30 changes: 27 additions & 3 deletions docs/core_docs/docs/integrations/chat/ollama.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -13,14 +13,14 @@ For a complete list of supported models and model variants, see the [Ollama mode

## Setup

Follow [these instructions](https://github.com/jmorganca/ollama) to set up and run a local Ollama instance.
Follow [these instructions](https://github.com/jmorganca/ollama) to set up and run a local Ollama instance. Then, download the `@langchain/ollama` package.

import IntegrationInstallTooltip from "@mdx_components/integration_install_tooltip.mdx";

<IntegrationInstallTooltip></IntegrationInstallTooltip>

```bash npm2yarn
npm install @langchain/community
npm install @langchain/ollama
```

## Usage
Expand All @@ -30,6 +30,28 @@ import OllamaExample from "@examples/models/chat/integration_ollama.ts";

<CodeBlock language="typescript">{OllamaExample}</CodeBlock>

## Tools

Ollama now offers support for native tool calling. The example below demonstrates how you can invoke a tool from an Ollama model.

import OllamaToolsExample from "@examples/models/chat/integration_ollama_tools.ts";

<CodeBlock language="typescript">{OllamaToolsExample}</CodeBlock>

:::tip
You can see the LangSmith trace of the above example [here](https://smith.langchain.com/public/940f4279-6825-4d19-9653-4c50d3c70625/r)
:::

Since `ChatOllama` supports the `.bindTools()` method, you can also call `.withStructuredOutput()` to get a structured output from the tool.

import OllamaWSOExample from "@examples/models/chat/integration_ollama_wso.ts";

<CodeBlock language="typescript">{OllamaWSOExample}</CodeBlock>

:::tip
You can see the LangSmith trace of the above example [here](https://smith.langchain.com/public/ed113c53-1299-4814-817e-1157c9eac47e/r)
:::

## JSON mode

Ollama also supports a JSON mode that coerces model outputs to only return JSON. Here's an example of how this can be useful for extraction:
Expand All @@ -38,7 +60,9 @@ import OllamaJSONModeExample from "@examples/models/chat/integration_ollama_json

<CodeBlock language="typescript">{OllamaJSONModeExample}</CodeBlock>

You can see a simple LangSmith trace of this here: https://smith.langchain.com/public/92aebeca-d701-4de0-a845-f55df04eff04/r
:::tip
You can see a simple LangSmith trace of this [here](https://smith.langchain.com/public/1fbd5660-b7fd-41c3-9d3a-a6ecc735277c/r)
:::

## Multimodal models

Expand Down
8 changes: 7 additions & 1 deletion docs/core_docs/docs/integrations/chat/ollama_functions.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,10 @@ sidebar_label: Ollama Functions

# Ollama Functions

:::tip
bracesproul marked this conversation as resolved.
Show resolved Hide resolved
The LangChain Ollama integration package has official support for tool calling. [Click here to view the documentation](/docs/integrations/chat/ollama#tools).
:::

LangChain offers an experimental wrapper around open source models run locally via [Ollama](https://github.com/jmorganca/ollama)
that gives it the same API as OpenAI Functions.

Expand Down Expand Up @@ -48,7 +52,9 @@ import OllamaFunctionsExtraction from "@examples/models/chat/ollama_functions/ex

<CodeBlock language="typescript">{OllamaFunctionsExtraction}</CodeBlock>

You can see a LangSmith trace of what this looks like here: https://smith.langchain.com/public/31457ea4-71ca-4e29-a1e0-aa80e6828883/r
:::tip
You can see a simple LangSmith trace of this [here](https://smith.langchain.com/public/74692bfc-0224-4221-b187-ddbf20d7ecc0/r)
:::

## Customization

Expand Down
2 changes: 1 addition & 1 deletion docs/core_docs/docs/tutorials/local_rag.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -153,7 +153,7 @@
"metadata": {},
"outputs": [],
"source": [
"import { ChatOllama } from \"@langchain/community/chat_models/ollama\";\n",
"import { ChatOllama } from \"@langchain/ollama\";\n",
"\n",
"const ollamaLlm = new ChatOllama({\n",
" baseUrl: \"http://localhost:11434\", // Default value\n",
Expand Down
1 change: 1 addition & 0 deletions examples/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,7 @@
"@langchain/mistralai": "workspace:*",
bracesproul marked this conversation as resolved.
Show resolved Hide resolved
"@langchain/mongodb": "workspace:*",
"@langchain/nomic": "workspace:*",
"@langchain/ollama": "workspace:*",
"@langchain/openai": "workspace:*",
"@langchain/pinecone": "workspace:*",
"@langchain/qdrant": "workspace:*",
Expand Down
2 changes: 1 addition & 1 deletion examples/src/models/chat/integration_ollama.ts
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import { ChatOllama } from "@langchain/community/chat_models/ollama";
import { ChatOllama } from "@langchain/ollama";
import { StringOutputParser } from "@langchain/core/output_parsers";

const model = new ChatOllama({
Expand Down
14 changes: 9 additions & 5 deletions examples/src/models/chat/integration_ollama_json_mode.ts
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import { ChatOllama } from "@langchain/community/chat_models/ollama";
import { ChatOllama } from "@langchain/ollama";
import { ChatPromptTemplate } from "@langchain/core/prompts";

const prompt = ChatPromptTemplate.fromMessages([
Expand All @@ -25,8 +25,12 @@ const result = await chain.invoke({
console.log(result);

/*
AIMessage {
content: '{"original": "I love programming", "translated": "Ich liebe das Programmieren"}',
additional_kwargs: {}
}
AIMessage {
content: '{\n' +
'"original": "I love programming",\n' +
'"translated": "Ich liebe Programmierung"\n' +
'}',
response_metadata: { ... },
usage_metadata: { ... }
}
*/
2 changes: 1 addition & 1 deletion examples/src/models/chat/integration_ollama_multimodal.ts
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import { ChatOllama } from "@langchain/community/chat_models/ollama";
import { ChatOllama } from "@langchain/ollama";
import { HumanMessage } from "@langchain/core/messages";
import * as fs from "node:fs/promises";

Expand Down
44 changes: 44 additions & 0 deletions examples/src/models/chat/integration_ollama_tools.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
import { tool } from "@langchain/core/tools";
import { ChatOllama } from "@langchain/ollama";
import { z } from "zod";

const weatherTool = tool((_) => "Da weather is weatherin", {
name: "get_current_weather",
description: "Get the current weather in a given location",
schema: z.object({
location: z.string().describe("The city and state, e.g. San Francisco, CA"),
}),
});

// Define the model
const model = new ChatOllama({
model: "llama3-groq-tool-use",
});

// Bind the tool to the model
const modelWithTools = model.bindTools([weatherTool]);

const result = await modelWithTools.invoke(
"What's the weather like today in San Francisco? Ensure you use the 'get_current_weather' tool."
);

console.log(result);
/*
AIMessage {
"content": "",
"tool_calls": [
{
"name": "get_current_weather",
"args": {
"location": "San Francisco, CA"
},
"type": "tool_call"
}
],
"usage_metadata": {
"input_tokens": 177,
"output_tokens": 30,
"total_tokens": 207
}
}
*/
25 changes: 25 additions & 0 deletions examples/src/models/chat/integration_ollama_wso.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
import { ChatOllama } from "@langchain/ollama";
import { z } from "zod";

// Define the model
const model = new ChatOllama({
model: "llama3-groq-tool-use",
});

// Define the tool schema you'd like the model to use.
const schema = z.object({
location: z.string().describe("The city and state, e.g. San Francisco, CA"),
});

// Pass the schema to the withStructuredOutput method to bind it to the model.
const modelWithTools = model.withStructuredOutput(schema, {
name: "get_current_weather",
});

const result = await modelWithTools.invoke(
"What's the weather like today in San Francisco? Ensure you use the 'get_current_weather' tool."
);
console.log(result);
/*
{ location: 'San Francisco, CA' }
*/
32 changes: 24 additions & 8 deletions examples/src/models/chat/ollama_functions/extraction.ts
Original file line number Diff line number Diff line change
Expand Up @@ -44,20 +44,36 @@ const model = new OllamaFunctions({
});

// Use a JsonOutputFunctionsParser to get the parsed JSON response directly.
const chain = await prompt.pipe(model).pipe(new JsonOutputFunctionsParser());
const chain = prompt.pipe(model).pipe(new JsonOutputFunctionsParser());

const response = await chain.invoke({
input:
"Alex is 5 feet tall. Claudia is 1 foot taller than Alex and jumps higher than him. Claudia has orange hair and Alex is blonde.",
});

console.log(response);
console.log(JSON.stringify(response, null, 2));

/*
{
people: [
{ name: 'Alex', height: 5, hairColor: 'blonde' },
{ name: 'Claudia', height: 6, hairColor: 'orange' }
]
}
{
"people": [
{
"name": "Alex",
"height": 5,
"hairColor": "blonde"
},
{
"name": "Claudia",
"height": {
"$num": 1,
"add": [
{
"name": "Alex",
"prop": "height"
}
]
},
"hairColor": "orange"
}
]
}
*/
8 changes: 8 additions & 0 deletions libs/langchain-community/src/chat_models/ollama.ts
Original file line number Diff line number Diff line change
Expand Up @@ -20,11 +20,19 @@ import {
type OllamaMessage,
} from "../utils/ollama.js";

/**
* @deprecated Deprecated in favor of the `@langchain/ollama` package. Import from `@langchain/ollama` instead.
*/
export interface ChatOllamaInput extends OllamaInput {}

/**
* @deprecated Deprecated in favor of the `@langchain/ollama` package. Import from `@langchain/ollama` instead.
*/
export interface ChatOllamaCallOptions extends BaseLanguageModelCallOptions {}

/**
* @deprecated Deprecated in favor of the `@langchain/ollama` package. Import from `@langchain/ollama` instead.
*
* A class that enables calls to the Ollama API to access large language
* models in a chat-like fashion. It extends the SimpleChatModel class and
* implements the OllamaInput interface.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,15 +17,24 @@ You must always select one of the above tools and respond with only a JSON objec
"tool_input": <parameters for the selected tool, matching the tool's JSON schema>
}}`;

/**
* @deprecated Deprecated in favor of the `@langchain/ollama` package. Import `ChatOllama` from `@langchain/ollama` instead.
*/
export interface ChatOllamaFunctionsCallOptions
extends BaseFunctionCallOptions {}

/**
* @deprecated Deprecated in favor of the `@langchain/ollama` package. Import `ChatOllama` from `@langchain/ollama` instead.
*/
export type OllamaFunctionsInput = Partial<ChatOllamaInput> &
BaseChatModelParams & {
llm?: ChatOllama;
toolSystemPromptTemplate?: string;
};

/**
* @deprecated Deprecated in favor of the `@langchain/ollama` package. Import `ChatOllama` from `@langchain/ollama` instead.
*/
export class OllamaFunctions extends BaseChatModel<ChatOllamaFunctionsCallOptions> {
llm: ChatOllama;

Expand Down
74 changes: 74 additions & 0 deletions libs/langchain-ollama/.eslintrc.cjs
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
module.exports = {
extends: [
"airbnb-base",
"eslint:recommended",
"prettier",
"plugin:@typescript-eslint/recommended",
],
parserOptions: {
ecmaVersion: 12,
parser: "@typescript-eslint/parser",
project: "./tsconfig.json",
sourceType: "module",
},
plugins: ["@typescript-eslint", "no-instanceof"],
ignorePatterns: [
".eslintrc.cjs",
"scripts",
"node_modules",
"dist",
"dist-cjs",
"*.js",
"*.cjs",
"*.d.ts",
],
rules: {
"no-process-env": 2,
"no-instanceof/no-instanceof": 2,
"@typescript-eslint/explicit-module-boundary-types": 0,
"@typescript-eslint/no-empty-function": 0,
"@typescript-eslint/no-shadow": 0,
"@typescript-eslint/no-empty-interface": 0,
"@typescript-eslint/no-use-before-define": ["error", "nofunc"],
"@typescript-eslint/no-unused-vars": ["warn", { args: "none" }],
"@typescript-eslint/no-floating-promises": "error",
"@typescript-eslint/no-misused-promises": "error",
camelcase: 0,
"class-methods-use-this": 0,
"import/extensions": [2, "ignorePackages"],
"import/no-extraneous-dependencies": [
"error",
{ devDependencies: ["**/*.test.ts"] },
],
"import/no-unresolved": 0,
"import/prefer-default-export": 0,
"keyword-spacing": "error",
"max-classes-per-file": 0,
"max-len": 0,
"no-await-in-loop": 0,
"no-bitwise": 0,
"no-console": 0,
"no-restricted-syntax": 0,
"no-shadow": 0,
"no-continue": 0,
"no-void": 0,
"no-underscore-dangle": 0,
"no-use-before-define": 0,
"no-useless-constructor": 0,
"no-return-await": 0,
"consistent-return": 0,
"no-else-return": 0,
"func-names": 0,
"no-lonely-if": 0,
"prefer-rest-params": 0,
"new-cap": ["error", { properties: false, capIsNew: false }],
},
overrides: [
{
files: ["**/*.test.ts"],
rules: {
"@typescript-eslint/no-unused-vars": "off",
},
},
],
};
7 changes: 7 additions & 0 deletions libs/langchain-ollama/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
index.cjs
index.js
index.d.ts
index.d.cts
node_modules
dist
.yarn
19 changes: 19 additions & 0 deletions libs/langchain-ollama/.prettierrc
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
{
"$schema": "https://json.schemastore.org/prettierrc",
"printWidth": 80,
"tabWidth": 2,
"useTabs": false,
"semi": true,
"singleQuote": false,
"quoteProps": "as-needed",
"jsxSingleQuote": false,
"trailingComma": "es5",
"bracketSpacing": true,
"arrowParens": "always",
"requirePragma": false,
"insertPragma": false,
"proseWrap": "preserve",
"htmlWhitespaceSensitivity": "css",
"vueIndentScriptAndStyle": false,
"endOfLine": "lf"
}
Loading
Loading