Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

community[minor],docs[minor]: Add ChromeAI chat model #5903

Merged
merged 16 commits into from
Jun 28, 2024
Merged
58 changes: 58 additions & 0 deletions docs/core_docs/docs/integrations/chat/chrome_ai.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,58 @@
---
sidebar_label: ChromeAI
---

import CodeBlock from "@theme/CodeBlock";

# ChatChromeAI

:::info
This feature is **experimental** and is subject to change.
:::
:::note
The `Built-in AI Early Preview Program` by Google is currently in beta. To apply for access or find more information, please visit [this link](https://developer.chrome.com/docs/ai/built-in).
:::

ChatChromeAI leverages the webGPU and Gemini Nano to run LLMs directly in the browser, without the need for an internet connection.
This allows for running faster and private models without ever having data leave the consumers device.

## Getting started

Once you've been granted access to the program, follow all steps to download the model.

Once downloaded, you can start using `ChatChromeAI` in the browser as follows:

```typescript
import { ChatChromeAI } from "langchain/experimental/chat_models/chrome_ai";
import { HumanMessage } from "@langchain/core/messages";

const model = new ChatChromeAI({
temperature: 0.5, // Optional, defaults to 0.5
topK: 40, // Optional, defaults to 40
});

const message = new HumanMessage("Write me a short poem please");

const response = await model.invoke([message]);
```

### Streaming

`ChatChromeAI` also supports streaming chunks:

```typescript
import { AIMessageChunk } from "@langchain/core/messages";

let fullMessage: AIMessageChunk | undefined = undefined;
for await (const chunk of await model.stream([message])) {
if (!fullMessage) {
fullMessage = chunk;
} else {
fullMessage = fullMessage.concat(chunk);
}
console.log(fullMessage.content);
}
```

We also have a simple demo application which you can copy to instantly start running `ChatChromeAI` in your browser.
Navigate to the [README.md](../../../../../langchain/src/experimental/chrome_ai/app/README.md) in the `./app` directory of the integration for more instructions.
1 change: 1 addition & 0 deletions environment_tests/test-exports-bun/src/entrypoints.js
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,7 @@ export * from "langchain/experimental/babyagi";
export * from "langchain/experimental/generative_agents";
export * from "langchain/experimental/plan_and_execute";
export * from "langchain/experimental/chains/violation_of_expectations";
export * from "langchain/experimental/chat_models/chrome_ai";
export * from "langchain/experimental/masking";
export * from "langchain/experimental/prompts/custom_format";
export * from "langchain/evaluation";
Expand Down
1 change: 1 addition & 0 deletions environment_tests/test-exports-cf/src/entrypoints.js
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,7 @@ export * from "langchain/experimental/babyagi";
export * from "langchain/experimental/generative_agents";
export * from "langchain/experimental/plan_and_execute";
export * from "langchain/experimental/chains/violation_of_expectations";
export * from "langchain/experimental/chat_models/chrome_ai";
export * from "langchain/experimental/masking";
export * from "langchain/experimental/prompts/custom_format";
export * from "langchain/evaluation";
Expand Down
1 change: 1 addition & 0 deletions environment_tests/test-exports-cjs/src/entrypoints.js
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,7 @@ const experimental_babyagi = require("langchain/experimental/babyagi");
const experimental_generative_agents = require("langchain/experimental/generative_agents");
const experimental_plan_and_execute = require("langchain/experimental/plan_and_execute");
const experimental_chains_violation_of_expectations = require("langchain/experimental/chains/violation_of_expectations");
const experimental_chat_models_chrome_ai = require("langchain/experimental/chat_models/chrome_ai");
const experimental_masking = require("langchain/experimental/masking");
const experimental_prompts_custom_format = require("langchain/experimental/prompts/custom_format");
const evaluation = require("langchain/evaluation");
Expand Down
1 change: 1 addition & 0 deletions environment_tests/test-exports-esbuild/src/entrypoints.js
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,7 @@ import * as experimental_babyagi from "langchain/experimental/babyagi";
import * as experimental_generative_agents from "langchain/experimental/generative_agents";
import * as experimental_plan_and_execute from "langchain/experimental/plan_and_execute";
import * as experimental_chains_violation_of_expectations from "langchain/experimental/chains/violation_of_expectations";
import * as experimental_chat_models_chrome_ai from "langchain/experimental/chat_models/chrome_ai";
import * as experimental_masking from "langchain/experimental/masking";
import * as experimental_prompts_custom_format from "langchain/experimental/prompts/custom_format";
import * as evaluation from "langchain/evaluation";
Expand Down
1 change: 1 addition & 0 deletions environment_tests/test-exports-esm/src/entrypoints.js
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,7 @@ import * as experimental_babyagi from "langchain/experimental/babyagi";
import * as experimental_generative_agents from "langchain/experimental/generative_agents";
import * as experimental_plan_and_execute from "langchain/experimental/plan_and_execute";
import * as experimental_chains_violation_of_expectations from "langchain/experimental/chains/violation_of_expectations";
import * as experimental_chat_models_chrome_ai from "langchain/experimental/chat_models/chrome_ai";
import * as experimental_masking from "langchain/experimental/masking";
import * as experimental_prompts_custom_format from "langchain/experimental/prompts/custom_format";
import * as evaluation from "langchain/evaluation";
Expand Down
1 change: 1 addition & 0 deletions environment_tests/test-exports-vercel/src/entrypoints.js
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,7 @@ export * from "langchain/experimental/babyagi";
export * from "langchain/experimental/generative_agents";
export * from "langchain/experimental/plan_and_execute";
export * from "langchain/experimental/chains/violation_of_expectations";
export * from "langchain/experimental/chat_models/chrome_ai";
export * from "langchain/experimental/masking";
export * from "langchain/experimental/prompts/custom_format";
export * from "langchain/evaluation";
Expand Down
1 change: 1 addition & 0 deletions environment_tests/test-exports-vite/src/entrypoints.js
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,7 @@ export * from "langchain/experimental/babyagi";
export * from "langchain/experimental/generative_agents";
export * from "langchain/experimental/plan_and_execute";
export * from "langchain/experimental/chains/violation_of_expectations";
export * from "langchain/experimental/chat_models/chrome_ai";
export * from "langchain/experimental/masking";
export * from "langchain/experimental/prompts/custom_format";
export * from "langchain/evaluation";
Expand Down
4 changes: 4 additions & 0 deletions langchain/.gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -510,6 +510,10 @@ experimental/chains/violation_of_expectations.cjs
experimental/chains/violation_of_expectations.js
experimental/chains/violation_of_expectations.d.ts
experimental/chains/violation_of_expectations.d.cts
experimental/chat_models/chrome_ai.cjs
experimental/chat_models/chrome_ai.js
experimental/chat_models/chrome_ai.d.ts
experimental/chat_models/chrome_ai.d.cts
experimental/masking.cjs
experimental/masking.js
experimental/masking.d.ts
Expand Down
1 change: 1 addition & 0 deletions langchain/langchain.config.js
Original file line number Diff line number Diff line change
Expand Up @@ -190,6 +190,7 @@ export const config = {
"experimental/plan_and_execute": "experimental/plan_and_execute/index",
"experimental/chains/violation_of_expectations":
"experimental/chains/violation_of_expectations/index",
"experimental/chat_models/chrome_ai": "experimental/chrome_ai/chat_models",
"experimental/masking": "experimental/masking/index",
"experimental/prompts/custom_format": "experimental/prompts/custom_format",
"experimental/prompts/handlebars": "experimental/prompts/handlebars",
Expand Down
13 changes: 13 additions & 0 deletions langchain/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -522,6 +522,10 @@
"experimental/chains/violation_of_expectations.js",
"experimental/chains/violation_of_expectations.d.ts",
"experimental/chains/violation_of_expectations.d.cts",
"experimental/chat_models/chrome_ai.cjs",
"experimental/chat_models/chrome_ai.js",
"experimental/chat_models/chrome_ai.d.ts",
"experimental/chat_models/chrome_ai.d.cts",
"experimental/masking.cjs",
"experimental/masking.js",
"experimental/masking.d.ts",
Expand Down Expand Up @@ -2075,6 +2079,15 @@
"import": "./experimental/chains/violation_of_expectations.js",
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey there! 👋 I noticed that this PR introduces a new dependency change related to the "chrome_ai" chat model. This comment is to flag the change for maintainers to review, as it impacts the project's dependencies. Great work, and looking forward to the review!

"require": "./experimental/chains/violation_of_expectations.cjs"
},
"./experimental/chat_models/chrome_ai": {
"types": {
"import": "./experimental/chat_models/chrome_ai.d.ts",
"require": "./experimental/chat_models/chrome_ai.d.cts",
"default": "./experimental/chat_models/chrome_ai.d.ts"
},
"import": "./experimental/chat_models/chrome_ai.js",
"require": "./experimental/chat_models/chrome_ai.cjs"
},
"./experimental/masking": {
"types": {
"import": "./experimental/masking.d.ts",
Expand Down
22 changes: 22 additions & 0 deletions langchain/src/experimental/chrome_ai/app/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
# ChatChromeAI

This is a simple application designed to run in the browser that uses the webGPU and Gemini Nano.
Gemini Nano is a LLM which Google Chrome has embedded in the browser. As of 06/26/2024 it is still in beta. To request access or find more information, please visit [this link](https://developer.chrome.com/docs/ai/built-in).

## Getting Started

To run this application, you'll first need to build the locally dependencies. From the root of the `langchain-ai/langchainjs` repo, run the following command:

```bash
yarn build --filter=langchain --filter=@langchain/openai
```

Once the dependencies are built, navigate into this directory (`langchain/src/experimental/chrome_ai/app`) and run the following command:

```bash
yarn install # install the dependencies

yarn start # start the application
```

Then, open your browser and navigate to [`http://127.0.0.1:8080/src/chrome_ai.html`](http://127.0.0.1:8080/src/chrome_ai.html).
16 changes: 16 additions & 0 deletions langchain/src/experimental/chrome_ai/app/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
{
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey there! 👋 I noticed that this PR introduces changes to the dev and hard dependencies in the package.json file. I've flagged this for your review to ensure everything aligns with the project's requirements. Keep up the great work!

"name": "chrome_ai",
"packageManager": "yarn@3.4.1",
"scripts": {
"start": "rm -rf ./dist && yarn webpack && yarn http-server -c-1 -p 8080"
},
"devDependencies": {
"http-server": "^14.0.1",
"webpack": "^5.92.1",
"webpack-cli": "^5.1.4"
},
"dependencies": {
"@langchain/openai": "file:../../../../../libs/langchain-openai",
"langchain": "file:../../../../"
}
}
111 changes: 111 additions & 0 deletions langchain/src/experimental/chrome_ai/app/src/chrome_ai.html
Original file line number Diff line number Diff line change
@@ -0,0 +1,111 @@
<!DOCTYPE html>
<html>
<head>
<title>ChatChromeAI Example</title>
<style>
body {
font-family: Arial, sans-serif;
max-width: 800px;
margin: 0 auto;
padding: 20px;
background-color: #f0f0f0;
}
h1 {
color: #333;
text-align: center;
}
button {
background-color: #4caf50;
border: none;
color: white;
padding: 10px 20px;
text-align: center;
text-decoration: none;
display: inline-block;
font-size: 16px;
margin: 4px 2px;
cursor: pointer;
border-radius: 4px;
}
#destroyButton {
background-color: #f44336;
}
form {
background-color: white;
padding: 20px;
border-radius: 8px;
box-shadow: 0 2px 4px rgba(0, 0, 0, 0.1);
}
input[type="text"] {
width: 100%;
padding: 12px 20px;
margin: 8px 0;
box-sizing: border-box;
border: 2px solid #ccc;
border-radius: 4px;
}
#responseContainer {
background-color: white;
padding: 20px;
border-radius: 8px;
box-shadow: 0 2px 4px rgba(0, 0, 0, 0.1);
margin-top: 20px;
}
.stats {
display: flex;
justify-content: space-around;
margin-top: 10px;
}
.stat-pill {
padding: 5px 10px;
border-radius: 20px;
font-size: 14px;
color: white;
}
</style>
</head>
<body>
<h1>LangChain.js🦜🔗 - ChatChromeAI Example</h1>

<button id="destroyButton">Destroy Model</button>

<form id="inputForm">
<label for="inputField">Enter your input:</label><br />
<input
type="text"
id="inputField"
name="inputField"
autocomplete="off"
/><br />
<button type="submit">Submit</button>
</form>

<div id="responseContainer">
<div id="responseText"></div>
<div id="statsContainer">
<div class="stats">
<span
class="stat-pill"
style="background-color: #3498db"
id="firstTokenTime"
>First Token: -- ms</span
>
<span
class="stat-pill"
style="background-color: #2ecc71"
id="totalTime"
>Total Time: -- ms</span
>
<span
class="stat-pill"
style="background-color: #e74c3c"
id="totalTokens"
>Total Tokens: --</span
>
</div>
</div>
</div>

<script src="../dist/bundle.js"></script>
</body>
</html>
68 changes: 68 additions & 0 deletions langchain/src/experimental/chrome_ai/app/src/index.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,68 @@
import { ChatChromeAI } from "langchain/experimental/chat_models/chrome_ai";
import { encodingForModel } from "@langchain/core/utils/tiktoken";

const model = new ChatChromeAI();
const destroyButton = document.getElementById("destroyButton");
const inputForm = document.getElementById("inputForm");
const submitButton = inputForm.querySelector("button[type='submit']");

// Initialize the model when the page loads
window.addEventListener("load", async () => {
try {
await model.initialize();
destroyButton.disabled = false;
submitButton.disabled = false;
} catch (error) {
console.error("Failed to initialize model:", error);
alert("Failed to initialize model. Please try refreshing the page.");
}
});

destroyButton.addEventListener("click", () => {
model.destroy();
destroyButton.disabled = true;
submitButton.disabled = true;
});

inputForm.addEventListener("submit", async (event) => {
event.preventDefault();
const input = document.getElementById("inputField").value;
const humanMessage = ["human", input];

// Clear previous response
const responseTextElement = document.getElementById("responseText");
responseTextElement.textContent = "";

let fullMsg = "";
let timeToFirstTokenMs = 0;
let totalTimeMs = 0;
try {
const startTime = performance.now();
for await (const chunk of await model.stream(humanMessage)) {
if (timeToFirstTokenMs === 0) {
timeToFirstTokenMs = performance.now() - startTime;
}
fullMsg += chunk.content;
// Update the response element with the new content
responseTextElement.textContent = fullMsg;
}
totalTimeMs = performance.now() - startTime;
} catch (error) {
console.error("An error occurred:", error);
responseTextElement.textContent = "An error occurred: " + error.message;
}

const encoding = await encodingForModel("gpt2");
const numTokens = encoding.encode(fullMsg).length;

// Update the stat pills
document.getElementById(
"firstTokenTime"
).textContent = `First Token: ${Math.round(timeToFirstTokenMs)} ms`;
document.getElementById("totalTime").textContent = `Total Time: ${Math.round(
totalTimeMs
)} ms`;
document.getElementById(
"totalTokens"
).textContent = `Total Tokens: ${numTokens}`;
});
10 changes: 10 additions & 0 deletions langchain/src/experimental/chrome_ai/app/webpack.config.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
const path = require("path");

module.exports = {
entry: "./src/index.js",
output: {
filename: "bundle.js",
path: path.resolve(__dirname, "dist"),
},
mode: "development",
};
Loading
Loading