Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

mistralai[patch]: Fix flaky test using callbacks #6001

Merged
merged 2 commits into from
Jul 8, 2024
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
53 changes: 34 additions & 19 deletions libs/langchain-mistralai/src/tests/llms.int.test.ts
Original file line number Diff line number Diff line change
@@ -1,7 +1,12 @@
/* eslint-disable no-process-env */
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey there! 👋 I noticed that the recent change in this PR explicitly accesses an environment variable via process.env. This comment is to flag the change for maintainers to review. Let me know if you have any questions!


import { test, expect } from "@jest/globals";
import { CallbackManager } from "@langchain/core/callbacks/manager";
import { MistralAI } from "../llms.js";

// Save the original value of the 'LANGCHAIN_CALLBACKS_BACKGROUND' environment variable
const originalBackground = process.env.LANGCHAIN_CALLBACKS_BACKGROUND;

test("Test MistralAI", async () => {
const model = new MistralAI({
maxTokens: 5,
Expand Down Expand Up @@ -75,27 +80,37 @@ test("Test MistralAI with signal in call options", async () => {
}, 5000);

test("Test MistralAI in streaming mode", async () => {
let nrNewTokens = 0;
let streamedCompletion = "";
// Running LangChain callbacks in the background will sometimes cause the callbackManager to execute
// after the test/llm call has already finished & returned. Set that environment variable to false
// to prevent that from happening.
process.env.LANGCHAIN_CALLBACKS_BACKGROUND = "false";

const model = new MistralAI({
maxTokens: 5,
model: "codestral-latest",
streaming: true,
callbacks: CallbackManager.fromHandlers({
async handleLLMNewToken(token: string) {
nrNewTokens += 1;
streamedCompletion += token;
},
}),
});
const res = await model.invoke(
"Log 'Hello world' to the console in javascript: "
);
console.log({ res }, "Test MistralAI in streaming mode");
try {
let nrNewTokens = 0;
let streamedCompletion = "";

expect(nrNewTokens > 0).toBe(true);
expect(res).toBe(streamedCompletion);
const model = new MistralAI({
maxTokens: 5,
model: "codestral-latest",
streaming: true,
callbacks: CallbackManager.fromHandlers({
async handleLLMNewToken(token: string) {
nrNewTokens += 1;
streamedCompletion += token;
},
}),
});
const res = await model.invoke(
"Log 'Hello world' to the console in javascript: "
);
console.log({ res }, "Test MistralAI in streaming mode");

expect(nrNewTokens > 0).toBe(true);
expect(res).toBe(streamedCompletion);
} finally {
// Reset the environment variable
process.env.LANGCHAIN_CALLBACKS_BACKGROUND = originalBackground;
}
});

test("Test MistralAI stream method", async () => {
Expand Down
Loading