Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Langsmith expose is not working Azure OpenAI services #9438

Closed
mrcmoresi opened this issue Aug 18, 2023 · 4 comments
Closed

Langsmith expose is not working Azure OpenAI services #9438

mrcmoresi opened this issue Aug 18, 2023 · 4 comments
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature Ɑ: models Related to LLMs or chat model modules 🤖:question A specific question about the codebase, product, project, or how to use a feature

Comments

@mrcmoresi
Copy link

mrcmoresi commented Aug 18, 2023

Issue you'd like to raise.

Hi everyone, I'm trying to deploy and use langsmith locally.
I deployed in a docker container using

langsmith start --expose --openai-api-key=<my azure OpenAi key>

the docker container looks good
image
I opened all the used ports to avoid any problem there, I'm running langsmith in a remote computer

I set up the environment variables
LANGCHAIN_TRACING_V2=true
LANGCHAIN_ENDPOINT=https://cc23-20-79-217-xxx.ngrok.io
LANGCHAIN_API_KEY=

but the interface is not loading the projects
image

when I try to access the langsmith endpoint it returns

{
"detail": "Not Found"
}

using the chat example that appears in this repo
https://github.com/langchain-ai/langsmith-cookbook/tree/main/feedback-examples/streamlit

I can see in the endpoint https://cc23-20-79-217-xxx.ngrok.io that the runs are being tracked, but I can't see them in the frontend

debugging the front end it is failing trying to fetch the tenants, it's trying to fetch them from http://127.0.0.1:1984/tenants while if I'm not understanding it wrong it should get them from http://20.79.217.xxx:1984/tenants
image

could it be a problem with the Azure OpenAI? or did I do something wrong with the installation?

Thanks in advance

Suggestion:

No response

@dosubot dosubot bot added Ɑ: models Related to LLMs or chat model modules 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature 🤖:question A specific question about the codebase, product, project, or how to use a feature labels Aug 18, 2023
@dosubot
Copy link

dosubot bot commented Aug 18, 2023

🤖

Hi there,

Thanks for reaching out and providing detailed information about your issue. Based on the information you've provided, it seems like the issue might not be directly related to Azure OpenAI, but rather to the setup of your environment variables or the LangSmith platform itself.

Firstly, ensure that you have correctly set up the environment variables LANGCHAIN_TRACING_V2, LANGCHAIN_PROJECT, LANGCHAIN_ENDPOINT, and LANGCHAIN_API_KEY. The LANGCHAIN_TRACING_V2 should be set to "true", LANGCHAIN_PROJECT should be the name of your project, LANGCHAIN_ENDPOINT should be "https://api.smith.langchain.com", and LANGCHAIN_API_KEY should be your API key.

Here's an example of how you can set these variables in Python:

import os
from uuid import uuid4

unique_id = uuid4().hex[0:8]
os.environ["LANGCHAIN_TRACING_V2"] = "true"
os.environ["LANGCHAIN_PROJECT"] = f"Tracing Walkthrough - {unique_id}"
os.environ["LANGCHAIN_ENDPOINT"] = "https://api.smith.langchain.com"
os.environ["LANGCHAIN_API_KEY"] = ""  # Update to your API key

Secondly, make sure you have created a LangSmith account and generated an API key. You can do this at https://smith.langchain.com/.

Lastly, check if the LangSmith platform is currently operational. As mentioned in the guide, LangSmith is in closed beta and they are in the process of rolling it out to more users.

If you have checked all these and the issue persists, it might be a good idea to reach out to LangSmith support at support@langchain.dev for further assistance.

Regarding the "Not Found" error, a common mistake that could lead to this error is if the BACKEND_URL for the frontend service is not correctly set to the URL of the backend service. In this case, the frontend service would not be able to communicate with the backend service, leading to a "Not Found" error. Another possible issue could be if the PORT for the backend service is not correctly exposed or if the LANGCHAIN_ENV is not correctly set.

I hope this helps! Let me know if you have any other questions or if there's anything else I can assist you with.

Best,
Dosu

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@sudevschiz
Copy link

Hi @mrcmoresi , I am facing the same issue while trying to host the application in a GCP VM. Were you able to fix this?

@mrcmoresi
Copy link
Author

Hi @sudevschiz unfortunately I couldn't spend more time on it. in the following weeks I'll try to find a solution to this problem.

Copy link

dosubot bot commented Dec 26, 2023

Hi, @mrcmoresi,

I'm helping the LangChain team manage their backlog and am marking this issue as stale. From what I understand, you reported an issue with the Langsmith expose feature not working with Azure OpenAI services. You deployed Langsmith in a Docker container, set up the necessary environment variables, but the interface is not loading the projects. The frontend is failing to fetch tenants from the correct endpoint, and you suspect it may be a problem with Azure OpenAI or the installation process. Detailed troubleshooting steps were provided, and another user mentioned facing a similar issue while hosting the application in a GCP VM. You acknowledged the issue and expressed intent to spend more time finding a solution in the following weeks.

Could you please confirm if this issue is still relevant to the latest version of the LangChain repository? If it is, please let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or the issue will be automatically closed in 7 days. Thank you!

@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Dec 26, 2023
@dosubot dosubot bot closed this as not planned Won't fix, can't repro, duplicate, stale Jan 30, 2024
@dosubot dosubot bot removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Jan 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature Ɑ: models Related to LLMs or chat model modules 🤖:question A specific question about the codebase, product, project, or how to use a feature
Projects
None yet
Development

No branches or pull requests

2 participants