Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🐛 Bug Report: CohereInstrumentor do not instrument anything #2338

Open
1 task done
bbaabemhp opened this issue Nov 26, 2024 · 5 comments
Open
1 task done

🐛 Bug Report: CohereInstrumentor do not instrument anything #2338

bbaabemhp opened this issue Nov 26, 2024 · 5 comments
Labels
bug Something isn't working

Comments

@bbaabemhp
Copy link

Which component is this bug for?

Cohere Instrumentation

📜 Description

I use the CohereInstrumentor, but it does not create any spans for my trace.

👟 Reproduction steps

import logging

from azure.monitor.opentelemetry import configure_azure_monitor
from opentelemetry import trace
from opentelemetry.instrumentation.cohere import CohereInstrumentor
from opentelemetry.instrumentation.fastapi import FastAPIInstrumentor
from opentelemetry.instrumentation.langchain import LangchainInstrumentor
from opentelemetry.instrumentation.logging import LoggingInstrumentor
from opentelemetry.instrumentation.openai import OpenAIInstrumentor
from opentelemetry.instrumentation.psycopg import PsycopgInstrumentor
from opentelemetry.instrumentation.requests import RequestsInstrumentor
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor, ConsoleSpanExporter
from opentelemetry.instrumentation.httpx import HTTPXClientInstrumentor
from dataproduct.settings import settings
from opentelemetry.instrumentation.asyncio import AsyncioInstrumentor


def configure_tracing(app=None):
    if settings.APPLICATIONINSIGHTS_CONNECTION_STRING:
        configure_azure_monitor(
            connection_string=settings.APPLICATIONINSIGHTS_CONNECTION_STRING,
            enable_live_metrics=True,
            logger_name="dataproduct",
        )
        print("Config done")
    else:
        provider = TracerProvider()
        console_exporter = ConsoleSpanExporter()
        provider.add_span_processor(BatchSpanProcessor(console_exporter))
        trace.set_tracer_provider(provider)
    RequestsInstrumentor().instrument()
    HTTPXClientInstrumentor().instrument()
    AsyncioInstrumentor().instrument()
    CohereInstrumentor().instrument()
    OpenAIInstrumentor().instrument()
    # LangchainInstrumentor().instrument()
    PsycopgInstrumentor().instrument(enable_commenter=False, skip_dep_check=True)
    LoggingInstrumentor().instrument(
        set_logging_format=True,
        log_level=logging.DEBUG,
    )
    if app:
        FastAPIInstrumentor.instrument_app(app)
    return trace.get_tracer(
        __name__,
    )

_ = configure_tracing()

async def reranking(docs):
   co = cohere.AsyncClient(
                base_url=settings.COHERE_AZURE_ENDPOINT,
                api_key=settings.COHERE_AZURE_KEY,
            )
    rerank_response = await co.rerank(
                query=search.search,
                documents=sorted_data[:100],
                rank_fields=["chunk_text"],
                top_n=search.no_hits,
            )
     return rerank_response

👍 Expected behavior

It should give me any spans

👎 Actual Behavior with Screenshots

I just get the span for the HTTPXInstrumentor

image

🤖 Python Version

3.11

📃 Provide any additional context for the Bug.

opentelemetry-instrumentation-cohere = "^0.33.12"

👀 Have you spent some time to check if this bug has been raised before?

  • I checked and didn't find similar issue

Are you willing to submit PR?

None

@dosubot dosubot bot added the bug Something isn't working label Nov 26, 2024
Copy link

dosubot bot commented Nov 26, 2024

Hey there, @bbaabemhp! I'm here to help you with any issues you're experiencing. I can assist with bug fixes, answer questions, and guide you on becoming a contributor. Let me know how I can assist you!

The issue with the CohereInstrumentor not creating any spans might be due to the suppression of instrumentation. The code checks for two context keys, _SUPPRESS_INSTRUMENTATION_KEY and SUPPRESS_LANGUAGE_MODEL_INSTRUMENTATION_KEY, before proceeding with span creation. If either of these keys is set in the context, the instrumentation is suppressed, and no spans will be created. Ensure that these keys are not set in your context when you want spans to be created [1].

Additionally, verify that you have properly instrumented your application using the CohereInstrumentor as shown in the example usage:

from opentelemetry.instrumentation.cohere import CohereInstrumentor

CohereInstrumentor().instrument()

Also, check if the TRACELOOP_TRACE_CONTENT environment variable is set to false, as this could affect the logging of prompts, completions, and embeddings, although it should not prevent span creation [2].

To continue talking to Dosu, mention @dosu.

@bbaabemhp
Copy link
Author

BTW, OpenAIInstrumentor isn't working too :/

@bbaabemhp
Copy link
Author

Small addition: Seems like a problem with the async clients of Cohere and OpenAI. With sync, it works

@nirga
Copy link
Member

nirga commented Nov 27, 2024

Thanks @bbaabemhp!
We'll take a look on the cohere instrumentation - looks like you're right and we're indeed missing the async instrumentation.
Reg. OpenAI - we do have it - can you paste here a snippet of what you're trying to run?

@bbaabemhp
Copy link
Author

Thanks for the fast reply!

So regarding OpenAI: I use the OpenAIInstrumentor together with the LangchainInstrumentor.

The problem is that both of them will be traced, even if I never call the OpenAIInstrumentor in a specific method (Langchain works fine tho)

Here is a snippet of my code:

async def summarize_history(self, chat_history: list, question: str) -> ConversationSummary:
        prompt = get_system_prompt("summarize-history")
        model_name = prompt.config.get("model", None)

        model = self.get_model(model_name)
        structured_model = model.with_structured_output(ConversationSummary)

        langchain_prompt = ChatPromptTemplate.from_template(
            prompt.get_langchain_prompt()
        )

        langchain_prompt.metadata = {"langfuse_prompt": prompt}

        chain = langchain_prompt | structured_model

        langfuse_context.update_current_observation(
            input=chat_history, model=model_name
        )

        answer = await chain.ainvoke(
            {
                "history": chat_history, 
                "question": question
            }, 
            config={"callbacks": [langfuse_handler]})
    
        return answer

I just make an asynchronous call with Langchain's ainvoke().

image

The problem is that the OpenAI stuff, which will be called by Langchain, will get another span which is not so nice. But yeah, it could also come from my HTTPXInstrumentor

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants