Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Instrumentation LlamaIndex #122

Closed
amaitammar opened this issue Sep 18, 2024 · 6 comments · Fixed by #128
Closed

Instrumentation LlamaIndex #122

amaitammar opened this issue Sep 18, 2024 · 6 comments · Fixed by #128

Comments

@amaitammar
Copy link

Bonjour,

Je rencontre quelques difficultés avec l'instrumentation LlamaIndex.

On observe que l'instrumentation fonctionne avec des query engines mais pas avec certains autres appels.
Exemple : agent.chat, lm.predict_and_call, etc.

De plus, certains threads produisent des erreurs.
image

Avec un message suivant dans la console :
console _error.txt

Environnement
python=3.12
literalai==0.0.622
llama-index==0.11.10

Voici un exemple de code dont je souhaite logger les steps :

from llama_index.core.llms import ChatMessage
from llama_index.core.tools import FunctionTool
from literalai import LiteralClient

lc = LiteralClient(
    api_key=LITERAL_API_KEY
)

lc.instrument_llamaindex()


def multiply(a: float, b: float) -> float:
    """Multiply two numbers and returns the product"""
    return a * b

def add(a: float, b: float) -> float:
    """Add two numbers and returns the sum"""
    return a + b

tools = [
    FunctionTool.from_defaults(fn=multiply),
    FunctionTool.from_defaults(fn=add)
]

query = "What is 20+(2*4)? Use a tool to calculate every step."
chat_history = [
    ChatMessage(
        role="user",
        content=query)
]


with lc.thread(name="test instru") as thread:
    lc.message(chat_history[-1].content)
    with lc.run(thread_id=thread.id) as run:
        response = Settings.llm.predict_and_call(
            tools,
            chat_history=chat_history,
            error_on_no_tool_call=False,
        )
    
    lc.message(str(response))

Bien cordialement,
Adrien

@desaxce
Copy link
Contributor

desaxce commented Sep 18, 2024

Merci @amait41 pour ton retour, je regarde ça.

@willydouhard
Copy link
Contributor

Can you provide the thread json that is causing this issue? You can find it in the network tab of the chrome dev tool. Look for a graphql query named threadDetail. That would be helpful to reproduce.

@amaitammar
Copy link
Author

amaitammar commented Sep 24, 2024

Thanks for your help, here is the file :
graphql_query.txt
Let me know if I didn't provide the correct file :)

@desaxce desaxce linked a pull request Sep 26, 2024 that will close this issue
@willydouhard
Copy link
Contributor

This is great, thank you!

After inspecting the thread, we can see that there is a circular dependency between step c693d870-5cdc-4bfc-b14a-2766b5c8a7a8 of type retrieval and step d09ce25c-7e52-580f-8820-a9a97e2cb17c of type run.

This should never happen. Is this happening using only the llama index instrumentation?

In the next release of the platform the UI will gracefully handle this edge case but we need to also find the root cause.

@desaxce
Copy link
Contributor

desaxce commented Sep 29, 2024

In the next release of the platform the UI will gracefully handle this edge case but we need to also find the root cause.

I managed to reproduce circular dependencies when with a call to engine.query() when two calls to literalai_client.instrument_llamaindex() were made prior. Enforced single single instrumentation in this PR.

@desaxce
Copy link
Contributor

desaxce commented Oct 1, 2024

@amait41 Nous venons de sortir:

  • la version 0.0.625 de la SDK Python: nous avons mis à jour la documentation de l'instrumentation de LlamaIndex afin d'expliciter les Thread/Run/Generation auxquels on peut s'attendre sur Literal AI
  • la version v0.0.625-beta de notre plateforme, qui résout les problèmes d'affichage de Thread quand les Step associés ont des dépendances circulaires

Pour ce dernier point, l'erreur venait d'une double instrumentation de LlamaIndex, chose qui est désormais infaisable avec la dernière version de la SDK.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants