-
Notifications
You must be signed in to change notification settings - Fork 814
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Which component is this bug for?
VertexAI Instrumentation
📜 Description
Passing a list of messages to generate_content combines them all into a single string and stores/presents them as a single user prompt instead of keeping them separate.
👟 Reproduction steps
import vertexai
from traceloop.sdk import Traceloop
from vertexai.generative_models import Content, GenerativeModel, Part
Traceloop.init()
vertexai.init()
model = GenerativeModel('gemini-1.5-flash-002')
response = model.generate_content(
[
Content(
role='user',
parts=[
Part.from_text("What's 2+2?"),
],
),
Content(
role='assistant',
parts=[
Part.from_text('5'),
],
),
Content(
role='user',
parts=[
Part.from_text('really?'),
],
),
]
)👍 Expected behavior
Attributes look something like:
{
"gen_ai.prompt.0.role": "user",
"gen_ai.prompt.0.content": "What's 2+2?\n",
"gen_ai.prompt.1.role": "assistant",
"gen_ai.prompt.1.content": "5\n",
"gen_ai.prompt.2.role": "user",
"gen_ai.prompt.2.content": "really?\n",
"gen_ai.completion.3.role": "assistant",
"gen_ai.completion.3.content": "Oops! My apologies. 2 + 2 = 4. I'm still under development and learning to perform these calculations correctly.\n"
}👎 Actual Behavior with Screenshots
Attributes look like:
{
"gen_ai.prompt.0.user": "role: \"user\"\nparts {\n text: \"What\\'s 2+2?\"\n}\n\nrole: \"assistant\"\nparts {\n text: \"5\"\n}\n\nrole: \"user\"\nparts {\n text: \"really?\"\n}\n\n",
"gen_ai.completion.0.role": "assistant",
"gen_ai.completion.0.content": "Oops! My apologies. 2 + 2 = 4. I'm still under development and learning to perform these calculations correctly.\n"
}
🤖 Python Version
No response
📃 Provide any additional context for the Bug.
Lines 124 to 138 in d92a50e
| def _set_input_attributes(span, args, kwargs, llm_model): | |
| if should_send_prompts() and args is not None and len(args) > 0: | |
| prompt = "" | |
| for arg in args: | |
| if isinstance(arg, str): | |
| prompt = f"{prompt}{arg}\n" | |
| elif isinstance(arg, list): | |
| for subarg in arg: | |
| prompt = f"{prompt}{subarg}\n" | |
| _set_span_attribute( | |
| span, | |
| f"{SpanAttributes.LLM_PROMPTS}.0.user", | |
| prompt, | |
| ) |
👀 Have you spent some time to check if this bug has been raised before?
- I checked and didn't find similar issue
Are you willing to submit PR?
None
milest
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working