Replies: 2 comments
-
Related to the question above, we have managed to create a new tool package externally to do get to the prompt details by amending the AOI tool. Rather than creating an external tool - is it possible to create an inline tool in the same project? |
Beta Was this translation helpful? Give feedback.
-
@svenski Seems your ask is similar to this thread #1079 Could you please elaborate on which part in the flow_artifacts you want? This data is mainly an internal data structure and may change at anytime, it's unlikely we could provide a public API to access all of the details. We would like to know more about your scenario and see if we can help. |
Beta Was this translation helpful? Give feedback.
-
We want to access the prompt request programmatically to store the prompt details for future analysis, cost and prompt stats.
I can see that the request is stored in the flow_artefacts directory. Is there a good way of accessing this after each llm-node call? I'm considering wrapping the llm (see https://github.com/microsoft/promptflow/blob/main/examples/tools/tool-package-quickstart/my_tool_package/tools/tool_with_custom_llm_type.py) -- would the request be accessible from there? I have not been able to find it in the api.
Beta Was this translation helpful? Give feedback.
All reactions