You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Developers may deploy a variety of LLMs as endpoints on Amazon SageMaker. They'd query these endpoints using the InvokeEndpoint API to retrieve the LLM response.
Which component is this feature for?
All Packages
🔖 Feature description
Add tracing support for Amazon SageMaker endpoints
🎤 Why is this feature needed ?
Developers may deploy a variety of LLMs as endpoints on Amazon SageMaker. They'd query these endpoints using the InvokeEndpoint API to retrieve the LLM response.
✌️ How do you aim to achieve this?
opentelemetry-instrumentation-sagemaker
packagesagemaker-runtime
service and patch theinvoke_endpoint
method.🔄️ Additional Information
No response
👀 Have you spent some time to check if this feature request has been raised before?
Are you willing to submit PR?
Yes I am willing to submit a PR!
The text was updated successfully, but these errors were encountered: