Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🚀 Feature: Tracing support for Amazon SageMaker endpoints #1660

Closed
1 task done
bobbywlindsey opened this issue Jul 23, 2024 · 1 comment · Fixed by #2028
Closed
1 task done

🚀 Feature: Tracing support for Amazon SageMaker endpoints #1660

bobbywlindsey opened this issue Jul 23, 2024 · 1 comment · Fixed by #2028

Comments

@bobbywlindsey
Copy link
Contributor

Which component is this feature for?

All Packages

🔖 Feature description

Add tracing support for Amazon SageMaker endpoints

🎤 Why is this feature needed ?

Developers may deploy a variety of LLMs as endpoints on Amazon SageMaker. They'd query these endpoints using the InvokeEndpoint API to retrieve the LLM response.

✌️ How do you aim to achieve this?

  1. Create opentelemetry-instrumentation-sagemaker package
  2. Create instrumentation for the sagemaker-runtime service and patch the invoke_endpoint method.
  3. Create span for the request.

🔄️ Additional Information

No response

👀 Have you spent some time to check if this feature request has been raised before?

  • I checked and didn't find similar issue

Are you willing to submit PR?

Yes I am willing to submit a PR!

@nirga
Copy link
Member

nirga commented Jul 23, 2024

Yes @bobbywlindsey! We've been wanting to do this for a while! I see that you want to work on this, let me know how can I help!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants