Skip to content

Commit

Permalink
docs: Update Tracing Integrations to match standard format (GITBOOK-784)
Browse files Browse the repository at this point in the history
  • Loading branch information
mikeldking authored and gitbook-bot committed Aug 3, 2024
1 parent 04a8ea0 commit dedf969
Show file tree
Hide file tree
Showing 12 changed files with 308 additions and 317 deletions.
4 changes: 2 additions & 2 deletions docs/tracing/integrations-tracing/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,8 @@ Phoenix natively works with a variety of frameworks and SDKs across Python and J

## Python

<table data-full-width="true"><thead><tr><th width="176">Library</th><th width="299">Instrumentation</th><th width="125">Version</th></tr></thead><tbody><tr><td><a href="llamaindex.md">LlamaIndex</a></td><td><code>openinference-instrumentation-llama-index</code></td><td><a href="https://pypi.python.org/pypi/openinference-instrumentation-llama-index"><img src="https://camo.githubusercontent.com/f9b5663c14435cd2e280675aee8a86f23b1802679514ddbd9cd6d7b5e5d51a06/68747470733a2f2f696d672e736869656c64732e696f2f707970692f762f6f70656e696e666572656e63652d696e737472756d656e746174696f6e2d6c6c616d612d696e6465782e737667" alt="PyPI Version"></a></td></tr><tr><td><a href="langchain.md">LangChain</a></td><td><code>openinference-instrumentation-langchain</code></td><td><a href="https://pypi.python.org/pypi/openinference-instrumentation-langchain"><img src="https://camo.githubusercontent.com/17d2c9f2d42d6dd80a5e0defeed3d7d346444231761194d328e9f21b57c18eae/68747470733a2f2f696d672e736869656c64732e696f2f707970692f762f6f70656e696e666572656e63652d696e737472756d656e746174696f6e2d6c616e67636861696e2e737667" alt="PyPI Version"></a></td></tr><tr><td><a href="openai.md">OpenAI</a></td><td><code>openinference-instrumentation-openai</code></td><td><a href="https://pypi.python.org/pypi/openinference-instrumentation-openai"><img src="https://camo.githubusercontent.com/bb515c29aa0ef45bff47e0510f59ed6701c43457a90d574f537e43c24de9d80f/68747470733a2f2f696d672e736869656c64732e696f2f707970692f762f6f70656e696e666572656e63652d696e737472756d656e746174696f6e2d6f70656e61692e737667" alt="PyPI Version"></a></td></tr><tr><td><a href="mistralai.md">MistralAI</a></td><td><code>openinference-instrumentation-mistralai</code></td><td><a href="https://pypi.python.org/pypi/openinference-instrumentation-mistralai"><img src="https://img.shields.io/pypi/v/openinference-instrumentation-mistralai.svg" alt="PyPI Version"></a></td></tr><tr><td><a href="vertexai.md">VertexAI</a></td><td><code>openinference-instrumentation-vertexai</code></td><td><a href="https://pypi.python.org/pypi/openinference-instrumentation-vertexai"><img src="https://img.shields.io/pypi/v/openinference-instrumentation-vertexai.svg" alt="PyPI Version"></a></td></tr><tr><td><a href="dspy.md">DSPy</a></td><td><code>openinference-instrumentation-dspy</code></td><td><a href="https://pypi.python.org/pypi/openinference-instrumentation-dspy"><img src="https://camo.githubusercontent.com/414d13608ed7dd45f47e813034d6934993bcb49394a51910fa2f037efb4cd891/68747470733a2f2f696d672e736869656c64732e696f2f707970692f762f6f70656e696e666572656e63652d696e737472756d656e746174696f6e2d647370792e737667" alt="PyPI Version"></a></td></tr><tr><td><a href="bedrock.md">AWS Bedrock</a></td><td><code>openinference-instrumentation-bedrock</code></td><td><a href="https://pypi.python.org/pypi/openinference-instrumentation-bedrock"><img src="https://camo.githubusercontent.com/98735a9c821fdb27bf3c29ccf513af8de1fba8878bd6e424ee42f8c971df1afe/68747470733a2f2f696d672e736869656c64732e696f2f707970692f762f6f70656e696e666572656e63652d696e737472756d656e746174696f6e2d626564726f636b2e737667" alt="PyPI Version"></a></td></tr></tbody></table>
<table data-full-width="true"><thead><tr><th width="218">Library</th><th width="475">Instrumentation</th><th width="386">Version</th></tr></thead><tbody><tr><td><a href="llamaindex.md">LlamaIndex</a></td><td><code>openinference-instrumentation-llama-index</code></td><td><a href="https://pypi.python.org/pypi/openinference-instrumentation-llama-index"><img src="https://camo.githubusercontent.com/f9b5663c14435cd2e280675aee8a86f23b1802679514ddbd9cd6d7b5e5d51a06/68747470733a2f2f696d672e736869656c64732e696f2f707970692f762f6f70656e696e666572656e63652d696e737472756d656e746174696f6e2d6c6c616d612d696e6465782e737667" alt="PyPI Version"></a></td></tr><tr><td><a href="langchain.md">LangChain</a></td><td><code>openinference-instrumentation-langchain</code></td><td><a href="https://pypi.python.org/pypi/openinference-instrumentation-langchain"><img src="https://camo.githubusercontent.com/17d2c9f2d42d6dd80a5e0defeed3d7d346444231761194d328e9f21b57c18eae/68747470733a2f2f696d672e736869656c64732e696f2f707970692f762f6f70656e696e666572656e63652d696e737472756d656e746174696f6e2d6c616e67636861696e2e737667" alt="PyPI Version"></a></td></tr><tr><td><a href="openai.md">OpenAI</a></td><td><code>openinference-instrumentation-openai</code></td><td><a href="https://pypi.python.org/pypi/openinference-instrumentation-openai"><img src="https://camo.githubusercontent.com/bb515c29aa0ef45bff47e0510f59ed6701c43457a90d574f537e43c24de9d80f/68747470733a2f2f696d672e736869656c64732e696f2f707970692f762f6f70656e696e666572656e63652d696e737472756d656e746174696f6e2d6f70656e61692e737667" alt="PyPI Version"></a></td></tr><tr><td><a href="mistralai.md">MistralAI</a></td><td><code>openinference-instrumentation-mistralai</code></td><td><a href="https://pypi.python.org/pypi/openinference-instrumentation-mistralai"><img src="https://img.shields.io/pypi/v/openinference-instrumentation-mistralai.svg" alt="PyPI Version"></a></td></tr><tr><td><a href="vertexai.md">VertexAI</a></td><td><code>openinference-instrumentation-vertexai</code></td><td><a href="https://pypi.python.org/pypi/openinference-instrumentation-vertexai"><img src="https://img.shields.io/pypi/v/openinference-instrumentation-vertexai.svg" alt="PyPI Version"></a></td></tr><tr><td><a href="dspy.md">DSPy</a></td><td><code>openinference-instrumentation-dspy</code></td><td><a href="https://pypi.python.org/pypi/openinference-instrumentation-dspy"><img src="https://camo.githubusercontent.com/414d13608ed7dd45f47e813034d6934993bcb49394a51910fa2f037efb4cd891/68747470733a2f2f696d672e736869656c64732e696f2f707970692f762f6f70656e696e666572656e63652d696e737472756d656e746174696f6e2d647370792e737667" alt="PyPI Version"></a></td></tr><tr><td><a href="bedrock.md">AWS Bedrock</a></td><td><code>openinference-instrumentation-bedrock</code></td><td><a href="https://pypi.python.org/pypi/openinference-instrumentation-bedrock"><img src="https://camo.githubusercontent.com/98735a9c821fdb27bf3c29ccf513af8de1fba8878bd6e424ee42f8c971df1afe/68747470733a2f2f696d672e736869656c64732e696f2f707970692f762f6f70656e696e666572656e63652d696e737472756d656e746174696f6e2d626564726f636b2e737667" alt="PyPI Version"></a></td></tr><tr><td><a href="mistralai-1.md">Guardrails AI</a></td><td><code>openinference-instrumentation-guardrails</code></td><td></td></tr><tr><td><a href="haystack.md">Haystack</a></td><td><code>openinference-instrumentation-haystack</code></td><td></td></tr><tr><td><a href="mistralai-2.md">CrewAI</a></td><td><code>openinference-instrumentation-crewai</code></td><td></td></tr></tbody></table>

## Javascript

<table data-full-width="true"><thead><tr><th>Library</th><th width="228">Instrumentation</th><th>Version</th></tr></thead><tbody><tr><td><a href="openai-node-sdk.md">OpenAI</a></td><td><code>@arizeai/openinference-instrumentation-openai</code></td><td><a href="https://www.npmjs.com/package/@arizeai/openinference-instrumentation-openai"><img src="https://camo.githubusercontent.com/e8d7d683994696e16d7620368f72a71929485bbfaad93848edfa813f631d53e2/68747470733a2f2f696d672e736869656c64732e696f2f6e706d2f762f406172697a6561692f6f70656e696e666572656e63652d696e737472756d656e746174696f6e2d6f70656e6169" alt="NPM Version"></a></td></tr><tr><td><a href="langchain.js.md">LangChainJS</a></td><td><code>@arizeai/openinference-instrumentation-langchain</code></td><td><a href="https://www.npmjs.com/package/@arizeai/openinference-instrumentation-langchain"><img src="../../.gitbook/assets/langchain-npm-version.png" alt=""></a></td></tr></tbody></table>
<table data-full-width="true"><thead><tr><th width="218">Library</th><th width="472">Instrumentation</th><th>Version</th></tr></thead><tbody><tr><td><a href="openai-node-sdk.md">OpenAI</a></td><td><code>@arizeai/openinference-instrumentation-openai</code></td><td><a href="https://www.npmjs.com/package/@arizeai/openinference-instrumentation-openai"><img src="https://camo.githubusercontent.com/e8d7d683994696e16d7620368f72a71929485bbfaad93848edfa813f631d53e2/68747470733a2f2f696d672e736869656c64732e696f2f6e706d2f762f406172697a6561692f6f70656e696e666572656e63652d696e737472756d656e746174696f6e2d6f70656e6169" alt="NPM Version"></a></td></tr><tr><td><a href="langchain.js.md">LangChainJS</a></td><td><code>@arizeai/openinference-instrumentation-langchain</code></td><td><a href="https://www.npmjs.com/package/@arizeai/openinference-instrumentation-langchain"><img src="../../.gitbook/assets/langchain-npm-version.png" alt=""></a></td></tr></tbody></table>
51 changes: 21 additions & 30 deletions docs/tracing/integrations-tracing/bedrock.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,59 +8,50 @@ description: >-

boto3 provides Python bindings to AWS services, including Bedrock, which provides access to a number of foundation models. Calls to these models can be instrumented using OpenInference, enabling OpenTelemetry-compliant observability of applications built using these models. Traces collected using OpenInference can be viewed in Phoenix.

## Traces

OpenInference Traces collect telemetry data about the execution of your LLM application. Consider using this instrumentation to understand how a Bedrock-managed models are being called inside a complex system and to troubleshoot issues such as extraction and response synthesis.

To get started instrumenting Bedrock calls via boto3, we need to install three components: Phoenix, which acts as a trace collector, the OpenInference instrumentation for AWS Bedrock, and an OpenTelemetry exporter used to send these traces to Phoenix.
## Install

```sh
pip install arize-phoenix
pip install openinference-instrumentation-bedrock
pip install opentelemetry-exporter-otlp
```bash
pip install openinference-instrumentation-bedrock opentelemetry-exporter-otlp
```

Launch a Phoenix server to collect OpenInference traces.
## Setup

```python
import phoenix as px
session = px.launch_app()
```
Set up [OpenTelemetry to point to a running Phoenix Instance](https://docs.arize.com/phoenix/quickstart).

After starting a Phoenix server, instrument `boto3` prior to initializing a `bedrock-runtime` client. All clients created after instrumentation will send traces on all calls to `invoke_model`.

```python

import boto3
from openinference.instrumentation.bedrock import BedrockInstrumentor
from opentelemetry import trace as trace_api
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor

resource = Resource(attributes={})
tracer_provider = trace_sdk.TracerProvider(resource=resource)
span_console_exporter = ConsoleSpanExporter()
# point the SpanExporter to the Phoenix server URL
span_otlp_exporter = OTLPSpanExporter(endpoint="http://127.0.0.1:6006/v1/traces")
tracer_provider.add_span_processor(SimpleSpanProcessor(span_exporter=span_console_exporter))
tracer_provider.add_span_processor(SimpleSpanProcessor(span_exporter=span_otlp_exporter))
trace_api.set_tracer_provider(tracer_provider=tracer_provider)

BedrockInstrumentor().instrument()

session = boto3.session.Session()
client = session.client("bedrock-runtime")

```

```
# All calls to invoke_model are instrumented
## Run Bedrock

From here you can run Bedrock as normal

```python
prompt = (
b'{"prompt": "Human: Hello there, how are you? Assistant:", "max_tokens_to_sample": 1024}'
)
response = client.invoke_model(modelId="anthropic.claude-v2", body=prompt)
response_body = json.loads(response.get("body").read())
print(response_body["completion"])
```

## Observe

Now that you have tracing setup, all calls to invoke\_model will be streamed to your running Phoenix for observability and evaluation.

## Resources

* [Example notebook](https://github.com/Arize-ai/openinference/blob/main/python/instrumentation/openinference-instrumentation-bedrock/examples/bedrock\_example.py)
* [OpenInference package](https://github.com/Arize-ai/openinference/blob/main/python/instrumentation/openinference-instrumentation-bedrock)
* [Working examples](https://github.com/Arize-ai/openinference/blob/main/python/instrumentation/openinference-instrumentation-bedrock/examples)

83 changes: 58 additions & 25 deletions docs/tracing/integrations-tracing/dspy.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,43 +8,76 @@ description: Instrument and observe your DSPy application via the DSPyInstrument

Phoenix makes your DSPy applications observable by visualizing the underlying structure of each call to your compiled DSPy module.

## Tracing
## Install

To trace your DSPy application, ensure that the following packages are installed in addition to DSPy:

```
pip install arize-phoenix openinference-instrumentation-dspy opentelemetry-exporter-otlp
```bash
pip install openinference-instrumentation-dspy dspy
```

Launch Phoenix as a collector in the background.
## Setup

Set up [OpenTelemetry to point to a running Phoenix instance](https://docs.arize.com/phoenix/quickstart) and then initialize the DSPyInstrumentor before your application code.

```python
import phoenix as px
from openinference.instrumentation.dspy import DSPyInstrumentor

px.launch_app()
DSPyInstrumentor().instrument()
```

Configure your OpenTelemetry exporter, which will export spans and traces to Phoenix, and run the DSPy instrumentor to wrap calls to the relevant DSPy components.
## Run DSPy

Now run invoke your compiled DSPy module. Your traces should appear inside of Phoenix.

```python
from openinference.instrumentation.dspy import DSPyInstrumentor
from opentelemetry import trace as trace_api
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace.export import SimpleSpanProcessor

endpoint = "http://127.0.0.1:6006/v1/traces"
resource = Resource(attributes={})
tracer_provider = trace_sdk.TracerProvider(resource=resource)
span_otlp_exporter = OTLPSpanExporter(endpoint=endpoint)
tracer_provider.add_span_processor(SimpleSpanProcessor(span_exporter=span_otlp_exporter))
trace_api.set_tracer_provider(tracer_provider=tracer_provider)
DSPyInstrumentor().instrument()
class BasicQA(dspy.Signature):
"""Answer questions with short factoid answers."""

question = dspy.InputField()
answer = dspy.OutputField(desc="often between 1 and 5 words")


if __name__ == "__main__":
turbo = dspy.OpenAI(model="gpt-3.5-turbo")

dspy.settings.configure(lm=turbo)

with using_attributes(
session_id="my-test-session",
user_id="my-test-user",
metadata={
"test-int": 1,
"test-str": "string",
"test-list": [1, 2, 3],
"test-dict": {
"key-1": "val-1",
"key-2": "val-2",
},
},
tags=["tag-1", "tag-2"],
prompt_template_version="v1.0",
prompt_template_variables={
"city": "Johannesburg",
"date": "July 11th",
},
):
# Define the predictor.
generate_answer = dspy.Predict(BasicQA)

# Call the predictor on a particular input.
pred = generate_answer(
question="What is the capital of the united states?" # noqa: E501
) # noqa: E501
print(f"Predicted Answer: {pred.answer}")
```

Now run invoke your compiled DSPy module. Your traces should appear inside of Phoenix.
## Observe

Now that you have tracing setup, all predictions will be streamed to your running Phoenix for observability and evaluation.

![Traces and spans from an instrumented DSPy custom module.](https://storage.googleapis.com/arize-phoenix-assets/assets/docs/notebooks/dspy-tracing-tutorial/dspy\_spans\_and\_traces.gif)

For a full working example, check out the [Colab](https://colab.research.google.com/github/Arize-ai/phoenix/blob/main/tutorials/tracing/dspy\_tracing\_tutorial.ipynb).
## Resources

* [Example notebook](https://colab.research.google.com/github/Arize-ai/phoenix/blob/main/tutorials/tracing/dspy\_tracing\_tutorial.ipynb)
* [OpenInference package](https://github.com/Arize-ai/openinference/blob/main/python/instrumentation/openinference-instrumentation-dspy)
* [Working examples](https://github.com/Arize-ai/openinference/blob/main/python/examples/dspy-rag-fastapi)
40 changes: 21 additions & 19 deletions docs/tracing/integrations-tracing/haystack.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,35 +4,27 @@ description: Instrument LLM applications built with Haystack

# Haystack

### Quickstart
Phoenix provides auto-instrumentation for [Haystack](https://haystack.deepset.ai/)

In this example we will instrument a small program that uses [Haystack](https://haystack.deepset.ai/) to make calls to GPT 3.5 turbo, and observe the traces via [`arize-phoenix`](https://github.com/Arize-ai/phoenix).
## Install

```bash
pip install openinference-instrumentation-haystack haystack-ai
```
pip install openinference-instrumentation-haystack haystack-ai arize-phoenix opentelemetry-sdk opentelemetry-exporter-otlp
```

In a python file, setup the `HaystackInstrumentor` and configure the tracer to send traces to Phoenix.
## Setup

Set up [OpenTelemetry to point to a running Phoenix instance](https://docs.arize.com/phoenix/quickstart) and then initialize the HaystackInstrumentor before your application code.

```python
from openinference.instrumentation.haystack import HaystackInstrumentor
from opentelemetry import trace as trace_api
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor
import phoenix as px

endpoint = "http://127.0.0.1:6006/v1/traces"
tracer_provider = trace_sdk.TracerProvider()
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))
trace_api.set_tracer_provider(tracer_provider)

px.launch_app()
HaystackInstrumentor().instrument()

HaystackInstrumentor().instrument()
```

From there, you can set up your Haystack app as normal:
## Run Haystack

From here, you can set up your Haystack app as normal:

```python
from haystack import Pipeline
Expand Down Expand Up @@ -60,3 +52,13 @@ pipeline.connect("prompt_builder", "llm")
# Define the question
question = "What is the location of the Hanging Gardens of Babylon?"
```

## Observe

Now that you have tracing setup, all invocations of pipelines will be streamed to your running Phoenix for observability and evaluation.

## Resources:

* [Example notebook](https://github.com/Arize-ai/openinference/blob/main/python/instrumentation/openinference-instrumentation-haystack/examples/qa\_rag\_pipeline.py)
* [OpenInference package](https://github.com/Arize-ai/openinference/blob/main/python/instrumentation/openinference-instrumentation-haystack)
* [Working examples](https://github.com/Arize-ai/openinference/tree/main/python/instrumentation/openinference-instrumentation-haystack/examples)
9 changes: 7 additions & 2 deletions docs/tracing/integrations-tracing/langchain.js.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,13 +4,13 @@

This module provides automatic instrumentation for LangChain.js, more specifically, the @langchain/core module. which may be used in conjunction with @opentelemetry/sdk-trace-node.

## Installation
## Install

```bash
npm install --save @arizeai/openinference-instrumentation-langchain
```

## Usage
## Setup

To load the LangChain instrumentation, manually instrument the `@langchain/core/callbacks/manager` module. The callbacks manager must be manually instrumented due to the non-traditional module structure in `@langchain/core`. Additional instrumentations can be registered as usual in the registerInstrumentations function.

Expand All @@ -30,3 +30,8 @@ const lcInstrumentation = new LangChainInstrumentation();
lcInstrumentation.manuallyInstrument(CallbackManagerModule);

```

## Resources

* [Example project](https://github.com/Arize-ai/openinference/blob/main/js/packages/openinference-instrumentation-langchain/examples)
* [OpenInference package](https://github.com/Arize-ai/openinference/blob/main/js/packages/openinference-instrumentation-langchain)
Loading

0 comments on commit dedf969

Please sign in to comment.