Skip to content

Release 3.31.0

Compare
Choose a tag to compare
@nr-swilloughby nr-swilloughby released this 27 Mar 20:08
· 183 commits to master since this release
cb651c4

3.31.0

Added

  • Integration packages to instrument AI model invocations (see below).
    • New package nrawsbedrock v1.0.0 introduced to instrument calls to Amazon Bedrock Runtime Client API InvokeModel and InvokeModelWithResponseStream calls. Also provides a simple one-step method which invokes stream invocations and harvests the response stream data for you.
    • New package nropenai v1.0.0 introduced to instrument calls to OpenAI using NRCreateChatCompletion, NRCreateChatCompletionStream, and NRCreateEmbedding calls.
    • Dockerfile in the examples/server sample app which facilitates the easy creation of a containerized ready-to-run sample app for situations where that makes testing easier.

Fixed

  • .Ignore was not ignoring transaction. Fixes Issue #845.
  • Added nil error check in wrap function. Fixes Issue #862.
  • WrapBackgroundCore background logger was not sending logs to New Relic. Fixes Issue #859.
  • Corrected pgx5 integration example which caused a race condition. Thanks to @WillAbides! Fixes Issue #855.
  • Updated third-party library versions due to reported security or other supportability issues:
    • github.com/jackc/pgx/v5 to 5.5.4 in nrpgx5 integration
    • google.gopang.org/protobuf to 1.33.0 in nrmicro and nrgrpc integrations
    • github.com/jackc/pgx/v4 to 4.18.2 in nrpgx integration

AI Monitoring Configuration

New configuration options are available specific to AI monitoring. These settings include:

  • AIMonitoring.Enabled, configured via ConfigAIMonitoring.Enabled(bool) [default false]
  • AIMonitoring.Streaming.Enabled, configured via ConfigAIMonitoringStreamingEnabled(bool) [default true]
  • AIMonitoring.Content.Enabled, configured via ConfigAIMonitoringContentEnabled(bool) [default true]

AI Monitoring Public API Methods

Two new AI monitoring related public API methods have been added, as methods of the newrelic.Application value returned by newrelic.NewApplication:

AI Monitoring

New Relic AI monitoring is the industry’s first APM solution that provides end-to-end visibility for AI Large Language Model (LLM) applications. It enables end-to-end visibility into the key components of an AI LLM application. With AI monitoring, users can monitor, alert, and debug AI-powered applications for reliability, latency, performance, security and cost. AI monitoring also enables AI/LLM specific insights (metrics, events, logs and traces) which can easily integrate to build advanced guardrails for enterprise security, privacy and compliance.

AI monitoring offers custom-built insights and tracing for the complete lifecycle of an LLM’s prompts and responses, from raw user input to repaired/polished responses. AI monitoring provides built-in integrations with popular LLMs and components of the AI development stack. This release provides instrumentation for OpenAI
and Bedrock.

When AI monitoring is enabled with ConfigAIMonitoringEnabled(true), the agent will now capture AI LLM related data. This data will be visible under a new APM tab called AI Responses. See our AI Monitoring documentation for more details.

Support statement

We use the latest version of the Go language. At minimum, you should be using no version of Go older than what is supported by the Go team themselves.
See the Go agent EOL Policy for details about supported versions of the Go agent and third-party components.