04.18.2025: Tracing for MCP Client-Server Applications πŸ”Œ

Available in Phoenix 8.26+

We’re excited to announce a powerful capability in the OpenInference OSS library openinference-instrumentation-mcp β€” seamless OTEL context propagation for MCP clients and servers.

What’s New?

This release introduces automatic distributed tracing for Anthropic’s Model Context Protocol (MCP). Using OpenTelemetry, you can now:

  • Propagate context across MCP client-server boundaries

  • Generate end-to-end traces of your AI system across services and languages

  • Gain full visibility into how models access and use external context

The openinference-instrumentation-mcp package handles this for you by:

  • Creating spans for MCP client operations

  • Injecting trace context into MCP requests

  • Extracting and continuing the trace context on the server

  • Associating the context with OTEL spans on the server side

Set up

  1. Instrument both MCP client and server with OpenTelemetry.

  2. Add the openinference-instrumentation-mcp package.

  3. Spans will propagate across services, appearing as a single connected trace in Phoenix.

Full example usage is available:

Walkthrough Video

Acknowledgments

Big thanks to Adrian Cole and Anuraag Agrawal for their contributions to this feature.

Last updated

Was this helpful?