Phoenix
TypeScript APIPython APICommunityGitHubPhoenix Cloud
  • Documentation
  • Self-Hosting
  • Cookbooks
  • SDK and API Reference
  • Release Notes
  • Resources
  • Frequently Asked Questions
  • OpenInference
  • Contribute to Phoenix
  • Resources
    • Github
  • Releases
  • OpenInference
Powered by GitBook

Platform

  • Tracing
  • Prompts
  • Datasets and Experiments
  • Evals

Software

  • Python Client
  • TypeScript Client
  • Phoenix Evals
  • Phoenix Otel

Resources

  • Container Images
  • X
  • Blue Sky
  • Blog

Integrations

  • OpenTelemetry
  • AI Providers

© 2025 Arize AI

On this page

Was this helpful?

OpenInference

PreviousFrequently Asked QuestionsNextContribute to Phoenix

Was this helpful?

For a in-depth specification of the OpenInference specification, please consult the spec

OpenInference is a set of specifications for model inferences and LLM traces

OpenInference is a specification that encompass two data models:

Tracing

capture the execution of an application that results in invocations of an LLM.

Inferences

designed to capture inference logs from a variety of model types and use-cases

https://github.com/Arize-ai/openinference