New Releases, Enhancements, + Changes

What's New

LLM Tracing in Arize

LLM Tracing is now available to all users in the Arize platform! With LLM traces and spans, you can glean visibility into the discrete executions of your LLM-based applications. Instrument LLM tracing as part of a retrieval augmented generation (RAG) or agentic (e.g. LangChain or LlamaIndex) system to improve your able to detect issues and troubleshoot them in an intuitive UI. Learn more →


Model Dashboards Tab Revamp

The model overview dashboards tab has gotten a new look! You can now find all supported templates for any model type in the new "Dashboard Templates" tab. Improvements include:

  • search by dashboard name in the "All Dashboards" view

  • easily find the default dashboard template for each model type

  • new detailed descriptions for each template

Python SDK v7.10.0

  • LLM spans support to Python SDK batch ingestion

  • Multi-class models support now includes batch ingestion via Pandas

  • Support for greater schema size by sending as metadata

  • Increased prediction ID limit to 512

Learn about Python SDK fixes and improvements here.

The latest ebooks, self-guided course modules, and technical posts on topics like LLM evaluation and beyond:

  • RAG LLM: a roadmap to getting started

  • RAG Time: evaluate RAG with LLM evals and benchmarking

  • LLM Evaluations: everything you need to know

  • Interview: enterprise data strategy with Samsung Research America's Prashanth Rajendran

Last updated