How To: Annotations

Annotations are currently in Beta! Please reach out to support@arize.com to have annotations enabled in your space

Annotations are custom labels that can be added to traces in LLM applications. AI engineers can use annotations to:

  • Hand-label / manually label data

  • Categorize spans or traces

  • Curate a dataset for experimentation

  • Log human feedback

User Feedback is an important part of improving applications. Annotations enable teams to add their feedback to their trace data. Annotations can be added via the UI or via API.

When are annotations used?

  • Find examples where LLM evals and humans agree / disagree (for eval improvement) or further review

  • Often times, subject matter experts (doctor, legal expert, customer support expert) are needed to determine how good the application as a whole is, this is complementary to other evals

  • (soon) Log feedback from an application directly (via API or latent label)

Adding Annotations

Annotations are labels than can be applied at a per-span level for LLM use cases. Annotations are defined by a config for the annotation (label, score). Those annotations are then available for any future annotation on the model.

Unstructured text annotations (notes) can also be continuously added.

Viewing Annotations on a Trace

Users can save and view annotations on a trace and also filter on them.

Coming soon: Annotations will also be available via API

Last updated

Copyright © 2023 Arize AI, Inc