Feedback & Annotations
Phoenix supports adding Feedback and Annotations to your traces in a variety of ways, include generating LLM evaluation labels and logging human annotations.
Last updated
Was this helpful?
Phoenix supports adding Feedback and Annotations to your traces in a variety of ways, include generating LLM evaluation labels and logging human annotations.
Last updated
Was this helpful?
To learn how to add human labels to your traces, either manually or programmatically, see Capture Feedback on Traces
To learn how to evaluate traces captured in Phoenix, see Evaluating Phoenix Traces
To learn how to upload your own evaluation labels into Phoenix, see Log Evaluation Results
For more background on the concept of annotations, see Annotations
Common types of feedback include:
LLM as a Judge generated evaluations
Code evaluations
End user feedback
Human labels / human annotations