Prompt Engineering
Iterate on low performance prompt templates in Arize
Prompt analysis is an important component in troubleshooting your LLM's performance. Often, LLM performance can be improved simply by comparing different prompt templates, or iterating on the one you have.
Follow these steps to find prompt templates that need to be improved, and how to iterate and test out new prompt templates.
Select the evaluation metric you want to measure on Performance Tracing.
On the Table view, make sure the User Feedback and prompt_template_name columns are selected in the Primary Columns selector. Then sort the User Feedback column from lowest score to highest.

Click on a row you want to investigate further, to open up the side panel.
The side panel will show you information such as the prompt template used, and any other relevant variables such as the prompt instruction.

Prediction Details
Click on either "Edit Template" or the "Prompt Playground" tab to iterate and compare responses.

Navigate to Prompt Playground
Choose to edit the prompt variables, LLM parameters, or the prompt template itself. Then re-run the prompt to easily compare responses for analysis prior to implementing a new or revised prompt template.

Edit Prompt Template, Params, or Variables