Prompt Hub
Last updated
Last updated
Copyright © 2023 Arize AI, Inc
The Prompt Hub is a centralized repository for managing, iterating, and deploying prompt templates within the Arize platform. It serves as a collaborative workspace for users to refine and store templates for various use cases, including production applications and experimentation.
Key features of the Prompt Hub include:
Template Management: Users can save templates directly from the Prompt Playground along with associated LLM parameters, function definitions, and metadata required to reproduce specific LLM calls.
Version Control: Every saved template supports versioning, enabling users to track updates, experiment with variations, and revert to previous versions if needed.
Collaboration and Reusability: Saved templates can be shared across teams, facilitating collaboration and consistency in production workflows. Templates can also be reloaded into the Prompt Playground or accessed via APIs for seamless integration into codebases and online tasks.
Evaluation and Optimization: By saving outputs as experiments, users can compare templates, compute evaluation metrics, and analyze performance both quantitatively and qualitatively.
The Prompt Hub provides a comprehensive solution for managing prompt workflows, ensuring traceability, reusability, and integration into production pipelines.
Any playground template can also be saved to the Prompt Hub, making it especially valuable for production use cases and collaboration. The Prompt Hub stores the template along with LLM parameters, function definitions, and other metadata necessary to reproduce the LLM call.
A saved prompt can be loaded into the Prompt Playground for additional iteration and version updates. This can be done directly from the prompt version view, the Prompt Hub listing page, or within the Prompt Playground itself, allowing for seamless editing and refinement.
Once the prompt is loaded into the Prompt Playground, the user can freely make changes and test the updated template on a dataset, following the same workflow described earlier. In the example below, the system template is modified to instruct the LLM to provide concise, one-sentence responses. Instead of creating a new prompt, the user updates the version of the existing prompt directly, preserving the connection to the original template from the Prompt Hub.