Last updated
Copyright © 2023 Arize AI, Inc
Last updated
By instrumenting the prompt template, users can take full advantage of the Arize . You don't need to deploy a new template version in order to see if prompt text or prompt variables changes have the intended effect. Instead, you can experiment with these changes in the playground UI.
We provide a context manager (example below) to add a prompt template to the current OpenTelemetry Context. OpenInference will read this Context and pass the prompt template fields as span attributes, following the OpenInference . The interface expects the following:
Refer to the code below for a working example:
We provide a setPromptTemplate
function which allows you to set a template, version, and variables on context. You can use this utility in conjunction with to set the active context. OpenInference will then pick up these attributes and add them to any spans created within the context.with
callback. The components of a prompt template are:
template
string
"Please describe the weather forecast for {city} on {date}"
version
string
"v1.0"
variables
Record<string, unknown>
{"city": "Johannesburg", "date":"July 11"}
template
str
"Please describe the weather forecast for {city} on {date}"
version
str
"v1.0"
variables
Dict[str]
{"city": "Johannesburg", "date":"July 11"}