Custom LLM Models

Arize supports any model which has an OpenAI-compatible api.

Add your custom model endpoints to begin accessing your model in Arize's prompt playground. Arize makes use of the openai-client to make calls to these endpoints.

Guide

Fill in the details of your endpoint

  • Name: the name you wish to give your endpoint - this is the name it will appear as in other areas of the UI such as in the prompt playground

  • Model Name: the exact name of the provider's model

  • Base Url: the url of your custom endpoint - Arize leverages the openai-client to make calls to your endpoint, so the relevant endpoints are automatically added for chat (/chat/completions) and completions(/completions) when used

  • API Key: key used to access your openai-compatible endpoint

  • Headers: optional headers to add in your request

You can now access your endpoint in Arize's prompt playground

You can navigate to the prompt playground by choosing a generative llm model which has prompts and responses and selecting a table row on the performance tracing tab, or by selecting prompt playground on an llm span

Choose custom provider and the endpoint you've specified, and toggle on the Use Chat switch to use the chat endpoint and off to use the legacy completions endpoint

Last updated

Copyright © 2023 Arize AI, Inc