Trace Function Calls
Last updated
Last updated
Copyright © 2023 Arize AI, Inc
Function Calling enables LLMs to interact directly with external tools and APIs, making AI useful for real-world tasks like content generation and system operations. However, debugging is challenging due to the complexity of multiple tools, unclear errors, and incorrect parameter handling.
Arize streamlines this process by logging chat history and function calls to the platform with just a single line of code. Users can easily review function call traces in a human-readable format, load chats and calls into the prompt playground, and refine parameters to optimize workflows — all within a user-friendly UI.
Refer to the following QuickStart guide to get started with tracing function calls in Open AI using auto instrumentation.
By adding a single line of code to auto instrument your Open AI chat, the entire chat history (including function calls) will be traced to the Arize platform. You can view your traces to debug the chat history and review whether the LLM called the appropriate functions with the expected parameters.
The Function Output
tab displays the functions and their arguments generated by the LLM, while the Function Definition
tab shows the tools available to the LLM, defined in the OpenAI tools
parameter.
After tracing your application, you can load the span into the prompt playground to experiment with improving LLM performance. In the playground, you can replay the function call with new LLM parameters, adjust function definitions, modify chat messages, and more!
Stay tuned—we'll soon be adding function calling support for Gemini, Azure, Anthropic, and other models. We'll also add support for manual instrumentation of OpenAI function calls, complementing the existing auto-instrumentation feature.