LangFlow
Last updated
Was this helpful?
Last updated
Was this helpful?
LangFlow is an open-source visual framework that enables developers to rapidly design, prototype, and deploy custom applications powered by large language models (LLMs). Built on top of LangChain,
LangFlow users can now seamlessly observe their LLM workflows through Arize Phoenix. This integration allows developers to gain granular visibility into the performance and behavior of their LangFlow applications. By leveraging Arize AI's observability platform, users can capture detailed telemetry data from their LangFlow pipelines, enabling them to identify bottlenecks, trace the flow of requests, and ensure the reliability and efficiency of their LLM-powered systems. This enhanced observability empowers teams to debug issues faster, optimize performance, and maintain high-quality user experiences across their LLM applications.
Navigate to the LangFlow GitHub repo and pull the project down
Navigate to the repo and create a .env
file with all the Arize Phoenix variables.
You can use the .env.example
as a template to create the .env
file
Add the following environment variable to the .env
file
Note: This LangFlow integration is for Phoenix Cloud
Start Docker Desktop, build the images, and run the container (this will take around 10 minutes the first time) Go into your terminal into the LangFlow directory and run the following commands
In this example, we'll use Simple Agent for this tutorial
Add your OpenAI Key to the Agent component in LangFlow
Go into the Playground and run the Agent
Navigate to your project name (should match the name of of your LangFlow Agent name)
https://app.phoenix.arize.com/
AgentExecutor Trace is Arize Phoenix instrumentation to capture what's happening with the LangChain being ran during the LangFlow components
The other UUID trace is the native LangFlow tracing.