Portkey
Last updated
Was this helpful?
Last updated
Was this helpful?
Portkey is an AI Gateway and Control Panel that provides production-ready features for AI applications including observability, reliability, and cost management. Learn how to instrument the Portkey SDK using the package for comprehensive LLM tracing and monitoring.
Install the required packages for Portkey AI Gateway tracing:
Configure the PortkeyInstrumentor
and tracer to send traces to Arize for LLM observability:
Test your Portkey integration with this example code and observe traces in Arize:
Start using your LLM application and monitor traces in Arize.
Arize provides comprehensive observability for Portkey's AI Gateway capabilities, automatically tracing:
Multiple Provider Calls: Track requests across different LLM providers (OpenAI, Anthropic, Cohere) through Portkey's unified interface
Provider Switching: Monitor seamless switching between AI providers
Cost Optimization: Track usage and costs across different LLM providers
Fallback and Retry Logic: Monitor automatic fallbacks and retry attempts when primary services fail
Load Balancing: Observe how requests are distributed across multiple models or providers
Latency Tracking: Monitor response times and performance metrics
Semantic Caching: See cache hits and misses for semantic caching to optimize costs
Request Deduplication: Track duplicate request handling
Performance Optimization: Identify bottlenecks and optimization opportunities