Langfuse
Integrate Agno with Langfuse to send traces and gain insights into your agent’s performance.
Integrating Agno with Langfuse
Langfuse provides a robust platform for tracing and monitoring AI model calls. By integrating Agno with Langfuse, you can utilize OpenInference and OpenLIT to send traces and gain insights into your agent’s performance.
Prerequisites
-
Install Dependencies
Ensure you have the necessary packages installed:
-
Setup Langfuse Account
- Either self-host or sign up for an account at Langfuse.
- Obtain your public and secret API keys from the Langfuse dashboard.
-
Set Environment Variables
Configure your environment with the Langfuse API keys:
Sending Traces to Langfuse
-
Example: Using Langfuse with OpenInference
This example demonstrates how to instrument your Agno agent with OpenInference and send traces to Langfuse.
-
Example: Using Langfuse with OpenLIT
This example demonstrates how to use Langfuse via OpenLIT to trace model calls.
Notes
- Environment Variables: Ensure your environment variables are correctly set for the API keys and OTLP endpoint.
- Data Regions: Adjust the
OTEL_EXPORTER_OTLP_ENDPOINT
for your data region or local deployment as needed. Available regions include:https://us.cloud.langfuse.com/api/public/otel
for the US regionhttps://eu.cloud.langfuse.com/api/public/otel
for the EU regionhttp://localhost:3000/api/public/otel
for local deployment
By following these steps, you can effectively integrate Agno with Langfuse, enabling comprehensive observability and monitoring of your AI agents.