Integrating Agno with Langfuse
Langfuse provides a robust platform for tracing and monitoring AI model calls. By integrating Agno with Langfuse, you can utilize OpenInference and OpenLIT to send traces and gain insights into your agent’s performance.Prerequisites
-
Install Dependencies
Ensure you have the necessary packages installed:
-
Setup Langfuse Account
- Either self-host or sign up for an account at Langfuse.
- Obtain your public and secret API keys from the Langfuse dashboard.
-
Set Environment Variables
Configure your environment with the Langfuse API keys:
Sending Traces to Langfuse
-
Example: Using Langfuse with OpenInference
-
Example: Using Langfuse with OpenLIT
Notes
- Environment Variables: Ensure your environment variables are correctly set for the API keys and OTLP endpoint.
- Data Regions: Adjust the
OTEL_EXPORTER_OTLP_ENDPOINT
for your data region or local deployment as needed. Available regions include:https://us.cloud.langfuse.com/api/public/otel
for the US regionhttps://eu.cloud.langfuse.com/api/public/otel
for the EU regionhttp://localhost:3000/api/public/otel
for local deployment