Use LiteLLM with Agno with an openai-compatible proxy server.
LiteLLMOpenAI
class connects to the LiteLLM proxy using an OpenAI-compatible interface:
LiteLLMOpenAI
class accepts the following parameters:
Parameter | Type | Description | Default |
---|---|---|---|
id | str | Model identifier | ”gpt-4o” |
name | str | Display name for the model | ”LiteLLM” |
provider | str | Provider name | ”LiteLLM” |
api_key | str | API key (falls back to LITELLM_API_KEY environment variable) | None |
base_url | str | URL of the LiteLLM proxy server | ”http://0.0.0.0:4000” |