LiteLLM
Litellm openai
Proxy Server Integration
LiteLLM can also be used as an OpenAI-compatible proxy server, allowing you to route requests to different models through a unified API.
Starting the Proxy Server
First, install LiteLLM with proxy support:
Start the proxy server:
Using the Proxy
The LiteLLMOpenAI
class connects to the LiteLLM proxy using an OpenAI-compatible interface:
Configuration Options
The LiteLLMOpenAI
class accepts the following parameters:
Parameter | Type | Description | Default |
---|---|---|---|
id | str | Model identifier | ”gpt-4o” |
name | str | Display name for the model | ”LiteLLM” |
provider | str | Provider name | ”LiteLLM” |
api_key | str | API key (falls back to LITELLM_API_KEY environment variable) | None |
base_url | str | URL of the LiteLLM proxy server | ”http://0.0.0.0:4000” |
Examples
Check out these examples in the cookbook:
Proxy Examples
View more examples here.