Skip to main content
Many providers support the OpenAI API format. Use the OpenAILike model to access them by replacing the base_url.

Example

from os import getenv
from agno.agent import Agent
from agno.models.openai.like import OpenAILike

agent = Agent(
    model=OpenAILike(
        id="mistralai/Mixtral-8x7B-Instruct-v0.1",
        api_key=getenv("TOGETHER_API_KEY"),
        base_url="https://api.together.xyz/v1",
    )
)

# Print the response in the terminal
agent.print_response("Share a 2 sentence horror story.")

Parameters

ParameterTypeDefaultDescription
idstr"not-provided"The id of the model to use
namestr"OpenAILike"The name of the model
providerstr"OpenAILike"The provider of the model
api_keyOptional[str]"not-provided"The API key for authentication
base_urlOptional[str]NoneThe base URL for the API service
collect_metrics_on_completionboolFalseCollect token metrics only from the final streaming chunk (for providers with cumulative token counts)
OpenAILike extends the OpenAI-compatible interface and supports all parameters from OpenAIChat. Simply change the base_url and api_key to point to your preferred OpenAI-compatible service.

Responses API

For providers that implement the Open Responses API specification, use OpenResponses:
from agno.agent import Agent
from agno.models.openai import OpenResponses

agent = Agent(
    model=OpenResponses(
        id="your-model-id",
        base_url="https://your-provider.com/v1",
        api_key="your-api-key",
    ),
)

agent.print_response("Share a 2 sentence horror story.")
The Responses API is stateless by default. Each request is independent with no previous_response_id chaining. For specific providers, use the dedicated classes: See OpenResponses reference for full parameters.