Models
Ollama
Run Large Language Models locally with Ollama
Ollama is a fantastic tool for running models locally.
Ollama supports multiple open-source models. See the library here.
We recommend experimenting to find the best-suited model for your use-case. Here are some general recommendations:
llama3.3
models are good for most basic use-cases.qwen
models perform specifically well with tool use.deepseek-r1
models have strong reasoning capabilities.phi4
models are powerful, while being really small in size.
Set up a model
Install ollama and run a model using
run model
This gives you an interactive session with the model.
Alternatively, to download the model to be used in an Agno agent
pull model
Example
After you have the model locally, use the Ollama
model class to access it
View more examples here.
Params
Parameter | Type | Default | Description |
---|---|---|---|
id | str | "llama3.1" | The ID of the model to use. |
name | str | "Ollama" | The name of the model. |
provider | str | "Ollama" | The provider of the model. |
format | Optional[Any] | None | The format of the response. |
options | Optional[Any] | None | Additional options to pass to the model. |
keep_alive | Optional[Union[float, str]] | None | The keep alive time for the model. |
request_params | Optional[Dict[str, Any]] | None | Additional parameters to pass to the request. |
host | Optional[str] | None | The host to connect to. |
timeout | Optional[Any] | None | The timeout for the connection. |
client_params | Optional[Dict[str, Any]] | None | Additional parameters to pass to the client. |
client | Optional[OllamaClient] | None | A pre-configured instance of the Ollama client. |
async_client | Optional[AsyncOllamaClient] | None | A pre-configured instance of the asynchronous Ollama client. |
structured_outputs | bool | False | Whether to use the structured outputs with this Model. |
supports_structured_outputs | bool | True | Whether the Model supports structured outputs. |
Ollama
is a subclass of the Model class and has access to the same params.