The Ollama model provides access to locally-hosted open source models.

ParameterTypeDefaultDescription
idstr"llama3.1"The ID of the model to use.
namestr"Ollama"The name of the model.
providerstr"Ollama"The provider of the model.
formatOptional[Any]NoneThe format of the response.
optionsOptional[Any]NoneAdditional options to pass to the model.
keep_aliveOptional[Union[float, str]]NoneThe keep alive time for the model.
request_paramsOptional[Dict[str, Any]]NoneAdditional parameters to pass to the request.
hostOptional[str]NoneThe host to connect to.
timeoutOptional[Any]NoneThe timeout for the connection.
client_paramsOptional[Dict[str, Any]]NoneAdditional parameters to pass to the client.
clientOptional[OllamaClient]NoneA pre-configured instance of the Ollama client.
async_clientOptional[AsyncOllamaClient]NoneA pre-configured instance of the asynchronous Ollama client.
structured_outputsboolFalseWhether to use the structured outputs with this Model.
supports_structured_outputsboolTrueWhether the Model supports structured outputs.