The OllamaEmbedder can be used to embed text data into vectors locally using Ollama.

The model used for generating embeddings needs to run locally. In this case it is openhermes so you have to install ollama and run ollama pull openhermes in your terminal.

Usage

cookbook/embedders/ollama_embedder.py
from agno.agent import AgentKnowledge
from agno.vectordb.pgvector import PgVector
from agno.embedder.ollama import OllamaEmbedder

# Embed sentence in database
embeddings = OllamaEmbedder(id="openhermes").get_embedding("The quick brown fox jumps over the lazy dog.")

# Print the embeddings and their dimensions
print(f"Embeddings: {embeddings[:5]}")
print(f"Dimensions: {len(embeddings)}")

# Use an embedder in a knowledge base
knowledge_base = AgentKnowledge(
    vector_db=PgVector(
        db_url="postgresql+psycopg://ai:ai@localhost:5532/ai",
        table_name="ollama_embeddings",
        embedder=OllamaEmbedder(),
    ),
    num_documents=2,
)

Params

ParameterTypeDefaultDescription
modelstr"openhermes"The name of the model used for generating embeddings.
dimensionsint4096The dimensionality of the embeddings generated by the model.
hoststr-The host address for the API endpoint.
timeoutAny-The timeout duration for API requests.
optionsAny-Additional options for configuring the API request.
client_kwargsOptional[Dict[str, Any]]-Additional keyword arguments for configuring the API client. Optional.
ollama_clientOptional[OllamaClient]-An instance of the OllamaClient to use for making API requests. Optional.

Developer Resources