This guide will walk you through:

  • Creating a minimal FastAPI app with an Agno agent
  • Containerizing it with Docker
  • Running it locally along with a PostgreSQL database for knowledge and memory

Setup

1

Create a new directory for your project

Create a new directory for your project and navigate to it. After following this guide, your project structure will should look like this:

mkdir my-project
cd my-project

After following this guide, your project structure will should look like this:

my-project/
├── main.py
├── Dockerfile
├── requirements.txt
├── docker-compose.yml
2

Create a `requirements.txt` file and add the required dependencies:

requirements.txt
fastapi
agno
openai
pgvector
pypdf
psycopg[binary]
sqlalchemy
uvicorn

Step 1: Create a FastAPI App with an Agno Agent

1

Create a new Python file, e.g., `main.py`, and add the following code to create a minimal FastAPI app with an Agno agent:

main.py
from fastapi import FastAPI
from agno.agent import Agent
from agno.models.openai import OpenAIChat

app = FastAPI()

agent = Agent(
    model=OpenAIChat(id="gpt-4o"),
    description="You are a helpful assistant.",
    markdown=True,
)

@app.get("/ask")
async def ask(query: str):
    response = agent.run(query)
    return {"response": response.content}
2

Create and activate a virtual environment:

python -m venv .venv
source .venv/bin/activate
3

Install the required dependencies by running:

pip install -r requirements.txt
4

Set your OPENAI_API_KEY environment variable:

export OPENAI_API_KEY=your_api_key
5

Run the FastAPI app with `uvicorn main:app --reload`.

uvicorn main:app --reload

Step 2: Create a Dockerfile

1

In the same directory, create a new file named `Dockerfile` with the following content:

Dockerfile
FROM agnohq/python:3.12

WORKDIR /app

COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt

COPY . .

CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000"]
2

Build the Docker image by running:

docker build -t my-agent-app .
3

Run the Docker container with:

docker run -p 8000:8000 -e OPENAI_API_KEY=your_api_key my-agent-app
4

Access your app

You can now access the FastAPI app at http://localhost:8000.

Step 3: Add Knowledge and Memory with PostgreSQL

1

Update your `main.py` file to include knowledge and memory storage using PostgreSQL:

main.py
from fastapi import FastAPI
from agno.agent import Agent
from agno.models.openai import OpenAIChat
from agno.knowledge.pdf_url import PDFUrlKnowledgeBase
from agno.vectordb.pgvector import PgVector
from agno.storage.agent.postgres import PostgresAgentStorage

app = FastAPI()

db_url = "postgresql+psycopg://agno:agno@db/agno"

knowledge_base = PDFUrlKnowledgeBase(
    urls=["https://agno-public.s3.amazonaws.com/recipes/ThaiRecipes.pdf"],
    vector_db=PgVector(table_name="recipes", db_url=db_url),
)
knowledge_base.load(recreate=True)

agent = Agent(
    model=OpenAIChat(id="gpt-4o"),
    description="You are a Thai cuisine expert!",
    knowledge=knowledge_base,
    storage=PostgresAgentStorage(table_name="agent_sessions", db_url=db_url),
    markdown=True,
)

@app.get("/ask")
async def ask(query: str):
    response = agent.run(query)
    return {"response": response.content}
2

Create a `docker-compose.yml` file in the same directory with the following content:

docker-compose.yml
services:
  app:
    build: .
    ports:
      - "8000:8000"
    environment:
      - OPENAI_API_KEY=${OPENAI_API_KEY}
    depends_on:
      db:
        condition: service_healthy

  db:
    image: agnohq/pgvector:16
    environment:
      POSTGRES_DB: agno
      POSTGRES_USER: agno
      POSTGRES_PASSWORD: agno
    volumes:
      - pgdata:/var/lib/postgresql/data
    healthcheck:
      test: ["CMD-SHELL", "pg_isready -U agno"]
      interval: 2s
      timeout: 5s
      retries: 5

volumes:
  pgdata:
3

Run the Docker Compose setup with:

docker-compose up --build

This will start the FastAPI app and the PostgreSQL database, allowing your agent to use knowledge and memory storage.

You can now access the FastAPI app at http://localhost:8000 and interact with your agent that has knowledge and memory capabilities.

You can test the agent by running curl http://localhost:8000/ask?query="What is the recipe for pad thai?".