db object: sessions, memory, knowledge, traces, schedules, approvals. The interface is identical across backends. Pick from JSON files (local or cloud), embedded (SQLite), relational (Postgres, MySQL), document (MongoDB), key-value (Redis, DynamoDB, Firestore), or distributed (SingleStore).
What gets stored
| Table | Holds |
|---|---|
agno_sessions | Conversation history per (user_id, session_id) |
agno_memories | User memories the agent decides to keep |
agno_knowledge | Embeddings |
agno_traces, agno_spans | OpenTelemetry traces |
agno_approvals | Pending and resolved HITL requests |
agno_schedules, agno_schedule_runs | Cron jobs |
agno_metrics, agno_evals | Metrics and eval results |
Pick a backend
PostgresDb is the default for every tutorial template and the recommended production database. It pairs with PgVector to keep relational data and embeddings on the same engine.
| Backend | When to use |
|---|---|
PostgresDb | Production. Vector + relational on one box. |
SqliteDb | Local dev, single-user demos, edge deployments |
MongoDb | Already on Mongo |
MysqlDb | Already on MySQL |
SinglestoreDb | Vector + analytics on one engine, high-throughput |
RedisDb | Cache-friendly, ephemeral sessions |
DynamoDb | AWS-native, serverless |
FirestoreDb | GCP-native, serverless |
GCSJsonDb | Cheap cold storage, knowledge as JSON in Cloud Storage |
InMemoryDb | Tests, ephemeral demos |
Vector storage
Knowledge needs a vector store and agno supports every vector database out of the box.Splitting concerns across databases
For larger deployments, pass dedicated databases per role:When db isn’t enough
Some agents need direct database connections in tools (SQL agents, BI agents, agents that update CRM tables). Pass them via dependencies:
File and blob storage
For media that doesn’t belong in the relational store (generated images, audio, large PDFs), store them in object storage and reference paths inagno_knowledge or agno_sessions. Agno doesn’t ship a built-in blob abstraction. Use S3, GCS, or whatever your platform provides.
Migrations
AgentOS handles its own tables. Your application tables (data the agent reads via SQL tools, schemas the Engineer agent builds, your own data) you migrate however you like. Alembic, raw SQL, dbt, your call. The Dash deploy tutorial shows the pattern: AgentOS comes up, runs its DDL automatically, then a one-timepython scripts/generate_data.py loads application tables.