Examples
- Examples
- Getting Started
- Agents
- Teams
- Workflows
- Applications
- Streamlit Apps
- Evals
Agent Concepts
- Reasoning
- Multimodal
- RAG
- User Control Flows
- Knowledge
- Memory
- Async
- Hybrid Search
- Storage
- Tools
- Vector Databases
- Context
- Embedders
- Agent State
- Observability
- Miscellaneous
Models
- Anthropic
- AWS Bedrock
- AWS Bedrock Claude
- Azure AI Foundry
- Azure OpenAI
- Cerebras
- Cerebras OpenAI
- Cohere
- DeepInfra
- DeepSeek
- Fireworks
- Gemini
- Groq
- Hugging Face
- IBM
- LM Studio
- LiteLLM
- LiteLLM OpenAI
- Meta
- Mistral
- NVIDIA
- Ollama
- OpenAI
- Perplexity
- Together
- XAI
- Vercel
- vLLM
vLLM
Code Generation
Code
cookbook/models/vllm/code_generation.py
Copy
Ask AI
from agno.agent import Agent
from agno.models.vllm import vLLM
agent = Agent(
model=vLLM(id="deepseek-ai/deepseek-coder-6.7b-instruct"),
description="You are an expert Python developer.",
markdown=True,
)
agent.print_response(
"""Write a Python function that returns the nth Fibonacci number
using dynamic programming."""
)
Usage
1
Create a virtual environment
Open the Terminal
and create a python virtual environment.
Copy
Ask AI
python3 -m venv .venv
source .venv/bin/activate
2
Install Libraries
Copy
Ask AI
pip install -U agno openai vllm
3
Start vLLM server
Copy
Ask AI
vllm serve deepseek-ai/deepseek-coder-6.7b-instruct \
--dtype float32 \
--tool-call-parser pythonic
4
Run Agent
Copy
Ask AI
python cookbook/models/vllm/code_generation.py
Was this page helpful?
Assistant
Responses are generated using AI and may contain mistakes.