LiteLLM OpenAI
Streaming Agent
Make sure to start the proxy server:
Code
cookbook/models/litellm_openai/basic_stream.py
Usage
1
Create a virtual environment
Open the Terminal
and create a python virtual environment.
2
Set your API key
3
Install libraries
4
Run Agent