Authentication
Set yourFIREWORKS_API_KEY environment variable. Get your key from here.
Prompt caching
Prompt caching will happen automatically using ourFireworks model. You can read more about how Fireworks handle caching in their docs.
Example
UseFireworks with your Agent:
View more examples here.
Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
id | str | "accounts/fireworks/models/llama-v3p1-405b-instruct" | The id of the Fireworks model to use |
name | str | "Fireworks" | The name of the model |
provider | str | "Fireworks" | The provider of the model |
api_key | Optional[str] | None | The API key for Fireworks (defaults to FIREWORKS_API_KEY env var) |
base_url | str | "https://api.fireworks.ai/inference/v1" | The base URL for the Fireworks API |
Fireworks extends the OpenAI-compatible interface and supports most parameters from the OpenAI model.