Documentation Index
Fetch the complete documentation index at: https://trigger-docs-tri-7532-ai-sdk-chat-transport-and-chat-task-s.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Overview
AI Prompts let you define prompt templates in your codebase alongside your tasks. When you deploy, Trigger.dev automatically versions your prompts. You can then:- View all prompt versions in the dashboard
- Create overrides to change the prompt text or model without redeploying
- Track every generation that used each prompt version
- See token usage, cost, and latency metrics per prompt
- Manage prompts programmatically via SDK methods
Defining a prompt
Useprompts.define() to create a prompt with typed variables:
Options
| Option | Type | Required | Description |
|---|---|---|---|
id | string | Yes | Unique identifier (becomes the prompt slug) |
description | string | No | Shown in the dashboard |
model | string | No | Default model (e.g. "gpt-4o", "claude-sonnet-4-6") |
config | object | No | Default config (temperature, maxTokens, etc.) |
variables | Zod/ArkType schema | No | Schema for template variables (enables validation and dashboard UI) |
content | string | Yes | The prompt template with {{variable}} placeholders |
Template syntax
Templates use Mustache-style placeholders:{{variableName}}— replaced with the variable value{{#conditionalVar}}...{{/conditionalVar}}— content only included if the variable is truthy
Resolving a prompt
Via prompt handle
Call.resolve() on the handle returned by define():
Via standalone prompts.resolve()
Resolve any prompt by slug without needing a handle. Pass the prompt handle as a type parameter for full type safety:Record<string, unknown> variables.
Resolve options
You can resolve a specific version or label:resolve() returns the override version if one is active, otherwise the current (latest deployed) version.
Both
promptHandle.resolve() and prompts.resolve() call the Trigger.dev API when a client is configured. During local dev with trigger dev, this means you’ll always get the server version (including overrides).Using with the AI SDK
The resolved prompt integrates with the Vercel AI SDK viatoAISDKTelemetry(). This links AI generation spans to the prompt in the dashboard.
generateText
streamText
Custom telemetry metadata
Pass additional metadata totoAISDKTelemetry() that will appear on the generation span:
Using with chat.agent()
Prompts integrate withchat.agent() via chat.prompt — a run-scoped store for the resolved prompt. Store a prompt once in a lifecycle hook, then access it anywhere during the run.
chat.prompt.set() and chat.prompt()
chat.toStreamTextOptions()
Returns an options object ready to spread intostreamText(). When a prompt is stored via chat.prompt.set(), it includes:
system— the compiled prompt textmodel— resolved via theregistrywhen providedtemperature,maxTokens, etc. — from the prompt’sconfigexperimental_telemetry— links generations to the prompt in the dashboard
Reading the prompt
Access the stored prompt from anywhere in the run:Prompt management SDK
Theprompts namespace includes methods for managing prompts programmatically. These work both inside tasks and outside (e.g. scripts, API handlers) as long as an API client is configured.
List prompts
List versions
Create an override
Create a new override that takes priority over the deployed version:Update an override
Remove an override
Remove the active override, reverting to the deployed version:Promote a version
All management methods
| Method | Description |
|---|---|
prompts.list() | List all prompts in the current environment |
prompts.versions(slug) | List all versions for a prompt |
prompts.resolve(slug, variables?, options?) | Resolve a prompt by slug |
prompts.promote(slug, version) | Promote a version to current |
prompts.createOverride(slug, body) | Create an override |
prompts.updateOverride(slug, body) | Update the active override |
prompts.removeOverride(slug) | Remove the active override |
prompts.reactivateOverride(slug, version) | Reactivate a removed override |
Overrides
Overrides let you change a prompt’s template or model from the dashboard or SDK without redeploying your code. When an override is active,resolve() returns the override version instead of the deployed version.
How overrides work
- Overrides take priority over the deployed (“current”) version
- Only one override can be active at a time
- Creating a new override replaces the previous one
- Removing an override reverts to the deployed version
- Overrides are environment-scoped (dev, staging, production are independent)
Creating an override (dashboard)
- Go to the prompt detail page
- Click Create Override
- Edit the template text and/or model
- Add an optional commit message
- Click Create override
Version resolution order
Whenresolve() is called, versions are resolved in this order:
- Specific version — if
{ version: N }is passed - Override — if an override is active in this environment
- Label — if
{ label: "..." }is passed (defaults to"current") - Current — the latest deployed version with the “current” label
Dashboard
Prompts list
The prompts list page shows all prompts in the current environment with the current or override version, default model, and a usage sparkline.Prompt detail
Click a prompt to see:- Template panel — the prompt template for the selected version
- Details tab — slug, description, model, config, source file, and variable schema
- Versions tab — all versions with labels, source, and commit messages
- Generations tab — every AI generation that used this prompt, with live polling
- Metrics tab — token usage, cost, and latency charts
AI span inspectors
When you usetoAISDKTelemetry(), AI generation spans in the run trace get a custom inspector showing:
- Overview — model, provider, token usage, cost, input/output preview
- Messages — the full message thread
- Tools — tool definitions and tool call details
- Prompt — the linked prompt’s metadata, input variables, and template content

