AI nodes run a language model prompt during journey execution and return structured output. The response merges into the journey context, making it available to downstream branch conditions, send templates, and other nodes. Use AI nodes to generate personalized notification copy, classify users based on behavior data, score engagement signals, or enrich profiles with structured insights that drive downstream routing.Documentation Index
Fetch the complete documentation index at: https://www.courier.com/docs/llms.txt
Use this file to discover all available pages before exploring further.
Configuration
Click the AI node on the canvas to open the summary panel, then click Configure to open the full configuration drawer.
Model
Select the LLM to use for this node. Available models:| Provider | Model | Credits per invocation |
|---|---|---|
| OpenAI | GPT-5.5 | 6 |
| OpenAI | GPT-5.4 | 3 |
| OpenAI | GPT-5.4 Mini | 1 |
| OpenAI | GPT-5.4 Nano | 0.25 |
| OpenAI | GPT-5 Nano | 0.1 |
| Anthropic | Claude Opus 4.7 | 5 |
| Anthropic | Claude Opus 4.6 | 5 |
| Anthropic | Claude Sonnet 4.6 | 3 |
| Anthropic | Claude Haiku 4.5 | 1 |
Web Search
Toggle web search to let the model query the internet for real-time information before responding. This is available for Anthropic (Claude) models only. Each invocation with web search enabled costs an additional 2 credits. Use web search when the prompt needs current information that isn’t in the journey context; for example, looking up a company’s latest funding round before generating outreach copy.Prompt
Write a natural-language prompt describing what you want the model to do. The prompt field supports{{variable}} interpolation; type {{ to see available fields from the trigger schema, profile, or upstream fetch responses.
For the onboarding nudge example below, the prompt might be:
Output Schema
The output schema defines the structure of the model’s response. You can define it in two modes: Form mode (default) — Add fields with a name and type (string, number, or boolean). Optionally add a description to guide the model on what each field should contain. JSON mode — Write a JSON Schema directly. Use this for more complex schemas or when pasting from an existing definition. For the onboarding nudge example, the form-mode schema would be:| Field | Type | Description |
|---|---|---|
subject_line | string | Personalized email subject under 60 characters |
body_copy | string | 1-2 sentence nudge referencing the user’s activity |
recommended_feature | string | One of: templates, automations, integrations, analytics |
tone | string | One of: encouraging, celebratory, urgent |
- In branch conditions: select
data.toneordata.recommended_featureas the field - In journey templates: use
{{subject_line}}or{{body_copy}}
Conditions
Like other nodes, AI nodes support optional conditions. If the conditions are not met when the run reaches the node, it’s skipped and no credits are consumed.Testing
Click Test in the configuration drawer to open the test panel. The test panel lets you run the prompt against the selected model with sample variable values and see the structured response before publishing the journey.
Example: Personalized Onboarding Nudge
A journey that uses the AI node to generate tailored onboarding messages based on each user’s activity:
- Trigger — Fires on day 3 after signup. The trigger schema includes
user_name,plan_name,features_used, anddays_since_signup. - Fetch Data — Pulls the user’s recent activity summary from your application API.
- AI node — Given the user’s profile and activity, generates a personalized nudge with
subject_line,body_copy,recommended_feature, andtone. Model: Claude Sonnet 4.6 (3 credits per run). - Branch — Routes based on
data.tone:- Path “Celebratory”: user has been active → sends a congratulatory email with
{{subject_line}}and{{body_copy}}. - Path “Urgent”: user hasn’t engaged → sends a push notification to re-engage.
- Default: sends a standard onboarding email with the AI-generated copy.
- Path “Celebratory”: user has been active → sends a congratulatory email with
Common Patterns
Score and classify users: Feed product usage, behavior, and profile data into the LLM. Route the journey based on risk level, intent, engagement, or any structured category the model returns. Generate personalized notifications: Give the model your journey context and get back tailored subject lines, body copy, and recommended actions. Personalized content for every recipient without dozens of template variants. Enrich user profiles: Classify users into personas, derive lifecycle stage, or generate account summaries. Persist outputs to the profile so every future journey starts with richer context. Structured output for downstream logic: Define an output schema with field names, types, and enums. The LLM returns structured JSON that branch conditions, send nodes, and downstream integrations can act on directly.Debugging AI Nodes
Open Run Inspection and click the AI node step to see:- The model used and whether web search was enabled
- The resolved prompt (with variables substituted)
- The output schema that was sent to the model
- The structured JSON response
- Token usage (input and output token counts)
What’s Next
Branch
Route based on AI output fields
Fetch Data
Enrich context before the AI node processes it
Journey Templates
Use AI output as template variables
Run Inspection
Debug AI node prompts and responses