Workflow Builder Guide
This guide walks through creating, deploying, and monitoring workflows using the Observer UI.
Prerequisites
Before building a workflow, you need at least two deployed environments (one for the orchestrator, one or more for downstream agents). See the Deployment Modes guide for setup instructions.
Creating a Workflow
Using the Canvas Tab
The Observer UI provides a visual canvas for composing workflows.
- Open Observer and navigate to the Workflows section
- Click "New Workflow"
- You'll see two tabs: Canvas (visual editor) and Generate (AI-assisted)
Adding Nodes
- Open the node palette on the left sidebar
- Drag a node onto the canvas for each deployment in your workflow:
- Model node -- For headless LLM backends (billing agent, support agent, etc.)
- App node -- For frontend applications
- Bridge node -- For protocol adapters (e.g., Chatwoot)
- Click a node to configure it:
- Select an existing deployment from the dropdown, or create a new one
- Set the deployment type (
model,app, orbridge) - For the orchestrator node, check "Is Orchestrator"
Connecting Nodes (Edges)
- Hover over a node's output port (right side) until the connection handle appears
- Click and drag to the target node's input port (left side)
- For edges from the orchestrator, a condition panel appears:
- Intent name -- A short label (e.g.,
billing,support) - Description -- Natural-language description of when to route here (e.g., "Route billing inquiries, invoice questions, and payment issues")
- Intent name -- A short label (e.g.,
Edge conditions are prompts
The description field is injected directly into the orchestrator's compiled system prompt. Write it the way you would instruct a human agent: clearly and specifically.
Canvas Controls
| Action | Shortcut | Description |
|---|---|---|
| Pan | Click + drag on canvas | Move the viewport |
| Zoom | Scroll wheel | Zoom in/out |
| Select | Click node | Select for configuration |
| Delete | Select + Backspace |
Remove a node or edge |
| Undo | Cmd/Ctrl + Z |
Undo last action |
Using the Generate Tab
The Generate tab lets you describe a workflow in natural language:
- Switch to the Generate tab
- Describe your workflow:
Create a customer support workflow with: - An orchestrator that routes between billing and technical support - A Chatwoot bridge for incoming webhooks - An Open WebUI frontend for live agents - Click "Generate" -- the AI creates the workflow JSON and renders it on the canvas
- Review and adjust nodes, edges, and conditions on the canvas
- Click "Apply" to save
Configuring Routing Conditions
Routing conditions define how the orchestrator decides where to send each request. Each edge from the orchestrator has:
- Intent: A keyword identifier (used in logs and metrics)
- Description: A natural-language instruction for the orchestrator LLM
Effective conditions are specific and mutually exclusive:
| Good | Bad |
|---|---|
| "Route billing inquiries, invoice questions, payment disputes, and account upgrade requests" | "Route billing stuff" |
| "Route technical troubleshooting, bug reports, and how-to questions about product features" | "Route support questions" |
If conditions overlap, the orchestrator may route inconsistently. Use the observability auditor to inspect routing decisions and refine conditions.
Deploying a Workflow
- Click "Deploy" in the workflow toolbar
- The platform validates the workflow:
- All referenced deployments must exist
- Exactly one node must be marked as orchestrator
- All orchestrator edges must have conditions
- The workflow enters
deployingstatus:- Environments that do not yet exist are provisioned
- The workflow compiler generates the orchestrator system prompt and MCP tools
- MCP tool registrations are pushed to the orchestrator deployment
- When all nodes report healthy, the workflow transitions to
active
Deployment via CLI
You can also deploy a workflow from a JSON file:
# Create a workflow from JSON
lucid workflow create -f my-workflow.json
# Deploy it
lucid workflow deploy wf-customer-support
# Check status
lucid workflow status wf-customer-support
Monitoring Workflow Status
Status Dashboard
The workflow detail page in Observer shows:
- Overall status:
draft,deploying,active,stopped, orerror - Node health: Per-node status with TEE attestation indicators
- Edge traffic: Request counts and latency per routing edge
- Recent traces: Latest orchestrator decisions with intent classifications
Workflow Statuses
| Status | Meaning | Action |
|---|---|---|
draft |
Defined but not deployed | Click Deploy to provision |
deploying |
Environments being provisioned | Wait for completion |
active |
All nodes healthy, accepting traffic | Monitor in dashboard |
stopped |
Manually stopped, environments intact | Click Deploy to restart |
error |
One or more nodes failed | Check node-level errors |
Debugging Routing
If the orchestrator is routing incorrectly:
- Open the Traces tab in the workflow detail view
- Filter by the misrouted intent
- Inspect the orchestrator's reasoning in the trace:
- What intent did it classify?
- Which MCP tool did it call?
- What was the system prompt at the time?
- Adjust the edge condition descriptions and redeploy
Workflow Passports and Attestation
Each active workflow generates a composite AI Passport that bundles attestations from all participating nodes.
Viewing the Workflow Passport
- Open the workflow in Observer
- Click the Passport tab
- The composite passport shows:
- Graph hash -- Cryptographic proof that the deployed topology matches the workflow definition
- Node attestations -- Individual TEE attestations and auditor status for each node
- Edge conditions -- The routing logic is included as evidence
Verifying Programmatically
# Get the workflow passport
lucid passport show --workflow wf-customer-support
# Verify it
lucid passport verify pass-wf-abc123
The verification checks:
- Each node's TEE attestation is valid and current
- The graph hash matches the declared workflow topology
- All auditor chains are active and passing
Composite trust
A workflow passport proves not just that individual services are secure, but that the composition of services matches the declared architecture. Relying parties can verify that requests flow through the correct audit chains.
Next Steps
- Workflows concept page -- Deeper dive into workflow JSON schema and compilation
- Deployment Types -- Understanding
model,app, andbridgetypes - MCP -- How inter-service tool calls work
- AI Passports -- Passport verification details