Flows API
The Flow surface spans three routers in the product backend:
/api/pipeline/*— running recipes, run history, observability, anomaly detection./api/flow/*(flow_codegen) — visual ↔ SQL recipe round-tripping, zone management./api/flow/*(flow_ai) — agent recipes and AI-driven node placement.
For the conceptual model — node types, zones, run modes, lineage — see Flow. The endpoints below cover programmatic access.
Run history
/api/pipeline/runs🔒 authList recent pipeline runs. Filterable by status (running, success, failed, cancelled), recipe_type, since. Default sort: created_at desc.
/api/pipeline/runs/{run_id}/details🔒 authFull details for a single run: parameters, compiled SQL, stdout, downstream lineage, exit status. Used by the run-detail panel in the UI.
Triggering runs
/api/pipeline/run🔒 authTrigger a run. Body: {target: {type, name}, scope: 'single'|'upstream'|'downstream'|'full'}. Returns the new run_id immediately; runs are async — poll /api/pipeline/runs/{run_id}/details for status.
The four scopes:
| Scope | Builds |
|---|---|
single | Just the target recipe. |
upstream | Target + every upstream recipe needed to bring its inputs up-to-date. |
downstream | Target + every downstream recipe that consumes its output. |
full | Full lineage — upstream and downstream. |
Observability
/api/pipeline/api-health🔒 authPer-router health summary. Returns each router's recent error rate and median latency. Used by the platform's status dashboard.
/api/pipeline/anomalies🔒 authActive data-quality anomalies — rule violations detected on the last run of each dataset. Filterable by severity, dataset_name, rule_id.
/api/pipeline/anomalies/types🔒 authDistinct anomaly types with counts.
/api/pipeline/anomalies/rules🔒 authConfigured data-quality rules — assertion type, target column, threshold.
/api/pipeline/anomalies/tolerances🔒 authPer-rule tolerances (e.g. allow 1% null rate before flagging). Used to quiet noisy rules during a known issue.
/api/pipeline/anomalies/tolerances/{check_id}🔒 authUpdate tolerance for a specific rule.
/api/pipeline/anomalies/{alert_id}/dismiss🔒 authDismiss a single anomaly without resolving it (acknowledged, won't re-surface until conditions change).
/api/pipeline/anomalies/{alert_id}/resolve🔒 authMark an anomaly as resolved (the underlying issue was fixed).
/api/pipeline/anomalies/bulk-dismiss🔒 authDismiss many anomalies at once. Body: {alert_ids: [...]}.
/api/pipeline/anomalies/bulk-resolve🔒 authResolve many anomalies at once.
/api/pipeline/row-audit🔒 authPer-row freshness audit — which datasets refreshed in the last N hours, row count delta, dbt run that produced the refresh.
/api/pipeline/dbt/status🔒 authStatus of the most recent dbt run across all tenants — per-model success/failure, total duration.
/api/pipeline/freshness🔒 authPer-dataset freshness — last_refresh_at, freshness_threshold, and a derived is_fresh boolean. Drives the "stale data" badges in the Flow.
Visual ↔ SQL round-trip
/api/flow/generate-sql🔒 authCompile a visual recipe (Prepare / Join / Group By / Stack) to dbt SQL. Body: the recipe's structured config. Returns the compiled SQL plus a list of inputs and inferred output columns.
/api/flow/parse-to-visual🔒 authInverse of generate-sql — parse dbt SQL into a structured visual recipe config when possible. Returns null if the SQL doesn't fit any visual pattern (e.g. a window function with multiple partitions).
/api/flow/visual-lineage🔒 authGiven a visual recipe config, return the column-level lineage edges it would emit when compiled. Used to preview lineage before saving.
/api/flow/preview-sql🔒 authCompile + run the recipe against a sample of input rows. Returns the result rows without persisting anything. Used by the recipe editor's Preview button.
/api/flow/recipe/{model_name}🔒 authFetch the structured config for a recipe (visual structure or SQL body) by its dbt model name.
/api/flow/save-recipe-steps🔒 authPersist a structured recipe config. Triggers SQL compilation server-side and writes the dbt model file.
Zones
/api/flow/zones🔒 authList zones in the current project.
/api/flow/zones🔒 authCreate a zone. Body: {name, color}.
/api/flow/zones/{template_id}🔒 authRename / recolor a zone.
/api/flow/zones/{template_id}🔒 authDelete a zone. Nodes assigned to it become unassigned.
/api/flow/zones/assign🔒 authAssign a node to a zone. Body: {model_name, zone_id}. Used by drag-and-drop on the Flow canvas.
/api/flow/zones/auto-assign🔒 authAuto-assign nodes to zones using the platform's heuristics (cluster by upstream connector, group raw inputs together). Returns the proposed assignments without committing them; the client confirms.
AI / Agent recipes
/api/flow/ai-nodes🔒 authList AI-recipe nodes in the project. Filtered by recipe_type (embed, llm_enrich, agent, rag_search).
/api/flow/ai-recipes🔒 authCreate an AI recipe. Body includes recipe_type (one of embed, llm_enrich, agent, rag_search), inputs, run_config. Idempotent — duplicate (project_id, model_name, recipe_type) is a no-op.
/api/flow/ai-recipes🔒 authList the project's AI recipes.
/api/flow/ai-recipes/{recipe_id}🔒 authFull config for an AI recipe.
/api/flow/ai-recipes/{recipe_id}🔒 authUpdate an AI recipe's config or run_config.
/api/flow/ai-recipes/{recipe_id}🔒 authDelete an AI recipe.
/api/flow/ai-recipes/{recipe_id}/run🔒 authTrigger a run of an AI recipe. Returns the run_id; poll /api/flow/ai-recipes/{recipe_id}/runs for status.
/api/flow/ai-recipes/{recipe_id}/runs🔒 authRun history for an AI recipe.
Lineage
For lineage queries see Flow → Lineage and the /api/lineage/* surface in the OpenAPI spec.
Auth and access
All endpoints require a valid JWT. The legacy layer gates writes on require_role("editor", "admin"); the new layer (planned) will use object-level permissions on the recipe's output dataset.
Run lifecycle
A run progresses through these states:
queued → running → (success | failed | cancelled)
Run status updates are written to pipeline_runs and visible via the /api/pipeline/runs/{run_id}/details endpoint. The frontend polls every second; for high-volume programmatic access, consider polling once every 5–10 seconds and using the run's updated_at timestamp to short-circuit unchanged states.
There is no streaming run-log endpoint today. The full stdout becomes available once the run completes; partial output during a long-running recipe is not exposed via the API. (UI-side, the recipe editor reads the platform's internal log buffer directly — that's not a public surface.)