Changelog¶
All notable changes to wf are documented here. This project follows Semantic Versioning and Conventional Commits.
[0.2.0] — 2026-03-13¶
Expands wf from a basic sequential workflow runner into a full infrastructure automation runtime. The execution engine, storage layer, and DAG model were substantially redesigned. Six new CLI commands, three execution modes, and a formal security model were added. The logger and config subsystems were refactored.
Added¶
CLI Commands (new)¶
wf inspect <run-id>— structured run details: registered variables, task timings, exit codes, forensic logswf status <run-id>— live polling of an in-progress run with periodic refreshwf audit <run-id>— chronological, tamper-evident event trail for a runwf diff <run-id-a> <run-id-b>— side-by-side comparison of task outcomes and registered variables between two runswf export <run-id>— export a full run record as--format jsonor--format tar(tar includes per-task log files)wf health— system health check: database connectivity, schema version, workflow directory, disk space
Execution Modes¶
- Parallel (
--parallel,--max-parallel N) — level-locked concurrency; tasks within the same topological level run concurrently, bounded by a semaphore (default 4) - Work-stealing (
--work-stealing) — dependency-driven dispatch; a task is dispatched the moment all of its declareddepends_ontasks complete, regardless of level — maximum throughput for sparse DAGs --timeout <duration>— per-run wall-clock limit; cancels the entire process group on expiry (context-aware: retry delays are also cancelled immediately)--print-output— buffer and emit task stdout/stderr atomically after completion (capped at 64 KiB on the progress channel; full content always written to log file)--var KEY=VALUE— inject variables at run time without modifying the workflow file
Workflow Fields (new)¶
register— capture the last non-empty stdout line of a task as a named variable, scoped to that task (consistent with the shell convention ofecho $RESULTas the final command){{.varname}}— safe regex-based variable interpolation in downstreamcmdfields andifexpressions (notext/templatelogic — by design)if— conditional task execution evaluated against runtime variable values; supports==,!=,<,<=,>,>=,contains,starts_with,ends_with,matchesmatrix— expand a single task definition into N independent nodes, one per parameter combination (e.g. three databases, five platforms); matrix nodes are independent in the DAG and can run in parallel; matrix variable short names (e.g.{{.env}}) resolve inside the owning task without qualificationon_failure(task-level) — wire a compensatingforensictask to a specific taskon_failure(workflow-level) — global failure handler that fires after the run settlestype = "forensic"— tasks that run only on failure; excluded from normal DAG levels; receive{{.failed_task}}and{{.error_message}}automatically (Saga pattern)ignore_failure— continue workflow execution even if the task exits non-zeroretry_delay— configurable delay between retry attempts (context-aware)timeout(per-task) — individual task execution limit; sendsSIGKILLto the entire process group on expiryenv— task-specific environment variable mapclean_env/--clean-env— start task with an empty environment rather than inheriting the parent process envworking_dir— per-task working directory (validated:/proc,/sys,/dev, and null bytes are blocked)tags— workflow-level string array, stored as JSON; filterable viawf runs --tag <tag>
Storage (internal/storage/ — replaces internal/run/)¶
- Full rewrite of the persistence layer: SQLite with WAL mode enabled
- KSUID run IDs (sortable, collision-free, URL-safe base62) — replaces simple integer IDs
- Versioned, forward-only schema migrations applied at startup
- Append-only
audit_trailtable — every state transition recorded with timestamp, immutable after write context_snapshotstable — variable state persisted after each task; consumed bywf resumeto restoreContextMapforensic_logstable — output from forensic task executions stored separatelytask_dependenciestable — DAG edges stored forwf graphandwf diffdag_cachetable — DAG content hash cached to detect workflow file changes between runs and resumes- All SQL uses parameterized queries — no string interpolation into queries
- Database file pre-created at mode
0600before SQLite opens it
DAG Layer (internal/dag/ — redesigned)¶
parser.go— TOML →WorkflowDefinition; validates workflow name (path traversal prevention),working_dir, andenvkey names at parse timebuilder.go—WorkflowDefinition→DAG; expands matrix tasks, wires dependency edges, runs cycle detection (Kahn's algorithm), assigns topological levels →DAG{Levels [][]*Node}; populatesDAG.ForensicTasksfor task-level trap lookupserialize.go— DAG serialisation forwf graphoutput formats (HTML, ASCII, DOT, Mermaid, JSON)Node— all mutable fields (State,Output,ExitCode) protected bysync.RWMutexvia thread-safe methods (MarkRunning,MarkSuccess,MarkFailed,MarkSkipped,MarkEarlyFailed,Reset,GetState)MarkEarlyFailed()— marks nodes whose dependencies failed asNodeStateFailed, allowing the executor to skip them cleanlyDAG.ForensicTasks map[string]*Node— task-level forensic tasks registered here during build so the executor can look them up by name without polluting the normal node graph
Executor Layer (internal/executor/ — expanded)¶
Executorinterface:Execute(ctx, dag, ctxMap),Resume(ctx, runID),GetStore()— all three implementations conform to the same interfacesequential.go— single-task-at-a-time, level-by-level (replaces the original monolithic executor)parallel.go— level-locked concurrency with semaphore (--parallel)work_stealing.go— dependency-driven dispatch (--work-stealing); goroutine per node, pending-dep counter, shared work queueprogress.go— progress event system (ProgressTaskStarted,ProgressTaskCompleted,ProgressTaskOutput) feeding the--print-outputrendererdoResume()— shared resume logic: reload TOML, restoreContextMapfrom snapshots, pre-mark succeeded tasks asNodeStateSuccess, re-run remaining tasks- Forensic trap wiring: task-level
on_failurefires immediately on task failure; workflow-level fires after the run settles limitedBuffer— caps stdout + stderr capture at 10 MiB per task; silently drops writes beyond the capresetDAGStateresets all nodes including forensic and global trap nodes — safe DAG re-use across multipleExecutecalls
Logging (internal/logger/)¶
- Migrated from
go.uber.org/zaptolog/slog(standard library, Go 1.21+) — no external logger dependency - Comprehensive structured logging across the entire execution pipeline: every event carries
run_id,workflow,task,task_name,attempt,duration_ms,exit_code, and other contextual fields. Key log points:task started/task completed/retrying task,run started/run completedwith full stats,database opened,run created,variable set/ conflict warnings, forensic trap lifecycle
New Packages¶
internal/contextmap/— thread-safe variable registry;Set(taskID, name, value),InterpolateCommand(taskID, cmd),EvalCondition(expr),Snapshot(),Restore(data). Regex-only{{.var}}substitutor —text/templatewas removed intentionally to prevent template injectioninternal/security/— centralised security validation:ValidateWorkflowName()(path traversal, null bytes, length),ValidateWorkingDir()(blocks/proc,/sys,/dev),ValidateVariableName()(alphanumeric + allowed symbols),ValidateEnvKey()(POSIX rules + deny-list forLD_PRELOAD,LD_LIBRARY_PATH,DYLD_*, and other dynamic-linker vars)internal/tty/— terminal detection for ANSI colour suppression; respectsNO_COLOR,TERM=dumb, and non-TTY stdout
Configuration¶
- Config file paths set to platform-specific XDG/OS defaults:
~/.config/workflow/config.yaml(Linux),~/Library/Application Support/workflow/config.yaml(macOS),%AppData%\workflow\config.yaml(Windows) - Data directory defaults:
~/.cache/workflow/(Linux),~/Library/Caches/workflow/(macOS),%LocalAppData%\workflow\(Windows) config.example.yamldocuments allWF_*environment variable overrides
Test Suites¶
tests/security/security_test.go— 22 security test functions covering path traversal, template injection, env-key deny-list, working_dir restrictions, output buffer cap, file permissions, SQL parameterization; must pass with-racetests/integration/scenarios_test.go— 15 scenario tests covering: register last-line capture, matrix variable interpolation, working-dir pre-existence, task-level forensic traps, global forensic traps,ignore_failurecontinuation, condition skip/run, resume skipping succeeded tasks, parallel vs sequential equivalence, work-stealing diamond pattern, timeout, retry attempt count,clean_env, concurrent-runs stress (race detector guard)tests/examples/examples_test.go—TestExamplesValidateparses and builds all 12 example workflows on every CI run;TestExamplesRunexecutes them end-to-end (skipped undergo test -short)internal/storage/store_test.go— run/task CRUD, filter-by-status/tag/name/limit, migration idempotencytests/benchmarks/— performance benchmarks for DAG construction, execution modes, and storage operations- Existing e2e and integration tests updated for the new storage layer and executor interface
Documentation¶
- Full documentation site (MkDocs + Material theme) — 53 pages covering getting started, concepts, CLI reference, guides, examples, security model, and architecture
- ReadTheDocs configuration (
.readthedocs.yaml) - GitHub Actions workflow for automated GitHub Pages deployment (
.github/workflows/docs.yml) - Twelve production-grade example workflows in
files/examples/covering all major features (filenames unified with theirnamefield — no numeric prefixes) files/examples/GUIDE.md— feature coverage matrix and per-example run commands
CI / Quality¶
.golangci.yml— curated linter set:errcheck,staticcheck,gosec,gocritic,bodyclose,noctx,ineffassign,unconvert,unparam- CI
lintjob — runs golangci-lint v2 on every push/PR - CI
example validationstep — runsTestExamplesValidateon every push/PR across all three OS targets - CI
benchmarksstep — runs all benchmarks on every push/PR
Release¶
- GoReleaser ldflags version injection wired through
main.go→cmd.SetVersionInfo()—wf --versionnow shows build version, commit hash, and build date - SBOM generation via Syft integrated into the release workflow
-jshort flag added for--jsononwf validateandwf run
Branding & Organisation¶
- Repository migrated to
github.com/silocorp/workflow— Go module path, all imports, install script, GoReleaser config, docs, and CI updated - Placeholder logo (
docs/assets/logo.svg) added; used as both logo and favicon in docs - README redesigned with centered header, logo, and tagline
Changed¶
Config refactor (internal/config/)¶
- Removed package-level global
config.C - All config access now via
config.Get()— returns the singletonConfigstruct - Eliminates the race condition possible when
config.Cwas written during init and read from multiple goroutines
DAG package restructure¶
internal/dag/loader.go— removed; loading logic merged intodag/parser.gointernal/dag/topo.go— removed; topological sort moved intodag/builder.goas part of theBuild()pipelineinternal/dag/validate.go— removed; validation integrated intobuilder.goandsecurity/validate.gointernal/dag/dag_test.go— removed; replaced bybuilder_test.go
Storage package replacement¶
internal/run/package (model.go, store.go, run_test.go) removed and replaced byinternal/storage/with full WAL-mode SQLite, versioned migrations, and the expanded schema above
[0.1.0] — 2026-02-03¶
Initial release of wf — a minimal, deterministic workflow orchestrator for local-first execution.
Added¶
Core¶
- Single static Go binary, no CGo, no runtime dependencies
- TOML-based workflow definitions with strict DAG semantics
- Topological execution order (Kahn's algorithm)
- Deterministic, fail-fast task execution
- Per-task configurable retries
- Graceful cancellation via Ctrl+C (SIGINT/SIGTERM handling)
CLI Commands¶
wf init— initialise workspace directories, database, and default config filewf validate [workflow]— validate workflow definitions (all or single), with--jsonoutputwf run <workflow>— execute a workflow with--dry-runsupportwf resume <run-id>— resume a failed run from the point of failure, skipping succeeded taskswf list— list available workflowswf runs— list run history with--workflow,--status,--limit, and--jsonfilterswf logs <run-id> [--task <task-id>]— view per-task and per-run logswf graph <workflow>— display DAG structure with--detailand--format asciioptions
Persistence¶
- SQLite run history — one record per run, one record per task execution
- Per-task structured logs captured and stored on disk
- Run state committed on every transition (start, success, fail, retry) for crash recovery
Workflow Schema¶
name— required workflow identifier[tasks.<id>]— task definition sectionscmd— shell command to executedepends_on— list of upstream task dependenciesretries— number of retry attempts (default: 0)
Configuration¶
- Viper-based config with YAML config file, environment variables, and CLI flag overrides
- Platform-specific default paths (XDG on Linux, Application Support on macOS, AppData on Windows)
--config <path>,--log-level,--verbose,--versionglobal flags
Validation Rules¶
- Workflow must have a
name - Tasks must have a
cmd - Task names: alphanumeric, hyphen, underscore only
- No duplicate task names
- No missing dependencies
- No cycles in the DAG
- At least one task required
Cross-Platform¶
- Linux, macOS, and Windows support
- Pre-built binaries via GitHub Releases (
.github/workflows/release.yml)