Overview
PyWorkflow includes a powerful CLI for managing workflows and monitoring runs directly from your terminal. The CLI provides commands to list, inspect, and execute workflows, as well as monitor their execution status and event logs.
Quickstart Create a new project with sample workflows in seconds.
Workflow Management List, inspect, and run workflows from the command line.
Run Monitoring Check run status, view event logs, and debug executions.
Schedule Management Create, manage, and monitor automated workflow schedules.
Scheduler Run the local scheduler for triggering due schedules.
Worker Management Start and manage Celery workers for distributed execution.
Installation
The CLI is included with PyWorkflow and available as the pyworkflow command:
pip install pyworkflow
pyworkflow --version
Global Options
These options apply to all commands:
Option Environment Variable Description --modulePYWORKFLOW_MODULEPython module to import for workflow discovery --runtimePYWORKFLOW_RUNTIMEExecution runtime: local (in-process) or celery (distributed). Default: celery --storagePYWORKFLOW_STORAGE_BACKENDStorage backend: file or memory (default: file) --storage-pathPYWORKFLOW_STORAGE_PATHPath for file storage (default: ./workflow_data) --output- Output format: table, json, or plain (default: table) --verbose, -v- Enable verbose logging --version- Show version information
Configuration
Configuration File
Create a pyworkflow.config.yaml file in your project directory:
# pyworkflow.config.yaml
module : myapp.workflows
runtime : celery # or "local"
storage :
backend : file
path : ./workflow_data
celery :
broker : redis://localhost:6379/0
result_backend : redis://localhost:6379/1
The YAML config file is the recommended approach. Place it in your working directory
and both the CLI and your Python code will automatically use it.
PyWorkflow also supports TOML configuration files (searched in order, walking up the directory tree):
pyworkflow.toml
.pyworkflow.toml
pyproject.toml (under [tool.pyworkflow] section)
TOML Configuration Examples
pyworkflow.toml
pyproject.toml
module = "myapp.workflows"
runtime = "celery"
[ storage ]
backend = "file"
path = "./workflow_data"
[ celery ]
broker = "redis://localhost:6379/0"
result_backend = "redis://localhost:6379/1"
[ tool . pyworkflow ]
module = "myapp.workflows"
runtime = "celery"
[ tool . pyworkflow . storage ]
backend = "file"
path = "./workflow_data"
[ tool . pyworkflow . celery ]
broker = "redis://localhost:6379/0"
result_backend = "redis://localhost:6379/1"
Priority Resolution
Configuration values are resolved in this order (highest to lowest priority):
Priority Source Example 1 (highest) CLI flags --module myapp.workflows2 Environment variables PYWORKFLOW_MODULE=myapp.workflows3 Config file pyworkflow.config.yaml4 (lowest) Defaults runtime: local, durable: false
Workflow Discovery
When you run pyworkflow worker run or other CLI commands, PyWorkflow needs to discover
and import your workflow modules. This happens in the following priority order:
Discovery Priority
Priority Source Example 1 (highest) --module flagpyworkflow --module myapp.workflows worker run2 PYWORKFLOW_DISCOVER env varPYWORKFLOW_DISCOVER=myapp.workflows pyworkflow worker run3 (lowest) pyworkflow.config.yamlmodule: myapp.workflows in config file
How Discovery Works
Module Import : PyWorkflow imports the specified Python module(s)
Decorator Registration : When the module loads, @workflow and @step decorators
automatically register functions in the global registry
Project Root Detection : PyWorkflow automatically finds your project root (by looking
for pyproject.toml, setup.py, or .git) and adds it to the Python path
# Create pyworkflow.config.yaml in your project
cd myproject/
cat > pyworkflow.config.yaml << EOF
module: myapp.workflows
runtime: celery
storage:
backend: file
path: ./workflow_data
celery:
broker: redis://localhost:6379/0
EOF
# Now just run - config is auto-detected
pyworkflow worker run
pyworkflow --module myapp.workflows worker run
export PYWORKFLOW_DISCOVER = myapp . workflows
pyworkflow worker run
# Or inline
PYWORKFLOW_DISCOVER = myapp.workflows pyworkflow worker run
Multiple Modules
You can discover workflows from multiple modules:
Config File
Environment Variable
# pyworkflow.config.yaml
modules :
- myapp.workflows
- myapp.tasks
- myapp.handlers
# Comma-separated list
PYWORKFLOW_DISCOVER = myapp.workflows,myapp.tasks pyworkflow worker run
Commands
Workflow Commands
Manage and execute registered workflows.
workflows list
List all registered workflows:
pyworkflow --module myapp.workflows workflows list
Output Formats:
table (default): Shows Name, Max Duration, and Metadata columns
json: Array of workflow objects
plain: Simple list of workflow names
workflows info
Show detailed information about a specific workflow:
pyworkflow --module myapp.workflows workflows info my_workflow
Arguments:
Argument Required Description WORKFLOW_NAMEYes Name of the workflow to inspect
Output includes: Name, max duration, function details, module path, and docstring.
workflows run
Execute a workflow with optional arguments:
pyworkflow --module myapp.workflows workflows run my_workflow \
--arg user_id= 123 --arg amount= 50.00
Arguments:
Argument Required Description WORKFLOW_NAMEYes Name of the workflow to run
Options:
Option Description --arg key=valueWorkflow argument (repeatable, supports JSON values) --args-json '{...}'Workflow arguments as JSON object --durable/--no-durableRun in durable mode (default: durable) --idempotency-keyIdempotency key for the execution
Key-Value Arguments
JSON Arguments
pyworkflow workflows run order_process \
--arg order_id= 12345 \
--arg amount= 99.99 \
--arg items='["item1", "item2"]'
pyworkflow workflows run order_process \
--args-json '{"order_id": 12345, "amount": 99.99, "items": ["item1", "item2"]}'
Use --no-durable for quick, transient executions that don’t need persistence. Use --idempotency-key to prevent duplicate executions.
Run Commands
Monitor and debug workflow runs.
runs list
List workflow runs with optional filtering:
pyworkflow runs list --workflow my_workflow --status completed --limit 10
Options:
Option Description --workflowFilter by workflow name --statusFilter by status: pending, running, suspended, completed, failed, cancelled --limitMaximum runs to display (default: 20)
Output Formats:
table (default): Shows Run ID, Workflow, Status (color-coded), Started time, and Duration
json: Array of run objects with full details
plain: Simple list of Run IDs
runs status
Show detailed status of a specific run:
pyworkflow runs status run_abc123def456
Arguments:
Argument Required Description RUN_IDYes Workflow run identifier
Output includes:
Run ID, Workflow name, Status
Created, Started, Completed timestamps
Duration
Input arguments
Result (if completed)
Error message (if failed)
runs logs
View the execution event log for a run:
pyworkflow runs logs run_abc123 --filter step_completed
Arguments:
Argument Required Description RUN_IDYes Workflow run identifier
Options:
Option Description --filterFilter events by type (case-insensitive substring match)
Event Types:
workflow_started, workflow_completed, workflow_failed, workflow_cancelled
step_started, step_completed, step_failed
sleep_started, sleep_resumed
hook_created, hook_received
cancellation_requested
runs cancel
Cancel a running or suspended workflow:
pyworkflow runs cancel run_abc123 --reason "User requested"
Arguments:
Argument Required Description RUN_IDYes Workflow run identifier
Options:
Option Description --wait/--no-waitWait for cancellation to complete (default: no-wait) --timeoutTimeout in seconds when waiting (default: 30) --reasonReason for cancellation
Examples:
# Cancel a workflow
pyworkflow runs cancel run_abc123
# Cancel with reason
pyworkflow runs cancel run_abc123 --reason "Customer cancelled order"
# Wait for cancellation to complete
pyworkflow runs cancel run_abc123 --wait --timeout 60
Cancellation is graceful - the workflow will stop at the next checkpoint (before a step, sleep, or hook), not immediately. See Cancellation for details.
Worker Commands
Manage Celery workers for distributed workflow execution.
worker run
Start a Celery worker to process workflow tasks:
Options:
Option Description --workflowOnly process workflow orchestration tasks --stepOnly process step execution tasks --scheduleOnly process scheduled resumption tasks --concurrency NNumber of worker processes (default: 1) --loglevel LEVELLog level: debug, info, warning, error --hostname NAMECustom worker hostname --beatAlso start Celery Beat scheduler --autoscale MIN,MAXEnable worker autoscaling (e.g., 2,10) --max-tasks-per-child NReplace worker child after N tasks --prefetch-multiplier NTask prefetch count per worker process --time-limit SECONDSHard time limit for tasks --soft-time-limit SECONDSSoft time limit for tasks
All Queues (Default)
Specialized Workers
Advanced Celery Options
# Start a worker processing all queues
pyworkflow worker run
# Terminal 1: Workflow orchestration
pyworkflow worker run --workflow
# Terminal 2: Step execution (scale this for heavy work)
pyworkflow worker run --step --concurrency 4
# Terminal 3: Scheduled tasks
pyworkflow worker run --schedule
# Enable autoscaling (min 2, max 10 workers)
pyworkflow worker run --step --autoscale 2,10
# Set task limits
pyworkflow worker run --max-tasks-per-child 100 --time-limit 300
# Pass arbitrary Celery arguments after --
pyworkflow worker run -- --max-memory-per-child=200000
# Combine PyWorkflow options with Celery passthrough
pyworkflow worker run --step --autoscale 2,8 -- --max-memory-per-child=150000
For production, run separate workers for each queue type. Scale step workers horizontally for computation-heavy workloads.
worker status
Show status of active Celery workers:
Displays worker names, status, concurrency, active tasks, and processed task counts.
worker queues
Show available task queues and their configuration:
Scheduler Commands
Run the schedule executor for local runtime.
scheduler run
Start the local scheduler that polls for due schedules:
Options:
Option Description --poll-intervalSeconds between storage polls (default: 5.0) --durationRun for specified seconds then exit (default: run forever)
Basic Usage
With Module
Testing
# Start scheduler with defaults
pyworkflow scheduler run
# With custom poll interval
pyworkflow scheduler run --poll-interval 10
# Discover workflows from a module
pyworkflow --module myapp.workflows scheduler run
# Run for 60 seconds (useful for testing)
pyworkflow scheduler run --duration 60
Use scheduler run for local runtime. For Celery runtime, use worker run --beat
or start Celery Beat separately.
Schedule Commands
Manage workflow schedules for automated execution.
schedules list
List all schedules with optional filtering:
pyworkflow schedules list --workflow my_workflow --status active --limit 10
Options:
Option Description --workflowFilter by workflow name --statusFilter by status: active, paused, deleted --limitMaximum schedules to display (default: 20)
Output includes: Schedule ID, Workflow, Status, Schedule description, Next Run time, Success rate
schedules create
Create a new schedule for a workflow:
pyworkflow schedules create my_workflow --cron "0 9 * * *"
Arguments:
Argument Required Description WORKFLOW_NAMEYes Name of the workflow to schedule
Options:
Option Description --cronCron expression (e.g., "0 9 * * *" for daily at 9 AM) --intervalInterval duration (e.g., 5m, 1h, 30s) --timezoneTimezone for schedule (default: UTC) --overlapOverlap policy: skip, buffer_one, buffer_all, cancel_other, allow_all --schedule-idCustom schedule ID (optional)
Cron Schedule
Interval Schedule
With Options
# Every day at 9 AM
pyworkflow schedules create daily_report --cron "0 9 * * *"
# Every Monday at midnight
pyworkflow schedules create weekly_cleanup --cron "0 0 * * 1"
# Every 5 minutes
pyworkflow schedules create health_check --interval 5m
# Every hour
pyworkflow schedules create sync_data --interval 1h
pyworkflow schedules create my_workflow \
--cron "0 9 * * *" \
--timezone "America/New_York" \
--overlap buffer_one \
--schedule-id my_custom_id
schedules show
Show detailed information about a schedule:
pyworkflow schedules show sched_abc123
Output includes:
Schedule ID, Workflow name, Status
Schedule specification (cron/interval)
Overlap policy, Timezone
Next run time, Last run time
Statistics: Total runs, Successful, Failed, Skipped
schedules pause
Pause a schedule (stops triggering new runs):
pyworkflow schedules pause sched_abc123
schedules resume
Resume a paused schedule:
pyworkflow schedules resume sched_abc123
Output includes: New next run time after resumption
schedules delete
Delete a schedule (soft delete):
pyworkflow schedules delete sched_abc123
pyworkflow schedules delete sched_abc123 --force # Skip confirmation
Options:
Option Description --forceDelete without confirmation prompt
schedules trigger
Manually trigger a schedule immediately:
pyworkflow schedules trigger sched_abc123
This executes the workflow immediately without affecting the regular schedule timing.
schedules update
Update an existing schedule:
pyworkflow schedules update sched_abc123 --cron "0 10 * * *"
pyworkflow schedules update sched_abc123 --overlap buffer_one
Options:
Option Description --cronNew cron expression --intervalNew interval duration --overlapNew overlap policy
schedules backfill
Backfill missed runs for a schedule:
pyworkflow schedules backfill sched_abc123 \
--start 2024-01-01T00:00:00 \
--end 2024-01-31T23:59:59
Options:
Option Required Description --startYes Start time for backfill (ISO format) --endYes End time for backfill (ISO format)
Backfill creates runs for all scheduled times in the range. For high-frequency schedules, this could create many runs.
Quickstart Command
Create a new PyWorkflow project with sample workflows.
quickstart
Scaffold a complete project structure with working examples:
Options:
Option Description --non-interactiveRun without prompts, use defaults --skip-dockerSkip Docker services setup --template TYPEProject template: basic (default) --storage TYPEStorage backend: sqlite or file
Examples:
# Interactive quickstart (recommended for first-time users)
pyworkflow quickstart
# Non-interactive with defaults
pyworkflow quickstart --non-interactive
# Without Docker services
pyworkflow quickstart --skip-docker
# Use file storage instead of SQLite
pyworkflow quickstart --storage file
Created Files:
myproject/
├── workflows/
│ ├── __init__.py # Exports all workflows
│ ├── orders.py # process_order workflow
│ └── notifications.py # send_notification workflow
├── pyworkflow.config.yaml # Configuration
└── docker-compose.yml # Docker services (if enabled)
Use pyworkflow quickstart to bootstrap a new project, then modify the sample
workflows or add your own in the workflows/ directory.
Setup Command
Configure the PyWorkflow environment for an existing project.
setup
Interactive setup that generates configuration and Docker files:
Options:
Option Description --non-interactiveRun without prompts (use defaults) --skip-dockerSkip Docker infrastructure setup --module PATHWorkflow module path (e.g., myapp.workflows) --storage TYPEStorage backend: file, memory, or sqlite --storage-path PATHStorage path for file/sqlite backends
Examples:
# Interactive setup (recommended)
pyworkflow setup
# Non-interactive with defaults
pyworkflow setup --non-interactive
# Skip Docker setup
pyworkflow setup --skip-docker
# Specify options directly
pyworkflow setup --module myapp.workflows --storage sqlite
Control output format with the --output flag:
Table (Default)
JSON
Plain
┏━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━┳━━━━━━━━━━━┳━━━━━━━━━━━━━┳━━━━━━━━━━┓
┃ Run ID ┃ Workflow ┃ Status ┃ Started ┃ Duration ┃
┡━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━╇━━━━━━━━━━━╇━━━━━━━━━━━━━╇━━━━━━━━━━┩
│ run_abc123... │ onboarding │ completed │ 10:30:45 │ 1.2s │
│ run_def456... │ payment │ running │ 10:31:02 │ 0.5s │
└────────────────┴─────────────┴───────────┴─────────────┴──────────┘
pyworkflow --output json runs list
[
{
"run_id" : "run_abc123..." ,
"workflow" : "onboarding" ,
"status" : "completed" ,
"started_at" : "2025-01-15T10:30:45Z" ,
"duration" : 1.2
}
]
pyworkflow --output plain runs list
run_abc123...
run_def456...
Use --output json for scripting and automation. Use --output plain for simple lists suitable for piping to other commands.
Examples
Complete Workflow Lifecycle
# 1. List available workflows
pyworkflow --module myapp.workflows workflows list
# 2. Get details about a workflow
pyworkflow --module myapp.workflows workflows info onboarding_workflow
# 3. Run the workflow
pyworkflow --module myapp.workflows workflows run onboarding_workflow \
--arg user_id=user_123
# Output: Workflow started: run_abc123def456
# 4. Check the status
pyworkflow runs status run_abc123def456
# 5. View the event log
pyworkflow runs logs run_abc123def456
Debugging Failed Runs
# Find failed runs
pyworkflow runs list --status failed
# Check error details (verbose mode for full traceback)
pyworkflow --verbose runs status run_xyz789
# View events leading to failure
pyworkflow runs logs run_xyz789 --filter failed
Scripting with JSON Output
# Get failed run IDs for batch processing
pyworkflow --output json runs list --status failed | jq -r '.[].run_id'
# Export workflow list
pyworkflow --output json workflows list > workflows.json
Using Config File
With a pyworkflow.toml in your project:
module = "myapp.workflows"
[ storage ]
backend = "file"
path = "./data/workflows"
Commands become simpler:
# No --module needed
pyworkflow workflows list
pyworkflow workflows run my_workflow --arg foo=bar
Distributed Workflow Execution
Complete example of running workflows on Celery workers:
# 1. Setup and verify environment
pyworkflow setup --check
# 2. Start Redis (if not running)
docker run -d -p 6379:6379 redis:7-alpine
# 3. Start workers (in separate terminals)
pyworkflow worker run --workflow # Workflow orchestration
pyworkflow worker run --step # Step execution
pyworkflow worker run --schedule # Sleep resumption
# Or start all-in-one worker
pyworkflow worker run
# 4. Run a workflow (dispatched to Celery)
pyworkflow --module myapp.workflows workflows run my_workflow \
--arg user_id= 123
# 5. Monitor execution
pyworkflow runs list
pyworkflow runs status run_abc123
pyworkflow runs logs run_abc123
Use --runtime local to run workflows in-process without Celery for testing or simple scripts.
Next Steps