Skip to main content

Overview

PyWorkflow includes a powerful CLI for managing workflows and monitoring runs directly from your terminal. The CLI provides commands to list, inspect, and execute workflows, as well as monitor their execution status and event logs.

Quickstart

Create a new project with sample workflows in seconds.

Workflow Management

List, inspect, and run workflows from the command line.

Run Monitoring

Check run status, view event logs, and debug executions.

Schedule Management

Create, manage, and monitor automated workflow schedules.

Scheduler

Run the local scheduler for triggering due schedules.

Worker Management

Start and manage Celery workers for distributed execution.

Installation

The CLI is included with PyWorkflow and available as the pyworkflow command:
pip install pyworkflow
pyworkflow --version

Global Options

These options apply to all commands:
OptionEnvironment VariableDescription
--modulePYWORKFLOW_MODULEPython module to import for workflow discovery
--runtimePYWORKFLOW_RUNTIMEExecution runtime: local (in-process) or celery (distributed). Default: celery
--storagePYWORKFLOW_STORAGE_BACKENDStorage backend: file or memory (default: file)
--storage-pathPYWORKFLOW_STORAGE_PATHPath for file storage (default: ./workflow_data)
--output-Output format: table, json, or plain (default: table)
--verbose, -v-Enable verbose logging
--version-Show version information

Configuration

Configuration File

Create a pyworkflow.config.yaml file in your project directory:
# pyworkflow.config.yaml
module: myapp.workflows

runtime: celery  # or "local"

storage:
  backend: file
  path: ./workflow_data

celery:
  broker: redis://localhost:6379/0
  result_backend: redis://localhost:6379/1
The YAML config file is the recommended approach. Place it in your working directory and both the CLI and your Python code will automatically use it.

Alternative Config Formats

PyWorkflow also supports TOML configuration files (searched in order, walking up the directory tree):
  1. pyworkflow.toml
  2. .pyworkflow.toml
  3. pyproject.toml (under [tool.pyworkflow] section)
module = "myapp.workflows"
runtime = "celery"

[storage]
backend = "file"
path = "./workflow_data"

[celery]
broker = "redis://localhost:6379/0"
result_backend = "redis://localhost:6379/1"

Priority Resolution

Configuration values are resolved in this order (highest to lowest priority):
PrioritySourceExample
1 (highest)CLI flags--module myapp.workflows
2Environment variablesPYWORKFLOW_MODULE=myapp.workflows
3Config filepyworkflow.config.yaml
4 (lowest)Defaultsruntime: local, durable: false

Workflow Discovery

When you run pyworkflow worker run or other CLI commands, PyWorkflow needs to discover and import your workflow modules. This happens in the following priority order:

Discovery Priority

PrioritySourceExample
1 (highest)--module flagpyworkflow --module myapp.workflows worker run
2PYWORKFLOW_DISCOVER env varPYWORKFLOW_DISCOVER=myapp.workflows pyworkflow worker run
3 (lowest)pyworkflow.config.yamlmodule: myapp.workflows in config file

How Discovery Works

  1. Module Import: PyWorkflow imports the specified Python module(s)
  2. Decorator Registration: When the module loads, @workflow and @step decorators automatically register functions in the global registry
  3. Project Root Detection: PyWorkflow automatically finds your project root (by looking for pyproject.toml, setup.py, or .git) and adds it to the Python path

Multiple Modules

You can discover workflows from multiple modules:
# pyworkflow.config.yaml
modules:
  - myapp.workflows
  - myapp.tasks
  - myapp.handlers

Commands

Workflow Commands

Manage and execute registered workflows.

workflows list

List all registered workflows:
pyworkflow --module myapp.workflows workflows list
Output Formats:
  • table (default): Shows Name, Max Duration, and Metadata columns
  • json: Array of workflow objects
  • plain: Simple list of workflow names

workflows info

Show detailed information about a specific workflow:
pyworkflow --module myapp.workflows workflows info my_workflow
Arguments:
ArgumentRequiredDescription
WORKFLOW_NAMEYesName of the workflow to inspect
Output includes: Name, max duration, function details, module path, and docstring.

workflows run

Execute a workflow with optional arguments:
pyworkflow --module myapp.workflows workflows run my_workflow \
    --arg user_id=123 --arg amount=50.00
Arguments:
ArgumentRequiredDescription
WORKFLOW_NAMEYesName of the workflow to run
Options:
OptionDescription
--arg key=valueWorkflow argument (repeatable, supports JSON values)
--args-json '{...}'Workflow arguments as JSON object
--durable/--no-durableRun in durable mode (default: durable)
--idempotency-keyIdempotency key for the execution
pyworkflow workflows run order_process \
    --arg order_id=12345 \
    --arg amount=99.99 \
    --arg items='["item1", "item2"]'
Use --no-durable for quick, transient executions that don’t need persistence. Use --idempotency-key to prevent duplicate executions.

Run Commands

Monitor and debug workflow runs.

runs list

List workflow runs with optional filtering:
pyworkflow runs list --workflow my_workflow --status completed --limit 10
Options:
OptionDescription
--workflowFilter by workflow name
--statusFilter by status: pending, running, suspended, completed, failed, cancelled
--limitMaximum runs to display (default: 20)
Output Formats:
  • table (default): Shows Run ID, Workflow, Status (color-coded), Started time, and Duration
  • json: Array of run objects with full details
  • plain: Simple list of Run IDs

runs status

Show detailed status of a specific run:
pyworkflow runs status run_abc123def456
Arguments:
ArgumentRequiredDescription
RUN_IDYesWorkflow run identifier
Output includes:
  • Run ID, Workflow name, Status
  • Created, Started, Completed timestamps
  • Duration
  • Input arguments
  • Result (if completed)
  • Error message (if failed)

runs logs

View the execution event log for a run:
pyworkflow runs logs run_abc123 --filter step_completed
Arguments:
ArgumentRequiredDescription
RUN_IDYesWorkflow run identifier
Options:
OptionDescription
--filterFilter events by type (case-insensitive substring match)
Event Types:
  • workflow_started, workflow_completed, workflow_failed, workflow_cancelled
  • step_started, step_completed, step_failed
  • sleep_started, sleep_resumed
  • hook_created, hook_received
  • cancellation_requested

runs cancel

Cancel a running or suspended workflow:
pyworkflow runs cancel run_abc123 --reason "User requested"
Arguments:
ArgumentRequiredDescription
RUN_IDYesWorkflow run identifier
Options:
OptionDescription
--wait/--no-waitWait for cancellation to complete (default: no-wait)
--timeoutTimeout in seconds when waiting (default: 30)
--reasonReason for cancellation
Examples:
# Cancel a workflow
pyworkflow runs cancel run_abc123

# Cancel with reason
pyworkflow runs cancel run_abc123 --reason "Customer cancelled order"

# Wait for cancellation to complete
pyworkflow runs cancel run_abc123 --wait --timeout 60
Cancellation is graceful - the workflow will stop at the next checkpoint (before a step, sleep, or hook), not immediately. See Cancellation for details.

Worker Commands

Manage Celery workers for distributed workflow execution.

worker run

Start a Celery worker to process workflow tasks:
pyworkflow worker run
Options:
OptionDescription
--workflowOnly process workflow orchestration tasks
--stepOnly process step execution tasks
--scheduleOnly process scheduled resumption tasks
--concurrency NNumber of worker processes (default: 1)
--loglevel LEVELLog level: debug, info, warning, error
--hostname NAMECustom worker hostname
--beatAlso start Celery Beat scheduler
--autoscale MIN,MAXEnable worker autoscaling (e.g., 2,10)
--max-tasks-per-child NReplace worker child after N tasks
--prefetch-multiplier NTask prefetch count per worker process
--time-limit SECONDSHard time limit for tasks
--soft-time-limit SECONDSSoft time limit for tasks
# Start a worker processing all queues
pyworkflow worker run
For production, run separate workers for each queue type. Scale step workers horizontally for computation-heavy workloads.

worker status

Show status of active Celery workers:
pyworkflow worker status
Displays worker names, status, concurrency, active tasks, and processed task counts.

worker queues

Show available task queues and their configuration:
pyworkflow worker queues

Scheduler Commands

Run the schedule executor for local runtime.

scheduler run

Start the local scheduler that polls for due schedules:
pyworkflow scheduler run
Options:
OptionDescription
--poll-intervalSeconds between storage polls (default: 5.0)
--durationRun for specified seconds then exit (default: run forever)
# Start scheduler with defaults
pyworkflow scheduler run

# With custom poll interval
pyworkflow scheduler run --poll-interval 10
Use scheduler run for local runtime. For Celery runtime, use worker run --beat or start Celery Beat separately.

Schedule Commands

Manage workflow schedules for automated execution.

schedules list

List all schedules with optional filtering:
pyworkflow schedules list --workflow my_workflow --status active --limit 10
Options:
OptionDescription
--workflowFilter by workflow name
--statusFilter by status: active, paused, deleted
--limitMaximum schedules to display (default: 20)
Output includes: Schedule ID, Workflow, Status, Schedule description, Next Run time, Success rate

schedules create

Create a new schedule for a workflow:
pyworkflow schedules create my_workflow --cron "0 9 * * *"
Arguments:
ArgumentRequiredDescription
WORKFLOW_NAMEYesName of the workflow to schedule
Options:
OptionDescription
--cronCron expression (e.g., "0 9 * * *" for daily at 9 AM)
--intervalInterval duration (e.g., 5m, 1h, 30s)
--timezoneTimezone for schedule (default: UTC)
--overlapOverlap policy: skip, buffer_one, buffer_all, cancel_other, allow_all
--schedule-idCustom schedule ID (optional)
# Every day at 9 AM
pyworkflow schedules create daily_report --cron "0 9 * * *"

# Every Monday at midnight
pyworkflow schedules create weekly_cleanup --cron "0 0 * * 1"

schedules show

Show detailed information about a schedule:
pyworkflow schedules show sched_abc123
Output includes:
  • Schedule ID, Workflow name, Status
  • Schedule specification (cron/interval)
  • Overlap policy, Timezone
  • Next run time, Last run time
  • Statistics: Total runs, Successful, Failed, Skipped

schedules pause

Pause a schedule (stops triggering new runs):
pyworkflow schedules pause sched_abc123

schedules resume

Resume a paused schedule:
pyworkflow schedules resume sched_abc123
Output includes: New next run time after resumption

schedules delete

Delete a schedule (soft delete):
pyworkflow schedules delete sched_abc123
pyworkflow schedules delete sched_abc123 --force  # Skip confirmation
Options:
OptionDescription
--forceDelete without confirmation prompt

schedules trigger

Manually trigger a schedule immediately:
pyworkflow schedules trigger sched_abc123
This executes the workflow immediately without affecting the regular schedule timing.

schedules update

Update an existing schedule:
pyworkflow schedules update sched_abc123 --cron "0 10 * * *"
pyworkflow schedules update sched_abc123 --overlap buffer_one
Options:
OptionDescription
--cronNew cron expression
--intervalNew interval duration
--overlapNew overlap policy

schedules backfill

Backfill missed runs for a schedule:
pyworkflow schedules backfill sched_abc123 \
    --start 2024-01-01T00:00:00 \
    --end 2024-01-31T23:59:59
Options:
OptionRequiredDescription
--startYesStart time for backfill (ISO format)
--endYesEnd time for backfill (ISO format)
Backfill creates runs for all scheduled times in the range. For high-frequency schedules, this could create many runs.

Quickstart Command

Create a new PyWorkflow project with sample workflows.

quickstart

Scaffold a complete project structure with working examples:
pyworkflow quickstart
Options:
OptionDescription
--non-interactiveRun without prompts, use defaults
--skip-dockerSkip Docker services setup
--template TYPEProject template: basic (default)
--storage TYPEStorage backend: sqlite or file
Examples:
# Interactive quickstart (recommended for first-time users)
pyworkflow quickstart

# Non-interactive with defaults
pyworkflow quickstart --non-interactive

# Without Docker services
pyworkflow quickstart --skip-docker

# Use file storage instead of SQLite
pyworkflow quickstart --storage file
Created Files:
myproject/
├── workflows/
│   ├── __init__.py          # Exports all workflows
│   ├── orders.py            # process_order workflow
│   └── notifications.py     # send_notification workflow
├── pyworkflow.config.yaml   # Configuration
└── docker-compose.yml       # Docker services (if enabled)
Use pyworkflow quickstart to bootstrap a new project, then modify the sample workflows or add your own in the workflows/ directory.

Setup Command

Configure the PyWorkflow environment for an existing project.

setup

Interactive setup that generates configuration and Docker files:
pyworkflow setup
Options:
OptionDescription
--non-interactiveRun without prompts (use defaults)
--skip-dockerSkip Docker infrastructure setup
--module PATHWorkflow module path (e.g., myapp.workflows)
--storage TYPEStorage backend: file, memory, or sqlite
--storage-path PATHStorage path for file/sqlite backends
Examples:
# Interactive setup (recommended)
pyworkflow setup

# Non-interactive with defaults
pyworkflow setup --non-interactive

# Skip Docker setup
pyworkflow setup --skip-docker

# Specify options directly
pyworkflow setup --module myapp.workflows --storage sqlite

Output Formats

Control output format with the --output flag:
pyworkflow runs list
┏━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━┳━━━━━━━━━━━┳━━━━━━━━━━━━━┳━━━━━━━━━━┓
┃ Run ID         ┃ Workflow    ┃ Status    ┃ Started     ┃ Duration ┃
┡━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━╇━━━━━━━━━━━╇━━━━━━━━━━━━━╇━━━━━━━━━━┩
│ run_abc123...  │ onboarding  │ completed │ 10:30:45    │ 1.2s     │
│ run_def456...  │ payment     │ running   │ 10:31:02    │ 0.5s     │
└────────────────┴─────────────┴───────────┴─────────────┴──────────┘
Use --output json for scripting and automation. Use --output plain for simple lists suitable for piping to other commands.

Examples

Complete Workflow Lifecycle

# 1. List available workflows
pyworkflow --module myapp.workflows workflows list

# 2. Get details about a workflow
pyworkflow --module myapp.workflows workflows info onboarding_workflow

# 3. Run the workflow
pyworkflow --module myapp.workflows workflows run onboarding_workflow \
    --arg user_id=user_123

# Output: Workflow started: run_abc123def456

# 4. Check the status
pyworkflow runs status run_abc123def456

# 5. View the event log
pyworkflow runs logs run_abc123def456

Debugging Failed Runs

# Find failed runs
pyworkflow runs list --status failed

# Check error details (verbose mode for full traceback)
pyworkflow --verbose runs status run_xyz789

# View events leading to failure
pyworkflow runs logs run_xyz789 --filter failed

Scripting with JSON Output

# Get failed run IDs for batch processing
pyworkflow --output json runs list --status failed | jq -r '.[].run_id'

# Export workflow list
pyworkflow --output json workflows list > workflows.json

Using Config File

With a pyworkflow.toml in your project:
module = "myapp.workflows"

[storage]
backend = "file"
path = "./data/workflows"
Commands become simpler:
# No --module needed
pyworkflow workflows list
pyworkflow workflows run my_workflow --arg foo=bar

Distributed Workflow Execution

Complete example of running workflows on Celery workers:
# 1. Setup and verify environment
pyworkflow setup --check

# 2. Start Redis (if not running)
docker run -d -p 6379:6379 redis:7-alpine

# 3. Start workers (in separate terminals)
pyworkflow worker run --workflow      # Workflow orchestration
pyworkflow worker run --step          # Step execution
pyworkflow worker run --schedule      # Sleep resumption

# Or start all-in-one worker
pyworkflow worker run

# 4. Run a workflow (dispatched to Celery)
pyworkflow --module myapp.workflows workflows run my_workflow \
    --arg user_id=123

# 5. Monitor execution
pyworkflow runs list
pyworkflow runs status run_abc123
pyworkflow runs logs run_abc123
Use --runtime local to run workflows in-process without Celery for testing or simple scripts.

Next Steps