My App
Getting Started

Introduction

SPANE - Parallel Asynchronous Node Execution workflow orchestration engine

SPANE

Parallel Asynchronous Node Execution

SPANE is a workflow orchestration engine built on BullMQ and Redis. It executes DAG-based (Directed Acyclic Graph) workflows with parallel node processing, automatic retries, state persistence, and comprehensive error handling.

Warning: Experimental Project

SPANE is an active research and development project. APIs may change between versions. This is not recommended for production use without thorough testing and evaluation. We encourage feedback and contributions from early adopters.

Key Features

  • Parallel Execution: Execute multiple workflow nodes concurrently for maximum throughput
  • DAG-based Workflows: Define complex workflows as directed acyclic graphs
  • State Persistence: Choose between in-memory or PostgreSQL-backed state storage
  • Automatic Retries: Built-in retry policies with configurable backoff strategies
  • Circuit Breakers: Protect external services with automatic circuit breaking
  • Sub-workflows: Compose reusable workflow components up to 10 levels deep
  • Dead Letter Queue: Automatically route failed jobs to DLQ for inspection and retry
  • Rate Limiting: Per-node-type rate limiting and global worker rate limiting
  • Webhook & Schedule Triggers: Automatically trigger workflows via webhooks or cron schedules
  • Job Prioritization: Set job priorities (1-10) for critical workflow paths
  • Error Handling: Standardized error classes with comprehensive error codes
  • Validation: Runtime validation with Zod schemas

Architecture

SPANE consists of these core components:

  • WorkflowEngine: Main orchestrator that manages workflow lifecycle
  • NodeRegistry: Stores and retrieves node executors by type
  • NodeProcessor: Thin orchestrator that delegates to specialized handlers
  • Handlers: Modular execution logic for different node types
    • execution-handler — Regular node execution
    • delay-handler — Delay node processing
    • subworkflow-handler — Sub-workflow execution
    • child-enqueue-handler — Child node enqueueing
  • QueueManager: Manages BullMQ queues for node execution, workflow triggers, and DLQ
  • WorkerManager: Runs workers that consume jobs from queues
  • StateStore: Handles execution state (InMemory or PostgreSQL)
  • Error System: Standardized error handling with error codes and utility functions
  • Validation: Zod-based runtime validation for workflows and node configs

Quick Start

1. Installation

npm install @manyeya/spane
# or
bun add @manyeya/spane

2. Basic Setup

import { Redis } from 'ioredis';
import { WorkflowEngine, NodeRegistry, InMemoryExecutionStore } from '@manyeya/spane';
import type { WorkflowDefinition, INodeExecutor, ExecutionContext, ExecutionResult } from '@manyeya/spane';

// 1. Create a node executor
class HttpExecutor implements INodeExecutor {
  async execute(context: ExecutionContext): Promise<ExecutionResult> {
    const { url, method = 'GET' } = context.nodeConfig || {};

    const response = await fetch(url as string, {
      method,
      body: method !== 'GET' ? JSON.stringify(context.inputData) : undefined,
      headers: { 'Content-Type': 'application/json' }
    });

    const data = await response.json();
    return { success: true, data };
  }
}

class TransformExecutor implements INodeExecutor {
  async execute(context: ExecutionContext): Promise<ExecutionResult> {
    const transformed = {
      ...context.inputData,
      processedAt: new Date().toISOString()
    };
    return { success: true, data: transformed };
  }
}

// 2. Set up registry and engine
const redis = new Redis();
const registry = new NodeRegistry();
registry.register('http', new HttpExecutor());
registry.register('transform', new TransformExecutor());

const stateStore = new InMemoryExecutionStore();
const engine = new WorkflowEngine(registry, stateStore, redis);

// 3. Define workflow
const workflow: WorkflowDefinition = {
  id: 'fetch-and-transform',
  name: 'Fetch and Transform',
  entryNodeId: 'fetch',
  nodes: [
    {
      id: 'fetch',
      type: 'http',
      config: { url: 'https://api.example.com/data' },
      inputs: [],
      outputs: ['transform']
    },
    {
      id: 'transform',
      type: 'transform',
      config: {},
      inputs: ['fetch'],
      outputs: []
    }
  ]
};

// 4. Register and execute
await engine.registerWorkflow(workflow);
engine.startWorkers(5);

const executionId = await engine.enqueueWorkflow('fetch-and-transform', { userId: 123 });
console.log('Started execution:', executionId);

// 5. Check status
const execution = await stateStore.getExecution(executionId);
console.log('Status:', execution?.status);

Requirements

  • Redis 6.0+: Required for BullMQ queue management
  • Node.js 18+ or Bun 1.0+: Runtime environment
  • PostgreSQL (optional): For persistent state storage

What's Next?

On this page