Skip to content

Workflow Engine Registration System Implementation

Date: 2025-11-04 Purpose: Implement database-backed workflow engine registration with environment variable fallback Status: ✅ Core Backend Completed (UI Pending)

Executive Summary

Implemented a flexible workflow engine registration system that supports: - ✅ Database-registered engines - Register multiple N8N instances, Prefect, Windmill, etc. via Admin UI - ✅ Environment variable fallback - Backward compatible with Azure deployment - ✅ Multi-source loading - Engines loaded from both database and environment - ✅ Encrypted API keys - Fernet symmetric encryption for API keys at rest - ✅ Zero breaking changes - Existing deployments continue working

Problem Statement

Original Issue

User needed to: 1. Manually set up N8N owner account via UI 2. Generate API key in N8N 3. Add API key to .env.local 4. Restart services

This created poor first-run experience and prevented: - Multiple N8N instances (dev, staging, prod) - Different workflow engines (Prefect, Windmill, Power Automate) - Team-specific engine configurations

User Request

"I think it is more logical then to have a 'Register Workflow Engines' step, where the user enters Workflow Engine Name, URL and API key. This way we can easily add multiple N8N engines or add different engines (prefect or windmill or powerautomate)"

Solution Architecture

Multi-Source Engine Loading

┌─────────────────────────────────────────┐
│   Slidefactory Startup                  │
└──────────────┬──────────────────────────┘
               ├─► Database-Registered Engines
               │   ├─ n8n-prod (https://n8n.company.com)
               │   ├─ n8n-dev (http://localhost:5678)
               │   ├─ prefect-main (https://prefect.company.com)
               │   └─ windmill-test (https://windmill.local)
               └─► Environment Variables (Fallback)
                   └─ n8n-default (from N8N_API_URL/N8N_API_KEY)

         Workflow Registry
         (All engines available)

Key Design Decisions

  1. String-based engine keys instead of enum-based
  2. Allows dynamic naming: "n8n-prod", "n8n-dev", "n8n-customer-a"
  3. Supports unlimited instances of same engine type

  4. Environment variables as fallback

  5. Azure deployment continues working unchanged
  6. No migration required for existing installations

  7. Encrypted API keys at rest

  8. Uses SECRET_KEY from settings as encryption base
  9. Fernet symmetric encryption (industry standard)

  10. Graceful degradation

  11. Database unavailable? Falls back to environment
  12. Decryption fails? Logs error, skips that engine
  13. Connection test available before saving

Implementation Details

1. Database Model

File: app/workflowengine/db_models.py

class WorkflowEngine(Base):
    __tablename__ = "workflow_engines"

    id = Column(Integer, primary_key=True, autoincrement=True)
    name = Column(String(255), unique=True, nullable=False)  # "n8n-prod", "n8n-dev"
    engine_type = Column(String(50), nullable=False)  # "n8n", "prefect", "windmill"
    api_url = Column(String(500), nullable=False)
    api_key_encrypted = Column(Text, nullable=True)  # Fernet encrypted
    active = Column(Boolean, default=True, nullable=False)
    config = Column(JSONB, nullable=True)  # Per-engine configuration
    created_by = Column(Integer, ForeignKey("app_users.id"))
    created_at = Column(DateTime, default=datetime.utcnow)
    updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)

Migration: alembic/versions/453c1befe40_add_workflow_engines_table.py

2. API Key Encryption

File: app/workflowengine/encryption.py

  • Uses Fernet symmetric encryption
  • Derives 32-byte key from SECRET_KEY via SHA256
  • Handles empty API keys (some engines don't require them)
def encrypt_api_key(api_key: str) -> str:
    """Encrypt API key using Fernet"""
    key = hashlib.sha256(settings.SECRET_KEY.encode()).digest()
    f = Fernet(base64.urlsafe_b64encode(key))
    return f.encrypt(api_key.encode()).decode()

3. CRUD Operations

File: app/workflowengine/crud.py

Functions: - get_engine(db, engine_id) - Get by ID - get_engine_by_name(db, name) - Get by unique name - get_engines(db, skip, limit, active_only, engine_type) - List with filtering - get_active_engines(db) - All active engines - create_engine(...) - Create with validation and encryption - update_engine(...) - Update with name conflict checking - delete_engine(db, engine_id) - Soft or hard delete - test_engine_connection(db, engine_id) - Validate connection

4. Multi-Source Registry Loading

File: app/workflowengine/registry.py

Key Changes: - Changed from enum-based keys to string-based keys - Added initialize(db) parameter for database session - Implemented _load_engines_from_database(db) - Implemented _load_engines_from_environment() - Created _create_engine_instance() factory method

Loading Order: 1. Database-registered engines loaded first 2. Environment variables loaded second 3. Duplicates skipped (database takes precedence)

async def initialize(self, db: Optional[Session] = None):
    if db:
        await self._load_engines_from_database(db)
    await self._load_engines_from_environment()

5. Application Startup Integration

File: app/main.py

Updated lifespan function to: - Get database session on startup - Pass to workflow_registry.initialize(db=db_session) - Close session after initialization - Updated display logic for string-based engine names

db_session = next(get_db())
await workflow_registry.initialize(db=db_session)
db_session.close()

6. Pydantic Schemas

File: app/workflowengine/schemas.py

  • WorkflowEngineCreate - For registration
  • WorkflowEngineUpdate - For updates
  • WorkflowEngineResponse - For API responses
  • ConnectionTestRequest/Response - For testing connections

Files Created

New Files:

  1. app/workflowengine/db_models.py - SQLAlchemy database model
  2. app/workflowengine/crud.py - CRUD operations
  3. app/workflowengine/encryption.py - API key encryption utilities
  4. app/workflowengine/schemas.py - Pydantic schemas for API
  5. alembic/versions/453c1befe40_add_workflow_engines_table.py - Database migration

Modified Files:

  1. app/workflowengine/registry.py - Multi-source loading, string-based keys
  2. app/main.py - Pass database session to registry initialization
  3. docker-compose.override.yml - Removed n8n-init container
  4. .env.local - Simplified N8N instructions

Unchanged (No Breaking Changes):

  • All existing engine implementations (engines/n8n.py, engines/prefect.py, etc.)
  • All existing workflow execution code
  • All existing API endpoints
  • GitHub Actions deployment workflows

Usage Examples

Scenario 1: Azure Deployment (Environment Variables)

Current setup continues working unchanged:

# GitHub Actions
env:
  N8N_API_URL: "${{ secrets.N8N_API_URL }}"
  N8N_API_KEY: "${{ secrets.N8N_API_KEY }}"

On startup: - Slidefactory loads n8n-default from environment variables - No database registration needed - Backward compatible

Scenario 2: Local Development (Database Registration)

Recommended workflow:

  1. Start docker-compose:

    docker-compose up
    

  2. Access Slidefactory: http://localhost:8000

  3. Login and go to Admin > Workflow Engines

  4. Register engines:

  5. N8N Dev: http://localhost:5678 (no API key)
  6. N8N Prod: https://n8n.company.com (with API key)
  7. Prefect: https://prefect.company.com (with API key)

  8. Engines loaded on next startup

Scenario 3: Hybrid (Both Sources)

Azure uses environment, local adds more engines:

  • Azure: n8n-default from environment variables
  • Local: User registers n8n-dev, prefect-main via UI
  • Result: All 3 engines available

Security Considerations

API Key Encryption

  • ✅ Keys encrypted at rest using Fernet
  • ✅ Encryption key derived from SECRET_KEY
  • SECRET_KEY must be strong and secret
  • ⚠️ Changing SECRET_KEY breaks existing encrypted keys

Access Control

  • 🚧 TODO: Restrict engine registration to admin users
  • 🚧 TODO: Add audit logging for engine changes
  • 🚧 TODO: Implement engine-level permissions

Connection Validation

  • ✅ Test connection before saving
  • ✅ Validate URLs and credentials
  • ⚠️ Connection test exposes whether URL is reachable

Backward Compatibility

✅ No Breaking Changes

  1. Azure deployment - Works unchanged with environment variables
  2. Existing code - All engine implementations unchanged
  3. API endpoints - No changes to existing APIs
  4. Workflow execution - Same execution flow

Migration Path

For existing installations:

  1. Phase 1 (Current): Core backend completed
  2. Database model and migration ready
  3. Multi-source loading implemented
  4. Environment fallback working

  5. Phase 2 (Next): Admin UI

  6. Create admin router for engine management
  7. Build templates for registration UI
  8. Add connection test button

  9. Phase 3 (Future): Advanced features

  10. Per-user engine access control
  11. Engine-specific quota limits
  12. Workflow import from engines

Testing

Manual Testing Steps

  1. Environment variable fallback:

    # Set N8N_API_URL and N8N_API_KEY
    docker-compose up
    # Verify "n8n-default" loaded in startup logs
    

  2. Database registration (requires Admin UI):

  3. Register new engine via UI
  4. Restart application
  5. Verify engine loaded in startup logs

  6. API key encryption:

    from app.workflowengine.encryption import encrypt_api_key, decrypt_api_key
    encrypted = encrypt_api_key("test-key-123")
    decrypted = decrypt_api_key(encrypted)
    assert decrypted == "test-key-123"
    

  7. Connection test (via CRUD):

    from app.workflowengine.crud import test_engine_connection
    result = test_engine_connection(db, engine_id=1)
    print(result["success"], result["message"])
    

Integration Testing

# Test multi-source loading
from app.workflowengine.registry import workflow_registry

# Set environment variable
os.environ['N8N_API_URL'] = 'http://localhost:5678/api/v1'
os.environ['N8N_API_KEY'] = ''

# Create database engine
engine = create_engine(
    db, name="n8n-prod", engine_type="n8n",
    api_url="https://n8n.company.com/api/v1",
    api_key="prod-key-456"
)

# Initialize registry
await workflow_registry.initialize(db)

# Verify both loaded
assert "n8n-default" in workflow_registry._engines  # From env
assert "n8n-prod" in workflow_registry._engines     # From database

Future Enhancements

Phase 2: Admin UI (Priority 1)

  • Create admin router (app/admin/workflow_engines.py)
  • Build registration form template
  • Add list view with edit/delete actions
  • Implement connection test button (HTMX)
  • Add navigation link to admin menu

Phase 3: Advanced Features

  • Per-user engine access control
  • Engine-specific quota limits
  • Automatic workflow discovery on registration
  • Engine health monitoring dashboard
  • Bulk engine import/export (JSON)
  • Engine-level audit logging

Phase 4: Multi-Tenancy

  • Organization-level engines
  • Shared vs private engines
  • Engine usage analytics per team
  • Cost allocation by engine usage

Known Limitations

  1. No Admin UI yet - Engine registration requires direct database access or future UI
  2. No access control - Any user can see all engines (until UI with permissions)
  3. No audit logging - Engine changes not tracked in audit log
  4. SECRET_KEY rotation - Changing SECRET_KEY breaks encrypted API keys

Cleanup Tasks

Removed Components

  • ✅ Removed n8n-init container from docker-compose
  • ✅ Simplified .env.local N8N instructions
  • 🚧 Optional: Remove n8n-init.sh script (kept for reference)
  • 🚧 Optional: Remove n8n-workflows/ directory (kept for reference)

Documentation Updates Needed

CLAUDE.md

  • Add workflow engine registration section
  • Document multi-source loading
  • Add environment variable fallback explanation
  • Update workflow selection instructions

README.md

  • Add "Workflow Engines" section
  • Document registration process
  • Add troubleshooting guide

Troubleshooting

Issue: Engines not loading from database

Symptoms: Startup logs show 0 engines or only environment engines

Causes: 1. Database connection failed during startup 2. Migration not applied 3. Decryption error

Solution:

# Check migration status
alembic current

# Apply migration if needed
alembic upgrade head

# Check database
docker-compose exec postgres psql -U postgres -d slidefactory -c "SELECT * FROM workflow_engines;"

# Check logs
docker-compose logs web | grep "workflow engine"

Issue: Decryption fails

Symptoms: Failed to decrypt API key in logs

Causes: 1. SECRET_KEY changed after encryption 2. Database corruption

Solution: - Re-register engine with new API key - Or restore SECRET_KEY to original value

Issue: Duplicate engine error

Symptoms: Workflow engine with name 'n8n-default' already exists

Causes: - Both database and environment have same engine name

Solution: - Rename database engine - Or clear N8N_API_URL from environment

Summary

✅ Completed

  1. Database model with encryption
  2. CRUD operations with validation
  3. Multi-source registry loading
  4. Environment variable fallback
  5. Application startup integration
  6. Pydantic schemas for API
  7. Alembic migration
  8. Documentation cleanup

🚧 Pending (Phase 2)

  1. Admin router for engine management
  2. Registration UI templates
  3. Connection test UI
  4. Access control and permissions

🎯 Achievement

Transformed workflow engine configuration from static, single-instance, manual setup to dynamic, multi-instance, self-service registration while maintaining 100% backward compatibility with existing deployments.

Impact

Developer Experience

  • Zero configuration - Existing setups continue working
  • Flexible registration - Add multiple engines easily (once UI is built)
  • No secrets in config - API keys encrypted in database

Production Deployment

  • Azure unchanged - Environment variables work as before
  • No migration required - Graceful fallback to environment
  • Scalable - Support unlimited engine instances

Future-Proof Architecture

  • Multi-engine ready - N8N, Prefect, Windmill, Power Automate
  • Multi-tenant capable - Foundation for per-organization engines
  • API-first - Easy to build UI or CLI on top

Conclusion

The workflow engine registration system provides a solid foundation for flexible, scalable workflow orchestration while maintaining backward compatibility. The core backend is complete and production-ready. The Admin UI (Phase 2) will unlock the full self-service experience for users.

Recommendation: Deploy Phase 1 (backend) to production now, build Phase 2 (UI) in next sprint.