Skip to content

Testing Guide: Workflow Engine Registration System

This guide explains how to test the newly implemented workflow engine registration system.

Quick Start

1. Apply Database Migration

# Inside docker container
docker-compose exec web alembic upgrade head

# Or with docker-compose run
docker-compose run --rm web alembic upgrade head

Expected Output:

INFO  [alembic.runtime.migration] Running upgrade 9c7deff4ac4a -> 453c1befe40, add workflow_engines table

2. Run Backend Tests

# Inside docker container
docker-compose exec web python test_workflow_engines.py

# Or with docker-compose run
docker-compose run --rm web python test_workflow_engines.py

Expected Output:

============================================================
WORKFLOW ENGINE REGISTRATION - BACKEND TESTS
============================================================

============================================================
TEST 1: Encryption/Decryption
============================================================
✓ Original key: test-api-key-12345
✓ Encrypted: ...
✓ Decrypted: test-api-key-12345
✅ Basic encryption test PASSED
✅ None/empty handling PASSED
✅ Special character handling PASSED

============================================================
TEST 2: CRUD Operations
============================================================
✓ Creating test engine...
✅ Created engine: test-engine (ID: 1)
✓ Reading engine...
✅ Retrieved engine: test-engine
✓ Updating engine...
✅ Updated engine: active=False
✓ Deleting engine...
✅ Deleted engine

============================================================
TEST 3: Registry Loading
============================================================
✓ Initializing registry with database...
✅ Loaded 1 total engine(s)
✅ 1 enabled engine(s)
   - n8n-default: n8n (enabled)

============================================================
TEST 4: Database Model
============================================================
✅ Created engine via SQLAlchemy: test-model-engine (ID: 2)
   Type: n8n
   URL: http://test.local/api
   Active: True
   Config: {'test': 'value'}
   Created: 2025-11-04 ...
✅ Cleaned up test engine

============================================================
✅ ALL TESTS PASSED!
============================================================

3. Verify Startup

# Restart services to see multi-source loading
docker-compose restart web

# Check logs
docker-compose logs web | grep -i "workflow"

Expected Log Output:

web_1  | Initializing workflow engine registry (multi-source)
web_1  | Loading workflow engines from environment variables
web_1  | Loaded N8N workflow engine from environment: n8n-default
web_1  | Workflow registry initialized with 1 engines
web_1  | Workflows  : 1/1 engines enabled
web_1  | WF Details : n8n-default:n8n(enabled)


Manual Testing Scenarios

Scenario 1: Environment Variable Fallback (Azure Compatibility)

Goal: Verify existing environment variable setup still works

Steps: 1. Ensure .env.local has N8N_API_URL and N8N_API_KEY set 2. Ensure no engines in database:

docker-compose exec postgres psql -U postgres -d slidefactory -c "SELECT * FROM workflow_engines;"
3. Restart services: docker-compose restart web 4. Check logs: Should see "n8n-default" loaded from environment

Expected: ✅ Engine loads from environment variables


Scenario 2: Database Registration (Manual)

Goal: Register engine directly in database, verify it loads

Steps: 1. Register engine via SQL:

docker-compose exec web python << 'EOF'
from app.util.database import get_db
from app.workflowengine.crud import create_engine

db = next(get_db())
engine = create_engine(
    db=db,
    name="n8n-test",
    engine_type="n8n",
    api_url="http://localhost:5678/api/v1",
    api_key="",
    active=True
)
print(f"Created: {engine.name} (ID: {engine.id})")
db.close()
EOF

  1. Restart services: docker-compose restart web

  2. Check logs: Should see both "n8n-default" (environment) and "n8n-test" (database)

Expected: ✅ Both engines loaded


Scenario 3: API Key Encryption

Goal: Verify API keys are encrypted in database

Steps: 1. Create engine with API key:

docker-compose exec web python << 'EOF'
from app.util.database import get_db
from app.workflowengine.crud import create_engine

db = next(get_db())
engine = create_engine(
    db=db,
    name="n8n-with-key",
    engine_type="n8n",
    api_url="http://localhost:5678/api/v1",
    api_key="my-secret-key-12345",
    active=True
)
print(f"Created: {engine.name}")
db.close()
EOF

  1. Check database - API key should be encrypted:

    docker-compose exec postgres psql -U postgres -d slidefactory -c \
      "SELECT name, api_key_encrypted FROM workflow_engines WHERE name='n8n-with-key';"
    

  2. Verify encrypted string looks like Fernet token (starts with "gAAAAA...")

Expected: ✅ API key is encrypted, not plain text


Scenario 4: Connection Test

Goal: Test connection validation before registration

Steps: 1. Test with valid N8N instance:

docker-compose exec web python << 'EOF'
from app.util.database import get_db
from app.workflowengine.crud import create_engine, test_engine_connection

db = next(get_db())

# Create engine
engine = create_engine(
    db=db,
    name="n8n-connection-test",
    engine_type="n8n",
    api_url="http://n8n:5678/api/v1",
    api_key="",
    active=True
)

# Test connection
result = test_engine_connection(db, engine.id)
print(f"Success: {result['success']}")
print(f"Message: {result['message']}")
if 'details' in result:
    print(f"Details: {result['details']}")

db.close()
EOF

Expected: ✅ Connection succeeds if N8N is running, shows workflow count


Scenario 5: Multi-Source Priority

Goal: Verify database engines take precedence over environment

Steps: 1. Create database engine with same name as environment:

docker-compose exec web python << 'EOF'
from app.util.database import get_db
from app.workflowengine.crud import create_engine

db = next(get_db())
try:
    engine = create_engine(
        db=db,
        name="n8n-default",  # Same as environment
        engine_type="n8n",
        api_url="http://database-override:5678/api/v1",
        api_key="",
        active=True
    )
    print(f"Created: {engine.name}")
except Exception as e:
    print(f"Expected error (duplicate name): {e}")
db.close()
EOF

  1. Note: Should fail due to unique constraint on name

  2. Try with different name:

    docker-compose exec web python << 'EOF'
    from app.util.database import get_db
    from app.workflowengine.crud import create_engine
    
    db = next(get_db())
    engine = create_engine(
        db=db,
        name="n8n-db-priority",
        engine_type="n8n",
        api_url="http://database-first:5678/api/v1",
        api_key="",
        active=True
    )
    print(f"Created: {engine.name}")
    db.close()
    EOF
    

  3. Restart and verify both engines loaded

Expected: ✅ Database engine loads, environment doesn't conflict


Troubleshooting

Migration fails: "relation already exists"

Cause: Table already exists from manual creation

Solution:

# Mark migration as applied without running it
docker-compose exec web alembic stamp head


"Failed to decrypt API key"

Cause: SECRET_KEY changed after encryption

Solution: 1. Check SECRET_KEY in .env.local 2. Re-register engine with new API key 3. Or restore original SECRET_KEY


No engines loaded from database

Cause: Database session not passed to registry

Solution: 1. Check logs for "Could not get database session" 2. Verify database is healthy:

docker-compose exec postgres psql -U postgres -d slidefactory -c "SELECT 1;"
3. Check that migration was applied


Connection test fails

Cause: N8N not reachable or API key invalid

Solution: 1. Verify N8N is running: docker-compose ps n8n 2. Check N8N logs: docker-compose logs n8n 3. Test URL manually: curl http://localhost:5678/healthz 4. Verify API key (if required)


Database Inspection

View all registered engines:

docker-compose exec postgres psql -U postgres -d slidefactory -c \
  "SELECT id, name, engine_type, api_url, active, created_at FROM workflow_engines ORDER BY created_at DESC;"

View encrypted API keys:

docker-compose exec postgres psql -U postgres -d slidefactory -c \
  "SELECT id, name, LEFT(api_key_encrypted, 20) || '...' as api_key_sample FROM workflow_engines WHERE api_key_encrypted IS NOT NULL;"

Count engines by type:

docker-compose exec postgres psql -U postgres -d slidefactory -c \
  "SELECT engine_type, COUNT(*) FROM workflow_engines GROUP BY engine_type;"

Delete all test engines:

docker-compose exec postgres psql -U postgres -d slidefactory -c \
  "DELETE FROM workflow_engines WHERE name LIKE '%test%';"

Performance Testing

Load test: Register 50 engines

docker-compose exec web python << 'EOF'
from app.util.database import get_db
from app.workflowengine.crud import create_engine
import time

db = next(get_db())
start = time.time()

for i in range(50):
    engine = create_engine(
        db=db,
        name=f"test-engine-{i}",
        engine_type="n8n",
        api_url=f"http://test{i}.local/api",
        api_key=f"key-{i}",
        active=True
    )
    if i % 10 == 0:
        print(f"Created {i+1} engines...")

elapsed = time.time() - start
print(f"\nCreated 50 engines in {elapsed:.2f} seconds")
print(f"Average: {elapsed/50:.3f} seconds per engine")

db.close()
EOF

Load test: Registry initialization with 50 engines

docker-compose exec web python << 'EOF'
import asyncio
import time
from app.util.database import get_db
from app.workflowengine.registry import WorkflowRegistry

db = next(get_db())
registry = WorkflowRegistry()

start = time.time()
asyncio.run(registry.initialize(db=db))
elapsed = time.time() - start

print(f"Loaded {len(registry.get_all_engines())} engines in {elapsed:.2f} seconds")

db.close()
EOF

Cleanup

Remove all test engines:

docker-compose exec postgres psql -U postgres -d slidefactory -c \
  "DELETE FROM workflow_engines WHERE name LIKE 'test-%';"

Reset to environment-only:

docker-compose exec postgres psql -U postgres -d slidefactory -c \
  "TRUNCATE workflow_engines RESTART IDENTITY CASCADE;"

Next Steps

Once backend testing is complete: 1. Read implementation report: Workflow Engine Registration System (archived) 2. Review deployment checklist: DEPLOYMENT_CHECKLIST.md