Break free from DSL hell. Dimensigon orchestrations speak your language: Shell, Python, HTTP/REST, or nested workflows. One engine. Every language.
A financial services company needs to deploy applications across dev, staging, and production environments with language-agnostic workflows.
{
"name": "deploy-financial-app",
"version": 1,
"steps": [
{
"name": "validate-deployment",
"action_type": "shell",
"code": "docker pull myapp:latest && \
docker inspect myapp:latest | jq '.RepoTags'",
"target": "prod-servers"
},
{
"name": "check-compliance",
"action_type": "python",
"code": "import hashlib\nwith open('/app/binary', 'rb') as f:\n sha256 = hashlib.sha256(f.read()).hexdigest()\n assert sha256 == '{{vault.expected_hash}}', 'Binary mismatch!'",
"timeout": 30
},
{
"name": "notify-audit",
"action_type": "http",
"method": "POST",
"url": "https://audit.internal/api/v1/deployments",
"headers": {
"Authorization": "Bearer {{vault.audit_token}}"
},
"body": {
"environment": "production",
"version": "{{vault.app_version}}",
"timestamp": "2025-02-20T06:00:00Z"
}
}
]
}
No custom DSL. No limitations. Use the right tool for each job.
Define base orchestrations, then compose them into complex workflows without duplication:
health-check.json - Generic health check across any servicedeploy-service.json - Deploy + health check + rollbackfull-pipeline.json - Multi-service orchestration calling the aboveOrchestra 10 services with 3 nested calls instead of 100 steps.
// health-check.json (reusable)
{
"name": "health-check",
"steps": [
{
"code": "curl -sf {{service_url}}/health",
"action_type": "shell"
}
]
}
// full-pipeline.json (composes 3 services)
{
"name": "deploy-all",
"steps": [
{
"name": "deploy-web",
"action_type": "orchestration",
"orchestration_id": "deploy-service",
"params": {
"service": "web-api"
}
},
{
"name": "deploy-worker",
"action_type": "orchestration",
"orchestration_id": "deploy-service",
"params": {
"service": "worker"
}
}
]
}
// Orchestration combining all languages
{
"name": "etl-pipeline",
"steps": [
{
"name": "extract-data",
"action_type": "shell",
"code": "aws s3 sync s3://data-lake /tmp/data --profile prod"
},
{
"name": "transform",
"action_type": "python",
"code": "import pandas as pd\ndf = pd.read_csv('/tmp/data/raw.csv')\ndf['timestamp'] = pd.to_datetime(df['timestamp'])\ndf.to_parquet('/tmp/data/transformed.parquet')"
},
{
"name": "load",
"action_type": "http",
"method": "POST",
"url": "https://dw.internal/api/ingest",
"files": {
"data": "/tmp/data/transformed.parquet"
}
}
]
}
No duct-taping languages together. Everything runs in the orchestration engine with shared vault variables.
Shell for sysadmin, Python for data, HTTP for APIs. No compromises.
Build libraries of reusable orchestrations. Compose them at scale.
Teams use languages they already know. No DSL to master.
All languages get secret injection and variable substitution automatically.