PlatformPipelines

Pipelines

CI/CD with visual pipeline graphs, structured logs, and pre-execution validation.

Drok Pipelines is a CI/CD system designed by engineers who have spent years fighting YAML indentation errors, deciphering opaque log output, and waiting for pipelines to fail on line 3 of a 200-line configuration file because of a typo that could have been caught before execution began.

Pipeline Configuration

Pipelines are defined in .lehub/pipeline.yml at the root of your repository.

# .lehub/pipeline.yml
 
stages:
  - name: build
    steps:
      - name: Install dependencies
        run: cargo build --release
      - name: Run linter
        run: cargo clippy -- -D warnings
 
  - name: test
    depends_on: [build]
    steps:
      - name: Unit tests
        run: cargo test --lib
      - name: Integration tests
        run: cargo test --test integration
        timeout: 10m
 
  - name: deploy
    depends_on: [test]
    when: branch == "main"
    steps:
      - name: Deploy to production
        run: ./scripts/deploy.sh
        secrets: [DEPLOY_KEY, AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY]

Pre-Execution Validation

Every pipeline.yml is validated before execution. Drok parses the configuration, checks for:

  • Syntax errors — Invalid YAML structure, unknown keys, type mismatches
  • Dependency cycles — Stages that depend on each other in a cycle
  • Missing references — References to stages, secrets, or artifacts that do not exist
  • Security violations — Secrets used in contexts where they would be exposed (e.g., in logs)

If validation fails, the pipeline does not execute. The validation error is displayed immediately with the exact line and column of the problem. You do not wait 45 seconds for a container to spin up only to discover a typo.

Visual Pipeline Graph

The pipeline execution page displays a visual directed acyclic graph (DAG) of your pipeline stages. Each node represents a stage, edges represent dependencies, and the graph updates in real time as stages complete.

  • Green nodes — Stage completed successfully
  • Red nodes — Stage failed
  • Blue nodes — Stage currently executing
  • Gray nodes — Stage waiting for dependencies
  • Yellow nodes — Stage skipped (condition not met)

Click any node to expand its step-by-step execution log.

Structured Logs

Pipeline logs are not a wall of undifferentiated text. They are structured:

  • Collapsible sections — Each step produces a collapsible log section. Expand only the steps you need to investigate.
  • Timestamp prefixes — Every log line includes a precise timestamp relative to step start.
  • ANSI color rendering — Terminal colors from your build tools render correctly in the web interface.
  • Search — Full-text search within pipeline logs with regex support.
  • Log persistence — Logs are retained for 90 days on Pro plans, 365 days on Enterprise.

Structured Annotations

Pipeline steps can emit structured annotations that surface in the merge request diff:

echo "::warning file=src/lib.rs,line=42::Consider using a bounded channel here"
echo "::error file=src/main.rs,line=15::Unused import"

Annotations appear inline in the merge request diff view, positioned at the referenced file and line.

Triggers

Automatic Triggers

triggers:
  push:
    branches: [main, develop]
  merge_request:
    types: [opened, synchronize]
  tag:
    pattern: "v*"
  schedule:
    cron: "0 2 * * 1"  # Every Monday at 2 AM

Manual Triggers

Pipelines can be triggered manually from the web interface or CLI:

drok pipeline run --branch main
drok pipeline run --branch main --variable DEPLOY_ENV=staging

API Triggers

curl -X POST https://drok.us/api/v1/repos/org/repo/pipelines \
  -H "Authorization: Bearer $Drok_TOKEN" \
  -d '{"ref": "main", "variables": {"DEPLOY_ENV": "staging"}}'

Environments

Define deployment environments with protection rules:

environments:
  staging:
    url: https://staging.your-app.com
  production:
    url: https://your-app.com
    protection:
      required_reviewers: ["@devops"]
      wait_timer: 30m

Production deployments can require manual approval from designated reviewers and enforce a cooldown timer between deployments.

Secrets

Secrets are stored encrypted and injected into pipeline steps as environment variables:

drok secret set DEPLOY_KEY --value "your-secret-value"
drok secret set AWS_ACCESS_KEY_ID --value "AKIA..."

Secrets are:

  • Encrypted at rest — AES-256-GCM with per-secret keys
  • Masked in logs — Any output matching a secret value is replaced with ***
  • Scoped — Secrets can be scoped to specific branches or environments
  • Audited — Secret access is logged in the organization audit log

Artifacts

Pipeline steps can produce artifacts that persist after execution:

steps:
  - name: Build
    run: cargo build --release
    artifacts:
      paths:
        - target/release/my-binary
      expire_in: 30d

Artifacts are downloadable from the pipeline page and accessible via the API. They can be passed between stages:

stages:
  - name: build
    steps:
      - name: Compile
        run: cargo build --release
        artifacts:
          paths: [target/release/my-binary]
 
  - name: deploy
    depends_on: [build]
    steps:
      - name: Deploy binary
        run: ./scripts/deploy.sh target/release/my-binary
        needs_artifacts_from: [build]

Caching

Cache dependencies between pipeline runs to avoid redundant downloads:

cache:
  key: "${CI_COMMIT_REF_SLUG}"
  paths:
    - target/
    - .cargo/registry/
    - node_modules/

Cache keys support variable interpolation. Use branch-specific keys for isolation or shared keys for cross-branch caching.

Services

Spin up auxiliary services (databases, message queues) for integration testing:

stages:
  - name: integration
    services:
      - name: postgres
        image: postgres:16
        env:
          POSTGRES_DB: test
          POSTGRES_USER: test
          POSTGRES_PASSWORD: test
      - name: redis
        image: redis:7
    steps:
      - name: Run integration tests
        run: cargo test --test integration
        env:
          DATABASE_URL: postgres://test:test@postgres:5432/test
          REDIS_URL: redis://redis:6379

Sandboxed Integrations

Third-party pipeline integrations run in sandboxed environments with zero-trust network policies. Each integration:

  • Runs in an isolated container with no access to host resources
  • Has network access restricted to explicitly declared endpoints
  • Cannot access secrets unless explicitly granted
  • Is audited for all API calls and resource access

Matrix Builds

Run the same pipeline across multiple configurations:

stages:
  - name: test
    matrix:
      rust: ["1.75", "1.76", "stable", "nightly"]
      os: ["ubuntu-latest", "macos-latest"]
    steps:
      - name: Test on Rust ${{ matrix.rust }}
        run: |
          rustup override set ${{ matrix.rust }}
          cargo test

Matrix builds execute in parallel. The pipeline graph displays each matrix combination as a separate node.

Migrating from GitHub Actions

See Migration for converting GitHub Actions workflows to Drok Pipelines. The general mapping:

GitHub ActionsDrok Pipelines
.github/workflows/*.yml.lehub/pipeline.yml
jobsstages
stepssteps
needsdepends_on
ifwhen
secrets.MY_SECRET$MY_SECRET (injected as env var)
actions/checkout@v4Automatic (repository is always checked out)
actions/cache@v4cache block