Portable AI workflows in simple YAML

Orchestrate LLM agents with built-in observability, cost tracking, and security.

Review PRs for security issues. Run via CLI, GitHub webhook, or CI pipeline.

name: security-review
description: Review PRs for security issues

inputs:
  - name: owner
    type: string
  - name: repo
    type: string
  - name: pr_number
    type: number

steps:
  - id: get_diff
    github.get_pull_request_diff:
      owner: "{{.inputs.owner}}"
      repo: "{{.inputs.repo}}"
      number: "{{.inputs.pr_number}}"

  - id: review
    type: llm
    model: balanced
    prompt: |
      Review this diff for security vulnerabilities.
      Flag any issues found with severity and remediation.

      {{.steps.get_diff.content}}

  - id: comment
    github.create_comment:
      body: "{{.steps.review.response}}"

Everything you need for production AI workflows

YAML-First

Define complex workflows in simple, readable YAML. No SDK required.

Any LLM

Anthropic, OpenAI, Ollama, or others with one config change.

Production Ready

Built-in observability, cost tracking, and security controls.

Flexible Runtime

CLI, API server, webhooks, or scheduled execution.

Connectors

GitHub, Slack, Jira integrations with zero code.

Auditable

Security review YAML, not agent code scattered across codebases.

How it works

1

Write YAML

Define your workflow in a simple YAML file

2

Configure

Set your LLM provider credentials

3

Run

Execute from CLI, API, or on a schedule

4

Iterate

Refine prompts and add more steps

Ready to get started?

Install Conductor and run your first workflow in minutes.

go install github.com/tombee/conductor@latest
Read the Docs