Nanci

ci without the git commit -m 'try fix ci' over and over ci where if it passes locally, it passes in the cloud ci where build artifacts are just return values ci where env vars and cwd stay put between commands ci where @autocache decides when to rerun, so you don't have to ci you can step through with a debugger ci where jobs are just async functions ci you can debug on your laptop ci written in plain Python ci that fits in one Python file ci where the tests that didn't change don't run ci without the YAML headaches

A syntax you'll remember, because you know it already

@job(image="rust:latest")
@autocache
async def test(verbose: bool = False):
    await ci.upload("leviathan/", "/app")
    "cd /app"
    if verbose:
        "cargo test --color always -- --nocapture"
    else:
        "cargo test --color always"

@job(image="rust:latest")
async def build() -> dict[str, Artifact]:
    await ci.upload("leviathan/", "/app")
    "cd /app"
    "apt-get update -qq"
    "apt-get install -y -qq gcc-x86-64-linux-gnu"

    binaries = {}
    for target, linker in TARGETS:
        f"rustup target add {target}"

        target_upper = target.upper().replace('-', '_')
        env = f"CARGO_TARGET_{target_upper}_LINKER={linker} " if linker else ""
        f"{env}cargo build --release --target {target}"

        binary_path = f"/app/target/{target}/release/leviathan"
        size = int((await ci.exec(f"wc -c < {binary_path}")).strip())
        print(f"  {target}: {size:,} bytes")

        binaries[target] = await ci.download(binary_path, AsArtifact())

    return binaries

Ifs, loops, functions, dictionaries, JSON parsing, string manipulation — you already know how to write them. So why wrestle them into YAML just to run a pipeline?

Nanci pipelines are plain Python.

A live view of every run

Every pipeline run is captured and streamed live to a web UI. Watch jobs progress in real time, inspect logs, and share results with your team.

ANSI colors and animations fully supported.

Test and fix your pipelines locally, before you push

Nanci uses the same engine locally and in the cloud — so you can run python nanci_ci.py and get the exact same behaviour you'd see in CI.

Iterate fast, catch failures early, fix them before a push ever leaves your machine.

Watch your pipelines 🏃 in a terminal UI.

Stop guessing. Start stepping.

Because pipelines run locally as plain Python, you can attach any debugger you like — pdb, VS Code, PyCharm. Set a breakpoint inside a job, step through execution, inspect variables.

No more guessing what went wrong from logs alone.

Debugging a Nanci pipeline in VS Code

Jobs that know when to skip themselves

Autocached job

Add @autocache to a job and Nanci figures out what it depends on — files, arguments, even the job's own code. If none of that changed, the job is skipped.

No conditions to write. No cache keys to maintain.

A shell that doesn't forget

In most CI platforms, every line is its own little amnesiac shell — cd somewhere, export a variable, and watch it vanish on the next line.

It's a quirk you just learn to work around — until you don't have to.

No such gotchas in Nanci. The working directory and environment stick around. No hidden resets, no surprises.

@job
async def release():
    "cd /app"
    "export TAG=$(git describe --tags --abbrev=0)"
    "export GOOS=linux GOARCH=amd64"

    # cd and every export are still in effect here
    "go build -o bin/app-$TAG ."
    "scp bin/app-$TAG deploy@prod:/opt/app/"

Uncomplicated artifacts

@job
async def build() -> Artifact:
    "cargo build --release"
    return await ci.download(
        "/app/target/release/binary",
        AsArtifact(),
    )

@job
async def publish(binary: Artifact):
    await ci.upload(binary, "/deploy/binary")
    "systemctl restart app"

Other tools make you upload, store, and re-download files between jobs.

In Nanci, you just return them.

First-class GitHub integration

Nanci reports job statuses back to GitHub in real time. See which jobs passed, which are running, and jump straight to the detailed logs — all without leaving your pull request.

Push your code, and let Nanci take it from there.

Nanci GitHub integration

Architecture

Nanci architecture diagram

The webhook listener is a lightweight, always-on service whose only job is to receive push events from GitHub and durably enqueue them. This keeps the critical path of accepting triggers fast and resilient.

Server instances pull work from that queue and orchestrate the run: they write the initial state to the database, open a check on the GitHub commit via the API, then enqueue a message for a runner to pick up.

Runner instances pick up those messages and execute the CI pipeline using the same Nanci Engine that runs locally on your machine — sandboxed inside a VM to prevent untrusted pipeline code from escaping. As the pipeline progresses, results are streamed back to a server instance which keeps both the database and the GitHub checks UI in sync in real time.

Servers and runners scale independently, with the queue naturally distributing load between them.

Get Started

Writing the pipeline

  1. pip install nanci
  2. Create a file called nanci_ci.py
  3. Write your first pipeline:
    from nanci import job
    import asyncio
    
    @job
    def hello_world():
        "echo Hello World!"
    
    asyncio.run(hello_world())
    

Running it locally

  1. python nanci_ci.py

Running in the cloud

  1. Create a GitHub App that users can install on their repos, and note down the client ID and private key.
  2. If you don't have a public domain yet, get a webhook proxy URL at smee.io.
  3. Generate a random secret key for signing JWT tokens and cookies.
  4. Provide an Ubuntu base disk image as base.qcow2.
  5. Embed the Nanci Engine into it by running bake_nanci_into_vm_image.sh.
  6. Fill in the following environment variables used by docker-compose.yml:
    GH_CLIENT_ID=
    GH_PRIVATE_KEY=
    
    NANCI_RUNNER_RABBIT_MQ_URL=
    # any free port on the host
    # each runner instance must use a different one
    NANCI_RUNNER_VM_SSH_PORT=
    # directory containing the EFI firmware files
    NANCI_RUNNER_EFI_DIR_PATH=
    # path to the baked qcow2 image from step 5
    NANCI_RUNNER_BASE_IMAGE_PATH=
    NANCI_RUNNER_GITHUB__CLIENT_ID=
    NANCI_RUNNER_GITHUB__PRIVATE_KEY=
    # URL of the Nanci Server, or a load balancer in front of it
    NANCI_RUNNER_SERVER_URL=
    
    COOKIE_KEY=
    JWT_KEY=
    SMEE_URL=
  7. docker compose up -d
  8. Open localhost:9090 in your browser.
  9. Create an organization and a project.
  10. Link the project to a GitHub repo by authorizing the app you created in step 1.
  11. Push a commit and watch Nanci take it from there.

Docs