Share
Specification

Concepts & Glossary

A reference for every term used in the ALP specification. If you encounter an unfamiliar term in another document, look it up here.


Core Terms

Assembly Line Protocol (ALP)

The formal name of this specification. ALP defines the contracts between the Server, Runner, Operator, and Agent roles. A server that implements the ALP API is an ALP-compliant server. A runner that speaks the ALP polling protocol is an ALP client.

Assembly Line

An ordered sequence of Stations that defines how a type of work is processed. An Assembly Line is a template — it does not run by itself. When a Task is submitted against an Assembly Line, a Task instance is created and flows through the Stations one by one.

Analogous to: GitHub Actions workflow, factory production line

Station

One discrete unit of work in an Assembly Line. Each Station has an Agent Definition that specifies what prompt and configuration to give the Agent. When a Task reaches a Station, the Server dispatches a Job to a matching Runner.

Analogous to: GitHub Actions job, factory workstation

Station Operator — see Operator

Task

The unit of work that flows through an Assembly Line from start to finish. A Task is created when a user (or external trigger) submits a Task Card. One Assembly Line can have many Tasks in flight simultaneously, each at a different Station.

Analogous to: GitHub Actions workflow run, factory work order

Task Card

The submission form that creates a Task. A Task Card has two required fields: title (a short name) and description (the full context, in markdown). Everything the Agent needs to understand the work should be in the description.

Job

A specific Station execution dispatched to a Runner. When a Task reaches a Station, the Server creates a Job and queues it. One Task produces one Job per Station it reaches. If a Station is retried, a new Job is created.

Run

A single execution of a Job on a specific Runner. One Job has one Run in the normal case. A retry creates a new Run. (Note: in the current implementation, Job and Run are often used interchangeably at the API level.)

Transition Rule

A condition that controls when and where a Task moves after a Station completes. Without Transition Rules, the default behavior applies (success → next Station, failure → halt). With Transition Rules, you can express human review gates, retries, and (in a future version) conditional branching.

Analogous to: GitHub Actions if: success() condition, factory routing rule

Gate (Human Review Gate)

A pause point in an Assembly Line. When a Task reaches a Gate, it enters the awaiting_review state and waits until a human approves or rejects it. Defined as part of a Transition Rule.

Assembly Line Repository

A hosted git repository provisioned per Task, used as a shared workspace for Agents across all Stations. Each Station's Agent clones it, writes its outputs (reports, generated files, artifacts), commits, and pushes. The next Station clones it and reads the accumulated outputs.

The Assembly Line Repository is the primary mechanism for passing state between Stations. It also serves as a full audit trail — every file produced by every Agent is versioned in git.

Analogous to: GitHub Actions artifact storage, factory shared workbench

Server

The ALP service that manages Assembly Lines, the Task queue, and the Runner registry. The Server is the authoritative coordinator — it decides which Runner gets which Job based on label matching, and it applies Transition Rules when Jobs complete.

Reference implementation: agentics.dk (closed source)

Runner

A long-running process that registers with an ALP Server, polls for Jobs, and delegates execution to an Operator. The Runner is infrastructure — it has no AI capability of its own. It is the glue between the Server and the Operator/Agent stack.

Runner. A single Runner can proxy to multiple Operators running different Agent types.

Reference implementation: pks-cli (open source, C#)

Operator (Station Operator)

The component spawned by the Runner for each Job. It prepares the execution environment (workspace, git credentials, environment variables), starts the Agent, streams output to viewers, and detects completion. The Operator knows how to run a specific type of Agent — it is the translation layer between infrastructure and AI.

Reference implementation: vibecast (open source, Go) — uses tmux + ttyd + Claude Code

Agent

The AI that executes the Station's task. It receives a prompt and context from the Operator and produces output by working in the execution environment. The Agent has no direct knowledge of the Server, the Assembly Line, or ALP itself. From the Agent's perspective, it is given a task to complete and an environment to work in.

Reference implementation: Claude Code

Agent Definition

The payload the Server sends to the Runner when dispatching a Job. It contains everything the Runner and Operator need to configure and run the Agent: the prompt, repository to clone, labels, timeouts, and the Assembly Line Repository credentials.

Label

A capability tag on a Station or a Runner. The Server matches Station labels to Runner labels when dispatching Jobs. A Runner's registered labels must be a superset of the Station's required labels for a match.

Examples: linux, claude-code, gpu, x86_64, vibecheck


The Critical Distinction: Runner vs Agent

The Runner is not the Agent.

RunnerAgent
What it isInfrastructure daemonAI model
Knows aboutALP Server, polling, tokensPrompt, workspace, tools
Communicates withALP Server (HTTP)Operator (process I/O, MCP)
LifetimeLong-running, always onPer-job, exits when done
Examplespks-cliClaude Code, GPT-4 CLI

A Runner can manage multiple Operators simultaneously, each running a different type of Agent. The Runner does not decide what the Agent does — that is determined by the Station's Agent Definition on the Server.


Security Concepts

Credential Server

A Unix socket hosted by the Runner at /run/alp/cred.sock, bind-mounted into the devcontainer. Agents call it to request scoped, short-lived (JIT) tokens for external services. The Runner holds all real credentials; the Agent only ever receives proxy tokens scoped to the current Job.

See 11-security.md

Egress Proxy

An HTTP/HTTPS proxy run by the Runner (typically at host-gateway:3128) through which all outbound container traffic routes. The proxy enforces an allow-list of permitted destinations, performs Token Swap on requests, logs all egress, and can enforce Human DMZ Gates on sensitive operations.

See 11-security.md

Workload Identity

The runtime identity of an Agent, derived from its ALP_JOB_ID and ALP_STATION_LABELS rather than from a static secret. The Credential Server uses workload identity to determine which services an Agent is permitted to access. Modelled on GitHub Actions OIDC / Azure federated credentials.

See 11-security.md

JIT Token (Just-In-Time Token)

A short-lived, scoped credential issued by the Credential Server to a requesting Agent. A JIT token is valid only for the current Job's lifetime and only for the service and scopes it was issued for. It is not the real service credential — it is a proxy reference that the Egress Proxy swaps for the real credential at the DMZ boundary.

Token Swap

The action performed by the Egress Proxy when forwarding an Agent's outbound request: the proxy recognises the JIT token in the Authorization header, replaces it with the real service credential stored on the host, and forwards the request. The real credential never leaves the host process.

Human DMZ Gate

An approval gate implemented at the Egress Proxy level (as distinct from a pipeline-level Gate in Transition Rules). A Human DMZ Gate holds a live, in-flight HTTP request from the Agent and waits for human approval before forwarding it. Used for sensitive operations like production deploys.

Sandbox

The isolated execution environment (typically a devcontainer) in which the Operator and Agent run. The sandbox has no host filesystem access, no Docker daemon access, and no raw service credentials. All privilege is mediated by the Runner via the Credential Server and Egress Proxy.