Each Job has a single build plan. When a build of a job is created, the plan determines what happens.
A build plan is a sequence of steps to execute. These steps may fetch down or update Resources, or execute Tasks.
A new build of the job is scheduled whenever get
steps with
trigger: true
have new versions available.
To visualize the job in the pipeline, resources that appear as get
steps are drawn as inputs, and resources that appear in put
steps
appear as outputs.
A simple unit test job may look something like:
name: banana-unit
plan:
- get: banana
trigger: true
- task: unit
file: banana/task.yml
This job says: get
the banana
resource,
and run a task
step called unit
, using
the configuration from the task.yml
file fetched from the banana
step.
When new versions of banana
are detected, a new build of
banana-unit
will be scheduled, because we've set trigger: true
.
Jobs can depend on resources that are produced by or pass through upstream
jobs, by configuring passed: [job-a, job-b]
on the
get
step.
Putting these pieces together, if we were to propagate banana
from
the above example into an integration suite with another apple
component (pretending we also defined its apple-unit
job), the
configuration for the integration job may look something like:
name: fruit-basket-integration
plan:
- aggregate:
- get: banana
trigger: true
passed: [banana-unit]
- get: apple
trigger: true
passed: [apple-unit]
- get: integration-suite
trigger: true
- task: integration
file: integration-suite/task.yml
Note the use of the aggregate
step to
collect multiple inputs at once.
With this example we've configured a tiny pipeline that will automatically run unit tests for two components, and continuously run integration tests against whichever versions pass both unit tests.
This can be further chained into later "stages" of your pipeline; for
example, you may want to continuously deliver an artifact built from
whichever components pass fruit-basket-integration
.
To push artifacts, you would use a put
step
that targets the destination resource. For example:
name: deliver-food
plan:
- aggregate:
- get: banana
trigger: true
passed: [fruit-basket-integration]
- get: apple
trigger: true
passed: [fruit-basket-integration]
- get: baggy
trigger: true
- task: shrink-wrap
file: baggy/shrink-wrap.yml
- put: bagged-food
params:
bag: shrink-wrap/bagged.tgz
This presumes that there's a bagged-food
resource defined, which understands that the
bag
parameter points to a file to ship up to the resource's location.
Note that both banana
and apple
list the same job as an
upstream dependency. This guarantees that deliver-food
will only
trigger when a version of both of these dependencies pass through the same
build of the integration job (and transitively, their individual unit jobs).
This prevents bad apples or bruised bananas from being delivered. (I'm sorry.)