Advanced GitLab CI/CD
Agenda
1. CI/CD Basics: a quick review This workshop assumes that participants
2. Pipeline structure and ways to hack it are familiar with basic GitLab CI/CD.
Pipeline Architectures
3. Variables: a foundation before using It presents advanced concepts and
rules techniques for people who are active
4. Controlling when your jobs run users of GitLab CI.
The “rules” label
5. Holding, securing, and sharing results
Job Artifacts We do not cover Runner configuration and
6. Assembling pipelines from components any of the CD-specific features such as
Using “includes” and “extends” labels Environments and Deployments.
GitLab CI/CD Basics Review
GitLab Recommended Process
Manage Plan Create Verify Package Secure Release Configure Monitor Protect
Epics
Review App
Milestones
Push Fixes
Issues Push Code Approval
Automated Collaboration &
Create Merge Scan
Build / Test Review
Request
Assign Issue Merge Release Deploy
Accepted
GitLab CI/CD
Anatomy of a GitLab CI/CD build
Pipeline
○ Set of one or more jobs. Optionally organized into stages
Stages
○ Collection of jobs to be run in parallel
○ e.g. Build, Test, Deploy
Jobs
○ Scripts that perform tasks
○ e.g. npm test; mvn install; etc.
Environments
○ Where we deploy (Test, Review, Staging, Canary, Prod)
Pipeline Architectures
Pipeline Architectures
Docs
● The building-blocks of your pipeline.
○ Basic
○ Directed Acyclic Graph
○ Parent/Child Pipelines
○ Dynamic Child Pipelines
○ Multi-Project Pipelines
● Simplest pipeline in GitLab
● Jobs run independently, sometimes on
different runners
Basic pipelines ● All Jobs in a Stage must complete
successfully before proceeding to the
next stage
● Control pipeline with pipeline options
Pipeline Architectures - Basic pipeline with options
Stage “Build” Stage “Test” Stage “Deploy”
Other options:
when: delayed
when: on_failure
Jobs
when: always
This job has
This job has when: manual
allow_failure: true ...so it waits for someone (with the right permissions) to click a
...so the pipeline proceeds even though it failed “Play” button
● Define job dependencies to optimize
pipeline flow
Needs ● Jobs still run independently,
sometimes on different runners
or Directed ● Dependent jobs can proceed to next
stage without waiting for other jobs in
Acyclic Graph stage to finish
(DAG)
Pipeline Architectures - Directed Acyclic Graph (DAG)
The Android test doesn’t have to
wait for the iOS build to complete
Jobs still runs in stages,
but the “needs”
keyword overrides the
need for the previous
stage to complete
entirely.
Directed Acyclic Graph Pipelines
Example: needs
linux:build:
stage: build Linux path: the linux:rspec job will be
script: echo "Building linux..." run as soon as the linux:build job
mac:build: finishes without waiting for mac:build to
stage: build
script: echo "Building mac..." finish.
lint:
stage: test macOS path: the mac:rspec job will be
needs: []
script: echo "Linting..." run as soon as the mac:build job
linux:rspec: finishes without waiting for linux:build
stage: test
needs: ["linux:build"] to finish.
script: echo "Running rspec on linux..."
mac:rspec:
stage: test
The lint job will run immediately without
needs: ["mac:build"] waiting for build jobs.
script: echo "Running rspec on mac..."
production: The production job runs as soon as all
stage: deploy previous jobs finish.
script: echo "Running production..."
Source: https://docs.gitlab.com/ee/ci/yaml/#needs
Additional example: needs with artifacts
test-job1: The test-job1 job downloads the
stage: test
build_job1 artifacts
needs:
- job: build_job1 The test-job2 job does not download the
artifacts: true
build_job2 artifacts.
test-job2:
stage: test
The test-job3 job downloads the artifacts
needs: from all three build_jobs, because
- job: build_job2 artifacts is true, or defaults to true, for
artifacts: false
all three needed jobs.
test-job3:
needs:
- job: build_job1
artifacts: true
- job: build_job2
- build_job3
Source: https://docs.gitlab.com/ee/ci/yaml/#needsartifacts
● Run child pipelines independent from
each other.
● Separates entire pipeline configuration
Parent/Child in multiple files to keep things simple.
Pipelines ● Combine with DAG pipelines to
achieve benefits of both.
● Useful to branch out long running
tasks into separate pipelines.
Pipeline Architectures - Parent/Child Pipelines
The pipeline you’re viewing Downstream pipelines that are
triggered from within this one.
Pipeline Architectures - Parent/Child Pipelines for Monorepos
Run the job if there are
changes in those files
Include files from
elsewhere in the project -
must reference a YAML file strategy: depend
means to hold this
pipeline until the other
pipeline finishes
● Generate pipeline configuration at
build time
● Use the generated configuration at a
Dynamic later stage to run as a child pipeline.
Pipelines ● Useful to use a single pipeline
configuration with different settings to
support a matrix of targets and
architectures
Pipeline Architectures - Dynamic Pipelines
.gitlab-ci.yml test.gitlab-ci.yml
Place a generated YAML
file in the job artifact store
Reference it later to
actually run the pipeline
Pipeline Architectures - Dynamic Pipelines
The replacement happened
in the parent pipeline
● A pipeline in one project can trigger a
pipeline in another project
● You can specify a specific branch
● You can pass variables to downstream
Multi-project pipelines
● If downstream pipeline fails it will not
Pipelines fail upstream pipeline
Multi-project pipelines
● Useful when building / deploying large
applications that are made up of
different components that have their
own project and build pipeline
Documentation link:
https://docs.gitlab.com/ee/ci/multi_project_pipelines.html
Pipeline Architectures - Multi-Project Pipelines
test: The job with the trigger is
stage: test referred to as the “bridge” job
Pass variables
script: Downstream pipeline can
to downstream
- echo “I am building!” be anything
projects
bridge:
variables: staging:
ENVIRONMENT: staging stage: deploy
stage: test script:
trigger: - echo “Deploying to $ENVIRONMENT!”
Specify project
project: my/project
and branch
branch: master
strategy: depend
Define what
variables to
forward:
forward pipeline_variables: true
Variables
Variables in UI at project level
Variables can also be set at the
Group and Instance level
Variables entered for the pipeline run
Hot tip!
Prepopulate the keys and values using URL parameters
.../pipelines/new?ref=<branch>&var[<variable_key>]=<val
ue>
Variables defined in the CI configuration
variables:
GLOBAL_VAR: test
build:
variables:
ENVIRONMENT: staging
script:
- echo “Trying $GLOBAL_VAR on $ENVIRONMENT”
Secrets
deploy:
stage: deploy
secrets:
DATABASE_PASSWORD:
vault: # Translates to secret: `ops/data/production/db`, field: `password`
engine:
name: kv-v2
path: ops
path: production/db
field: password
file: false
# Shortened version:
# vault: production/db/password@ops
Inherited variables
build:
stage: build
script:
- echo "BUILD_VARIABLE=value_from_build_job" >> build.env
artifacts:
reports:
dotenv: build.env
deploy:
stage: deploy
variables:
BUILD_VARIABLE: value_from_deploy_job
script:
- echo "$BUILD_VARIABLE" # Output is: 'value_from_build_job' due to precedence
How variables are processed
Values for this run
Manually (UI) or in API request
Values configured
Order of precedence
for Project, Group, or Instance
CI Engine
Main GitLab server
Values inherited
from Jobs in previous Stages
CI Jobs
Values in YAML GitLab Runner
using the variables: block
Values from GitLab
“Predefined Variables”
Rules
When are pipelines run?
New merge
New commit New branch New tag Manual API call Scheduled
request
Rules: The basics
Job Name
“If” statements can reference
Variables - including the
New “rules”
predefined ones, as in this case
block
As it claims, this job will only
run when the pipeline is
kicked off from the web form
Rules Quick Reference
Clauses Operators Results “when” options
if (just the variable) when always
changes == allow_failure never
exists != start_in on_success
(... or none) =~ on_failure
!~ manual
&& Delayed
|| (... or none)
Rules Quick Reference
Default values Job is added when: Job is not added when:
when: on_success A rule matches and has No rules matches and
either: no standalone:
allow_failure: false
● when: on_success ● when: on_success
● when: delayed ● when: delayed
● when: always ● when: always
No match, but last clause is: A rule matches with:
(As standalone attribute)
● when: never
● when: on_success
● when: delayed
● when: always
Rules Example 01
job:
Script: ”echo Hello, Rules!“
rules:
- if: ‘$CI_PIPELINE_SOURCE == “merge_request_event”’
- if: ‘$CI_PIPELINE_SOURCE == “push”’
● Control when merge request pipelines run.
○ if: '$CI_PIPELINE_SOURCE == "merge_request_event"'
● Control when both branch pipelines and tag pipelines run.
○ if: '$CI_PIPELINE_SOURCE == "push"'
Rules Example 02
● This job will not run if this was triggered from a merge request
○ if: '$CI_PIPELINE_SOURCE == "merge_request_event"'
● This job will not run if pipeline was scheduled
○ if: '$CI_PIPELINE_SOURCE == "schedule"'
● Otherwise, this job will run if the previous stage was successful
Rules Example 03
● This job will not run if CUSTOM_VARIABLE is equal to false
○ if: '$CUSTOM_VARIABLE == "false"'
● Otherwise, this job will run even if the previous stage had a failure
Rules Example 04
● This job will be in the pipeline and will be a manual job
○ if: '$VAR == "string value"' and there was a change to Dockerfile or any file in
docker/scripts
● Otherwise, this job will not run
Rules Example 05
"main" '
● This job will run 3 hours after triggered and will be allowed to fail (will not prevent
further stages from firing)
○ if: '$CI_COMMIT_BRANCH == "main"'
● Otherwise, this job will not run
Workflow Rules
workflow:
rules:
- if: $CI_COMMIT_MESSAGE =~ /-wip$/
when: never
- if: $CI_COMMIT_TAG
when: never
- when: always
● This pipeline will not run if the commit message ends with “-wip”
○ if: '$CI_COMMIT_MESSAGE =~ /-wip$/
● This pipeline will not run if this was triggered by a TAG being applied
○ if: '$CI_COMMIT_TAG
● Otherwise, this pipeline will run
Workflow - Variables depending on
workflow:
rules:
- if: $CI_COMMIT_REF_NAME =~ /-wip$/
variables:
DOCKER_FILE: test.dockerfile
- if: $CI_COMMIT_REF_NAME =- $CI_DEFAULT_BRANCH
variables:
DOCKER_FILE: main.dockerfile
- when: always
● The variable DOCKER_FILE will be set differently based on the rules.
● This pipeline will always run
Artifacts
CI configuration
● Allows for saving of build artifacts and/or the output of any job
● Are available for use by subsequent jobs
● Can pull in any combination of paths and/or files
● Use exclude to limit what is added
● Use depends to limit what gets downloaded on subsequent jobs
● Use when to determine if artifacts will be stored or not
● Use expire_in to determine when artifacts will be destroyed
artifacts:
when: on_success
paths:
- bin/target
- .exe
exclude:
- throw-away.exe
expire_in: 3mos
https://docs.gitlab.com/ce/ci/yaml/README.html#artifacts
Artifact download by URL
Build product for a specific CI pipeline run:
https://gitlab.com/fpotter/examples/c-with-artifact/-/jobs/207917559/artifacts/file/helloworld
Three ways of Build product for the latest CI pipeline run on a specific branch (main):
accessing Artifacts
https://gitlab.com/fpotter/examples/c-with-artifact/-/jobs/artifacts/main/raw/helloworld?job=build
via secure HTTPS
All build products (as a Zip) for the latest CI pipeline run on a specific branch:
https://gitlab.com/fpotter/examples/c-with-artifact/-/jobs/artifacts/main/download?job=build
Artifact download in GitLab UI
On the Pipelines page On a specific Job
On the Jobs page
Artifact browser (for a Job)
Artifact administration
In a self-managed GitLab
instance, job artifacts may be
stored in local storage or object
storage
Artifact expiration times can be
configured at the instance level
Artifact downloads fall under
GitLab’s access control
Container and language-specific package registries
Docker
registry
Dependency
Proxy
Language-specific
package registries
dependencies
build:osx:
stage: build
script: make build:osx
artifacts:
paths:
- binaries/
build:linux:
● Reduce number of artifacts passed from
stage: build previous jobs
script: make build:linux
artifacts: ● Improve job performance by preventing
paths:
- binaries/ large artifacts to download for each job
test:osx:
stage: test ● Empty array will skip downloading
script: make test:osx artifacts
dependencies:
- build:osx
● Only from jobs from previous stages
test:linux:
stage: test
script: make test:linux
dependencies:
- build:linux
deploy:
stage: deploy
script: make deploy
Source: https://docs.gitlab.com/ee/ci/yaml/#dependencies
Include and Extends
Include
● Allows the inclusion of external YAML files
● Helps to break down the CI/CD configuration into multiple files and increases readability for long
configuration files
● Possible to have template files stored in a central repository and projects include their configuration files
● Helps avoid duplicated configuration, for example, global default variables for all projects
● Can be nested (up to 100 includes)
Extends
● Defines entry names that a job uses extends is going to inherit from
● Alternative to YML anchors, and is more flexible and readable
● Performs a merge
● Supports multilevel inheritance (but more than 3 is not recommended)
● Can merge hidden jobs (.tests) or from regular jobs
Becomes
Using Includes and Extends together
● Extends works across configuration files with include
If you have a local included.yml: You can do the following:
This will run a job called useTemplate that runs echo Hello! as defined in the .template job, and uses
the alpine Docker image as defined in the local job.
Q&A
Contact GitLab