Jenkins Pipeline - Intermediate
Jenkins Pipeline - Intermediate
COURSE OBJECTIVES
● After completing this training module, you should understand the following:
■ How to create and modify Pipeline code without using Blue Ocean
■ How to create and configure a Shared Library
■ How to call a Shared Library custom step from a Pipeline
■ How to create and use a Resource File
■ More about creating robust, maintainable Pipelines
COURSE MODULES
AUDIENCE
APPROACH
● This course teaches you how to create and run a Jenkins Declarative Pipeline
using shared libraries.
■ Students modify a real-life Pipeline to build, test and deploy a Pipeline.
● The course uses one project:
■ Lab project: Students are given a list of tasks and are expected to figure out
how to implement a Pipeline that implements those tasks.
CLASSROOM
Introduction
Lab exercises are a key component of your CloudBees Jenkins training. Each student has a
self-contained lab environment that includes all of the plugins and dependencies that are required for this
course.
This workbook contains of a sequence of lab exercises through which you will learn to work with
Pipelines.
If you encounter problems with your software installation, or if you do not understand any of the
instructions, please ask your instructor for help.
Lab exercises
The solution to each task is located at the end of that task. Please try to solve the assignment by yourself
and look at the solutions only if you get stuck or want to validate your work.
This training is language-agnostic, so you are not expected to know the language. While
IMPORT
we recommend that you familarize yourself with ancillary technologies such as Apache
ANT
Maven, Gradle, Ant, NPM, Apache Groovy, Docker and Git/GitHub, you should be able to
complete the lab exercises by copying commands and text that are given in the class. You
will not need any additional tools.
● Jenkins Master — The Jenkins Master dashboard; this is the environment you will use for most
of the work in this course.
● Gitserver — Git repository page for the projects in this course. If you are used to GitHub, this
page will look familiar although it is actually running Gitea, which gives you a local advanced Git
server with a web interface from which to browse repositories, authenticate, do pull requests and
reviews.
● DevBox — A bash shell environment that provides a command line interface.
For the labs associated with this class, we will not be using DevBox.
Credentials To Use
Use the id of butler and the password of butler for all credentials in your lab environment.
Blue Ocean
The Blue Ocean plugin is installed in your lab environment. To open it:
○
● Switch to Classic Web UI
○ Click on the arrow button to switch to "Jenkins Web UI":
■
○ Click "Open Blue Ocean" in the side bar to switch back.
Introduction
This document explains how to install and start your CloudBees Lab Environment.
Please follow all the steps carefully, before running any Lab exercise.
A Virtual Machine (VM) will be used for hosting your Lab Environment:
This VM runs using the VirtualBox hypervisor, and is managed and automated by Vagrant (see
requirements below).
Both of those tools are Open Source, free and multi-platforms, but they require:
Common Requirements
●
The following ports must be allowed access to your instance’s domain (which is localhost):
○ 5000
○ 5001
○ 5002
○ 5003
○ 20000
○ 30001
○ 30002
● The following protocols must be allowed by any antivirus/firewall software:
○ HTTP
○ HTTPS
○ Websockets
■ Some antivirus software like Kasperky and McAfee might block websocket
silently
■ You can test websockets from this page: WebSockets Test
■ For more about security and websockets: Testing Websockets
● Even if the training lab is running in offline mode, an open Internet access is recommended
○ HTTP Proxy can only be configured for Jenkins operations
Hardware Requirements
● For All OSes, download and install the latest (64 Bits) versions of:
○ VirtualBox (An Open Source Hypervisor from Oracle):
■ Downloads page: https://www.virtualbox.org/wiki/Downloads
■ Make sure to download the appropriate binary for your OS
We encourage you to download the
latest available version of
VirtualBox. However, it is worth
noting that the last version we
tested with this courseware was
6.0.12. So, if you run into trouble
with the latest version, please try
using this one.
■
Windows users:
If you have HyperV installed,
VirtualBox may throw some
errors with the code
VERR_VMX_NO_VMX.
and reboot
○
Vagrant (An Open Source VM manager):
■ Downloads page: https://www.vagrantup.com/downloads.html
■ Make sure to download the appropriate binary for your OS
We encourage you to download the
latest available version of Vagrant.
However, it is worth noting that the
last version we tested with this
courseware was 2.2.5. So, if you
run into trouble with the latest
version, please try using this one.
●
For Windows only, download latest version of Git for Windows
Git for Windows provides a bash-compliant shell and OpenSSH
TIP
client
● Right click this link to the virtual machine’s ZIP archive to open it in a new tab or window
○ The archive will download to your local disc
● Extract the virtual machine ZIP archive to your local disc
○ This archive contains your virtual machine image and automated settings in a folder
named training-pipeline-intermediate
The command line is required to start the Virtual Machine without having to care
TIP
to any specific configuration.
●
Using the command line cd, navigate to the un-archived folder that should be located on your
Desktop:
● cd ~/Desktop/training-pipeline-intermediate/
●
TIP
The ~ special character means "Full path to the user home folder"
● Desktop may vary depending on your operating system: can be lower
case, or localized in your operating system’s language.
●
Use the command line ls to check the content of this directory. We need to have a file named
Vagrantfile here:
ls -1
● Vagrantfile
● Now you are able to start the Virtual Machine, using the vagrant command:
● vagrant up
● You need to be able to stop and start the Virtual Machine whenever you want. Let’s do it now:
○ From the training-pipeline-intermediate folder that contains a Vagrantfile:
○ Stop the VM "gracefully" with the vagrant "halt" command:
○ vagrant halt
○
Once the VM is in the stopped state, you can safely
T
do anything else, like stopping your computer
○
Start again the Virtual Machine:
○ vagrant up
Any Vagrant command can be used here. For more informations, please check
TIP
Vagrant Documentation - https://www.vagrantup.com/docs/cli/
● This page is available on your web browser at this URL: Lab Home Page
● Username: butler
● Password: butler
●
You will see an HTML page that lists the services hosted on your Lab Environment.
● Each service will be detailed in the next steps.
The first time you access your Jenkins instance, you should see it populated with an existing
multibranch-pipeline project: pipeline-lab:
When you click the project (pipeline-lab) you should see a list of the existing branches with their
corresponding pipelines; in the lab environment provided, there is only a master branch.
If instead you get a "This folder is empty" message, you must re-scan the project.
Figure 3. Re-scan Multibranch Pipeline, if folder is empty
After re-scanning, the different branches that have pipelines should appear.
Troubleshooting
General workflow
If you face any issue with the lab during this course, please read this troubleshooting guide first.
If you still cannot use the lab environment, depending on your training type:
apt-get update
○ ...
○ Then remove the plugin vagrant-vbguest, by using the command
○ vagrant plugin uninstall vagrant-vbguest
○ If the error is related to VT-x is not available
...
○ Stderr: VBoxManage.exe: error: VT-x is not available (VERR_VMX_NO_VMX)
○ Make sure you disable the HyperV service as stated in the 'Software Requirements' of
this document
● Is your VM started ?
○ Open VirtualBox GUI and check the state.
○ With you command line, use vagrant status within your labs directory.
○ On your process manager, verify that you have a VBoxHeadless process.
● Is your VM reachable with SSH ?
○ Is Vagrant aware of port forwarding (using vagrant port) ?
○ In the VirtualBox GUI, do you see the port forwarding ?
○ Do you have any firewall rule that would block any traffic on your localhost (l0, loopback,
etc.) interface, on the forwarded port (2222 generally)?
● When stuck, always try rebooting the VM one time
● If you need to submit an issue (Self Paced training only), try to run your latest vagrant command
in debug mode (see example below ), and copy-paste the result in a text file or in
https://gist.github.com/
VAGRANT_LOG=debug vagrant up
RECENT FEATURES
● equals
● changeRequest
● buildingTag
● tag
● beforeAgent
EQUALS
CHANGEREQUEST
● Returns true if this Pipeline is building a change request, such as a GitHub or Bitbucket pull
request
■ when { changeRequest() }
● More detailed checks by using a filter against the change request, allowing you to ask "was
this change request created by [email protected]?"
■ when { changeRequest authorEmail: "[email protected]" }
● You can also do pattern matching against the filters using a comparator to determine if the
pull request was from anyone with the email address ending in @example.com
■ when { changeRequest authorEmail: "[\\w_-.][email protected]", comparator:
'REGEXP' }
BUILDINGTAG
● A simple condition that just checks if the Pipeline is running against a tag in SCM, rather
than a branch or a specific commit reference
■ when { buildingTag() }
TAG
● A more detailed equivalent of buildingTag, allowing you to check against the tag name itself
pipeline {
agent any
stages {
stage('Build') {
steps {
sh 'make package'
}
}
stage('Test') {
when { equals expected: 2, actual: currentBuild.number }
steps {
sh 'make check'
}
}
stage('Deploy') {
when { tag "release-*" }
steps {
echo 'Deploying only because this commit is tagged...'
sh 'make deploy'
}
}
}
}
BEFOREAGENT
● Allows you to specify that the when conditions should be evaluated before entering the
agent for the stage
● When beforeAgent true is specified, you will not have access to the agent’s workspace, but
you can avoid unnecessary SCM checkouts and waiting for a valid agent to be available
pipeline {
agent none
stages {
stage('Example Build') {
steps {
echo 'Hello World'
}
}
stage('Example Deploy') {
agent {
label "some-label"
}
when {
beforeAgent true
branch 'production'
}
steps {
echo 'Deploying'
}
}
● fixed
● regression
FIXED
● Checks to see if the current run is successful and if the previous run was either failed or
unstable
REGRESSION
● Checks to see if the current run’s status is worse than the previous run’s status
● If the previous run was successful and the current run is unstable, this fires and its block of
steps executes
● It also runs if the previous run was unstable and the current run is a failure, etc
NEW OPTIONS
● checkoutToSubdirectory
● newContainerPerStage
CHECKOUTTOSUBDIRECTORY
● Allows you to override the location that the automatic SCM checkout uses
● Using checkoutToSubdirectory("foo"), your Pipeline checks out your repository to
$WORKSPACE/foo, rather than the default of $WORKSPACE
NEWCONTAINERPERSTAGE
● If you are using a top-level docker or dockerfile agent and want to ensure that each of your
stages runs in a fresh container of the same image
INPUT DIRECTIVE TO STAGE
pipeline {
agent any
stages {
stage('Example') {
input {
message "Should we continue?"
ok "Yes, we should."
submitter "alice,bob"
parameters {
string(name: 'PERSON', defaultValue: 'Mr Jenkins', description: 'Who should I say hello to?')
}
}
when {
equals expected: " Fred", a
ctual: "${PERSON}"
}
steps {
echo "Hello, ${PERSON}, nice to meet you."
}
}
}
}
USING DOCKER WITH PIPELINE
Pipeline has built-in support for interacting with Docker from within a Jenkinsfile
CUSTOMIZE THE EXECUTION ENVIRONMENT
pipeline {
agent {
docker { image 'node:7-alpine' }
}
stages {
stage('Test') {
steps {
sh 'node --version'
}
}
}
}
● Many build tools download external dependencies and cache them locally for future re-use
● Pipeline supports adding custom arguments that are passed to Docker, allowing users
to specify custom Docker Volumes to mount
■ These can be used for caching data on the agent between Pipeline runs
CACHING DATA FOR CONTAINERS
pipeline {
agent {
docker {
image 'maven:3-alpine'
args '-v $HOME/.m2:/root/.m2'
}
}
stages {
stage('Build') {
steps {
sh 'mvn -B'
}
}
}
}
● Many build tools download external dependencies and cache them locally for future re-use
● Pipeline supports adding custom arguments that are passed to Docker, allowing users
to specify custom Docker Volumes to mount
■ These can be used for caching data on the agent between Pipeline runs
CACHING DATA FOR CONTAINERS
pipeline {
agent {
docker {
image 'maven:3-alpine'
args '-v $HOME/.m2:/root/.m2'
}
}
stages {
stage('Build') {
steps {
sh 'mvn -B'
}
}
}
}
pipeline {
agent none
stages {
stage('Back-end') {
agent {
docker { image 'maven:3-alpine' }
}
steps {
sh 'mvn --version'
}
}
stage('Front-end') {
agent {
docker { image 'node:7-alpine' }
}
steps {
sh 'node --version'
}
}
}
}
USING MULTIPLE CONTAINERS
pipeline {
agent none
stages {
stage('Back-end') {
agent {
docker { image 'maven:3-alpine' }
}
steps {
sh 'mvn --version'
}
}
stage('Front-end') {
agent {
docker { image 'node:7-alpine' }
}
steps {
sh 'node --version'
}
}
}
}
USING A DOCKERFILE
Dockerfile
FROM node:7-alpine
RUN apk add -U subversion
Jenkinsfile
pipeline {
agent { dockerfile true }
stages {
stage('Test') {
steps {
sh 'node --version'
sh 'svn --version'
}
}
}
}
SPECIFY A DOCKER LABEL
SCRIPTED PIPELINE
OVERVIEW
DECLARATIVE VS SCRIPTED
● Declarative limits what is available to the user with a more strict and pre-defined structure,
making it an ideal choice for simpler continuous delivery pipelines
● Scripted provides very few limits
■ The only limits on structure and syntax are defined by Groovy itself, not by
Pipeline-specific systems
■ Useful when you have more complex requirements than Declarative can
support out of the box
■ BUT it has few safeguards against errors you might make
● Use the script step to execute a block of Scripted syntax in a Declarative Pipeline
SO WHAT DOES ALL THIS MEAN?
● https://jenkins.io/doc/book/pipeline/syntax/#scripted-pipeline
USING A JENKINSFILE
STRING INTERPOLATION
● Groovy supports declaring a string with either single quotes or double quotes:
def singlyQuoted = 'Hello'
● def doublyQuoted = "World"
STRING INTERPOLATION
● String interpolation only works for strings in double-quotes, not for strings in single-quotes.
■ For example, this code:
def username = 'Jenkins'
echo 'Hello Mr. ${username}'
■ echo "I said, Hello Mr. ${username}"
■ Results in:
Hello Mr. ${username}
■ I said, Hello Mr. Jenkins
● You can see that the dollar-sign ($) based string interpolation works for the string that is in
double quotes but does not work for the string in single quotes
USING ENVIRONMENT VARIABLES
pipeline {
agent any
stages {
stage('Example') {
steps {
echo "Running ${env.BUILD_ID} on ${env.JENKINS_URL}"
}
}
}
}
SETTING ENVIRONMENT VARIABLES
pipeline {
agent any
environment {
CC = 'clang'
}
stages {
stage('Example') {
environment {
DEBUG_FLAGS = '-g'
}
steps {
sh 'printenv'
}
}
}
}
CREDENTIALS
CREDENTIALS
pipeline {
agent any
stages {
stage("test") {
steps {
withCredentials([usernameColonPassword(variable: 'SERVICE_CREDS', credentialsId:
'my-cred-id')]) {
sh """
echo "Service user is $SERVICE_CREDS_USR"
echo "Service password is $SERVICE_CREDS_PSW"
curl -u $SERVICE_CREDS https://myservice.example.com
"""
}
}
}
}
}
CREDENTIALS
The environment variable specified is set to username:password and two additional environment
variables are defined automatically: MYVARNAME_USR and MYVARNAME_PSW respectively.
pipeline {
agent any
environment {
SERVICE_CREDS = credentials('my-prefined-username-password')
}
stages {
stage("test") {
steps {
sh """
echo "Service user is $SERVICE_CREDS_USR"
echo "Service password is $SERVICE_CREDS_PSW"
curl -u $SERVICE_CREDS https://myservice.example.com
"""
}
}
}
}
SECRET TEXT
The environment variable specified will be set to the Secret Text content
pipeline {
agent any
environment {
SOME_SECRET_TEXT = credentials('jenkins-secret-text-id')
}
stages {
stage("test") {
steps {
sh """
echo "secret text is $SOME_SECRET_TEXT"
"""
}
}
}
}
SECRET FILE
The environment variable specified will be set to the location of the file that is temporarily created
pipeline {
agent any
environment {
SOME_SECRET_FILE = credentials('jenkins-secret-file-id')
}
stages {
stage("test") {
steps {
sh """
echo "secret file location is $SOME_SECRET_FILE"
"""
}
}
}
}
SSH WITH PRIVATE KEY
The environment variable specified will be set to the location of the SSH key file that is temporarily
created and two additional environment variables may be automatically defined: MYVARNAME_USR and
MYVARNAME_PSW (holding the passphrase).
pipeline {
agent any
environment {
SSH_CREDS = credentials('my-prefined-ssh-creds')
}
stages {
stage("test") {
steps {
sh """
echo "SSH private key is located at $SSH_CREDS"
echo "SSH user is $SSH_CREDS_USR"
echo "SSH passphrase is $SSH_CREDS_PSW"
"""
}
}
}
}
SECRET FILE
The environment variable specified will be set to the location of the file that is temporarily created
pipeline {
agent any
environment {
SOME_SECRET_FILE = credentials('jenkins-secret-file-id')
}
stages {
stage("test") {
steps {
sh """
echo "secret file location is $SOME_SECRET_FILE"
"""
}
}
}
}
The environment variable specified will be set to the location of the SSH key file that is temporarily
created and two additional environment variables may be automatically defined: MYVARNAME_USR and
MYVARNAME_PSW (holding the passphrase).
pipeline {
agent any
environment {
SSH_CREDS = credentials('my-prefined-ssh-creds')
}
stages {
stage("test") {
steps {
sh """
echo "SSH private key is located at $SSH_CREDS"
echo "SSH user is $SSH_CREDS_USR"
echo "SSH passphrase is $SSH_CREDS_PSW"
"""
}
}
}
}
WHAT IF MY CREDENTIALS ISN’T ONE OF THESE FOUR?
Unsupported credentials type causes the pipeline to fail with the message:
pipeline {
agent any
parameters {
string(name: 'Greeting', d
efaultValue: 'Hello', description: 'How should I greet the world?')
}
stages {
stage('Example') {
steps {
echo "${params.Greeting} World!"
}
}
}
}
HANDLING FAILURE
pipeline {
agent any
stages {
stage('Test') {
steps {
sh 'make check'
}
}
}
post {
always {
junit '**/target/*.xml'
}
failure {
mail to: team@example.com, subject: 'The Pipeline failed :('
}
}
}
OPTIONAL STEP ARGUMENTS
Pipeline follows the Groovy language convention of allowing parentheses to be omitted around method
arguments
OPTIONAL STEP ARGUMENTS
MULTIBRANCH PIPELINES
● From the Jenkins Dashboard, click on New Item in the left frame
■ Enter the name of your new Pipeline in the box that is provided
■ Choose "Multibranch Pipeline" from the list provided and click "OK"
● Choose your SCM from the list under "Branch Sources"
■ Fill in the fields that are displayed to configure your SCM
■ (Optional) Configure a webhook from SCM
● Push a Jenkinsfile on any branch
■ Merge branch: jobs automatically managed
● Everything is automated, which greatly reduces the administrative tasks
CREATE A NEW JOB OF TYPE "MULTIBRANCH PIPELINE"
ORGANIZATION SCANNING
● Currently only works with GitHub Organization folders and Bitbucket Team/Project folders
■ Corresponding branch source plugins must be installed
■ Other SCMs may be supported in the future
● Admin selects the job type associated with the SCM type
■ One credential (API token generally) needed
■ Maps to an "organization folder" or "team/project" as top level
● Each repository maps to a Multibranch pipeline
■ Inside the "organization folder" or "team/project"
■ More automation
■ Automate webhooks creation
BUT WHAT IF I’M STILL USING SUBVERSION?
OVERVIEW
● A separate SCM repo that contains reusable custom steps that can be called from Pipelines
● Configured once per Jenkins instance
● Cloned at build time
● Loaded and used as code libraries for Jenkins Pipelines
● Modifications made to a shared library custom step are applied to all Pipelines that call that
custom step
NOTES ABOUT SHARED LIBRARIES
● Extremely powerful
● Learning curve
■ First step is not easy
■ Requires deeper understanding of Pipeline
● Adds some overhead
■ Testing
■ Maintenance
● Many uses
■ Take time to read the documentation
FOR FURTHER READING
● vars directory contains scripts that define custom steps accessible from a Pipeline.
● All custom steps are defined in the root of the vars directory
■ You can not use subfolders to var
● Each file should define one step
■ The name should be the name of that step, camelCased, with the .groovy suffix.
● The matching .txt file, if present, can contain documentation
■ This will be processed through the system’s configured markup formatter
RESOURCES DIRECTORY
OTHER DIRECTORIES
LAB EXERCISE
Task: Create a simple Jenkinsfile to verify that the library is set up correctly
● Click on New Item
● Enter the item name test-shared-library and select Pipeline
● Click OK
● Scroll down to the Pipeline text area and paste the following in:
@Library('shared-library') _
pipeline {
agent { label 'java' }
stages {
stage('verify') {
steps {
helloWorld(name: 'fred')
}
}
}
}
● Click Save
● Click Build Now
● Click on the Blue Ball to the left hand side of the #1
● Scroll down and verify that you see
● Create a file that has the desired name of our custom step
● Add code to a call() method inside that file
■ Code the custom step exactly as you would code it in a Pipeline
■ If the custom step is for code you created in a Pipeline, you can
basically copy-and-paste that code
● Check the file into the SCM repository
■ For testing, check the new custom step into a branch other than master
CREATE A HELLOWORLDSIMPLE CUSTOM STEP
Jenkinsfile
pipeline {
agent any
stages {
stage('hello') {
steps {
sh "echo Hello world, Fred. It is Friday."
}
}
}
}
vars/helloWorldSimple.groovy
def call(String name, String dayOfWeek) {
sh "echo Hello World ${name}. It is ${dayOfWeek}."
}
LAB EXERCISE
For this task, we will use the Gitea editor to create the custom step.
vars/postBuildSuccess.groovy
Jenkinsfile
@Library('shared-starter') _
pipeline {
agent any
stages {
stage('hello') {
steps {
helloWorldSimple("Fred","Friday")
}
}
}
}
vars/helloWorldSimple.groovy
def call(String name, String dayOfWeek) {
sh "echo Hello World ${name}. It is ${dayOfWeek}."
}
HELLOWORLD EXAMPLE
Jenkinsfile
@Library('shared-starter') _
pipeline {
agent any
stages {
stage('hello') {
steps {
helloWorld(name: "Fred", dayOfWeek: "Friday")
}
}
}
}
vars/helloWorld.groovy
def call(Map config = [:]) {
sh "echo Hello World ${config.name}. It is ${config.dayOfWeek}."
}
● In the "Pipelines - Fundamentals" course, we created code to send email and Slack
notifications when a build starts, when it completes and when it fails
● Let’s look at that code and then turn it into a custom step that any Pipeline can call
NOTIFICATIONS WHEN BUILD STARTS
stages {
stage ('Start') {
steps {
// send build started notifications
slackSend (
color: '#FFFF00',
message: "STARTED: Job '${env.JOB_NAME} [${env.BUILD_NUMBER}]' (${env.BUILD_URL})"
)
post {
success {
slackSend (
color: '#00FF00',
message: "SUCCESSFUL: Job '${env.JOB_NAME} [${env.BUILD_NUMBER}]' (${env.BUILD_URL})"
)
emailext (
subject: "SUCCESSFUL: Job '${env.JOB_NAME} [${env.BUILD_NUMBER}]'",
body: """<p>SUCCESSFUL: Job '${env.JOB_NAME} [${env.BUILD_NUMBER}]':</p>
<p>Check console output at "<a href='${env.BUILD_URL}'>${env.JOB_NAME}
[${env.BUILD_NUMBER}]</a>"</p>""",
recipientProviders: [[$class: 'DevelopersRecipientProvider']]
)
}
}
post {
failure {
slackSend (
color: '#FF0000',
message: "FAILED: Job '${env.JOB_NAME} [${env.BUILD_NUMBER}]' (${env.BUILD_URL})"
)
emailext (
subject: "FAILED: Job '${env.JOB_NAME} [${env.BUILD_NUMBER}]'",
body: """<p>FAILED: Job '${env.JOB_NAME} [${env.BUILD_NUMBER}]':</p>
<p>Check console output at "<a href='${env.BUILD_URL}'>${env.JOB_NAME}
[${env.BUILD_NUMBER}]</a>"</p>""",
recipientProviders: [[$class: 'DevelopersRecipientProvider']]
)
}
}
HERE WE GO
vars/sendNotifications.groovy
def call(Map config = [:]) {
slackSend (
color: "${config.slackSendColor}",
message: "${config.message}: Job '${env.JOB_NAME} [${env.BUILD_NUMBER}]'
(${env.BUILD_URL})"
)
Jenkinsfile
stages {
stage ('Start') {
steps {
sendNotifications(
slackSendColor: "#FFFF00",
message: "STARTED"
)
}
}
}
Jenkinsfile
post {
success {
sendNotifications(
slackSendColor: "#00FF00",
message: "SUCCESSFUL"
)
}
}
NOTIFICATIONS WHEN BUILD FAILS
Jenkinsfile
post {
failure {
sendNotifications(
slackSendColor: "#FF0000",
message: "FAILED"
)
}
}
Jenkinsfile
stages {
stage ('Start') {
steps {
sendNotificationsStart()
}
}
}
vars/sendNotificationsStart.groovy
def call() {
sendNotifications(
slackSendColor: "#FFFF00",
message: "STARTED"
)
}
Jenkinsfile
post {
success {
sendNotificationsSuccess()
}
}
vars/sendNotificationsSuccess.groovy
def call() {
sendNotifications(
slackSendColor: "#00FF00",
message: "SUCCESSFUL"
)
}
Jenkinsfile
post {
failure {
sendNotificationsFailure()
}
}
vars/sendNotificationsFailure.groovy
def call() {
sendNotifications(
slackSendColor: "#FF0000",
message: "FAILED"
)
}
LAB EXERCISE
● Modify existing Pipeline to use the custom step from the previous lab
For this task, we will use the Gitea editor to modify the existing Pipeline.
@Library('shared-library') _
● Scroll down to the post { success … } } section of the Build Java 7 stage
● Replace
archiveArtifacts 'target/*.jar'
stash(name: 'Java 7', includes: ' target/**')
with
● The job should automatically start once the commit has completed.
Solution
Jenkinsfile
@Library('shared-library') _
pipeline {
agent none
stages {
stage('Fluffy Build') {
parallel {
stage('Build Java 8') {
agent {
node {
label 'java8'
}
}
steps {
sh "./jenkins/build.sh"
}
post {
success {
stash(name: 'Java 8', includes: 'target/**')
}
}
}
stage('Build Java 7') {
agent {
node {
label 'java7'
}
}
steps {
sh './jenkins/build.sh'
}
post {
success {
postBuildSuccess(stashName: "Java 7")
}
}
}
}
}
stage('Fluffy Test') {
parallel {
stage('Backend Java 8') {
agent {
node {
label 'java8'
}
}
steps {
unstash 'Java 8'
sh './jenkins/test-backend.sh'
}
post {
always {
junit 'target/surefire-reports/**/TEST*.xml'
}
}
}
stage('Frontend') {
agent {
node {
label 'java8'
}
}
steps {
unstash 'Java 8'
sh './jenkins/test-frontend.sh'
}
post {
always {
junit 'target/test-results/**/TEST*.xml'
}
}
}
stage('Performance Java 8') {
agent {
node {
label 'java8'
}
}
steps {
unstash 'Java 8'
sh './jenkins/test-performance.sh'
}
}
stage('Static Java 8') {
agent {
node {
label 'java8'
}
}
steps {
unstash 'Java 8'
sh './jenkins/test-static.sh'
}
}
stage('Backend Java 7') {
agent {
node {
label 'java7'
}
}
steps {
unstash 'Java 7'
sh './jenkins/test-backend.sh'
}
post {
always {
junit 'target/surefire-reports/**/TEST*.xml'
}
}
}
stage('Frontend Java 7') {
agent {
node {
label 'java7'
}
}
steps {
unstash 'Java 7'
sh './jenkins/test-frontend.sh'
}
post {
always {
junit 'target/test-results/**/TEST*.xml'
}
}
}
stage('Performance Java 7') {
agent {
node {
label 'java7'
}
}
steps {
unstash 'Java 7'
sh './jenkins/test-performance.sh'
}
}
stage('Static Java 7') {
agent {
node {
label 'java7'
}
}
steps {
unstash 'Java 7'
sh './jenkins/test-static.sh'
}
}
}
}
stage('Confirm Deploy') {
when {
branch 'master'
}
steps {
timeout(time: 3, unit: 'MINUTES' ) {
input(message: "Okay to Deploy to Staging?", ok: "Let's Do it!")
}
}
}
stage('Fluffy Deploy') {
when {
branch 'master'
}
agent {
node {
label 'java7'
}
}
steps {
unstash 'Java 7'
sh "./jenkins/deploy.sh ${params.DEPLOY_TO}"
}
}
}
parameters {
string(name: 'DEPLOY_TO', d
efaultValue: 'dev', description: '')
}
}
LIBRARYRESOURCE
USING LIBRARYRESOURCE
● From our previous example, instead of doing an inline body for the email,
let’s load the body of the message from a file
STARTING POINT
vars/sendNotifications.groovy
def call(Map config = [:]) {
<... removed Slack ...>
// send to email
emailext (
subject: "${config.message}: Job '${env.JOB_NAME} [${env.BUILD_NUMBER}]'",
body: """<p>${config.message}: Job '${env.JOB_NAME} [${env.BUILD_NUMBER}]':</p>
<p>Check console output at "<a href='${env.BUILD_URL}'>${env.JOB_NAME}
[${env.BUILD_NUMBER}]</a>"</p>""",
recipientProviders: [[$class: 'DevelopersRecipientProvider']]
)
}
BODY OF EMAIL
resources/emailtemplates/build-results.html
<p>$message: Job '$applicationName [$buildNumber]':</p>
<p>Check console output at <a href="$buildUrl">$applicationName [$buildNumber]</a></p>
LOAD THE FILE
vars/sendNotifications.groovy
def renderTemplate(input, binding) {
def engine = new groovy.text.GStringTemplateEngine()
def template = engine.createTemplate(input).make(binding)
return template.toString()
}
LAB EXERCISE
For this task, we will copy the contents of jenkins/build.sh from pipeline-lab to resources/scripts/build.sh in
the shared-library repository.
For this task, we will use the Gitea editor to create the custom step.
sh './jenkins/build.sh'
with
runLinuxScript(name: "build.sh")
Solution
Jenkinsfile
@Library('shared-library') _
pipeline {
agent none
stages {
stage('Fluffy Build') {
parallel {
stage('Build Java 8') {
agent {
node {
label 'java8'
}
}
steps {
runLinuxScript(name: "build.sh")
}
post {
success {
stash(name: 'Java 8', includes: 'target/**')
}
}
}
stage('Build Java 7') {
agent {
node {
label 'java7'
}
}
steps {
runLinuxScript(name: "build.sh")
}
post {
success {
postBuildSuccess(stashName: "Java 7")
}
}
}
}
}
stage('Fluffy Test') {
parallel {
stage('Backend Java 8') {
agent {
node {
label 'java8'
}
}
steps {
unstash 'Java 8'
sh './jenkins/test-backend.sh'
}
post {
always {
junit 'target/surefire-reports/**/TEST*.xml'
}
}
}
stage('Frontend') {
agent {
node {
label 'java8'
}
}
steps {
unstash 'Java 8'
sh './jenkins/test-frontend.sh'
}
post {
always {
junit 'target/test-results/**/TEST*.xml'
}
}
}
stage('Performance Java 8') {
agent {
node {
label 'java8'
}
}
steps {
unstash 'Java 8'
sh './jenkins/test-performance.sh'
}
}
stage('Static Java 8') {
agent {
node {
label 'java8'
}
}
steps {
unstash 'Java 8'
sh './jenkins/test-static.sh'
}
}
stage('Backend Java 7') {
agent {
node {
label 'java7'
}
}
steps {
unstash 'Java 7'
sh './jenkins/test-backend.sh'
}
post {
always {
junit 'target/surefire-reports/**/TEST*.xml'
}
}
}
stage('Frontend Java 7') {
agent {
node {
label 'java7'
}
}
steps {
unstash 'Java 7'
sh './jenkins/test-frontend.sh'
}
post {
always {
junit 'target/test-results/**/TEST*.xml'
}
}
}
stage('Performance Java 7') {
agent {
node {
label 'java7'
}
}
steps {
unstash 'Java 7'
sh './jenkins/test-performance.sh'
}
}
stage('Static Java 7') {
agent {
node {
label 'java7'
}
}
steps {
unstash 'Java 7'
sh './jenkins/test-static.sh'
}
}
}
}
stage('Confirm Deploy') {
when {
branch 'master'
}
steps {
timeout(time: 3, unit: 'MINUTES' ) {
input(message: "Okay to Deploy to Staging?", ok: "Let's Do it!")
}
}
}
stage('Fluffy Deploy') {
when {
branch 'master'
}
agent {
node {
label 'java7'
}
}
steps {
unstash 'Java 7'
sh "./jenkins/deploy.sh ${params.DEPLOY_TO}"
}
}
}
parameters {
string(name: 'DEPLOY_TO', d
efaultValue: 'dev', description: '')
}
}
SIMPLIFYING JENKINSFILES
Jenkinsfile
@Library('shared-starter') _
helloWorldPipeline(name: "Fred", dayOfWeek: "Friday")
vars/helloWorldPipeline.groovy
def call(Map pipelineParams) {
pipeline {
agent any
stages {
stage('hello') {
steps {
helloWorld(name: "${pipelineParams.name}", dayOfWeek: "${pipelineParams.dayOfWeek}")
}
}
}
}
}
● Pipeline gives you the ability to add your own DSL elements
● Pipeline is itself a DSL, so you can extend it
WHY YOU WOULD WANT YOUR OWN DSL
Jenkinsfile
@Library('shared-starter') _
helloWorldPipeline {
name = "Fred"
dayOfWeek = "Friday"
}
vars/helloWorldPipeline.groovy
def call(body) {
def pipelineParams= [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = pipelineParams
body()
pipeline {
agent any
stages {
stage('hello') {
steps {
helloWorld(name: "${pipelineParams.name}", dayOfWeek: "${pipelineParams.dayOfWeek}")
}
}
}
}
}
LAB EXERCISE
def call(body) {
def pipelineParams= [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = pipelineParams
body()
!!!REPLACEME!!!
}
● Replace !!!REPLACEME!!! with the contents from the Jenkinsfile from the pipeline-lab repository.
Be sure to not copy over the @Library annotation.
● Add a commit message and click Commit Changes
Task: Modify the new custom step to use a parameter passed through corporatePipeline
● Click on corporatePipeline.groovy
● Click on the pencil in the upper right hand corner to enter edit mode
● Remove the parameters directive
● Change the ${params.DEPLOY_TO} parameter to ${pipelineParams.deployTo}
● Add a commit message and click Commit Changes
For this task, we will use the Gitea editor to modify the existing Pipeline.
@Library('shared-library') _
corporatePipeline {
deployTo = "dev"
}
Solution
vars/corporatePipeline.groovy
def call(body) {
def pipelineParams= [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = pipelineParams
body()
pipeline {
agent none
stages {
stage('Fluffy Build') {
parallel {
stage('Build Java 8') {
agent {
node {
label 'java8'
}
}
post {
success {
stash(name: 'Java 8', includes: ' target/**')
}
steps {
runLinuxScript(name: " build.sh")
}
}
stage('Build Java 7') {
agent {
node {
label 'java7'
}
}
post {
success {
postBuildSuccess(stashName: " Java 7")
}
}
steps {
runLinuxScript(name: " build.sh")
}
}
}
}
stage('Fluffy Test') {
parallel {
stage('Backend Java 8') {
agent {
node {
label 'java8'
}
}
post {
always {
junit 'target/surefire-reports/**/TEST*.xml'
}
steps {
unstash 'Java 8'
sh './jenkins/test-backend.sh'
}
}
stage('Frontend') {
agent {
node {
label 'java8'
}
}
post {
always {
junit 'target/test-results/**/TEST*.xml'
}
}
steps {
unstash 'Java 8'
sh './jenkins/test-frontend.sh'
}
}
stage('Performance Java 8') {
agent {
node {
label 'java8'
}
}
steps {
unstash 'Java 8'
sh './jenkins/test-performance.sh'
}
}
stage('Static Java 8') {
agent {
node {
label 'java8'
}
}
steps {
unstash 'Java 8'
sh './jenkins/test-static.sh'
}
}
stage('Backend Java 7') {
agent {
node {
label 'java7'
}
}
post {
always {
junit 'target/surefire-reports/**/TEST*.xml'
}
steps {
unstash 'Java 7'
sh './jenkins/test-backend.sh'
}
}
stage('Frontend Java 7') {
agent {
node {
label 'java7'
}
}
post {
always {
junit 'target/test-results/**/TEST*.xml'
}
steps {
unstash 'Java 7'
sh './jenkins/test-frontend.sh'
}
}
stage('Performance Java 7') {
agent {
node {
label 'java7'
}
}
steps {
unstash 'Java 7'
sh './jenkins/test-performance.sh'
}
}
stage('Static Java 7') {
agent {
node {
label 'java7'
}
}
steps {
unstash 'Java 7'
sh './jenkins/test-static.sh'
}
}
}
}
stage('Confirm Deploy') {
when {
branch 'master'
}
steps {
timeout(time: 3, unit: 'MINUTES') {
input(message: 'Okay to Deploy to Staging?', ok: 'Let\'s Do it!')
}
}
}
stage('Fluffy Deploy') {
agent {
node {
label 'java7'
}
}
when {
branch 'master'
}
steps {
unstash 'Java 7'
sh "./jenkins/deploy.sh ${pipelineParams.deployTo}"
}
}
}
}
}
DURABILITY
● Most basic Pipelines that just build and test the code
■ They frequently write build and test data and can easily be rerun
● Your Jenkins instance shows high iowait numbers
● Your Jenkins instance uses a networked file system or magnetic storage
● You run many Pipelines at the same time
● You run Pipelines with many steps (more than several hundred)
WHEN NOT TO USE HIGHER-PERFORMANCE DURABILITY SETTINGS
pipeline {
agent any
stages {
stage('Example') {
steps {
echo 'Hello World'
}
}
}
options {
durabilityHint('PERFORMANCE_OPTIMIZED')
}
}
BEST PRACTICES FOR DURABILITY SETTINGS
● Scaling Pipelines
LAB EXERCISE
Durability
Durability
● Modify the existing custom step to override the global durability value
● Using Classic view, review the most recent log from a successful master branch run.
● Search for Running in Durability level. You should see a value of PERFORMANCE_OPTIMIZED.
Solution
vars/corporatePipeline.groovy
def call(body) {
def pipelineParams = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = pipelineParams
body()
pipeline {
agent none
stages {
stage('Fluffy Build') {
parallel {
stage('Build Java 8') {
agent {
node {
label 'java8'
}
}
post {
success {
stash(name: 'Java 8', includes: ' target/**')
}
}
steps {
runLinuxScript(name: " build.sh")
}
}
stage('Build Java 7') {
agent {
node {
label 'java7'
}
}
post {
success {
postBuildSuccess(stashName: " Java 7")
}
}
steps {
runLinuxScript(name: " build.sh")
}
}
}
}
stage('Fluffy Test') {
parallel {
stage('Backend Java 8') {
agent {
node {
label 'java8'
}
}
post {
always {
junit 'target/surefire-reports/**/TEST*.xml'
}
}
steps {
unstash 'Java 8'
sh './jenkins/test-backend.sh'
}
}
stage('Frontend') {
agent {
node {
label 'java8'
}
}
post {
always {
junit 'target/test-results/**/TEST*.xml'
}
}
steps {
unstash 'Java 8'
sh './jenkins/test-frontend.sh'
}
}
stage('Performance Java 8') {
agent {
node {
label 'java8'
}
}
steps {
unstash 'Java 8'
sh './jenkins/test-performance.sh'
}
}
stage('Static Java 8') {
agent {
node {
label 'java8'
}
}
steps {
unstash 'Java 8'
sh './jenkins/test-static.sh'
}
}
stage('Backend Java 7') {
agent {
node {
label 'java7'
}
}
post {
always {
junit 'target/surefire-reports/**/TEST*.xml'
}
}
steps {
unstash 'Java 7'
sh './jenkins/test-backend.sh'
}
}
stage('Frontend Java 7') {
agent {
node {
label 'java7'
}
}
post {
always {
junit 'target/test-results/**/TEST*.xml'
}
}
steps {
unstash 'Java 7'
sh './jenkins/test-frontend.sh'
}
}
stage('Performance Java 7') {
agent {
node {
label 'java7'
}
}
steps {
unstash 'Java 7'
sh './jenkins/test-performance.sh'
}
}
stage('Static Java 7') {
agent {
node {
label 'java7'
}
}
steps {
unstash 'Java 7'
sh './jenkins/test-static.sh'
}
}
}
}
stage('Confirm Deploy') {
when {
branch 'master'
}
steps {
timeout(time: 3, unit: 'MINUTES') {
input(message: 'Okay to Deploy to Staging?', ok: 'Let\'s Do it!')
}
}
}
stage('Fluffy Deploy') {
agent {
node {
label 'java7'
}
}
when {
branch 'master'
}
steps {
unstash 'Java 7'
sh "./jenkins/deploy.sh ${pipelineParams.deployTo}"
}
}
}
options {
durabilityHint('MAX_SURVIVABILITY')
}
}
}
SEQUENTIAL STAGES
SEQUENTIAL STAGES
SEQUENTIAL STAGES
pipeline {
agent none
stages {
parallel {
stage("windows") {
stages {
stage("build") {
steps {
bat "run-build.bat"
stage("deploy") {
bat "run-deploy.bat"
stage("linux") {
stages {
stage("build") {
steps {
sh "./run-build.sh"
}
SEQUENTIAL STAGES
● Use sequential stages to ensure that stages using the same agent use the
same workspace even though you are using multiple agents in your Pipeline
■ Use a parent stage with an agent directive on it
○ Then all the stages inside its stages directive run on the same
executor,
in the same workspace.
SEQUENTIAL STAGES
pipeline {
agent none
stages {
stage("build") {
steps {
sh "./build.sh"
stage("test") {
steps {
sh "./test.sh"
post {
success {
stash name: "artifacts", includes: "artifacts/**/*"
...
SEQUENTIAL STAGES
...
input {
ok "Yes, we should."
submitter "alice,bob"
}
}
agent {
docker "our-deploy-tools-image"
steps {
sh "./deploy.sh"
LAB EXERCISE
Sequential Stages
Sequential Stages
● Click on the pencil in the upper right hand corner of corporatePipelineSequential.groovy to enter
edit mode
● Modify the pipeline to run parallel sequential stages to replace the existing Fluffy Build and Fluffy
Test stages. There should be one sequential stage for Java 7 and one sequential stage for Java
8.
● Commit the changes to corporatePipelineSequential.groovy
● Open the Jenkinsfile in the pipeline-lab repo and change corporatePipeline to
corporatePipelineSequential.
● Commit the changes to the master branch
● The job should start automatically on the master branch
● Verify the job completed successfully
● Go take a look at the job visualization using Blue Ocean. Notice the difference from prior runs.
Solution
vars/corporatePipelineSequential.groovy
def call(body) {
body.delegate = pipelineParams
body()
pipeline {
agent none
stages {
parallel {
stage('java8') {
stages {
stage("build8") {
steps {
runLinuxScript(name: " build.sh")
post {
success {
steps {
sh './jenkins/test-backend.sh'
post {
always {
junit 'target/surefire-reports/**/TEST*.xml'
stage('Frontend') {
steps {
sh './jenkins/test-frontend.sh'
post {
always {
junit 'target/test-results/**/TEST*.xml'
}
}
steps {
sh './jenkins/test-performance.sh'
steps {
sh './jenkins/test-static.sh'
}
}
stage('java7') {
stages {
stage("build7") {
steps {
post {
success {
}
}
steps {
sh './jenkins/test-backend.sh'
post {
always {
junit 'target/surefire-reports/**/TEST*.xml'
steps {
unstash 'Java 7'
sh './jenkins/test-frontend.sh'
post {
always {
junit 'target/test-results/**/TEST*.xml'
steps {
sh './jenkins/test-performance.sh'
}
}
steps {
sh './jenkins/test-static.sh'
stage('Confirm Deploy') {
steps {
timeout(time: 3, unit: 'MINUTES') {
stage('Fluffy Deploy') {
steps {
sh "./jenkins/deploy.sh ${pipelineParams.deployTo}"
}
options {
durabilityHint('MAX_SURVIVABILITY')
● You can restart any completed Declarative Pipeline from any top-level stage
that ran in that Pipeline
● This allows you to rerun a Pipeline from a stage that failed due to transient or
environmental considerations
HOW TO USE
● You are prompted to choose from a list of top-level stages that were executed
in the original run, in the order they were executed
● Stages that were skipped due to an earlier failure are not available to be restarted,
but stages that were skipped due to a when condition not being satisfied are available
● The parent stage for a group of parallel stages or a group of nested stages to be
run sequentially are also not available - only top-level stages are allowed
HOW TO USE
HOW TO USE
● Once you choose a stage from which to restart and click submit,
a new build, with a new build number, starts
■ All inputs are the same, including SCM information, build parameters,
and the contents of any stash artifacts
● All stages before the selected stage are skipped and the Pipeline
starts executing at the selected stage
● From that point on, the Pipeline runs as normal
PRESERVING STASHES FOR USE WITH RESTARTED STAGES
● Normally, when you run the stash step in your Pipeline, the resulting stash of artifacts
is cleared when the Pipeline completes, regardless of the result of the Pipeline
● Since stash artifacts are not accessible outside of the Pipeline run that created them,
this has not created any limitations on usage
● With Declarative stage restarting, you may want to be able to unstash artifacts
from a stage that ran before the stage from which you are restarting
PRESERVING STASHES FOR USE WITH RESTARTED STAGES
LAB EXERCISE
Restart Stage
Restart Stage
● Modify the existing custom step to use in order to successfully restart a stage
● Open the job run from the master branch in the Classic UI
● Click on Build Now
● Click on the progress bar for the job in the left nav to open the scrolling console log
● Wait for the input stage and then click on Abort
● The job should finish in an ABORTED state
● In the breadcrumb at the top of the page, click on the job number that you just aborted
● On the left nav, click on Restart from Stage
● From the dropdown, select Confirm Deploy
● Click Run
● Click on the progress bar for the job in the left nav to open the scrolling console log
● Wait for the input stage and then click on Let’s Do It!
● The job should complete successfully
You may have to refresh the page (Cmd+R/Ctrl+R) in order to see the correct
NOTE
rendering. This is known issue with Blue Ocean and Restart from Stage.
●
Click on Let’s Do It!
● The job should complete successfully
Solution
vars/corporatePipelineSequential.groovy
def call(body) {
def pipelineParams= [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = pipelineParams
body()
pipeline {
agent none
stages {
stage('Build and Test Java') {
parallel {
stage('java8') {
agent { label 'java8' }
stages {
stage("build8") {
steps {
runLinuxScript(name: " build.sh")
}
post {
success {
stash(name: 'Java 8', includes: ' target/**')
}
}
}
stage('Backend Java 8') {
steps {
unstash 'Java 8'
sh './jenkins/test-backend.sh'
}
post {
always {
junit 'target/surefire-reports/**/TEST*.xml'
}
}
}
stage('Frontend') {
steps {
unstash 'Java 8'
sh './jenkins/test-frontend.sh'
}
post {
always {
junit 'target/test-results/**/TEST*.xml'
}
}
}
stage('Performance Java 8') {
steps {
unstash 'Java 8'
sh './jenkins/test-performance.sh'
}
}
stage('Static Java 8') {
steps {
unstash 'Java 8'
sh './jenkins/test-static.sh'
}
}
}
}
stage('java7') {
agent { label 'java7' }
stages {
stage("build7") {
steps {
runLinuxScript(name: " build.sh")
}
post {
success {
postBuildSuccess(stashName: " Java 7")
}
}
}
stage('Backend Java 7') {
steps {
unstash 'Java 7'
sh './jenkins/test-backend.sh'
}
post {
always {
junit 'target/surefire-reports/**/TEST*.xml'
}
}
}
stage('Frontend Java 7') {
steps {
unstash 'Java 7'
sh './jenkins/test-frontend.sh'
}
post {
always {
junit 'target/test-results/**/TEST*.xml'
}
}
}
stage('Performance Java 7') {
steps {
unstash 'Java 7'
sh './jenkins/test-performance.sh'
}
}
stage('Static Java 7') {
steps {
unstash 'Java 7'
sh './jenkins/test-static.sh'
}
}
}
}
}
}
stage('Confirm Deploy') {
when { branch 'master' }
steps {
timeout(time: 3, unit: 'MINUTES') {
input(message: 'Okay to Deploy to Staging?', ok: 'Let\'s Do it!')
}
}
}
stage('Fluffy Deploy') {
agent { label 'java7' }
when { branch 'master' }
steps {
unstash 'Java 7'
sh "./jenkins/deploy.sh ${pipelineParams.deployTo}"
}
}
}
options {
durabilityHint('MAX_SURVIVABILITY')
)
preserveStashes(buildCount: 5
}
}
}
GROOVY SANDBOX
● The sandbox provides a safe location to test Scripted Pipeline code that
has not been thoroughly tested and reviewed
● "Unsafe" Pipeline code includes calls that are not known to be safe
■ The code itself may actually be benign
● The mischief that unsafe code can do includes:
■ Disclosure of information (secrets, proprietary information, or other
confidential information being accessed by the Pipeline)
■ Modification/deletion of data in the Jenkins master
● Unsafe code can be inserted in a Pipeline intentionally or accidentally
WHITELIST
● The whitelist defines each method call, object construction and field access that can be
used
■ The Script Security plugin includes a small default whitelist
■ Plugins may add operations to that list
■ Administrators may add operations to that list
■ Method signatures can be pre-whitelisted with Groovy either on boot (using
init.groovy.d)
or in the script console
ADMINISTRATION OF WHITELIST
● When a script fails because it uses an operation that is not in the whiteliest,
that operation is added to an approval queue
■ The administrator can approve the script and it will run
● The administrator is also given a list of pending operation approvals
■ Click Approve next to an operation to add it to the whitelist
■ This makes that operation available to all sandboxed scripts on the Jenkins
instance
● Administrator can instead click Approve assuming permission check for getItems
■ The call is permitted when run as an actual user who is on the ACL
■ The call is forbidden when run as the system user
■ This button is shown only for method calls and constructors
■ Use it only when you know that Jenkins is doing a permission check
MORE REMARKS ABOUT SCRIPT SECURITY
R. Tyler Croy’s Do not disable the Groovy sandbox blog is a fun discussion of
why sandboxes are important and includes an example script that could
destroy your Jenkins instance were it allowed to run.
OTHER HINTS
KEEP IT SIMPLE!
KEEP IT SIMPLE!
● Avoid Pipeline XML or JSON parsing using Groovy’s XmlSlurper and JsonSlurper
■ Groovy implementations are complex and very brittle for Pipeline usage
■ XmlSlurper and JsonSlurper carry a high memory and CPU cost in Pipelines
● xmllint and XMLStarlet are command-line tools offering XML extraction using XPath
● jq offers the same functionality for JSON
● These extraction tools may be coupled with curl or wget to fetch information from an HTTP
API
USE EXTERNAL SCRIPTS AND TOOLS
● Processing data
● Communicating interactively with REST APIs
● Parsing/templating larger XML or JSON files
● Nontrivial integration with external APIs
● Simulations and complex calculations
● Business logic
COMMAND-LINE CLIENTS FOR APIS
● Many software vendors provide easy command-line clients for their tools
in various programming languages
■ These are often robust, performant and easy to use
● Use shell or batch steps to integrate these tools, which can be written in any language
■ For a Java client, use a command like:
■ sh "java -jar client.jar $
endPointUrl $inputData"
● Avoid inputs that might contain shell metacharacters. A construction like the following
solves this problem
writeFile file: 'input.json', text: inputData
● sh 'java -jar client.jar $endPointUrl input.json'
REDUCE THE NUMBER OF STEPS IN THE PIPELINE