11 Jenkins
11 Jenkins
This code is pushed into repositories several times a day and over the period of time all of the code
gets merged. Traditional software development methods don’t dictate how frequently or regularly
you integrate all of the source on a project. Programmers can work separately for hours, days, or
even weeks on the same source without realizing how many conflicts (and perhaps bugs) they are
generating.
1. Integration is painful
Agile teams produce workable and robust code in each iteration. All that code if built and evaluated
returns lot of conflicts, bugs and errors. Developers needs to solve those conflicts and issues before
moving to next iteration. The more programmers are sharing the code, the more problematic this is.
For these reasons, agile teams often therefore choose to use Continuous Integration.
The source code can be built, packaged and deployed manually. But there are some build tools that
make developers life easy when it comes to building artefact or even deploying it. These are called
as build automation tools.
• Ant
• Maven
• Gradle
• Msbuild
Unit Testing
Unit testing simply verifies the individual unit of code(mostly functions) works as expected.
Developer along with writing the code will write the test cases that can be executed at the build
time. Some test cases can be automatically generated.
The objective of unit testing is to isolate a section of code(unit) and verify its correctness.
The Problem
Developers will write the code and BUILD it in their local system. Once developers test the code
and verify locally they push it to the centralised repository like github. Similarly, all the developers
would be pushing their code to VCS several times a day. Developers would be working in their own
silos or caves and keep writing the code until they finish a particular task or the project. Now all the
code which developers have pushed into the VCS, if built and tested will return lots & lots of
conflicts, error due to which build will fail.
The Solution
To get around this very problem whenever the developer push the code to the VCS it should be
fetched, built & tested by a build server at the same time.
4. What is Jenkins
Jenkins is a continuous integration server which can fetch the latest code from VCS, build it, test it
and notify it to the developers. Jenkins can do many more things apart from just being a CI server.
It was originally known a Hudson, Oracle inc owns Hudson now. Jenkins is an open source project
written by Kohsuke Kawaguchi.
Jenkins is a java based web application server. As a prerequisite, we need to setup first Java on the
machine to run Jenkins server.
Extensible
Jenkins comes with lot of goodies but its just not limited by that, Jenkins main power is its
extensibility that can be achieved by installing plugins into it.
Jenkins opensource community has written tons of plugins, these plugins can do variety of tasks,
like integration with external tools or servers.
➢ VCS plugins – git, svn, subversion etc
➢ Build plugins – Maven, ANT, Msbuild etc
➢ Notification plugins – Email, chat, sms etc
➢ Cloud plugins – Create cloud instances, deploy code to cloud services etc
➢ Testing plugins – Code analysis, Unit test case, Static code analysis etc
The list of plugins is very long, whenever we want Jenkins to do some tasks just search for that
plugin and most of the time you will find something.
For example, if you want jenkins to deploy java artefact to tomcat server, search for the plugin
named “deploy to container”.
Jenkins can be installed on windows, Linux or Mac OS. Jenkins just needs java software to run.
In this tutorial, we will install jenkins on a ubuntu server. You can setup a vm or a cloud instance.
Prereqs
Java runtime environment/ JRE can be installed on the system but we will install JDK as we will
setup maven moving along and build some java code. To Build the java code we will need JDK.
sudo add-apt-repository ppa:openjdk-r/ppa
Installing Jenkins
wget -q -O - https://pkg.jenkins.io/debian/jenkins-ci.org.key | sudo apt-key add -
Click Create new jobs or New Item => Enter name – “first-jenkins-job” =>
Select Freestyle project => OK
After saving the project, we land up into the job’s dashboard. Click Build Now
to execute this project.
We have seen from above example that setting up a jenkins job is not such a challenging task. But
you need to know what information goes up in the Jenkins job. In the next task, we will create an
actual build job.
This java source code can be built by Maven. It also has the pom.xml file which maven needs to
build the code.
Go to Jenkins main dashboard => Click New Item => Give a name to your job
=> Freestyle => OK
Go to build section => From drop down select Invoke top-level Maven targets
=> In the goals give “install” => Save
• Generate artefact
• Archive artefact
Workspace:
Workspace is the place where all the data of the job gets stored, e:g source code, artefacts etc.
Every job in jenkins has its own workspace.
9. Jenkins Administration.
Once we had a little taste of jenkins and how to run jobs, we can understand how to administer
Jenkins. Jenkins gives amazing features and its very flexible, we can setup jenkins as per our need.
Jenkins can do variety of tasks apart from just being a CI server but we need to configure it as per
our need. In coming section, we will see how jenkins is flexible and extensible.
You can open Jenkins admin settings by clicking on Manage Jenkins from main dashboard.
Note: Sometimes after making any config change you may need to restart Jenkins.
In the browser, you can use below url to restart jenkins.
http://<JenkinsIP>:8080/restart
Manage Plugins
Plugins are the most powerful feature of jenkins. You can customize jenkins as per your need by
installing and setting up plugins. You can use jenkins to automate almost anything, it’s just matter
of the plugins you setup and there are wide variety of choices in plugins. We have already used
some plugins in our build job like Git SCM, Invoke top level Maven target etc.
Some plugin comes by default installed in Jenkins and then you can install any plugin as per your
choice and need.
➢ Available
List of available plugins to install. Find your plugin from filter, just put a checkmark on
your favourite plugin and click “Install without restart”. If the settings does not take effect
restart jenkins server. Every plugin will have a wiki page, click on the plugin to read its
wiki.
➢ Advanced
Some time you sit behind a proxy server and don’t have a direct internet connection. That
time you won’t be able to see the list of Available plugin and won’t be able to install it.
You can mention proxy settings in this page, restart Jenkins and then you will see the list of
plugins to choose from.
If you are java programmer and have written your own plugin, you can upload your plugin
to Jenkins in .hpi format.
Configure System
Security Realm
First, establish the user authentication method. For smaller, more informal installations, you can use
Jenkins' own user database. For enterprise installations, you will want to use your corporate service,
which allows users to log in to Jenkins with their usual username and password.
On a Linux host you have an option to either use the Active Directory plugin or an LDAP based
authentication. To configure the LDAP to work with Active Directory, provide the following:
Server mydomaincontroller.mycompnay.com:389
Root DN dc=mycompnay,dc=com
User Search
sAMAccountName={0}
Filter
Manager DN cn=mymanageruser,ou=users,ou=na,ou=mycompany,dc=mycompany,dc=com
Manager
*****
Password
Note that the correct Manager DN value can vary greatly depending on your Active Directory set up.
UNIX NIS
To set up Network Information System:
Authorization
The Authorization section of the Configure Global Security page allows you to configure what users are
allowed to do once authenticated.
Matrix-based Security
Matrix-based security offers the most precise control over user privileges.
4.Give yourself full access by checking the entire row for your user name
5.Repeat for other users who deserve full access. The configuration should look like the picture below:
6.Click Save at the bottom of the page. You will be taken back to the top page. Now Jenkins is
successfully secured.
If you set up a service like NIS, Active Directory or LDAP, you can now log in to Jenkins using your
network credentials. If you are using Jenkins' own user database, create a user account for yourself:
If everything works smoothly, you are now logged on as yourself with full permissions. If something
goes wrong, follow this to reset the security setting.
5. Now you need to connect your slave machine to the master using the following steps.
a. Open a browser on the slave machine and go to the Jenkins master server url
(http://yourjenkinsmaster:8080).
b. Go to Manage Jenkins > Manage Nodes, Click on the newly created slave machine. You will
need to login as someone that has the "Connect" Slave permission if you have configured global
security.
c. Click on the Launch button to launch agent from browser on slave.
### If you encounter connection issue, then you could enlarge the popup windows to see the master
port used and check your network configuration (firewall, port forward, ...)
6. If you want the service to run on start-up of the slave machine do the following (Windows only
directions):
b)In the Slave agent program running on your slave machine,
c)click File --> Install as Windows Service.
Nexus
Software repository or repository managers are becoming very central part of Continuous
Integration and Continuous Delivery projects.
We have seen in our second build job, whenever we run the build job it will create gameoflife.war
artefact. This artefact will get replaced every time we run the job.
If we generate an artefact that does not work or have any issues with it then we may need to go back
to the previous version of the artefact. If we start versioning artefacts in Jenkins then we may fill up
Jenkins disk space very quickly as these jobs runs several times in a day. For this we should have a
mechanism of versioning and storing our versioned artefact to some centralised place.
For that very purpose we can use Nexus Repository Manager.
Project Setup
Jenkins Plugin Setup
Install plugins:-
• Git plugin
Checkout source code from github. Integrates Jenkins with git
• Zentimestamp plugin
Creates variable named $BUILD_TIMESTAMP which can be used for versioning/naming
our artifact.
After installing the plugin, we have to set its value from Configure System page.
Manage Jenkins => Configure System => Global properties.
• Nexus plugin
Uploads our versioned artifact to Nexus repository. Integrates Nexus with Jenkins.
Add Build step --> Invoke top level maven project --> In Goals enter "install"
Build verification.
In your project's dashboard => Go to the workspace => gameoflife-web => target
You should see gameoflife.war.
Nexus setup
We will setup nexus server on Centos in this tutorial.
Create a centos vm or cloud instance and login to it.
Follow below steps to setup Nexus
export RUN_AS_USER=root
wget http://www.sonatype.org/downloads/nexus-latest-bundle.tar.gz
cd /usr/local
/usr/local/nexus/bin/nexus start
Open the Jenkins build job => Add build step => Nexus artefact uploader.
Nexus output
Login to nexus and verify the repository data. You should see a versioned artifact their.
If you run
this job
multiple
times you
will see every time we get an artefact with a new name.
In any point in time we can use older versions of the artefact if something breaks in newer version.
Steps:
1. Install checkstyle plugin.
2. In Maven build step update the Goals as displayed below.
3. Click Post build action and select “Publish checkstyle analysis results”.
Tomcat setup
Login as root user
apt-get update
wget https://www-eu.apache.org/dist/tomcat/tomcat-8/v8.5.15/bin/apache-tomcat-
8.5.15.zip
mv apache-tomcat-8.5.15.zip /opt
cd /opt
unzip apache-tomcat-8.5.15.zip
cd apache-tomcat-8.5.15
vi conf/tomcat-users.xml
<!--
<role rolename="tomcat"/>
<role rolename="role1"/>
-->
</tomcat-users>
with
-->
<role rolename="manager-gui"/>
</tomcat-users>
Replace
<Valve className="org.apache.catalina.valves.RemoteAddrValve"
allow="127\.\d+\.\d+\.\d+|::1|0:0:0:0:0:0:0:1" />
</Context>
with
<!--
<Valve className="org.apache.catalina.valves.RemoteAddrValve"
allow="127\.\d+\.\d+\.\d+|::1|0:0:0:0:0:0:0:1" />
-->
</Context>
bin/startup.sh
This project is a complete build pipeline which will build java SC, create war package, run static
code analysis, deploy package to staging tomcat server & deploy to prod tomcat server.
Pipeline consist of below mentioned Jenkins job working together to create entire continuous
delivery pipeline.
1. Package job – This job will checkout source code from git repository, use Maven to build the
code, archive the artefacts, trigger Static code analysis job & staging deploy job
2. Static code analysis – This job will checkout source code from git repository, use Maven to run
checktyle code analysis and publish graph for STA.
3. Deploy to staging Tomcat server – This job will copy the war artefact from package job to
deploy job, Deploy the artefacts to Staging tomcat server & trigger Prod deploy job.
4. Deploy to Prod Tomcat Server – This job will copy the war artefact from package job to deploy
job, Deploy the artefacts to Staging tomcat server & trigger Prod deploy job.
Prerequisites:
1. Jenkins server: - Jenkins server should have openjdk 1.7, Git, Maven (Follow Jenkins setup doc)
2. Plugins: - git, maven, copy artifacts, deploy to container, checkstyle, build pipeline
3. Staging tomcat server: - Create a vm or ec2 instance with ubuntu OS & Follow tomcat setup
doc.
4. Prod tomcat server:- Create a vm or ec2 instance with ubuntu OS & Follow tomcat setup doc.
• Through a script
• From a browser
• Github webhooks
◦ In github repo go to Settings => Webhooks => Enter Jenkins URL and Token => Add
Webhook.
Now whenever there is a commit in the git repo it will trigger the Jenkins job by hitting
Jenkins remote URL.
Build Periodically
If we want to schedule our job to get executed for example every night 8 pm then we can use a
cronjob format to specify the time.
This field follows the syntax of cron (with minor differences). Specifically, each line consists of 5
fields separated by TAB or whitespace:
MINUTE HOUR DOM MONTH DOW
MINUTE Minutes within the hour (0–59)
HOUR The hour of the day (0–23)
DOM The day of the month (1–31)
Poll SCM
It’s similar to cronjob but instead of running the job at the interval it will go and check if there is
any new commit in the VCS repo like github repo’s and then execute the job.
DOCUMENTED BY KARUNAKAR G.
SonarQube is an open source platform for continuous inspection of code quality.This will reports on
duplicated code, coding standards, unit tests, code coverage and code complexity.
SonarQube can Provides a central place to view and define the rules used during analysis of
projects. These rulesets are organized in quality profiles. Every member of the organization can see
which rules are applied to their project.
Every project administrator can choose which quality profile is used for the project.
Required Plugins
✓ SonarQube scanner
✓ Maven
SonarQube LifeCycle
• Java-8-oracle,
• Sonarqube version 5.6.6,
• mysql-server 5.7.
SonarQube Installation
sudo apt-get install unzip
wget https://sonarsource.bintray.com/Distribution/sonarqube/sonarqube-5.6.6.zip
$ mysql –u user –p
Mysql>Create the MySQL database.
Mysql>CREATE DATABASE sonar CHARACTER SET utf8 COLLATE utf8_general_ci;
Mysql>CREATE USER 'sonar' IDENTIFIED BY 'sonar';
Mysql>GRANT ALL ON sonar.* TO 'sonar'@'%' IDENTIFIED BY 'sonar';
Mysql>GRANT ALL ON sonar.* TO 'sonar'@'localhost' IDENTIFIED BY 'sonar';
FLUSH PRIVILEGES;
# User credentials.
# Permissions to create tables, indices and triggers must be
granted to JDBC user.
# The schema must be created first.
sonar.jdbc.username=sonar
sonar.jdbc.password=sonar
Pass/Fail Notification:
Once an analysis is done, a report is sent to the SonarQube server to be integrated. At the end of this
integration, a standard web hook mechanism lets you notify any external system to do whatever you
want: trigger an alarm, update a wallboard, and notify a chat room.
As the first, and only, universal Artefact Repository Manager on the market, JFrog Artifactory fully
supports software packages created by any language or technology.
Artifactory is the only enterprise-ready repository manager available today, supporting secure,
clustered, High Availability Docker registries.
Integrating with all major CI/CD and DevOps tools, Artifactory provides an end-to-end, automated
and bullet-proof solution for tracking artefacts from development to production.
Chef
The concept of “Infrastructure as Code” has been widely adopted by most enterprise IT
organizations. Chef provides IT and DevOps with the tools they need to manage the different
environments they need to spin up. Through support for Chef Cookbook repositories, Artifactory
brings a new dimension to Infrastructure as Code. By managing configuration packages through a
binary repository, IT and DevOps organizations working hard on configuration management with
Chef now have many more capabilities at their fingertips.
Puppet
The concept of “Infrastructure as Code” has been widely adopted by most enterprise IT
organizations. Puppet provides IT and DevOps with the tools they need to manage the different
environments they need to spin up. Through support for Puppet repositories, Artifactory brings a
new dimension to Infrastructure as Code. By managing configuration packages through a binary
repository, IT and DevOps organizations working hard on configuration management with Puppet
now have many more capabilities at their fingertips.
Docker
Use Artifactory to manage your in-house Docker images. Distribute and share your images among
teams across your organization, whether on-site or at remote locations, just like using Docker Hub
Enterprise. Control access to your images using secure “docker pull”, and never have to rely on the
internet to access them. Once your images are stored in your repository, find them easily with smart
search.
Distribution Repository
Artifactory takes its integration with JFrog Bintray to the next step with Distribution Repositories
streaming liquid software from Artifactory to Bintray. Distribution repositories provide an easy way
to move artefacts from Artifactory to Bintray, for distribution to end users. As opposed to other
repositories in Artifactory, distribution repositories are not typed to a particular package format, but
rather, are governed by a set of rules that give fine-grained control over how to specify exactly
where an artefact in the distribution repository should be routed to in its corresponding repository in
Bintray.
Build Integration
Jenkins/Hudson, TeamCity and Bamboo
Stream your builds of liquid software into Artifactory from your favorite CI Server together with
exhaustive build environment information captured during deployment to enable fully reproducible
builds that continuously update computer systems and devices. Promote builds and use the build's
Bill of Materials to view deployed modules with their published artefacts and dependencies in all
scopes. See where specific artefacts are used and receive warnings when required build
dependencies are removed. Link back to the build information in the CI server and vice versa.
Currently, Jenkins, Hudson, JetBrains TeamCity and Atlassian Bamboo are supported.
Git LFS
Do you use Git for source code control? So do many others. But what about the binary assets that
go along with your source code? Git is not the best solution for that. “GitHub LFS,” you say? Well,
there is a better solution still. Artifactory is a fully-fledged Git LFS (Large File Storage) repository
and can optimize your workflow when working with large media files and other binary resources.
Artifactory fully supports the Git LFS API, so all you need to do is configure your Git client to
point to Artifactory as the Large File Storage repository.
One of the greatest advantages, especially in the Java world, is that development teams have the
freedom to choose and build modular environments by integrating the tools that they like, need and
that were adopted by their organization.
This is why our users build their projects using Maven, TeamCity, Ivy, Hudson, Gradle, Bamboo
and recently Jenkins. Bottom line is, you should not be concerned about the ability of the tools to
integrate while designing your CI stack! BUT, things are changing. Now even more with the fork of
Hudson CI (currently owned by Oracle), Artifactory has come to be the only Binary Repository
Manager that gives you the real freedom to choose. During the past year, JFrog’s team has been
developing for the Artifactory users a number of open source plug-ins that are tightly integrated
with the world’s leading tools and vendors. The following comparison tables are more than just a
list of features – it is what we envision when we think about the software development tooling, and
how we ensure that our users keep their freedom of choice: Build tools integration
jcenter delivers library through CDN which means improvements in CI and developer
builds.
Avoid trojan codes
jcenter is the largest Java Repository on earth, so whatever is available on Maven Central is
available on jcenter as well.
It is incredibly easy to upload your own library to bintray. No need to sign them or do any
complex things like you have to on Maven Central.
Friendly-UI
If you want to upload your library to Maven Central you could do it easily with a single
click on the bintray site.
J- FROG INSTALLATION
Go to www.jfrog.com/downloads-artifactory-pro/
and click on Artifactory Pro Standalone
Before running Jfrog services please ensure the essentials like JAVA, GIT, MAVEN and
JENKINS are installed. For my convenience, I have written a script to install all the
prerequisites.
#!/bin/bash
else
echo "Debian Based OS Detected"
sleep 3
echo "Installing Java-JDK,Jenkins,Maven"
sudo apt-get update
sudo apt-get install openjdk-8-jdk -y
sudo apt-get install openjdk-8-jre -y
sudo apt-get install maven -y
sudo apt-get install wget -y
wget -q -O - https://pkg.jenkins.io/debian/jenkins-ci.org.key | sudo apt-key add -
sudo sh -c 'echo deb http://pkg.jenkins.io/debian-stable binary/ > /etc/apt/sources.list.d/jenkins.list'
sudo apt-get update -y
sudo apt-get install jenkins -y
sudo apt-get install git -y
echo "Configuring services.... Please Wait"
sleep 5
sudo systemctl stop ufw
sudo systemctl start jenkins
fi
After installation of JAVA, Maven, git and Jenkins, navigate to below directory to start the
Jfrog services.
# cd Downloads\artifactory-pro-5.3.1\bin
# bash installservices.sh
# bash aritifactory.sh
#cd /home
#git clone https://github.com/JfrogDev/project-examples.git
Now in order to tell maven where to copy the artefact after building the code. we need to edit
the pom.xml of the code.
In the Jfrog Artifactory home page press SET ME UP and copy the deployment snippet and
paste it in your pom.xml file
# cd /home/project-examples/artifactory-maven-plugin-example
# vi pom.xml
Change credentials in pom so that your build can be copy the artefact to your repository.
Here we are creating a maven inventory file so that we can tell maven to download
dependencies from Jcenter instead of maven central.
#cd /home/project-examples/artifactory-maven-plugin-example
# mvn deploy
As the first, and only, universal Artefact Repository Manager on the market, JFrog Artifactory
fully supports software packages created by any language or technology.
Artifactory is the only enterprise-ready repository manager available today, supporting secure,
clustered, High Availability Docker registries.
Integrating with all major CI/CD and DevOps tools, Artifactory provides an end-to-end, automated
and bullet-proof solution for tracking artefacts from development to production.
I am Installing Jfrog in the local machine, and the Jfrog server will be my local machine
do as the following
check your email & Copy and Paste the link in the box and click on next
Now we must set our own maven repository so that we can upload the artefacts from Jenkins.
Follow the steps to create the repository in JFROG.
Go the the WELCOME ADMIN @ Top Right side of the screen you will find a drop down as soon
as you place the cursor over it
Now select ==> Local Repository
==> New Local Repository page is open where we can enter the name of repository of our choice
and then Click on Save & Finish
1. apt-get update -y
2. apt-get install openjdk-8-jdk -y
3. apt-get install Jenkins -y
4. apt-get install git
5. apt-get install maven -y
1. Zentimestamp plugin
2. Artifactory Plugin
3. Maven Invoker plugin
4. GitHub Authentication plugin
Now let’s clone the git repo we have created on our github.
Git clone <url>
==>
Now go the path :
/home/project-examples/artifactory-maven-plugin-example
and copy the files & directories to our git directory
==> mv -f * /home/jfrog-project1
Go the Jfrog url & select the local repo which we have specified
Now go to /home
create a directory .m2
==> mkdir .m2 & open a file settings.xml
From the Jfrog , select our repo & click on Genrate Settings and copy the settings & paste in the
Now copy this url to the Jenkins Source Code Management (GIT)
Continuous Integration is a mandatory procedure we need to do to setup code delivery pipelines and
that’s the first thing we should automate in DevOps Lifecycle.
Jenkins is the most famous CI tool.
Jenkins provides you with so many features and can be used for other purposes also like
Deployments and cloud automation.
Jenkins gets integrated with almost every Devops or developer’s tool in the market.
Later in the book we will see ansible, puppet and docker, Jenkins gets integrated with these tool
very nicely through its plugins.
I say its one of the most important tool in DevOps.