0% found this document useful (0 votes)
89 views101 pages

Devops Manual

The document outlines the Software Development Life Cycle (SDLC) and Agile methodologies, detailing stages such as requirement gathering, design, implementation, testing, deployment, and maintenance. It emphasizes the importance of Agile principles like iterative development, customer collaboration, and adaptability, along with various Agile frameworks such as Scrum and Extreme Programming (XP). Additionally, it introduces the DevOps lifecycle, which integrates development and operations for continuous delivery and improvement of software.

Uploaded by

eswarnandubanala
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
89 views101 pages

Devops Manual

The document outlines the Software Development Life Cycle (SDLC) and Agile methodologies, detailing stages such as requirement gathering, design, implementation, testing, deployment, and maintenance. It emphasizes the importance of Agile principles like iterative development, customer collaboration, and adaptability, along with various Agile frameworks such as Scrum and Extreme Programming (XP). Additionally, it introduces the DevOps lifecycle, which integrates development and operations for continuous delivery and improvement of software.

Uploaded by

eswarnandubanala
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 101

Exercise 1:

Reference course name :Software engineering and Agile software development


Get an understanding of the stages in software development lifecycle, the process
models, values and principles of agility and the need for agile software development.
This will enable we to work in projects following an agile approach to software
development.
Solve the questions given in the reference course name to gauge wer understanding of
the topic

SDLC

The Software Development Life Cycle (SDLC) is a structured process used for designing,
developing, testing, and deploying software. It ensures software is high-quality, meets user
requirements, and is delivered within time and budget constraints. Here’s an overview of the
typical stages in the SDLC:
1. Requirement Gathering and Analysis
 Purpose: Understand what the client or end-users need from the software.
 Activities:
o Stakeholder interviews, surveys, and meetings.
o Documenting functional and non-functional requirements.
 Outcome: A detailed Software Requirements Specification (SRS) document outlining
what the software should do.

2. Feasibility Study (Planning)


 Purpose: Analyze whether the project is financially, technically, and operationally
feasible.
 Activities:
o Evaluate costs, resources, and timelines.
o Assess risks and establish a high-level plan.
 Outcome: A feasibility report and high-level project plan.

3. System Design
 Purpose: Create a blueprint for the software’s architecture.
 Activities:
o Design both high-level architecture (database structure, technology stack) and
detailed designs (UI/UX design, API endpoints).
 Outcome: Detailed design documents, often including UI mockups, ER diagrams, and
data flow diagrams.

4. Implementation (Coding)
 Purpose: Actual development or coding of the software.
 Activities:
o Programmers write code according to the design documents.
o Code is typically written in small, iterative phases, especially if following Agile
or similar methodologies.
 Outcome: The first working version of the software (also known as alpha or beta
versions).

5. Testing
 Purpose: Ensure the software functions as expected, is bug-free, and meets the original
requirements.
 Activities:
o Unit Testing, Integration Testing, System Testing, and User Acceptance Testing
(UAT).
o Identify and fix bugs, performance issues, or gaps in functionality.
 Outcome: A stable version of the software that’s ready for deployment.

6. Deployment
 Purpose: Release the software to a live environment for use.
 Activities:
o Deploy the software to production servers.
o Provide necessary documentation and training to end users.
o Monitor the system post-deployment for any immediate issues.
 Outcome: The software is now live and in use by the end users.
7. Maintenance
 Purpose: Ensure the software remains functional over time.
 Activities:
o Fix bugs that weren’t caught during testing.
o Implement updates, upgrades, and improvements.
o Perform regular maintenance like optimizing databases, applying security patches,
etc.
 Outcome: Ongoing support and updates to keep the software running smoothly.

Common SDLC Models:


There are different models for implementing the SDLC, depending on the project’s nature,
scope, and constraints. Some of the popular ones include:
1. Waterfall Model: A linear approach where each phase must be completed before the
next one begins.
2. Agile Model: An iterative and incremental approach that emphasizes flexibility and
customer feedback.
3. V-Model: An extension of the waterfall model that emphasizes testing at each stage of
development.
4. Iterative Model: Focuses on repeating the development cycle to refine and build the
software incrementally.
5. Spiral Model: Combines iterative development with risk management to ensure risks are
identified and handled at each stage.

SDLC Benefits:
 Structured Approach: Provides a clear process with defined stages, ensuring
consistency and quality.
 Risk Management: Early identification of risks and issues through analysis and
planning.
 Cost Efficiency: Better planning and design help avoid costly fixes and scope creep.
 Customer Satisfaction: Involving stakeholders in the early stages ensures that the final
product meets their needs.
Exercise 2:

Reference course name: Development & Testing with Agile: Extreme Programming
Get a working knowledge of using extreme automation through XP programming
practices of test first development, refactoring and automating test case writing.
Solve the questions in the “Take test” module given in the reference course name to
gauge wer understanding of the topic

AGILE METHDOLOGY
Agile Methodology is a flexible, iterative approach to software development that emphasizes
collaboration, customer feedback, and small, rapid releases. Unlike traditional development
models like Waterfall, Agile breaks down projects into smaller, manageable units called
"iterations" or "sprints," allowing for continuous improvement and adaptation throughout the
development process.
Key Concepts of Agile:
1. Iterative Development: Projects are divided into smaller parts, which are developed,
tested, and delivered in iterations (typically 1-4 weeks).
2. Customer Collaboration: Constant feedback from customers ensures the product meets
their needs.
3. Adaptability: Agile embraces changing requirements, even late in the development
process.
4. Cross-functional Teams: Agile teams include members from different disciplines, like
developers, testers, and business analysts, working together in close collaboration.
5. Continuous Feedback: Frequent reviews and feedback help ensure the project remains
on track and is adjusted as needed.

Core Values of Agile (According to the Agile Manifesto):


1. Individuals and interactions over processes and tools.
2. Working software over comprehensive documentation.
3. Customer collaboration over contract negotiation.
4. Responding to change over following a plan.
Key Agile Frameworks/Methodologies:
There are several frameworks within the Agile methodology that teams can adopt, depending on
their project and organizational needs:
1. Scrum
 Scrum is the most widely used Agile framework.
 Projects are broken down into "sprints" (time-boxed iterations, usually 1-4 weeks).
 Teams hold daily "stand-ups" or daily meetings to assess progress.
 After each sprint, there’s a Sprint Review and Sprint Retrospective to reflect on what
went well and what can be improved.
 Roles in Scrum:
o Product Owner: Represents the stakeholders and the voice of the customer.
o Scrum Master: Facilitates the process, removes roadblocks, and ensures Agile
principles are followed.
o Development Team: A self-organized team of professionals working to deliver
increments of the product.
2. Kanban
 Kanban focuses on visualizing the work process and managing flow.
 Work items are represented on a board (physical or digital) with columns that represent
the stages of the development process (e.g., "To Do," "In Progress," "Done").
 There are no prescribed time frames (like Scrum sprints), but the focus is on limiting
work in progress (WIP) to improve flow.
 Key principles:
o Visualize work.
o Limit WIP.
o Manage flow.
o Make process policies explicit.
3. Extreme Programming (XP)
 XP is a more technical Agile framework focusing on improving software quality.
 Practices include Test-Driven Development (TDD), pair programming, and
continuous integration.
 Encourages frequent releases in short development cycles to improve productivity and
introduce checkpoints for customer requirements.
4. Lean Development
 Inspired by Lean Manufacturing, Lean in Agile focuses on minimizing waste and
maximizing value.
 It encourages continuous delivery, reducing bottlenecks, and improving team efficiency.

Agile Practices:
Agile teams use specific practices to promote collaboration, flexibility, and quality:
1. User Stories: Simple, informal descriptions of features told from the perspective of the
user.
o Example: “As a [user], I want [feature] so that [benefit].”
2. Backlog Grooming (Refinement): Regularly reviewing the product backlog to ensure it
is up-to-date and prioritized.
3. Daily Standups (Daily Scrum): A short, daily meeting (usually 15 minutes) where team
members discuss:
o What they did yesterday.
o What they plan to do today.
o Any blockers or impediments.
4. Sprint Planning: A meeting at the start of each sprint to plan what will be delivered and
how the team will approach the work.
5. Sprint Review: At the end of the sprint, the team demonstrates the completed work to
stakeholders and gathers feedback.
6. Sprint Retrospective: A meeting after each sprint to reflect on the process and find ways
to improve in the next sprint.
7. Burndown Chart: A visual representation of the work remaining in the sprint. It tracks
progress over time.
Agile Roles:
1. Product Owner: Owns the product backlog, prioritizes user stories, and ensures that the
team is working on the most valuable features.
2. Scrum Master (in Scrum): Ensures the Agile process is being followed, facilitates
communication, and removes obstacles.
3. Development Team: A group of cross-functional members who develop the product,
including developers, testers, designers, etc.

Benefits of Agile:
 Faster Delivery: Regular iterations mean that features can be released frequently and
users see value sooner.
 Flexibility: Agile welcomes changes, making it easier to adapt to changing customer
needs.
 Customer Satisfaction: Continuous customer feedback ensures that the product aligns
with their expectations.
 Improved Quality: Testing is integrated throughout the development process, helping
catch defects early.
 Transparency: Frequent updates and reviews keep everyone on the same page about
project progress.

Challenges of Agile:
 Changing Requirements: Constant changes may overwhelm the team if not managed
properly.
 Team Collaboration: Agile demands a high level of communication and collaboration,
which may be difficult for distributed teams.
 Client Involvement: Agile requires constant involvement from stakeholders, which may
not always be feasible.
When to Use Agile:
 Projects with changing requirements: Agile works best in environments where
requirements evolve over time.
 Customer-centered development: If customer feedback is crucial and expected
throughout the project, Agile provides the flexibility needed.
 Fast-moving projects: When rapid delivery of software is a priority, Agile’s iterative
nature helps get early versions out quickly.
EXERCISE-3
DEVOPS LIFE CYCLE
The DevOps lifecycle refers to the continuous and iterative process that combines development
(Dev) and operations (Ops) practices to accelerate the delivery of high-quality software. It
integrates development, testing, deployment, and monitoring processes, allowing for continuous
feedback and improvements. The key objective of DevOps is to bridge the gap between
development and operations teams, enabling more frequent, reliable software releases.

Stages of the DevOps Lifecycle:


1. Planning
o Purpose: Define the scope and objectives of the project.
o Activities:
 Gathering requirements from stakeholders.
 Identifying features and priorities.
 Using tools like Jira, Trello, or Azure DevOps for task tracking.
o Outcome: A project plan and roadmap, often represented in the form of user
stories, features, or a backlog.
2. Development (Coding)
o Purpose: Write and build the code.
o Activities:
 Writing source code for new features or bug fixes.
 Version control using tools like Git, GitHub, or GitLab.
 Teams follow best coding practices and keep the codebase modular.
 Use branches and pull requests to ensure code quality.
o Outcome: A working, versioned codebase.

3. Continuous Integration (CI)


o Purpose: Automatically integrate and test changes to ensure the code works well
with the existing codebase.
o Activities:
 Developers commit code to the version control system frequently (daily or
even multiple times a day).
 The code is automatically built and tested using CI tools like Jenkins,
Travis CI, or CircleCI.
 Static code analysis, unit tests, and integration tests are executed during
the build process.
o Outcome: A tested and validated build.

4. Continuous Testing
o Purpose: Automate and execute various types of tests to ensure the software
works as expected.
o Activities:
 Automated tests, such as unit tests, integration tests, performance tests,
and security tests, are triggered with every code change.
 Tools like Selenium, JUnit, TestNG, or JMeter are commonly used.
 Automated test results provide feedback to developers in real-time,
allowing them to fix issues before deployment.
o Outcome: Verified and bug-free code ready for deployment.
5. Continuous Deployment (CD)
o Purpose: Automatically deploy the tested code to production or staging
environments.
o Activities:
 The code is deployed using automated pipelines that move it from testing
environments to production.
 Tools like Jenkins, GitLab CI/CD, AWS CodePipeline, or Kubernetes
handle deployment processes.
 Microservices architectures often use Docker and Kubernetes to manage
containerized applications for consistency across environments.
o Outcome: The code is continuously and automatically released into production or
staging environments.

6. Monitoring and Logging


o Purpose: Continuously monitor the software and infrastructure in production to
detect issues, performance bottlenecks, and user behavior.
o Activities:
 Using monitoring tools like Prometheus, Nagios, ELK Stack
(Elasticsearch, Logstash, Kibana), Grafana, or New Relic.
 Logging user interactions and system performance to detect and respond
to problems quickly.
 Set up alerting mechanisms to notify teams of anomalies or system
failures.
o Outcome: Real-time monitoring provides insights into system health,
performance, and usage patterns, ensuring stability and performance.

7. Continuous Feedback
o Purpose: Use feedback from users, system monitoring, and development teams to
improve the software.
o Activities:
 Collecting feedback from end-users, stakeholders, and automated
monitoring systems.
 Teams analyze logs and user metrics to identify areas of improvement.
 Applying feedback in the planning stage to continuously improve future
releases.
o Outcome: Better alignment with user needs, improved product performance, and
more efficient workflows.

8. Continuous Operations
o Purpose: Ensure the system runs smoothly 24/7 with little to no downtime.
o Activities:
 Automating infrastructure scaling, backup, and disaster recovery.
 Utilizing cloud services like AWS, Azure, or Google Cloud to ensure
high availability and scalability.
 Automating system patches, updates, and maintenance tasks to minimize
manual intervention.
o Outcome: A self-sustaining system that is reliable, scalable, and able to handle
high traffic and demand.

Key Concepts in DevOps:


1. Infrastructure as Code (IaC):
o Managing and provisioning infrastructure using code (scripts and templates)
rather than manual processes. Tools like Terraform, Ansible, and AWS
CloudFormation are often used to define infrastructure.
2. Microservices:
o Breaking down a large monolithic application into smaller, independent services
that can be developed, deployed, and scaled independently. Each service focuses
on a specific function.
3. Containerization:
o Packaging applications and their dependencies into lightweight containers using
tools like Docker. This ensures the software works consistently across different
environments.
4. Automation:
o Automation is a core part of DevOps, from infrastructure provisioning to code
deployment and testing. The goal is to reduce manual effort, human error, and
ensure faster releases.
5. Collaboration and Communication:
o DevOps emphasizes a culture of collaboration between development and
operations teams, supported by tools like Slack, Microsoft Teams, or JIRA to
ensure smooth communication across departments.

DevOps Tools:
 Version Control: Git, GitHub, GitLab, Bitbucket.
 CI/CD: Jenkins, GitLab CI/CD, Travis CI, CircleCI.
 Configuration Management: Ansible, Puppet, Chef.
 Containerization: Docker, Kubernetes.
 Monitoring: Prometheus, Nagios, ELK Stack (Elasticsearch, Logstash, Kibana),
Grafana.
 Cloud Platforms: AWS, Azure, Google Cloud.

Benefits of DevOps:
 Faster Time to Market: Shorter development cycles and faster releases ensure quick
delivery of new features.
 Higher Quality: Continuous testing and integration lead to higher-quality software with
fewer bugs.
 Increased Collaboration: DevOps fosters a culture of collaboration between
development and operations teams, leading to better alignment and communication.
 Scalability: Automated infrastructure and containerized applications make scaling easier.
 Better Customer Experience: Continuous feedback loops allow quick adjustments to
meet user needs.
Challenges in DevOps:
 Cultural Shift: Adopting DevOps requires significant cultural changes, where
development and operations teams work closely together, which may be hard in
traditional environments.
 Tool Complexity: Managing the different tools and technologies in the DevOps
ecosystem can be complex.
 Security: In a rapid, automated release process, ensuring security at every step of the
pipeline (DevSecOps) becomes critical.
Exercise 4

Module name :Implementation of CICD with Java and open source stack
Configure the web application and Version control using Git using Git
commands and version control operations.

Version Control System

Version Control System (VCS) is a software that helps software developers to


work together and maintain a complete history of their work.

Listed below are the functions of a VCS −

• Allows developers to work simultaneously.


• Does not allow overwriting each other’s changes.
• Maintains a history of every version.

New version = Old version + changes

V1= abcd (old version)

V2= abcdef (new version)

V2= V1+ef

Following are the types of VCS −

• Centralized version control system (CVCS).


• Distributed/Decentralized version control system (DVCS).

• Centralized version control system (CVCS) uses a central server to


store all files and enables team collaboration. But the major drawback of
CVCS is its single point of failure, i.e., failure of the central server.
Unfortunately, if the central server goes down for an hour, then during
that hour, no one can collaborate at all. And even in a worst case, if the
disk of the central server gets corrupted and proper backup has not been
taken, then you will lose the entire history of the project. Here, distributed
version control system (DVCS) comes into picture.

Ex: subversion (SVN)


• DVCS clients not only check out the latest snapshot of the directory but
they also fully mirror the repository. If the server goes down, then the
repository from any client can be copied back to the server to restore
it.Every checkout is a full backup of the repository. Git does not rely on the
central server and that is why you can perform many operations when you
are offline. You can commit changes, create branches, view logs, and
perform other operations when you are offline. You require network
connection only to publish your changes and take the latest changes.

Ex: GIT

Centralized VCS Distributed VCS

Having Remote repo Having local and remote


repository

Developers connect with internet No need to connect internet every


only time

It is having single copy of code in It is having past versions also


the sever

Directly work in central repo Directly work in local

EX: SVN EX: GIT


Git lifecycle:
Step -1 : Update the EC2 Instance

sudo apt-get update (if the instance is Ubuntu Linux)

sudo yum update (if the instance is Amazon linux)

Step -2 : Install GIT in the EC2 Instance

sudo apt-get install git –y (if the instance is Ubuntu Linux)

sudo yum install (if the instance is Amazon linux)

Step -3 : Create the directory for a project

mkdir project

Step -4 : change the directory to the project

cd project

step – 5: Initialize a Git Repository in the project repository

git init

step – 6 : creation of a file and edit

creation of files:
==================

sudo touch param

sudo nano pavan

vi or vim

sudo vi sarada

EDIT:
===========

nano ----

vi -----
for nano:
==============
sudo nano mounika

control and s ---- for save your content in the file

control and x ---- for exit from the edit mode

for vi :
===========
sudo vi sarada

for insert mode use "i"

come out from insert mode use escape key "Esc"


for save and exit use ":wq"

w for save the content and q for exit from the edit mode

DIRECTORIES (FOLDER):
======================

sudo mkdir param

Step – 7: Check the Status of Files

See the status of your working directory (modified, staged, or untracked files).

git status

Step – 8: Add Files to Staging

Add files to the staging area (files ready for commit).

git add param pavan

To add all files:

Git add .

Step – 9: Commit Changes

Commit the staged changes with a message.

git commit -m "commit message"


once if we commit default branch will create i.e (master branch)
Step – 10 : create branches
Git branch param
Checkout to param branch
Git checkout param
Step – 11: Create branch and checkout to new branch at a time
git checkout -b param
Step – 12 : To check list of branches
Git branch
Step – 13 : Central repository work ( github)
Create repository in github
Click on new option
Provide required repository name (ex: project)

Scroll down you will see create repository click on that.


Then the repository available in dashboard.

Step -14: Push the files from local repository (git) to central repository (github)

We can check (git and git hub) connection available or not

git remote –v

if you can get fetch and push connection available.

if you can’t get fetch and push connection not available.

ESTABLISH CONNECTION BETWEEN GIT AND GIT HUB:

git remote add origin “github repository url”

EX: git remote add origin https://github.com/Paramesh4271/rvit.git

Now Push the files from local repository (git) to central repository (github)

git push origin “branch name”

EX: git push origin master

Step -15: Pull the files from central repository to local repository

git fetch -- This command is used for check the changes in central repository.

git pull origin “branch name”

ex: git pull origin rvit


Exercise:5

Configure a static code analyzer which will perform static analysis of the web
application code and identify the coding practices that are not appropriate. Configure
the profiles and dashboard of the static code analysis tool.

Procedure:
static code analysis:It helps us to ensure the overall code quality, fix bugs in the early
stage of development, and ensure that each developer is using the same coding
standards when writing the code.

There are three basic tools that we are going to use for our static code
analysis: CheckStyle, Findbugs, PMD.

CheckStyle
CheckStyle is a tool that helps programmers write code that aligns with already agreed
upon coding standards. It automatically checks if the code adheres to the coding
standards used on a project.

FindBugs
Fine people of the University of Maryland built this fantastic tool for us. What it
basically does for us is, of course, find bugs in our code. FindBugs analyses our code
and generates a report, giving us a list of all the bugs that could cause a program to

misbehave. One of the good examples of the bugs that could be detected are infinite
loops, unused variables, security, threading issues, and many more.

PMD
PMD is another useful tool in our static code analyzers toolbox. Beside reporting many
coding issues (e.g. possible memory leaks), PMD can check if our code was commented
properly if our variables are named properly and our method contains more than the
specified number of lines. PMD is highly configurable and the latest releases play quite
well with Lombok annotations. Previously, we needed to define custom excluded rules
in order for PMD to play nice with Lombok.

Copy/Paste Detector (CPD) is an integrated part of PMD and is used to detect


duplicated code in a source code.

Setting Up Static Code Analysis


We are going to use Gradle as our build tool. Gradle already has a set of plugins that
we are going to use for our static code analysis.
Static analysis setup files will reside in ./gradle/static-code-analysis folder. Each of our
static analysis tools will have its own dedicated folder where we will hold additional
configuration files for each of the tools. Finally, we’ll use
staticCodeAnalysis.gradle to aggregate all our static code analysis settings. It will
contain all the configuration settings needed to run static code analysis for our project.

buildscript {

repositories {

mavenCentral()

dependencies {

classpath 'de.aaschmid:gradle-cpd-plugin:1.1'

}
apply plugin: 'checkstyle'

apply plugin: 'findbugs'

apply plugin: 'pmd'

apply plugin: de.aaschmid.gradle.plugins.cpd.CpdPlugin


Ok, let’s jump into setting the static code analysis.

Once the plugins are included in the build script, we can start configuring each of the
plugins. First, we are going to configure the CheckStyle plugin.

Setting CheckStyle
For CheckStyle, we are going to set the ignoreFailures flag, toolVersion, and
configFile, which will point to the location of a configuration file. As a base for our
configuration file, we are going to use Google Code Style settings. The configurations
are basically the same — the only difference is that we are going to use four space
instead of two space indentation. And, that’s it. Nothing more needs to be done for
CheckStyle. Let’s set up the FindBugs next:

checkstyle {

toolVersion = '8.12'

ignoreFailures = false

configFile = file("${rootGradleDir}/static-code-analysis/checkstyle/checkstyle.xml
")

We are explicitly setting the plugin not to ignore failures i.e. ignoreFailures flag is set
to false. What that basically means is that our project build will fail if we run into any
issue during our static code analysis check. If we think about it, this has a lot of sense.
Our CI/CD pipeline should fail in case we run upon any issue in our code base. Being
it compile failure, unit test failure, code analysis, as long as we have an issue, we
shouldn’t be able to continue with our pipeline.

Setting FindBugs
In most cases, we should specify only toolVersion and ignoreFailures. There are other
options we could set here, such as specifying which bug detectors are going to be run
or to include/exclude lists of files that are going to be checked in our code base. For our
example, we will leave the default values here: all default bug detectors will be run, and
we are not going to exclude any file from FindBugs detection.

findbugs {

toolVersion = '3.0.1'

ignoreFailures = false

}
Setting PMD
For PMD, besides toolVersion and ignoreFailures, we are going to set the rule sets for
our code analysis. We have can set the rule sets in two ways. We can specify them
directly inside the PMD plugin configuration using ruleSets array, or we could extract
the rule sets to separate the XML file and reference the file using the
ruleSetFiles configuration parameter. We are going to choose the latter option since it
is more descriptive and allows us to provide exclusions to default rule sets categories.
For the codestyle category, we are
excluding DefaultPackage and OnlyOneReturn rules. we can check out ruleset.xml
for full setup.

pmd {

toolVersion = '6.7.0'

ignoreFailures = false

ruleSetFiles = files("${rootGradleDir}/static-code-analysis/pmd/ruleset.xml")

ruleSets = []

rulePriority = 3

Setting CPD
For Copy/Paste bug detection, we need to configure the CPD plugin. First, let’s set the
minimumTokenCount to 100. This means that the plugin will detect a duplicate code
bug if it finds around 5– 10 lines of the same code in separate places. If only four lines
of code are matched, the bug will not be detected. One useful option — especially if
we are using frameworks — is to set the ignoreAnnotations to true. It will allow us
to ignore “false positives” and ignore cases where classes or methods have the same 5–
6 lines of annotations. Finally, we’ll enable and generate XML by setting xml.enabled
to true.

cpd {

language = 'java'

toolVersion = '6.0.0'

minimumTokenCount = 100// approximately 5-10 lines

}
cpdCheck {

reports {

text.enabled = false

xml.enabled = true

ignoreAnnotations = true

source = sourceSets.main.allJava

view raw

For remaining static analysis report plugins, we will enable generation of the HTML
report instead of XML one.

tasks.withType(Checkstyle) {

reports {

xml.enabled false

html.enabled true

tasks.withType(FindBugs) {

reports {

xml.enabled false

html.enabled true

tasks.withType(Pmd) {
reports {

xml.enabled false

html.enabled true

Great! We are done with the static analysis code configuration. Now, we just need to
include staticCodeAnalysis.gradle into our Gradle build script:

1
apply from: "${rootGradleDir}/staticCodeAnalysis.gradle"

Running Static Code Analysis


Static code analysis plugins will run with the same Java version used to run Gradle.

Each plugin will add its own dependencies to the Java plugin check task (e.g. pmdMain,
cpdMain). Whenever we run ./gradlew clean build, the internally check
task will be triggered and static analysis steps will be run for our project. If any of the
code analysis steps fail, our build will fail as well. Static code analysis reports will be
generated under ./build/reports.

If in some situations we need to “loose” the specified static code rules, we can always
suppress static analysis errors by using @SuppressWarnings annotation. In order to
suppress the warning for having too many methods in a class, we could
put @SuppressWargning("PMD.TooManyMethods") on the given class.

We advise keeping static analysis “on” for the test classes as well. We should always
treat tests as an integrated part of our project. Test code should conform to the same
styles/rules we use throughout our project.
Exercise 6:

AIM:
Module Name: Implementation of CICD with Java and open source stack

Write a build script to build the application using a build automation tool like Maven.
Create a folder structure that will run the build script and invoke the various software
development build stages. This script should invoke the static analysis tool and unit
test cases and deploy the application to a web application server like Tomcat.

Procedure:

What is Maven

Maven is a powerful project management tool that is based on POM (project object
model). It is used for projects build, dependency and documentation. It simplifies
the build process like ANT. But it is too much advanced than ANT.
In short terms we can tell maven is a tool that can be used for building and
managing any Java-based project. maven make the day-to-day work of Java
developers easier and generally help with the comprehension of any Java-based
project.

What maven does

Maven does a lot of helpful task like


1. We can easily build a project using maven.
2. We can add jars and other dependencies of the project easily using the help
of maven.
3. Maven provides project information (log document, dependency list, unit test
reports etc.)
4. Maven is very helpful for a project while updating central repository of JARs
and other dependencies.
5. With the help of Maven we can build any number of projects into output
types like the JAR, WAR etc without doing any scripting.
6. Using maven we can easily integrate our project with source control system
(such as Subversion or Git).
How maven works?

Core Concepts of Maven:


1. POM Files: Project Object Model(POM) Files are XML file that contains
information related to the project and configuration information such as
dependencies, source directory, plugin, goals etc. used by Maven to build the
project. When you should execute a maven command you give maven a POM
file to execute the commands. Maven reads pom.xml file to accomplish its
configuration and operations.
2. Dependencies and Repositories: Dependencies are external Java libraries
required for Project and repositories are directories of packaged JAR files. The
local repository is just a directory on your machine hard drive. If the
dependencies are not found in the local Maven repository, Maven downloads
them from a central Maven repository and puts them in your local repository.
3. Build Life Cycles, Phases and Goals: A build life cycle consists of a
sequence of build phases, and each build phase consists of a sequence of goals.
Maven command is the name of a build lifecycle, phase or goal. If a lifecycle is
requested executed by giving maven command, all build phases in that life cycle
are executed also. If a build phase is requested executed, all build phases before
it in the defined sequence are executed too.
4. Build Profiles: Build profiles a set of configuration values which allows you
to build your project using different configurations. For example, you may need
to build your project for your local computer, for development and test. To
enable different builds you can add different build profiles to your POM files
using its profiles elements and are triggered in the variety of ways.
5. Build Plugins: Build plugins are used to perform specific goal. you can add a
plugin to the POM file. Maven has some standard plugins you can use, and you can
also implement your own in Java.

Installation process of Maven

The installation of Maven includes following Steps:


1. Verify that your system has java installed or not. if not then install java ( Link
for Java Installation )
2. Check java Environmental variable is set or not. if not then set java
environmental variable.(link to install java and setting environmental variable)
3. Download maven (Link)
4. Unpack your maven zip at any place in your system.
5. Add the bin directory of the created directory apache-maven-3.5.3(it depends
upon your installation version) to the PATH environment variable and system
variable.
6. open cmd and run mvn -v command. If it print following lines of code then
installation completed.

Apache Maven 3.5.3 (3383c37e1f9e9b3bc3df5050c29c8aff9f295297; 2018-02-


25T01:19:05+05:30)
Maven home: C:\apache-maven-3.5.3\bin\..
Java version: 1.8.0_151, vendor: Oracle Corporation
Java home: C:\Program Files\Java\jdk1.8.0_151\jre
Default locale: en_US, platform encoding: Cp1252
OS name: "windows 10", version: "10.0", arch: "amd64", family: "windows"

Maven pom.xml file

POM means Project Object Model is key to operate Maven. Maven reads pom.xml
file to accomplish its configuration and operations. It is an XML file that contains
information related to the project and configuration information such
as dependencies, source directory, plugin, goals etc. used by Maven to build the
project.
The sample of pom.xml

<project xmlns="http://maven.apache.org/POM/4.0.0"

xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"

xsi:schemaLocation="http://maven.apache.org/POM/4.0.0

http://maven.apache.org/xsd/maven-4.0.0.xsd">

<modelVersion>4.0.0</modelVersion>

<groupId> com.project.loggerapi </groupId>

<artifactId>LoggerApi</artifactId>

<version>0.0.1-SNAPSHOT</version>
<!-- Add typical dependencies for a web application -->

<dependencies>

<dependency>

<groupId>org.apache.logging.log4j</groupId>

<artifactId>log4j-api</artifactId>

<version>2.11.0</version>

</dependency>

</dependencies>

</project>

Elements used for Creating pom.xml file


1. project- It is the root element of the pom.xml file.
2. modelVersion- modelversion means what version of the POM model you
are using. Use version 4.0.0 for maven 2 and maven 3.
3. groupId- groupId means the id for the project group. It is unique and Most
often you will use a group ID which is similar to the root Java package name of
the project like we used the groupId com.project.loggerapi.
4. artifactId- artifactId used to give name of the project you are building.in our
example name of our project is LoggerApi.
5. version- version element contains the version number of the project. If your
project has been released in different versions then it is useful to give version of
your project.
Other Elements of Pom.xml file
1. dependencies- dependencies element is used to defines a list of dependency
of project.
2. dependency- dependency defines a dependency and used inside
dependencies tag. Each dependency is described by its groupId, artifactId and
version.
3. name- this element is used to give name to our maven project.
4. scope- this element used to define scope for this maven project that can be
compile, runtime, test, provided system etc.
5. packaging- packaging element is used to packaging our project to output
types like JAR, WAR etc.

Maven Repository

Maven repositories are directories of packaged JAR files with some metadata. The
metadata are POM files related to the projects each packaged JAR file belongs to,
including what external dependencies each packaged JAR has. This metadata
enables Maven to download dependencies of your dependencies recursively until all
dependencies are download and put into your local machine.
Maven has three types of repository :
1. Local repository
2. Central repository
3. Remote repository
Maven searches for dependencies in this repositories. First maven searches in Local
repository then Central repository then Remote repository if Remote repository
specified in the POM.

1. Local repository- A local repository is a directory on the machine of


developer. This repository contains all the dependencies Maven downloads.
Maven only needs to download the dependencies once, even if multiple projects
depends on them (e.g. ODBC).
By default, maven local repository is user_home/m2 directory.
example – C:\Users\asingh\.m2
2. Central repository- The central Maven repository is created Maven
community. Maven looks in this central repository for any dependencies needed
but not found in your local repository. Maven then downloads these
dependencies into your local repository.
3. Remote repository- remote repository is a repository on a web server from
which Maven can download dependencies.it often used for hosting projects
internal to organization. Maven then downloads these dependencies into your
local repository.

Practical Application Of Maven

When working on a java project and that project contains a lot of dependencies,
builds, requirement, then handling all those things manually is very difficult and
tiresome. Thus using some tool which can do these works is very helpful.
Maven is such a build management tool which can do all the things like adding
dependencies, managing the classpath to project, generating war and jar file
automatically and many other things.
Pros and Cons of using Maven

Pros:
1 Maven can add all the dependencies required for the project
automatically by reading pom file.
2 One can easily build their project to jar, war etc. as per their requirements
using maven.
1. Maven makes easy to start project in different environments and one
doesn’t needs to handle the dependencies injection, builds, processing, etc.
4 Adding a new dependency is very easy. One has to just write the
dependency code in pom file.
Cons:
1 Maven needs the maven installation in the system for
working and maven plugin for the ide.
2 If the maven code for an existing dependency is not
available, then one cannot add that dependency using
maven.

When should someone use Maven

One can use the Maven Build Tool in the following condition:
3 When there are a lot of dependencies for the project. Then
it is easy to handle those dependencies using maven.
4 When dependency version update frequently. Then one has
to only update version ID in pom file to update dependencies.
5 Continuous builds, integration, and testing can be easily
handled by using maven.
6 When one needs an easy way to Generating documentation
from the source code, Compiling source code, Packaging compiled code
into JAR files or ZIP files.
Commands:

1. For server update

sudo apt-get update

2. For jdk installation

Sudo apt-get install default-jdk –y

3. For Maven installation

Sudo apt-get install maven –y

4. For maven operations

mvn clean – for clear the background environment

mvn compile --- compilation of the pom.xml (will get .class files)

mvn install --- will get snapshot

mvn package --- will get the artifact (jar,war,ear)

For server update


2. For jdk installation

3. Maven installation

MAVEN VERSION CHECK


Clone repository from git hub

Mvn clean
mvn compile

mvn install
mvn package
Exercise 7:
AIM:

Module Name: Implementation of CICD with Java and open source stack Configure
the Jenkins tool with the required paths, path variables, users and pipeline views.

Procedure:
Jenkins has the most optimal product community and set of really useful plugins that
suits most of your software projects: you can build software, deploy software,
websites, portals to various places (e.g including AWS, DigitalOcean, bare metal
servers) or to run unit tests. It can be integrated with communication tools of your
choice, like Slack, HipChat or email.
If you haven't had a chance to try Jenkins earlier, feel free to use the tutorial below to
get started.

Manual installation

In order to install Jenkins, we will need:

 Unix system. I would recommend a Debian-based machine, like Ubuntu


server LTS

 Java runtime environment installed. I usually use Java 8

 Get base Jenkins setup

 Install necessary plugins

 Put everything behind your web server.


Install Java

The easiest way to install Java is through the apt-get package manager:

sudo apt-get install python-software-properties

sudo add-apt-repository ppa:webupd8team/java

sudo apt-get update

Once you added ppa above, you can install java with the following command:

sudo apt-get install oracle-java8-installer

Get base Jenkins Setup

You will need to execute a series of the following commands, namely: add the
Jenkins signing key, register Jenkins apt sources, update package lists, and install
Jenkins package.

wget -q -O - http://pkg.jenkins-ci.org/debian/jenkins-ci.org.key | sudo apt-key add -

sudo echo deb http://pkg.jenkins-ci.org/debian binary/ >


/etc/apt/sources.list.d/jenkins.list

sudo apt-get update

sudo apt-get install jenkins


By default, it will install the base Jenkins setup, which is insecure. You will need to
go to the host where your Jenkins is installed, for example: http://jenkins-host:8080/.

Navigate to Manage Jenkins (on the left) and choose the "Configure Global Security"
item on the page loaded.

Now look below on the Matrix-based security (select it, if it is not selected
previously), and make sure Anonymous only has the "Read" right under the View
group.

Click Save at the bottom of the page. After the page reloads, you'll see a login form,
but simply ignore that and
go to the home page (like, for example, http://jenkins-host:8080/). You'll see this
signup form, and the first signed up account will be the administrator.

The Power of Plugins

Jenkins would not be so powerful without plugins. Usually, I install these plugins by
default:

 Bitbucket:
The BitBucket plugin is designed to offer integration between BitBucket and
Jenkins. BitBucket offers a Jenkins hook, but this one just triggers a build for
a specific job on commit, nothing more. The BitBucket plugin, like the
GitHub plugin, uses the POST hook payload to check which job has to get
triggered based on the changed repository/branch.
Plugin URL: https://wiki.jenkins-ci.org/display/JENKINS/BitBucket+Plugin
 bitbucket-pullrequest-builder:
This plugin builds pull requests from Bitbucket.org. This is a must-have
plugin if you perform QA deploy for each submitted pull request.
Plugin URL: https://wiki.jenkins-
ci.org/display/JENKINS/Bitbucket+pullrequest+builder+plugin
 build-pipeline-plugin:
This plugin provides a Build Pipeline View of upstream and downstream
connected jobs that typically form a build pipeline. In addition, it offers the
ability to define manual triggers for jobs that require intervention prior to
execution, e.g. an approval process outside of Jenkins. Provides nice
visualization of the paths & flows.
Plugin URL: https://wiki.jenkins-
ci.org/display/JENKINS/Build+Pipeline+Plugin
 copyartifact:
Adds a build step to copy artifacts from another project. The plugin lets you
specify which build to copy artifacts from (e.g. the last successful/stable build,
by build number, or by a build parameter). You can also control the copying
process by filtering the files being copied, specifying a destination directory
within the target project, etc.
Plugin URL: https://wiki.jenkins-
ci.org/display/JENKINS/Copy+Artifact+Plugin
 credentials:
Adds a build step to copy artifacts from another project. The plugin lets you
specify which build to copy artifacts from (e.g. the last successful/stable build,
by build number, or by a build parameter). You can also control the copying
process by filtering the files being copied, specifying a destination directory
within the target project, etc.
Plugin URL: https://wiki.jenkins-ci.org/display/JENKINS/Credentials+Plugin
 delivery-pipeline-plugin:
Visualisation of Delivery/Build Pipelines, renders pipelines based on
upstream/downstream jobs. When using Jenkins as a build server it is now
possible with the Delivery Pipeline Plugin to visualise one or more Delivery
Pipelines in the same view even in full screen.
Plugin URL: https://wiki.jenkins-
ci.org/display/JENKINS/Delivery+Pipeline+Plugin
 environment-script:
Environment Script Plugin allows you to have a script run after SCM
checkout, before the build. If the script fails (exit code isn't zero), the build is
marked as failed.
Any output on standard out is parsed as environment variables that are
applied to the build. It supports "override syntax" to append paths to PATH-
like variables.
Plugin URL: https://wiki.jenkins-
ci.org/display/JENKINS/Environment+Script+Plugin
 git:
Supports popular git version control system
 ghprb:
This plugin builds pull requests in GitHub. It's another must-have plugin if
your software development life cycle includes deploying pull requests to PR
environment to test.
Plugin URL: https://wiki.jenkins-
ci.org/display/JENKINS/GitHub+pull+request+builder+plugin
 greenballs: The funniest plugin - changes Jenkins to use green balls instead of
blue for successful builds.
Plugin URL: https://wiki.jenkins-ci.org/display/JENKINS/Green+Balls
 hipchat:
This plugin allows your team to setup build notifications to be sent to
HipChat rooms.To enable notifications, add "HipChat Notifications" as a
post-build step.
Plugin URL: https://wiki.jenkins-ci.org/display/JENKINS/HipChat+Plugin
 junit:
Allows JUnit-format test results to be published. Note: number of tools,
including Karma, PhpUNIT & other tools allow to publish test results in a
JUnit format. Thus, this is a must-have plugin for unit test flows.
Plugin URL: https://wiki.jenkins-ci.org/display/JENKINS/JUnit+Plugin
 matrix-auth:
Offers matrix-based security authorization strategies (global and per-project).
This is quite handy if you have shared build server across several teams.
Plugin URL: https://wiki.jenkins-
ci.org/display/JENKINS/Matrix+Authorization+Strategy+Plugin
 parameterized-trigger:
This plugin lets you trigger new builds when your build has completed, with
various ways of specifying parameters for the new build.
You can add multiple configurations: each has a list of projects to trigger, a
condition for when to trigger them (based on the result of the current build),
and a parameters section.
Plugin URL: https://wiki.jenkins-
ci.org/display/JENKINS/Parameterized+Trigger+Plugin
 rebuild:
This plays nice with the parameterized-trigger plugin. The plug-in allows the
user to rebuild a parametrized build without entering the parameters again.
Plugin URL: https://wiki.jenkins-ci.org/display/JENKINS/Rebuild+Plugin
 ssh: You can use the SSH Plugin to run shell commands on a remote machine
via ssh.
Plugin URL: https://wiki.jenkins-ci.org/display/JENKINS/SSH+plugin
 s3: Allows uploading artifacts to S3 with multiple options.
Plugin URL: https://wiki.jenkins-ci.org/display/JENKINS/S3+Plugin
 throttle-concurrents:
This plugin allows for throttling the number of concurrent builds of a project
running per node or globally.
Unfortunately, this is also a must-have plugin for Node (0.10-0.12) projects
with NPM - two concurrent npm installs often fail.
Plugin URL: https://wiki.jenkins-
ci.org/display/JENKINS/Throttle+Concurrent+Builds+Plugin
Plugins are installed using Plugin manager on a Manage Jenkins Section.

Putting Jenkins Behind a Web Server


Usually I hide Jenkins behind nginx. A typical configuration looks like the one
below:

server {

listen 443 ssl;

server_name jenkins.vagrant.dev;

ssl_certificate /etc/nginx/jenkins_selfsigned.crt;

ssl_certificate_key /etc/nginx/jenkins_selfsigned.key

location / {

proxy_pass http://127.0.0.1:8080;

proxy_set_header Host $host;

proxy_set_header X-Real-IP $remote_addr;

proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;

proxy_redirect off;

proxy_connect_timeout 150;

proxy_send_timeout 100;

proxy_read_timeout 100;

}}

Automated installation

Do I install Jenkins manually each time? Of course not, I do it often for my


customers.
With ansible, and sa-box-jenkins role new Jenkins installation can be deployed while
you drink your coffee.
Let's prepare a basic bootstrap project that can be used by you in the future.
It includes following files:

 bootstrap.sh - installs ansible alongside with dependencies.

 init.sh - initializes 3rd party dependencies

 .projmodules - fully compatible with .gitmodules git syntax, specifies list of


the dependencies
that will be used by the playbook.
In particular, it includes ansible- by default developer_recipes (repository
with set of handy deployment recipes)
and ansible role called sa-box-bootstrap responsible for box securing steps
(assuming you plan to put Jenkins on remote hosts).
[submodule "public/ansible_developer_recipes"]

path = public/ansible_developer_recipes

url = [email protected]:Voronenko/ansible-developer_recipes.git

[submodule "roles/sa-box-bootstrap"]

path = roles/sa-box-bootstrap

url = [email protected]:softasap/sa-box-bootstrap.git

[submodule "roles/sa-box-jenkins"]

path = roles/sa-box-jenkins

url = [email protected]:softasap/sa-box-jenkins.git

 hosts - list here the initial box credentials that were provided to you for the
server. Note: jenkins-bootstrap assumes you have the fresh box with root
access only. If your box already secured, adjust credentials appropriately
[jenkins-bootstrap]

jenkins_bootstrap ansible_ssh_host=192.168.0.17 ansible_ssh_user=yourrootuser


ansible_ssh_pass=yourpassword
[jenkins]

jenkins ansible_ssh_host=192.168.0.17 ansible_ssh_user=jenkins

 jenkins_vars.yml - set here specific environment overrides, like your


preferred deploy user name and keys.

 jenkins_bootstrap.yml - First step - box securing. Creates jenkins user, and


secures the box using sa-box-bootstrap role.
See more details about the sa-box-bootstrap role
In order, to override params for sa-box-bootstrap - pass the parameters like in
the example below:
- hosts: all

vars_files:

- ./jenkins_vars.yml

roles:

-{

role: "sa-box-bootstrap",

root_dir: "{{playbook_dir}}/public/ansible_developer_recipes",

deploy_user: "{{jenkins_user}}",

deploy_user_keys: "{{jenkins_authorized_keys}}"

 jenkins.yml provisioning script that configures jenkins with set of plugins and
users.

 jenkins_vars.yml configuration options for jenkins deployment.

 setup_jenkins.sh shell script that invokes deployment in two steps: initial box
bootstraping & jenkins setup
#!/bin/sh
ansible-playbook jenkins_bootstrap.yml --limit jenkins_bootstrap

ansible-playbook jenkins.yml --limit jenkins

Configuration Options for Automated Installation

You need to override:

 jenkins_authorized_keys (this is list of the keys, that allow you to login to


Jenkins box under Jenkins)

 jenkins_domain - your agency domain

 jenkins_host - name of the Jenkins host (Site will be bound to


jenkins_host.jenkins_domain)

 java_version - your Java choice (6,7,8 supported)


jenkins_user: jenkins

jenkins_authorized_keys:

- "{{playbook_dir}}/components/files/ssh/vyacheslav.pub"

jenkins_domain: "vagrant.dev"

jenkins_host: "jenkins"

java_version: 8

- jenkins_users list of users with passwords to create. Admin and deploy are required
users.
Admin is used to manage instance, deploy is used to access the artifacts via
deployment scripts.
If you won't override passwords, the default one will be used (per role), which is not
the best, for public deployments.

jenkins_users:

-{
name: "Admin",

password: "AAAdmin",

email: "no-reply@localhost"

-{

name: "deploy",

password: "DeDeDeDeploy",

email: "no-reply@localhost"

 jenkins_plugins Your choice of plugins to install. By default:


jenkins_plugins:

- bitbucket # https://wiki.jenkins-ci.org/display/JENKINS/BitBucket+Plugin

- bitbucket-pullrequest-builder

- build-pipeline-plugin

- copyartifact # https://wiki.jenkins-ci.org/display/JENKINS/Copy+Artifact+Plugin

- credentials # https://wiki.jenkins-ci.org/display/JENKINS/Credentials+Plugin

- delivery-pipeline-plugin # https://wiki.jenkins-
ci.org/display/JENKINS/Delivery+Pipeline+Plugin

- environment-script # https://wiki.jenkins-
ci.org/display/JENKINS/Environment+Script+Plugin

- git

- ghprb # https://wiki.jenkins-
ci.org/display/JENKINS/GitHub+pull+request+builder+plugin
- greenballs # https://wiki.jenkins-ci.org/display/JENKINS/Green+Balls

- hipchat # https://wiki.jenkins-ci.org/display/JENKINS/HipChat+Plugin

- junit # https://wiki.jenkins-ci.org/display/JENKINS/JUnit+Plugin

- matrix-auth # https://wiki.jenkins-
ci.org/display/JENKINS/Matrix+Authorization+Strategy+Plugin

- matrix-project #https://wiki.jenkins-ci.org/display/JENKINS/Matrix+Project+Plugin

- parameterized-trigger #https://wiki.jenkins-
ci.org/display/JENKINS/Parameterized+Trigger+Plugin

- rebuild # https://wiki.jenkins-ci.org/display/JENKINS/Rebuild+Plugin

- ssh

- s3 # https://wiki.jenkins-ci.org/display/JENKINS/S3+Plugin

- throttle-concurrents #https://wiki.jenkins-
ci.org/display/JENKINS/Throttle+Concurrent+Builds+Plugin

The Code in Action

You can download this template code from this repository. In order to use it - fork it,
adjust parameters to your needs, and use.

Running is as simple as ./setup_jenkins.sh

Note: don’t write highlighted data i.e red colour


For server update

1. For jdk installation

2. Maven installation
MAVEN VERSION CHECK

JENKINS INSTALLTION

Check Jenkins version


Set up dash board in browser

Provide administrator password


Provide user details

JENKINS DASH BOARD

JENKINS PATH
Exercise 8:

AIM:

Module name: Implementation of CICD with Java and open source stack

Configure the Jenkins pipeline to call the build script jobs and configure to run
it whenever there is a change made to an application in the version control
system. Make a change to the background color of the landing page of the web
application and check if the configured pipeline runs.

Procedure:

There are six steps to building a pipeline with Jenkins. But, before you begin those six
steps, make sure you have the following in your system.

 Java Development Kit


 Knowledge to execute some basic Linux commands
The steps to build CI/CD pipeline with Jenkins are:

1. Download Jenkins

 Download Jenkins from the Jenkins downloads page


‘https://www.jenkins.io/download/’.
 Download the file ‘Generic Java package (.war)’.

2. Execute Jenkins as a Java binary

 Open the terminal window and enter cd <your path>.


 Use the command java –jar ./Jenkins. war to run the WAR file.

3. Create a Jenkins Job

 Open the web browser and open localhost:8080.


 The Jenkins dashboard opens creates new jobs there.
4. Create a Pipeline Job

 Select and define what Jenkins job that is to be created.


 Select Pipeline, give it a name and click OK.
 Scroll down and find the pipeline section.
 Either directly write a pipeline script or retrieve the Jenkins file from SCM
(Source Code Management).

5. Configure and Execute a Pipeline Job With a Direct Script

 Choose Pipeline script as the Destination and paste the Jenkins file content in
the Script from the GitHub.
 Click on Save to keep the changes.
 Now click on the Build Now to process the build.
 To check the output, click on any stage and click Log; a message will appear
on the screen.

6. Configure and Execute a Pipeline With SCM

 Copy the GitHub repository URL by clicking on Clone or download.


 Now, click on Configure to modify the existing job.
 Scroll to the Advanced Project Options setting and select Pipeline script from
the SCM option.
 Paste the GitHub repository URL here.
 Type Jenkinsfile in the Script, and then click on the Save button.
 Next, click on Build Now to execute the job again.
 There will be an additional stage, in this case, i.e., Declaration: Checkout
SCM.
 Click on any stage and click on Log.
After you have grasped all the essential steps to build a CI/CD pipeline using Jenkins,
a hands-on demonstration will serve as the icing on the cake.

Demo - To Build a CI/CD Pipeline With Jenkins

Go to your Jenkins Portal:

 Click on ‘Create a job’.


 In the item name dialog box, you may enter the ‘pipeline’.
 Select the pipeline job type in the list below.
 Click on OK.

A configuration related to the pipeline opens on the screen.

 Scroll down on that page.


 There in the dialog box, choose GitHub+Maven.

Some steps will appear on the screen. The next step is to integrate the Jenkins file into
the Version Control system.

So, to do that, you must:

 Select ‘Pipeline script from SCM’.


 Then in the SCM dialog box, select Git.
 ‘Jenkins file’ is the name of the Script.
 Add the Git repository URL.
 You can add the credentials if any.

The credentials can be added with the help of the ‘Add’ option.

 Then save the configuration


A page now appears on the screen that gives you various options like ‘Build Now’,
‘Delete Pipeline’, ‘Configure’, etc.
 Click on the Build Now option.
The pipeline will start downloading. The checkout will be visible on the screen and
you can see the build being complete on the screen.

You can go to the console output option to check the log that is taking place.

You will soon be able to see that all the segments of the pipeline are completed. The
artifact will be present to download. The war file can be downloaded using that link.

The entire process helps us understand how the whole Pipeline is configured. Using
similar types of steps, different kinds of automation pipelines can be configured.

Using CI/ CD Pipelines and IndexedDB for a jQuery UI Web Application

Using a simple PostgreSQL database, we’ll be storing some client-side data in


IndexedDB storage. This technique can be a convenient and robust solution for
storing some data in the browser.
Using IndexedDB is a much better alternative to the Document Object Model (DOM)
for storing and querying client-side data because of how easy it is to manage and
query data. Let’s get started with our project.

Creating a New Project

First, create a new directory for our project in a new directory and run this command:

git init

You should see something like the following:

$ mkdir jquery-ui-pipeline-demo $ cd jquery-ui-pipeline-demo $ bin/pipeline create

The “Pipeline create” command lets you specify the template where your code will be
built and tested. This method is like how a knitter uses yarn but in our case, we’re
using the web toolbelt instead of the yarn.

The only difference is that you won’t need to use your own build environment to run
tests. The web toolbelt will do this for you automatically.

That’s all you need to do to create a new project.

Building a jQuery UI Web App Using a Pipeline

To use the pipeline to build your sample app, we will open a new terminal window
and type:

$ bin/pipeline build

This command will take a while because it’s building and testing the app on our local
development machine.

After the build process is complete, you can see that the completed project is stored in
our index.html file. We can use these files in other apps or even deploy the app to a
server to test it out.
Exercise 9:

AIM:

Module name: Implementation of CICD with Java and open source stack

Create a pipeline viewof the Jenkins pipeline used in exercise 8 .configure it with
user defined messages

Procedure:

Building CICD Pipeline Using Jenkins and dockers:

Step 1: In your Terminal or CLI, Start and enable Jenkins and docker

systemctl start jenkins


systemctl enable jenkins
systemctl start docker

Step 2: In your Jenkins console click on New Item from where you will create your
first job.

Step 3: After you click on New Item , You need to choose an option freestyle project

with name and save

Step 4: In the configuration section select SCM and you will add the git repo link and

save it.
Step 5: Then you will select Build option and choose to Execute shell

Step 6: Provide the shell commands. Here it’ll build the archive file to induce a war

file. After that, it’ll get the code that already forces then it uses wiz to put in the

package. So, it merely installs the dependencies and compiles the applying.

Step 7: Similarly you will create a new job as before.

Step 8: Click on the .freestyle project and save it with the proper name.
Step 9: Again repeat step 4, In the configuration section select SCM and you will add

the git repo link and save it.

Step 10: Repeat step 5, You will select Build option and choose to Execute shell

Step 11: You will now write the shell module commands as for int phase and build the

container.

Step 12: Again you will create a new job as before in previous steps.
Step 13: Select freestyle project and provide the item name (here I have given Job3)

and click on OK.

Step 14: Again repeat step 4, In the configuration section select SCM and you will add

the git repo link and save it.

Step 15: Repeat step 10, You will select Build option and choose to Execute shell.

Step 16: Write the shell commands, Now it will verify the container files and the

deployment will be doe on port 8180, save it


Step 17: Now, you will choose job 1 and click to configure.

Step 18:From the build actions, You will choose post-build and click on build other

projects

Step 19: You will need to provide the name of the project to build after the job 1 and

then click save


Step 20: Now, you will choose job 2 and click to configure.

Step 21: From the build actions, You will choose post-build and click on build other

projects
Step 22: You will need to provide the name of the project to build after the job 2 and
then click save

Step 23: let's create a pipeline, by adding to + sign

Step 24: Now, You will choose and select a build Pipeline view and add the name.
Step 25: Choose the Job 1 and save OK

Step 26: let's RUN it and start the CICD process now

Step 27:After you build the job, To verify open the link in your browser
localhost:8180/sample.text, This is the port where your app is running
Exercise 10

AIM:

Module name: Implementation of CICD with Java and open source stack

Create a pipeline viewof the Jenkins pipeline used in exercise 8 .configure it with
user defined messages

Procedure:

SonarQube is an excellent tool for measuring code quality, using static analysis to find
code smells, bugs, vulnerabilities, and poor test coverage. Rather than manually
analysing the reports, why not automate the process by integrating SonarQube with
your Jenkins continuous integration pipeline? This way, you can configure a quality
gate based on your own requirements, ensuring bad code always fails the build.

You’ll learn exactly how to do that in this article, through a full worked example where
we add SonarQube analysis and SonarQube quality gate stages to a Jenkins pipeline.
SonarQube refresher

SonarQube works by running a local process to scan your project, called


the SonarQube scanner. This sends reports to a central server, known as the
SonarQube server.

The SonarQube server also has a UI where you can browse these reports. They look
like this:

Quality gates

In SonarQube a quality gate is a set of conditions that must be met in order for a project
to be marked as passed. In the above example the project met all the conditions.

Here’s an example where things didn’t go so well.


Clicking on the project name gives full details of the failure.

Here you can see here that a condition failed because the maintainability rating was
a D rather than A.

SonarQube and Jenkins

Running a SonarQube scan from a build on your local workstation is fine, but a robust
solution needs to include SonarQube as part of the continuous integration process. If
you add SonarQube analysis into a Jenkins pipeline, you can ensure that if the quality
gate fails then the pipeline won’t continue to further stages such as publish or release.

After all, nobody wants to release crappy code into production.

To do this, we can use the SonarQube Scanner plugin for Jenkins. It includes two
features that we’re going to make use of today:

1. SonarQube server configuration – the plugin lets you set your SonarQube server
location and credentials. This information is then used in a SonarQube analysis
pipeline stage to send code analysis reports to that SonarQube server.
2. SonarQube Quality Gate webhook – when a code analysis report is submitted
to SonarQube, unfortunately it doesn’t respond synchronously with the result of
whether the report passed the quality gate or not. To do this, a webhook call
must be configured in SonarQube to call back into Jenkins to allow our pipeline
to continue (or fail). The SonarQube Scanner Jenkins plugin makes this
webhook available.

Here’s a full breakdown of the interaction between Jenkins and SonarQube:

1. a Jenkins pipeline is started

2. the SonarQube scanner is run against a code project, and the analysis report is
sent to SonarQube server

3. SonarQube finishes analysis and checking the project meets the configured
Quality Gate
4. SonarQube sends a pass or failure result back to the Jenkins webhook exposed
by the plugin

5. the Jenkins pipeline will continue if the analysis result is a pass or optionally
otherwise fail

Full worked example

Let’s get our hands dirty with a worked example. We’ll run through all the steps in
the UI manually as this is the best way to understand the setup.

In this example we’ll:

1. get Jenkins and SonarQube up and running


2. install the SonarQube Scanner Jenkins plugin and configure it to point to our
SonarQube instance
3. configure SonarQube to call the Jenkins webhook when project analysis is
finished
4. create two Jenkins pipelines

 one that runs against a codebase with zero issues (I wish all my code

was like this )


 one that runs against a codebase with bad code issues

5. run the pipelines and see it all working

You’ll need to make sure you have Docker installed before carrying on.

Fast track: to get up and running quickly check out this GitHub repository. Everything
is setup through configuration-as-code, except the steps under Configure SonarQube
below.

Running Jenkins and SonarQube

What better way to start these two services than with Docker Compose? Create the
following file docker-compose.yml:

version: "3"

services:

sonarqube:

image: sonarqube:lts

ports:
- 9000:9000

networks:

- mynetwork

environment:

- SONAR_FORCEAUTHENTICATION=false

jenkins:

image: jenkins/jenkins:2.319.1-jdk11

ports:

- 8080:8080

networks:

- mynetwork

networks:

mynetwork:

 we’re configuring two containers in Docker Compose: Jenkins and SonarQube


 the Docker images used come from the official repositories in Docker Hub
 we’re adding both containers to the same network so they can talk to each
other
 for demo purposes SonarQube authentication is disabled so Jenkins won’t
need to pass a token

Running docker-compose up in the directory containing the file will start Jenkins
on http://localhost:8080 and SonarQube on http://localhost:9000. Awesomeness!

Configuring the SonarQube Scanner Jenkins plugin

Grab the Jenkins administrator password from the Jenkins logs in the console output
of the Docker Compose command you just ran.

jenkins_1 | Please use the following password to proceed to installation:


jenkins_1 |
jenkins_1 | 7efed7f025ee430c8938beaa975f5dde
Head over to your Jenkins instance and paste in the password.
On the next page choose Select plugins to install and install only
the pipeline and git plugins. The SonarQube Scanner plugin we’ll have to install
afterwards since this Getting Started page doesn’t give us the full choice of plugins.

In the final steps you’ll have to create a user and confirm the Jenkins URL
of http://localhost:8080.

Once complete head over to Manage Jenkins > Manage Plugins > Available and search
for sonar. Select the SonarQube Scanner plugin and click Install without restart.
Once the plugin is installed, let’s configure it!

Go to Manage Jenkins > Configure System and scroll down to the SonarQube
servers section. This is where we’ll add details of our SonarQube server so Jenkins can
pass its details to our project’s build when we run it.

Click the Add SonarQube button. Now add a Name for the server, such as SonarQube.
The Server URL will be http://sonarqube:9000. Remember to click Save.
Configuring SonarQube

Let’s jump over to SonarQube. Click Log in at the top-right of the page, and log in with
the default credentials of admin/admin. You’ll then have to set a new password.

Now go to Administration > Configuration > Webhooks. This is where we can add
webhooks that get called when project analysis is completed. In our case we need to
configure SonarQube to call Jenkins to let it know the results of the analysis.

Click Create, and in the popup that appears give the webhook a name of Jenkins, set the
URL to http://jenkins:8080/sonarqube-webhook and click Create.

In this case, the URL has the path sonarqube-webhook which is exposed by the
SonarQube Scanner plugin we installed earlier.

Adding a quality gate

SonarQube comes with its own Sonar way quality gate enabled by default. If you click
on Quality Gates you can see the details of this.
It’s all about making sure that new code is of a high quality. In this example we want
to check the quality of existing code, so we need to create a new quality gate.

Click Create, then give the quality gate a name. I’ve called mine Tom Way

Click Save then on the next screen click Add Condition. Select On Overall Code. Search
for the metric Maintainability Rating and choose worse than A. This means that if
existing code is not maintainable then the quality gate will fail. Click Add Condition to
save the condition.
Finally click Set as Default at the top of the page to make sure that this quality gate
will apply to any new code analysis.

Creating Jenkins pipelines

Last thing to do is setup two Jenkins pipelines:

1. A pipeline which runs against a code project over at the sonarqube-jacoco- code-
coverage GitHub repository. The code here is decent enough that the pipeline
should pass.
2. A pipeline which runs against the same project, but uses the bad-code branch.
The code here is so bad that the pipeline should fail.

Good code pipeline

Back in Jenkins click New Item and give it a name of sonarqube-good-code, select
the Pipeline job type, then click OK.

Scroll down to the Pipeline section of the configuration page and enter the following
declarative pipeline script in the Script textbox:

pipeline {

agent any

stages {

stage('Clone sources'){

steps {

git url: 'https://github.com/tkgregory/sonarqube-jacoco-code-coverage.git'

stage('SonarQube analysis'){
steps {

withSonarQubeEnv('SonarQube'){

sh "./gradlew sonarqube"

stage("Quality gate"){

steps {

waitForQualityGate abortPipeline: true

The script has three stages:

1. in the Clone sources stage code is cloned from the GitHub repository mentioned
earlier
2. in the SonarQube analysis stage we use
the withSonarQubeEnv('Sonarqube') method exposed by the plugin to wrap the
Gradle build of the code repository. This provides all the configuration required
for the build to know where to find SonarQube. Note that the project build itself
must have a way of running SonarQube analysis, which in this case is done by
running ./gradlew sonarqube. For more information about running SonarQube
analysis in a Gradle build see this article
3. in the Quality gate stage we use the waitForQualityGate method exposed by the
plugin to wait until the SonarQube server has called the Jenkins webhook. The
abortPipeline flag means if the SonarQube analysis result is a failure, we abort
the pipeline.

Click Save to save the pipeline.

SonarQube magic: all the withSonarQubeEnv method does is export some


environment variables that the project’s build understands. By adding a pipeline step
which runs the command printenv wrapped in withSonarQubeEnv, you’ll be able to
see environment variables such as SONAR_HOST_URL being set. These get picked
up by the Gradle build of the code project to tell it which SonarQube server to connect
to.

Bad code pipeline

Create another pipeline in the same way, but name it sonarqube-bad-code. The pipeline
script is almost exactly the same, except this time we need to check out the bad-
code branch of the same repository.

pipeline {

agent any

stages {

stage('Clone sources'){

steps {

git branch: 'bad-code', url: 'https://github.com/tkgregory/sonarqube-jacoco-code-


coverage.git'

stage('SonarQube analysis'){

steps {

withSonarQubeEnv('SonarQube'){

sh "./gradlew sonarqube"

stage("Quality gate"){

steps {

waitForQualityGate abortPipeline: true

}
}

 in the Clone sources stage we’re now also specifying the branch attribute to
point to the bad-code branch

Again, click Save.

You should now have two Jenkins jobs waiting to be run.

Any guesses as to what we’re going to do next?

SonarQube analysis and quality gate stages in action

Yes, that’s right, now it’s time to run our pipelines!


Let’s run the sonarqube-good-code pipeline first.

You should get a build with all three stages passing.


If we head over to SonarQube we can see that indeed our project has passed the
quality gate.
Now let’s run the sonarqube-bad-code pipeline. Remember this is running against
some really bad code!

You’ll be able to see that the Quality gate stage of the pipeline has failed. Exactly
what we wanted, blocking any future progress of this pipeline.

In the build’s Console Output you’ll see the message ERROR: Pipeline aborted due to
quality gate failure: ERROR which shows that the pipeline failed for the right reason.

Over in SonarQube you’ll see that this time it’s reporting a Quality Gate failure.

Looks like we got some code smells on our hands!

Click on the project name for more details.

We can see that the maintainability rating has dropped to B because of the two code
smells. This doesn’t meet our quality gate, which requires a minimum A rating.
Final thoughts

You’ve seen that integrating SonarQube quality gates into Jenkins is straightforward
using the SonarQube Scanner Jenkins plugin. To apply this to a production setup, I
suggest also to:

 remove the SONAR_FORCEAUTHENTICATION environment variable from


SonarQube & configure the webhook in Jenkins to require an authentication
token (see the SonarQube Scanner plugin configuration)
 consider running SonarQube analysis on feature branches, so developers get
early feedback on whether their code changes are good before merging into
master. However, multi-branch analysis does require a paid subscription to
SonarQube.

For full details about setting up SonarQube analysis in a Gradle code project, see How
To Measure Code Coverage Using SonarQube and Jacoco. If you’re using Maven,
check out this documentation from SonarQube.

Exercise 11

AIM:

Module name: Implementation of CICD with Java and open source stack

Create a pipeline viewof the Jenkins pipeline used in exercise 8 .configure it with
user defined messages

Procedure:

Jenkins provides an out of box functionality for Junit, and provides a host of plugins
for unit testing for other technologies, an example being MSTest for .Net Unit tests. If
you go to the link https://wiki.jenkins-ci.org/display/JENKINS/xUnit+Plugin it will
give the list of Unit Testing plugins available.
Example of a Junit Test in Jenkins

The following example will consider

 A simple HelloWorldTest class based on Junit.


 Ant as the build tool within Jenkins to build the class accordingly.
Step 1 − Go to the Jenkins dashboard and Click on the existing HelloWorld project
and choose the Configure option
Step 2 − Browse to the section to Add a Build step and choose the option to Invoke
Ant.

Step 3 − Click on the Advanced button.


Step 4 − In the build file section, enter the location of the build.xml file.

Step 5 − Next click the option to Add post-build option and choose the option of
“Publish Junit test result report”
Step 6 − In the Test reports XML’s, enter the location as shown below. Ensure that
Reports is a folder which is created in the HelloWorld project workspace. The “*.xml”
basically tells Jenkins to pick up the result xml files which are produced by the
running of the Junit test cases. These xml files which then be converted into reports
which can be viewed later.
Once done, click the Save option at the end.

Step 7 − Once saved, you can click on the Build Now option.
Once the build is completed, a status of the build will show if the build was successful
or not. In the Build output information, you will now notice an additional section called
Test Result. In our case, we entered a negative Test case so that the result would fail
just as an example.
One can go to the Console output to see further information. But what’s more
interesting is that if you click on Test Result, you will now see a drill down of the Test
results.

Jenkins provides an out of box functionality for Junit, and provides a host of plugins
for unit testing for other technologies, an example being MSTest for .Net Unit tests. If
you go to the link https://wiki.jenkins-ci.org/display/JENKINS/xUnit+Plugin it will
give the list of Unit Testing plugins available.
Example of a Junit Test in Jenkins

The following example will consider

 A simple HelloWorldTest class based on Junit.


 Ant as the build tool within Jenkins to build the class accordingly.
Step 1 − Go to the Jenkins dashboard and Click on the existing HelloWorld project
and choose the Configure option
Step 2 − Browse to the section to Add a Build step and choose the option to Invoke
Ant.

Step 3 − Click on the Advanced button.


Step 4 − In the build file section, enter the location of the build.xml file.

Step 5 − Next click the option to Add post-build option and choose the option of
“Publish Junit test result report”
Step 6 − In the Test reports XML’s, enter the location as shown below. Ensure that
Reports is a folder which is created in the HelloWorld project workspace. The “*.xml”
basically tells Jenkins to pick up the result xml files which are produced by the
running of the Junit test cases. These xml files which then be converted into reports
which can be viewed later.
Once done, click the Save option at the end.

Step 7 − Once saved, you can click on the Build Now option.
Once the build is completed, a status of the build will show if the build was successful
or not. In the Build output information, you will now notice an additional section called
Test Result. In our case, we entered a negative Test case so that the result would fail
just as an example.
One can go to the Console output to see further information. But what’s more interesting
is that if you click on Test Result, you will now see a drill down of the Test results.
Exercise 12

AIM:

Module name: Implementation of CICD with Java and open source stack

Create a pipeline viewof the Jenkins pipeline used in exercise 8 .configure it with
user defined messages

Procedure:

Code analysis in the agile product development cycle is one of the important and
necessary items to avoid possible failures and defects arising out of the continuous
changes in the source codes. There are few good reasons to include this in our
development lifecycle.

 It can help to find vulnerabilities in the distant corners of your application,


which are not even used, then also static analysis has a higher probability of
finding those vulnerabilities.
 You can define your project specific rules, and they will be ensured to follow
without any manual intervention.
 It can help to find the bug early in the development cycle, which means less
cost to fix them.

More importantly this you can include in your build process once and use it always
without having to do any manual steps.

Challenge

Now let’s talk about the actual the challenge. SonarQube does help us to gain
visibility into our code base. However, soon you will realize that having visibility into
code isn't enough and in order to take the actual advantage of code analysis, we need
to make the use of different data insights that we get with SonarQube.

One way was to enforce the standards and regulate them across all teams within the
organization. Quality Gates exactly what we needed here and are the best way to
ensure that standards are met and regulated across all the projects in your
organization. Quality Gates can be defined as a set of threshold measures set on your
project like Code Coverage, Technical Debt Measure, Number of Blocker/Critical
issues, Security Rating/ Unit Test Pass Rate and more.

Enforce Quality Gates

Failing your build jobs when the code doesn’t meet criteria set in Quality Gates
should be the way to go. We were using Jenkins as our CI tool and therefore we
wanted to setup Jenkins job to fail if the code doesn’t meet quality gates.
In this article, we are going to setup following

1. Quality gate metrics setup in SonarQube.


2. Configure Jenkins job to fail the build when not meeting Quality Gates.

Jenkins job setup

Prerequisites

 Install Jenkins plugin “sonar-quality-gates-plugin” if not already present.


 Email-ext plugin for Jenkins to be able to send emails.
 Jenkins project configured i.e. successfully passing the build already.

Here is the snapshot of the job that currently passing build before Quality Gates setup.

Let’s setup Quality gate metrics in the sonarqube server. We are going to create
quality gate only for the metrics “Code coverage” for demo purpose. But there are
more metrics available that you should be selecting while creating quality gates.

Login to sonarqube as admin → go to Quality Gates

Click on create -> Add Condition -> Choose metrics


(In this example, we selected Code Coverage) -> select operator along with warning
and error threshold.

Select the project from the available list to which you want to associate this quality
gate. We have selected sample miqp project for which we have set up Jenkins job.
Now go to the Jenkins job and configure the quality gate validation. Click on the job
and go to Post-build Actions and provide the project details you have associated with
Quality Gate created in the earlier steps.

Run the Jenkins job again and verify the build status post quality check enabled.
As we could see that code passed the build, however, it doesn't pass quality gate
check. Therefore, build fails in the end. We can verify the same with the project
status in sonarqube server.

You might also like