Jenkins Guided Tour

Download as pdf or txt
Download as pdf or txt
You are on page 1of 13

Getting started with the Guided Tour

This guided tour introduces you to the basics of using Jenkins and its main feature,
Jenkins Pipeline. This tour uses the "standalone" Jenkins distribution, which runs
locally on your own machine.

Prerequisites

For this tour, you will require:

 A machine with:
o 256 MB of RAM, although more than 2 GB is recommended
o 10 GB of drive space (for Jenkins and your Docker image)
 The following software installed:
o Java 8 or 11 (either a JRE or Java Development Kit (JDK) is fine)
o Docker (navigate to Get Docker site to access the Docker download
that’s suitable for your platform)
Download and run Jenkins

1. Download Jenkins.
2. Open up a terminal in the download directory.
3. Run java -jar jenkins.war --httpPort=8080.
4. Browse to http://localhost:8080.
5. Follow the instructions to complete the installation.
When the installation is complete, you can start putting Jenkins to work!

Creating your first Pipeline


What is a Jenkins Pipeline?

Jenkins Pipeline (or simply "Pipeline") is a suite of plugins which supports


implementing and integrating continuous delivery pipelines into Jenkins.

A continuous delivery pipeline is an automated expression of your process for getting


software from version control right through to your users and customers.

Jenkins Pipeline provides an extensible set of tools for modeling simple-to-complex


delivery pipelines "as code". The definition of a Jenkins Pipeline is typically written
into a text file (called a Jenkinsfile) which in turn is checked into a project’s source
control repository. [1]

For more information about Pipeline and what a Jenkinsfile is, refer to the
respective Pipeline and Using a Jenkinsfile sections of the User Handbook.

To get started quickly with Pipeline:

1. Copy one of the examples below into your repository and name it Jenkinsfile
2. Click the New Item menu within

Jenkins
3. Provide a name for your new item (e.g. My-Pipeline) and select Multibranch
Pipeline
4. Click the Add Source button, choose the type of repository you want to use
and fill in the details.
5. Click the Save button and watch your first Pipeline run!
You may need to modify one of the example Jenkinsfile's to make it run with your
project. Try modifying the sh command to run the same command you would run on
your local machine.

After you have setup your Pipeline, Jenkins will automatically detect any new
Branches or Pull Requests that are created in your repository and start running
Pipelines for them.

Continue to "Run multiple steps"

Quick Start Examples


Below are some easily copied and pasted examples of a simple Pipeline with various
languages.

Java

Jenkinsfile (Declarative Pipeline)


pipeline {
agent { docker { image 'maven:3.3.3' } }
stages {
stage('build') {
steps {
sh 'mvn --version'
}
}
}
}
Toggle Scripted Pipeline (Advanced)

Node.js / JavaScript

Jenkinsfile (Declarative Pipeline)


pipeline {
agent { docker { image 'node:14-alpine' } }
stages {
stage('build') {
steps {
sh 'npm --version'
}
}
}
}
Toggle Scripted Pipeline (Advanced)

Ruby

Jenkinsfile (Declarative Pipeline)


pipeline {
agent { docker { image 'ruby' } }
stages {
stage('build') {
steps {
sh 'ruby --version'
}
}
}
}
Toggle Scripted Pipeline (Advanced)

Python

Jenkinsfile (Declarative Pipeline)


pipeline {
agent { docker { image 'python:3.5.1' } }
stages {
stage('build') {
steps {
sh 'python --version'
}
}
}
}
Toggle Scripted Pipeline (Advanced)

PHP

Jenkinsfile (Declarative Pipeline)


pipeline {
agent { docker { image 'php' } }
stages {
stage('build') {
steps {
sh 'php --version'
}
}
}
}
Toggle Scripted Pipeline (Advanced)

Go

Jenkinsfile (Declarative Pipeline)


pipeline {
agent { docker { image 'golang' } }
stages {
stage('build') {
steps {
sh 'go version'
}
}
}
}

Running multiple steps


Pipelines are made up of multiple steps that allow you to build, test and deploy
applications. Jenkins Pipeline allows you to compose multiple steps in an easy way
that can help you model any sort of automation process.

Think of a "step" like a single command which performs a single action. When a step
succeeds it moves onto the next step. When a step fails to execute correctly the
Pipeline will fail.

When all the steps in the Pipeline have successfully completed, the Pipeline is
considered to have successfully executed.

Linux, BSD, and Mac OS

On Linux, BSD, and Mac OS (Unix-like) systems, the sh step is used to execute a
shell command in a Pipeline.

Jenkinsfile (Declarative Pipeline)


pipeline {
agent any
stages {
stage('Build') {
steps {
sh 'echo "Hello World"'
sh '''
echo "Multiline shell steps works too"
ls -lah
'''
}
}
}
}
Toggle Scripted Pipeline (Advanced)

Windows

Windows-based systems should use the bat step for executing batch commands.

Jenkinsfile (Declarative Pipeline)


pipeline {
agent any
stages {
stage('Build') {
steps {
bat 'set'
}
}
}
}
Toggle Scripted Pipeline (Advanced)

Timeouts, retries and more


There are some powerful steps that "wrap" other steps which can easily solve
problems like retrying (retry) steps until successful or exiting if a step takes too long
(timeout).

Jenkinsfile (Declarative Pipeline)


pipeline {
agent any
stages {
stage('Deploy') {
steps {
retry(3) {
sh './flakey-deploy.sh'
}

timeout(time: 3, unit: 'MINUTES') {


sh './health-check.sh'
}
}
}
}
}
Toggle Scripted Pipeline (Advanced)

The "Deploy" stage retries the flakey-deploy.sh script 3 times, and then waits for up
to 3 minutes for the health-check.sh script to execute. If the health check script does
not complete in 3 minutes, the Pipeline will be marked as having failed in the
"Deploy" stage.

"Wrapper" steps such as timeout and retry may contain other steps,
including timeout or retry.

We can compose these steps together. For example, if we wanted to retry our
deployment 5 times, but never want to spend more than 3 minutes in total before
failing the stage:

Jenkinsfile (Declarative Pipeline)


pipeline {
agent any
stages {
stage('Deploy') {
steps {
timeout(time: 3, unit: 'MINUTES') {
retry(5) {
sh './flakey-deploy.sh'
}
}
}
}
}
}
Toggle Scripted Pipeline (Advanced)

Finishing up
When the Pipeline has finished executing, you may need to run clean-up steps or
perform some actions based on the outcome of the Pipeline. These actions can be
performed in the post section.

Jenkinsfile (Declarative Pipeline)


pipeline {
agent any
stages {
stage('Test') {
steps {
sh 'echo "Fail!"; exit 1'
}
}
}
post {
always {
echo 'This will always run'
}
success {
echo 'This will run only if successful'
}
failure {
echo 'This will run only if failed'
}
unstable {
echo 'This will run only if the run was marked as unstable'
}
changed {
echo 'This will run only if the state of the Pipeline has changed'
echo 'For example, if the Pipeline was previously failing but is now
successful'
}
}
}

Defining execution environments


In the previous section you may have noticed the agent directive in each of the
examples. The agent directive tells Jenkins where and how to execute the Pipeline,
or subset thereof. As you might expect, the agent is required for all Pipelines.

Underneath the hood, there are a few things agent causes to happen:

 All the steps contained within the block are queued for execution by Jenkins.
As soon as an executor is available, the steps will begin to execute.
 A workspace is allocated which will contain files checked out from source
control as well as any additional working files for the Pipeline.
There are several ways to define agents for use in Pipeline, for this tour we will only
focus on using an ephemeral Docker container.

Pipeline is designed to easily use Docker images and containers to run inside. This
allows the Pipeline to define the environment and tools required without having to
configure various system tools and dependencies on agents manually. This
approach allows you to use practically any tool which can be packaged in a Docker
container.

For more agent specification options, consult the syntax reference.

Jenkinsfile (Declarative Pipeline)


pipeline {
agent {
docker { image 'node:14-alpine' }
}
stages {
stage('Test') {
steps {
sh 'node --version'
}
}
}
}

Toggle Scripted Pipeline (Advanced)


When the Pipeline executes, Jenkins will automatically start the specified container
and execute the defined steps within it:

[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] sh
[guided-tour] Running shell script
+ node --version
v14.15.0
[Pipeline] }
[Pipeline] // stage
[Pipeline] }

Using environment variables


Environment variables can be set globally, like the example below, or per stage. As
you might expect, setting environment variables per stage means they will only apply
to the stage in which they’re defined.

Jenkinsfile (Declarative Pipeline)


pipeline {
agent {
label '!windows'
}

environment {
DISABLE_AUTH = 'true'
DB_ENGINE = 'sqlite'
}

stages {
stage('Build') {
steps {
echo "Database engine is ${DB_ENGINE}"
echo "DISABLE_AUTH is ${DISABLE_AUTH}"
sh 'printenv'
}
}
}
}

Toggle Scripted Pipeline (Advanced)


This approach to defining environment variables from within the Jenkinsfile can be
very useful for instructing scripts, such as a Makefile, to configure the build or tests
differently to run them inside of Jenkins.

See "Using Environment Variables" for more details on using environment


variables in Pipelines.

Environment variables may also be set by Jenkins plugins. Refer to the


documentation of the specific plugins for environment variable names and
descriptions for those plugins.
Another common use for environment variables is to set or override "dummy"
credentials in build or test scripts. Because it’s (obviously) a bad idea to put
credentials directly into a Jenkinsfile, Jenkins Pipeline allows users to quickly and
safely access pre-defined credentials in the Jenkinsfile without ever needing to
know their values.

Credentials in the Environment

Recording tests and artifacts


While testing is a critical part of a good continuous delivery pipeline, most people
don’t want to sift through thousands of lines of console output to find information
about failing tests. To make this easier, Jenkins can record and aggregate test
results so long as your test runner can output test result files. Jenkins typically
comes bundled with the junit step, but if your test runner cannot output JUnit-style
XML reports, there are additional plugins which process practically any widely-used
test report format.

To collect our test results and artifacts, we will use the post section.

Jenkinsfile (Declarative Pipeline)


pipeline {
agent any
stages {
stage('Test') {
steps {
sh './gradlew check'
}
}
}
post {
always {
junit 'build/reports/**/*.xml'
}
}
}
Toggle Scripted Pipeline (Advanced)

This will always grab the test results and let Jenkins track them, calculate trends and
report on them. A Pipeline that has failing tests will be marked as "UNSTABLE",
denoted by yellow in the web UI. That is distinct from the "FAILED" state, denoted by
red.

Pipeline execution will by default proceed even when the build is unstable. To skip deployment after test
Declarative syntax, use the skipStagesAfterUnstable option. In Scripted syntax, you may
check currentBuild.currentResult == 'SUCCESS'.

When there are test failures, it is often useful to grab built artifacts from Jenkins for
local analysis and investigation. This is made practical by Jenkins’s built-in support
for storing "artifacts", files generated during the execution of the Pipeline.
This is easily done with the archiveArtifacts step and a file-globbing expression, as
is demonstrated in the example below:

Jenkinsfile (Declarative Pipeline)


pipeline {
agent any
stages {
stage('Build') {
steps {
sh './gradlew build'
}
}
stage('Test') {
steps {
sh './gradlew check'
}
}
}

post {
always {
archiveArtifacts artifacts: 'build/libs/**/*.jar', fingerprint: true
junit 'build/reports/**/*.xml'
}
}
}
Toggle Scripted Pipeline (Advanced)

If more than one parameter is specified in the archiveArtifacts step, then each
parameter’s name must explicitly be specified in the step code - i.e. artifacts for the
artifact’s path and file name and fingerprint to choose this option. If you only need
to specify the artifacts' path and file name/s, then you can omit the parameter
name artifacts - e.g.
archiveArtifacts 'build/libs/**/*.jar'

Recording tests and artifacts in Jenkins is useful for quickly and easily surfacing
information to various members of the team. In the next section we’ll talk about how
to tell those members of the team what’s been happening in our Pipeline.

Cleaning up and notifications


Since the post section of a Pipeline is guaranteed to run at the end of a Pipeline’s
execution, we can add some notification or other steps to perform finalization,
notification, or other end-of-Pipeline tasks.

See Glossary - Build Status for the different build statuses: SUCCESS, UNSTABLE, and FAILED.

Jenkinsfile (Declarative Pipeline)


pipeline {
agent any
stages {
stage('No-op') {
steps {
sh 'ls'
}
}
}
post {
always {
echo 'One way or another, I have finished'
deleteDir() /* clean up our workspace */
}
success {
echo 'I succeeded!'
}
unstable {
echo 'I am unstable :/'
}
failure {
echo 'I failed :('
}
changed {
echo 'Things were different before...'
}
}
}
Toggle Scripted Pipeline (Advanced)

There are plenty of ways to send notifications, below are a few snippets
demonstrating how to send notifications about a Pipeline to an email, a Hipchat
room, or a Slack channel.

Email
post {
failure {
mail to: 'team@example.com',
subject: "Failed Pipeline: ${currentBuild.fullDisplayName}",
body: "Something is wrong with ${env.BUILD_URL}"
}
}
Hipchat
post {
failure {
hipchatSend message: "Attention @here ${env.JOB_NAME} #${env.BUILD_NUMBER}
has failed.",
color: 'RED'
}
}
Slack
post {
success {
slackSend channel: '#ops-room',
color: 'good',
message: "The pipeline ${currentBuild.fullDisplayName} completed
successfully."
}
}
Now that the team is being notified when things are failing, unstable, or even
succeeding we can finish our continuous delivery pipeline with the exciting part:
shipping!

Deployment
The most basic continuous delivery pipeline will have, at minimum, three stages
which should be defined in a Jenkinsfile: Build, Test, and Deploy. For this section
we will focus primarily on the Deploy stage, but it should be noted that stable Build
and Test stages are an important precursor to any deployment activity.

Jenkinsfile (Declarative Pipeline)


pipeline {
agent any
options {
skipStagesAfterUnstable()
}
stages {
stage('Build') {
steps {
echo 'Building'
}
}
stage('Test') {
steps {
echo 'Testing'
}
}
stage('Deploy') {
steps {
echo 'Deploying'
}
}
}
}
Toggle Scripted Pipeline (Advanced)

Stages as Deployment Environments


One common pattern is to extend the number of stages to capture additional
deployment environments, like "staging" or "production", as shown in the following
snippet.

stage('Deploy - Staging') {
steps {
sh './deploy staging'
sh './run-smoke-tests'
}
}
stage('Deploy - Production') {
steps {
sh './deploy production'
}
}
In this example, we’re assuming that whatever "smoke tests" are run by our ./run-
smoke-tests script are sufficient to qualify or validate a release to the production
environment. This kind of pipeline that automatically deploys code all the way
through to production can be considered an implementation of "continuous
deployment." While this is a noble ideal, for many there are good reasons why
continuous deployment might not be practical, but those can still enjoy the benefits of
continuous delivery. [1] Jenkins Pipeline readily supports both.

Asking for human input to proceed


Often when passing between stages, especially environment stages, you may want
human input before continuing. For example, to judge if the application is in a good
enough state to "promote" to the production environment. This can be accomplished
with the input step. In the example below, the "Sanity check" stage actually blocks
for input and won’t proceed without a person confirming the progress.

Jenkinsfile (Declarative Pipeline)


pipeline {
agent any
stages {
/* "Build" and "Test" stages omitted */

stage('Deploy - Staging') {
steps {
sh './deploy staging'
sh './run-smoke-tests'
}
}

stage('Sanity check') {
steps {
input "Does the staging environment look ok?"
}
}

stage('Deploy - Production') {
steps {
sh './deploy production'
}
}
}
}
Toggle Scripted Pipeline (Advanced)

Conclusion
This Guided Tour introduced you to the basics of using Jenkins and Jenkins
Pipeline. Because Jenkins is extremely extensible, it can be modified and configured
to handle practically any aspect of automation. To learn more about what Jenkins
can do, check out the User Handbook, or the Jenkins blog for the latest events,
tutorials, and updates.

You might also like