Jenkins Pipelines are used to define, manage, and automate the entire lifecycle of software delivery processes (a.k.a. CI/CD pipelines). These pipelines provide a structured way to define the steps and stages involved in building, testing, deploying, and delivering software applications.
Jenkins Pipelines enable teams to automate repetitive tasks, and ensure consistent and reliable software releases.
Let’s take the Docker images build process as an example…
In our development workflow, whenever we wanted to create a new version of our application, we had to manually run the docker build
command to create a Docker image.
This process involved remembering the right set of build arguments, manually managing images, and we had no any mechanism to track the build history.
With Jenkins Pipelines, we can automate the build process, which makes it consistent and reproducible process. In addition, Jenkins provides a clear view of the build status, logs, and any errors encountered during the build.
Jenkinsfile
A Jenkins pipeline is defined in a file usually called Jenkinsfile
, stored as part of the code repository.
In this file you instruct Jenkins on how to build, test, and deploy your application by specifying a series of stages, steps, and configurations.
There are two main types of syntax for defining Jenkins pipelines in a Jenkinsfile: Declarative Syntax and Scripted Syntax.
The Jenkinsfile
typically consists of multiple stages, each of which performs a specific steps, such as building the code as a Docker image, running tests, or deploying the software to Kubernetes cluster.
Let’s create a declarative pipeline that builds aa docker image for the Roberta app.
main
, create a file called build.Jenkinsfile
in the root directory as the following template:pipeline {
agent any
stages {
stage('Build') {
steps {
sh 'ls'
sh 'echo building...'
}
}
}
}
The Jenkinsfile you’ve provided is written in Declarative Pipeline syntax. Let’s break down each part of the code:
pipeline { ... }
: This is the outermost block that encapsulates the entire pipeline definition.agent any
: This directive specifies the agent where the pipeline stages will run. The any
keyword indicates that the pipeline can run on any available agent (Jenkins agent or worker). Agents are the compute resources where the pipeline stages are executed.stages { ... }
: The stages block contains a series of stages that define the major steps in the pipeline. Stages represent different phases of the software delivery process.stage('Build') { ... }
: This directive defines a specific stage named “Build.” Each stage represents a logical phase of your pipeline, such as building, testing, deploying, etc.steps { ... }:
Inside the stage block, the steps block contains the individual steps or tasks to be executed within that stage.sh 'echo building...'
: This directive is a step that executes a shell command. In this case, it uses the sh (shell) step to run the shell command ‘echo building…’. This step prints “building…” to the console.RobertaBuild
), and choose Pipeline.repo,read:user,user:email,write:repo_hook
Click here to create a token with this scope.
github
as the credentials ID.main
as we want this pipeline to be triggered upon changes in branch main.build.Jenkinsfile
defining this pipeline.sh
step to the build.Jenkinsfile
, commit & push and see the triggered job.Well done! You’ve implemented an automated build pipeline for the Roberta app.
Let’s discuss the execution stages when your pipeline is running:
The Build phase builds the app source code and store the build artifact somewhere, make it ready to be deployed.
In our case, a docker image is our build artifact, but in general, there are many other build tools that can be used in different programming languages and contexts (e.g. maven, npm
, gradle
, etc…)
We now want to complete the build.Jenkinsfile
pipeline, such that on every run of this job,
a new docker image of the app will be built and stored in container registry (DockerHub or ECR).
build.Jenkinsfile
to build an docker image and push it. Your stage might look like:stage('Build') {
steps {
sh '''
docker login ...
docker build ...
docker tag ...
docker push ...
'''
}
}
stage
s according to your needs.BUILD_NUMBER
or BUILD_TAG
environment variables to tag your Docker images, but don’t tag the images as latest
.environment
directive to store global variable and make your pipeline a bit more elegant.Let’s create another new Jenkins pipeline that deploys the image we’ve just built to a Kubernetes cluster.
We would like to trigger the Deploy pipeline after every successful running of the Build pipeline.
Jenkinsfile
called deploy.Jenkinsfile
. In this pipeline we will define the deployment steps for the roberta app:
pipeline {
agent any
stages {
stage('Deploy') {
steps {
// complete this code to deploy to real k8s cluster
sh '# kubectl apply -f ....'
}
}
}
}
RobertaDeploy
), fill it similarly to the Build pipeline, but don’t trigger this pipeline as a result of a GitHub hook event (why?).We now want that every successful Build pipeline running will automatically trigger the Deploy pipeline. We can achieve this using the following two steps:
build.Jenkinsfile
, add the Pipeline: Build function that triggers the Deploy pipeline:stage('Trigger Deploy') {
steps {
build job: '<deploy-job-name>', wait: false, parameters: [
string(name: 'ROBERTA_IMAGE_URL', value: "<full-url-to-docker-image>")
]
}
}
Where:
<deploy-job-name>
is the name of your Deploy pipeline (should be RobertaDeploy
).<full-url-to-docker-image>
is a full URL to the Docker image you’ve just built. You environment variable to make it dynamically according to the image tag. E.g. : value: "${IMAGE_NAME}:${IMAGE_TAG}"
.deploy.Jenkinsfile
define a string parameter that will be passed to this pipeline from the Build pipeline:pipeline {
agent ..
# add the below line in the same level an `agent` and `stages`:
parameters { string(name: 'ROBERTA_IMAGE_URL', defaultValue: '', description: '') }
stages ...
}
Test your simple CI/CD pipeline end-to-end.
Enter the interactive self-check page
Use the post
directive and the docker image prune
command to cleanup the built Docker images from the disk.
Review some additional pipeline features, as part of the options
directive. Add the options{}
clause with the relevant features for the Build and Deploy pipelines.
Jenkins does not clean the workspace by default after a build. Jenkins retains the contents of the workspace between builds to improve performance by avoiding the need to re-fetch and recreate the entire workspace each time a build runs.
Cleaning the workspace can help ensure that no artifacts from previous builds interfere with the current build.
Configure stage('Clean Workspace')
stage to clean the workspace before or after a build.
Integrate snyk
image vulnerability scanning into your build-deploy pipline.
withCredentials
step, read your Snyk API secret as SNYK_TOKEN
env var, and perform the security testing using simple sh
step and synk
cli.snyk ignore
command:snyk ignore --id=<ISSUE_ID>
Create a new Jenkins pipeline and code the corresponding Jenkinsfile
, to periodically backup the Jenkins server in S3.
/var/lib/jenkins
directory into a .tar.gz
file, excluding some files..tar.gz
file to an S3 bucket.https://www.jenkins.io/blog/2017/02/15/declarative-notifications/