Local Development with Jenkins Pipelines

I have been writing a lot of pipelines lately. Pipelines, just like any code, should be run locally while in development before it gets run on production jenkins. Otherwise, one seemingly minor change could cause the pipeline for a product to stop in its tracks. Local development for applications with rapid feedback is required. Today, even infrastructure can be developed with rapid feedback with tools such as test-kitchen and chef-spec. I doubt anyone has any qualms with that. So why can’t the same be said for the glue that brings applications and infrastructure together? At Liatrio, our viewpoint is that local development with rapid feedback is essential for pipelines.

Before jenkins pipelines, writing jobs in groovy DSL was king and local development was easy. Whether you were running jenkins locally in a vagrant box or docker container, it was easy to mount a directory from your host machine and have jenkins execute code as you worked on it in your editor. Life was good – well good if you like external job configuration with potentially hundreds of lines of groovy. Pipelines on the other hand are a different animal.

Creating pipelines is now simpler than creating a freestyle job, but local development is not as straight-forward. The pipelines job type does not have all the configuration options that freestyle jobs have, such as the ability to specify a custom workspace. The only two options for using a jenkinsfile is using one in-line (inputted into a text box) or referencing it in git repo. Having to create a new branch (so not to impact current work) and then having to do a commit and push for every change is slow work and can be quite frustrating, especially when debugging.

I’ve created a solution available at https://github.com/liatrio/pipeline-developer. I’ll walk you through it. The goal here is to be able to immediately run a pipeline locally without needing to manually copy code or make a git commit before running changes.

 

Let’s Get Started

1. One directory above the project you would like to work on, clone https://github.com/liatrio/pipeline-developer.git

 

2. Change line 7 in the docker compose file to point to the directory where your code lives. The default points to a project called sample-pipeline-spring-petclinic one directory up. This repo is available as an example at https://github.com/liatrio/sample-pipeline-spring-petclinic .

3. Then run docker-compose up . A jenkins instance should come up and be available at http://localhost:18080 within moments.

 

4. There should be one job called ‘pipeline-updater’. Click build. It will execute, and a create a pipeline job called devPipeline and run your pipeline code in it.

How it works

The docker-compose volume mounts your pipeline project to a directory on the jenkins container called /pipeline-dev. The job “pipeline-editor” creates the job called “devPipeline” using groovy dsl and put’s the jenkinsfile definition as an in-line jenkinsfile. It then copies all of the files into that pipeline job’s workspace. Finally, it kicks off the job itself. In this example the job is saved as a config.xml and put directly on jenkins while the container is built.

Using Your Own Jenkins Instance

To use this on your own local jenkins instance you will need to do is the same docker-compose setting (or linked directory in a vagrant setup). We want to avoid manual setup, so create a freestyle job for running some dsl. Have the job pull down the same repo, and reference the “CreateDevJob.groovy” the DSL job definition. Running that job should create the pipeline-updater job. If you have an questions, just tweet at me @bjaminstein .

Next Steps

The example provided uses a very simple pipeline, but this workflow will work for any jenkinsfile. More often than not, a jenkinsfile will be used with shared libraries to share the same functionality among multiple pipelines. Shared libraries are git repos with methods for to be used by jenkins files. This makes sense for deploy logic that would be shared among multiple pipelines. I’ll cover this in my next post, as well as how to work with them locally.

 

If you have any comments or questions, reach out to us @liatrio or ben@liatrio
Liatrio is a DevOps Consulting firm focussing on helping enterprises get better at software delivery using DevOps & Rapid Release philosophies. We work as “boots on the ground change agents” helping our clients re-imagine their daily work and get better at delivery one day at a time. Liatrio is also hiring! If you enjoy being a part of a team that is solving challenges around software delivery automation, deployment pipelines and large scale transformations, reach out to us via our contact page on our website.

This Post Has 2 Comments
  1. I followed your example https://liatrio.com/local-development-with-jenkins-pipelines/ and I saw it create a pipeline job called devPipeline in Jenkins. (http://localhost:18080) Nice.

    However when I build the pipeline-update, it fails.

    Started by user anonymous
    [EnvInject] – Loading node environment variables.
    Building in workspace /pipeline-dev
    [/pipeline-dev] $ /bin/sh -xe /tmp/jenkins7007542338989192023.sh
    + [ ! -d /var/jenkins_home/jobs/devPipeline/ ]
    + cp -r ../pipeline-dev/AnsiColorBuildWrapper.groovy ../pipeline-dev/README.md /var/jenkins_home/jobs/devPipeline/workspace/
    Processing provided DSL script
    ERROR: (script, line 4) File Jenkinsfile does not exist in workspace
    Finished: FAILURE

    Any help wold be appreciated.

    1. Hey Steve,

      Did you alter the docker-compose file to point to the repo you’re working on? Can you post a link to that repo?

      -Ben

Leave a Reply

Your email address will not be published. Required fields are marked *