Blue Ocean 1.0.0 released in early April 2017 and it looks great. There has been some criticism from the community regarding the system, but we think Blue Ocean has come a long way and is continuing to grow. Admittedly, using Blue Ocean won’t come without some growing pains, but the overall benefits will help to make your software delivery better. Here, I’ll share some history, some of the main benefits of Blue Ocean, and some examples of software and infrastructure as code.
Pipelines of Old
We already have a pipeline; what’s wrong with it? Implementation and Visualization could be better.
Implementation of pipelines from their beginnings through today vary from project to project. Historically pipelines were created manually by an admin and these jobs would be tedious, prone to error, and – without proper backup – subject to loss if the server went down. If there are hundreds of jobs and something changes project-wide then the maintainer would be required to verify all changes occurred in every job. Implementation would be much simpler with a streamlined way to keep the configuration with the project files. This has been partially remedied with Groovy scripting and the Jenkinsfile plugin but it can always be better.
Visualizing a pipeline in Jenkins before Blue Ocean pipeline automation wasn’t aesthetically pleasing and required manual setup of the views. An admin could use the Build Pipeline Plugin which would present a visual representation of the pipeline but this took extra time. Developers may need to dive deep into folders to find the job they needed to see only to be met with the wall of text that the console outputs. Visualization could be improved by automatically generating views for users and helping to navigate through the messy log files.
Benefits
Blue Ocean is a project that is designed to enhance the user’s experience with the software. Some of the main benefits are:
- Jenkinsfile and Pipeline Integration
- Visualization of Pipelines
- Focuses on what the current user is interested in
- Logs that are easy to navigate.
Jenkinsfile and Pipeline Integration
It is true that standard Jenkins can use pipelines and Jenkinsfile but the native support built into Blue Ocean rises above. A Jenkinsfile is a configuration file that will define the pipeline for a particular project. This file has been designed to live alongside your code in a repo to create accountability and reliability for your configuration through source control. This close coupling of Jenkinsfile and repo allowed Blue Ocean to implement an easy way to scan via the “New Pipeline” option.
This option brings up the Create Pipeline view and includes a step-by-step instruction set to add a new pipeline to the list. If you use Github then you can choose the repository you want to use or have it scan the entire organization looking for Jenkinsfiles.
Otherwise you can set up a URL and credentials to access a repo elsewhere. Once completed the application will inform you that the new pipeline has been created and it is ready to view. An example Jenkinsfile can be found at the bottom of this document.
Blue Ocean and Jenkinsfile have some great feedback for pipeline configuration. Getting an error an hour into a build because of a bad value or misspell is a thing of the past. Declarative Jenkinsfile will be linted at the start of a job and will fail if somethings is incorrectly added. Also, linting before committing is now a possibility using the Jenkins-cli. Both of these features add to benefits of pipeline as code and make building software less painful.
Visualization of Pipelines and Projects
In 2017, UI/UX design isn’t just for consumers anymore. Blue Ocean out of the box has a more modern, streamlined appearance that makes it easy to work in. Aside from aesthetics the views are personalized to show your favorite projects, projects you work on, and all of their statuses. The user can focus on the items important to them. Their favorites and projects they work on are listed on top with the color coding to draw quick attention to what is broken. Easily select favorites by clicking the star on the righthand side of the page and, in good Jenkins fashion, the weather report is still available with newer icons.
Blue Ocean also includes a new pipeline view that shows the steps the project takes through the pipeline. The main sections of the pipeline are broken into the stages in the project’s Jenkinsfile. If the stage has any parallel steps then those are shown like a stack. If a stage fails then it is clearly outlined by the view and you can dig further into the issue by selecting the failing step to see the log.
Easy-to-Read Logs
Speaking of logs, most people hate having to dig through huge log files to determine what went wrong with a build or test. Blue ocean breaks up the console output into the same sections that the build pipeline view shows. On top of that, those same sections may have multiple steps and each step has it’s own break in the console output.
Blue Ocean Pipeline Automation: Basic Examples
Now you see some of the features of Blue Ocean pipeline automation but how about actually using them? Moving to Jenkinsfile and pipeline-as-code takes some serious thought and effort but it is worth it in the long run. It is worth noting that there are two types of Jenkinsfile, Scripted and Declarative. These examples are both Declarative.
Here are two basic examples to look at:
Node.js example
Node.js is a popular Javascript runtime that is used in projects all over. If you are here after viewing the Linux Foundation’s Intro to Continuous Delivery course then you are already familiar with the project Dromedary. If you aren’t, Dromedary is a demo application by Stelligent (Thanks!). You are welcome to spin up your own instance of Jenkins with all of the Blue Ocean plugins to try this.
Let’s dive into the Jenkinsfile.
- agent
- The agent is the workspace that the
- This build is using a Dockerfile that is placed in the top level of the software’s repository. This Dockerfile takes the Nodejs image and creates a new image included Gulp.
- Stages
- We created an initialize stage to install the node_modules necessary for testing.
- Unit testing using gulp
- Convergence testing (No convergence testing here, just created for the sake of example)
- Parallelized builds will all run as a stage
- You can take this parallelized steps and spread them out over different agents to save on time and configuration
- Build using gulp. In your own software you can do what you see fit.
- Deploy
- Deploy the application with you preferred option.
- Some options are Heroku, AWS, an Artifact repo, etc.
Chef Infrastructure Example
Liatrio’s infrastructure code also goes through CI. Historically we have used Chef but this methodology can be used with whatever Infrastructure tool assuming testing is possible.
This example is straightforward. Let’s take a look:
- Agent for this pipeline is any. This will use the master agent in this case.
- Stages are similar to the previous example.
- The Setup step prints the version for log verification if need be as well as getting the dependencies with Berkshelf.
- Acceptance testing in parallel to easily see what failed if it did.
- Test Kitchen is a step that purposely fails for the sake of the description. It runs convergence (integration) testing of the cookbook.
- Post – Runs after the build is complete.
- Success – will print in this example but you can send messages, slack, etc on a successful build
- Failure – will print on a failed build but you can send emails, slack messages, etc on a failed build.
Final Thoughts
The above examples are trivial at best, but they show options and capabilities. This is a new take on Blue Ocean pipeline automation, and there will be growing pains. However, we hope we’ve shown that it’s worth it.
If you have any comments or questions, reach out to us.