2 Approaches to Ensure Success in Your DevOps Journey

Suja Uddin Mullick Feb 18 - 6 min read

Audio : Listen to This Blog.

DevOps makes continuous software delivery simple for both development and operation teams, with a set of tools and best practices. In order to understand the power of DevOps, we chose a standard development environment with a suite of applications such as git, Gerrit, Jenkins, JIRA and Nagios. We studied setting up such a traditional environment and compared the same with a more modern approach involving docker containers.

Introduction

In this article we will discuss about DevOps, traditional and container based ways of approaching it. For this purpose, we will have a fictitious software company (client) which wants to streamline its development and delivery process.

What is DevOps?

DevOps means many things to many people. The one that is closest to our view is “DevOps is the practice of operations and development engineers participating together in the entire service lifecycle, from design through the development process to production support”.

The process of continuous delivery of good quality code with the help of tools that make it easy. There are many tools that work in tandem to ensure that only good quality code ends up to the production. Our client wants to use the following tools.

DevOps Tools

  • Git – Most popular distributed version control system.
  • Gerrit – Code review tool.
  • Jenkins -Continuous integration tool.
  • JIRA – Bug tracking tool.

Development Workflow

We came up with the following workflow that captures a typical development life cycle:

  • A developer will commit their changes to the staging area of a branch.
  • Gerrit will watch for commits in the staging area.
  • Jenkins will watch Gerrit for new change sets to review. It will then trigger a set of jobs to run on the patch set. The results are shared with both Gerrit and JIRA. Based on the commit message the appropriate JIRA issue will be updated.
  • When reviewers accept the change, it will be ready to commit.

In order to let Jenkins auto update JIRA issue, a pattern was enforced for all commits. This allowed us to automate Jenkins to find and update a specific JIRA issue.

DevOps Operations Workflow

Operation teams were more concerned with provisioning machines (physical or virtual), installing suite of applications and eventually monitoring those machines and applications. Alert notifications are important too, in order to address any anomalies at the earliest. For monitoring and alert notification we used Nagios.

Two Types of DevOps Approach

Traditional Approach

The traditional approach is to manually install these tools on bare metal boxes or virtual machines and configuring them to talk to each other.

Following are the brief steps for traditional DevOps infrastructure.

  • Git/Gerrit, Jenkins, JIRA is installed on single or multiple machines.
  • Gerrit project,accounts are created and access is provided as per the requirement.
  • Required plugins are installed on Jenkins( i.e. Gerrit-trigger, Git client, JIRA issue updater, Git etc).
  • Previously installed plugins are configured on Jenkins.
  • Jenkins ssh key is added against Gerrit account.
  • Multiple accounts with few issues are created in JIRA.

Now the whole DevOps infrastructure is ready to be used.

DevOps Automation Services via Python Script
We automated the workflow for both installation, configuration and monitoring using python script. Actual python code for downloading, installing and configuring Git, Gerrit, Jenkins, JIRA and Nagios can be found in the following Github repository.
https://github.com/sujauddinmullick/dev_ops_traditional

Container Approach

The automation of installation and configuration relieves us of some pain in setting up infrastructure. Think of a situation where the client’s environment dependencies conflict with our DevOps infrastructure dependencies. In order to solve this problem, we tried isolating the DevOps environment from existing environment. We used docker engine to setup these tools.

A docker engine builds over a linux container. A linux container is an operating system-level virtualization method for running multiple isolated linux systems (continers) on a single control host[2]. It differs from virtual machines in many ways. One stricking difference is containers share host’s kernel and library files but virtual machines do not. This makes containers lightweight than virtual machines.

Getting a linux container up and running is once again a difficult task. Docker makes it simple. Docker is built on top of Linux containers, which makes it easy to create, deploy and run applications using containers.

Dockerfile is used to create a container image. It contains instructions for docker to assemble an image. For example, following Dockerfile will build a Jenkins image.

Sample Dockerfile for Jenkins:

FROM jenkins
  MAINTAINER sujauddin
# Install plugins
  COPY plugins.txt /usr/local/etc/plugins.txt
  RUN /usr/local/bin/plugins.sh /usr/local/etc/plugins.txt
# Add gerrit-trigger plugin config file
  COPY gerrit-trigger.xml /usr/local/etc/gerrit-trigger.xml
  COPY gerrit-trigger.xml /var/jenkins_home/gerrit-trigger.xml
# Add Jenkins URL and system admin e-mail config file
  COPY jenkins.model.JenkinsLocationConfiguration.xml
/usr/local/etc/jenkins.model.JenkinsLocationConfiguration.xml
  COPY hudson.plugins.JIRA.JIRAProjectProperty.xml   /var/jenkins_home/hudson.plugins.JIRA.JIRAProjectProperty.xml
COPY jenkins.model.JenkinsLocationConfiguration.xml /var/jenkins_home/jenkins.model.JenkinsLocationConfiguration.xml
  #COPY jenkins.model.ArtifactManagerConfiguration.xml /var/jenkins_home/jenkins.model.ArtifactManagerConfiguration.xml
# Add setup script.
  COPY jenkins-setup.sh /usr/local/bin/jenkins-setup.sh
# Add cloud setting in config file.
  COPY config.xml /usr/local/etc/config.xml
  COPY jenkins-cli.jar /usr/local/etc/jenkins-cli.jar
  COPY jenkins_job.xml /usr/local/etc/jenkins_job.xml

We can run the previously built images inside a docker container. To setup and run a set of containers, docker-compose tool is used. This command will take a docker-compose.yml file and builds and runs all the containers defined there. The actual compose file we used to build git, gerrit, jenkins and JIRA is given below.

docker-compose.yml

final_gerrit: image: sujauddin/docker_gerrit_final restart: always
     ports:
  - 8020:8080
  - 29418:29418
final_JIRA: build: ./docker-JIRA ports:
  - 8025:8080
  restart: always
final_jenkins: build: ./docker-jenkins restart: always ports:
  - 8023:8080
  - 8024:50000
  links:
  - final_JIRA
  - final_gerrit
final_DevOpsnagios: image: tpires/nagios ports:
  - 8036:80
  - 8037:5666
  - 8038:2266
  restart: always

With one command we got all the containers up and running with all the necessary configurations done so that the whole DevOps workflow runs smoothly. Isn’t that cool?

Conclusion

Clearly docker based approach is easy to setup and more efficient. Containers can be quickly deployed (usually in a few seconds), can be ported along with the application and its dependencies and has minimal memory footprint.
Resulting into a happy client!

Leave a Reply

MSys rescued one of our biggest clients by conflating DevOps and Containerization, powered by Automation. The application downtime, prevalent during the release cycle slumbered down by 100 percent. To learn the whole story download our success story on “MSys DevOps and Containerization Solutions Help Fast-Food Giant to Achieve Zero Downtime Deployment”