Time to read: 6 min read

Getting [meta] with GitLab CI/CD: building build images

An alternative title for this post could have been:

I heard you liked docker, so I put dind.

# Getting Started

It should be clear by now that I love building stuff with GitLab CI/CD. From DNS to breakfast is a pretty wide range. However, past those "fun" use cases, I also like to share some ~best~ practices I have acquired through my years of using GitLab CI/CD. Both for software and non-software projects alike.

I crossed out "best" above because I don't really like the term "best practices." It implies that there is only one right answer to a given question - which is the opposite of the point of computer science. Sure there are better and worse ways to do something - but much like many things in life, you have to find what works for you. "The best camera is the one you have with you" comes to mind when building CI/CD for projects. Something that works is better than something that's pretty.

But, enough of my transgression, let's get to the practice I wanted to share in this post: building build images as part of the build process. Yes - it is precisely as meta as it sounds.

# Why?

Often when building a particular project, you may have several unique build dependencies. In many languages, package managers solve for the majority if not all of these dependencies - at least for build time. (think npm, RubyGems, Maven) However, when we are building and deploying (CI/CD let's remember) from a machine that is not our own, that may not be enough. There may be a few dependencies we might need from elsewhere.

The language libraries themselves are one such dependency - to build Java I'm going to need the JDK or JRE. To build Node, I'll need...well Node, etc. In a docker based environment, those languages and dependencies typically have an official image on Docker Hub (JRE from Oracle or Node from Node.js for instance). Assume, however, that I may need a few other things not included in either those official docker images or the package manager I'm using. For instance, maybe I need a CLI tool for deploy (AWS, Heroku, Firebase, etc.). We also might need a testing framework or tool like Selenium or headless Chrome. Or there may be other tools for packaging, testing, or deployment that I need.

Sometimes there is a docker image on Docker Hub for these combinations - or some of them - but not always a maintained version. One easy solution to this could be to just run the install of the tools before every job that needs it. This can even be "automated" using something like the before_script syntax. However, this adds time to our pipeline and seems inefficient - is there a better way?

# Enter the GitLab Docker registry

Since GitLab is a single application for the entire DevOps lifecycle - it actually ships out of the box with a built-in Docker registry. This can be a useful tool when deploying code in a containerized environment. We can build our application into a container and send it off into Kubernetes or some other Docker orchestrator.

However, I also see this registry as an opportunity to save time in my pipeline (and save round trips to Docker hub and back every time!). For builds that require some of these extra dependencies, I like to build a "build" docker image. That way, I have an image with all of those baked right in. Then, as part of my pipeline, I can build the image at the start (only when changes are made or every time). And the rest of the pipeline can consume that image as the base image.

# Putting it in practice

For example, let's see what it looks like to build a simple Docker image to use with deploying to Google Firebase.

Firebase is a "backend as a service" tool that provides a database, authentication, and other services across platforms (web, iOS, and Andriod). It also includes web hosting and several other items that can be deployed through a CLI. This makes getting started really easy. You can deploy the whole stack with firebase deploy. Alternatively, you can deploy a part (like serverless functions) with a command like firebase deploy --only functions.

Making this work in a CI/CD world requires a few extra steps though. We'll need a Node Docker image that has the firebase CLI in it, so let's make a simple Dockerfile to do that.

Putting this Dockerfile in .meta/Dockerfile

FROM node:10

RUN npm install -g firebase-tools

After that, I'll add a job to the front of my pipeline.

Added to the front of my .gitlab-ci.yml

meta-build-image:
 image: docker:stable
 services:
   - docker:dind
 stage: prepare
 script:
   - docker login -u $CI_REGISTRY_USER -p $CI_REGISTRY_PASSWORD $CI_REGISTRY
   - cd .meta
   - docker build -t $CI_REGISTRY/group/project/buildimage:latest .
   - docker push $CI_REGISTRY/group/project/buildimage:latest
 only:
   refs:
     - master
   changes:
     - .meta/Dockerfile

Let's break down that job:

  1. We use the docker:stable image and a service of docker:dind
  2. The stage is my first stage called prepare
  3. In the script, we login to the GitLab registry with the built-in variables and build the image. For more details see the GitLab documentation for building docker images
  4. We only run this on master and only when the .meta/Dockerfile changes. This makes sure we are specific about when we change the docker image. We could also use the commit hash or other methods here to make the image more fungible.

Now, in further jobs down the pipeline, I can use the latest build of the docker image like this:

firestore:
  image: registry.gitlab.com/group/project/buildimage
  stage: deploy đŸšĸ🇮🇹
  script:
    - firebase deploy --only firestore
  only:
    changes:
      - .firebase-config/firestore.rules
      - .firebase-config/firestore.indexes.json

In this job, we only run the job if something about the Firestore (the database from Firebase) configuration changes. And when it does, we run the firestore deploy command in CI. I also added a token for deploy as a GitLab CI/CD variable based off the Firebase documentation for using firebase with CI.

# Summary

In the end, this helps speed up pipelines by ensuring that you have a custom-built build image that you control. You don't have to rely on unstable or unmaintained Docker Hub images or even have a Docker Hub account yourself to get started.

To learn more about GitLab CI/CD you can read the GitLab website or the CI/CD docs. Also, there's a lot more to learn about the GitLab docker registry.

Comments