Creating a CI Pipeline with GitLab
GitLab and Google Container Registry are a Great Couple
The purpose of this article, is to cover how to create a simple CI pipeline using GitLab to push newly built Docker images to Google Container Registry in GCP (Google Cloud Platform).
Firstly, you will need your code in a GitLab Project.
If you’ve ever created a GitHub repo, then this should be pretty straightforward, if not, check out the very detailed GitLab docs!
Next, your codebase needs to be Docker ready.
This means ready-to-go Dockerfiles for containerizing your client and server code. My previous article explains how to use Docker to containerize such an application, including all the configuration for a React/Node.js set-up.
Now, we move over to Google Cloud Platform!
Creating a New Google Cloud Project
Upon creating a GCP account by signing in to GCP, you will be taken to a home dashboard that looks similar to the image below,
From here, simply click ‘CREATE PROJECT’, enter a ‘Project Name’, and click ‘CREATE’.
In the side navigation, scroll down until you find ‘Container Registry’ under ‘CI/CD’,
Once, you’ve click on ‘Container Registry’, you will have to enable billing for your GCP account, Google offers USD$300 of credit to new accounts to get started, which for the purposes here is more than enough. However, feel free to disable billing and delete your project later to prevent any charges incurring.
Create a Service Account
We need to create a cloud service account, which will provide us with credentials that we can use later to authenticate between GitLab and GCP.
Again, in the side navigation, select ‘IAM & Admin’ > ‘Service Accounts’ > Click ‘+ Create Service Account’,
Steps 2 and 3 are optional but highly recommended if your projects are under and organisation with multiple users, since you should always follow the principle of least privilege where possible.
Once the service account has been created, select it from the service accounts list and select the ‘KEYS’ tab.
Select ‘Create new key’, and choose the JSON option since this will give you a downloadable JSON file with the service account credentials.
Don’t share this JSON file, and be sure to keep it somewhere safe (just as you would with login details to your own bank account!) The service account keys allow any individual or system to authenticate to your GCP account, hence they should be as restricted as possible.
Add Service Account Key to GitLab
In your GitLab account, go to the project settings and then select ‘CI/CD’,
and then expand on the ‘Variables’ section,
As you can see, I already have added the key our SocioProphet codebase, but to add a new secret variable, simply click ‘Add variable’.
Fill out the ‘Key’ field which in my case I called it ‘GCLOUD_SERICE_KEY’, and then copy and paste the contents of the JSON key file into the ‘Value’ field. Simply leave all the other options as they are and click ‘Add variable’. Your GCP credentials have now been added to your GitLab account!
Initializing the GCloud CLI
The GCloud CLI provides an SDK for interacting with your Google Cloud account and the various projects you may have, all from the command line. This is important especially since we need to write terminal based commands in our GitLab CI configuration file. The documentation won’t lead you astray and has installation instructions for Linux, macOS and Windows. Check out the docs here titled Installing Google Cloud SDK.
By this stage, we now have a GCP project with the Google Container Registry enabled, a GitLab project with the GCP service account key stored as a secret variable, and the Cloud SDK CLI installed, now we can create the actual pipeline!
GitLab CI Pipeline Configuration
First, create a file called .gitlab-ci.yml in the root of your repository.
You have a lot of options and flexibility over what this pipeline does, however, for now we will add some basic stages such as build, test, and docker_image.
To use CI/CD in GitLab, you will need to install runners for your project. The GitLab documentation explains this perfectly.
Configuration Base
Since each stage needs an environment to run, we need to set that first;
image: node:latest
Then we can define the stages for the pipeline;
stages:
- build
- test
- docker_image
Build Stage
The build stage is going to build you application (build client code typically using something like Webpack) to make sure everything is running smoothly.
build:
stage: build
script:
- make build_web
- make build_client
We define this section as ‘build’ and reference it to the build stage we mentioned under ‘stages’. Then we add the required scripts. Yours may be something like
...
script:
- npm build
However, I use bash scripts and Makefiles to automate a lot of this stuff. If you’re interested I wrote an article explaining how to set this up titled, Using a Makefile with Your Web App.
Test Stage
Next, if you have testing implemented in you application using Jest or something similar, that can be added in the test stage as follows;
test:
stage: test
script:
- "cd ./[your-project]/client/ && yarn test"
Your script may be similar but I’m sure you get the idea 😃
Docker Stage
Finally, the stage we’ve been building up to!
docker_image:
stage: docker_image
image: docker:stableservices:
- docker:dind
We have to define a separate image for this stage to run in since we are using Docker here, so just use the latest stable version of Docker!
Setting Variables
GCP_PROJECT_ID: [your project id]
IMAGE_NAME_CLIENT: [the name of your client docker image]
IMAGE_NAME_SERVER: [the name of your server docker image]
In the GCP Console, in the header, there will be a drop down menu on your project name, after selecting that, a pop-up screen will appear that lists the project name in the left column and project id in the right column. That is the project id we are using here for the GCP_PROJECT_ID variable.
The next two variables can of course be called whatever you’d like them to be.
Authenticating to Google Container Registry using the Service Key
Now we can start defining the scripts for the docker stage. Firstly, we authenticate to GCR using the variable we created earlier, GCLOUD_SERVICE_KEY.
script:
- docker login -u _json_key -p "$GCLOUD_SERVICE_KEY" https://gcr.io
This tells docker to login to gcr at ‘https://gcr.io’ using the JSON value of the service key variable.
Building the Docker Images
First, let’s build the client image;
...- "cd [into where your client Dockerfile is] && docker build -t gcr.io/$GCP_PROJECT_ID/$IMAGE_NAME_CLIENT:release-candidate ."
This command will build the docker image (according to your Dockerfile) and tag it as gcr.io/[project id]/[image name]:release-candidate
Also, don’t forget the tag version can be whatever you find appropriate for the image.
Next, build the server image;
...- "cd ../[into where your server Dockerfile is] && docker build -t gcr.io/$GCP_PROJECT_ID/$IMAGE_NAME_SERVER:release-candidate ."
Pushing the Docker Images to the Container Registry
Now, we can push the images to the Google Container Registry!
...- docker push gcr.io/$GCP_PROJECT_ID/$IMAGE_NAME_CLIENT:release-candidate- docker push gcr.io/$GCP_PROJECT_ID/$IMAGE_NAME_SERVER:release-candidate
Full YAML Configuration
Your final configuration should look similar to below;
Line 40 tells the pipeline to ONLY run when code is pushed to the ‘master’ branch which in my case is the production branch.
After the pipeline runs (GitLab will detect when code is pushed to the repo), you will get a status message, saying ‘passed’.
This means the new docker images will have been pushed to GCR and we can check them out! Go to your GCP Console and under the images tab of ‘Container Registry’.
These images can them be used as part of a managed instance deployment or in Compute Engine in a container optimized OS virtual machine.
I hope this article helped you get started on your CI/CD journey!