Setting up your first GCP Github Action

We are huge fans of Github and their initiatives to support developers. We've been considering using Github Actions at QUID, to augment our existing CI/CD systems, so I spent some time experimenting to see what it offers. I'll outline my findings in this post.

For demonstration purposes, we will use a repo that is hosting a very simple personal landing page as an example. To host the static site, a Google Cloud storage bucket is used to host and serve the files to the public. This is nothing more than an html file with a few images. To deploy a new version, all we have to do is copy over these static files to the storage bucket and overwrite any previous ones.  

Without any automation, we would have to run gsutil command line tools or log into the Google Cloud Console and upload the files using the web UI. Lets see how to skip this manual step and run things automatically.

The Guide

Currently Actions is still in Beta at time of writing. If you get invited to the beta, you will see an Action tab inside your repositories between Pull Requests and Projects.

Click on Actions and select the prompts to add your first work flow

This will bring you to the workflow creator screen.

Step 1 - The trigger

Call your workflow something that will let you know what your action will do. I will call mine deploy-to-prod-on-push. I will leave the default setting of running on push. Going through the Run drop down list, you have access to almost every event that github emits, so there will be lots of room to play around once you get this first flow going.

The blue dot with an arrow is what you use to connect your actions together. Think of it like connecting boxes in a process flow diagram in Microsoft Visio.

Step 2 - Auth

The first step in our flow after detecting a push needs to be authentication. This is because your GCP project's APIs are restricted to your credentials, and Github needs a way to authenticate before it can communicate with GCP. To do this, there is a premade GCP Auth action that goes along with the GCP CLI action.

The best way to authenticate the action is by creating a service account on GCP with permissions to only write to your storage bucket. Head over to the IAM settings to make one.

The service account comes in the form of a JSON file, containing a set of cryptographic keys which will authenticate our action.

In the Auth step, search for the GCloud Auth action package and click use.

As stated in the Auth repo, for Github to digest this JSON file, we will need to base64 encode the json file and pass it to GITHUB as a secret. To encode, simply run a command in your terminal

base64 ~/<account_id>.json

and copy the string it generates.

Github made it super easy for us to enter our secrets. Scrolling down from the action creator form, there is a Secrets section. Simply hit "Create a new secret", set GCLOUD_AUTH as the SECRET_KEY, paste the base64 string as the Secret value, and click Add secret.

No other config for this step is required as we will be running our gsutil command in the next action step. Hit Done to complete this step.

Step 3 - Run your command

Now for the part we are most interested in, running the gsutil command to sync our repo with our storage bucket.

gsutil cp . gs://

To do this, we need to now add the GCP CLI action to our flow. Drag the dot from the bottom of our auth step and create a new action. Look for the Google Cloud SDK action and click use.

We can now enter the command we want to run and click Done to finish this final step.

Step 4 - Results

The flow builder we used is a UI to generate the code below. As you get more advance with your build steps, it may become easier to modify your actions in a code editor instead of a UI.

workflow "Deploy to prod" {
  on = "push"
  resolves = ["GitHub Action for Google Cloud"]

action "GitHub Action for Docker" {
  uses = "actions/gcloud/auth@dc2b6c3bc6efde1869a9d4c21fcad5c125d19b81"
  secrets = ["GCLOUD_AUTH"]

action "GitHub Action for Google Cloud" {
  uses = "actions/gcloud/cli@dc2b6c3bc6efde1869a9d4c21fcad5c125d19b81"
  needs = ["GitHub Action for Docker"]
  runs = "gsutil -m cp -a public-read -r . gs://"

We can now make some small changes to our landing page index.html and push it to Github. If everything was done correctly, Github will run the workflow and provides logs for the containers they are running in so you can inspect them when errors happen.


Github did an amazing job providing a tool to set up automation within your organization as painless as possible. The two points that stood out to me were:

  1. The UI is very intuitive for people new to automation, allowing us to make a CI pipeline with no need learning to read and write YAML or Groovy files.
  2. With access to tons of Github events to hook onto and a repo full of premade actions to choose from, you'll most likely find a way to link events in Github to your open source tools.

The only downside is its still in Beta. Some things I'll be looking for in their public release before moving some of our pipelines to Actions is competitive pricing and hopefully some advanced features like running parallel tests.

If you'd like to learn more about GCP, make sure to check out a great blog post we have on making a serverless application in GCP which will go a lot more in depth into GCP tools. If there are any questions about Github or GCP, don't hesitate contacting me at

Happy hacking!

Title Photo by lan deng

Author image

About William Chou

Wielder of javascript and student in the school of serverless cloud services.
  • Toronto, Ontario, Canada