Use Dagger with GitHub Actions and Google Cloud
Watch a live demo of this tutorial in the Dagger Community Call (12 Jan 2023). For more demos, join the next Dagger Community Call.
Introduction
This tutorial teaches you how to use a Dagger pipeline to continuously build and deploy a Node.js application with GitHub Actions on Google Cloud Run. You will learn how to:
- Configure a Google Cloud service account and assign it the correct roles
- Create a Google Cloud Run service accessible at a public URL
- Create a Dagger pipeline using a Dagger SDK
- Run the Dagger pipeline on your local host to manually build and deploy the application on Google Cloud Run
- Use the same Dagger pipeline with GitHub Actions to automatically build and deploy the application on Google Cloud Run on every repository commit
Requirements
This tutorial assumes that:
- You have a basic understanding of the JavaScript programming language.
- You have a basic understanding of GitHub Actions. If not, learn about GitHub Actions.
- You have a Go, Python or Node.js development environment. If not, install Go, Python or Node.js.
- You have the Dagger CLI installed in your development environment. If not, install the Dagger CLI.
- You have Docker installed and running on the host system. If not, install Docker.
- You have the Google Cloud CLI installed. If not, install the Google Cloud CLI.
- You have a Google Cloud account and a Google Cloud project with billing enabled. If not, register for a Google Cloud account, create a Google Cloud project and enable billing.
- You have a GitHub account and a GitHub repository containing a Node.js Web application. This repository should also be cloned locally in your development environment. If not, register for a GitHub account, install the GitHub CLI and follow the steps in Appendix A to create and populate a local and GitHub repository with an example Express application.
Step 1: Create a Google Cloud service account
The Dagger pipeline demonstrated in this tutorial (re)builds a container image of an application every time a new commit is added to the application's repository. It then publishes the container image to Google Container registry and deploys it at a public URL using Google Cloud infrastructure.
This requires the following:
- A Google Cloud service account with all necessary privileges
- A Google Cloud Run service with a public URL and defined resource/capacity/access rules
- Access to various Google Cloud APIs
This step discusses how to create a Google Cloud service account. If you already have a Google Cloud service account and key for your project, skip to Step 2.
Create a Google Cloud service account, as follows:
-
Log in to the Google Cloud Console and select your project.
-
From the navigation menu, click
IAM & Admin
->Service Accounts
. -
Click
Create Service Account
. -
In the
Service account details
section, enter a string in theService account ID
field. This string forms the prefix of the unique service account email address. -
Click
Create and Continue
. -
In the
Grant this service account access to project
section, select theService Account Token Creator
andEditor
roles. -
Click
Continue
. -
Click
Done
.
Once the service account is created, the Google Cloud Console displays it in the service account list, as shown below. Note the service account email address, as you will need it in the next step.
Next, create a JSON key for the service account as follows:
- From the navigation menu, click
IAM & Admin
->Service Accounts
. - Click the newly-created service account in the list of service accounts.
- Click the
Keys
tab on the service account detail page. - Click
Add Key
->Create new key
. - Select the
JSON
key type. - Click
Create
.
The key is created and automatically downloaded to your local host through your browser as a JSON file.
Store the JSON service account key file safely as it cannot be retrieved again.
Step 2: Configure Google Cloud APIs and a Google Cloud Run service
The next step is to enable access to the required Google Cloud APIs:
-
From the navigation menu, select the
APIs & Services
->Enabled APIs & services
option. -
Select the
Enable APIs and Services
option. -
On the
API Library
page, search for and select theCloud Run API
entry. -
On the API detail page, click
Enable
. -
Repeat the previous two steps for the
IAM Service Account Credentials API
.
Once the APIs are enabled, the Google Cloud Console displays the updated status of the APIs.
The final step is to create a Google Cloud Run service and corresponding public URL endpoint. This service will eventually host the container deployed by the Dagger pipeline.
-
From the navigation menu, select the
Serverless
->Cloud Run
product. -
Select the
Create Service
option. -
Select the
Deploy one revision from an existing container image
option. ClickTest with a sample container
to have a container image URL pre-filled. -
Continue configuring the service with the following inputs:
- Service name:
myapp
(modify as needed) - Region:
us-central1
(modify as needed) - CPU allocation and pricing:
CPU is only allocated during request processing
- Minimum number of instances:
0
(modify as needed) - Maximum number of instances:
1
(modify as needed) - Ingress:
Allow all traffic
- Authentication:
Allow unauthenticated invocations
- Service name:
-
Click
Create
to create the service.
The new service is created. The Google Cloud Console displays the service details, including its public URL, on the service detail page, as shown below.
Step 3: Create the Dagger pipeline
The next step is to create a Dagger pipeline to do the heavy lifting: build a container image of the application, release it to Google Container Registry and deploy it on Google Cloud Run.
- Go
- Node.js
- Python
-
In the application directory, install the Dagger SDK and the Google Cloud Run client library as development dependencies:
go get dagger.io/dagger@latest
go get cloud.google.com/go/run/apiv2 -
Create a new sub-directory named
ci
. Within theci
directory, create a file namedmain.go
and add the following code to it. Replace the PROJECT placeholder with your Google Cloud project identifier and adjust the region (us-central1
) and service name (myapp
) if you specified different values when creating the Google Cloud Run service in Step 2.package main
import (
"context"
"fmt"
"os"
run "cloud.google.com/go/run/apiv2"
runpb "cloud.google.com/go/run/apiv2/runpb"
"dagger.io/dagger"
)
const GCR_SERVICE_URL = "projects/PROJECT/locations/us-central1/services/myapp"
const GCR_PUBLISH_ADDRESS = "gcr.io/PROJECT/myapp"
func main() {
// create Dagger client
ctx := context.Background()
daggerClient, err := dagger.Connect(ctx, dagger.WithLogOutput(os.Stderr))
if err != nil {
panic(err)
}
defer daggerClient.Close()
// get working directory on host
source := daggerClient.Host().Directory(".", dagger.HostDirectoryOpts{
Exclude: []string{"ci", "node_modules"},
})
// build application
node := daggerClient.Container(dagger.ContainerOpts{Platform: "linux/amd64"}).
From("node:16")
c := node.
WithDirectory("/src", source).
WithWorkdir("/src").
WithExec([]string{"cp", "-R", ".", "/home/node"}).
WithWorkdir("/home/node").
WithExec([]string{"npm", "install"}).
WithEntrypoint([]string{"npm", "start"})
// publish container to Google Container Registry
addr, err := c.Publish(ctx, GCR_PUBLISH_ADDRESS)
if err != nil {
panic(err)
}
// print ref
fmt.Println("Published at:", addr)
// create Google Cloud Run client
gcrClient, err := run.NewServicesClient(ctx)
if err != nil {
panic(err)
}
defer gcrClient.Close()
// define service request
gcrRequest := &runpb.UpdateServiceRequest{
Service: &runpb.Service{
Name: GCR_SERVICE_URL,
Template: &runpb.RevisionTemplate{
Containers: []*runpb.Container{
{
Image: addr,
Ports: []*runpb.ContainerPort{
{
Name: "http1",
ContainerPort: 1323,
},
},
},
},
},
},
}
// update service
gcrOperation, err := gcrClient.UpdateService(ctx, gcrRequest)
if err != nil {
panic(err)
}
// wait for service request completion
gcrResponse, err := gcrOperation.Wait(ctx)
if err != nil {
panic(err)
}
// print ref
fmt.Println("Deployment for image", addr, "now available at", gcrResponse.Uri)
}This file performs the following operations:
- It imports the Dagger and Google Cloud Run client libraries.
- It creates a Dagger client with
Connect()
. This client provides an interface for executing commands against the Dagger engine. - It uses the client's
Host().Directory()
method to obtain a reference to the current directory on the host, excluding thenode_modules
andci
directories. This reference is stored in thesource
variable. - It uses the client's
Container().From()
method to initialize a new container from a base image. The additionalPlatform
argument to theContainer()
method instructs Dagger to build for a specific architecture. In this example, the base image is thenode:16
image and the archiecture islinux/amd64
, which is one of the architectures supported by Google Cloud. This method returns aContainer
representing an OCI-compatible container image. - It uses the previous
Container
object'sWithDirectory()
method to return the container image with the host directory written at the/src
path, and theWithWorkdir()
method to set the working directory in the container. - It chains the
WithExec()
method to copy the contents of the working directory to the/home/node
directory in the container and then uses theWithWorkdir()
method to change the working directory in the container to/home/node
. - It chains the
WithExec()
method again to install dependencies withnpm install
and sets the container entrypoint using theWithEntrypoint()
method. - It uses the container object's
Publish()
method to publish the container to Google Container Registry, and prints the SHA identifier of the published image. - It creates a Google Cloud Run client, updates the Google Cloud Run service defined in Step 2 to use the published container image, and requests a service update.
-
Run the following command to update
go.sum
:go mod tidy
-
In the application directory, install the Dagger SDK and the Google Cloud Run client library as development dependencies:
npm install @dagger.io/dagger@latest @google-cloud/run --save-dev
-
Create a new sub-directory named
ci
. Within theci
directory, create a file namedindex.mjs
and add the following code to it. Replace the PROJECT placeholder with your Google Cloud project identifier and adjust the region (us-central1
) and service name (myapp
) if you specified different values when creating the Google Cloud Run service in Step 2.import { connect } from "@dagger.io/dagger"
import { ServicesClient } from "@google-cloud/run"
const GCR_SERVICE_URL = "projects/PROJECT/locations/us-central1/services/myapp"
const GCR_PUBLISH_ADDRESS = "gcr.io/PROJECT/myapp"
// initialize Dagger client
connect(
async (daggerClient) => {
// get reference to the project directory
const source = daggerClient
.host()
.directory(".", { exclude: ["node_modules/", "ci/"] })
// get Node image
const node = daggerClient
.container({ platform: "linux/amd64" })
.from("node:16")
// mount cloned repository into Node image
// install dependencies
const c = node
.withDirectory("/src", source)
.withWorkdir("/src")
.withExec(["cp", "-R", ".", "/home/node"])
.withWorkdir("/home/node")
.withExec(["npm", "install"])
.withEntrypoint(["npm", "start"])
// publish container to Google Container Registry
const gcrContainerPublishResponse = await c.publish(GCR_PUBLISH_ADDRESS)
// print ref
console.log(`Published at: ${gcrContainerPublishResponse}`)
// initialize Google Cloud Run client
const gcrClient = new ServicesClient()
// define service request
const gcrServiceUpdateRequest = {
service: {
name: GCR_SERVICE_URL,
template: {
containers: [
{
image: gcrContainerPublishResponse,
ports: [
{
name: "http1",
containerPort: 3000,
},
],
},
],
},
},
}
// update service
const [gcrServiceUpdateOperation] = await gcrClient.updateService(
gcrServiceUpdateRequest,
)
const [gcrServiceUpdateResponse] = await gcrServiceUpdateOperation.promise()
// print ref
console.log(
`Deployment for image ${gcrContainerPublishResponse} now available at ${gcrServiceUpdateResponse.uri}`,
)
},
{ LogOutput: process.stderr },
)This file performs the following operations:
- It imports the Dagger and Google Cloud Run client libraries.
- It creates a Dagger client with
connect()
. This client provides an interface for executing commands against the Dagger engine. - It uses the client's
host().workdir()
method to obtain a reference to the current directory on the host, excluding thenode_modules
andci
directories. This reference is stored in thesource
variable. - It uses the client's
container().from()
method to initialize a new container from a base image. The additionalplatform
argument to thecontainer()
method instructs Dagger to build for a specific architecture. In this example, the base image is thenode:16
image and the archiecture islinux/amd64
, which is one of the architectures supported by Google Cloud. This method returns aContainer
representing an OCI-compatible container image. - It uses the previous
Container
object'swithDirectory()
method to return the container image with the host directory written at the/src
path, and thewithWorkdir()
method to set the working directory in the container. - It chains the
withExec()
method to copy the contents of the working directory to the/home/node
directory in the container and then uses thewithWorkdir()
method to change the working directory in the container to/home/node
. - It chains the
withExec()
method again to install dependencies withnpm install
and sets the container entrypoint using thewithEntrypoint()
method. - It uses the container object's
publish()
method to publish the container to Google Container Registry, and prints the SHA identifier of the published image. - It creates a Google Cloud Run client, updates the Google Cloud Run service defined in Step 2 to use the published container image, and requests a service update.
-
In the application directory, create a virtual environment and install the Dagger SDK and the Google Cloud Run client library:
pip install dagger-io google-cloud-run
-
Create a new sub-directory named
ci
. Within theci
directory, create a file namedmain.py
and add the following code to it. Replace the PROJECT placeholder with your Google Cloud project identifier and adjust the region (us-central1
) and service name (myapp
) if you specified different values when creating the Google Cloud Run service in Step 2.import sys
import anyio
from google.cloud import run_v2
import dagger
GCR_SERVICE_URL = "projects/PROJECT/locations/us-central1/services/myapp"
GCR_PUBLISH_ADDRESS = "gcr.io/PROJECT/myapp"
async def main():
# initialize Dagger client
async with dagger.Connection(dagger.Config(log_output=sys.stderr)) as client:
# get reference to the project directory
source = client.host().directory(".", exclude=["node_modules", "ci"])
# get Node image
node = client.container(platform=dagger.Platform("linux/amd64")).from_(
"node:16"
)
# mount source code directory into Node image
# install dependencies
# set entrypoint
c = (
node.with_directory("/src", source)
.with_workdir("/src")
.with_exec(["cp", "-R", ".", "/home/node"])
.with_workdir("/home/node")
.with_exec(["npm", "install"])
.with_entrypoint(["npm", "start"])
)
# publish container to Google Container Registry
addr = await c.publish(GCR_PUBLISH_ADDRESS)
print(f"Published at: {addr}")
# create Google Cloud Run client
gcr_client = run_v2.ServicesAsyncClient()
# define a service request
gcr_request = run_v2.UpdateServiceRequest(
service=run_v2.Service(
name=GCR_SERVICE_URL,
template=run_v2.RevisionTemplate(
containers=[
run_v2.Container(
image=addr,
ports=[
run_v2.ContainerPort(
name="http1",
container_port=1323,
),
],
),
],
),
)
)
# update service
gcr_operation = await gcr_client.update_service(request=gcr_request)
# wait for service request completion
response = await gcr_operation.result()
print(f"Deployment for image {addr} now available at {response.uri}.")
anyio.run(main)This file performs the following operations:
- It imports the Dagger and Google Cloud Run client libraries.
- It creates a Dagger client with
dagger.Connection()
. This client provides an interface for executing commands against the Dagger engine. - It uses the client's
host().directory()
method to obtain a reference to the current directory on the host, excluding thenode_modules
andci
directories. This reference is stored in thesource
variable. - It uses the client's
container().from_()
method to initialize a new container from a base image. The additionalplatform
argument to thecontainer()
method instructs Dagger to build for a specific architecture. In this example, the base image is thenode:16
image and the archiecture islinux/amd64
, which is one of the architectures supported by Google Cloud. This method returns aContainer
representing an OCI-compatible container image. - It uses the previous
Container
object'swith_directory()
method to mount the host directory into the container at the/src
mount point, and thewith_workdir()
method to set the working directory in the container. - It chains the
withExec()
method to copy the contents of the working directory to the/home/node
directory in the container and then uses thewithWorkdir()
method to change the working directory in the container to/home/node
. - It chains the
with_exec()
method again to install dependencies withnpm install
and sets the container entrypoint using thewith_entrypoint()
method. - It uses the container object's
publish()
method to publish the container to Google Container Registry, and prints the SHA identifier of the published image. - It creates a Google Cloud Run client, updates the Google Cloud Run service defined in Step 2 to use the published container image, and requests a service update.
Most Container
object methods return a revised Container
object representing the new state of the container. This makes it easy to chain methods together. Dagger evaluates pipelines "lazily", so the chained operations are only executed when required - in this case, when the container is published. Learn more about lazy evaluation in Dagger.
Step 4: Test the Dagger pipeline on the local host
Configure credentials for the Google Cloud SDK on the local host, as follows:
-
Configure Docker credentials for Google Container Registry on the local host using the following commands. Replace the SERVICE-ACCOUNT-ID placeholder with the service account email address created in Step 1, and the SERVICE-ACCOUNT-KEY-FILE placeholder with the location of the service account JSON key file downloaded in Step 1.
gcloud auth activate-service-account SERVICE-ACCOUNT-ID --key-file=SERVICE-ACCOUNT-KEY-FILE
gcloud auth configure-dockerinfoThis step is necessary because Dagger relies on the host's Docker credentials and authorizations when publishing to remote registries.
-
Set the
GOOGLE_APPLICATION_CREDENTIALS
environment variable to the location of the service account JSON key file, replacing the SERVICE-ACCOUNT-KEY-FILE placeholder in the following command. This variable is used by the Google Cloud Run client library during the client authentication process.export GOOGLE_APPLICATION_CREDENTIALS=SERVICE-ACCOUNT-KEY-FILE
Once credentials are configured, test the Dagger pipeline by running the command below:
- Go
- Node.js
- Python
dagger run go run ci/main.go
dagger run node ci/index.mjs
dagger run python ci/main.py
Dagger performs the operations defined in the pipeline script, logging each operation to the console. At the end of the process, the built container is deployed to Google Cloud Run and a message similar to the one below appears in the console output:
Deployment for image gcr.io/PROJECT/myapp@sha256:b1cf... now available at https://...run.app
Browse to the URL shown in the deployment message to see the running application.
If you deployed the example application from Appendix A, you should see a page similar to that shown below:
Step 5: Create a GitHub Actions workflow
Dagger executes your pipelines entirely as standard OCI containers. This means that the same pipeline will run the same, whether on on your local machine or a remote server.
This also means that it's very easy to move your Dagger pipeline from your local host to GitHub Actions - all that's needed is to commit and push the pipeline script from your local clone to your GitHub repository, and then define a GitHub Actions workflow to run it on every commit.
-
Commit and push the pipeline script and related changes to the application's GitHub repository:
git add .
git commit -a -m "Added pipeline"
git push -
In the GitHub repository, create a new workflow file at
.github/workflows/main.yml
with the following content:- Go
- Node.js
- Python
name: 'ci'
on:
push:
branches:
- master
jobs:
dagger:
runs-on: ubuntu-latest
steps:
-
name: Checkout
uses: actions/checkout@v3
-
id: 'auth'
name: 'Authenticate to Google Cloud'
uses: 'google-github-actions/auth@v1'
with:
token_format: 'access_token'
credentials_json: '${{ secrets.GOOGLE_CREDENTIALS }}'
-
name: Login to Google Container Registry
uses: docker/login-action@v2
with:
registry: gcr.io
username: oauth2accesstoken
password: ${{ steps.auth.outputs.access_token }}
-
name: Setup Go
uses: actions/setup-go@v4
with:
go-version: '>=1.20'
-
name: Install
run: go get dagger.io/dagger@latest cloud.google.com/go/run/apiv2
-
name: Install Dagger CLI
run: cd /usr/local && { curl -L https://dl.dagger.io/dagger/install.sh | sh; cd -; }
-
name: Release and deploy with Dagger
run: dagger run go run ci/main.goname: 'ci'
on:
push:
branches:
- master
jobs:
dagger:
runs-on: ubuntu-latest
steps:
-
name: Checkout
uses: actions/checkout@v3
-
id: 'auth'
name: 'Authenticate to Google Cloud'
uses: 'google-github-actions/auth@v1'
with:
token_format: 'access_token'
credentials_json: '${{ secrets.GOOGLE_CREDENTIALS }}'
-
name: Login to Google Container Registry
uses: docker/login-action@v2
with:
registry: gcr.io
username: oauth2accesstoken
password: ${{ steps.auth.outputs.access_token }}
-
name: Setup node
uses: actions/setup-node@v3
with:
node-version: 16.13.x
cache: npm
-
name: Install
run: npm install
-
name: Install Dagger CLI
run: cd /usr/local && { curl -L https://dl.dagger.io/dagger/install.sh | sh; cd -; }
-
name: Release and deploy with Dagger
run: dagger run node ci/index.mjsname: 'ci'
on:
push:
branches:
- master
jobs:
dagger:
runs-on: ubuntu-latest
steps:
-
name: Checkout
uses: actions/checkout@v3
-
id: 'auth'
name: 'Authenticate to Google Cloud'
uses: 'google-github-actions/auth@v1'
with:
token_format: 'access_token'
credentials_json: '${{ secrets.GOOGLE_CREDENTIALS }}'
-
name: Login to Google Container Registry
uses: docker/login-action@v2
with:
registry: gcr.io
username: oauth2accesstoken
password: ${{ steps.auth.outputs.access_token }}
-
name: Setup Python
uses: actions/setup-python@v4
with:
python-version: '3.10'
-
name: Install
run: pip install dagger-io google-cloud-run
-
name: Install Dagger CLI
run: cd /usr/local && { curl -L https://dl.dagger.io/dagger/install.sh | sh; cd -; }
-
name: Release and deploy with Dagger
run: dagger run python ci/main.pyThis workflow runs on every commit to the repository
master
branch. It consists of a single job with seven steps, as below:- The first step uses the Checkout action to check out the latest source code from the
main
branch to the GitHub runner. - The second step uses the Authenticate to Google Cloud action to authenticate to Google Cloud. It requires a service account key in JSON format, which it expects to find in the
GOOGLE_CREDENTIALS
GitHub secret. This step sets various environment variables (including the GOOGLE_APPLICATION_CREDENTIALS variable required by the Google Cloud Run SDK) and returns an access token as output, which is used to authenticate the next step. - The third step uses the Docker Login action and the access token from the previous step to authenticate to Google Container Registry from the GitHub runner. This is necessary because Dagger relies on the host's Docker credentials and authorizations when publishing to remote registries.
- The fourth and fifth steps download and install the programming language and required dependencies (such as the Dagger SDK and the Google Cloud Run SDK) on the GitHub runner.
- The sixth step downloads and installs the Dagger CLI on the GitHub runner.
- The seventh and final step executes the Dagger pipeline.
The Authenticate to Google Cloud action looks for a JSON service account key in the GOOGLE_CREDENTIALS
GitHub secret. Create this secret as follows:
- Navigate to the
Settings
->Secrets
->Actions
page in the GitHub Web interface. - Click
New repository secret
to create a new secret. - Configure the secret with the following inputs:
- Name:
GOOGLE_CREDENTIALS
- Secret: The contents of the service account JSON key file downloaded in Step 1.
- Name:
- Click
Add secret
to save the secret.
Step 6: Test the Dagger pipeline on GitHub
Test the Dagger pipeline by committing a change to the GitHub repository.
If you are using the example application described in Appendix A, the following commands modify and commit a simple change to the application's index page:
git pull
sed -i 's/Dagger/Dagger on GitHub/g' routes/index.js
git add routes/index.js
git commit -a -m "Update welcome message"
git push
The commit triggers the GitHub Actions workflow defined in Step 6. The workflow runs the various steps of the dagger
job, including the pipeline script.
At the end of the process, a new version of the built container image is released to Google Container Registry and deployed on Google Cloud Run. A message similar to the one below appears in the GitHub Actions log:
Deployment for image gcr.io/PROJECT/myapp@sha256:h4si... now available at https://...run.app
Browse to the URL shown in the deployment message to see the running application. If you deployed the example application with the additional modification above, you see a page similar to that shown below:
Conclusion
This tutorial walked you through the process of creating a Dagger pipeline to continuously build and deploy a Node.js application on Google Cloud Run. It used the Dagger SDKs and explained key concepts, objects and methods available in the SDKs to construct a Dagger pipeline.
Dagger executes your pipelines entirely as standard OCI containers. This means that pipelines can be tested and debugged locally, and that the same pipeline will run consistently on your local machine, a CI runner, a dedicated server, or any container hosting service. This portability is one of Dagger's key advantages, and this tutorial demonstrated it in action by using the same pipeline on the local host and on GitHub.
Use the API Key Concepts page and the Go, Node.js and Python SDK References to learn more about Dagger.
Appendix A: Create a GitHub repository with an example Express application
This tutorial assumes that you have a GitHub repository with a Node.js Web application. If not, follow the steps below to create a GitHub repository and commit an example Express application to it.
-
Log in to GitHub using the GitHub CLI:
gh auth login
-
Create a directory for the Express application:
mkdir myapp
cd myapp -
Create a skeleton Express application:
npx express-generator
-
Make a minor modification to the application's index page:
sed -i -e 's/Express/Dagger/g' routes/index.js
-
Initialize a local Git repository for the application:
git init
-
Add a
.gitignore
file and commit the application code:echo node_modules >> .gitignore
git add .
git commit -a -m "Initial commit" -
Create a private repository in your GitHub account and push the changes to it:
gh repo create myapp --push --source . --private