Deploy Dags from Google Cloud Storage to Astro
Prerequisites
- A Google Cloud Storage (GCS) bucket.
- An Astro Deployment with Dag-only deploys enabled.
- A Deployment API token, Workspace API token, or Organization API token.
- An Astro project containing your project configurations.
Dag deploy template
This CI/CD template can be used to deploy Dags from a single GCS bucket to a single Astro Deployment. When you create or modify a Dag in the GCS bucket, a Cloud Function triggers and initializes an Astro project to deploy your Dags using Astro CLI.
To deploy any non-Dag code changes to Astro, you need to trigger a standard image deploy with your Astro project. When you do this, your Astro project must include the latest version of your Dags from your GCS bucket. If your Astro project dags folder isn’t up to date with your GCS Dags bucket when you trigger this deploy, you revert your Dags back to the version hosted in your Astro project.
-
Download the latest Astro CLI binary from GitHub releases, then rename the file to
astro_cli.tar.gz. For example, to use Astro CLI version 1.40.0 in your template, downloadastro_1.40.0_linux_amd64.tar.gzand rename it toastro_cli.tar.gz. -
In your GCS bucket, create the following new folders:
dagscli_binary
-
Add
astro_cli.tar.gztocli_binary. -
Create a Cloud Run Function with the Python 3.12 runtime in the same region as your storage bucket. Use the inline editor to create your function.
-
Create a Cloud Storage trigger with the following configuration:
- Event provider: Select Cloud Storage.
- Event: Select google.cloud.storage.object.v1.finalized.
- Bucket: Select your storage bucket.
- Service Account: Ensure the service account you use has the Cloud Run Invoker role.
-
Choose the runtime service account on the Security tab of the Cloud Run Functions settings. Ensure that the service account has Storage Object Viewer (
storage.objects.list) access to the Google Cloud Storage bucket. -
Under the Containers settings, set the following environment variables for your Cloud Function:
ASTRO_HOME=/tmpASTRO_API_TOKEN: The value for your Workspace or Organization API token.ASTRO_DEPLOYMENT_ID: Your Deployment ID.BUCKET: Your GCS bucket.
For production Deployments, ensure that you store the value for your API token in a secrets backend. See Secret Manager overview.
-
When editing the function source, change the function entry point to
astro_deploy. -
Add the following code to
main.py:
-
Add the dependency
google-cloud-storageto therequirements.txtfile for your Cloud Function. See Specifying Dependencies in Python. -
(Optional) If you want the function to trigger when Dags are deleted as well as created/modified, create another Cloud Storage trigger with the following configuration:
- Event provider: Select Cloud Storage.
- Event: Select google.cloud.storage.object.v1.deleted.
- Bucket: Select your storage bucket.
- Service Account: Ensure the service account you use has the Cloud Run Invoker role.
-
If you haven’t already, deploy your complete Astro project to your Deployment. See Deploy code.
-
Add your Dags to the
dagsfolder in your storage bucket. -
In the Astro UI, select a Workspace, click Deployments, and then select your Deployment. Confirm that your deploy worked by checking the Deployment Dag bundle version. The version’s name should include the time that you added the Dags to your GCS bucket.