Using gcp artifactregistry with cloudfunctionsv2
TypeScriptWhen deploying cloud functions on Google Cloud Platform (GCP) that make use of artifacts (like Docker images), you may utilize an Artifact Registry to store, manage, and secure your artifacts. This is important when working with Cloud Functions as the code for the function is often packaged into a container image, which should be stored in a managed registry like GCP's Artifact Registry.
Firstly, you'll need to create an Artifact Registry Repository. This is the place where your Docker images or other artifacts will reside. Since Cloud Functions can be triggered by HTTP requests or events from various GCP services, they are an effective way to handle event-driven functionalities.
In this example, we will create an Artifact Registry repository to store our Docker images and then deploy a new Cloud Function using Cloud Functions V2 that uses the image stored in our Artifact Registry. This Cloud Function will be configured for HTTP triggering (it listens for HTTP requests).
To get started with Pulumi in GCP, make sure you have the Pulumi CLI installed and you're authenticated with GCP. You should also have a Pulumi project initialized where you can add the following TypeScript code.
Let’s begin by creating the Pulumi program.
import * as pulumi from '@pulumi/pulumi'; import * as gcp from '@pulumi/gcp'; // Create an Artifact Registry repository for our Docker images const myRepository = new gcp.artifactregistry.Repository('myRepository', { location: 'us-central1', // Choose the appropriate location here format: 'DOCKER', // The format could be DOCKER, MAVEN, or NPM depending on your needs repositoryId: 'my-function-repo', // ID for the repository description: 'Repository for my Cloud Function Docker images', }); // Now, let's create a Cloud Function (V2) which will fetch the Docker image from the Artifact Registry const myCloudFunction = new gcp.cloudfunctionsv2.Function('myCloudFunction', { project: gcp.config.project, // Defines the project to deploy into, defaulted to the project from the gcp provider configuration location: 'us-central1', // Cloud Functions V2 are regional, so we pick the same location as the Artifact Registry description: 'My Function', buildConfig: { entryPoint: 'function.handler', // Specify the entry point of your function within the source runtime: 'nodejs16', // Set the runtime to match your language and version source: { // For the source, you would use repoSource or storageSource depending on where your code is. Here we are assuming code is stored in a Cloud Source Repository or on Cloud Storage. // The specific fields would be populated based on your source location and method. }, }, serviceConfig: { availableMemory: '256MB', service: 'my-cloud-function-service', // The service name within Cloud Run timeoutSeconds: 30, // Maximum request execution time (optional) }, eventTrigger: { trigger: 'https://cloudfunctions.googleapis.com/v2/projects/_/locations/us-central1/triggers', // The trigger endpoint eventType: 'google.cloud.functions.v2.eventTypes.PUB_SUB_MESSAGE_PUBLISHED', // The type of event to trigger the function pubsubTopic: 'my-pubsub-topic', // If using Pub/Sub to trigger the function, specify the topic name }, }); // Export the URL of the Cloud Function export const cloudFunctionUrl = myCloudFunction.serviceConfig.apply(sc => sc.uri);
In the code above:
-
We create an Artifact Registry repository using
gcp.artifactregistry.Repository
. The repository is set up to store Docker images and located inus-central1
. -
We proceed to define a Cloud Function (V2) with
gcp.cloudfunctionsv2.Function
. We specify itsentryPoint
,runtime
, andsource
. Thesource
is where your function's source code resides, which is built into a Docker image. -
In
serviceConfig
, we define configurations like available memory for the function and the timeout for function execution. -
eventTrigger
is set up for an example Pub/Sub trigger. You need to replace this with the actual trigger you will be using, HTTP or otherwise. -
Finally, we export the Cloud Function URI that will be dynamically created upon deployment. You can use this URL to trigger the function once it's deployed.
To run this program, run
pulumi up
in your Pulumi project directory. Ensure that you've authenticated Pulumi with GCP correctly.Please note that in a real scenario, you would need to have your source code in a Cloud Source Repository or a Cloud Storage bucket, and your Dockerfile properly set up, so the Cloud Build can build your function's container image. The details of setting up a Cloud Build trigger for the Artifact Registry are not covered in this example.
Remember to replace the placeholders such as
entryPoint
,runtime
,source
,trigger
, andpubsubTopic
with actual values that correspond to your specific use case.Before running your Pulumi program, you need to ensure that you've granted the appropriate IAM permissions for Pulumi to perform actions on your GCP resources. This might involve granting roles such as
roles/artifactregistry.writer
for the Artifact Registry androles/cloudfunctions.developer
for the Cloud Functions to the service account you're using.By following the structure laid out here, you can deploy your Cloud Function (V2) with infrastructure as code using Pulumi and manage artifacts effectively with the GCP Artifact Registry.
-