1. Answers
  2. Creating a GCP Logging Project Sink

How do I create a GCP logging project sink?

To create a GCP logging project sink using Pulumi, you need to define a logging sink resource. The sink will export logs from the specified project to a destination such as a BigQuery dataset, a Cloud Storage bucket, or a Pub/Sub topic. Below is an example of how to accomplish this using Pulumi in TypeScript.

import * as pulumi from "@pulumi/pulumi";
import * as gcp from "@pulumi/gcp";

// Define the GCP project ID
const projectId = "my-gcp-project";

// Define the destination for the logs (e.g., a BigQuery dataset)
const dataset = new gcp.bigquery.Dataset("myDataset", {
    datasetId: "my_dataset",
    project: projectId,
    location: "US",
});

// Define the logging project sink
const logSink = new gcp.logging.ProjectSink("myLogSink", {
    name: "my-log-sink",
    project: projectId,
    destination: pulumi.interpolate`bigquery.googleapis.com/projects/${projectId}/datasets/${dataset.datasetId}`,
    filter: "logName:projects/my-gcp-project/logs/my-log",
    uniqueWriterIdentity: true,
});

// Export the sink name and destination
export const sinkName = logSink.name;
export const sinkDestination = logSink.destination;

In this example:

  • We define a BigQuery dataset as the destination for the logs.
  • We create a logging project sink that exports logs to the defined BigQuery dataset.
  • We use the pulumi.interpolate function to construct the destination URL for the BigQuery dataset.
  • We specify a filter to select specific logs to export.
  • We set uniqueWriterIdentity to true to create a unique identity for the sink’s writer.

This program will create a GCP logging project sink that exports logs to a BigQuery dataset. The sink name and destination are exported as stack outputs.

Deploy this code

Want to deploy this code? Sign up for a free Pulumi account to deploy in a few clicks.

Sign up

New to Pulumi?

Want to deploy this code? Sign up with Pulumi to deploy in a few clicks.

Sign up