1. Answers
  2. Configure a GCP Logging Project Sink with Pulumi

How do I configure a GCP logging projectsink with Pulumi?

In this guide, we will configure a project-level logging sink on Google Cloud using Pulumi. A logging sink exports logs to a specified destination such as a Cloud Storage bucket, BigQuery dataset, or Pub/Sub topic. We will create a logging sink that exports logs to a BigQuery dataset.

Key Points

  • We will create a BigQuery dataset to store the logs.
  • We will create a logging sink that exports logs to the created dataset.
  • We will grant the necessary permissions to the logging sink.
import * as pulumi from "@pulumi/pulumi";
import * as gcp from "@pulumi/gcp";

// Create a BigQuery dataset
const dataset = new gcp.bigquery.Dataset("logDataset", {
    datasetId: "log_dataset",
    friendlyName: "Log Dataset",
    description: "A dataset to store logs",
    location: "US",
});

// Create a logging sink
const logSink = new gcp.logging.ProjectSink("logSink", {
    name: "log-sink",
    destination: pulumi.interpolate`bigquery.googleapis.com/projects/${pulumi.getProject()}/datasets/${dataset.datasetId}`,
    filter: "severity>=ERROR",
    uniqueWriterIdentity: true,
});

// Grant the logging sink write access to the dataset
const datasetAccess = new gcp.bigquery.DatasetAccess("datasetAccess", {
    datasetId: dataset.datasetId,
    role: "WRITER",
    userByEmail: logSink.writerIdentity,
});

Summary

In this guide, we configured a project-level logging sink on Google Cloud using Pulumi. We created a BigQuery dataset, set up a logging sink to export logs to the dataset, and granted the necessary write permissions to the sink. This setup allows for efficient log management and analysis in Google Cloud.

Deploy this code

Want to deploy this code? Sign up for a free Pulumi account to deploy in a few clicks.

Sign up

New to Pulumi?

Want to deploy this code? Sign up with Pulumi to deploy in a few clicks.

Sign up