1. Answers
  2. How to save EC2 app logs to S3?

How do I save EC2 app logs to S3?

To save EC2 application logs to an S3 bucket using Pulumi, you will need to set up an S3 bucket and configure your EC2 instances to send their logs to this bucket. This can be achieved by using AWS CloudWatch to collect the logs from your EC2 instances and then exporting those logs to the S3 bucket.

Here is a step-by-step guide to achieve this:

  1. Create an S3 Bucket: This bucket will be used to store your logs.
  2. Create a CloudWatch Log Group: This log group will collect logs from your EC2 instances.
  3. Create a CloudWatch Log Stream: This log stream will be associated with the log group.
  4. Create an IAM Role and Policy: This role and policy will allow CloudWatch to write logs to the S3 bucket.
  5. Configure EC2 Instances: Configure your EC2 instances to send logs to CloudWatch.

Below is the Pulumi program written in TypeScript:

import * as pulumi from "@pulumi/pulumi";
import * as aws from "@pulumi/aws";

// Create an S3 bucket to store logs
const logBucket = new aws.s3.Bucket("logBucket", {
    bucket: "my-ec2-app-logs",
    acl: "private",
});

// Create a CloudWatch log group
const logGroup = new aws.cloudwatch.LogGroup("logGroup", {
    name: "/aws/ec2/app-logs",
    retentionInDays: 7, // Retain logs for 7 days
});

// Create a CloudWatch log stream
const logStream = new aws.cloudwatch.LogStream("logStream", {
    logGroupName: logGroup.name,
    name: "app-log-stream",
});

// Create an IAM role that allows CloudWatch to write to S3
const logRole = new aws.iam.Role("logRole", {
    assumeRolePolicy: aws.iam.assumeRolePolicyForPrincipal({
        Service: "logs.amazonaws.com",
    }),
});

// Attach a policy to the role that allows writing to the S3 bucket
const logPolicy = new aws.iam.RolePolicy("logPolicy", {
    role: logRole.id,
    policy: pulumi.output(logBucket.arn).apply(arn => JSON.stringify({
        Version: "2012-10-17",
        Statement: [
            {
                Effect: "Allow",
                Action: "s3:PutObject",
                Resource: `${arn}/*`,
            },
        ],
    })),
});

// Create a CloudWatch log destination
const logDestination = new aws.cloudwatch.LogDestination("logDestination", {
    roleArn: logRole.arn,
    targetArn: logBucket.arn,
});

// Export the name of the bucket and log group
export const bucketName = logBucket.bucket;
export const logGroupName = logGroup.name;

Explanation

  1. S3 Bucket: We create an S3 bucket named my-ec2-app-logs to store the logs. The bucket is private for security reasons.
  2. CloudWatch Log Group: We create a CloudWatch log group named /aws/ec2/app-logs to collect logs from your EC2 instances. The logs are retained for 7 days.
  3. CloudWatch Log Stream: We create a log stream named app-log-stream within the log group to organize the logs.
  4. IAM Role and Policy: We create an IAM role that CloudWatch logs will assume to write logs to the S3 bucket. We attach a policy to this role that grants s3:PutObject permission to the S3 bucket.
  5. CloudWatch Log Destination: We create a log destination that specifies the S3 bucket as the target for the logs.

Next Steps

  1. Configure EC2 Instances: You need to configure your EC2 instances to send logs to CloudWatch. This can be done by installing the CloudWatch agent on your instances and configuring it to send the desired logs to the log group created above.
  2. Verify Logs: Once everything is set up, you can verify that the logs are being sent to the S3 bucket by checking the bucket’s contents.

This setup ensures that your EC2 application logs are stored in an S3 bucket, providing a durable and scalable storage solution for your logs.

Deploy this code

Want to deploy this code? Sign up for a free Pulumi account to deploy in a few clicks.

Sign up

New to Pulumi?

Want to deploy this code? Sign up with Pulumi to deploy in a few clicks.

Sign up