1. Capturing EC2 Instance Network Flows for Security Analysis

    Python

    To capture EC2 Instance network flows for security analysis, we will use AWS Flow Logs. AWS Flow Logs allow you to capture information about the IP traffic going to and from network interfaces in your Virtual Private Cloud (VPC). You can use this feature to monitor the traffic that is reaching your instances, to diagnose overly restrictive security group rules, to monitor traffic that is reaching an elastic load balancer, and more.

    In order to set this up using Pulumi and AWS, you will need to create a Flow Log for your VPC, subnet, or network interface. The data captured by Flow Logs can be stored in Amazon S3 or CloudWatch Logs.

    Below is a Pulumi program written in Python that demonstrates how you might set up Flow Logs for an EC2 instance to capture the network flows:

    import pulumi import pulumi_aws as aws # First, you need an existing VPC and an EC2 instance for which you want to capture network flows. # Let's assume you already have a VPC and an EC2 instance. # Here's how to get the existing VPC and instance using Pulumi. vpc = aws.ec2.get_vpc(default=True) subnet = aws.ec2.get_subnet(id="subnet-123456") # Replace with your actual subnet ID security_group = aws.ec2.get_security_group(vpc_id=vpc.id, name="default") # Assuming you already have an EC2 instance running that you want to monitor # I'm using a placeholder for the ID of the instance. Replace 'i-0123456789abcdef0' with your actual instance ID instance_id = 'i-0123456789abcdef0' # Create an IAM Role and Policy to allow publishing of flow logs to a CloudWatch Logs group. flow_log_policy = aws.iam.Policy("flow-log-policy", policy=pulumi.Output.from_input({ "Version": "2012-10-17", "Statement": [{ "Action": [ "logs:CreateLogGroup", "logs:CreateLogStream", "logs:PutLogEvents", "logs:DescribeLogGroups", "logs:DescribeLogStreams" ], "Resource": "arn:aws:logs:*:*:*", "Effect": "Allow", }], }).apply(lambda p: p.json) ) flow_log_role = aws.iam.Role("flow-log-role", assume_role_policy={ "Version": "2012-10-17", "Statement": [{ "Action": "sts:AssumeRole", "Effect": "Allow", "Principal": { "Service": "vpc-flow-logs.amazonaws.com", }, }], } ) aws.iam.RolePolicyAttachment("flow-log-role-policy-attachment", policy_arn=flow_log_policy.arn, role=flow_log_role.name ) # Create a CloudWatch Logs Group to store the flow logs. flow_log_group = aws.cloudwatch.LogGroup("flow-log-group") # Create a flow log to monitor network traffic, attaching it to the VPC. vpc_flow_log = aws.ec2.FlowLog("vpc-flow-log", iam_role_arn=flow_log_role.arn, log_destination=flow_log_group.arn, traffic_type="ALL", vpc_id=vpc.id ) pulumi.export('flow_log_id', vpc_flow_log.id) pulumi.export('flow_log_group_name', flow_log_group.name)

    In the above program, we perform the following steps:

    1. Get the default VPC and a specific subnet. This is where we'll be monitoring traffic.
    2. Assume you have an EC2 instance running; we get its ID.
    3. We create an IAM Policy and Role for the VPC Flow Logs service to publish logs to CloudWatch.
    4. We then attach the policy to the role.
    5. Next, we create a CloudWatch Logs Group. This is where all the captured logs will be stored.
    6. Finally, we create a Flow Log resource and attach it to the VPC, specifying that it should log all traffic ("ALL"), and we send the logs to the CloudWatch Logs Group. The iam_role_arn is set to the ARN of the role we created earlier, providing the required permissions to push logs to CloudWatch.

    You need to replace placeholders like subnet-123456 and i-0123456789abcdef0 with your actual subnet ID and EC2 instance ID, respectively.

    By exporting the flow_log_id and flow_log_group_name, you can easily identify the Flow Log and the related logs from the log group within CloudWatch from the CLI or Pulumi's output.

    Once you have this set up and the flow logs are being captured, you can use various tools for analyzing these logs, such as ElasticSearch, Athena, or custom scripts to gain insights from the data.