FinOps With Pulumi

Matt Small Matt Small Richard Shade Richard Shade
FinOps With Pulumi

What is FinOps?

The FinOps Foundation eloquently defines FinOps as “an evolving cloud financial management discipline and cultural practice that enables organizations to get maximum business value by helping engineering, finance, technology and business teams to collaborate on data-driven spending decisions.” Simply put, FinOps is the continuous effort to control cloud spend.

Just as organizations have adopted operations-focused best practices into software development cycles and have considered how to best insert security best practices along the way, financial best practices may also be codified by developers writing cloud programs.

Read more →

Organizing AWS Accounts With Pulumi

Praneet Loke Praneet Loke
Organizing AWS Accounts With Pulumi

In an enterprise organization, an IT self-service “vending machine” allows employees to quickly and easily request and receive access to pre-approved cloud resources. Behind the scenes, Pulumi programs may orchestrate any of the requisite resources. We will look at an example of using Pulumi to create an AWS child account, within an AWS Organization.

Read more →

Automating Pulumi Import with Manually Created Resources

Josh Kodroff Josh Kodroff
Automating Pulumi Import with Manually Created Resources

A few weeks ago, I was speaking with a consultant at one of the big firms who asked me how he could introduce Pulumi into a client’s organization when that client had created many infrastructure resources manually through the AWS console and was running production workloads on those resources.

Introducing modern cloud infrastructure tooling and automation is relatively simple (or at least more straightforward) when organizations decide to adopt IaC from the start of their cloud journey, but what about organizations who have gone far enough down the route of manually created cloud infrastructure to see the perils of that approach? Many teams come to this realization only when they’ve deployed too many production workloads to start over from scratch. If your organization is looking at Pulumi as an IaC solution, it’s worth bringing these resources under management because of the low effort and high value of having a single pane of glass to manage all of your resources.

Read more →

Achieving Amazing Performance in the Pulumi CLI

Robbie McKinstry Robbie McKinstry
Achieving Amazing Performance in the Pulumi CLI

This is the first post in a series about performance optimizations we’ve made to the Pulumi CLI. Over the last six months at Pulumi, the Platform Team has been working on a project we call “Amazing Performance.” Amazing Performance is a new initiative to improve the throughput and latency of the Pulumi CLI not only for power users but for everyone. By the end of June 2022, we assembled a list of issues containing both high-value improvements requiring a sizable investment and low-hanging fruit for quick wins. The full list, including the items we have yet to tackle, is contained in a tracking issue on GitHub. This blog series will cover the highlights.

Read more →

The Easier Way to Create Pulumi Providers in Go

Kyle Dixler Kyle Dixler
The Easier Way to Create Pulumi Providers in Go

We are excited to announce that we’ve updated the Pulumi Provider Boilerplate to make custom provider implementation easier!

This major update brings a wealth of usability improvements to the Pulumi Provider Boilerplate by incorporating our brand new Pulumi Go Provider SDK.

Read more →

How To Build An ETL Pipeline With Amazon Redshift & AWS Glue

Christian Nunciato Christian Nunciato
How To Build An ETL Pipeline With Amazon Redshift & AWS Glue

In our last episode, Deploying a Data Warehouse with Pulumi and Amazon Redshift, we covered using Pulumi to load unstructured data from Amazon S3 into an Amazon Redshift cluster. That went well, but you may recall that at the end of that post, we were left with a few unanswered questions:

  • How do we avoid importing and processing the same data twice?
  • How can we transform the data during the ingestion process?
  • What are our options for loading data automatically — for example, on a regular schedule?

These are the kinds of questions you’ll almost always have when setting up a data-processing (or ETL) pipeline — and every platform tends to answer them a little differently.

Read more →

How to Create and Share a Pulumi Template

Christian Nunciato Christian Nunciato
How to Create and Share a Pulumi Template

Last month, we released our first set of architecture templates — configurable Pulumi projects designed to make it easy to bootstrap new stacks for common cloud architectures like static websites, containers, virtual machines, and Kubernetes clusters. Architecture templates are a great way to get a new project up and running quickly, and they’ve already grown quite popular with our users, several of whom have asked if whether it’s possible to create templates of their own.

Read more →

Transferring Stacks in the Pulumi Service Just Got Easier

Meagan Cojocar Meagan Cojocar
Transferring Stacks in the Pulumi Service Just Got Easier

Exactly 3 years ago we added support in the Pulumi Service to transfer stacks from an Individual account to a Pulumi organization and between Pulumi organizations. We heard from customers that they love this feature but found it both hard to discover and tedious when moving a large workload from one organization to another and from Individual accounts to organizations. We are excited to announce bulk stack transfer to address this feedback and a new organization set up wizard to improve discovery of the feature.

Read more →