Posts Tagged guest-post

Kenshoo Migrates to AWS with Pulumi

Kenshoo Migrates to AWS with Pulumi

Danny Zalkind is the DevOps group manager for Kenshoo, an award-winning intelligent marketing platform. He brings his 15 years of exprience of managing tech teams to his current role where he’s dedicated to allow Kenshoo R&D to efficiently produce and serve software. You can find him on Linkedin.

Kenshoo is an independent, global marketing platform for strategy, measurement, and best-of-breed activation across all of the world’s most influential digital channels. Kenshoo’s solution provides data-driven insights and optimization technology to help companies make informed decisions and scale performance across critical publishers.

Kenshoo possesses a highly technical engineering organization with over 350 software engineers, data experts, and DevOps engineers.

Read more →

Automate Your Infrastructure with Automation API and C#

Automate Your Infrastructure with Automation API and C#

Note Joshua Studt is a Solutions Architect at Financial Independence Group and a Pulumi Community member who contributed the C# package for Automation API. Currently available in public preview, Pulumi’s Automation API enables you to provision your infrastructure programmatically using the Pulumi engine. Today, we are excited to announce C# support for Automation API, enabling .NET developers to automate infrastructure deployments, create complex orchestration workflows, build custom ops tooling, and build cloud frameworks.

Read more →

Credijusto Manages Authentication with Auth0 and Pulumi

Credijusto Manages Authentication with Auth0 and Pulumi

Guest author Lead Devops Engineer Fernando Carletti, writes about using the Pulumi Auth0 provider to manage resources at Credijusto.

Auth0 allows you to simplify your authentication process. The Auth0 Provider allows you to manage the Auth0 resources, managing Applications, Databases, Social Connections, APIs, and other resources. Here at Credijusto we use it manage authentication from the front-end through all the APIs that serve that request, leveraging the complexity of the authentication to Auth0.

For this article, we will start a new Pulumi project in a fresh Auth0 account and fully configure it for a backend and a single page application and set up a connection to Github which allows you apps to authenticate with it using OAuth.

Read more →

How Pinpoint Manages Kubernetes Costs and Deployments

How Pinpoint Manages Kubernetes Costs and Deployments

This guest blog was contributed by Andrew Kunzel and Michael Goode of Pinpoint. Andrew is the Director of Backend Engineering, and Michael is a Platform Operations Engineer.

At Pinpoint, Kubernetes is the most powerful tool in our arsenal. It allows us to deploy and rapidly scale our applications with speed and efficiency that continues to delight our customers. In recent years, managed services like AWS EKS have made it easier than ever to leverage the power of Kubernetes in even the smallest of organizations. Yet even with these new conveniences, managing all of this infrastructure can be a daunting task. Right out of the gate, we knew that we wanted to avoid the burden of maintaining repositories full of home-brewed deployment scripts and domain-specific languages like YAML.

Read more →

Three Infrastructure as Code Blog Posts You Should Read

Three Infrastructure as Code Blog Posts You Should Read

We are always excited when people join the Infrastructure as Code community and write about their experiences. Pulumi can be used for a range of common tasks such as standardizing VPC builds, building VSphere virtual machines, or deploying your infrastructure from a CI/CD pipeline. Whether it’s TypeScript, JavaScript, or Python you can build your infrastructure with your language and tools of choice. Here are three new blog posts that show how to use Pulumi with code examples to perform these tasks.

Read more →

Mapbox IOT-as-code with Pulumi Crosswalk for AWS

Mapbox IOT-as-code with Pulumi Crosswalk for AWS

Guest Author: Chris Toomey, Solution Architect Lead @ Mapbox

With 8 billion+ connected IoT devices and 2 billion GPS-equipped smartphones already online, logistics businesses are tracking assets at every step in the supply chain. At this scale and complexity, it is imperative to have a flexible way to ingest, process, and act upon this data, without sacrificing security or best practices.

To meet this need, Mapbox has created an Asset Tracking Solution that uses Pulumi’s open source JavaScript libraries (AWS, AWSX) available with multi-language support with Pulumi Crosswalk for AWS. Pulumi Crosswalk for AWS is an open source framework that streamlines creation, deployment and management of AWS services with built-in AWS Best Practices and minimal lines of code in common programming languages.

In this blog, we will show snippets of the Javascript code that embraces the power of Pulumi to program AWS service APIs to create the Mapbox solution. To see the full architecture in action with a live bike race across America, please refer to this webinar recorded on June 13th 2019 and the Mapbox asset tracking solution. Also refer to this blog of the Race across America showcased live during the webinar tomorrow.

Read more →

Managing your MySQL databases with Pulumi

Managing your MySQL databases with Pulumi

One of the most critical components of an application’s infrastructure is its database, and one of the most popular databases in use in the cloud today is MySQL.

Pulumi can already be used to create managed MySQL instances in a wide variety of clouds, including AWS, Azure and GCP. In addition to this, Pulumi recently added support for managing the MySQL instances themselves to manage permissions, create databases, and other common tasks.

In this post, we’ll walk through a quick tutorial of how to use this new Pulumi MySQL provider to manage existing and new MySQL databases.

Read more →

Data science on demand: spinning up a Wallaroo cluster

Data science on demand: spinning up a Wallaroo cluster

This guest post is from Simon Zelazny of Wallaroo Labs. Find out how Wallaroo powered their cluster provisioning with Pulumi, for data science on demand.

Last month, we took a long-running pandas classifier and made it run faster by leveraging Wallaroo’s parallelization capabilities. This time around, we’d like to kick it up a notch and see if we can keep scaling out to meet higher demand. We’d also like to be as economical as possible: provision infrastructure as needed and de-provision it when we’re done processing.

If you don’t feel like reading the post linked above, here’s a short summary of the situation: there’s a batch job that you’re running every hour, on the hour. This job receives a CSV file and classifies each row of the file, using a Pandas-based algorithm. The run-time of the job is starting to near the one-hour mark, and there’s concern that the pipeline will break down once the input data grows past a particular point.

In the blog post, we show how to split up the input data into smaller dataframes, and distribute them among workers in an ad-hoc Wallaroo cluster, running on one physical machine. Parallelizing the work in this manner buys us a lot of time, and the batch job can continue processing increasing amounts of data.

Read more →