Duration: 1 hour
Standing up production-ready infrastructure to store and analyze your real-time data feeds used to be a major undertaking. Now, with a few lines of code you can spin up all the resources you need in the cloud. In this session, we’ll introduce you to Apache Kafka—a community distributed event streaming platform capable of handling trillions of events a day. We’ll show you how to quickly provision and connect Kafka clusters using Confluent Cloud - a fully-managed cloud-native platform built by the original creators of Kafka. With the Pulumi Conflent Provider, you’ll learn how to easily provision Kafka and a complete data streaming platform using your favorite programming languages.
- Josh KodroffSolutions Architect, Pulumi
- Spencer ShumwaySenior Product Manager, Confluent
Join us to learn:
- How to provision a data streaming platform using modern programming languages
- Securing your platform with Role Based Access Control (RBAC)
- Using Connectors and other common features.
Get Started with Pulumi
- Create an AWS S3 Bucket then modify the bucket to host a static website.
- Create an Azure Resource Group and Storage Account, then export the storage account’s connection string.
- Create a Google Cloud Storage Bucket and apply labels to that bucket.
- Create a Kubernetes NGINX deployment and add a config value for MiniKube deployments.