Posts Tagged ai

Deploy LangServe Apps with Pulumi on AWS (RAG & Chatbot)

Deploy LangServe Apps with Pulumi on AWS (RAG & Chatbot)

We all know how easy it is to create, deploy, and manage any cloud infrastructure with Pulumi using your favorite programming language. With the rise of artificial intelligence (AI) more and more developers are working on LLM-powered applications and services. And with this, the need to have the same ease of use for creating, deploying, and managing the infrastructure for these applications is growing.

In this blog post, we will show you how to this can be achieved with combining Pulumi and LangServe.

Read more →

Pinecone Provider Now Available for Pulumi

Pinecone Provider Now Available for Pulumi

Hello, Pulumi Pinecone Provider! 👋

The Pinecone integration with Pulumi offers a native way to manage Pinecone indexes, including the newly-announced serverless indexes. Utilize any of Pulumi’s supported languages to effortlessly create, update, and remove your Pinecone indexes. This integration facilitates the application of Infrastructure as Code principles, helping you to work even more efficiently. Furthermore, this gives you the benefit of tapping into Pulumi’s wide range of providers, offering you a diverse and powerful set of tools to enhance your development work.

Read more →

Deploy Cloud Infrastructure in 30 Seconds with Pulumi AI

Deploy Cloud Infrastructure in 30 Seconds with Pulumi AI

There are new intelligent cloud management capabilities available in Pulumi Copilot. Learn More

Earlier this year we launched Pulumi AI, a purpose-built AI assistant that can create Infrastructure as Code (IaC) from natural language prompts using Pulumi. Since launch, we’ve seen incredible adoption of Pulumi AI, with over 200,000 questions asked so far and growing fast. Pulumi AI is popular with users new to Pulumi and/or new to the Cloud, but also heavily used by many of the most advanced IaC users and organizations to quickly discover solutions to new problems they need to solve. Over the last few months, we’ve driven major improvements to Pulumi AI through the recently launched Pulumi AI Answers pages with thousands of AI generated answers to common questions, improvements to code generation correctness and performance, and expansion of the languages supported by Pulumi AI.

Today, we are taking the next big step, introducing support for deploying cloud infrastructure directly from Pulumi AI. Going from idea to running cloud infrastructure is just a natural language prompt away!

Read more →

How AI is Transforming DevOps: AI Talks for DevOps Insights

How AI is Transforming DevOps: AI Talks for DevOps Insights

The integration of artificial intelligence (AI) with DevOps signals a new era in software development. DevOps possesses unique characteristics and needs that make it exceptionally compatible with AI augmentation. Given that code fundamentally relies on language, and large language models (LLMs) serve as the core of GPT functionality, these models are particularly well-suited for tasks such as code generation. This article unwraps the topics addressed during our “AI: Friends or Foe | AI Talks for DevOps” event in San Francisco.

Read more →

LangChain for DevOps: Learn LLM & GenAI for Dev, Sec & Ops

LangChain for DevOps: Learn LLM & GenAI for Dev, Sec & Ops

The emergence of DevOps revolutionized software development. Now, with AI powered tools like LangChain, these transformations are being accelerated. Unsurprisingly, our distinguished speaker at the launch of Pulumi’s in-person AI Talks, Patrick Debois, who coined the term “DevOps,” has recently tuned into LLM and GenAI Ops using the Langchain framework.

Read more →

Pulumi Insights and AI in the Pulumi CLI

Pulumi Insights and AI in the Pulumi CLI

Earlier this year we introduced Pulumi Insights, a collection of features that bring intelligence to cloud infrastructure using Pulumi. Two key components of that launch were Pulumi AI, a generative AI assistant purpose-built to create cloud infrastructure using natural language, and Pulumi Resource Search, multi-cloud search and analytics across every cloud resource and environment in your organization.

Today, we are excited to bring Pulumi Insights into the pulumi CLI with the new pulumi org search and pulumi ai commands. These commands put AI and resource search at your finger tips right where Pulumi users spend most of their time, in the terminal iterating on their cloud infrastructure.

Read more →

Deploy an AI/ML Chatbot Frontend on Vercel with Pulumi

Deploy an AI/ML Chatbot Frontend on Vercel with Pulumi

The process of taking an idea and turning it into reality has been nothing short of extraordinary since we started innovating with Artificial Intelligence. With this technology, machines learn about and communicate with people, while also helping us in ways we never could have imagined only a few years ago. If you’ve been following along, you might recall that the real AI challenge is cloud, not code where we used Python and Pulumi to a chatbot API (named katwalk).

Read more →

Deploy AI Models on Amazon SageMaker using Pulumi Python IaC

Deploy AI Models on Amazon SageMaker using Pulumi Python IaC

Running models from Hugging Face on Amazon SageMaker is a popular deployment option for AI/ML services. While the SageMaker console allows for provisioning these cloud resources, this deployment pattern is labor intensive to document and vulnerable to human errors when reproducing as a regular operations practice. Infrastructure as Code (IaC) offers a reliable and easy to duplicate deployment practice. By developing this IaC with Pulumi, practitioners can choose to write their infrastructure code in Python and seamlessly develop both AI application code and IaC code in the same language.

Read more →

The Real AI Challenge is Cloud, not Code!

The Real AI Challenge is Cloud, not Code!

The AI industry is stealing the show as tech’s goldrush of the ’20s. Just looking at ChatGPT’s record setting user growth, and rapid 3rd party integration by top brands, it is not surprising the hype suggests this is the beginning of a major digital transformation.

However, using AI/ML in your own products has some major challenges and obstacles. Below is a diagram of the end to end workflow of building and using an AI model: preparing the data, training a model, fine-tuning a model, hosting and running a model, building a backend service to serve the model, and building the user interface that interacts with the model. Most AI engineers are only involved in a few steps of the process. However, there is one challenge that is common across the entire workflow: creating and managing the cloud infrastructure is hard.

Read more →

Resource Search - AI Assist is Generally Available

Resource Search - AI Assist is Generally Available

Pulumi Cloud Resource Search AI assist functionality is now generally available to all organizations! In addition we have shipped some improvements to the feature to make it easier to use and more discoverable: a toggle on the search bar, suggested queries and an “I’m Feeling Lucky” button to generate a random query for you.

Read more →