Posts Tagged pinecone

Low-Code LLM Apps with LocalAI, Flowise, and Pulumi on AWS

Low-Code LLM Apps with LocalAI, Flowise, and Pulumi on AWS

In a previous blog post from me, we discussed how easy it is to build your πŸ¦œοΈπŸ”— LangChain LLM application and use πŸ¦œοΈπŸ“ LangServe and Pulumi to deploy it on an AWS Fargate cluster. We even went a step further and deployed a Pinecone index, all in a few lines of Pulumi code, to provide a vector store for our LLM application. Let me walk you this time a different path on creating a LLM applications.

Read more →

Pinecone Provider Now Available for Pulumi

Pinecone Provider Now Available for Pulumi

Hello, Pulumi Pinecone Provider! πŸ‘‹ The Pinecone integration with Pulumi offers a native way to manage Pinecone indexes, including the newly-announced serverless indexes. Utilize any of Pulumi’s supported languages to effortlessly create, update, and remove your Pinecone indexes. This integration facilitates the application of Infrastructure as Code principles, helping you to work even more efficiently. Furthermore, this gives you the benefit of tapping into Pulumi’s wide range of providers, offering you a diverse and powerful set of tools to enhance your development work.

Read more →