Dynamic Secrets for AI Pipeline Jobs with Vault
PythonWhen working with AI pipelines, it is common to require access to sensitive data and credentials. These might be used for accessing data stores, APIs, or other cloud services. Storing these secrets in a secure manner while also keeping them accessible to your AI jobs is a challenge. This is where HashiCorp Vault comes into play. Vault is an identity-based secret and encryption management system.
A common approach is to use Vault's dynamic secrets capabilities, which ensures that different instances of jobs or tasks have unique sets of credentials that are generated on-the-fly and have a limited lifetime. This reduces the risk of credential leakage and makes it possible to track and audit all access to the secrets.
To help you set this up using Pulumi with Python, I'm going to walk you through the creation of dynamic secrets for your AI pipeline jobs. We will set up a Vault server, configure a secrets engine for dynamic secrets, and create a role that your AI jobs will assume to access these secrets.
In this example, we will assume you have a Vault server already deployed and an administrative token available to configure Vault. We are not deploying Vault from scratch here, but rather setting up the dynamic secrets component using Pulumi.
Note that you need to have the Pulumi CLI installed and configured with the appropriate cloud provider credentials. Additionally, you should have the Pulumi Vault provider installed in your working environment.
Let's start by writing the Pulumi program:
import pulumi import pulumi_vault as vault # Configure the Vault provider with your server's address and an administrative token vault_provider = vault.Provider("vault", address="https://vault.example.com", token="s.myVaultAdminToken") # Enable a new KV (Key-Value) secrets engine on Vault, if not already enabled. kv_engine = vault.Mount("kv-engine", path="kv", type="kv-v2", opts=pulumi.ResourceOptions(provider=vault_provider), ) # Create a role associated with a policy that grants dynamic secrets capabilities. # Here, you'd specify the policy in HCL format that aligns with your security requirements. role = vault.SecretBackendRole("ai-pipeline-role", backend=kv_engine.path, name="ai-pipeline-job", token_policies=["ai-job-access-policy"], opts=pulumi.ResourceOptions(provider=vault_provider), ) # Export the path where the AI jobs should request their dynamic secrets. pulumi.export("dynamic_secrets_path", kv_engine.path)
Before this code can be executed using Pulumi CLI commands like
pulumi up
, let's go through what each step does:- We begin by creating a Vault provider to interact with your Vault instance. Replace the
address
andtoken
with your actual Vault server address and a valid administrative token. - The
vault.Mount
resource enables a new Key-Value secrets engine on the specified Vault server if it isn't already enabled. This engine will store the dynamic secrets that are generated. - We then create a
vault.SecretBackendRole
. This role links to the KV secrets engine and is associated with a policy that specifies what secrets the AI jobs can access. You will have to defineai-job-access-policy
using HashiCorp Configuration Language (HCL), aligning with your specific security needs. - Finally, we export the path of the KV secrets engine, which is where your AI jobs will request their credentials from Vault.
Save this file and run
pulumi up
to apply your configuration. Make sure Vault is reachable from your machine and Pulumi CLI is properly configured.This sets up the backend for dynamic secrets with Vault for your AI pipeline jobs. The actual retrieval of these secrets would be done by your AI jobs using Vault's APIs or client libraries during their execution time.
- We begin by creating a Vault provider to interact with your Vault instance. Replace the