1. AI Model Inputs Storage on Cosmos DB Containers

    Python

    To create storage for AI model inputs on Azure Cosmos DB, you will need to create a Cosmos DB account, a database within that account, and a container within that database. The Cosmos DB account acts as the management unit for your databases, the database serves as a namespace to manage multiple containers, and the containers are where the actual items for your AI models will be stored.

    Below is a Pulumi program in Python that will create these resources:

    1. Cosmos DB Account: This is the top-level resource which provides the necessary endpoint to interact with the Cosmos DB service.
    2. SQL Database: Under the Cosmos DB account, a SQL API database will be created, which can contain multiple containers.
    3. SQL Container: Within the SQL database, we define a SQL container that will actually store the data.

    Each SQL container is provisioned with a certain amount of throughput, which is measured in Request Units per second (RUs). You need to consider how much throughput to provision based on the expected workload.

    Here's what the Pulumi code looks like for setting up AI model input storage on Azure Cosmos DB Containers:

    import pulumi import pulumi_azure_native as azure_native # Create a new resource group to contain the Cosmos DB account resource_group = azure_native.resources.ResourceGroup('ai_model_resources') # Create a new Cosmos DB account with a SQL API type account = azure_native.documentdb.DatabaseAccount('cosmosdb_account', resource_group_name=resource_group.name, kind="GlobalDocumentDB", locations=[ { "locationName": "East US", "failoverPriority": 0, "isZoneRedundant": False, }, ], database_account_offer_type="Standard", enable_multiple_write_locations=True, ) # Create a new SQL API database within the Cosmos DB account database = azure_native.documentdb.SqlResourceSqlDatabase('sql_database', resource_group_name=resource_group.name, account_name=account.name, resource=azure_native.documentdb.SqlDatabaseResourceArgs( id="aIModelInputsDatabase" ), options=azure_native.documentdb.CreateUpdateOptionsArgs( throughput=400, # Specify the throughput ), ) # Create a new SQL container within the SQL API database container = azure_native.documentdb.SqlResourceSqlContainer('sql_container', resource_group_name=resource_group.name, account_name=account.name, database_name=database.name, resource=azure_native.documentdb.SqlContainerResourceArgs( id="ModelInputs", partitionKey=azure_native.documentdb.ContainerPartitionKeyArgs( paths=["/ModelId"], kind="Hash" ) ), options=azure_native.documentdb.CreateUpdateOptionsArgs( throughput=1000, # Higher throughput, adjust based on actual usage ), ) # Export the Cosmos DB account endpoint and primary master key pulumi.export('endpoint', account.document_endpoint) pulumi.export('primary_master_key', account.list_keys.apply(lambda keys: keys.primary_master_key))

    This program sets up infrastructure where you can store AI model inputs. Once completed, the endpoint of the Cosmos DB service and its primary key are exported. These outputs can be used to configure your AI application to connect to the Cosmos DB container and store or retrieve data.

    Detailed Explanation:

    • azure_native.resources.ResourceGroup: Represents an Azure Resource Group which is a container that holds related resources for an Azure solution.
    • azure_native.documentdb.DatabaseAccount: This creates an Azure Cosmos DB account which is required before you can create any databases or containers.
    • azure_native.documentdb.SqlResourceSqlDatabase: This creates a SQL API database within the Cosmos DB account for managing and containing your containers.
    • azure_native.documentdb.SqlResourceSqlContainer: This creates an individual container within the database that will actually store the items.
    • pulumi.export: Outputs the generated values from the Pulumi program, such as the endpoint and primary master key for the Cosmos DB account. These can be used in other systems or for reference.

    Note on Throughput:

    Throughput settings, represented by throughput parameter, are important for performance considerations. You need to set the throughput according to your application's requirements. It's measured in Request Units per second (RUs). You can start with the default values provided and adjust as necessary based on actual performance and cost management.