1. OCI Container Registry for ML Model Deployment

    Python

    To deploy an ML model using Oracle Cloud Infrastructure (OCI) Container Registry, you'll need to create a container repository where your Docker images can be stored. These images typically contain your trained machine learning model wrapped within an application that exposes the model over HTTP for inference requests.

    OCI's Container Registry is a managed Docker container registry service for storing and sharing container images. It's integrated with OCI Identity and Access Management (IAM) for securing access to your repositories.

    Here is a Pulumi program in Python that sets up an OCI Container Registry. It creates a repository where you can push Docker images containing your ML models. Before running this code, ensure you have the required OCI credentials set up in your Pulumi environment.

    import pulumi import pulumi_oci as oci # Create an OCI Container Repository for ML model deployment. ml_repo = oci.artifacts.ContainerRepository("mlModelRepo", # Define the compartment where the repository will be created. compartment_id=oci.config.compartment, # Replace with your compartment ID or configure in Pulumi # Display name for the repository. This is user-friendly name for the repository. display_name="ml-model-repo", # Set whether the repository is public or private. # If it is public, no authentication is required to pull images from the repository. is_public=True, # Readme for the repository. readme=oci.artifacts.ContainerRepositoryReadmeArgs( content="ML Model Repository for storing Docker images", format="TEXT_PLAIN", ), # Example of setting tags on the resource. freeform_tags={ "owner": "ml-team", "usage": "inference" } ) # Export the repository URL as an output of the Pulumi program. pulumi.export("repository_url", ml_repo.repository_url)

    Let's explore what the code is doing:

    • We import the pulumi and pulumi_oci modules, which contain the classes and functions needed to interact with Pulumi and OCI, respectively.
    • We create a container repository using oci.artifacts.ContainerRepository which creates a repository in the given compartment.
    • compartment_id specifies the OCI compartment under which the repository is created. This is where you organize your cloud resources.
    • display_name is the human-readable name for your repository.
    • is_public determines the visibility of your repository. Public repositories can be accessed without authentication.
    • You can optionally add a readme to help describe the contents or usage of the repository.
    • freeform_tags allows you to attach key-value metadata to the repository. This can be helpful for organizing and managing access at a later stage.

    After the deployment of this Pulumi program, the repository URL will be exported. You can then use a tool like Docker to tag your local image with this URL and push the image to OCI Container Registry, making it available for deployment on OCI services such as OCI Functions or Container Engine for Kubernetes.

    To roll with this setup, you'll need to:

    1. Build a Docker image containing your ML model and application.
    2. Tag the image appropriately using the repository URL from the output.
    3. Authenticate the Docker CLI with OCI's Container Registry.
    4. Push the image to the OCI Container Registry (using docker push command).
    5. [Optional] Deploy the image to an OCI cloud service that can run Docker containers, set up any required networking and authentication, and open it up for inference requests.

    Make sure to have the OCI CLI installed and correctly configured with the right permissions to push images to the registry. Also, remember that managing access and users is critical for production services, so ensure your IAM policies are set up correctly.