1. Pub/Sub for Asynchronous Model Inference Triggering

    Python

    To set up Pub/Sub for asynchronous model inference triggering, you'll need to create a Pub/Sub topic to which applications can send messages containing the data to be processed. These messages will then be delivered to a subscription connected to a service (like a cloud function or a containerized application) that will perform the model inference.

    In Google Cloud, the Pub/Sub system facilitates this by allowing you to create 'topics' for sending messages and 'subscriptions' to receive messages. Here's how you can set up a simple Pub/Sub infrastructure using Pulumi and the gcp package, which is preferred for Google Cloud resources.

    This program does the following:

    1. Creates a Pub/Sub topic named model-inference-topic.
    2. Creates a Pub/Sub subscription named model-inference-subscription that is associated with the topic.
    3. Grants a service account permissions to publish messages to the topic. This is a placeholder service account in this example and should be replaced with the actual one used by your model serving service.
    import pulumi import pulumi_gcp as gcp # Create a Pub/Sub topic that will be used to send messages for processing. model_inference_topic = gcp.pubsub.Topic("model-inference-topic") # Create a subscription to the newly created topic where the messages will be received. model_inference_subscription = gcp.pubsub.Subscription( "model-inference-subscription", topic=model_inference_topic.name, ack_deadline_seconds=20 # Time to acknowledge receipt of a message. Adjust as needed. ) # An example service account that will publish messages to the topic. # Replace `my-service-account@project-id` with your own service account email that will interact with the topic pubsub_publisher = gcp.serviceaccount.Account("pubsub-publisher", account_id="pubsub-publisher", display_name="PubSub Publisher Service Account") # IAM binding that allows the service account to publish to the topic pubsub_publisher_iam = gcp.pubsub.TopicIAMMember( "pubsub-publisher-iam", topic=model_inference_topic.name, role="roles/pubsub.publisher", member=pulumi.Output.concat("serviceAccount:", pubsub_publisher.email) ) # Export the URLs so that they can be easily accessed if needed. pulumi.export("topic_name", model_inference_topic.name) pulumi.export("subscription_name", model_inference_subscription.name) pulumi.export("publisher_service_account_email", pubsub_publisher.email)

    Explanation:

    • Pub/Sub Topic: A topic is a resource to which messages are sent. Any application or service with the correct permissions can publish to the topic.
    • Pub/Sub Subscription: A subscription is attached to a topic and receives messages published to that topic. In this case, the service that performs the model inference would pull messages from this subscription.
    • Service Account: Represents an identity that will have permission to interact with the topic. Typically, this will be the account under which your model inference service runs.
    • IAM Permissions: Manage access permissions to the topic, granting the defined service account the ability to publish messages.

    After setting up this infrastructure, your model inference service can listen on the subscription for new messages to process asynchronously.

    Make sure to replace the placeholder service account with your actual service account, and adjust the ack_deadline_seconds to match the requirements of your message acknowledgement timelines.

    To execute this program with Pulumi, save the code in a file named __main__.py, and then run pulumi up from the command line in the same directory to create the resources. Please note that you should have Google Cloud Platform configured with the Pulumi CLI before running the command.