1. Streaming Real-time Mixed Reality Content with Azure Media Services


    Streaming real-time mixed reality content requires a setup that can capture, encode, and distribute streams to various platforms and devices. Azure Media Services provides live and on-demand streaming, encoding, and indexing services that are highly scalable and customizable.

    Below you'll find a Pulumi program written in Python, designed to set up the infrastructure for streaming real-time mixed reality content with Azure Media Services. The program will create a live event for ingesting live streams, a live output for processing the stream, a streaming endpoint to distribute the content, and an asset to store and stream the content.


    1. Live Event: This is the entry point for a live stream on Azure Media Services. The live event will receive live input streams, which it can then make available for processing and streaming.

    2. Live Output: This is connected to the live event. It defines the settings for how the incoming live stream is recorded and made available for streaming.

    3. Streaming Endpoint: This is the origin server from which content gets delivered to users. When enabled and properly scaled, it can deliver content to a wide audience.

    4. Asset: This is where the recorded content is stored. It can be a video on demand that can be streamed after the event or directly streamed.

    Pulumi Program

    import pulumi import pulumi_azure_native as azure_native # Configure the location for our resources location = "East US" # Create a Resource Group resource_group = azure_native.resources.ResourceGroup("resourceGroup", resource_group_name="mediaResourceGroup", location=location) # Create a Media Service media_service = azure_native.media.MediaService("mediaService", resource_group_name=resource_group.name, location=location, account_name="mediaServiceAccount") # Create a Live Event live_event = azure_native.media.LiveEvent("liveEvent", resource_group_name=resource_group.name, account_name=media_service.name, live_event_name="liveEventName", location=location, input=azure_native.media.LiveEventInputArgs( streaming_protocol="RTMP" ), description="Live event for mixed reality streaming") # Create a Live Output live_output = azure_native.media.LiveOutput("liveOutput", resource_group_name=resource_group.name, account_name=media_service.name, live_event_name=live_event.name, live_output_name="liveOutputName", archive_window_length="PT1H", asset_name="myAsset") # Create a Streaming Endpoint streaming_endpoint = azure_native.media.StreamingEndpoint("streamingEndpoint", resource_group_name=resource_group.name, account_name=media_service.name, location=location, streaming_endpoint_name="endpointName", scale_units=1) # Create an Asset asset = azure_native.media.Asset("asset", resource_group_name=resource_group.name, account_name=media_service.name, asset_name="myAsset", description="Asset holding mixed reality content") # Outputs pulumi.export("liveEventOutputUrl", live_event.input.endpoints.apply( lambda eps: eps[0].url if eps else "No endpoint" )) pulumi.export("streamingEndpointHostName", streaming_endpoint.host_name)

    Each of these resources has been declared with its corresponding Python class from the Azure Native provider in Pulumi.

    • The Live Event is created with the RTMP protocol, ideal for live streaming.
    • The Live Output records the stream and makes it available with an archive window length of 1 hour (PT1H), indicating how long the content will be retained for playback.
    • The Streaming Endpoint is set up with a single scale unit initially, but this can be scaled according to the expected audience size.
    • And finally, the Asset holds the mixed reality content that can be a recorded video or live stream.

    After you've set up this infrastructure, you can use Azure Media Services APIs to control the stream, such as starting and stopping the live event.

    Running the Program

    Ensure you have Pulumi installed and configured with Azure credentials. Then, you can run this code in a Pulumi project by placing it in a __main__.py file and running pulumi up from the command line in the same directory. This will provision the infrastructure in your Azure account.

    Remember, live streaming involves additional considerations like encoding settings, stream redundancy, and more you might fine-tune based on your requirements.