1. Packages
  2. AWS Native
  3. API Docs
  4. sagemaker
  5. InferenceExperiment

AWS Native is in preview. AWS Classic is fully supported.

AWS Native v0.102.0 published on Tuesday, Apr 16, 2024 by Pulumi

aws-native.sagemaker.InferenceExperiment

Explore with Pulumi AI

aws-native logo

AWS Native is in preview. AWS Classic is fully supported.

AWS Native v0.102.0 published on Tuesday, Apr 16, 2024 by Pulumi

    Resource Type definition for AWS::SageMaker::InferenceExperiment

    Create InferenceExperiment Resource

    Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.

    Constructor syntax

    new InferenceExperiment(name: string, args: InferenceExperimentArgs, opts?: CustomResourceOptions);
    @overload
    def InferenceExperiment(resource_name: str,
                            args: InferenceExperimentArgs,
                            opts: Optional[ResourceOptions] = None)
    
    @overload
    def InferenceExperiment(resource_name: str,
                            opts: Optional[ResourceOptions] = None,
                            model_variants: Optional[Sequence[InferenceExperimentModelVariantConfigArgs]] = None,
                            type: Optional[InferenceExperimentType] = None,
                            role_arn: Optional[str] = None,
                            endpoint_name: Optional[str] = None,
                            name: Optional[str] = None,
                            kms_key: Optional[str] = None,
                            data_storage_config: Optional[InferenceExperimentDataStorageConfigArgs] = None,
                            desired_state: Optional[InferenceExperimentDesiredState] = None,
                            schedule: Optional[InferenceExperimentScheduleArgs] = None,
                            shadow_mode_config: Optional[InferenceExperimentShadowModeConfigArgs] = None,
                            status_reason: Optional[str] = None,
                            tags: Optional[Sequence[_root_inputs.TagArgs]] = None,
                            description: Optional[str] = None)
    func NewInferenceExperiment(ctx *Context, name string, args InferenceExperimentArgs, opts ...ResourceOption) (*InferenceExperiment, error)
    public InferenceExperiment(string name, InferenceExperimentArgs args, CustomResourceOptions? opts = null)
    public InferenceExperiment(String name, InferenceExperimentArgs args)
    public InferenceExperiment(String name, InferenceExperimentArgs args, CustomResourceOptions options)
    
    type: aws-native:sagemaker:InferenceExperiment
    properties: # The arguments to resource properties.
    options: # Bag of options to control resource's behavior.
    
    

    Parameters

    name string
    The unique name of the resource.
    args InferenceExperimentArgs
    The arguments to resource properties.
    opts CustomResourceOptions
    Bag of options to control resource's behavior.
    resource_name str
    The unique name of the resource.
    args InferenceExperimentArgs
    The arguments to resource properties.
    opts ResourceOptions
    Bag of options to control resource's behavior.
    ctx Context
    Context object for the current deployment.
    name string
    The unique name of the resource.
    args InferenceExperimentArgs
    The arguments to resource properties.
    opts ResourceOption
    Bag of options to control resource's behavior.
    name string
    The unique name of the resource.
    args InferenceExperimentArgs
    The arguments to resource properties.
    opts CustomResourceOptions
    Bag of options to control resource's behavior.
    name String
    The unique name of the resource.
    args InferenceExperimentArgs
    The arguments to resource properties.
    options CustomResourceOptions
    Bag of options to control resource's behavior.

    Example

    The following reference example uses placeholder values for all input properties.

    Coming soon!
    
    Coming soon!
    
    Coming soon!
    
    Coming soon!
    
    const inferenceExperimentResource = new aws_native.sagemaker.InferenceExperiment("inferenceExperimentResource", {
        modelVariants: [{
            infrastructureConfig: {
                infrastructureType: aws_native.sagemaker.InferenceExperimentModelInfrastructureConfigInfrastructureType.RealTimeInference,
                realTimeInferenceConfig: {
                    instanceCount: 0,
                    instanceType: "string",
                },
            },
            modelName: "string",
            variantName: "string",
        }],
        type: aws_native.sagemaker.InferenceExperimentType.ShadowMode,
        roleArn: "string",
        endpointName: "string",
        name: "string",
        kmsKey: "string",
        dataStorageConfig: {
            destination: "string",
            contentType: {
                csvContentTypes: ["string"],
                jsonContentTypes: ["string"],
            },
            kmsKey: "string",
        },
        desiredState: aws_native.sagemaker.InferenceExperimentDesiredState.Running,
        schedule: {
            endTime: "string",
            startTime: "string",
        },
        shadowModeConfig: {
            shadowModelVariants: [{
                samplingPercentage: 0,
                shadowModelVariantName: "string",
            }],
            sourceModelVariantName: "string",
        },
        statusReason: "string",
        tags: [{
            key: "string",
            value: "string",
        }],
        description: "string",
    });
    
    Coming soon!
    

    InferenceExperiment Resource Properties

    To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.

    Inputs

    The InferenceExperiment resource accepts the following input properties:

    EndpointName string
    ModelVariants List<Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentModelVariantConfig>
    An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
    RoleArn string
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
    Type Pulumi.AwsNative.SageMaker.InferenceExperimentType
    The type of the inference experiment that you want to run.
    DataStorageConfig Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentDataStorageConfig
    Description string
    The description of the inference experiment.
    DesiredState Pulumi.AwsNative.SageMaker.InferenceExperimentDesiredState
    The desired state of the experiment after starting or stopping operation.
    KmsKey string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
    Name string
    The name for the inference experiment.
    Schedule Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentSchedule
    ShadowModeConfig Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentShadowModeConfig
    StatusReason string
    The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
    Tags List<Pulumi.AwsNative.Inputs.Tag>
    An array of key-value pairs to apply to this resource.
    EndpointName string
    ModelVariants []InferenceExperimentModelVariantConfigArgs
    An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
    RoleArn string
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
    Type InferenceExperimentType
    The type of the inference experiment that you want to run.
    DataStorageConfig InferenceExperimentDataStorageConfigArgs
    Description string
    The description of the inference experiment.
    DesiredState InferenceExperimentDesiredState
    The desired state of the experiment after starting or stopping operation.
    KmsKey string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
    Name string
    The name for the inference experiment.
    Schedule InferenceExperimentScheduleArgs
    ShadowModeConfig InferenceExperimentShadowModeConfigArgs
    StatusReason string
    The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
    Tags TagArgs
    An array of key-value pairs to apply to this resource.
    endpointName String
    modelVariants List<InferenceExperimentModelVariantConfig>
    An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
    roleArn String
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
    type InferenceExperimentType
    The type of the inference experiment that you want to run.
    dataStorageConfig InferenceExperimentDataStorageConfig
    description String
    The description of the inference experiment.
    desiredState InferenceExperimentDesiredState
    The desired state of the experiment after starting or stopping operation.
    kmsKey String
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
    name String
    The name for the inference experiment.
    schedule InferenceExperimentSchedule
    shadowModeConfig InferenceExperimentShadowModeConfig
    statusReason String
    The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
    tags List<Tag>
    An array of key-value pairs to apply to this resource.
    endpointName string
    modelVariants InferenceExperimentModelVariantConfig[]
    An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
    roleArn string
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
    type InferenceExperimentType
    The type of the inference experiment that you want to run.
    dataStorageConfig InferenceExperimentDataStorageConfig
    description string
    The description of the inference experiment.
    desiredState InferenceExperimentDesiredState
    The desired state of the experiment after starting or stopping operation.
    kmsKey string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
    name string
    The name for the inference experiment.
    schedule InferenceExperimentSchedule
    shadowModeConfig InferenceExperimentShadowModeConfig
    statusReason string
    The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
    tags Tag[]
    An array of key-value pairs to apply to this resource.
    endpoint_name str
    model_variants Sequence[InferenceExperimentModelVariantConfigArgs]
    An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
    role_arn str
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
    type InferenceExperimentType
    The type of the inference experiment that you want to run.
    data_storage_config InferenceExperimentDataStorageConfigArgs
    description str
    The description of the inference experiment.
    desired_state InferenceExperimentDesiredState
    The desired state of the experiment after starting or stopping operation.
    kms_key str
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
    name str
    The name for the inference experiment.
    schedule InferenceExperimentScheduleArgs
    shadow_mode_config InferenceExperimentShadowModeConfigArgs
    status_reason str
    The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
    tags Sequence[TagArgs]
    An array of key-value pairs to apply to this resource.
    endpointName String
    modelVariants List<Property Map>
    An array of ModelVariantConfig objects. Each ModelVariantConfig object in the array describes the infrastructure configuration for the corresponding variant.
    roleArn String
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to access model artifacts and container images, and manage Amazon SageMaker Inference endpoints for model deployment.
    type "ShadowMode"
    The type of the inference experiment that you want to run.
    dataStorageConfig Property Map
    description String
    The description of the inference experiment.
    desiredState "Running" | "Completed" | "Cancelled"
    The desired state of the experiment after starting or stopping operation.
    kmsKey String
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance that hosts the endpoint.
    name String
    The name for the inference experiment.
    schedule Property Map
    shadowModeConfig Property Map
    statusReason String
    The error message or client-specified reason from the StopInferenceExperiment API, that explains the status of the inference experiment.
    tags List<Property Map>
    An array of key-value pairs to apply to this resource.

    Outputs

    All input properties are implicitly available as output properties. Additionally, the InferenceExperiment resource produces the following output properties:

    Arn string
    The Amazon Resource Name (ARN) of the inference experiment.
    CreationTime string
    The timestamp at which you created the inference experiment.
    EndpointMetadata Pulumi.AwsNative.SageMaker.Outputs.InferenceExperimentEndpointMetadata
    Id string
    The provider-assigned unique ID for this managed resource.
    LastModifiedTime string
    The timestamp at which you last modified the inference experiment.
    Status Pulumi.AwsNative.SageMaker.InferenceExperimentStatus
    The status of the inference experiment.
    Arn string
    The Amazon Resource Name (ARN) of the inference experiment.
    CreationTime string
    The timestamp at which you created the inference experiment.
    EndpointMetadata InferenceExperimentEndpointMetadata
    Id string
    The provider-assigned unique ID for this managed resource.
    LastModifiedTime string
    The timestamp at which you last modified the inference experiment.
    Status InferenceExperimentStatus
    The status of the inference experiment.
    arn String
    The Amazon Resource Name (ARN) of the inference experiment.
    creationTime String
    The timestamp at which you created the inference experiment.
    endpointMetadata InferenceExperimentEndpointMetadata
    id String
    The provider-assigned unique ID for this managed resource.
    lastModifiedTime String
    The timestamp at which you last modified the inference experiment.
    status InferenceExperimentStatus
    The status of the inference experiment.
    arn string
    The Amazon Resource Name (ARN) of the inference experiment.
    creationTime string
    The timestamp at which you created the inference experiment.
    endpointMetadata InferenceExperimentEndpointMetadata
    id string
    The provider-assigned unique ID for this managed resource.
    lastModifiedTime string
    The timestamp at which you last modified the inference experiment.
    status InferenceExperimentStatus
    The status of the inference experiment.
    arn str
    The Amazon Resource Name (ARN) of the inference experiment.
    creation_time str
    The timestamp at which you created the inference experiment.
    endpoint_metadata InferenceExperimentEndpointMetadata
    id str
    The provider-assigned unique ID for this managed resource.
    last_modified_time str
    The timestamp at which you last modified the inference experiment.
    status InferenceExperimentStatus
    The status of the inference experiment.
    arn String
    The Amazon Resource Name (ARN) of the inference experiment.
    creationTime String
    The timestamp at which you created the inference experiment.
    endpointMetadata Property Map
    id String
    The provider-assigned unique ID for this managed resource.
    lastModifiedTime String
    The timestamp at which you last modified the inference experiment.
    status "Creating" | "Created" | "Updating" | "Starting" | "Stopping" | "Running" | "Completed" | "Cancelled"
    The status of the inference experiment.

    Supporting Types

    InferenceExperimentCaptureContentTypeHeader, InferenceExperimentCaptureContentTypeHeaderArgs

    CsvContentTypes List<string>
    The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
    JsonContentTypes List<string>
    The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
    CsvContentTypes []string
    The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
    JsonContentTypes []string
    The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
    csvContentTypes List<String>
    The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
    jsonContentTypes List<String>
    The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
    csvContentTypes string[]
    The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
    jsonContentTypes string[]
    The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
    csv_content_types Sequence[str]
    The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
    json_content_types Sequence[str]
    The list of all content type headers that SageMaker will treat as JSON and capture accordingly.
    csvContentTypes List<String>
    The list of all content type headers that SageMaker will treat as CSV and capture accordingly.
    jsonContentTypes List<String>
    The list of all content type headers that SageMaker will treat as JSON and capture accordingly.

    InferenceExperimentDataStorageConfig, InferenceExperimentDataStorageConfigArgs

    Destination string
    The Amazon S3 bucket where the inference request and response data is stored.
    ContentType Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentCaptureContentTypeHeader
    KmsKey string
    The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
    Destination string
    The Amazon S3 bucket where the inference request and response data is stored.
    ContentType InferenceExperimentCaptureContentTypeHeader
    KmsKey string
    The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
    destination String
    The Amazon S3 bucket where the inference request and response data is stored.
    contentType InferenceExperimentCaptureContentTypeHeader
    kmsKey String
    The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
    destination string
    The Amazon S3 bucket where the inference request and response data is stored.
    contentType InferenceExperimentCaptureContentTypeHeader
    kmsKey string
    The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
    destination str
    The Amazon S3 bucket where the inference request and response data is stored.
    content_type InferenceExperimentCaptureContentTypeHeader
    kms_key str
    The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.
    destination String
    The Amazon S3 bucket where the inference request and response data is stored.
    contentType Property Map
    kmsKey String
    The AWS Key Management Service key that Amazon SageMaker uses to encrypt captured data at rest using Amazon S3 server-side encryption.

    InferenceExperimentDesiredState, InferenceExperimentDesiredStateArgs

    Running
    Running
    Completed
    Completed
    Cancelled
    Cancelled
    InferenceExperimentDesiredStateRunning
    Running
    InferenceExperimentDesiredStateCompleted
    Completed
    InferenceExperimentDesiredStateCancelled
    Cancelled
    Running
    Running
    Completed
    Completed
    Cancelled
    Cancelled
    Running
    Running
    Completed
    Completed
    Cancelled
    Cancelled
    RUNNING
    Running
    COMPLETED
    Completed
    CANCELLED
    Cancelled
    "Running"
    Running
    "Completed"
    Completed
    "Cancelled"
    Cancelled

    InferenceExperimentEndpointMetadata, InferenceExperimentEndpointMetadataArgs

    EndpointName string
    EndpointConfigName string
    The name of the endpoint configuration.
    EndpointStatus Pulumi.AwsNative.SageMaker.InferenceExperimentEndpointMetadataEndpointStatus
    The status of the endpoint. For possible values of the status of an endpoint.
    EndpointName string
    EndpointConfigName string
    The name of the endpoint configuration.
    EndpointStatus InferenceExperimentEndpointMetadataEndpointStatus
    The status of the endpoint. For possible values of the status of an endpoint.
    endpointName String
    endpointConfigName String
    The name of the endpoint configuration.
    endpointStatus InferenceExperimentEndpointMetadataEndpointStatus
    The status of the endpoint. For possible values of the status of an endpoint.
    endpointName string
    endpointConfigName string
    The name of the endpoint configuration.
    endpointStatus InferenceExperimentEndpointMetadataEndpointStatus
    The status of the endpoint. For possible values of the status of an endpoint.
    endpoint_name str
    endpoint_config_name str
    The name of the endpoint configuration.
    endpoint_status InferenceExperimentEndpointMetadataEndpointStatus
    The status of the endpoint. For possible values of the status of an endpoint.
    endpointName String
    endpointConfigName String
    The name of the endpoint configuration.
    endpointStatus "Creating" | "Updating" | "SystemUpdating" | "RollingBack" | "InService" | "OutOfService" | "Deleting" | "Failed"
    The status of the endpoint. For possible values of the status of an endpoint.

    InferenceExperimentEndpointMetadataEndpointStatus, InferenceExperimentEndpointMetadataEndpointStatusArgs

    Creating
    Creating
    Updating
    Updating
    SystemUpdating
    SystemUpdating
    RollingBack
    RollingBack
    InService
    InService
    OutOfService
    OutOfService
    Deleting
    Deleting
    Failed
    Failed
    InferenceExperimentEndpointMetadataEndpointStatusCreating
    Creating
    InferenceExperimentEndpointMetadataEndpointStatusUpdating
    Updating
    InferenceExperimentEndpointMetadataEndpointStatusSystemUpdating
    SystemUpdating
    InferenceExperimentEndpointMetadataEndpointStatusRollingBack
    RollingBack
    InferenceExperimentEndpointMetadataEndpointStatusInService
    InService
    InferenceExperimentEndpointMetadataEndpointStatusOutOfService
    OutOfService
    InferenceExperimentEndpointMetadataEndpointStatusDeleting
    Deleting
    InferenceExperimentEndpointMetadataEndpointStatusFailed
    Failed
    Creating
    Creating
    Updating
    Updating
    SystemUpdating
    SystemUpdating
    RollingBack
    RollingBack
    InService
    InService
    OutOfService
    OutOfService
    Deleting
    Deleting
    Failed
    Failed
    Creating
    Creating
    Updating
    Updating
    SystemUpdating
    SystemUpdating
    RollingBack
    RollingBack
    InService
    InService
    OutOfService
    OutOfService
    Deleting
    Deleting
    Failed
    Failed
    CREATING
    Creating
    UPDATING
    Updating
    SYSTEM_UPDATING
    SystemUpdating
    ROLLING_BACK
    RollingBack
    IN_SERVICE
    InService
    OUT_OF_SERVICE
    OutOfService
    DELETING
    Deleting
    FAILED
    Failed
    "Creating"
    Creating
    "Updating"
    Updating
    "SystemUpdating"
    SystemUpdating
    "RollingBack"
    RollingBack
    "InService"
    InService
    "OutOfService"
    OutOfService
    "Deleting"
    Deleting
    "Failed"
    Failed

    InferenceExperimentModelInfrastructureConfig, InferenceExperimentModelInfrastructureConfigArgs

    infrastructureType "RealTimeInference"
    The type of the inference experiment that you want to run.
    realTimeInferenceConfig Property Map

    InferenceExperimentModelInfrastructureConfigInfrastructureType, InferenceExperimentModelInfrastructureConfigInfrastructureTypeArgs

    RealTimeInference
    RealTimeInference
    InferenceExperimentModelInfrastructureConfigInfrastructureTypeRealTimeInference
    RealTimeInference
    RealTimeInference
    RealTimeInference
    RealTimeInference
    RealTimeInference
    REAL_TIME_INFERENCE
    RealTimeInference
    "RealTimeInference"
    RealTimeInference

    InferenceExperimentModelVariantConfig, InferenceExperimentModelVariantConfigArgs

    InfrastructureConfig Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentModelInfrastructureConfig
    ModelName string
    The name of the Amazon SageMaker Model entity.
    VariantName string
    The name of the variant.
    InfrastructureConfig InferenceExperimentModelInfrastructureConfig
    ModelName string
    The name of the Amazon SageMaker Model entity.
    VariantName string
    The name of the variant.
    infrastructureConfig InferenceExperimentModelInfrastructureConfig
    modelName String
    The name of the Amazon SageMaker Model entity.
    variantName String
    The name of the variant.
    infrastructureConfig InferenceExperimentModelInfrastructureConfig
    modelName string
    The name of the Amazon SageMaker Model entity.
    variantName string
    The name of the variant.
    infrastructure_config InferenceExperimentModelInfrastructureConfig
    model_name str
    The name of the Amazon SageMaker Model entity.
    variant_name str
    The name of the variant.
    infrastructureConfig Property Map
    modelName String
    The name of the Amazon SageMaker Model entity.
    variantName String
    The name of the variant.

    InferenceExperimentRealTimeInferenceConfig, InferenceExperimentRealTimeInferenceConfigArgs

    InstanceCount int
    The number of instances of the type specified by InstanceType.
    InstanceType string
    The instance type the model is deployed to.
    InstanceCount int
    The number of instances of the type specified by InstanceType.
    InstanceType string
    The instance type the model is deployed to.
    instanceCount Integer
    The number of instances of the type specified by InstanceType.
    instanceType String
    The instance type the model is deployed to.
    instanceCount number
    The number of instances of the type specified by InstanceType.
    instanceType string
    The instance type the model is deployed to.
    instance_count int
    The number of instances of the type specified by InstanceType.
    instance_type str
    The instance type the model is deployed to.
    instanceCount Number
    The number of instances of the type specified by InstanceType.
    instanceType String
    The instance type the model is deployed to.

    InferenceExperimentSchedule, InferenceExperimentScheduleArgs

    EndTime string
    The timestamp at which the inference experiment ended or will end.
    StartTime string
    The timestamp at which the inference experiment started or will start.
    EndTime string
    The timestamp at which the inference experiment ended or will end.
    StartTime string
    The timestamp at which the inference experiment started or will start.
    endTime String
    The timestamp at which the inference experiment ended or will end.
    startTime String
    The timestamp at which the inference experiment started or will start.
    endTime string
    The timestamp at which the inference experiment ended or will end.
    startTime string
    The timestamp at which the inference experiment started or will start.
    end_time str
    The timestamp at which the inference experiment ended or will end.
    start_time str
    The timestamp at which the inference experiment started or will start.
    endTime String
    The timestamp at which the inference experiment ended or will end.
    startTime String
    The timestamp at which the inference experiment started or will start.

    InferenceExperimentShadowModeConfig, InferenceExperimentShadowModeConfigArgs

    ShadowModelVariants List<Pulumi.AwsNative.SageMaker.Inputs.InferenceExperimentShadowModelVariantConfig>
    List of shadow variant configurations.
    SourceModelVariantName string
    The name of the production variant, which takes all the inference requests.
    ShadowModelVariants []InferenceExperimentShadowModelVariantConfig
    List of shadow variant configurations.
    SourceModelVariantName string
    The name of the production variant, which takes all the inference requests.
    shadowModelVariants List<InferenceExperimentShadowModelVariantConfig>
    List of shadow variant configurations.
    sourceModelVariantName String
    The name of the production variant, which takes all the inference requests.
    shadowModelVariants InferenceExperimentShadowModelVariantConfig[]
    List of shadow variant configurations.
    sourceModelVariantName string
    The name of the production variant, which takes all the inference requests.
    shadow_model_variants Sequence[InferenceExperimentShadowModelVariantConfig]
    List of shadow variant configurations.
    source_model_variant_name str
    The name of the production variant, which takes all the inference requests.
    shadowModelVariants List<Property Map>
    List of shadow variant configurations.
    sourceModelVariantName String
    The name of the production variant, which takes all the inference requests.

    InferenceExperimentShadowModelVariantConfig, InferenceExperimentShadowModelVariantConfigArgs

    SamplingPercentage int
    The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
    ShadowModelVariantName string
    The name of the shadow variant.
    SamplingPercentage int
    The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
    ShadowModelVariantName string
    The name of the shadow variant.
    samplingPercentage Integer
    The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
    shadowModelVariantName String
    The name of the shadow variant.
    samplingPercentage number
    The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
    shadowModelVariantName string
    The name of the shadow variant.
    sampling_percentage int
    The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
    shadow_model_variant_name str
    The name of the shadow variant.
    samplingPercentage Number
    The percentage of inference requests that Amazon SageMaker replicates from the production variant to the shadow variant.
    shadowModelVariantName String
    The name of the shadow variant.

    InferenceExperimentStatus, InferenceExperimentStatusArgs

    Creating
    Creating
    Created
    Created
    Updating
    Updating
    Starting
    Starting
    Stopping
    Stopping
    Running
    Running
    Completed
    Completed
    Cancelled
    Cancelled
    InferenceExperimentStatusCreating
    Creating
    InferenceExperimentStatusCreated
    Created
    InferenceExperimentStatusUpdating
    Updating
    InferenceExperimentStatusStarting
    Starting
    InferenceExperimentStatusStopping
    Stopping
    InferenceExperimentStatusRunning
    Running
    InferenceExperimentStatusCompleted
    Completed
    InferenceExperimentStatusCancelled
    Cancelled
    Creating
    Creating
    Created
    Created
    Updating
    Updating
    Starting
    Starting
    Stopping
    Stopping
    Running
    Running
    Completed
    Completed
    Cancelled
    Cancelled
    Creating
    Creating
    Created
    Created
    Updating
    Updating
    Starting
    Starting
    Stopping
    Stopping
    Running
    Running
    Completed
    Completed
    Cancelled
    Cancelled
    CREATING
    Creating
    CREATED
    Created
    UPDATING
    Updating
    STARTING
    Starting
    STOPPING
    Stopping
    RUNNING
    Running
    COMPLETED
    Completed
    CANCELLED
    Cancelled
    "Creating"
    Creating
    "Created"
    Created
    "Updating"
    Updating
    "Starting"
    Starting
    "Stopping"
    Stopping
    "Running"
    Running
    "Completed"
    Completed
    "Cancelled"
    Cancelled

    InferenceExperimentType, InferenceExperimentTypeArgs

    ShadowMode
    ShadowMode
    InferenceExperimentTypeShadowMode
    ShadowMode
    ShadowMode
    ShadowMode
    ShadowMode
    ShadowMode
    SHADOW_MODE
    ShadowMode
    "ShadowMode"
    ShadowMode

    Tag, TagArgs

    Key string
    The key name of the tag
    Value string
    The value of the tag
    Key string
    The key name of the tag
    Value string
    The value of the tag
    key String
    The key name of the tag
    value String
    The value of the tag
    key string
    The key name of the tag
    value string
    The value of the tag
    key str
    The key name of the tag
    value str
    The value of the tag
    key String
    The key name of the tag
    value String
    The value of the tag

    Package Details

    Repository
    AWS Native pulumi/pulumi-aws-native
    License
    Apache-2.0
    aws-native logo

    AWS Native is in preview. AWS Classic is fully supported.

    AWS Native v0.102.0 published on Tuesday, Apr 16, 2024 by Pulumi