1. Packages
  2. AWS Native
  3. API Docs
  4. sagemaker
  5. ModelBiasJobDefinition

AWS Native is in preview. AWS Classic is fully supported.

AWS Native v0.102.0 published on Tuesday, Apr 16, 2024 by Pulumi

aws-native.sagemaker.ModelBiasJobDefinition

Explore with Pulumi AI

aws-native logo

AWS Native is in preview. AWS Classic is fully supported.

AWS Native v0.102.0 published on Tuesday, Apr 16, 2024 by Pulumi

    Resource Type definition for AWS::SageMaker::ModelBiasJobDefinition

    Create ModelBiasJobDefinition Resource

    Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.

    Constructor syntax

    new ModelBiasJobDefinition(name: string, args: ModelBiasJobDefinitionArgs, opts?: CustomResourceOptions);
    @overload
    def ModelBiasJobDefinition(resource_name: str,
                               args: ModelBiasJobDefinitionArgs,
                               opts: Optional[ResourceOptions] = None)
    
    @overload
    def ModelBiasJobDefinition(resource_name: str,
                               opts: Optional[ResourceOptions] = None,
                               job_resources: Optional[ModelBiasJobDefinitionMonitoringResourcesArgs] = None,
                               model_bias_app_specification: Optional[ModelBiasJobDefinitionModelBiasAppSpecificationArgs] = None,
                               model_bias_job_input: Optional[ModelBiasJobDefinitionModelBiasJobInputArgs] = None,
                               model_bias_job_output_config: Optional[ModelBiasJobDefinitionMonitoringOutputConfigArgs] = None,
                               role_arn: Optional[str] = None,
                               endpoint_name: Optional[str] = None,
                               job_definition_name: Optional[str] = None,
                               model_bias_baseline_config: Optional[ModelBiasJobDefinitionModelBiasBaselineConfigArgs] = None,
                               network_config: Optional[ModelBiasJobDefinitionNetworkConfigArgs] = None,
                               stopping_condition: Optional[ModelBiasJobDefinitionStoppingConditionArgs] = None,
                               tags: Optional[Sequence[_root_inputs.CreateOnlyTagArgs]] = None)
    func NewModelBiasJobDefinition(ctx *Context, name string, args ModelBiasJobDefinitionArgs, opts ...ResourceOption) (*ModelBiasJobDefinition, error)
    public ModelBiasJobDefinition(string name, ModelBiasJobDefinitionArgs args, CustomResourceOptions? opts = null)
    public ModelBiasJobDefinition(String name, ModelBiasJobDefinitionArgs args)
    public ModelBiasJobDefinition(String name, ModelBiasJobDefinitionArgs args, CustomResourceOptions options)
    
    type: aws-native:sagemaker:ModelBiasJobDefinition
    properties: # The arguments to resource properties.
    options: # Bag of options to control resource's behavior.
    
    

    Parameters

    name string
    The unique name of the resource.
    args ModelBiasJobDefinitionArgs
    The arguments to resource properties.
    opts CustomResourceOptions
    Bag of options to control resource's behavior.
    resource_name str
    The unique name of the resource.
    args ModelBiasJobDefinitionArgs
    The arguments to resource properties.
    opts ResourceOptions
    Bag of options to control resource's behavior.
    ctx Context
    Context object for the current deployment.
    name string
    The unique name of the resource.
    args ModelBiasJobDefinitionArgs
    The arguments to resource properties.
    opts ResourceOption
    Bag of options to control resource's behavior.
    name string
    The unique name of the resource.
    args ModelBiasJobDefinitionArgs
    The arguments to resource properties.
    opts CustomResourceOptions
    Bag of options to control resource's behavior.
    name String
    The unique name of the resource.
    args ModelBiasJobDefinitionArgs
    The arguments to resource properties.
    options CustomResourceOptions
    Bag of options to control resource's behavior.

    Example

    The following reference example uses placeholder values for all input properties.

    Coming soon!
    
    Coming soon!
    
    Coming soon!
    
    Coming soon!
    
    const modelBiasJobDefinitionResource = new aws_native.sagemaker.ModelBiasJobDefinition("modelBiasJobDefinitionResource", {
        jobResources: {
            clusterConfig: {
                instanceCount: 0,
                instanceType: "string",
                volumeSizeInGb: 0,
                volumeKmsKeyId: "string",
            },
        },
        modelBiasAppSpecification: {
            configUri: "string",
            imageUri: "string",
            environment: "any",
        },
        modelBiasJobInput: {
            groundTruthS3Input: {
                s3Uri: "string",
            },
            batchTransformInput: {
                dataCapturedDestinationS3Uri: "string",
                datasetFormat: {
                    csv: {
                        header: false,
                    },
                    json: {
                        line: false,
                    },
                    parquet: false,
                },
                localPath: "string",
                endTimeOffset: "string",
                featuresAttribute: "string",
                inferenceAttribute: "string",
                probabilityAttribute: "string",
                probabilityThresholdAttribute: 0,
                s3DataDistributionType: aws_native.sagemaker.ModelBiasJobDefinitionBatchTransformInputS3DataDistributionType.FullyReplicated,
                s3InputMode: aws_native.sagemaker.ModelBiasJobDefinitionBatchTransformInputS3InputMode.Pipe,
                startTimeOffset: "string",
            },
            endpointInput: {
                endpointName: "string",
                localPath: "string",
                endTimeOffset: "string",
                featuresAttribute: "string",
                inferenceAttribute: "string",
                probabilityAttribute: "string",
                probabilityThresholdAttribute: 0,
                s3DataDistributionType: aws_native.sagemaker.ModelBiasJobDefinitionEndpointInputS3DataDistributionType.FullyReplicated,
                s3InputMode: aws_native.sagemaker.ModelBiasJobDefinitionEndpointInputS3InputMode.Pipe,
                startTimeOffset: "string",
            },
        },
        modelBiasJobOutputConfig: {
            monitoringOutputs: [{
                s3Output: {
                    localPath: "string",
                    s3Uri: "string",
                    s3UploadMode: aws_native.sagemaker.ModelBiasJobDefinitionS3OutputS3UploadMode.Continuous,
                },
            }],
            kmsKeyId: "string",
        },
        roleArn: "string",
        endpointName: "string",
        jobDefinitionName: "string",
        modelBiasBaselineConfig: {
            baseliningJobName: "string",
            constraintsResource: {
                s3Uri: "string",
            },
        },
        networkConfig: {
            enableInterContainerTrafficEncryption: false,
            enableNetworkIsolation: false,
            vpcConfig: {
                securityGroupIds: ["string"],
                subnets: ["string"],
            },
        },
        stoppingCondition: {
            maxRuntimeInSeconds: 0,
        },
        tags: [{
            key: "string",
            value: "string",
        }],
    });
    
    Coming soon!
    

    ModelBiasJobDefinition Resource Properties

    To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.

    Inputs

    The ModelBiasJobDefinition resource accepts the following input properties:

    JobResources Pulumi.AwsNative.SageMaker.Inputs.ModelBiasJobDefinitionMonitoringResources
    ModelBiasAppSpecification Pulumi.AwsNative.SageMaker.Inputs.ModelBiasJobDefinitionModelBiasAppSpecification
    ModelBiasJobInput Pulumi.AwsNative.SageMaker.Inputs.ModelBiasJobDefinitionModelBiasJobInput
    ModelBiasJobOutputConfig Pulumi.AwsNative.SageMaker.Inputs.ModelBiasJobDefinitionMonitoringOutputConfig
    RoleArn string
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to perform tasks on your behalf.
    EndpointName string
    JobDefinitionName string
    ModelBiasBaselineConfig Pulumi.AwsNative.SageMaker.Inputs.ModelBiasJobDefinitionModelBiasBaselineConfig
    NetworkConfig Pulumi.AwsNative.SageMaker.Inputs.ModelBiasJobDefinitionNetworkConfig
    StoppingCondition Pulumi.AwsNative.SageMaker.Inputs.ModelBiasJobDefinitionStoppingCondition
    Tags List<Pulumi.AwsNative.Inputs.CreateOnlyTag>
    An array of key-value pairs to apply to this resource.
    jobResources Property Map
    modelBiasAppSpecification Property Map
    modelBiasJobInput Property Map
    modelBiasJobOutputConfig Property Map
    roleArn String
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to perform tasks on your behalf.
    endpointName String
    jobDefinitionName String
    modelBiasBaselineConfig Property Map
    networkConfig Property Map
    stoppingCondition Property Map
    tags List<Property Map>
    An array of key-value pairs to apply to this resource.

    Outputs

    All input properties are implicitly available as output properties. Additionally, the ModelBiasJobDefinition resource produces the following output properties:

    CreationTime string
    The time at which the job definition was created.
    Id string
    The provider-assigned unique ID for this managed resource.
    JobDefinitionArn string
    The Amazon Resource Name (ARN) of job definition.
    CreationTime string
    The time at which the job definition was created.
    Id string
    The provider-assigned unique ID for this managed resource.
    JobDefinitionArn string
    The Amazon Resource Name (ARN) of job definition.
    creationTime String
    The time at which the job definition was created.
    id String
    The provider-assigned unique ID for this managed resource.
    jobDefinitionArn String
    The Amazon Resource Name (ARN) of job definition.
    creationTime string
    The time at which the job definition was created.
    id string
    The provider-assigned unique ID for this managed resource.
    jobDefinitionArn string
    The Amazon Resource Name (ARN) of job definition.
    creation_time str
    The time at which the job definition was created.
    id str
    The provider-assigned unique ID for this managed resource.
    job_definition_arn str
    The Amazon Resource Name (ARN) of job definition.
    creationTime String
    The time at which the job definition was created.
    id String
    The provider-assigned unique ID for this managed resource.
    jobDefinitionArn String
    The Amazon Resource Name (ARN) of job definition.

    Supporting Types

    CreateOnlyTag, CreateOnlyTagArgs

    Key string
    The key name of the tag
    Value string
    The value of the tag
    Key string
    The key name of the tag
    Value string
    The value of the tag
    key String
    The key name of the tag
    value String
    The value of the tag
    key string
    The key name of the tag
    value string
    The value of the tag
    key str
    The key name of the tag
    value str
    The value of the tag
    key String
    The key name of the tag
    value String
    The value of the tag

    ModelBiasJobDefinitionBatchTransformInput, ModelBiasJobDefinitionBatchTransformInputArgs

    DataCapturedDestinationS3Uri string
    A URI that identifies the Amazon S3 storage location where Batch Transform Job captures data.
    DatasetFormat Pulumi.AwsNative.SageMaker.Inputs.ModelBiasJobDefinitionDatasetFormat
    LocalPath string
    Path to the filesystem where the endpoint data is available to the container.
    EndTimeOffset string
    Monitoring end time offset, e.g. PT0H
    FeaturesAttribute string
    JSONpath to locate features in JSONlines dataset
    InferenceAttribute string
    Index or JSONpath to locate predicted label(s)
    ProbabilityAttribute string
    Index or JSONpath to locate probabilities
    ProbabilityThresholdAttribute double
    S3DataDistributionType Pulumi.AwsNative.SageMaker.ModelBiasJobDefinitionBatchTransformInputS3DataDistributionType
    Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated
    S3InputMode Pulumi.AwsNative.SageMaker.ModelBiasJobDefinitionBatchTransformInputS3InputMode
    Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.
    StartTimeOffset string
    Monitoring start time offset, e.g. -PT1H
    DataCapturedDestinationS3Uri string
    A URI that identifies the Amazon S3 storage location where Batch Transform Job captures data.
    DatasetFormat ModelBiasJobDefinitionDatasetFormat
    LocalPath string
    Path to the filesystem where the endpoint data is available to the container.
    EndTimeOffset string
    Monitoring end time offset, e.g. PT0H
    FeaturesAttribute string
    JSONpath to locate features in JSONlines dataset
    InferenceAttribute string
    Index or JSONpath to locate predicted label(s)
    ProbabilityAttribute string
    Index or JSONpath to locate probabilities
    ProbabilityThresholdAttribute float64
    S3DataDistributionType ModelBiasJobDefinitionBatchTransformInputS3DataDistributionType
    Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated
    S3InputMode ModelBiasJobDefinitionBatchTransformInputS3InputMode
    Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.
    StartTimeOffset string
    Monitoring start time offset, e.g. -PT1H
    dataCapturedDestinationS3Uri String
    A URI that identifies the Amazon S3 storage location where Batch Transform Job captures data.
    datasetFormat ModelBiasJobDefinitionDatasetFormat
    localPath String
    Path to the filesystem where the endpoint data is available to the container.
    endTimeOffset String
    Monitoring end time offset, e.g. PT0H
    featuresAttribute String
    JSONpath to locate features in JSONlines dataset
    inferenceAttribute String
    Index or JSONpath to locate predicted label(s)
    probabilityAttribute String
    Index or JSONpath to locate probabilities
    probabilityThresholdAttribute Double
    s3DataDistributionType ModelBiasJobDefinitionBatchTransformInputS3DataDistributionType
    Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated
    s3InputMode ModelBiasJobDefinitionBatchTransformInputS3InputMode
    Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.
    startTimeOffset String
    Monitoring start time offset, e.g. -PT1H
    dataCapturedDestinationS3Uri string
    A URI that identifies the Amazon S3 storage location where Batch Transform Job captures data.
    datasetFormat ModelBiasJobDefinitionDatasetFormat
    localPath string
    Path to the filesystem where the endpoint data is available to the container.
    endTimeOffset string
    Monitoring end time offset, e.g. PT0H
    featuresAttribute string
    JSONpath to locate features in JSONlines dataset
    inferenceAttribute string
    Index or JSONpath to locate predicted label(s)
    probabilityAttribute string
    Index or JSONpath to locate probabilities
    probabilityThresholdAttribute number
    s3DataDistributionType ModelBiasJobDefinitionBatchTransformInputS3DataDistributionType
    Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated
    s3InputMode ModelBiasJobDefinitionBatchTransformInputS3InputMode
    Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.
    startTimeOffset string
    Monitoring start time offset, e.g. -PT1H
    data_captured_destination_s3_uri str
    A URI that identifies the Amazon S3 storage location where Batch Transform Job captures data.
    dataset_format ModelBiasJobDefinitionDatasetFormat
    local_path str
    Path to the filesystem where the endpoint data is available to the container.
    end_time_offset str
    Monitoring end time offset, e.g. PT0H
    features_attribute str
    JSONpath to locate features in JSONlines dataset
    inference_attribute str
    Index or JSONpath to locate predicted label(s)
    probability_attribute str
    Index or JSONpath to locate probabilities
    probability_threshold_attribute float
    s3_data_distribution_type ModelBiasJobDefinitionBatchTransformInputS3DataDistributionType
    Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated
    s3_input_mode ModelBiasJobDefinitionBatchTransformInputS3InputMode
    Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.
    start_time_offset str
    Monitoring start time offset, e.g. -PT1H
    dataCapturedDestinationS3Uri String
    A URI that identifies the Amazon S3 storage location where Batch Transform Job captures data.
    datasetFormat Property Map
    localPath String
    Path to the filesystem where the endpoint data is available to the container.
    endTimeOffset String
    Monitoring end time offset, e.g. PT0H
    featuresAttribute String
    JSONpath to locate features in JSONlines dataset
    inferenceAttribute String
    Index or JSONpath to locate predicted label(s)
    probabilityAttribute String
    Index or JSONpath to locate probabilities
    probabilityThresholdAttribute Number
    s3DataDistributionType "FullyReplicated" | "ShardedByS3Key"
    Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated
    s3InputMode "Pipe" | "File"
    Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.
    startTimeOffset String
    Monitoring start time offset, e.g. -PT1H

    ModelBiasJobDefinitionBatchTransformInputS3DataDistributionType, ModelBiasJobDefinitionBatchTransformInputS3DataDistributionTypeArgs

    FullyReplicated
    FullyReplicated
    ShardedByS3Key
    ShardedByS3Key
    ModelBiasJobDefinitionBatchTransformInputS3DataDistributionTypeFullyReplicated
    FullyReplicated
    ModelBiasJobDefinitionBatchTransformInputS3DataDistributionTypeShardedByS3Key
    ShardedByS3Key
    FullyReplicated
    FullyReplicated
    ShardedByS3Key
    ShardedByS3Key
    FullyReplicated
    FullyReplicated
    ShardedByS3Key
    ShardedByS3Key
    FULLY_REPLICATED
    FullyReplicated
    SHARDED_BY_S3_KEY
    ShardedByS3Key
    "FullyReplicated"
    FullyReplicated
    "ShardedByS3Key"
    ShardedByS3Key

    ModelBiasJobDefinitionBatchTransformInputS3InputMode, ModelBiasJobDefinitionBatchTransformInputS3InputModeArgs

    Pipe
    Pipe
    File
    File
    ModelBiasJobDefinitionBatchTransformInputS3InputModePipe
    Pipe
    ModelBiasJobDefinitionBatchTransformInputS3InputModeFile
    File
    Pipe
    Pipe
    File
    File
    Pipe
    Pipe
    File
    File
    PIPE
    Pipe
    FILE
    File
    "Pipe"
    Pipe
    "File"
    File

    ModelBiasJobDefinitionClusterConfig, ModelBiasJobDefinitionClusterConfigArgs

    InstanceCount int
    The number of ML compute instances to use in the model monitoring job. For distributed processing jobs, specify a value greater than 1. The default value is 1.
    InstanceType string
    The ML compute instance type for the processing job.
    VolumeSizeInGb int
    The size of the ML storage volume, in gigabytes, that you want to provision. You must specify sufficient ML storage for your scenario.
    VolumeKmsKeyId string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the model monitoring job.
    InstanceCount int
    The number of ML compute instances to use in the model monitoring job. For distributed processing jobs, specify a value greater than 1. The default value is 1.
    InstanceType string
    The ML compute instance type for the processing job.
    VolumeSizeInGb int
    The size of the ML storage volume, in gigabytes, that you want to provision. You must specify sufficient ML storage for your scenario.
    VolumeKmsKeyId string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the model monitoring job.
    instanceCount Integer
    The number of ML compute instances to use in the model monitoring job. For distributed processing jobs, specify a value greater than 1. The default value is 1.
    instanceType String
    The ML compute instance type for the processing job.
    volumeSizeInGb Integer
    The size of the ML storage volume, in gigabytes, that you want to provision. You must specify sufficient ML storage for your scenario.
    volumeKmsKeyId String
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the model monitoring job.
    instanceCount number
    The number of ML compute instances to use in the model monitoring job. For distributed processing jobs, specify a value greater than 1. The default value is 1.
    instanceType string
    The ML compute instance type for the processing job.
    volumeSizeInGb number
    The size of the ML storage volume, in gigabytes, that you want to provision. You must specify sufficient ML storage for your scenario.
    volumeKmsKeyId string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the model monitoring job.
    instance_count int
    The number of ML compute instances to use in the model monitoring job. For distributed processing jobs, specify a value greater than 1. The default value is 1.
    instance_type str
    The ML compute instance type for the processing job.
    volume_size_in_gb int
    The size of the ML storage volume, in gigabytes, that you want to provision. You must specify sufficient ML storage for your scenario.
    volume_kms_key_id str
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the model monitoring job.
    instanceCount Number
    The number of ML compute instances to use in the model monitoring job. For distributed processing jobs, specify a value greater than 1. The default value is 1.
    instanceType String
    The ML compute instance type for the processing job.
    volumeSizeInGb Number
    The size of the ML storage volume, in gigabytes, that you want to provision. You must specify sufficient ML storage for your scenario.
    volumeKmsKeyId String
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the model monitoring job.

    ModelBiasJobDefinitionConstraintsResource, ModelBiasJobDefinitionConstraintsResourceArgs

    S3Uri string
    The Amazon S3 URI for baseline constraint file in Amazon S3 that the current monitoring job should validated against.
    S3Uri string
    The Amazon S3 URI for baseline constraint file in Amazon S3 that the current monitoring job should validated against.
    s3Uri String
    The Amazon S3 URI for baseline constraint file in Amazon S3 that the current monitoring job should validated against.
    s3Uri string
    The Amazon S3 URI for baseline constraint file in Amazon S3 that the current monitoring job should validated against.
    s3_uri str
    The Amazon S3 URI for baseline constraint file in Amazon S3 that the current monitoring job should validated against.
    s3Uri String
    The Amazon S3 URI for baseline constraint file in Amazon S3 that the current monitoring job should validated against.

    ModelBiasJobDefinitionCsv, ModelBiasJobDefinitionCsvArgs

    Header bool
    A boolean flag indicating if given CSV has header
    Header bool
    A boolean flag indicating if given CSV has header
    header Boolean
    A boolean flag indicating if given CSV has header
    header boolean
    A boolean flag indicating if given CSV has header
    header bool
    A boolean flag indicating if given CSV has header
    header Boolean
    A boolean flag indicating if given CSV has header

    ModelBiasJobDefinitionDatasetFormat, ModelBiasJobDefinitionDatasetFormatArgs

    ModelBiasJobDefinitionEndpointInput, ModelBiasJobDefinitionEndpointInputArgs

    EndpointName string
    LocalPath string
    Path to the filesystem where the endpoint data is available to the container.
    EndTimeOffset string
    Monitoring end time offset, e.g. PT0H
    FeaturesAttribute string
    JSONpath to locate features in JSONlines dataset
    InferenceAttribute string
    Index or JSONpath to locate predicted label(s)
    ProbabilityAttribute string
    Index or JSONpath to locate probabilities
    ProbabilityThresholdAttribute double
    S3DataDistributionType Pulumi.AwsNative.SageMaker.ModelBiasJobDefinitionEndpointInputS3DataDistributionType
    Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated
    S3InputMode Pulumi.AwsNative.SageMaker.ModelBiasJobDefinitionEndpointInputS3InputMode
    Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.
    StartTimeOffset string
    Monitoring start time offset, e.g. -PT1H
    EndpointName string
    LocalPath string
    Path to the filesystem where the endpoint data is available to the container.
    EndTimeOffset string
    Monitoring end time offset, e.g. PT0H
    FeaturesAttribute string
    JSONpath to locate features in JSONlines dataset
    InferenceAttribute string
    Index or JSONpath to locate predicted label(s)
    ProbabilityAttribute string
    Index or JSONpath to locate probabilities
    ProbabilityThresholdAttribute float64
    S3DataDistributionType ModelBiasJobDefinitionEndpointInputS3DataDistributionType
    Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated
    S3InputMode ModelBiasJobDefinitionEndpointInputS3InputMode
    Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.
    StartTimeOffset string
    Monitoring start time offset, e.g. -PT1H
    endpointName String
    localPath String
    Path to the filesystem where the endpoint data is available to the container.
    endTimeOffset String
    Monitoring end time offset, e.g. PT0H
    featuresAttribute String
    JSONpath to locate features in JSONlines dataset
    inferenceAttribute String
    Index or JSONpath to locate predicted label(s)
    probabilityAttribute String
    Index or JSONpath to locate probabilities
    probabilityThresholdAttribute Double
    s3DataDistributionType ModelBiasJobDefinitionEndpointInputS3DataDistributionType
    Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated
    s3InputMode ModelBiasJobDefinitionEndpointInputS3InputMode
    Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.
    startTimeOffset String
    Monitoring start time offset, e.g. -PT1H
    endpointName string
    localPath string
    Path to the filesystem where the endpoint data is available to the container.
    endTimeOffset string
    Monitoring end time offset, e.g. PT0H
    featuresAttribute string
    JSONpath to locate features in JSONlines dataset
    inferenceAttribute string
    Index or JSONpath to locate predicted label(s)
    probabilityAttribute string
    Index or JSONpath to locate probabilities
    probabilityThresholdAttribute number
    s3DataDistributionType ModelBiasJobDefinitionEndpointInputS3DataDistributionType
    Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated
    s3InputMode ModelBiasJobDefinitionEndpointInputS3InputMode
    Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.
    startTimeOffset string
    Monitoring start time offset, e.g. -PT1H
    endpoint_name str
    local_path str
    Path to the filesystem where the endpoint data is available to the container.
    end_time_offset str
    Monitoring end time offset, e.g. PT0H
    features_attribute str
    JSONpath to locate features in JSONlines dataset
    inference_attribute str
    Index or JSONpath to locate predicted label(s)
    probability_attribute str
    Index or JSONpath to locate probabilities
    probability_threshold_attribute float
    s3_data_distribution_type ModelBiasJobDefinitionEndpointInputS3DataDistributionType
    Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated
    s3_input_mode ModelBiasJobDefinitionEndpointInputS3InputMode
    Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.
    start_time_offset str
    Monitoring start time offset, e.g. -PT1H
    endpointName String
    localPath String
    Path to the filesystem where the endpoint data is available to the container.
    endTimeOffset String
    Monitoring end time offset, e.g. PT0H
    featuresAttribute String
    JSONpath to locate features in JSONlines dataset
    inferenceAttribute String
    Index or JSONpath to locate predicted label(s)
    probabilityAttribute String
    Index or JSONpath to locate probabilities
    probabilityThresholdAttribute Number
    s3DataDistributionType "FullyReplicated" | "ShardedByS3Key"
    Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated
    s3InputMode "Pipe" | "File"
    Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.
    startTimeOffset String
    Monitoring start time offset, e.g. -PT1H

    ModelBiasJobDefinitionEndpointInputS3DataDistributionType, ModelBiasJobDefinitionEndpointInputS3DataDistributionTypeArgs

    FullyReplicated
    FullyReplicated
    ShardedByS3Key
    ShardedByS3Key
    ModelBiasJobDefinitionEndpointInputS3DataDistributionTypeFullyReplicated
    FullyReplicated
    ModelBiasJobDefinitionEndpointInputS3DataDistributionTypeShardedByS3Key
    ShardedByS3Key
    FullyReplicated
    FullyReplicated
    ShardedByS3Key
    ShardedByS3Key
    FullyReplicated
    FullyReplicated
    ShardedByS3Key
    ShardedByS3Key
    FULLY_REPLICATED
    FullyReplicated
    SHARDED_BY_S3_KEY
    ShardedByS3Key
    "FullyReplicated"
    FullyReplicated
    "ShardedByS3Key"
    ShardedByS3Key

    ModelBiasJobDefinitionEndpointInputS3InputMode, ModelBiasJobDefinitionEndpointInputS3InputModeArgs

    Pipe
    Pipe
    File
    File
    ModelBiasJobDefinitionEndpointInputS3InputModePipe
    Pipe
    ModelBiasJobDefinitionEndpointInputS3InputModeFile
    File
    Pipe
    Pipe
    File
    File
    Pipe
    Pipe
    File
    File
    PIPE
    Pipe
    FILE
    File
    "Pipe"
    Pipe
    "File"
    File

    ModelBiasJobDefinitionJson, ModelBiasJobDefinitionJsonArgs

    Line bool
    A boolean flag indicating if it is JSON line format
    Line bool
    A boolean flag indicating if it is JSON line format
    line Boolean
    A boolean flag indicating if it is JSON line format
    line boolean
    A boolean flag indicating if it is JSON line format
    line bool
    A boolean flag indicating if it is JSON line format
    line Boolean
    A boolean flag indicating if it is JSON line format

    ModelBiasJobDefinitionModelBiasAppSpecification, ModelBiasJobDefinitionModelBiasAppSpecificationArgs

    ConfigUri string
    The S3 URI to an analysis configuration file
    ImageUri string
    The container image to be run by the monitoring job.
    Environment object
    Sets the environment variables in the Docker container
    ConfigUri string
    The S3 URI to an analysis configuration file
    ImageUri string
    The container image to be run by the monitoring job.
    Environment interface{}
    Sets the environment variables in the Docker container
    configUri String
    The S3 URI to an analysis configuration file
    imageUri String
    The container image to be run by the monitoring job.
    environment Object
    Sets the environment variables in the Docker container
    configUri string
    The S3 URI to an analysis configuration file
    imageUri string
    The container image to be run by the monitoring job.
    environment any
    Sets the environment variables in the Docker container
    config_uri str
    The S3 URI to an analysis configuration file
    image_uri str
    The container image to be run by the monitoring job.
    environment Any
    Sets the environment variables in the Docker container
    configUri String
    The S3 URI to an analysis configuration file
    imageUri String
    The container image to be run by the monitoring job.
    environment Any
    Sets the environment variables in the Docker container

    ModelBiasJobDefinitionModelBiasBaselineConfig, ModelBiasJobDefinitionModelBiasBaselineConfigArgs

    ModelBiasJobDefinitionModelBiasJobInput, ModelBiasJobDefinitionModelBiasJobInputArgs

    ModelBiasJobDefinitionMonitoringGroundTruthS3Input, ModelBiasJobDefinitionMonitoringGroundTruthS3InputArgs

    S3Uri string
    A URI that identifies the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job.
    S3Uri string
    A URI that identifies the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job.
    s3Uri String
    A URI that identifies the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job.
    s3Uri string
    A URI that identifies the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job.
    s3_uri str
    A URI that identifies the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job.
    s3Uri String
    A URI that identifies the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job.

    ModelBiasJobDefinitionMonitoringOutput, ModelBiasJobDefinitionMonitoringOutputArgs

    ModelBiasJobDefinitionMonitoringOutputConfig, ModelBiasJobDefinitionMonitoringOutputConfigArgs

    MonitoringOutputs List<Pulumi.AwsNative.SageMaker.Inputs.ModelBiasJobDefinitionMonitoringOutput>
    Monitoring outputs for monitoring jobs. This is where the output of the periodic monitoring jobs is uploaded.
    KmsKeyId string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the model artifacts at rest using Amazon S3 server-side encryption.
    MonitoringOutputs []ModelBiasJobDefinitionMonitoringOutput
    Monitoring outputs for monitoring jobs. This is where the output of the periodic monitoring jobs is uploaded.
    KmsKeyId string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the model artifacts at rest using Amazon S3 server-side encryption.
    monitoringOutputs List<ModelBiasJobDefinitionMonitoringOutput>
    Monitoring outputs for monitoring jobs. This is where the output of the periodic monitoring jobs is uploaded.
    kmsKeyId String
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the model artifacts at rest using Amazon S3 server-side encryption.
    monitoringOutputs ModelBiasJobDefinitionMonitoringOutput[]
    Monitoring outputs for monitoring jobs. This is where the output of the periodic monitoring jobs is uploaded.
    kmsKeyId string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the model artifacts at rest using Amazon S3 server-side encryption.
    monitoring_outputs Sequence[ModelBiasJobDefinitionMonitoringOutput]
    Monitoring outputs for monitoring jobs. This is where the output of the periodic monitoring jobs is uploaded.
    kms_key_id str
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the model artifacts at rest using Amazon S3 server-side encryption.
    monitoringOutputs List<Property Map>
    Monitoring outputs for monitoring jobs. This is where the output of the periodic monitoring jobs is uploaded.
    kmsKeyId String
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the model artifacts at rest using Amazon S3 server-side encryption.

    ModelBiasJobDefinitionMonitoringResources, ModelBiasJobDefinitionMonitoringResourcesArgs

    ModelBiasJobDefinitionNetworkConfig, ModelBiasJobDefinitionNetworkConfigArgs

    EnableInterContainerTrafficEncryption bool
    Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.
    EnableNetworkIsolation bool
    Whether to allow inbound and outbound network calls to and from the containers used for the processing job.
    VpcConfig Pulumi.AwsNative.SageMaker.Inputs.ModelBiasJobDefinitionVpcConfig
    EnableInterContainerTrafficEncryption bool
    Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.
    EnableNetworkIsolation bool
    Whether to allow inbound and outbound network calls to and from the containers used for the processing job.
    VpcConfig ModelBiasJobDefinitionVpcConfig
    enableInterContainerTrafficEncryption Boolean
    Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.
    enableNetworkIsolation Boolean
    Whether to allow inbound and outbound network calls to and from the containers used for the processing job.
    vpcConfig ModelBiasJobDefinitionVpcConfig
    enableInterContainerTrafficEncryption boolean
    Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.
    enableNetworkIsolation boolean
    Whether to allow inbound and outbound network calls to and from the containers used for the processing job.
    vpcConfig ModelBiasJobDefinitionVpcConfig
    enable_inter_container_traffic_encryption bool
    Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.
    enable_network_isolation bool
    Whether to allow inbound and outbound network calls to and from the containers used for the processing job.
    vpc_config ModelBiasJobDefinitionVpcConfig
    enableInterContainerTrafficEncryption Boolean
    Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.
    enableNetworkIsolation Boolean
    Whether to allow inbound and outbound network calls to and from the containers used for the processing job.
    vpcConfig Property Map

    ModelBiasJobDefinitionS3Output, ModelBiasJobDefinitionS3OutputArgs

    LocalPath string
    The local path to the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job. LocalPath is an absolute path for the output data.
    S3Uri string
    A URI that identifies the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job.
    S3UploadMode Pulumi.AwsNative.SageMaker.ModelBiasJobDefinitionS3OutputS3UploadMode
    Whether to upload the results of the monitoring job continuously or after the job completes.
    LocalPath string
    The local path to the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job. LocalPath is an absolute path for the output data.
    S3Uri string
    A URI that identifies the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job.
    S3UploadMode ModelBiasJobDefinitionS3OutputS3UploadMode
    Whether to upload the results of the monitoring job continuously or after the job completes.
    localPath String
    The local path to the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job. LocalPath is an absolute path for the output data.
    s3Uri String
    A URI that identifies the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job.
    s3UploadMode ModelBiasJobDefinitionS3OutputS3UploadMode
    Whether to upload the results of the monitoring job continuously or after the job completes.
    localPath string
    The local path to the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job. LocalPath is an absolute path for the output data.
    s3Uri string
    A URI that identifies the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job.
    s3UploadMode ModelBiasJobDefinitionS3OutputS3UploadMode
    Whether to upload the results of the monitoring job continuously or after the job completes.
    local_path str
    The local path to the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job. LocalPath is an absolute path for the output data.
    s3_uri str
    A URI that identifies the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job.
    s3_upload_mode ModelBiasJobDefinitionS3OutputS3UploadMode
    Whether to upload the results of the monitoring job continuously or after the job completes.
    localPath String
    The local path to the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job. LocalPath is an absolute path for the output data.
    s3Uri String
    A URI that identifies the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job.
    s3UploadMode "Continuous" | "EndOfJob"
    Whether to upload the results of the monitoring job continuously or after the job completes.

    ModelBiasJobDefinitionS3OutputS3UploadMode, ModelBiasJobDefinitionS3OutputS3UploadModeArgs

    Continuous
    Continuous
    EndOfJob
    EndOfJob
    ModelBiasJobDefinitionS3OutputS3UploadModeContinuous
    Continuous
    ModelBiasJobDefinitionS3OutputS3UploadModeEndOfJob
    EndOfJob
    Continuous
    Continuous
    EndOfJob
    EndOfJob
    Continuous
    Continuous
    EndOfJob
    EndOfJob
    CONTINUOUS
    Continuous
    END_OF_JOB
    EndOfJob
    "Continuous"
    Continuous
    "EndOfJob"
    EndOfJob

    ModelBiasJobDefinitionStoppingCondition, ModelBiasJobDefinitionStoppingConditionArgs

    MaxRuntimeInSeconds int
    The maximum runtime allowed in seconds.
    MaxRuntimeInSeconds int
    The maximum runtime allowed in seconds.
    maxRuntimeInSeconds Integer
    The maximum runtime allowed in seconds.
    maxRuntimeInSeconds number
    The maximum runtime allowed in seconds.
    max_runtime_in_seconds int
    The maximum runtime allowed in seconds.
    maxRuntimeInSeconds Number
    The maximum runtime allowed in seconds.

    ModelBiasJobDefinitionVpcConfig, ModelBiasJobDefinitionVpcConfigArgs

    SecurityGroupIds List<string>
    The VPC security group IDs, in the form sg-xxxxxxxx. Specify the security groups for the VPC that is specified in the Subnets field.
    Subnets List<string>
    The ID of the subnets in the VPC to which you want to connect to your monitoring jobs.
    SecurityGroupIds []string
    The VPC security group IDs, in the form sg-xxxxxxxx. Specify the security groups for the VPC that is specified in the Subnets field.
    Subnets []string
    The ID of the subnets in the VPC to which you want to connect to your monitoring jobs.
    securityGroupIds List<String>
    The VPC security group IDs, in the form sg-xxxxxxxx. Specify the security groups for the VPC that is specified in the Subnets field.
    subnets List<String>
    The ID of the subnets in the VPC to which you want to connect to your monitoring jobs.
    securityGroupIds string[]
    The VPC security group IDs, in the form sg-xxxxxxxx. Specify the security groups for the VPC that is specified in the Subnets field.
    subnets string[]
    The ID of the subnets in the VPC to which you want to connect to your monitoring jobs.
    security_group_ids Sequence[str]
    The VPC security group IDs, in the form sg-xxxxxxxx. Specify the security groups for the VPC that is specified in the Subnets field.
    subnets Sequence[str]
    The ID of the subnets in the VPC to which you want to connect to your monitoring jobs.
    securityGroupIds List<String>
    The VPC security group IDs, in the form sg-xxxxxxxx. Specify the security groups for the VPC that is specified in the Subnets field.
    subnets List<String>
    The ID of the subnets in the VPC to which you want to connect to your monitoring jobs.

    Package Details

    Repository
    AWS Native pulumi/pulumi-aws-native
    License
    Apache-2.0
    aws-native logo

    AWS Native is in preview. AWS Classic is fully supported.

    AWS Native v0.102.0 published on Tuesday, Apr 16, 2024 by Pulumi