aws-native logo
AWS Native v0.63.0, May 25 23

aws-native.sagemaker.DataQualityJobDefinition

Explore with Pulumi AI

Resource Type definition for AWS::SageMaker::DataQualityJobDefinition

Create DataQualityJobDefinition Resource

new DataQualityJobDefinition(name: string, args: DataQualityJobDefinitionArgs, opts?: CustomResourceOptions);
@overload
def DataQualityJobDefinition(resource_name: str,
                             opts: Optional[ResourceOptions] = None,
                             data_quality_app_specification: Optional[DataQualityJobDefinitionDataQualityAppSpecificationArgs] = None,
                             data_quality_baseline_config: Optional[DataQualityJobDefinitionDataQualityBaselineConfigArgs] = None,
                             data_quality_job_input: Optional[DataQualityJobDefinitionDataQualityJobInputArgs] = None,
                             data_quality_job_output_config: Optional[DataQualityJobDefinitionMonitoringOutputConfigArgs] = None,
                             endpoint_name: Optional[str] = None,
                             job_definition_name: Optional[str] = None,
                             job_resources: Optional[DataQualityJobDefinitionMonitoringResourcesArgs] = None,
                             network_config: Optional[DataQualityJobDefinitionNetworkConfigArgs] = None,
                             role_arn: Optional[str] = None,
                             stopping_condition: Optional[DataQualityJobDefinitionStoppingConditionArgs] = None,
                             tags: Optional[Sequence[DataQualityJobDefinitionTagArgs]] = None)
@overload
def DataQualityJobDefinition(resource_name: str,
                             args: DataQualityJobDefinitionArgs,
                             opts: Optional[ResourceOptions] = None)
func NewDataQualityJobDefinition(ctx *Context, name string, args DataQualityJobDefinitionArgs, opts ...ResourceOption) (*DataQualityJobDefinition, error)
public DataQualityJobDefinition(string name, DataQualityJobDefinitionArgs args, CustomResourceOptions? opts = null)
public DataQualityJobDefinition(String name, DataQualityJobDefinitionArgs args)
public DataQualityJobDefinition(String name, DataQualityJobDefinitionArgs args, CustomResourceOptions options)
type: aws-native:sagemaker:DataQualityJobDefinition
properties: # The arguments to resource properties.
options: # Bag of options to control resource's behavior.

name string
The unique name of the resource.
args DataQualityJobDefinitionArgs
The arguments to resource properties.
opts CustomResourceOptions
Bag of options to control resource's behavior.
resource_name str
The unique name of the resource.
args DataQualityJobDefinitionArgs
The arguments to resource properties.
opts ResourceOptions
Bag of options to control resource's behavior.
ctx Context
Context object for the current deployment.
name string
The unique name of the resource.
args DataQualityJobDefinitionArgs
The arguments to resource properties.
opts ResourceOption
Bag of options to control resource's behavior.
name string
The unique name of the resource.
args DataQualityJobDefinitionArgs
The arguments to resource properties.
opts CustomResourceOptions
Bag of options to control resource's behavior.
name String
The unique name of the resource.
args DataQualityJobDefinitionArgs
The arguments to resource properties.
options CustomResourceOptions
Bag of options to control resource's behavior.

DataQualityJobDefinition Resource Properties

To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.

Inputs

The DataQualityJobDefinition resource accepts the following input properties:

DataQualityAppSpecification Pulumi.AwsNative.SageMaker.Inputs.DataQualityJobDefinitionDataQualityAppSpecificationArgs
DataQualityJobInput Pulumi.AwsNative.SageMaker.Inputs.DataQualityJobDefinitionDataQualityJobInputArgs
DataQualityJobOutputConfig Pulumi.AwsNative.SageMaker.Inputs.DataQualityJobDefinitionMonitoringOutputConfigArgs
JobResources Pulumi.AwsNative.SageMaker.Inputs.DataQualityJobDefinitionMonitoringResourcesArgs
RoleArn string

The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to perform tasks on your behalf.

DataQualityBaselineConfig Pulumi.AwsNative.SageMaker.Inputs.DataQualityJobDefinitionDataQualityBaselineConfigArgs
EndpointName string
JobDefinitionName string
NetworkConfig Pulumi.AwsNative.SageMaker.Inputs.DataQualityJobDefinitionNetworkConfigArgs
StoppingCondition Pulumi.AwsNative.SageMaker.Inputs.DataQualityJobDefinitionStoppingConditionArgs
Tags List<Pulumi.AwsNative.SageMaker.Inputs.DataQualityJobDefinitionTagArgs>

An array of key-value pairs to apply to this resource.

dataQualityAppSpecification Property Map
dataQualityJobInput Property Map
dataQualityJobOutputConfig Property Map
jobResources Property Map
roleArn String

The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to perform tasks on your behalf.

dataQualityBaselineConfig Property Map
endpointName String
jobDefinitionName String
networkConfig Property Map
stoppingCondition Property Map
tags List<Property Map>

An array of key-value pairs to apply to this resource.

Outputs

All input properties are implicitly available as output properties. Additionally, the DataQualityJobDefinition resource produces the following output properties:

CreationTime string

The time at which the job definition was created.

Id string

The provider-assigned unique ID for this managed resource.

JobDefinitionArn string

The Amazon Resource Name (ARN) of job definition.

CreationTime string

The time at which the job definition was created.

Id string

The provider-assigned unique ID for this managed resource.

JobDefinitionArn string

The Amazon Resource Name (ARN) of job definition.

creationTime String

The time at which the job definition was created.

id String

The provider-assigned unique ID for this managed resource.

jobDefinitionArn String

The Amazon Resource Name (ARN) of job definition.

creationTime string

The time at which the job definition was created.

id string

The provider-assigned unique ID for this managed resource.

jobDefinitionArn string

The Amazon Resource Name (ARN) of job definition.

creation_time str

The time at which the job definition was created.

id str

The provider-assigned unique ID for this managed resource.

job_definition_arn str

The Amazon Resource Name (ARN) of job definition.

creationTime String

The time at which the job definition was created.

id String

The provider-assigned unique ID for this managed resource.

jobDefinitionArn String

The Amazon Resource Name (ARN) of job definition.

Supporting Types

DataQualityJobDefinitionBatchTransformInput

DataCapturedDestinationS3Uri string

A URI that identifies the Amazon S3 storage location where Batch Transform Job captures data.

DatasetFormat Pulumi.AwsNative.SageMaker.Inputs.DataQualityJobDefinitionDatasetFormat
LocalPath string

Path to the filesystem where the endpoint data is available to the container.

S3DataDistributionType Pulumi.AwsNative.SageMaker.DataQualityJobDefinitionBatchTransformInputS3DataDistributionType

Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated

S3InputMode Pulumi.AwsNative.SageMaker.DataQualityJobDefinitionBatchTransformInputS3InputMode

Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.

DataCapturedDestinationS3Uri string

A URI that identifies the Amazon S3 storage location where Batch Transform Job captures data.

DatasetFormat DataQualityJobDefinitionDatasetFormat
LocalPath string

Path to the filesystem where the endpoint data is available to the container.

S3DataDistributionType DataQualityJobDefinitionBatchTransformInputS3DataDistributionType

Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated

S3InputMode DataQualityJobDefinitionBatchTransformInputS3InputMode

Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.

dataCapturedDestinationS3Uri String

A URI that identifies the Amazon S3 storage location where Batch Transform Job captures data.

datasetFormat DataQualityJobDefinitionDatasetFormat
localPath String

Path to the filesystem where the endpoint data is available to the container.

s3DataDistributionType DataQualityJobDefinitionBatchTransformInputS3DataDistributionType

Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated

s3InputMode DataQualityJobDefinitionBatchTransformInputS3InputMode

Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.

dataCapturedDestinationS3Uri string

A URI that identifies the Amazon S3 storage location where Batch Transform Job captures data.

datasetFormat DataQualityJobDefinitionDatasetFormat
localPath string

Path to the filesystem where the endpoint data is available to the container.

s3DataDistributionType DataQualityJobDefinitionBatchTransformInputS3DataDistributionType

Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated

s3InputMode DataQualityJobDefinitionBatchTransformInputS3InputMode

Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.

data_captured_destination_s3_uri str

A URI that identifies the Amazon S3 storage location where Batch Transform Job captures data.

dataset_format DataQualityJobDefinitionDatasetFormat
local_path str

Path to the filesystem where the endpoint data is available to the container.

s3_data_distribution_type DataQualityJobDefinitionBatchTransformInputS3DataDistributionType

Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated

s3_input_mode DataQualityJobDefinitionBatchTransformInputS3InputMode

Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.

dataCapturedDestinationS3Uri String

A URI that identifies the Amazon S3 storage location where Batch Transform Job captures data.

datasetFormat Property Map
localPath String

Path to the filesystem where the endpoint data is available to the container.

s3DataDistributionType "FullyReplicated" | "ShardedByS3Key"

Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated

s3InputMode "Pipe" | "File"

Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.

DataQualityJobDefinitionBatchTransformInputS3DataDistributionType

FullyReplicated
FullyReplicated
ShardedByS3Key
ShardedByS3Key
DataQualityJobDefinitionBatchTransformInputS3DataDistributionTypeFullyReplicated
FullyReplicated
DataQualityJobDefinitionBatchTransformInputS3DataDistributionTypeShardedByS3Key
ShardedByS3Key
FullyReplicated
FullyReplicated
ShardedByS3Key
ShardedByS3Key
FullyReplicated
FullyReplicated
ShardedByS3Key
ShardedByS3Key
FULLY_REPLICATED
FullyReplicated
SHARDED_BY_S3_KEY
ShardedByS3Key
"FullyReplicated"
FullyReplicated
"ShardedByS3Key"
ShardedByS3Key

DataQualityJobDefinitionBatchTransformInputS3InputMode

Pipe
Pipe
File
File
DataQualityJobDefinitionBatchTransformInputS3InputModePipe
Pipe
DataQualityJobDefinitionBatchTransformInputS3InputModeFile
File
Pipe
Pipe
File
File
Pipe
Pipe
File
File
PIPE
Pipe
FILE
File
"Pipe"
Pipe
"File"
File

DataQualityJobDefinitionClusterConfig

InstanceCount int

The number of ML compute instances to use in the model monitoring job. For distributed processing jobs, specify a value greater than 1. The default value is 1.

InstanceType string

The ML compute instance type for the processing job.

VolumeSizeInGB int

The size of the ML storage volume, in gigabytes, that you want to provision. You must specify sufficient ML storage for your scenario.

VolumeKmsKeyId string

The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the model monitoring job.

InstanceCount int

The number of ML compute instances to use in the model monitoring job. For distributed processing jobs, specify a value greater than 1. The default value is 1.

InstanceType string

The ML compute instance type for the processing job.

VolumeSizeInGB int

The size of the ML storage volume, in gigabytes, that you want to provision. You must specify sufficient ML storage for your scenario.

VolumeKmsKeyId string

The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the model monitoring job.

instanceCount Integer

The number of ML compute instances to use in the model monitoring job. For distributed processing jobs, specify a value greater than 1. The default value is 1.

instanceType String

The ML compute instance type for the processing job.

volumeSizeInGB Integer

The size of the ML storage volume, in gigabytes, that you want to provision. You must specify sufficient ML storage for your scenario.

volumeKmsKeyId String

The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the model monitoring job.

instanceCount number

The number of ML compute instances to use in the model monitoring job. For distributed processing jobs, specify a value greater than 1. The default value is 1.

instanceType string

The ML compute instance type for the processing job.

volumeSizeInGB number

The size of the ML storage volume, in gigabytes, that you want to provision. You must specify sufficient ML storage for your scenario.

volumeKmsKeyId string

The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the model monitoring job.

instance_count int

The number of ML compute instances to use in the model monitoring job. For distributed processing jobs, specify a value greater than 1. The default value is 1.

instance_type str

The ML compute instance type for the processing job.

volume_size_in_gb int

The size of the ML storage volume, in gigabytes, that you want to provision. You must specify sufficient ML storage for your scenario.

volume_kms_key_id str

The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the model monitoring job.

instanceCount Number

The number of ML compute instances to use in the model monitoring job. For distributed processing jobs, specify a value greater than 1. The default value is 1.

instanceType String

The ML compute instance type for the processing job.

volumeSizeInGB Number

The size of the ML storage volume, in gigabytes, that you want to provision. You must specify sufficient ML storage for your scenario.

volumeKmsKeyId String

The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the model monitoring job.

DataQualityJobDefinitionConstraintsResource

S3Uri string

The Amazon S3 URI for baseline constraint file in Amazon S3 that the current monitoring job should validated against.

S3Uri string

The Amazon S3 URI for baseline constraint file in Amazon S3 that the current monitoring job should validated against.

s3Uri String

The Amazon S3 URI for baseline constraint file in Amazon S3 that the current monitoring job should validated against.

s3Uri string

The Amazon S3 URI for baseline constraint file in Amazon S3 that the current monitoring job should validated against.

s3_uri str

The Amazon S3 URI for baseline constraint file in Amazon S3 that the current monitoring job should validated against.

s3Uri String

The Amazon S3 URI for baseline constraint file in Amazon S3 that the current monitoring job should validated against.

DataQualityJobDefinitionCsv

Header bool

A boolean flag indicating if given CSV has header

Header bool

A boolean flag indicating if given CSV has header

header Boolean

A boolean flag indicating if given CSV has header

header boolean

A boolean flag indicating if given CSV has header

header bool

A boolean flag indicating if given CSV has header

header Boolean

A boolean flag indicating if given CSV has header

DataQualityJobDefinitionDataQualityAppSpecification

ImageUri string

The container image to be run by the monitoring job.

ContainerArguments List<string>

An array of arguments for the container used to run the monitoring job.

ContainerEntrypoint List<string>

Specifies the entrypoint for a container used to run the monitoring job.

Environment object

Sets the environment variables in the Docker container

PostAnalyticsProcessorSourceUri string

An Amazon S3 URI to a script that is called after analysis has been performed. Applicable only for the built-in (first party) containers.

RecordPreprocessorSourceUri string

An Amazon S3 URI to a script that is called per row prior to running analysis. It can base64 decode the payload and convert it into a flatted json so that the built-in container can use the converted data. Applicable only for the built-in (first party) containers

ImageUri string

The container image to be run by the monitoring job.

ContainerArguments []string

An array of arguments for the container used to run the monitoring job.

ContainerEntrypoint []string

Specifies the entrypoint for a container used to run the monitoring job.

Environment interface{}

Sets the environment variables in the Docker container

PostAnalyticsProcessorSourceUri string

An Amazon S3 URI to a script that is called after analysis has been performed. Applicable only for the built-in (first party) containers.

RecordPreprocessorSourceUri string

An Amazon S3 URI to a script that is called per row prior to running analysis. It can base64 decode the payload and convert it into a flatted json so that the built-in container can use the converted data. Applicable only for the built-in (first party) containers

imageUri String

The container image to be run by the monitoring job.

containerArguments List<String>

An array of arguments for the container used to run the monitoring job.

containerEntrypoint List<String>

Specifies the entrypoint for a container used to run the monitoring job.

environment Object

Sets the environment variables in the Docker container

postAnalyticsProcessorSourceUri String

An Amazon S3 URI to a script that is called after analysis has been performed. Applicable only for the built-in (first party) containers.

recordPreprocessorSourceUri String

An Amazon S3 URI to a script that is called per row prior to running analysis. It can base64 decode the payload and convert it into a flatted json so that the built-in container can use the converted data. Applicable only for the built-in (first party) containers

imageUri string

The container image to be run by the monitoring job.

containerArguments string[]

An array of arguments for the container used to run the monitoring job.

containerEntrypoint string[]

Specifies the entrypoint for a container used to run the monitoring job.

environment any

Sets the environment variables in the Docker container

postAnalyticsProcessorSourceUri string

An Amazon S3 URI to a script that is called after analysis has been performed. Applicable only for the built-in (first party) containers.

recordPreprocessorSourceUri string

An Amazon S3 URI to a script that is called per row prior to running analysis. It can base64 decode the payload and convert it into a flatted json so that the built-in container can use the converted data. Applicable only for the built-in (first party) containers

image_uri str

The container image to be run by the monitoring job.

container_arguments Sequence[str]

An array of arguments for the container used to run the monitoring job.

container_entrypoint Sequence[str]

Specifies the entrypoint for a container used to run the monitoring job.

environment Any

Sets the environment variables in the Docker container

post_analytics_processor_source_uri str

An Amazon S3 URI to a script that is called after analysis has been performed. Applicable only for the built-in (first party) containers.

record_preprocessor_source_uri str

An Amazon S3 URI to a script that is called per row prior to running analysis. It can base64 decode the payload and convert it into a flatted json so that the built-in container can use the converted data. Applicable only for the built-in (first party) containers

imageUri String

The container image to be run by the monitoring job.

containerArguments List<String>

An array of arguments for the container used to run the monitoring job.

containerEntrypoint List<String>

Specifies the entrypoint for a container used to run the monitoring job.

environment Any

Sets the environment variables in the Docker container

postAnalyticsProcessorSourceUri String

An Amazon S3 URI to a script that is called after analysis has been performed. Applicable only for the built-in (first party) containers.

recordPreprocessorSourceUri String

An Amazon S3 URI to a script that is called per row prior to running analysis. It can base64 decode the payload and convert it into a flatted json so that the built-in container can use the converted data. Applicable only for the built-in (first party) containers

DataQualityJobDefinitionDataQualityBaselineConfig

DataQualityJobDefinitionDataQualityJobInput

DataQualityJobDefinitionDatasetFormat

DataQualityJobDefinitionEndpointInput

EndpointName string
LocalPath string

Path to the filesystem where the endpoint data is available to the container.

S3DataDistributionType Pulumi.AwsNative.SageMaker.DataQualityJobDefinitionEndpointInputS3DataDistributionType

Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated

S3InputMode Pulumi.AwsNative.SageMaker.DataQualityJobDefinitionEndpointInputS3InputMode

Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.

EndpointName string
LocalPath string

Path to the filesystem where the endpoint data is available to the container.

S3DataDistributionType DataQualityJobDefinitionEndpointInputS3DataDistributionType

Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated

S3InputMode DataQualityJobDefinitionEndpointInputS3InputMode

Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.

endpointName String
localPath String

Path to the filesystem where the endpoint data is available to the container.

s3DataDistributionType DataQualityJobDefinitionEndpointInputS3DataDistributionType

Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated

s3InputMode DataQualityJobDefinitionEndpointInputS3InputMode

Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.

endpointName string
localPath string

Path to the filesystem where the endpoint data is available to the container.

s3DataDistributionType DataQualityJobDefinitionEndpointInputS3DataDistributionType

Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated

s3InputMode DataQualityJobDefinitionEndpointInputS3InputMode

Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.

endpoint_name str
local_path str

Path to the filesystem where the endpoint data is available to the container.

s3_data_distribution_type DataQualityJobDefinitionEndpointInputS3DataDistributionType

Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated

s3_input_mode DataQualityJobDefinitionEndpointInputS3InputMode

Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.

endpointName String
localPath String

Path to the filesystem where the endpoint data is available to the container.

s3DataDistributionType "FullyReplicated" | "ShardedByS3Key"

Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated

s3InputMode "Pipe" | "File"

Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.

DataQualityJobDefinitionEndpointInputS3DataDistributionType

FullyReplicated
FullyReplicated
ShardedByS3Key
ShardedByS3Key
DataQualityJobDefinitionEndpointInputS3DataDistributionTypeFullyReplicated
FullyReplicated
DataQualityJobDefinitionEndpointInputS3DataDistributionTypeShardedByS3Key
ShardedByS3Key
FullyReplicated
FullyReplicated
ShardedByS3Key
ShardedByS3Key
FullyReplicated
FullyReplicated
ShardedByS3Key
ShardedByS3Key
FULLY_REPLICATED
FullyReplicated
SHARDED_BY_S3_KEY
ShardedByS3Key
"FullyReplicated"
FullyReplicated
"ShardedByS3Key"
ShardedByS3Key

DataQualityJobDefinitionEndpointInputS3InputMode

Pipe
Pipe
File
File
DataQualityJobDefinitionEndpointInputS3InputModePipe
Pipe
DataQualityJobDefinitionEndpointInputS3InputModeFile
File
Pipe
Pipe
File
File
Pipe
Pipe
File
File
PIPE
Pipe
FILE
File
"Pipe"
Pipe
"File"
File

DataQualityJobDefinitionJson

Line bool

A boolean flag indicating if it is JSON line format

Line bool

A boolean flag indicating if it is JSON line format

line Boolean

A boolean flag indicating if it is JSON line format

line boolean

A boolean flag indicating if it is JSON line format

line bool

A boolean flag indicating if it is JSON line format

line Boolean

A boolean flag indicating if it is JSON line format

DataQualityJobDefinitionMonitoringOutput

DataQualityJobDefinitionMonitoringOutputConfig

MonitoringOutputs List<Pulumi.AwsNative.SageMaker.Inputs.DataQualityJobDefinitionMonitoringOutput>

Monitoring outputs for monitoring jobs. This is where the output of the periodic monitoring jobs is uploaded.

KmsKeyId string

The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the model artifacts at rest using Amazon S3 server-side encryption.

MonitoringOutputs []DataQualityJobDefinitionMonitoringOutput

Monitoring outputs for monitoring jobs. This is where the output of the periodic monitoring jobs is uploaded.

KmsKeyId string

The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the model artifacts at rest using Amazon S3 server-side encryption.

monitoringOutputs List<DataQualityJobDefinitionMonitoringOutput>

Monitoring outputs for monitoring jobs. This is where the output of the periodic monitoring jobs is uploaded.

kmsKeyId String

The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the model artifacts at rest using Amazon S3 server-side encryption.

monitoringOutputs DataQualityJobDefinitionMonitoringOutput[]

Monitoring outputs for monitoring jobs. This is where the output of the periodic monitoring jobs is uploaded.

kmsKeyId string

The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the model artifacts at rest using Amazon S3 server-side encryption.

monitoring_outputs Sequence[DataQualityJobDefinitionMonitoringOutput]

Monitoring outputs for monitoring jobs. This is where the output of the periodic monitoring jobs is uploaded.

kms_key_id str

The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the model artifacts at rest using Amazon S3 server-side encryption.

monitoringOutputs List<Property Map>

Monitoring outputs for monitoring jobs. This is where the output of the periodic monitoring jobs is uploaded.

kmsKeyId String

The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the model artifacts at rest using Amazon S3 server-side encryption.

DataQualityJobDefinitionMonitoringResources

DataQualityJobDefinitionNetworkConfig

EnableInterContainerTrafficEncryption bool

Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.

EnableNetworkIsolation bool

Whether to allow inbound and outbound network calls to and from the containers used for the processing job.

VpcConfig Pulumi.AwsNative.SageMaker.Inputs.DataQualityJobDefinitionVpcConfig
EnableInterContainerTrafficEncryption bool

Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.

EnableNetworkIsolation bool

Whether to allow inbound and outbound network calls to and from the containers used for the processing job.

VpcConfig DataQualityJobDefinitionVpcConfig
enableInterContainerTrafficEncryption Boolean

Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.

enableNetworkIsolation Boolean

Whether to allow inbound and outbound network calls to and from the containers used for the processing job.

vpcConfig DataQualityJobDefinitionVpcConfig
enableInterContainerTrafficEncryption boolean

Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.

enableNetworkIsolation boolean

Whether to allow inbound and outbound network calls to and from the containers used for the processing job.

vpcConfig DataQualityJobDefinitionVpcConfig
enable_inter_container_traffic_encryption bool

Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.

enable_network_isolation bool

Whether to allow inbound and outbound network calls to and from the containers used for the processing job.

vpc_config DataQualityJobDefinitionVpcConfig
enableInterContainerTrafficEncryption Boolean

Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.

enableNetworkIsolation Boolean

Whether to allow inbound and outbound network calls to and from the containers used for the processing job.

vpcConfig Property Map

DataQualityJobDefinitionS3Output

LocalPath string

The local path to the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job. LocalPath is an absolute path for the output data.

S3Uri string

A URI that identifies the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job.

S3UploadMode Pulumi.AwsNative.SageMaker.DataQualityJobDefinitionS3OutputS3UploadMode

Whether to upload the results of the monitoring job continuously or after the job completes.

LocalPath string

The local path to the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job. LocalPath is an absolute path for the output data.

S3Uri string

A URI that identifies the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job.

S3UploadMode DataQualityJobDefinitionS3OutputS3UploadMode

Whether to upload the results of the monitoring job continuously or after the job completes.

localPath String

The local path to the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job. LocalPath is an absolute path for the output data.

s3Uri String

A URI that identifies the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job.

s3UploadMode DataQualityJobDefinitionS3OutputS3UploadMode

Whether to upload the results of the monitoring job continuously or after the job completes.

localPath string

The local path to the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job. LocalPath is an absolute path for the output data.

s3Uri string

A URI that identifies the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job.

s3UploadMode DataQualityJobDefinitionS3OutputS3UploadMode

Whether to upload the results of the monitoring job continuously or after the job completes.

local_path str

The local path to the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job. LocalPath is an absolute path for the output data.

s3_uri str

A URI that identifies the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job.

s3_upload_mode DataQualityJobDefinitionS3OutputS3UploadMode

Whether to upload the results of the monitoring job continuously or after the job completes.

localPath String

The local path to the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job. LocalPath is an absolute path for the output data.

s3Uri String

A URI that identifies the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job.

s3UploadMode "Continuous" | "EndOfJob"

Whether to upload the results of the monitoring job continuously or after the job completes.

DataQualityJobDefinitionS3OutputS3UploadMode

Continuous
Continuous
EndOfJob
EndOfJob
DataQualityJobDefinitionS3OutputS3UploadModeContinuous
Continuous
DataQualityJobDefinitionS3OutputS3UploadModeEndOfJob
EndOfJob
Continuous
Continuous
EndOfJob
EndOfJob
Continuous
Continuous
EndOfJob
EndOfJob
CONTINUOUS
Continuous
END_OF_JOB
EndOfJob
"Continuous"
Continuous
"EndOfJob"
EndOfJob

DataQualityJobDefinitionStatisticsResource

S3Uri string

The Amazon S3 URI for the baseline statistics file in Amazon S3 that the current monitoring job should be validated against.

S3Uri string

The Amazon S3 URI for the baseline statistics file in Amazon S3 that the current monitoring job should be validated against.

s3Uri String

The Amazon S3 URI for the baseline statistics file in Amazon S3 that the current monitoring job should be validated against.

s3Uri string

The Amazon S3 URI for the baseline statistics file in Amazon S3 that the current monitoring job should be validated against.

s3_uri str

The Amazon S3 URI for the baseline statistics file in Amazon S3 that the current monitoring job should be validated against.

s3Uri String

The Amazon S3 URI for the baseline statistics file in Amazon S3 that the current monitoring job should be validated against.

DataQualityJobDefinitionStoppingCondition

MaxRuntimeInSeconds int

The maximum runtime allowed in seconds.

MaxRuntimeInSeconds int

The maximum runtime allowed in seconds.

maxRuntimeInSeconds Integer

The maximum runtime allowed in seconds.

maxRuntimeInSeconds number

The maximum runtime allowed in seconds.

max_runtime_in_seconds int

The maximum runtime allowed in seconds.

maxRuntimeInSeconds Number

The maximum runtime allowed in seconds.

DataQualityJobDefinitionTag

Key string

The key name of the tag. You can specify a value that is 1 to 127 Unicode characters in length and cannot be prefixed with aws:. You can use any of the following characters: the set of Unicode letters, digits, whitespace, _, ., /, =, +, and -.

Value string

The value for the tag. You can specify a value that is 1 to 255 Unicode characters in length and cannot be prefixed with aws:. You can use any of the following characters: the set of Unicode letters, digits, whitespace, _, ., /, =, +, and -.

Key string

The key name of the tag. You can specify a value that is 1 to 127 Unicode characters in length and cannot be prefixed with aws:. You can use any of the following characters: the set of Unicode letters, digits, whitespace, _, ., /, =, +, and -.

Value string

The value for the tag. You can specify a value that is 1 to 255 Unicode characters in length and cannot be prefixed with aws:. You can use any of the following characters: the set of Unicode letters, digits, whitespace, _, ., /, =, +, and -.

key String

The key name of the tag. You can specify a value that is 1 to 127 Unicode characters in length and cannot be prefixed with aws:. You can use any of the following characters: the set of Unicode letters, digits, whitespace, _, ., /, =, +, and -.

value String

The value for the tag. You can specify a value that is 1 to 255 Unicode characters in length and cannot be prefixed with aws:. You can use any of the following characters: the set of Unicode letters, digits, whitespace, _, ., /, =, +, and -.

key string

The key name of the tag. You can specify a value that is 1 to 127 Unicode characters in length and cannot be prefixed with aws:. You can use any of the following characters: the set of Unicode letters, digits, whitespace, _, ., /, =, +, and -.

value string

The value for the tag. You can specify a value that is 1 to 255 Unicode characters in length and cannot be prefixed with aws:. You can use any of the following characters: the set of Unicode letters, digits, whitespace, _, ., /, =, +, and -.

key str

The key name of the tag. You can specify a value that is 1 to 127 Unicode characters in length and cannot be prefixed with aws:. You can use any of the following characters: the set of Unicode letters, digits, whitespace, _, ., /, =, +, and -.

value str

The value for the tag. You can specify a value that is 1 to 255 Unicode characters in length and cannot be prefixed with aws:. You can use any of the following characters: the set of Unicode letters, digits, whitespace, _, ., /, =, +, and -.

key String

The key name of the tag. You can specify a value that is 1 to 127 Unicode characters in length and cannot be prefixed with aws:. You can use any of the following characters: the set of Unicode letters, digits, whitespace, _, ., /, =, +, and -.

value String

The value for the tag. You can specify a value that is 1 to 255 Unicode characters in length and cannot be prefixed with aws:. You can use any of the following characters: the set of Unicode letters, digits, whitespace, _, ., /, =, +, and -.

DataQualityJobDefinitionVpcConfig

SecurityGroupIds List<string>

The VPC security group IDs, in the form sg-xxxxxxxx. Specify the security groups for the VPC that is specified in the Subnets field.

Subnets List<string>

The ID of the subnets in the VPC to which you want to connect to your monitoring jobs.

SecurityGroupIds []string

The VPC security group IDs, in the form sg-xxxxxxxx. Specify the security groups for the VPC that is specified in the Subnets field.

Subnets []string

The ID of the subnets in the VPC to which you want to connect to your monitoring jobs.

securityGroupIds List<String>

The VPC security group IDs, in the form sg-xxxxxxxx. Specify the security groups for the VPC that is specified in the Subnets field.

subnets List<String>

The ID of the subnets in the VPC to which you want to connect to your monitoring jobs.

securityGroupIds string[]

The VPC security group IDs, in the form sg-xxxxxxxx. Specify the security groups for the VPC that is specified in the Subnets field.

subnets string[]

The ID of the subnets in the VPC to which you want to connect to your monitoring jobs.

security_group_ids Sequence[str]

The VPC security group IDs, in the form sg-xxxxxxxx. Specify the security groups for the VPC that is specified in the Subnets field.

subnets Sequence[str]

The ID of the subnets in the VPC to which you want to connect to your monitoring jobs.

securityGroupIds List<String>

The VPC security group IDs, in the form sg-xxxxxxxx. Specify the security groups for the VPC that is specified in the Subnets field.

subnets List<String>

The ID of the subnets in the VPC to which you want to connect to your monitoring jobs.

Package Details

Repository
AWS Native pulumi/pulumi-aws-native
License
Apache-2.0