1. Packages
  2. AWS Native
  3. API Docs
  4. sagemaker
  5. DataQualityJobDefinition

AWS Native is in preview. AWS Classic is fully supported.

AWS Native v0.112.0 published on Wednesday, Jul 24, 2024 by Pulumi

aws-native.sagemaker.DataQualityJobDefinition

Explore with Pulumi AI

aws-native logo

AWS Native is in preview. AWS Classic is fully supported.

AWS Native v0.112.0 published on Wednesday, Jul 24, 2024 by Pulumi

    Resource Type definition for AWS::SageMaker::DataQualityJobDefinition

    Create DataQualityJobDefinition Resource

    Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.

    Constructor syntax

    new DataQualityJobDefinition(name: string, args: DataQualityJobDefinitionArgs, opts?: CustomResourceOptions);
    @overload
    def DataQualityJobDefinition(resource_name: str,
                                 args: DataQualityJobDefinitionArgs,
                                 opts: Optional[ResourceOptions] = None)
    
    @overload
    def DataQualityJobDefinition(resource_name: str,
                                 opts: Optional[ResourceOptions] = None,
                                 data_quality_app_specification: Optional[DataQualityJobDefinitionDataQualityAppSpecificationArgs] = None,
                                 data_quality_job_input: Optional[DataQualityJobDefinitionDataQualityJobInputArgs] = None,
                                 data_quality_job_output_config: Optional[DataQualityJobDefinitionMonitoringOutputConfigArgs] = None,
                                 job_resources: Optional[DataQualityJobDefinitionMonitoringResourcesArgs] = None,
                                 role_arn: Optional[str] = None,
                                 data_quality_baseline_config: Optional[DataQualityJobDefinitionDataQualityBaselineConfigArgs] = None,
                                 endpoint_name: Optional[str] = None,
                                 job_definition_name: Optional[str] = None,
                                 network_config: Optional[DataQualityJobDefinitionNetworkConfigArgs] = None,
                                 stopping_condition: Optional[DataQualityJobDefinitionStoppingConditionArgs] = None,
                                 tags: Optional[Sequence[_root_inputs.CreateOnlyTagArgs]] = None)
    func NewDataQualityJobDefinition(ctx *Context, name string, args DataQualityJobDefinitionArgs, opts ...ResourceOption) (*DataQualityJobDefinition, error)
    public DataQualityJobDefinition(string name, DataQualityJobDefinitionArgs args, CustomResourceOptions? opts = null)
    public DataQualityJobDefinition(String name, DataQualityJobDefinitionArgs args)
    public DataQualityJobDefinition(String name, DataQualityJobDefinitionArgs args, CustomResourceOptions options)
    
    type: aws-native:sagemaker:DataQualityJobDefinition
    properties: # The arguments to resource properties.
    options: # Bag of options to control resource's behavior.
    
    

    Parameters

    name string
    The unique name of the resource.
    args DataQualityJobDefinitionArgs
    The arguments to resource properties.
    opts CustomResourceOptions
    Bag of options to control resource's behavior.
    resource_name str
    The unique name of the resource.
    args DataQualityJobDefinitionArgs
    The arguments to resource properties.
    opts ResourceOptions
    Bag of options to control resource's behavior.
    ctx Context
    Context object for the current deployment.
    name string
    The unique name of the resource.
    args DataQualityJobDefinitionArgs
    The arguments to resource properties.
    opts ResourceOption
    Bag of options to control resource's behavior.
    name string
    The unique name of the resource.
    args DataQualityJobDefinitionArgs
    The arguments to resource properties.
    opts CustomResourceOptions
    Bag of options to control resource's behavior.
    name String
    The unique name of the resource.
    args DataQualityJobDefinitionArgs
    The arguments to resource properties.
    options CustomResourceOptions
    Bag of options to control resource's behavior.

    DataQualityJobDefinition Resource Properties

    To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.

    Inputs

    The DataQualityJobDefinition resource accepts the following input properties:

    DataQualityAppSpecification Pulumi.AwsNative.SageMaker.Inputs.DataQualityJobDefinitionDataQualityAppSpecification
    Specifies the container that runs the monitoring job.
    DataQualityJobInput Pulumi.AwsNative.SageMaker.Inputs.DataQualityJobDefinitionDataQualityJobInput
    A list of inputs for the monitoring job. Currently endpoints are supported as monitoring inputs.
    DataQualityJobOutputConfig Pulumi.AwsNative.SageMaker.Inputs.DataQualityJobDefinitionMonitoringOutputConfig
    The output configuration for monitoring jobs.
    JobResources Pulumi.AwsNative.SageMaker.Inputs.DataQualityJobDefinitionMonitoringResources
    Identifies the resources to deploy for a monitoring job.
    RoleArn string
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to perform tasks on your behalf.
    DataQualityBaselineConfig Pulumi.AwsNative.SageMaker.Inputs.DataQualityJobDefinitionDataQualityBaselineConfig
    Configures the constraints and baselines for the monitoring job.
    EndpointName string
    JobDefinitionName string
    The name for the monitoring job definition.
    NetworkConfig Pulumi.AwsNative.SageMaker.Inputs.DataQualityJobDefinitionNetworkConfig
    Specifies networking configuration for the monitoring job.
    StoppingCondition Pulumi.AwsNative.SageMaker.Inputs.DataQualityJobDefinitionStoppingCondition
    A time limit for how long the monitoring job is allowed to run before stopping.
    Tags List<Pulumi.AwsNative.Inputs.CreateOnlyTag>
    An array of key-value pairs to apply to this resource.
    DataQualityAppSpecification DataQualityJobDefinitionDataQualityAppSpecificationArgs
    Specifies the container that runs the monitoring job.
    DataQualityJobInput DataQualityJobDefinitionDataQualityJobInputArgs
    A list of inputs for the monitoring job. Currently endpoints are supported as monitoring inputs.
    DataQualityJobOutputConfig DataQualityJobDefinitionMonitoringOutputConfigArgs
    The output configuration for monitoring jobs.
    JobResources DataQualityJobDefinitionMonitoringResourcesArgs
    Identifies the resources to deploy for a monitoring job.
    RoleArn string
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to perform tasks on your behalf.
    DataQualityBaselineConfig DataQualityJobDefinitionDataQualityBaselineConfigArgs
    Configures the constraints and baselines for the monitoring job.
    EndpointName string
    JobDefinitionName string
    The name for the monitoring job definition.
    NetworkConfig DataQualityJobDefinitionNetworkConfigArgs
    Specifies networking configuration for the monitoring job.
    StoppingCondition DataQualityJobDefinitionStoppingConditionArgs
    A time limit for how long the monitoring job is allowed to run before stopping.
    Tags CreateOnlyTagArgs
    An array of key-value pairs to apply to this resource.
    dataQualityAppSpecification DataQualityJobDefinitionDataQualityAppSpecification
    Specifies the container that runs the monitoring job.
    dataQualityJobInput DataQualityJobDefinitionDataQualityJobInput
    A list of inputs for the monitoring job. Currently endpoints are supported as monitoring inputs.
    dataQualityJobOutputConfig DataQualityJobDefinitionMonitoringOutputConfig
    The output configuration for monitoring jobs.
    jobResources DataQualityJobDefinitionMonitoringResources
    Identifies the resources to deploy for a monitoring job.
    roleArn String
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to perform tasks on your behalf.
    dataQualityBaselineConfig DataQualityJobDefinitionDataQualityBaselineConfig
    Configures the constraints and baselines for the monitoring job.
    endpointName String
    jobDefinitionName String
    The name for the monitoring job definition.
    networkConfig DataQualityJobDefinitionNetworkConfig
    Specifies networking configuration for the monitoring job.
    stoppingCondition DataQualityJobDefinitionStoppingCondition
    A time limit for how long the monitoring job is allowed to run before stopping.
    tags List<CreateOnlyTag>
    An array of key-value pairs to apply to this resource.
    dataQualityAppSpecification DataQualityJobDefinitionDataQualityAppSpecification
    Specifies the container that runs the monitoring job.
    dataQualityJobInput DataQualityJobDefinitionDataQualityJobInput
    A list of inputs for the monitoring job. Currently endpoints are supported as monitoring inputs.
    dataQualityJobOutputConfig DataQualityJobDefinitionMonitoringOutputConfig
    The output configuration for monitoring jobs.
    jobResources DataQualityJobDefinitionMonitoringResources
    Identifies the resources to deploy for a monitoring job.
    roleArn string
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to perform tasks on your behalf.
    dataQualityBaselineConfig DataQualityJobDefinitionDataQualityBaselineConfig
    Configures the constraints and baselines for the monitoring job.
    endpointName string
    jobDefinitionName string
    The name for the monitoring job definition.
    networkConfig DataQualityJobDefinitionNetworkConfig
    Specifies networking configuration for the monitoring job.
    stoppingCondition DataQualityJobDefinitionStoppingCondition
    A time limit for how long the monitoring job is allowed to run before stopping.
    tags CreateOnlyTag[]
    An array of key-value pairs to apply to this resource.
    data_quality_app_specification DataQualityJobDefinitionDataQualityAppSpecificationArgs
    Specifies the container that runs the monitoring job.
    data_quality_job_input DataQualityJobDefinitionDataQualityJobInputArgs
    A list of inputs for the monitoring job. Currently endpoints are supported as monitoring inputs.
    data_quality_job_output_config DataQualityJobDefinitionMonitoringOutputConfigArgs
    The output configuration for monitoring jobs.
    job_resources DataQualityJobDefinitionMonitoringResourcesArgs
    Identifies the resources to deploy for a monitoring job.
    role_arn str
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to perform tasks on your behalf.
    data_quality_baseline_config DataQualityJobDefinitionDataQualityBaselineConfigArgs
    Configures the constraints and baselines for the monitoring job.
    endpoint_name str
    job_definition_name str
    The name for the monitoring job definition.
    network_config DataQualityJobDefinitionNetworkConfigArgs
    Specifies networking configuration for the monitoring job.
    stopping_condition DataQualityJobDefinitionStoppingConditionArgs
    A time limit for how long the monitoring job is allowed to run before stopping.
    tags Sequence[CreateOnlyTagArgs]
    An array of key-value pairs to apply to this resource.
    dataQualityAppSpecification Property Map
    Specifies the container that runs the monitoring job.
    dataQualityJobInput Property Map
    A list of inputs for the monitoring job. Currently endpoints are supported as monitoring inputs.
    dataQualityJobOutputConfig Property Map
    The output configuration for monitoring jobs.
    jobResources Property Map
    Identifies the resources to deploy for a monitoring job.
    roleArn String
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to perform tasks on your behalf.
    dataQualityBaselineConfig Property Map
    Configures the constraints and baselines for the monitoring job.
    endpointName String
    jobDefinitionName String
    The name for the monitoring job definition.
    networkConfig Property Map
    Specifies networking configuration for the monitoring job.
    stoppingCondition Property Map
    A time limit for how long the monitoring job is allowed to run before stopping.
    tags List<Property Map>
    An array of key-value pairs to apply to this resource.

    Outputs

    All input properties are implicitly available as output properties. Additionally, the DataQualityJobDefinition resource produces the following output properties:

    CreationTime string
    The time at which the job definition was created.
    Id string
    The provider-assigned unique ID for this managed resource.
    JobDefinitionArn string
    The Amazon Resource Name (ARN) of job definition.
    CreationTime string
    The time at which the job definition was created.
    Id string
    The provider-assigned unique ID for this managed resource.
    JobDefinitionArn string
    The Amazon Resource Name (ARN) of job definition.
    creationTime String
    The time at which the job definition was created.
    id String
    The provider-assigned unique ID for this managed resource.
    jobDefinitionArn String
    The Amazon Resource Name (ARN) of job definition.
    creationTime string
    The time at which the job definition was created.
    id string
    The provider-assigned unique ID for this managed resource.
    jobDefinitionArn string
    The Amazon Resource Name (ARN) of job definition.
    creation_time str
    The time at which the job definition was created.
    id str
    The provider-assigned unique ID for this managed resource.
    job_definition_arn str
    The Amazon Resource Name (ARN) of job definition.
    creationTime String
    The time at which the job definition was created.
    id String
    The provider-assigned unique ID for this managed resource.
    jobDefinitionArn String
    The Amazon Resource Name (ARN) of job definition.

    Supporting Types

    CreateOnlyTag, CreateOnlyTagArgs

    Key string
    The key name of the tag
    Value string
    The value of the tag
    Key string
    The key name of the tag
    Value string
    The value of the tag
    key String
    The key name of the tag
    value String
    The value of the tag
    key string
    The key name of the tag
    value string
    The value of the tag
    key str
    The key name of the tag
    value str
    The value of the tag
    key String
    The key name of the tag
    value String
    The value of the tag

    DataQualityJobDefinitionBatchTransformInput, DataQualityJobDefinitionBatchTransformInputArgs

    DataCapturedDestinationS3Uri string
    A URI that identifies the Amazon S3 storage location where Batch Transform Job captures data.
    DatasetFormat Pulumi.AwsNative.SageMaker.Inputs.DataQualityJobDefinitionDatasetFormat
    The dataset format for your batch transform job.
    LocalPath string
    Path to the filesystem where the endpoint data is available to the container.
    ExcludeFeaturesAttribute string
    Indexes or names of the features to be excluded from analysis
    S3DataDistributionType Pulumi.AwsNative.SageMaker.DataQualityJobDefinitionBatchTransformInputS3DataDistributionType
    Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated
    S3InputMode Pulumi.AwsNative.SageMaker.DataQualityJobDefinitionBatchTransformInputS3InputMode
    Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.
    DataCapturedDestinationS3Uri string
    A URI that identifies the Amazon S3 storage location where Batch Transform Job captures data.
    DatasetFormat DataQualityJobDefinitionDatasetFormat
    The dataset format for your batch transform job.
    LocalPath string
    Path to the filesystem where the endpoint data is available to the container.
    ExcludeFeaturesAttribute string
    Indexes or names of the features to be excluded from analysis
    S3DataDistributionType DataQualityJobDefinitionBatchTransformInputS3DataDistributionType
    Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated
    S3InputMode DataQualityJobDefinitionBatchTransformInputS3InputMode
    Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.
    dataCapturedDestinationS3Uri String
    A URI that identifies the Amazon S3 storage location where Batch Transform Job captures data.
    datasetFormat DataQualityJobDefinitionDatasetFormat
    The dataset format for your batch transform job.
    localPath String
    Path to the filesystem where the endpoint data is available to the container.
    excludeFeaturesAttribute String
    Indexes or names of the features to be excluded from analysis
    s3DataDistributionType DataQualityJobDefinitionBatchTransformInputS3DataDistributionType
    Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated
    s3InputMode DataQualityJobDefinitionBatchTransformInputS3InputMode
    Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.
    dataCapturedDestinationS3Uri string
    A URI that identifies the Amazon S3 storage location where Batch Transform Job captures data.
    datasetFormat DataQualityJobDefinitionDatasetFormat
    The dataset format for your batch transform job.
    localPath string
    Path to the filesystem where the endpoint data is available to the container.
    excludeFeaturesAttribute string
    Indexes or names of the features to be excluded from analysis
    s3DataDistributionType DataQualityJobDefinitionBatchTransformInputS3DataDistributionType
    Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated
    s3InputMode DataQualityJobDefinitionBatchTransformInputS3InputMode
    Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.
    data_captured_destination_s3_uri str
    A URI that identifies the Amazon S3 storage location where Batch Transform Job captures data.
    dataset_format DataQualityJobDefinitionDatasetFormat
    The dataset format for your batch transform job.
    local_path str
    Path to the filesystem where the endpoint data is available to the container.
    exclude_features_attribute str
    Indexes or names of the features to be excluded from analysis
    s3_data_distribution_type DataQualityJobDefinitionBatchTransformInputS3DataDistributionType
    Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated
    s3_input_mode DataQualityJobDefinitionBatchTransformInputS3InputMode
    Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.
    dataCapturedDestinationS3Uri String
    A URI that identifies the Amazon S3 storage location where Batch Transform Job captures data.
    datasetFormat Property Map
    The dataset format for your batch transform job.
    localPath String
    Path to the filesystem where the endpoint data is available to the container.
    excludeFeaturesAttribute String
    Indexes or names of the features to be excluded from analysis
    s3DataDistributionType "FullyReplicated" | "ShardedByS3Key"
    Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated
    s3InputMode "Pipe" | "File"
    Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.

    DataQualityJobDefinitionBatchTransformInputS3DataDistributionType, DataQualityJobDefinitionBatchTransformInputS3DataDistributionTypeArgs

    FullyReplicated
    FullyReplicated
    ShardedByS3Key
    ShardedByS3Key
    DataQualityJobDefinitionBatchTransformInputS3DataDistributionTypeFullyReplicated
    FullyReplicated
    DataQualityJobDefinitionBatchTransformInputS3DataDistributionTypeShardedByS3Key
    ShardedByS3Key
    FullyReplicated
    FullyReplicated
    ShardedByS3Key
    ShardedByS3Key
    FullyReplicated
    FullyReplicated
    ShardedByS3Key
    ShardedByS3Key
    FULLY_REPLICATED
    FullyReplicated
    SHARDED_BY_S3_KEY
    ShardedByS3Key
    "FullyReplicated"
    FullyReplicated
    "ShardedByS3Key"
    ShardedByS3Key

    DataQualityJobDefinitionBatchTransformInputS3InputMode, DataQualityJobDefinitionBatchTransformInputS3InputModeArgs

    Pipe
    Pipe
    File
    File
    DataQualityJobDefinitionBatchTransformInputS3InputModePipe
    Pipe
    DataQualityJobDefinitionBatchTransformInputS3InputModeFile
    File
    Pipe
    Pipe
    File
    File
    Pipe
    Pipe
    File
    File
    PIPE
    Pipe
    FILE
    File
    "Pipe"
    Pipe
    "File"
    File

    DataQualityJobDefinitionClusterConfig, DataQualityJobDefinitionClusterConfigArgs

    InstanceCount int
    The number of ML compute instances to use in the model monitoring job. For distributed processing jobs, specify a value greater than 1. The default value is 1.
    InstanceType string
    The ML compute instance type for the processing job.
    VolumeSizeInGb int
    The size of the ML storage volume, in gigabytes, that you want to provision. You must specify sufficient ML storage for your scenario.
    VolumeKmsKeyId string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the model monitoring job.
    InstanceCount int
    The number of ML compute instances to use in the model monitoring job. For distributed processing jobs, specify a value greater than 1. The default value is 1.
    InstanceType string
    The ML compute instance type for the processing job.
    VolumeSizeInGb int
    The size of the ML storage volume, in gigabytes, that you want to provision. You must specify sufficient ML storage for your scenario.
    VolumeKmsKeyId string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the model monitoring job.
    instanceCount Integer
    The number of ML compute instances to use in the model monitoring job. For distributed processing jobs, specify a value greater than 1. The default value is 1.
    instanceType String
    The ML compute instance type for the processing job.
    volumeSizeInGb Integer
    The size of the ML storage volume, in gigabytes, that you want to provision. You must specify sufficient ML storage for your scenario.
    volumeKmsKeyId String
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the model monitoring job.
    instanceCount number
    The number of ML compute instances to use in the model monitoring job. For distributed processing jobs, specify a value greater than 1. The default value is 1.
    instanceType string
    The ML compute instance type for the processing job.
    volumeSizeInGb number
    The size of the ML storage volume, in gigabytes, that you want to provision. You must specify sufficient ML storage for your scenario.
    volumeKmsKeyId string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the model monitoring job.
    instance_count int
    The number of ML compute instances to use in the model monitoring job. For distributed processing jobs, specify a value greater than 1. The default value is 1.
    instance_type str
    The ML compute instance type for the processing job.
    volume_size_in_gb int
    The size of the ML storage volume, in gigabytes, that you want to provision. You must specify sufficient ML storage for your scenario.
    volume_kms_key_id str
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the model monitoring job.
    instanceCount Number
    The number of ML compute instances to use in the model monitoring job. For distributed processing jobs, specify a value greater than 1. The default value is 1.
    instanceType String
    The ML compute instance type for the processing job.
    volumeSizeInGb Number
    The size of the ML storage volume, in gigabytes, that you want to provision. You must specify sufficient ML storage for your scenario.
    volumeKmsKeyId String
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the model monitoring job.

    DataQualityJobDefinitionConstraintsResource, DataQualityJobDefinitionConstraintsResourceArgs

    S3Uri string
    The Amazon S3 URI for baseline constraint file in Amazon S3 that the current monitoring job should validated against.
    S3Uri string
    The Amazon S3 URI for baseline constraint file in Amazon S3 that the current monitoring job should validated against.
    s3Uri String
    The Amazon S3 URI for baseline constraint file in Amazon S3 that the current monitoring job should validated against.
    s3Uri string
    The Amazon S3 URI for baseline constraint file in Amazon S3 that the current monitoring job should validated against.
    s3_uri str
    The Amazon S3 URI for baseline constraint file in Amazon S3 that the current monitoring job should validated against.
    s3Uri String
    The Amazon S3 URI for baseline constraint file in Amazon S3 that the current monitoring job should validated against.

    DataQualityJobDefinitionCsv, DataQualityJobDefinitionCsvArgs

    Header bool
    A boolean flag indicating if given CSV has header
    Header bool
    A boolean flag indicating if given CSV has header
    header Boolean
    A boolean flag indicating if given CSV has header
    header boolean
    A boolean flag indicating if given CSV has header
    header bool
    A boolean flag indicating if given CSV has header
    header Boolean
    A boolean flag indicating if given CSV has header

    DataQualityJobDefinitionDataQualityAppSpecification, DataQualityJobDefinitionDataQualityAppSpecificationArgs

    ImageUri string
    The container image to be run by the monitoring job.
    ContainerArguments List<string>
    An array of arguments for the container used to run the monitoring job.
    ContainerEntrypoint List<string>
    Specifies the entrypoint for a container used to run the monitoring job.
    Environment object
    Sets the environment variables in the Docker container
    PostAnalyticsProcessorSourceUri string
    An Amazon S3 URI to a script that is called after analysis has been performed. Applicable only for the built-in (first party) containers.
    RecordPreprocessorSourceUri string
    An Amazon S3 URI to a script that is called per row prior to running analysis. It can base64 decode the payload and convert it into a flatted json so that the built-in container can use the converted data. Applicable only for the built-in (first party) containers
    ImageUri string
    The container image to be run by the monitoring job.
    ContainerArguments []string
    An array of arguments for the container used to run the monitoring job.
    ContainerEntrypoint []string
    Specifies the entrypoint for a container used to run the monitoring job.
    Environment interface{}
    Sets the environment variables in the Docker container
    PostAnalyticsProcessorSourceUri string
    An Amazon S3 URI to a script that is called after analysis has been performed. Applicable only for the built-in (first party) containers.
    RecordPreprocessorSourceUri string
    An Amazon S3 URI to a script that is called per row prior to running analysis. It can base64 decode the payload and convert it into a flatted json so that the built-in container can use the converted data. Applicable only for the built-in (first party) containers
    imageUri String
    The container image to be run by the monitoring job.
    containerArguments List<String>
    An array of arguments for the container used to run the monitoring job.
    containerEntrypoint List<String>
    Specifies the entrypoint for a container used to run the monitoring job.
    environment Object
    Sets the environment variables in the Docker container
    postAnalyticsProcessorSourceUri String
    An Amazon S3 URI to a script that is called after analysis has been performed. Applicable only for the built-in (first party) containers.
    recordPreprocessorSourceUri String
    An Amazon S3 URI to a script that is called per row prior to running analysis. It can base64 decode the payload and convert it into a flatted json so that the built-in container can use the converted data. Applicable only for the built-in (first party) containers
    imageUri string
    The container image to be run by the monitoring job.
    containerArguments string[]
    An array of arguments for the container used to run the monitoring job.
    containerEntrypoint string[]
    Specifies the entrypoint for a container used to run the monitoring job.
    environment any
    Sets the environment variables in the Docker container
    postAnalyticsProcessorSourceUri string
    An Amazon S3 URI to a script that is called after analysis has been performed. Applicable only for the built-in (first party) containers.
    recordPreprocessorSourceUri string
    An Amazon S3 URI to a script that is called per row prior to running analysis. It can base64 decode the payload and convert it into a flatted json so that the built-in container can use the converted data. Applicable only for the built-in (first party) containers
    image_uri str
    The container image to be run by the monitoring job.
    container_arguments Sequence[str]
    An array of arguments for the container used to run the monitoring job.
    container_entrypoint Sequence[str]
    Specifies the entrypoint for a container used to run the monitoring job.
    environment Any
    Sets the environment variables in the Docker container
    post_analytics_processor_source_uri str
    An Amazon S3 URI to a script that is called after analysis has been performed. Applicable only for the built-in (first party) containers.
    record_preprocessor_source_uri str
    An Amazon S3 URI to a script that is called per row prior to running analysis. It can base64 decode the payload and convert it into a flatted json so that the built-in container can use the converted data. Applicable only for the built-in (first party) containers
    imageUri String
    The container image to be run by the monitoring job.
    containerArguments List<String>
    An array of arguments for the container used to run the monitoring job.
    containerEntrypoint List<String>
    Specifies the entrypoint for a container used to run the monitoring job.
    environment Any
    Sets the environment variables in the Docker container
    postAnalyticsProcessorSourceUri String
    An Amazon S3 URI to a script that is called after analysis has been performed. Applicable only for the built-in (first party) containers.
    recordPreprocessorSourceUri String
    An Amazon S3 URI to a script that is called per row prior to running analysis. It can base64 decode the payload and convert it into a flatted json so that the built-in container can use the converted data. Applicable only for the built-in (first party) containers

    DataQualityJobDefinitionDataQualityBaselineConfig, DataQualityJobDefinitionDataQualityBaselineConfigArgs

    BaseliningJobName string
    The name of the job that performs baselining for the data quality monitoring job.
    ConstraintsResource Pulumi.AwsNative.SageMaker.Inputs.DataQualityJobDefinitionConstraintsResource
    The constraints resource for a monitoring job.
    StatisticsResource Pulumi.AwsNative.SageMaker.Inputs.DataQualityJobDefinitionStatisticsResource
    Configuration for monitoring constraints and monitoring statistics. These baseline resources are compared against the results of the current job from the series of jobs scheduled to collect data periodically.
    BaseliningJobName string
    The name of the job that performs baselining for the data quality monitoring job.
    ConstraintsResource DataQualityJobDefinitionConstraintsResource
    The constraints resource for a monitoring job.
    StatisticsResource DataQualityJobDefinitionStatisticsResource
    Configuration for monitoring constraints and monitoring statistics. These baseline resources are compared against the results of the current job from the series of jobs scheduled to collect data periodically.
    baseliningJobName String
    The name of the job that performs baselining for the data quality monitoring job.
    constraintsResource DataQualityJobDefinitionConstraintsResource
    The constraints resource for a monitoring job.
    statisticsResource DataQualityJobDefinitionStatisticsResource
    Configuration for monitoring constraints and monitoring statistics. These baseline resources are compared against the results of the current job from the series of jobs scheduled to collect data periodically.
    baseliningJobName string
    The name of the job that performs baselining for the data quality monitoring job.
    constraintsResource DataQualityJobDefinitionConstraintsResource
    The constraints resource for a monitoring job.
    statisticsResource DataQualityJobDefinitionStatisticsResource
    Configuration for monitoring constraints and monitoring statistics. These baseline resources are compared against the results of the current job from the series of jobs scheduled to collect data periodically.
    baselining_job_name str
    The name of the job that performs baselining for the data quality monitoring job.
    constraints_resource DataQualityJobDefinitionConstraintsResource
    The constraints resource for a monitoring job.
    statistics_resource DataQualityJobDefinitionStatisticsResource
    Configuration for monitoring constraints and monitoring statistics. These baseline resources are compared against the results of the current job from the series of jobs scheduled to collect data periodically.
    baseliningJobName String
    The name of the job that performs baselining for the data quality monitoring job.
    constraintsResource Property Map
    The constraints resource for a monitoring job.
    statisticsResource Property Map
    Configuration for monitoring constraints and monitoring statistics. These baseline resources are compared against the results of the current job from the series of jobs scheduled to collect data periodically.

    DataQualityJobDefinitionDataQualityJobInput, DataQualityJobDefinitionDataQualityJobInputArgs

    batchTransformInput Property Map
    Input object for the batch transform job.
    endpointInput Property Map
    Input object for the endpoint

    DataQualityJobDefinitionDatasetFormat, DataQualityJobDefinitionDatasetFormatArgs

    DataQualityJobDefinitionEndpointInput, DataQualityJobDefinitionEndpointInputArgs

    EndpointName string
    An endpoint in customer's account which has enabled DataCaptureConfig enabled.
    LocalPath string
    Path to the filesystem where the endpoint data is available to the container.
    ExcludeFeaturesAttribute string
    Indexes or names of the features to be excluded from analysis
    S3DataDistributionType Pulumi.AwsNative.SageMaker.DataQualityJobDefinitionEndpointInputS3DataDistributionType
    Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated
    S3InputMode Pulumi.AwsNative.SageMaker.DataQualityJobDefinitionEndpointInputS3InputMode
    Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.
    EndpointName string
    An endpoint in customer's account which has enabled DataCaptureConfig enabled.
    LocalPath string
    Path to the filesystem where the endpoint data is available to the container.
    ExcludeFeaturesAttribute string
    Indexes or names of the features to be excluded from analysis
    S3DataDistributionType DataQualityJobDefinitionEndpointInputS3DataDistributionType
    Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated
    S3InputMode DataQualityJobDefinitionEndpointInputS3InputMode
    Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.
    endpointName String
    An endpoint in customer's account which has enabled DataCaptureConfig enabled.
    localPath String
    Path to the filesystem where the endpoint data is available to the container.
    excludeFeaturesAttribute String
    Indexes or names of the features to be excluded from analysis
    s3DataDistributionType DataQualityJobDefinitionEndpointInputS3DataDistributionType
    Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated
    s3InputMode DataQualityJobDefinitionEndpointInputS3InputMode
    Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.
    endpointName string
    An endpoint in customer's account which has enabled DataCaptureConfig enabled.
    localPath string
    Path to the filesystem where the endpoint data is available to the container.
    excludeFeaturesAttribute string
    Indexes or names of the features to be excluded from analysis
    s3DataDistributionType DataQualityJobDefinitionEndpointInputS3DataDistributionType
    Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated
    s3InputMode DataQualityJobDefinitionEndpointInputS3InputMode
    Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.
    endpoint_name str
    An endpoint in customer's account which has enabled DataCaptureConfig enabled.
    local_path str
    Path to the filesystem where the endpoint data is available to the container.
    exclude_features_attribute str
    Indexes or names of the features to be excluded from analysis
    s3_data_distribution_type DataQualityJobDefinitionEndpointInputS3DataDistributionType
    Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated
    s3_input_mode DataQualityJobDefinitionEndpointInputS3InputMode
    Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.
    endpointName String
    An endpoint in customer's account which has enabled DataCaptureConfig enabled.
    localPath String
    Path to the filesystem where the endpoint data is available to the container.
    excludeFeaturesAttribute String
    Indexes or names of the features to be excluded from analysis
    s3DataDistributionType "FullyReplicated" | "ShardedByS3Key"
    Whether input data distributed in Amazon S3 is fully replicated or sharded by an S3 key. Defauts to FullyReplicated
    s3InputMode "Pipe" | "File"
    Whether the Pipe or File is used as the input mode for transfering data for the monitoring job. Pipe mode is recommended for large datasets. File mode is useful for small files that fit in memory. Defaults to File.

    DataQualityJobDefinitionEndpointInputS3DataDistributionType, DataQualityJobDefinitionEndpointInputS3DataDistributionTypeArgs

    FullyReplicated
    FullyReplicated
    ShardedByS3Key
    ShardedByS3Key
    DataQualityJobDefinitionEndpointInputS3DataDistributionTypeFullyReplicated
    FullyReplicated
    DataQualityJobDefinitionEndpointInputS3DataDistributionTypeShardedByS3Key
    ShardedByS3Key
    FullyReplicated
    FullyReplicated
    ShardedByS3Key
    ShardedByS3Key
    FullyReplicated
    FullyReplicated
    ShardedByS3Key
    ShardedByS3Key
    FULLY_REPLICATED
    FullyReplicated
    SHARDED_BY_S3_KEY
    ShardedByS3Key
    "FullyReplicated"
    FullyReplicated
    "ShardedByS3Key"
    ShardedByS3Key

    DataQualityJobDefinitionEndpointInputS3InputMode, DataQualityJobDefinitionEndpointInputS3InputModeArgs

    Pipe
    Pipe
    File
    File
    DataQualityJobDefinitionEndpointInputS3InputModePipe
    Pipe
    DataQualityJobDefinitionEndpointInputS3InputModeFile
    File
    Pipe
    Pipe
    File
    File
    Pipe
    Pipe
    File
    File
    PIPE
    Pipe
    FILE
    File
    "Pipe"
    Pipe
    "File"
    File

    DataQualityJobDefinitionJson, DataQualityJobDefinitionJsonArgs

    Line bool
    A boolean flag indicating if it is JSON line format
    Line bool
    A boolean flag indicating if it is JSON line format
    line Boolean
    A boolean flag indicating if it is JSON line format
    line boolean
    A boolean flag indicating if it is JSON line format
    line bool
    A boolean flag indicating if it is JSON line format
    line Boolean
    A boolean flag indicating if it is JSON line format

    DataQualityJobDefinitionMonitoringOutput, DataQualityJobDefinitionMonitoringOutputArgs

    S3Output Pulumi.AwsNative.SageMaker.Inputs.DataQualityJobDefinitionS3Output
    The Amazon S3 storage location where the results of a monitoring job are saved.
    S3Output DataQualityJobDefinitionS3Output
    The Amazon S3 storage location where the results of a monitoring job are saved.
    s3Output DataQualityJobDefinitionS3Output
    The Amazon S3 storage location where the results of a monitoring job are saved.
    s3Output DataQualityJobDefinitionS3Output
    The Amazon S3 storage location where the results of a monitoring job are saved.
    s3_output DataQualityJobDefinitionS3Output
    The Amazon S3 storage location where the results of a monitoring job are saved.
    s3Output Property Map
    The Amazon S3 storage location where the results of a monitoring job are saved.

    DataQualityJobDefinitionMonitoringOutputConfig, DataQualityJobDefinitionMonitoringOutputConfigArgs

    MonitoringOutputs List<Pulumi.AwsNative.SageMaker.Inputs.DataQualityJobDefinitionMonitoringOutput>
    Monitoring outputs for monitoring jobs. This is where the output of the periodic monitoring jobs is uploaded.
    KmsKeyId string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the model artifacts at rest using Amazon S3 server-side encryption.
    MonitoringOutputs []DataQualityJobDefinitionMonitoringOutput
    Monitoring outputs for monitoring jobs. This is where the output of the periodic monitoring jobs is uploaded.
    KmsKeyId string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the model artifacts at rest using Amazon S3 server-side encryption.
    monitoringOutputs List<DataQualityJobDefinitionMonitoringOutput>
    Monitoring outputs for monitoring jobs. This is where the output of the periodic monitoring jobs is uploaded.
    kmsKeyId String
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the model artifacts at rest using Amazon S3 server-side encryption.
    monitoringOutputs DataQualityJobDefinitionMonitoringOutput[]
    Monitoring outputs for monitoring jobs. This is where the output of the periodic monitoring jobs is uploaded.
    kmsKeyId string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the model artifacts at rest using Amazon S3 server-side encryption.
    monitoring_outputs Sequence[DataQualityJobDefinitionMonitoringOutput]
    Monitoring outputs for monitoring jobs. This is where the output of the periodic monitoring jobs is uploaded.
    kms_key_id str
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the model artifacts at rest using Amazon S3 server-side encryption.
    monitoringOutputs List<Property Map>
    Monitoring outputs for monitoring jobs. This is where the output of the periodic monitoring jobs is uploaded.
    kmsKeyId String
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the model artifacts at rest using Amazon S3 server-side encryption.

    DataQualityJobDefinitionMonitoringResources, DataQualityJobDefinitionMonitoringResourcesArgs

    ClusterConfig Pulumi.AwsNative.SageMaker.Inputs.DataQualityJobDefinitionClusterConfig
    The configuration for the cluster resources used to run the processing job.
    ClusterConfig DataQualityJobDefinitionClusterConfig
    The configuration for the cluster resources used to run the processing job.
    clusterConfig DataQualityJobDefinitionClusterConfig
    The configuration for the cluster resources used to run the processing job.
    clusterConfig DataQualityJobDefinitionClusterConfig
    The configuration for the cluster resources used to run the processing job.
    cluster_config DataQualityJobDefinitionClusterConfig
    The configuration for the cluster resources used to run the processing job.
    clusterConfig Property Map
    The configuration for the cluster resources used to run the processing job.

    DataQualityJobDefinitionNetworkConfig, DataQualityJobDefinitionNetworkConfigArgs

    EnableInterContainerTrafficEncryption bool
    Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.
    EnableNetworkIsolation bool
    Whether to allow inbound and outbound network calls to and from the containers used for the processing job.
    VpcConfig Pulumi.AwsNative.SageMaker.Inputs.DataQualityJobDefinitionVpcConfig
    Specifies a VPC that your training jobs and hosted models have access to. Control access to and from your training and model containers by configuring the VPC.
    EnableInterContainerTrafficEncryption bool
    Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.
    EnableNetworkIsolation bool
    Whether to allow inbound and outbound network calls to and from the containers used for the processing job.
    VpcConfig DataQualityJobDefinitionVpcConfig
    Specifies a VPC that your training jobs and hosted models have access to. Control access to and from your training and model containers by configuring the VPC.
    enableInterContainerTrafficEncryption Boolean
    Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.
    enableNetworkIsolation Boolean
    Whether to allow inbound and outbound network calls to and from the containers used for the processing job.
    vpcConfig DataQualityJobDefinitionVpcConfig
    Specifies a VPC that your training jobs and hosted models have access to. Control access to and from your training and model containers by configuring the VPC.
    enableInterContainerTrafficEncryption boolean
    Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.
    enableNetworkIsolation boolean
    Whether to allow inbound and outbound network calls to and from the containers used for the processing job.
    vpcConfig DataQualityJobDefinitionVpcConfig
    Specifies a VPC that your training jobs and hosted models have access to. Control access to and from your training and model containers by configuring the VPC.
    enable_inter_container_traffic_encryption bool
    Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.
    enable_network_isolation bool
    Whether to allow inbound and outbound network calls to and from the containers used for the processing job.
    vpc_config DataQualityJobDefinitionVpcConfig
    Specifies a VPC that your training jobs and hosted models have access to. Control access to and from your training and model containers by configuring the VPC.
    enableInterContainerTrafficEncryption Boolean
    Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.
    enableNetworkIsolation Boolean
    Whether to allow inbound and outbound network calls to and from the containers used for the processing job.
    vpcConfig Property Map
    Specifies a VPC that your training jobs and hosted models have access to. Control access to and from your training and model containers by configuring the VPC.

    DataQualityJobDefinitionS3Output, DataQualityJobDefinitionS3OutputArgs

    LocalPath string
    The local path to the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job. LocalPath is an absolute path for the output data.
    S3Uri string
    A URI that identifies the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job.
    S3UploadMode Pulumi.AwsNative.SageMaker.DataQualityJobDefinitionS3OutputS3UploadMode
    Whether to upload the results of the monitoring job continuously or after the job completes.
    LocalPath string
    The local path to the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job. LocalPath is an absolute path for the output data.
    S3Uri string
    A URI that identifies the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job.
    S3UploadMode DataQualityJobDefinitionS3OutputS3UploadMode
    Whether to upload the results of the monitoring job continuously or after the job completes.
    localPath String
    The local path to the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job. LocalPath is an absolute path for the output data.
    s3Uri String
    A URI that identifies the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job.
    s3UploadMode DataQualityJobDefinitionS3OutputS3UploadMode
    Whether to upload the results of the monitoring job continuously or after the job completes.
    localPath string
    The local path to the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job. LocalPath is an absolute path for the output data.
    s3Uri string
    A URI that identifies the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job.
    s3UploadMode DataQualityJobDefinitionS3OutputS3UploadMode
    Whether to upload the results of the monitoring job continuously or after the job completes.
    local_path str
    The local path to the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job. LocalPath is an absolute path for the output data.
    s3_uri str
    A URI that identifies the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job.
    s3_upload_mode DataQualityJobDefinitionS3OutputS3UploadMode
    Whether to upload the results of the monitoring job continuously or after the job completes.
    localPath String
    The local path to the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job. LocalPath is an absolute path for the output data.
    s3Uri String
    A URI that identifies the Amazon S3 storage location where Amazon SageMaker saves the results of a monitoring job.
    s3UploadMode "Continuous" | "EndOfJob"
    Whether to upload the results of the monitoring job continuously or after the job completes.

    DataQualityJobDefinitionS3OutputS3UploadMode, DataQualityJobDefinitionS3OutputS3UploadModeArgs

    Continuous
    Continuous
    EndOfJob
    EndOfJob
    DataQualityJobDefinitionS3OutputS3UploadModeContinuous
    Continuous
    DataQualityJobDefinitionS3OutputS3UploadModeEndOfJob
    EndOfJob
    Continuous
    Continuous
    EndOfJob
    EndOfJob
    Continuous
    Continuous
    EndOfJob
    EndOfJob
    CONTINUOUS
    Continuous
    END_OF_JOB
    EndOfJob
    "Continuous"
    Continuous
    "EndOfJob"
    EndOfJob

    DataQualityJobDefinitionStatisticsResource, DataQualityJobDefinitionStatisticsResourceArgs

    S3Uri string
    The Amazon S3 URI for the baseline statistics file in Amazon S3 that the current monitoring job should be validated against.
    S3Uri string
    The Amazon S3 URI for the baseline statistics file in Amazon S3 that the current monitoring job should be validated against.
    s3Uri String
    The Amazon S3 URI for the baseline statistics file in Amazon S3 that the current monitoring job should be validated against.
    s3Uri string
    The Amazon S3 URI for the baseline statistics file in Amazon S3 that the current monitoring job should be validated against.
    s3_uri str
    The Amazon S3 URI for the baseline statistics file in Amazon S3 that the current monitoring job should be validated against.
    s3Uri String
    The Amazon S3 URI for the baseline statistics file in Amazon S3 that the current monitoring job should be validated against.

    DataQualityJobDefinitionStoppingCondition, DataQualityJobDefinitionStoppingConditionArgs

    MaxRuntimeInSeconds int
    The maximum runtime allowed in seconds.
    MaxRuntimeInSeconds int
    The maximum runtime allowed in seconds.
    maxRuntimeInSeconds Integer
    The maximum runtime allowed in seconds.
    maxRuntimeInSeconds number
    The maximum runtime allowed in seconds.
    max_runtime_in_seconds int
    The maximum runtime allowed in seconds.
    maxRuntimeInSeconds Number
    The maximum runtime allowed in seconds.

    DataQualityJobDefinitionVpcConfig, DataQualityJobDefinitionVpcConfigArgs

    SecurityGroupIds List<string>
    The VPC security group IDs, in the form sg-xxxxxxxx. Specify the security groups for the VPC that is specified in the Subnets field.
    Subnets List<string>
    The ID of the subnets in the VPC to which you want to connect to your monitoring jobs.
    SecurityGroupIds []string
    The VPC security group IDs, in the form sg-xxxxxxxx. Specify the security groups for the VPC that is specified in the Subnets field.
    Subnets []string
    The ID of the subnets in the VPC to which you want to connect to your monitoring jobs.
    securityGroupIds List<String>
    The VPC security group IDs, in the form sg-xxxxxxxx. Specify the security groups for the VPC that is specified in the Subnets field.
    subnets List<String>
    The ID of the subnets in the VPC to which you want to connect to your monitoring jobs.
    securityGroupIds string[]
    The VPC security group IDs, in the form sg-xxxxxxxx. Specify the security groups for the VPC that is specified in the Subnets field.
    subnets string[]
    The ID of the subnets in the VPC to which you want to connect to your monitoring jobs.
    security_group_ids Sequence[str]
    The VPC security group IDs, in the form sg-xxxxxxxx. Specify the security groups for the VPC that is specified in the Subnets field.
    subnets Sequence[str]
    The ID of the subnets in the VPC to which you want to connect to your monitoring jobs.
    securityGroupIds List<String>
    The VPC security group IDs, in the form sg-xxxxxxxx. Specify the security groups for the VPC that is specified in the Subnets field.
    subnets List<String>
    The ID of the subnets in the VPC to which you want to connect to your monitoring jobs.

    Package Details

    Repository
    AWS Native pulumi/pulumi-aws-native
    License
    Apache-2.0
    aws-native logo

    AWS Native is in preview. AWS Classic is fully supported.

    AWS Native v0.112.0 published on Wednesday, Jul 24, 2024 by Pulumi