We recommend new projects start with resources from the AWS provider.
We recommend new projects start with resources from the AWS provider.
Resource Type definition for AWS::SageMaker::ProcessingJob
Create ProcessingJob Resource
Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.
Constructor syntax
new ProcessingJob(name: string, args: ProcessingJobArgs, opts?: CustomResourceOptions);@overload
def ProcessingJob(resource_name: str,
args: ProcessingJobArgs,
opts: Optional[ResourceOptions] = None)
@overload
def ProcessingJob(resource_name: str,
opts: Optional[ResourceOptions] = None,
app_specification: Optional[ProcessingJobAppSpecificationArgs] = None,
processing_resources: Optional[ProcessingJobProcessingResourcesArgs] = None,
role_arn: Optional[str] = None,
environment: Optional[ProcessingJobEnvironmentArgs] = None,
experiment_config: Optional[ProcessingJobExperimentConfigArgs] = None,
network_config: Optional[ProcessingJobNetworkConfigArgs] = None,
processing_inputs: Optional[Sequence[ProcessingJobProcessingInputsObjectArgs]] = None,
processing_job_name: Optional[str] = None,
processing_output_config: Optional[ProcessingJobProcessingOutputConfigArgs] = None,
stopping_condition: Optional[ProcessingJobStoppingConditionArgs] = None,
tags: Optional[Sequence[_root_inputs.CreateOnlyTagArgs]] = None)func NewProcessingJob(ctx *Context, name string, args ProcessingJobArgs, opts ...ResourceOption) (*ProcessingJob, error)public ProcessingJob(string name, ProcessingJobArgs args, CustomResourceOptions? opts = null)
public ProcessingJob(String name, ProcessingJobArgs args)
public ProcessingJob(String name, ProcessingJobArgs args, CustomResourceOptions options)
type: aws-native:sagemaker:ProcessingJob
properties: # The arguments to resource properties.
options: # Bag of options to control resource's behavior.
Parameters
- name string
- The unique name of the resource.
- args ProcessingJobArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- resource_name str
- The unique name of the resource.
- args ProcessingJobArgs
- The arguments to resource properties.
- opts ResourceOptions
- Bag of options to control resource's behavior.
- ctx Context
- Context object for the current deployment.
- name string
- The unique name of the resource.
- args ProcessingJobArgs
- The arguments to resource properties.
- opts ResourceOption
- Bag of options to control resource's behavior.
- name string
- The unique name of the resource.
- args ProcessingJobArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- name String
- The unique name of the resource.
- args ProcessingJobArgs
- The arguments to resource properties.
- options CustomResourceOptions
- Bag of options to control resource's behavior.
ProcessingJob Resource Properties
To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.
Inputs
In Python, inputs that are objects can be passed either as argument classes or as dictionary literals.
The ProcessingJob resource accepts the following input properties:
- App
Specification Pulumi.Aws Native. Sage Maker. Inputs. Processing Job App Specification - Configuration to run a processing job in a specified container image.
- Processing
Resources Pulumi.Aws Native. Sage Maker. Inputs. Processing Job Processing Resources - Identifies the resources, ML compute instances, and ML storage volumes to deploy for a processing job. In distributed training, you specify more than one instance.
- Role
Arn string - The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to perform tasks on your behalf.
- Environment
Pulumi.
Aws Native. Sage Maker. Inputs. Processing Job Environment - Sets the environment variables in the Docker container.
- Experiment
Config Pulumi.Aws Native. Sage Maker. Inputs. Processing Job Experiment Config - Associates a SageMaker job as a trial component with an experiment and trial. Specified when you call the CreateProcessingJob API.
- Network
Config Pulumi.Aws Native. Sage Maker. Inputs. Processing Job Network Config - Networking options for a job, such as network traffic encryption between containers, whether to allow inbound and outbound network calls to and from containers, and the VPC subnets and security groups to use for VPC-enabled jobs.
- Processing
Inputs List<Pulumi.Aws Native. Sage Maker. Inputs. Processing Job Processing Inputs Object> - An array of inputs configuring the data to download into the processing container.
- Processing
Job stringName - The name of the processing job. The name must be unique within an AWS Region in the AWS account.
- Processing
Output Pulumi.Config Aws Native. Sage Maker. Inputs. Processing Job Processing Output Config - Contains information about the output location for the compiled model and the target device that the model runs on.
TargetDeviceandTargetPlatformare mutually exclusive, so you need to choose one between the two to specify your target device or platform. If you cannot find your device you want to use from theTargetDevicelist, useTargetPlatformto describe the platform of your edge device andCompilerOptionsif there are specific settings that are required or recommended to use for particular TargetPlatform. - Stopping
Condition Pulumi.Aws Native. Sage Maker. Inputs. Processing Job Stopping Condition - Configures conditions under which the processing job should be stopped, such as how long the processing job has been running. After the condition is met, the processing job is stopped.
-
List<Pulumi.
Aws Native. Inputs. Create Only Tag> - (Optional) An array of key-value pairs. For more information, see Using Cost Allocation Tags(https://docs.aws.amazon.com/awsaccountbilling/latest/aboutv2/cost-alloc-tags.html#allocation-whatURL) in the AWS Billing and Cost Management User Guide.
- App
Specification ProcessingJob App Specification Args - Configuration to run a processing job in a specified container image.
- Processing
Resources ProcessingJob Processing Resources Args - Identifies the resources, ML compute instances, and ML storage volumes to deploy for a processing job. In distributed training, you specify more than one instance.
- Role
Arn string - The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to perform tasks on your behalf.
- Environment
Processing
Job Environment Args - Sets the environment variables in the Docker container.
- Experiment
Config ProcessingJob Experiment Config Args - Associates a SageMaker job as a trial component with an experiment and trial. Specified when you call the CreateProcessingJob API.
- Network
Config ProcessingJob Network Config Args - Networking options for a job, such as network traffic encryption between containers, whether to allow inbound and outbound network calls to and from containers, and the VPC subnets and security groups to use for VPC-enabled jobs.
- Processing
Inputs []ProcessingJob Processing Inputs Object Args - An array of inputs configuring the data to download into the processing container.
- Processing
Job stringName - The name of the processing job. The name must be unique within an AWS Region in the AWS account.
- Processing
Output ProcessingConfig Job Processing Output Config Args - Contains information about the output location for the compiled model and the target device that the model runs on.
TargetDeviceandTargetPlatformare mutually exclusive, so you need to choose one between the two to specify your target device or platform. If you cannot find your device you want to use from theTargetDevicelist, useTargetPlatformto describe the platform of your edge device andCompilerOptionsif there are specific settings that are required or recommended to use for particular TargetPlatform. - Stopping
Condition ProcessingJob Stopping Condition Args - Configures conditions under which the processing job should be stopped, such as how long the processing job has been running. After the condition is met, the processing job is stopped.
-
Create
Only Tag Args - (Optional) An array of key-value pairs. For more information, see Using Cost Allocation Tags(https://docs.aws.amazon.com/awsaccountbilling/latest/aboutv2/cost-alloc-tags.html#allocation-whatURL) in the AWS Billing and Cost Management User Guide.
- app
Specification ProcessingJob App Specification - Configuration to run a processing job in a specified container image.
- processing
Resources ProcessingJob Processing Resources - Identifies the resources, ML compute instances, and ML storage volumes to deploy for a processing job. In distributed training, you specify more than one instance.
- role
Arn String - The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to perform tasks on your behalf.
- environment
Processing
Job Environment - Sets the environment variables in the Docker container.
- experiment
Config ProcessingJob Experiment Config - Associates a SageMaker job as a trial component with an experiment and trial. Specified when you call the CreateProcessingJob API.
- network
Config ProcessingJob Network Config - Networking options for a job, such as network traffic encryption between containers, whether to allow inbound and outbound network calls to and from containers, and the VPC subnets and security groups to use for VPC-enabled jobs.
- processing
Inputs List<ProcessingJob Processing Inputs Object> - An array of inputs configuring the data to download into the processing container.
- processing
Job StringName - The name of the processing job. The name must be unique within an AWS Region in the AWS account.
- processing
Output ProcessingConfig Job Processing Output Config - Contains information about the output location for the compiled model and the target device that the model runs on.
TargetDeviceandTargetPlatformare mutually exclusive, so you need to choose one between the two to specify your target device or platform. If you cannot find your device you want to use from theTargetDevicelist, useTargetPlatformto describe the platform of your edge device andCompilerOptionsif there are specific settings that are required or recommended to use for particular TargetPlatform. - stopping
Condition ProcessingJob Stopping Condition - Configures conditions under which the processing job should be stopped, such as how long the processing job has been running. After the condition is met, the processing job is stopped.
-
List<Create
Only Tag> - (Optional) An array of key-value pairs. For more information, see Using Cost Allocation Tags(https://docs.aws.amazon.com/awsaccountbilling/latest/aboutv2/cost-alloc-tags.html#allocation-whatURL) in the AWS Billing and Cost Management User Guide.
- app
Specification ProcessingJob App Specification - Configuration to run a processing job in a specified container image.
- processing
Resources ProcessingJob Processing Resources - Identifies the resources, ML compute instances, and ML storage volumes to deploy for a processing job. In distributed training, you specify more than one instance.
- role
Arn string - The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to perform tasks on your behalf.
- environment
Processing
Job Environment - Sets the environment variables in the Docker container.
- experiment
Config ProcessingJob Experiment Config - Associates a SageMaker job as a trial component with an experiment and trial. Specified when you call the CreateProcessingJob API.
- network
Config ProcessingJob Network Config - Networking options for a job, such as network traffic encryption between containers, whether to allow inbound and outbound network calls to and from containers, and the VPC subnets and security groups to use for VPC-enabled jobs.
- processing
Inputs ProcessingJob Processing Inputs Object[] - An array of inputs configuring the data to download into the processing container.
- processing
Job stringName - The name of the processing job. The name must be unique within an AWS Region in the AWS account.
- processing
Output ProcessingConfig Job Processing Output Config - Contains information about the output location for the compiled model and the target device that the model runs on.
TargetDeviceandTargetPlatformare mutually exclusive, so you need to choose one between the two to specify your target device or platform. If you cannot find your device you want to use from theTargetDevicelist, useTargetPlatformto describe the platform of your edge device andCompilerOptionsif there are specific settings that are required or recommended to use for particular TargetPlatform. - stopping
Condition ProcessingJob Stopping Condition - Configures conditions under which the processing job should be stopped, such as how long the processing job has been running. After the condition is met, the processing job is stopped.
-
Create
Only Tag[] - (Optional) An array of key-value pairs. For more information, see Using Cost Allocation Tags(https://docs.aws.amazon.com/awsaccountbilling/latest/aboutv2/cost-alloc-tags.html#allocation-whatURL) in the AWS Billing and Cost Management User Guide.
- app_
specification ProcessingJob App Specification Args - Configuration to run a processing job in a specified container image.
- processing_
resources ProcessingJob Processing Resources Args - Identifies the resources, ML compute instances, and ML storage volumes to deploy for a processing job. In distributed training, you specify more than one instance.
- role_
arn str - The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to perform tasks on your behalf.
- environment
Processing
Job Environment Args - Sets the environment variables in the Docker container.
- experiment_
config ProcessingJob Experiment Config Args - Associates a SageMaker job as a trial component with an experiment and trial. Specified when you call the CreateProcessingJob API.
- network_
config ProcessingJob Network Config Args - Networking options for a job, such as network traffic encryption between containers, whether to allow inbound and outbound network calls to and from containers, and the VPC subnets and security groups to use for VPC-enabled jobs.
- processing_
inputs Sequence[ProcessingJob Processing Inputs Object Args] - An array of inputs configuring the data to download into the processing container.
- processing_
job_ strname - The name of the processing job. The name must be unique within an AWS Region in the AWS account.
- processing_
output_ Processingconfig Job Processing Output Config Args - Contains information about the output location for the compiled model and the target device that the model runs on.
TargetDeviceandTargetPlatformare mutually exclusive, so you need to choose one between the two to specify your target device or platform. If you cannot find your device you want to use from theTargetDevicelist, useTargetPlatformto describe the platform of your edge device andCompilerOptionsif there are specific settings that are required or recommended to use for particular TargetPlatform. - stopping_
condition ProcessingJob Stopping Condition Args - Configures conditions under which the processing job should be stopped, such as how long the processing job has been running. After the condition is met, the processing job is stopped.
-
Sequence[Create
Only Tag Args] - (Optional) An array of key-value pairs. For more information, see Using Cost Allocation Tags(https://docs.aws.amazon.com/awsaccountbilling/latest/aboutv2/cost-alloc-tags.html#allocation-whatURL) in the AWS Billing and Cost Management User Guide.
- app
Specification Property Map - Configuration to run a processing job in a specified container image.
- processing
Resources Property Map - Identifies the resources, ML compute instances, and ML storage volumes to deploy for a processing job. In distributed training, you specify more than one instance.
- role
Arn String - The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to perform tasks on your behalf.
- environment Property Map
- Sets the environment variables in the Docker container.
- experiment
Config Property Map - Associates a SageMaker job as a trial component with an experiment and trial. Specified when you call the CreateProcessingJob API.
- network
Config Property Map - Networking options for a job, such as network traffic encryption between containers, whether to allow inbound and outbound network calls to and from containers, and the VPC subnets and security groups to use for VPC-enabled jobs.
- processing
Inputs List<Property Map> - An array of inputs configuring the data to download into the processing container.
- processing
Job StringName - The name of the processing job. The name must be unique within an AWS Region in the AWS account.
- processing
Output Property MapConfig - Contains information about the output location for the compiled model and the target device that the model runs on.
TargetDeviceandTargetPlatformare mutually exclusive, so you need to choose one between the two to specify your target device or platform. If you cannot find your device you want to use from theTargetDevicelist, useTargetPlatformto describe the platform of your edge device andCompilerOptionsif there are specific settings that are required or recommended to use for particular TargetPlatform. - stopping
Condition Property Map - Configures conditions under which the processing job should be stopped, such as how long the processing job has been running. After the condition is met, the processing job is stopped.
- List<Property Map>
- (Optional) An array of key-value pairs. For more information, see Using Cost Allocation Tags(https://docs.aws.amazon.com/awsaccountbilling/latest/aboutv2/cost-alloc-tags.html#allocation-whatURL) in the AWS Billing and Cost Management User Guide.
Outputs
All input properties are implicitly available as output properties. Additionally, the ProcessingJob resource produces the following output properties:
- Auto
Ml stringJob Arn - The ARN of an AutoML job associated with this processing job.
- Creation
Time string - The time at which the processing job was created.
- Exit
Message string - An optional string, up to one KB in size, that contains metadata from the processing container when the processing job exits.
- Failure
Reason string - A string, up to one KB in size, that contains the reason a processing job failed, if it failed.
- Id string
- The provider-assigned unique ID for this managed resource.
- Last
Modified stringTime - The time at which the processing job was last modified.
- Monitoring
Schedule stringArn - The ARN of a monitoring schedule for an endpoint associated with this processing job.
- Processing
End stringTime - The time at which the processing job completed.
- Processing
Job stringArn - The Amazon Resource Name (ARN) of the processing job.
- Processing
Job Pulumi.Status Aws Native. Sage Maker. Processing Job Status - Provides the status of a processing job.
- Processing
Start stringTime - The time at which the processing job started.
- Training
Job stringArn - The ARN of a training job associated with this processing job
- Auto
Ml stringJob Arn - The ARN of an AutoML job associated with this processing job.
- Creation
Time string - The time at which the processing job was created.
- Exit
Message string - An optional string, up to one KB in size, that contains metadata from the processing container when the processing job exits.
- Failure
Reason string - A string, up to one KB in size, that contains the reason a processing job failed, if it failed.
- Id string
- The provider-assigned unique ID for this managed resource.
- Last
Modified stringTime - The time at which the processing job was last modified.
- Monitoring
Schedule stringArn - The ARN of a monitoring schedule for an endpoint associated with this processing job.
- Processing
End stringTime - The time at which the processing job completed.
- Processing
Job stringArn - The Amazon Resource Name (ARN) of the processing job.
- Processing
Job ProcessingStatus Job Status - Provides the status of a processing job.
- Processing
Start stringTime - The time at which the processing job started.
- Training
Job stringArn - The ARN of a training job associated with this processing job
- auto
Ml StringJob Arn - The ARN of an AutoML job associated with this processing job.
- creation
Time String - The time at which the processing job was created.
- exit
Message String - An optional string, up to one KB in size, that contains metadata from the processing container when the processing job exits.
- failure
Reason String - A string, up to one KB in size, that contains the reason a processing job failed, if it failed.
- id String
- The provider-assigned unique ID for this managed resource.
- last
Modified StringTime - The time at which the processing job was last modified.
- monitoring
Schedule StringArn - The ARN of a monitoring schedule for an endpoint associated with this processing job.
- processing
End StringTime - The time at which the processing job completed.
- processing
Job StringArn - The Amazon Resource Name (ARN) of the processing job.
- processing
Job ProcessingStatus Job Status - Provides the status of a processing job.
- processing
Start StringTime - The time at which the processing job started.
- training
Job StringArn - The ARN of a training job associated with this processing job
- auto
Ml stringJob Arn - The ARN of an AutoML job associated with this processing job.
- creation
Time string - The time at which the processing job was created.
- exit
Message string - An optional string, up to one KB in size, that contains metadata from the processing container when the processing job exits.
- failure
Reason string - A string, up to one KB in size, that contains the reason a processing job failed, if it failed.
- id string
- The provider-assigned unique ID for this managed resource.
- last
Modified stringTime - The time at which the processing job was last modified.
- monitoring
Schedule stringArn - The ARN of a monitoring schedule for an endpoint associated with this processing job.
- processing
End stringTime - The time at which the processing job completed.
- processing
Job stringArn - The Amazon Resource Name (ARN) of the processing job.
- processing
Job ProcessingStatus Job Status - Provides the status of a processing job.
- processing
Start stringTime - The time at which the processing job started.
- training
Job stringArn - The ARN of a training job associated with this processing job
- auto_
ml_ strjob_ arn - The ARN of an AutoML job associated with this processing job.
- creation_
time str - The time at which the processing job was created.
- exit_
message str - An optional string, up to one KB in size, that contains metadata from the processing container when the processing job exits.
- failure_
reason str - A string, up to one KB in size, that contains the reason a processing job failed, if it failed.
- id str
- The provider-assigned unique ID for this managed resource.
- last_
modified_ strtime - The time at which the processing job was last modified.
- monitoring_
schedule_ strarn - The ARN of a monitoring schedule for an endpoint associated with this processing job.
- processing_
end_ strtime - The time at which the processing job completed.
- processing_
job_ strarn - The Amazon Resource Name (ARN) of the processing job.
- processing_
job_ Processingstatus Job Status - Provides the status of a processing job.
- processing_
start_ strtime - The time at which the processing job started.
- training_
job_ strarn - The ARN of a training job associated with this processing job
- auto
Ml StringJob Arn - The ARN of an AutoML job associated with this processing job.
- creation
Time String - The time at which the processing job was created.
- exit
Message String - An optional string, up to one KB in size, that contains metadata from the processing container when the processing job exits.
- failure
Reason String - A string, up to one KB in size, that contains the reason a processing job failed, if it failed.
- id String
- The provider-assigned unique ID for this managed resource.
- last
Modified StringTime - The time at which the processing job was last modified.
- monitoring
Schedule StringArn - The ARN of a monitoring schedule for an endpoint associated with this processing job.
- processing
End StringTime - The time at which the processing job completed.
- processing
Job StringArn - The Amazon Resource Name (ARN) of the processing job.
- processing
Job "Completed" | "InStatus Progress" | "Stopping" | "Stopped" | "Failed" - Provides the status of a processing job.
- processing
Start StringTime - The time at which the processing job started.
- training
Job StringArn - The ARN of a training job associated with this processing job
Supporting Types
CreateOnlyTag, CreateOnlyTagArgs
A set of tags to apply to the resource.ProcessingJobAppSpecification, ProcessingJobAppSpecificationArgs
Configures the processing job to run a specified Docker container image.- Image
Uri string - The container image to be run by the processing job.
- Container
Arguments List<string> - The arguments for a container used to run a processing job.
- Container
Entrypoint List<string> - The entrypoint for a container used to run a processing job.
- Image
Uri string - The container image to be run by the processing job.
- Container
Arguments []string - The arguments for a container used to run a processing job.
- Container
Entrypoint []string - The entrypoint for a container used to run a processing job.
- image
Uri String - The container image to be run by the processing job.
- container
Arguments List<String> - The arguments for a container used to run a processing job.
- container
Entrypoint List<String> - The entrypoint for a container used to run a processing job.
- image
Uri string - The container image to be run by the processing job.
- container
Arguments string[] - The arguments for a container used to run a processing job.
- container
Entrypoint string[] - The entrypoint for a container used to run a processing job.
- image_
uri str - The container image to be run by the processing job.
- container_
arguments Sequence[str] - The arguments for a container used to run a processing job.
- container_
entrypoint Sequence[str] - The entrypoint for a container used to run a processing job.
- image
Uri String - The container image to be run by the processing job.
- container
Arguments List<String> - The arguments for a container used to run a processing job.
- container
Entrypoint List<String> - The entrypoint for a container used to run a processing job.
ProcessingJobAthenaDatasetDefinition, ProcessingJobAthenaDatasetDefinitionArgs
Configuration for Athena Dataset Definition input.- Catalog string
- The name of the data catalog used in Athena query execution.
- Database string
- The name of the database used in the Athena query execution.
- Output
Format Pulumi.Aws Native. Sage Maker. Processing Job Athena Dataset Definition Output Format - The data storage format for Athena query results.
- Output
S3Uri string - The location in Amazon S3 where Athena query results are stored.
- Query
String string - The SQL query statements, to be executed.
- Kms
Key stringId - The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data generated from an Athena query execution.
- Output
Compression Pulumi.Aws Native. Sage Maker. Processing Job Athena Dataset Definition Output Compression - The compression used for Athena query results.
- Work
Group string - The name of the workgroup in which the Athena query is being started.
- Catalog string
- The name of the data catalog used in Athena query execution.
- Database string
- The name of the database used in the Athena query execution.
- Output
Format ProcessingJob Athena Dataset Definition Output Format - The data storage format for Athena query results.
- Output
S3Uri string - The location in Amazon S3 where Athena query results are stored.
- Query
String string - The SQL query statements, to be executed.
- Kms
Key stringId - The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data generated from an Athena query execution.
- Output
Compression ProcessingJob Athena Dataset Definition Output Compression - The compression used for Athena query results.
- Work
Group string - The name of the workgroup in which the Athena query is being started.
- catalog String
- The name of the data catalog used in Athena query execution.
- database String
- The name of the database used in the Athena query execution.
- output
Format ProcessingJob Athena Dataset Definition Output Format - The data storage format for Athena query results.
- output
S3Uri String - The location in Amazon S3 where Athena query results are stored.
- query
String String - The SQL query statements, to be executed.
- kms
Key StringId - The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data generated from an Athena query execution.
- output
Compression ProcessingJob Athena Dataset Definition Output Compression - The compression used for Athena query results.
- work
Group String - The name of the workgroup in which the Athena query is being started.
- catalog string
- The name of the data catalog used in Athena query execution.
- database string
- The name of the database used in the Athena query execution.
- output
Format ProcessingJob Athena Dataset Definition Output Format - The data storage format for Athena query results.
- output
S3Uri string - The location in Amazon S3 where Athena query results are stored.
- query
String string - The SQL query statements, to be executed.
- kms
Key stringId - The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data generated from an Athena query execution.
- output
Compression ProcessingJob Athena Dataset Definition Output Compression - The compression used for Athena query results.
- work
Group string - The name of the workgroup in which the Athena query is being started.
- catalog str
- The name of the data catalog used in Athena query execution.
- database str
- The name of the database used in the Athena query execution.
- output_
format ProcessingJob Athena Dataset Definition Output Format - The data storage format for Athena query results.
- output_
s3_ struri - The location in Amazon S3 where Athena query results are stored.
- query_
string str - The SQL query statements, to be executed.
- kms_
key_ strid - The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data generated from an Athena query execution.
- output_
compression ProcessingJob Athena Dataset Definition Output Compression - The compression used for Athena query results.
- work_
group str - The name of the workgroup in which the Athena query is being started.
- catalog String
- The name of the data catalog used in Athena query execution.
- database String
- The name of the database used in the Athena query execution.
- output
Format "PARQUET" | "AVRO" | "ORC" | "JSON" | "TEXTFILE" - The data storage format for Athena query results.
- output
S3Uri String - The location in Amazon S3 where Athena query results are stored.
- query
String String - The SQL query statements, to be executed.
- kms
Key StringId - The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data generated from an Athena query execution.
- output
Compression "GZIP" | "SNAPPY" | "ZLIB" - The compression used for Athena query results.
- work
Group String - The name of the workgroup in which the Athena query is being started.
ProcessingJobAthenaDatasetDefinitionOutputCompression, ProcessingJobAthenaDatasetDefinitionOutputCompressionArgs
- Gzip
GZIP- Snappy
SNAPPY- Zlib
ZLIB
- Processing
Job Athena Dataset Definition Output Compression Gzip GZIP- Processing
Job Athena Dataset Definition Output Compression Snappy SNAPPY- Processing
Job Athena Dataset Definition Output Compression Zlib ZLIB
- Gzip
GZIP- Snappy
SNAPPY- Zlib
ZLIB
- Gzip
GZIP- Snappy
SNAPPY- Zlib
ZLIB
- GZIP
GZIP- SNAPPY
SNAPPY- ZLIB
ZLIB
- "GZIP"
GZIP- "SNAPPY"
SNAPPY- "ZLIB"
ZLIB
ProcessingJobAthenaDatasetDefinitionOutputFormat, ProcessingJobAthenaDatasetDefinitionOutputFormatArgs
- Parquet
PARQUET- Avro
AVRO- Orc
ORC- Json
JSON- Textfile
TEXTFILE
- Processing
Job Athena Dataset Definition Output Format Parquet PARQUET- Processing
Job Athena Dataset Definition Output Format Avro AVRO- Processing
Job Athena Dataset Definition Output Format Orc ORC- Processing
Job Athena Dataset Definition Output Format Json JSON- Processing
Job Athena Dataset Definition Output Format Textfile TEXTFILE
- Parquet
PARQUET- Avro
AVRO- Orc
ORC- Json
JSON- Textfile
TEXTFILE
- Parquet
PARQUET- Avro
AVRO- Orc
ORC- Json
JSON- Textfile
TEXTFILE
- PARQUET
PARQUET- AVRO
AVRO- ORC
ORC- JSON
JSON- TEXTFILE
TEXTFILE
- "PARQUET"
PARQUET- "AVRO"
AVRO- "ORC"
ORC- "JSON"
JSON- "TEXTFILE"
TEXTFILE
ProcessingJobClusterConfig, ProcessingJobClusterConfigArgs
Configuration for the cluster used to run a processing job.- Instance
Count int - The number of ML compute instances to use in the processing job. For distributed processing jobs, specify a value greater than 1. The default value is 1.
- Instance
Type Pulumi.Aws Native. Sage Maker. Processing Job Cluster Config Instance Type - The ML compute instance type for the processing job.
- Volume
Size intIn Gb - The size of the ML storage volume in gigabytes that you want to provision. You must specify sufficient ML storage for your scenario.
- Volume
Kms stringKey Id - The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the processing job.
- Instance
Count int - The number of ML compute instances to use in the processing job. For distributed processing jobs, specify a value greater than 1. The default value is 1.
- Instance
Type ProcessingJob Cluster Config Instance Type - The ML compute instance type for the processing job.
- Volume
Size intIn Gb - The size of the ML storage volume in gigabytes that you want to provision. You must specify sufficient ML storage for your scenario.
- Volume
Kms stringKey Id - The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the processing job.
- instance
Count Integer - The number of ML compute instances to use in the processing job. For distributed processing jobs, specify a value greater than 1. The default value is 1.
- instance
Type ProcessingJob Cluster Config Instance Type - The ML compute instance type for the processing job.
- volume
Size IntegerIn Gb - The size of the ML storage volume in gigabytes that you want to provision. You must specify sufficient ML storage for your scenario.
- volume
Kms StringKey Id - The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the processing job.
- instance
Count number - The number of ML compute instances to use in the processing job. For distributed processing jobs, specify a value greater than 1. The default value is 1.
- instance
Type ProcessingJob Cluster Config Instance Type - The ML compute instance type for the processing job.
- volume
Size numberIn Gb - The size of the ML storage volume in gigabytes that you want to provision. You must specify sufficient ML storage for your scenario.
- volume
Kms stringKey Id - The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the processing job.
- instance_
count int - The number of ML compute instances to use in the processing job. For distributed processing jobs, specify a value greater than 1. The default value is 1.
- instance_
type ProcessingJob Cluster Config Instance Type - The ML compute instance type for the processing job.
- volume_
size_ intin_ gb - The size of the ML storage volume in gigabytes that you want to provision. You must specify sufficient ML storage for your scenario.
- volume_
kms_ strkey_ id - The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the processing job.
- instance
Count Number - The number of ML compute instances to use in the processing job. For distributed processing jobs, specify a value greater than 1. The default value is 1.
- instance
Type "ml.t3.medium" | "ml.t3.large" | "ml.t3.xlarge" | "ml.t3.2xlarge" | "ml.m4.xlarge" | "ml.m4.2xlarge" | "ml.m4.4xlarge" | "ml.m4.10xlarge" | "ml.m4.16xlarge" | "ml.c4.xlarge" | "ml.c4.2xlarge" | "ml.c4.4xlarge" | "ml.c4.8xlarge" | "ml.c5.xlarge" | "ml.c5.2xlarge" | "ml.c5.4xlarge" | "ml.c5.9xlarge" | "ml.c5.18xlarge" | "ml.m5.large" | "ml.m5.xlarge" | "ml.m5.2xlarge" | "ml.m5.4xlarge" | "ml.m5.12xlarge" | "ml.m5.24xlarge" | "ml.r5.large" | "ml.r5.xlarge" | "ml.r5.2xlarge" | "ml.r5.4xlarge" | "ml.r5.8xlarge" | "ml.r5.12xlarge" | "ml.r5.16xlarge" | "ml.r5.24xlarge" | "ml.g4dn.xlarge" | "ml.g4dn.2xlarge" | "ml.g4dn.4xlarge" | "ml.g4dn.8xlarge" | "ml.g4dn.12xlarge" | "ml.g4dn.16xlarge" | "ml.g5.xlarge" | "ml.g5.2xlarge" | "ml.g5.4xlarge" | "ml.g5.8xlarge" | "ml.g5.16xlarge" | "ml.g5.12xlarge" | "ml.g5.24xlarge" | "ml.g5.48xlarge" | "ml.r5d.large" | "ml.r5d.xlarge" | "ml.r5d.2xlarge" | "ml.r5d.4xlarge" | "ml.r5d.8xlarge" | "ml.r5d.12xlarge" | "ml.r5d.16xlarge" | "ml.r5d.24xlarge" | "ml.g6.xlarge" | "ml.g6.2xlarge" | "ml.g6.4xlarge" | "ml.g6.8xlarge" | "ml.g6.12xlarge" | "ml.g6.16xlarge" | "ml.g6.24xlarge" | "ml.g6.48xlarge" | "ml.g6e.xlarge" | "ml.g6e.2xlarge" | "ml.g6e.4xlarge" | "ml.g6e.8xlarge" | "ml.g6e.12xlarge" | "ml.g6e.16xlarge" | "ml.g6e.24xlarge" | "ml.g6e.48xlarge" | "ml.m6i.large" | "ml.m6i.xlarge" | "ml.m6i.2xlarge" | "ml.m6i.4xlarge" | "ml.m6i.8xlarge" | "ml.m6i.12xlarge" | "ml.m6i.16xlarge" | "ml.m6i.24xlarge" | "ml.m6i.32xlarge" | "ml.c6i.xlarge" | "ml.c6i.2xlarge" | "ml.c6i.4xlarge" | "ml.c6i.8xlarge" | "ml.c6i.12xlarge" | "ml.c6i.16xlarge" | "ml.c6i.24xlarge" | "ml.c6i.32xlarge" | "ml.m7i.large" | "ml.m7i.xlarge" | "ml.m7i.2xlarge" | "ml.m7i.4xlarge" | "ml.m7i.8xlarge" | "ml.m7i.12xlarge" | "ml.m7i.16xlarge" | "ml.m7i.24xlarge" | "ml.m7i.48xlarge" | "ml.c7i.large" | "ml.c7i.xlarge" | "ml.c7i.2xlarge" | "ml.c7i.4xlarge" | "ml.c7i.8xlarge" | "ml.c7i.12xlarge" | "ml.c7i.16xlarge" | "ml.c7i.24xlarge" | "ml.c7i.48xlarge" | "ml.r7i.large" | "ml.r7i.xlarge" | "ml.r7i.2xlarge" | "ml.r7i.4xlarge" | "ml.r7i.8xlarge" | "ml.r7i.12xlarge" | "ml.r7i.16xlarge" | "ml.r7i.24xlarge" | "ml.r7i.48xlarge" - The ML compute instance type for the processing job.
- volume
Size NumberIn Gb - The size of the ML storage volume in gigabytes that you want to provision. You must specify sufficient ML storage for your scenario.
- volume
Kms StringKey Id - The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the processing job.
ProcessingJobClusterConfigInstanceType, ProcessingJobClusterConfigInstanceTypeArgs
- Ml
T3Medium ml.t3.medium- Ml
T3Large ml.t3.large- Ml
T3Xlarge ml.t3.xlarge- Ml
T32xlarge ml.t3.2xlarge- Ml
M4Xlarge ml.m4.xlarge- Ml
M42xlarge ml.m4.2xlarge- Ml
M44xlarge ml.m4.4xlarge- Ml
M410xlarge ml.m4.10xlarge- Ml
M416xlarge ml.m4.16xlarge- Ml
C4Xlarge ml.c4.xlarge- Ml
C42xlarge ml.c4.2xlarge- Ml
C44xlarge ml.c4.4xlarge- Ml
C48xlarge ml.c4.8xlarge- Ml
C5Xlarge ml.c5.xlarge- Ml
C52xlarge ml.c5.2xlarge- Ml
C54xlarge ml.c5.4xlarge- Ml
C59xlarge ml.c5.9xlarge- Ml
C518xlarge ml.c5.18xlarge- Ml
M5Large ml.m5.large- Ml
M5Xlarge ml.m5.xlarge- Ml
M52xlarge ml.m5.2xlarge- Ml
M54xlarge ml.m5.4xlarge- Ml
M512xlarge ml.m5.12xlarge- Ml
M524xlarge ml.m5.24xlarge- Ml
R5Large ml.r5.large- Ml
R5Xlarge ml.r5.xlarge- Ml
R52xlarge ml.r5.2xlarge- Ml
R54xlarge ml.r5.4xlarge- Ml
R58xlarge ml.r5.8xlarge- Ml
R512xlarge ml.r5.12xlarge- Ml
R516xlarge ml.r5.16xlarge- Ml
R524xlarge ml.r5.24xlarge- Ml
G4dn Xlarge ml.g4dn.xlarge- Ml
G4dn2xlarge ml.g4dn.2xlarge- Ml
G4dn4xlarge ml.g4dn.4xlarge- Ml
G4dn8xlarge ml.g4dn.8xlarge- Ml
G4dn12xlarge ml.g4dn.12xlarge- Ml
G4dn16xlarge ml.g4dn.16xlarge- Ml
G5Xlarge ml.g5.xlarge- Ml
G52xlarge ml.g5.2xlarge- Ml
G54xlarge ml.g5.4xlarge- Ml
G58xlarge ml.g5.8xlarge- Ml
G516xlarge ml.g5.16xlarge- Ml
G512xlarge ml.g5.12xlarge- Ml
G524xlarge ml.g5.24xlarge- Ml
G548xlarge ml.g5.48xlarge- Ml
R5d Large ml.r5d.large- Ml
R5d Xlarge ml.r5d.xlarge- Ml
R5d2xlarge ml.r5d.2xlarge- Ml
R5d4xlarge ml.r5d.4xlarge- Ml
R5d8xlarge ml.r5d.8xlarge- Ml
R5d12xlarge ml.r5d.12xlarge- Ml
R5d16xlarge ml.r5d.16xlarge- Ml
R5d24xlarge ml.r5d.24xlarge- Ml
G6Xlarge ml.g6.xlarge- Ml
G62xlarge ml.g6.2xlarge- Ml
G64xlarge ml.g6.4xlarge- Ml
G68xlarge ml.g6.8xlarge- Ml
G612xlarge ml.g6.12xlarge- Ml
G616xlarge ml.g6.16xlarge- Ml
G624xlarge ml.g6.24xlarge- Ml
G648xlarge ml.g6.48xlarge- Ml
G6e Xlarge ml.g6e.xlarge- Ml
G6e2xlarge ml.g6e.2xlarge- Ml
G6e4xlarge ml.g6e.4xlarge- Ml
G6e8xlarge ml.g6e.8xlarge- Ml
G6e12xlarge ml.g6e.12xlarge- Ml
G6e16xlarge ml.g6e.16xlarge- Ml
G6e24xlarge ml.g6e.24xlarge- Ml
G6e48xlarge ml.g6e.48xlarge- Ml
M6i Large ml.m6i.large- Ml
M6i Xlarge ml.m6i.xlarge- Ml
M6i2xlarge ml.m6i.2xlarge- Ml
M6i4xlarge ml.m6i.4xlarge- Ml
M6i8xlarge ml.m6i.8xlarge- Ml
M6i12xlarge ml.m6i.12xlarge- Ml
M6i16xlarge ml.m6i.16xlarge- Ml
M6i24xlarge ml.m6i.24xlarge- Ml
M6i32xlarge ml.m6i.32xlarge- Ml
C6i Xlarge ml.c6i.xlarge- Ml
C6i2xlarge ml.c6i.2xlarge- Ml
C6i4xlarge ml.c6i.4xlarge- Ml
C6i8xlarge ml.c6i.8xlarge- Ml
C6i12xlarge ml.c6i.12xlarge- Ml
C6i16xlarge ml.c6i.16xlarge- Ml
C6i24xlarge ml.c6i.24xlarge- Ml
C6i32xlarge ml.c6i.32xlarge- Ml
M7i Large ml.m7i.large- Ml
M7i Xlarge ml.m7i.xlarge- Ml
M7i2xlarge ml.m7i.2xlarge- Ml
M7i4xlarge ml.m7i.4xlarge- Ml
M7i8xlarge ml.m7i.8xlarge- Ml
M7i12xlarge ml.m7i.12xlarge- Ml
M7i16xlarge ml.m7i.16xlarge- Ml
M7i24xlarge ml.m7i.24xlarge- Ml
M7i48xlarge ml.m7i.48xlarge- Ml
C7i Large ml.c7i.large- Ml
C7i Xlarge ml.c7i.xlarge- Ml
C7i2xlarge ml.c7i.2xlarge- Ml
C7i4xlarge ml.c7i.4xlarge- Ml
C7i8xlarge ml.c7i.8xlarge- Ml
C7i12xlarge ml.c7i.12xlarge- Ml
C7i16xlarge ml.c7i.16xlarge- Ml
C7i24xlarge ml.c7i.24xlarge- Ml
C7i48xlarge ml.c7i.48xlarge- Ml
R7i Large ml.r7i.large- Ml
R7i Xlarge ml.r7i.xlarge- Ml
R7i2xlarge ml.r7i.2xlarge- Ml
R7i4xlarge ml.r7i.4xlarge- Ml
R7i8xlarge ml.r7i.8xlarge- Ml
R7i12xlarge ml.r7i.12xlarge- Ml
R7i16xlarge ml.r7i.16xlarge- Ml
R7i24xlarge ml.r7i.24xlarge- Ml
R7i48xlarge ml.r7i.48xlarge
- Processing
Job Cluster Config Instance Type Ml T3Medium ml.t3.medium- Processing
Job Cluster Config Instance Type Ml T3Large ml.t3.large- Processing
Job Cluster Config Instance Type Ml T3Xlarge ml.t3.xlarge- Processing
Job Cluster Config Instance Type Ml T32xlarge ml.t3.2xlarge- Processing
Job Cluster Config Instance Type Ml M4Xlarge ml.m4.xlarge- Processing
Job Cluster Config Instance Type Ml M42xlarge ml.m4.2xlarge- Processing
Job Cluster Config Instance Type Ml M44xlarge ml.m4.4xlarge- Processing
Job Cluster Config Instance Type Ml M410xlarge ml.m4.10xlarge- Processing
Job Cluster Config Instance Type Ml M416xlarge ml.m4.16xlarge- Processing
Job Cluster Config Instance Type Ml C4Xlarge ml.c4.xlarge- Processing
Job Cluster Config Instance Type Ml C42xlarge ml.c4.2xlarge- Processing
Job Cluster Config Instance Type Ml C44xlarge ml.c4.4xlarge- Processing
Job Cluster Config Instance Type Ml C48xlarge ml.c4.8xlarge- Processing
Job Cluster Config Instance Type Ml C5Xlarge ml.c5.xlarge- Processing
Job Cluster Config Instance Type Ml C52xlarge ml.c5.2xlarge- Processing
Job Cluster Config Instance Type Ml C54xlarge ml.c5.4xlarge- Processing
Job Cluster Config Instance Type Ml C59xlarge ml.c5.9xlarge- Processing
Job Cluster Config Instance Type Ml C518xlarge ml.c5.18xlarge- Processing
Job Cluster Config Instance Type Ml M5Large ml.m5.large- Processing
Job Cluster Config Instance Type Ml M5Xlarge ml.m5.xlarge- Processing
Job Cluster Config Instance Type Ml M52xlarge ml.m5.2xlarge- Processing
Job Cluster Config Instance Type Ml M54xlarge ml.m5.4xlarge- Processing
Job Cluster Config Instance Type Ml M512xlarge ml.m5.12xlarge- Processing
Job Cluster Config Instance Type Ml M524xlarge ml.m5.24xlarge- Processing
Job Cluster Config Instance Type Ml R5Large ml.r5.large- Processing
Job Cluster Config Instance Type Ml R5Xlarge ml.r5.xlarge- Processing
Job Cluster Config Instance Type Ml R52xlarge ml.r5.2xlarge- Processing
Job Cluster Config Instance Type Ml R54xlarge ml.r5.4xlarge- Processing
Job Cluster Config Instance Type Ml R58xlarge ml.r5.8xlarge- Processing
Job Cluster Config Instance Type Ml R512xlarge ml.r5.12xlarge- Processing
Job Cluster Config Instance Type Ml R516xlarge ml.r5.16xlarge- Processing
Job Cluster Config Instance Type Ml R524xlarge ml.r5.24xlarge- Processing
Job Cluster Config Instance Type Ml G4dn Xlarge ml.g4dn.xlarge- Processing
Job Cluster Config Instance Type Ml G4dn2xlarge ml.g4dn.2xlarge- Processing
Job Cluster Config Instance Type Ml G4dn4xlarge ml.g4dn.4xlarge- Processing
Job Cluster Config Instance Type Ml G4dn8xlarge ml.g4dn.8xlarge- Processing
Job Cluster Config Instance Type Ml G4dn12xlarge ml.g4dn.12xlarge- Processing
Job Cluster Config Instance Type Ml G4dn16xlarge ml.g4dn.16xlarge- Processing
Job Cluster Config Instance Type Ml G5Xlarge ml.g5.xlarge- Processing
Job Cluster Config Instance Type Ml G52xlarge ml.g5.2xlarge- Processing
Job Cluster Config Instance Type Ml G54xlarge ml.g5.4xlarge- Processing
Job Cluster Config Instance Type Ml G58xlarge ml.g5.8xlarge- Processing
Job Cluster Config Instance Type Ml G516xlarge ml.g5.16xlarge- Processing
Job Cluster Config Instance Type Ml G512xlarge ml.g5.12xlarge- Processing
Job Cluster Config Instance Type Ml G524xlarge ml.g5.24xlarge- Processing
Job Cluster Config Instance Type Ml G548xlarge ml.g5.48xlarge- Processing
Job Cluster Config Instance Type Ml R5d Large ml.r5d.large- Processing
Job Cluster Config Instance Type Ml R5d Xlarge ml.r5d.xlarge- Processing
Job Cluster Config Instance Type Ml R5d2xlarge ml.r5d.2xlarge- Processing
Job Cluster Config Instance Type Ml R5d4xlarge ml.r5d.4xlarge- Processing
Job Cluster Config Instance Type Ml R5d8xlarge ml.r5d.8xlarge- Processing
Job Cluster Config Instance Type Ml R5d12xlarge ml.r5d.12xlarge- Processing
Job Cluster Config Instance Type Ml R5d16xlarge ml.r5d.16xlarge- Processing
Job Cluster Config Instance Type Ml R5d24xlarge ml.r5d.24xlarge- Processing
Job Cluster Config Instance Type Ml G6Xlarge ml.g6.xlarge- Processing
Job Cluster Config Instance Type Ml G62xlarge ml.g6.2xlarge- Processing
Job Cluster Config Instance Type Ml G64xlarge ml.g6.4xlarge- Processing
Job Cluster Config Instance Type Ml G68xlarge ml.g6.8xlarge- Processing
Job Cluster Config Instance Type Ml G612xlarge ml.g6.12xlarge- Processing
Job Cluster Config Instance Type Ml G616xlarge ml.g6.16xlarge- Processing
Job Cluster Config Instance Type Ml G624xlarge ml.g6.24xlarge- Processing
Job Cluster Config Instance Type Ml G648xlarge ml.g6.48xlarge- Processing
Job Cluster Config Instance Type Ml G6e Xlarge ml.g6e.xlarge- Processing
Job Cluster Config Instance Type Ml G6e2xlarge ml.g6e.2xlarge- Processing
Job Cluster Config Instance Type Ml G6e4xlarge ml.g6e.4xlarge- Processing
Job Cluster Config Instance Type Ml G6e8xlarge ml.g6e.8xlarge- Processing
Job Cluster Config Instance Type Ml G6e12xlarge ml.g6e.12xlarge- Processing
Job Cluster Config Instance Type Ml G6e16xlarge ml.g6e.16xlarge- Processing
Job Cluster Config Instance Type Ml G6e24xlarge ml.g6e.24xlarge- Processing
Job Cluster Config Instance Type Ml G6e48xlarge ml.g6e.48xlarge- Processing
Job Cluster Config Instance Type Ml M6i Large ml.m6i.large- Processing
Job Cluster Config Instance Type Ml M6i Xlarge ml.m6i.xlarge- Processing
Job Cluster Config Instance Type Ml M6i2xlarge ml.m6i.2xlarge- Processing
Job Cluster Config Instance Type Ml M6i4xlarge ml.m6i.4xlarge- Processing
Job Cluster Config Instance Type Ml M6i8xlarge ml.m6i.8xlarge- Processing
Job Cluster Config Instance Type Ml M6i12xlarge ml.m6i.12xlarge- Processing
Job Cluster Config Instance Type Ml M6i16xlarge ml.m6i.16xlarge- Processing
Job Cluster Config Instance Type Ml M6i24xlarge ml.m6i.24xlarge- Processing
Job Cluster Config Instance Type Ml M6i32xlarge ml.m6i.32xlarge- Processing
Job Cluster Config Instance Type Ml C6i Xlarge ml.c6i.xlarge- Processing
Job Cluster Config Instance Type Ml C6i2xlarge ml.c6i.2xlarge- Processing
Job Cluster Config Instance Type Ml C6i4xlarge ml.c6i.4xlarge- Processing
Job Cluster Config Instance Type Ml C6i8xlarge ml.c6i.8xlarge- Processing
Job Cluster Config Instance Type Ml C6i12xlarge ml.c6i.12xlarge- Processing
Job Cluster Config Instance Type Ml C6i16xlarge ml.c6i.16xlarge- Processing
Job Cluster Config Instance Type Ml C6i24xlarge ml.c6i.24xlarge- Processing
Job Cluster Config Instance Type Ml C6i32xlarge ml.c6i.32xlarge- Processing
Job Cluster Config Instance Type Ml M7i Large ml.m7i.large- Processing
Job Cluster Config Instance Type Ml M7i Xlarge ml.m7i.xlarge- Processing
Job Cluster Config Instance Type Ml M7i2xlarge ml.m7i.2xlarge- Processing
Job Cluster Config Instance Type Ml M7i4xlarge ml.m7i.4xlarge- Processing
Job Cluster Config Instance Type Ml M7i8xlarge ml.m7i.8xlarge- Processing
Job Cluster Config Instance Type Ml M7i12xlarge ml.m7i.12xlarge- Processing
Job Cluster Config Instance Type Ml M7i16xlarge ml.m7i.16xlarge- Processing
Job Cluster Config Instance Type Ml M7i24xlarge ml.m7i.24xlarge- Processing
Job Cluster Config Instance Type Ml M7i48xlarge ml.m7i.48xlarge- Processing
Job Cluster Config Instance Type Ml C7i Large ml.c7i.large- Processing
Job Cluster Config Instance Type Ml C7i Xlarge ml.c7i.xlarge- Processing
Job Cluster Config Instance Type Ml C7i2xlarge ml.c7i.2xlarge- Processing
Job Cluster Config Instance Type Ml C7i4xlarge ml.c7i.4xlarge- Processing
Job Cluster Config Instance Type Ml C7i8xlarge ml.c7i.8xlarge- Processing
Job Cluster Config Instance Type Ml C7i12xlarge ml.c7i.12xlarge- Processing
Job Cluster Config Instance Type Ml C7i16xlarge ml.c7i.16xlarge- Processing
Job Cluster Config Instance Type Ml C7i24xlarge ml.c7i.24xlarge- Processing
Job Cluster Config Instance Type Ml C7i48xlarge ml.c7i.48xlarge- Processing
Job Cluster Config Instance Type Ml R7i Large ml.r7i.large- Processing
Job Cluster Config Instance Type Ml R7i Xlarge ml.r7i.xlarge- Processing
Job Cluster Config Instance Type Ml R7i2xlarge ml.r7i.2xlarge- Processing
Job Cluster Config Instance Type Ml R7i4xlarge ml.r7i.4xlarge- Processing
Job Cluster Config Instance Type Ml R7i8xlarge ml.r7i.8xlarge- Processing
Job Cluster Config Instance Type Ml R7i12xlarge ml.r7i.12xlarge- Processing
Job Cluster Config Instance Type Ml R7i16xlarge ml.r7i.16xlarge- Processing
Job Cluster Config Instance Type Ml R7i24xlarge ml.r7i.24xlarge- Processing
Job Cluster Config Instance Type Ml R7i48xlarge ml.r7i.48xlarge
- Ml
T3Medium ml.t3.medium- Ml
T3Large ml.t3.large- Ml
T3Xlarge ml.t3.xlarge- Ml
T32xlarge ml.t3.2xlarge- Ml
M4Xlarge ml.m4.xlarge- Ml
M42xlarge ml.m4.2xlarge- Ml
M44xlarge ml.m4.4xlarge- Ml
M410xlarge ml.m4.10xlarge- Ml
M416xlarge ml.m4.16xlarge- Ml
C4Xlarge ml.c4.xlarge- Ml
C42xlarge ml.c4.2xlarge- Ml
C44xlarge ml.c4.4xlarge- Ml
C48xlarge ml.c4.8xlarge- Ml
C5Xlarge ml.c5.xlarge- Ml
C52xlarge ml.c5.2xlarge- Ml
C54xlarge ml.c5.4xlarge- Ml
C59xlarge ml.c5.9xlarge- Ml
C518xlarge ml.c5.18xlarge- Ml
M5Large ml.m5.large- Ml
M5Xlarge ml.m5.xlarge- Ml
M52xlarge ml.m5.2xlarge- Ml
M54xlarge ml.m5.4xlarge- Ml
M512xlarge ml.m5.12xlarge- Ml
M524xlarge ml.m5.24xlarge- Ml
R5Large ml.r5.large- Ml
R5Xlarge ml.r5.xlarge- Ml
R52xlarge ml.r5.2xlarge- Ml
R54xlarge ml.r5.4xlarge- Ml
R58xlarge ml.r5.8xlarge- Ml
R512xlarge ml.r5.12xlarge- Ml
R516xlarge ml.r5.16xlarge- Ml
R524xlarge ml.r5.24xlarge- Ml
G4dn Xlarge ml.g4dn.xlarge- Ml
G4dn2xlarge ml.g4dn.2xlarge- Ml
G4dn4xlarge ml.g4dn.4xlarge- Ml
G4dn8xlarge ml.g4dn.8xlarge- Ml
G4dn12xlarge ml.g4dn.12xlarge- Ml
G4dn16xlarge ml.g4dn.16xlarge- Ml
G5Xlarge ml.g5.xlarge- Ml
G52xlarge ml.g5.2xlarge- Ml
G54xlarge ml.g5.4xlarge- Ml
G58xlarge ml.g5.8xlarge- Ml
G516xlarge ml.g5.16xlarge- Ml
G512xlarge ml.g5.12xlarge- Ml
G524xlarge ml.g5.24xlarge- Ml
G548xlarge ml.g5.48xlarge- Ml
R5d Large ml.r5d.large- Ml
R5d Xlarge ml.r5d.xlarge- Ml
R5d2xlarge ml.r5d.2xlarge- Ml
R5d4xlarge ml.r5d.4xlarge- Ml
R5d8xlarge ml.r5d.8xlarge- Ml
R5d12xlarge ml.r5d.12xlarge- Ml
R5d16xlarge ml.r5d.16xlarge- Ml
R5d24xlarge ml.r5d.24xlarge- Ml
G6Xlarge ml.g6.xlarge- Ml
G62xlarge ml.g6.2xlarge- Ml
G64xlarge ml.g6.4xlarge- Ml
G68xlarge ml.g6.8xlarge- Ml
G612xlarge ml.g6.12xlarge- Ml
G616xlarge ml.g6.16xlarge- Ml
G624xlarge ml.g6.24xlarge- Ml
G648xlarge ml.g6.48xlarge- Ml
G6e Xlarge ml.g6e.xlarge- Ml
G6e2xlarge ml.g6e.2xlarge- Ml
G6e4xlarge ml.g6e.4xlarge- Ml
G6e8xlarge ml.g6e.8xlarge- Ml
G6e12xlarge ml.g6e.12xlarge- Ml
G6e16xlarge ml.g6e.16xlarge- Ml
G6e24xlarge ml.g6e.24xlarge- Ml
G6e48xlarge ml.g6e.48xlarge- Ml
M6i Large ml.m6i.large- Ml
M6i Xlarge ml.m6i.xlarge- Ml
M6i2xlarge ml.m6i.2xlarge- Ml
M6i4xlarge ml.m6i.4xlarge- Ml
M6i8xlarge ml.m6i.8xlarge- Ml
M6i12xlarge ml.m6i.12xlarge- Ml
M6i16xlarge ml.m6i.16xlarge- Ml
M6i24xlarge ml.m6i.24xlarge- Ml
M6i32xlarge ml.m6i.32xlarge- Ml
C6i Xlarge ml.c6i.xlarge- Ml
C6i2xlarge ml.c6i.2xlarge- Ml
C6i4xlarge ml.c6i.4xlarge- Ml
C6i8xlarge ml.c6i.8xlarge- Ml
C6i12xlarge ml.c6i.12xlarge- Ml
C6i16xlarge ml.c6i.16xlarge- Ml
C6i24xlarge ml.c6i.24xlarge- Ml
C6i32xlarge ml.c6i.32xlarge- Ml
M7i Large ml.m7i.large- Ml
M7i Xlarge ml.m7i.xlarge- Ml
M7i2xlarge ml.m7i.2xlarge- Ml
M7i4xlarge ml.m7i.4xlarge- Ml
M7i8xlarge ml.m7i.8xlarge- Ml
M7i12xlarge ml.m7i.12xlarge- Ml
M7i16xlarge ml.m7i.16xlarge- Ml
M7i24xlarge ml.m7i.24xlarge- Ml
M7i48xlarge ml.m7i.48xlarge- Ml
C7i Large ml.c7i.large- Ml
C7i Xlarge ml.c7i.xlarge- Ml
C7i2xlarge ml.c7i.2xlarge- Ml
C7i4xlarge ml.c7i.4xlarge- Ml
C7i8xlarge ml.c7i.8xlarge- Ml
C7i12xlarge ml.c7i.12xlarge- Ml
C7i16xlarge ml.c7i.16xlarge- Ml
C7i24xlarge ml.c7i.24xlarge- Ml
C7i48xlarge ml.c7i.48xlarge- Ml
R7i Large ml.r7i.large- Ml
R7i Xlarge ml.r7i.xlarge- Ml
R7i2xlarge ml.r7i.2xlarge- Ml
R7i4xlarge ml.r7i.4xlarge- Ml
R7i8xlarge ml.r7i.8xlarge- Ml
R7i12xlarge ml.r7i.12xlarge- Ml
R7i16xlarge ml.r7i.16xlarge- Ml
R7i24xlarge ml.r7i.24xlarge- Ml
R7i48xlarge ml.r7i.48xlarge
- Ml
T3Medium ml.t3.medium- Ml
T3Large ml.t3.large- Ml
T3Xlarge ml.t3.xlarge- Ml
T32xlarge ml.t3.2xlarge- Ml
M4Xlarge ml.m4.xlarge- Ml
M42xlarge ml.m4.2xlarge- Ml
M44xlarge ml.m4.4xlarge- Ml
M410xlarge ml.m4.10xlarge- Ml
M416xlarge ml.m4.16xlarge- Ml
C4Xlarge ml.c4.xlarge- Ml
C42xlarge ml.c4.2xlarge- Ml
C44xlarge ml.c4.4xlarge- Ml
C48xlarge ml.c4.8xlarge- Ml
C5Xlarge ml.c5.xlarge- Ml
C52xlarge ml.c5.2xlarge- Ml
C54xlarge ml.c5.4xlarge- Ml
C59xlarge ml.c5.9xlarge- Ml
C518xlarge ml.c5.18xlarge- Ml
M5Large ml.m5.large- Ml
M5Xlarge ml.m5.xlarge- Ml
M52xlarge ml.m5.2xlarge- Ml
M54xlarge ml.m5.4xlarge- Ml
M512xlarge ml.m5.12xlarge- Ml
M524xlarge ml.m5.24xlarge- Ml
R5Large ml.r5.large- Ml
R5Xlarge ml.r5.xlarge- Ml
R52xlarge ml.r5.2xlarge- Ml
R54xlarge ml.r5.4xlarge- Ml
R58xlarge ml.r5.8xlarge- Ml
R512xlarge ml.r5.12xlarge- Ml
R516xlarge ml.r5.16xlarge- Ml
R524xlarge ml.r5.24xlarge- Ml
G4dn Xlarge ml.g4dn.xlarge- Ml
G4dn2xlarge ml.g4dn.2xlarge- Ml
G4dn4xlarge ml.g4dn.4xlarge- Ml
G4dn8xlarge ml.g4dn.8xlarge- Ml
G4dn12xlarge ml.g4dn.12xlarge- Ml
G4dn16xlarge ml.g4dn.16xlarge- Ml
G5Xlarge ml.g5.xlarge- Ml
G52xlarge ml.g5.2xlarge- Ml
G54xlarge ml.g5.4xlarge- Ml
G58xlarge ml.g5.8xlarge- Ml
G516xlarge ml.g5.16xlarge- Ml
G512xlarge ml.g5.12xlarge- Ml
G524xlarge ml.g5.24xlarge- Ml
G548xlarge ml.g5.48xlarge- Ml
R5d Large ml.r5d.large- Ml
R5d Xlarge ml.r5d.xlarge- Ml
R5d2xlarge ml.r5d.2xlarge- Ml
R5d4xlarge ml.r5d.4xlarge- Ml
R5d8xlarge ml.r5d.8xlarge- Ml
R5d12xlarge ml.r5d.12xlarge- Ml
R5d16xlarge ml.r5d.16xlarge- Ml
R5d24xlarge ml.r5d.24xlarge- Ml
G6Xlarge ml.g6.xlarge- Ml
G62xlarge ml.g6.2xlarge- Ml
G64xlarge ml.g6.4xlarge- Ml
G68xlarge ml.g6.8xlarge- Ml
G612xlarge ml.g6.12xlarge- Ml
G616xlarge ml.g6.16xlarge- Ml
G624xlarge ml.g6.24xlarge- Ml
G648xlarge ml.g6.48xlarge- Ml
G6e Xlarge ml.g6e.xlarge- Ml
G6e2xlarge ml.g6e.2xlarge- Ml
G6e4xlarge ml.g6e.4xlarge- Ml
G6e8xlarge ml.g6e.8xlarge- Ml
G6e12xlarge ml.g6e.12xlarge- Ml
G6e16xlarge ml.g6e.16xlarge- Ml
G6e24xlarge ml.g6e.24xlarge- Ml
G6e48xlarge ml.g6e.48xlarge- Ml
M6i Large ml.m6i.large- Ml
M6i Xlarge ml.m6i.xlarge- Ml
M6i2xlarge ml.m6i.2xlarge- Ml
M6i4xlarge ml.m6i.4xlarge- Ml
M6i8xlarge ml.m6i.8xlarge- Ml
M6i12xlarge ml.m6i.12xlarge- Ml
M6i16xlarge ml.m6i.16xlarge- Ml
M6i24xlarge ml.m6i.24xlarge- Ml
M6i32xlarge ml.m6i.32xlarge- Ml
C6i Xlarge ml.c6i.xlarge- Ml
C6i2xlarge ml.c6i.2xlarge- Ml
C6i4xlarge ml.c6i.4xlarge- Ml
C6i8xlarge ml.c6i.8xlarge- Ml
C6i12xlarge ml.c6i.12xlarge- Ml
C6i16xlarge ml.c6i.16xlarge- Ml
C6i24xlarge ml.c6i.24xlarge- Ml
C6i32xlarge ml.c6i.32xlarge- Ml
M7i Large ml.m7i.large- Ml
M7i Xlarge ml.m7i.xlarge- Ml
M7i2xlarge ml.m7i.2xlarge- Ml
M7i4xlarge ml.m7i.4xlarge- Ml
M7i8xlarge ml.m7i.8xlarge- Ml
M7i12xlarge ml.m7i.12xlarge- Ml
M7i16xlarge ml.m7i.16xlarge- Ml
M7i24xlarge ml.m7i.24xlarge- Ml
M7i48xlarge ml.m7i.48xlarge- Ml
C7i Large ml.c7i.large- Ml
C7i Xlarge ml.c7i.xlarge- Ml
C7i2xlarge ml.c7i.2xlarge- Ml
C7i4xlarge ml.c7i.4xlarge- Ml
C7i8xlarge ml.c7i.8xlarge- Ml
C7i12xlarge ml.c7i.12xlarge- Ml
C7i16xlarge ml.c7i.16xlarge- Ml
C7i24xlarge ml.c7i.24xlarge- Ml
C7i48xlarge ml.c7i.48xlarge- Ml
R7i Large ml.r7i.large- Ml
R7i Xlarge ml.r7i.xlarge- Ml
R7i2xlarge ml.r7i.2xlarge- Ml
R7i4xlarge ml.r7i.4xlarge- Ml
R7i8xlarge ml.r7i.8xlarge- Ml
R7i12xlarge ml.r7i.12xlarge- Ml
R7i16xlarge ml.r7i.16xlarge- Ml
R7i24xlarge ml.r7i.24xlarge- Ml
R7i48xlarge ml.r7i.48xlarge
- ML_T3_MEDIUM
ml.t3.medium- ML_T3_LARGE
ml.t3.large- ML_T3_XLARGE
ml.t3.xlarge- ML_T32XLARGE
ml.t3.2xlarge- ML_M4_XLARGE
ml.m4.xlarge- ML_M42XLARGE
ml.m4.2xlarge- ML_M44XLARGE
ml.m4.4xlarge- ML_M410XLARGE
ml.m4.10xlarge- ML_M416XLARGE
ml.m4.16xlarge- ML_C4_XLARGE
ml.c4.xlarge- ML_C42XLARGE
ml.c4.2xlarge- ML_C44XLARGE
ml.c4.4xlarge- ML_C48XLARGE
ml.c4.8xlarge- ML_C5_XLARGE
ml.c5.xlarge- ML_C52XLARGE
ml.c5.2xlarge- ML_C54XLARGE
ml.c5.4xlarge- ML_C59XLARGE
ml.c5.9xlarge- ML_C518XLARGE
ml.c5.18xlarge- ML_M5_LARGE
ml.m5.large- ML_M5_XLARGE
ml.m5.xlarge- ML_M52XLARGE
ml.m5.2xlarge- ML_M54XLARGE
ml.m5.4xlarge- ML_M512XLARGE
ml.m5.12xlarge- ML_M524XLARGE
ml.m5.24xlarge- ML_R5_LARGE
ml.r5.large- ML_R5_XLARGE
ml.r5.xlarge- ML_R52XLARGE
ml.r5.2xlarge- ML_R54XLARGE
ml.r5.4xlarge- ML_R58XLARGE
ml.r5.8xlarge- ML_R512XLARGE
ml.r5.12xlarge- ML_R516XLARGE
ml.r5.16xlarge- ML_R524XLARGE
ml.r5.24xlarge- ML_G4DN_XLARGE
ml.g4dn.xlarge- ML_G4DN2XLARGE
ml.g4dn.2xlarge- ML_G4DN4XLARGE
ml.g4dn.4xlarge- ML_G4DN8XLARGE
ml.g4dn.8xlarge- ML_G4DN12XLARGE
ml.g4dn.12xlarge- ML_G4DN16XLARGE
ml.g4dn.16xlarge- ML_G5_XLARGE
ml.g5.xlarge- ML_G52XLARGE
ml.g5.2xlarge- ML_G54XLARGE
ml.g5.4xlarge- ML_G58XLARGE
ml.g5.8xlarge- ML_G516XLARGE
ml.g5.16xlarge- ML_G512XLARGE
ml.g5.12xlarge- ML_G524XLARGE
ml.g5.24xlarge- ML_G548XLARGE
ml.g5.48xlarge- ML_R5D_LARGE
ml.r5d.large- ML_R5D_XLARGE
ml.r5d.xlarge- ML_R5D2XLARGE
ml.r5d.2xlarge- ML_R5D4XLARGE
ml.r5d.4xlarge- ML_R5D8XLARGE
ml.r5d.8xlarge- ML_R5D12XLARGE
ml.r5d.12xlarge- ML_R5D16XLARGE
ml.r5d.16xlarge- ML_R5D24XLARGE
ml.r5d.24xlarge- ML_G6_XLARGE
ml.g6.xlarge- ML_G62XLARGE
ml.g6.2xlarge- ML_G64XLARGE
ml.g6.4xlarge- ML_G68XLARGE
ml.g6.8xlarge- ML_G612XLARGE
ml.g6.12xlarge- ML_G616XLARGE
ml.g6.16xlarge- ML_G624XLARGE
ml.g6.24xlarge- ML_G648XLARGE
ml.g6.48xlarge- ML_G6E_XLARGE
ml.g6e.xlarge- ML_G6E2XLARGE
ml.g6e.2xlarge- ML_G6E4XLARGE
ml.g6e.4xlarge- ML_G6E8XLARGE
ml.g6e.8xlarge- ML_G6E12XLARGE
ml.g6e.12xlarge- ML_G6E16XLARGE
ml.g6e.16xlarge- ML_G6E24XLARGE
ml.g6e.24xlarge- ML_G6E48XLARGE
ml.g6e.48xlarge- ML_M6I_LARGE
ml.m6i.large- ML_M6I_XLARGE
ml.m6i.xlarge- ML_M6I2XLARGE
ml.m6i.2xlarge- ML_M6I4XLARGE
ml.m6i.4xlarge- ML_M6I8XLARGE
ml.m6i.8xlarge- ML_M6I12XLARGE
ml.m6i.12xlarge- ML_M6I16XLARGE
ml.m6i.16xlarge- ML_M6I24XLARGE
ml.m6i.24xlarge- ML_M6I32XLARGE
ml.m6i.32xlarge- ML_C6I_XLARGE
ml.c6i.xlarge- ML_C6I2XLARGE
ml.c6i.2xlarge- ML_C6I4XLARGE
ml.c6i.4xlarge- ML_C6I8XLARGE
ml.c6i.8xlarge- ML_C6I12XLARGE
ml.c6i.12xlarge- ML_C6I16XLARGE
ml.c6i.16xlarge- ML_C6I24XLARGE
ml.c6i.24xlarge- ML_C6I32XLARGE
ml.c6i.32xlarge- ML_M7I_LARGE
ml.m7i.large- ML_M7I_XLARGE
ml.m7i.xlarge- ML_M7I2XLARGE
ml.m7i.2xlarge- ML_M7I4XLARGE
ml.m7i.4xlarge- ML_M7I8XLARGE
ml.m7i.8xlarge- ML_M7I12XLARGE
ml.m7i.12xlarge- ML_M7I16XLARGE
ml.m7i.16xlarge- ML_M7I24XLARGE
ml.m7i.24xlarge- ML_M7I48XLARGE
ml.m7i.48xlarge- ML_C7I_LARGE
ml.c7i.large- ML_C7I_XLARGE
ml.c7i.xlarge- ML_C7I2XLARGE
ml.c7i.2xlarge- ML_C7I4XLARGE
ml.c7i.4xlarge- ML_C7I8XLARGE
ml.c7i.8xlarge- ML_C7I12XLARGE
ml.c7i.12xlarge- ML_C7I16XLARGE
ml.c7i.16xlarge- ML_C7I24XLARGE
ml.c7i.24xlarge- ML_C7I48XLARGE
ml.c7i.48xlarge- ML_R7I_LARGE
ml.r7i.large- ML_R7I_XLARGE
ml.r7i.xlarge- ML_R7I2XLARGE
ml.r7i.2xlarge- ML_R7I4XLARGE
ml.r7i.4xlarge- ML_R7I8XLARGE
ml.r7i.8xlarge- ML_R7I12XLARGE
ml.r7i.12xlarge- ML_R7I16XLARGE
ml.r7i.16xlarge- ML_R7I24XLARGE
ml.r7i.24xlarge- ML_R7I48XLARGE
ml.r7i.48xlarge
- "ml.t3.medium"
ml.t3.medium- "ml.t3.large"
ml.t3.large- "ml.t3.xlarge"
ml.t3.xlarge- "ml.t3.2xlarge"
ml.t3.2xlarge- "ml.m4.xlarge"
ml.m4.xlarge- "ml.m4.2xlarge"
ml.m4.2xlarge- "ml.m4.4xlarge"
ml.m4.4xlarge- "ml.m4.10xlarge"
ml.m4.10xlarge- "ml.m4.16xlarge"
ml.m4.16xlarge- "ml.c4.xlarge"
ml.c4.xlarge- "ml.c4.2xlarge"
ml.c4.2xlarge- "ml.c4.4xlarge"
ml.c4.4xlarge- "ml.c4.8xlarge"
ml.c4.8xlarge- "ml.c5.xlarge"
ml.c5.xlarge- "ml.c5.2xlarge"
ml.c5.2xlarge- "ml.c5.4xlarge"
ml.c5.4xlarge- "ml.c5.9xlarge"
ml.c5.9xlarge- "ml.c5.18xlarge"
ml.c5.18xlarge- "ml.m5.large"
ml.m5.large- "ml.m5.xlarge"
ml.m5.xlarge- "ml.m5.2xlarge"
ml.m5.2xlarge- "ml.m5.4xlarge"
ml.m5.4xlarge- "ml.m5.12xlarge"
ml.m5.12xlarge- "ml.m5.24xlarge"
ml.m5.24xlarge- "ml.r5.large"
ml.r5.large- "ml.r5.xlarge"
ml.r5.xlarge- "ml.r5.2xlarge"
ml.r5.2xlarge- "ml.r5.4xlarge"
ml.r5.4xlarge- "ml.r5.8xlarge"
ml.r5.8xlarge- "ml.r5.12xlarge"
ml.r5.12xlarge- "ml.r5.16xlarge"
ml.r5.16xlarge- "ml.r5.24xlarge"
ml.r5.24xlarge- "ml.g4dn.xlarge"
ml.g4dn.xlarge- "ml.g4dn.2xlarge"
ml.g4dn.2xlarge- "ml.g4dn.4xlarge"
ml.g4dn.4xlarge- "ml.g4dn.8xlarge"
ml.g4dn.8xlarge- "ml.g4dn.12xlarge"
ml.g4dn.12xlarge- "ml.g4dn.16xlarge"
ml.g4dn.16xlarge- "ml.g5.xlarge"
ml.g5.xlarge- "ml.g5.2xlarge"
ml.g5.2xlarge- "ml.g5.4xlarge"
ml.g5.4xlarge- "ml.g5.8xlarge"
ml.g5.8xlarge- "ml.g5.16xlarge"
ml.g5.16xlarge- "ml.g5.12xlarge"
ml.g5.12xlarge- "ml.g5.24xlarge"
ml.g5.24xlarge- "ml.g5.48xlarge"
ml.g5.48xlarge- "ml.r5d.large"
ml.r5d.large- "ml.r5d.xlarge"
ml.r5d.xlarge- "ml.r5d.2xlarge"
ml.r5d.2xlarge- "ml.r5d.4xlarge"
ml.r5d.4xlarge- "ml.r5d.8xlarge"
ml.r5d.8xlarge- "ml.r5d.12xlarge"
ml.r5d.12xlarge- "ml.r5d.16xlarge"
ml.r5d.16xlarge- "ml.r5d.24xlarge"
ml.r5d.24xlarge- "ml.g6.xlarge"
ml.g6.xlarge- "ml.g6.2xlarge"
ml.g6.2xlarge- "ml.g6.4xlarge"
ml.g6.4xlarge- "ml.g6.8xlarge"
ml.g6.8xlarge- "ml.g6.12xlarge"
ml.g6.12xlarge- "ml.g6.16xlarge"
ml.g6.16xlarge- "ml.g6.24xlarge"
ml.g6.24xlarge- "ml.g6.48xlarge"
ml.g6.48xlarge- "ml.g6e.xlarge"
ml.g6e.xlarge- "ml.g6e.2xlarge"
ml.g6e.2xlarge- "ml.g6e.4xlarge"
ml.g6e.4xlarge- "ml.g6e.8xlarge"
ml.g6e.8xlarge- "ml.g6e.12xlarge"
ml.g6e.12xlarge- "ml.g6e.16xlarge"
ml.g6e.16xlarge- "ml.g6e.24xlarge"
ml.g6e.24xlarge- "ml.g6e.48xlarge"
ml.g6e.48xlarge- "ml.m6i.large"
ml.m6i.large- "ml.m6i.xlarge"
ml.m6i.xlarge- "ml.m6i.2xlarge"
ml.m6i.2xlarge- "ml.m6i.4xlarge"
ml.m6i.4xlarge- "ml.m6i.8xlarge"
ml.m6i.8xlarge- "ml.m6i.12xlarge"
ml.m6i.12xlarge- "ml.m6i.16xlarge"
ml.m6i.16xlarge- "ml.m6i.24xlarge"
ml.m6i.24xlarge- "ml.m6i.32xlarge"
ml.m6i.32xlarge- "ml.c6i.xlarge"
ml.c6i.xlarge- "ml.c6i.2xlarge"
ml.c6i.2xlarge- "ml.c6i.4xlarge"
ml.c6i.4xlarge- "ml.c6i.8xlarge"
ml.c6i.8xlarge- "ml.c6i.12xlarge"
ml.c6i.12xlarge- "ml.c6i.16xlarge"
ml.c6i.16xlarge- "ml.c6i.24xlarge"
ml.c6i.24xlarge- "ml.c6i.32xlarge"
ml.c6i.32xlarge- "ml.m7i.large"
ml.m7i.large- "ml.m7i.xlarge"
ml.m7i.xlarge- "ml.m7i.2xlarge"
ml.m7i.2xlarge- "ml.m7i.4xlarge"
ml.m7i.4xlarge- "ml.m7i.8xlarge"
ml.m7i.8xlarge- "ml.m7i.12xlarge"
ml.m7i.12xlarge- "ml.m7i.16xlarge"
ml.m7i.16xlarge- "ml.m7i.24xlarge"
ml.m7i.24xlarge- "ml.m7i.48xlarge"
ml.m7i.48xlarge- "ml.c7i.large"
ml.c7i.large- "ml.c7i.xlarge"
ml.c7i.xlarge- "ml.c7i.2xlarge"
ml.c7i.2xlarge- "ml.c7i.4xlarge"
ml.c7i.4xlarge- "ml.c7i.8xlarge"
ml.c7i.8xlarge- "ml.c7i.12xlarge"
ml.c7i.12xlarge- "ml.c7i.16xlarge"
ml.c7i.16xlarge- "ml.c7i.24xlarge"
ml.c7i.24xlarge- "ml.c7i.48xlarge"
ml.c7i.48xlarge- "ml.r7i.large"
ml.r7i.large- "ml.r7i.xlarge"
ml.r7i.xlarge- "ml.r7i.2xlarge"
ml.r7i.2xlarge- "ml.r7i.4xlarge"
ml.r7i.4xlarge- "ml.r7i.8xlarge"
ml.r7i.8xlarge- "ml.r7i.12xlarge"
ml.r7i.12xlarge- "ml.r7i.16xlarge"
ml.r7i.16xlarge- "ml.r7i.24xlarge"
ml.r7i.24xlarge- "ml.r7i.48xlarge"
ml.r7i.48xlarge
ProcessingJobDatasetDefinition, ProcessingJobDatasetDefinitionArgs
Configuration for Dataset Definition inputs. The Dataset Definition input must specify exactly one of either AthenaDatasetDefinition or RedshiftDatasetDefinition types.- Athena
Dataset Pulumi.Definition Aws Native. Sage Maker. Inputs. Processing Job Athena Dataset Definition - Configuration for Athena Dataset Definition input.
- Data
Distribution Pulumi.Type Aws Native. Sage Maker. Processing Job Dataset Definition Data Distribution Type - Whether the generated dataset is FullyReplicated or ShardedByS3Key (default).
- Input
Mode Pulumi.Aws Native. Sage Maker. Processing Job Dataset Definition Input Mode - Whether to use File or Pipe input mode. In File (default) mode, Amazon SageMaker copies the data from the input source onto the local Amazon Elastic Block Store (Amazon EBS) volumes before starting your training algorithm. This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your algorithm without using the EBS volume.
- Local
Path string - The local path where you want Amazon SageMaker to download the Dataset Definition inputs to run a processing job. LocalPath is an absolute path to the input data. This is a required parameter when AppManaged is False (default).
- Redshift
Dataset Pulumi.Definition Aws Native. Sage Maker. Inputs. Processing Job Redshift Dataset Definition - Configuration for Redshift Dataset Definition input.
- Athena
Dataset ProcessingDefinition Job Athena Dataset Definition - Configuration for Athena Dataset Definition input.
- Data
Distribution ProcessingType Job Dataset Definition Data Distribution Type - Whether the generated dataset is FullyReplicated or ShardedByS3Key (default).
- Input
Mode ProcessingJob Dataset Definition Input Mode - Whether to use File or Pipe input mode. In File (default) mode, Amazon SageMaker copies the data from the input source onto the local Amazon Elastic Block Store (Amazon EBS) volumes before starting your training algorithm. This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your algorithm without using the EBS volume.
- Local
Path string - The local path where you want Amazon SageMaker to download the Dataset Definition inputs to run a processing job. LocalPath is an absolute path to the input data. This is a required parameter when AppManaged is False (default).
- Redshift
Dataset ProcessingDefinition Job Redshift Dataset Definition - Configuration for Redshift Dataset Definition input.
- athena
Dataset ProcessingDefinition Job Athena Dataset Definition - Configuration for Athena Dataset Definition input.
- data
Distribution ProcessingType Job Dataset Definition Data Distribution Type - Whether the generated dataset is FullyReplicated or ShardedByS3Key (default).
- input
Mode ProcessingJob Dataset Definition Input Mode - Whether to use File or Pipe input mode. In File (default) mode, Amazon SageMaker copies the data from the input source onto the local Amazon Elastic Block Store (Amazon EBS) volumes before starting your training algorithm. This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your algorithm without using the EBS volume.
- local
Path String - The local path where you want Amazon SageMaker to download the Dataset Definition inputs to run a processing job. LocalPath is an absolute path to the input data. This is a required parameter when AppManaged is False (default).
- redshift
Dataset ProcessingDefinition Job Redshift Dataset Definition - Configuration for Redshift Dataset Definition input.
- athena
Dataset ProcessingDefinition Job Athena Dataset Definition - Configuration for Athena Dataset Definition input.
- data
Distribution ProcessingType Job Dataset Definition Data Distribution Type - Whether the generated dataset is FullyReplicated or ShardedByS3Key (default).
- input
Mode ProcessingJob Dataset Definition Input Mode - Whether to use File or Pipe input mode. In File (default) mode, Amazon SageMaker copies the data from the input source onto the local Amazon Elastic Block Store (Amazon EBS) volumes before starting your training algorithm. This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your algorithm without using the EBS volume.
- local
Path string - The local path where you want Amazon SageMaker to download the Dataset Definition inputs to run a processing job. LocalPath is an absolute path to the input data. This is a required parameter when AppManaged is False (default).
- redshift
Dataset ProcessingDefinition Job Redshift Dataset Definition - Configuration for Redshift Dataset Definition input.
- athena_
dataset_ Processingdefinition Job Athena Dataset Definition - Configuration for Athena Dataset Definition input.
- data_
distribution_ Processingtype Job Dataset Definition Data Distribution Type - Whether the generated dataset is FullyReplicated or ShardedByS3Key (default).
- input_
mode ProcessingJob Dataset Definition Input Mode - Whether to use File or Pipe input mode. In File (default) mode, Amazon SageMaker copies the data from the input source onto the local Amazon Elastic Block Store (Amazon EBS) volumes before starting your training algorithm. This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your algorithm without using the EBS volume.
- local_
path str - The local path where you want Amazon SageMaker to download the Dataset Definition inputs to run a processing job. LocalPath is an absolute path to the input data. This is a required parameter when AppManaged is False (default).
- redshift_
dataset_ Processingdefinition Job Redshift Dataset Definition - Configuration for Redshift Dataset Definition input.
- athena
Dataset Property MapDefinition - Configuration for Athena Dataset Definition input.
- data
Distribution "FullyType Replicated" | "Sharded By S3Key" - Whether the generated dataset is FullyReplicated or ShardedByS3Key (default).
- input
Mode "File" | "Pipe" - Whether to use File or Pipe input mode. In File (default) mode, Amazon SageMaker copies the data from the input source onto the local Amazon Elastic Block Store (Amazon EBS) volumes before starting your training algorithm. This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your algorithm without using the EBS volume.
- local
Path String - The local path where you want Amazon SageMaker to download the Dataset Definition inputs to run a processing job. LocalPath is an absolute path to the input data. This is a required parameter when AppManaged is False (default).
- redshift
Dataset Property MapDefinition - Configuration for Redshift Dataset Definition input.
ProcessingJobDatasetDefinitionDataDistributionType, ProcessingJobDatasetDefinitionDataDistributionTypeArgs
- Fully
Replicated FullyReplicated- Sharded
By S3Key ShardedByS3Key
- Processing
Job Dataset Definition Data Distribution Type Fully Replicated FullyReplicated- Processing
Job Dataset Definition Data Distribution Type Sharded By S3Key ShardedByS3Key
- Fully
Replicated FullyReplicated- Sharded
By S3Key ShardedByS3Key
- Fully
Replicated FullyReplicated- Sharded
By S3Key ShardedByS3Key
- FULLY_REPLICATED
FullyReplicated- SHARDED_BY_S3_KEY
ShardedByS3Key
- "Fully
Replicated" FullyReplicated- "Sharded
By S3Key" ShardedByS3Key
ProcessingJobDatasetDefinitionInputMode, ProcessingJobDatasetDefinitionInputModeArgs
- File
File- Pipe
Pipe
- Processing
Job Dataset Definition Input Mode File File- Processing
Job Dataset Definition Input Mode Pipe Pipe
- File
File- Pipe
Pipe
- File
File- Pipe
Pipe
- FILE
File- PIPE
Pipe
- "File"
File- "Pipe"
Pipe
ProcessingJobExperimentConfig, ProcessingJobExperimentConfigArgs
Associates a SageMaker job as a trial component with an experiment and trial.- Experiment
Name string - The name of an existing experiment to associate with the trial component.
- Run
Name string - The name of the experiment run to associate with the trial component.
- Trial
Component stringDisplay Name - The display name for the trial component. If this key isn't specified, the display name is the trial component name.
- Trial
Name string - The name of an existing trial to associate the trial component with. If not specified, a new trial is created.
- Experiment
Name string - The name of an existing experiment to associate with the trial component.
- Run
Name string - The name of the experiment run to associate with the trial component.
- Trial
Component stringDisplay Name - The display name for the trial component. If this key isn't specified, the display name is the trial component name.
- Trial
Name string - The name of an existing trial to associate the trial component with. If not specified, a new trial is created.
- experiment
Name String - The name of an existing experiment to associate with the trial component.
- run
Name String - The name of the experiment run to associate with the trial component.
- trial
Component StringDisplay Name - The display name for the trial component. If this key isn't specified, the display name is the trial component name.
- trial
Name String - The name of an existing trial to associate the trial component with. If not specified, a new trial is created.
- experiment
Name string - The name of an existing experiment to associate with the trial component.
- run
Name string - The name of the experiment run to associate with the trial component.
- trial
Component stringDisplay Name - The display name for the trial component. If this key isn't specified, the display name is the trial component name.
- trial
Name string - The name of an existing trial to associate the trial component with. If not specified, a new trial is created.
- experiment_
name str - The name of an existing experiment to associate with the trial component.
- run_
name str - The name of the experiment run to associate with the trial component.
- trial_
component_ strdisplay_ name - The display name for the trial component. If this key isn't specified, the display name is the trial component name.
- trial_
name str - The name of an existing trial to associate the trial component with. If not specified, a new trial is created.
- experiment
Name String - The name of an existing experiment to associate with the trial component.
- run
Name String - The name of the experiment run to associate with the trial component.
- trial
Component StringDisplay Name - The display name for the trial component. If this key isn't specified, the display name is the trial component name.
- trial
Name String - The name of an existing trial to associate the trial component with. If not specified, a new trial is created.
ProcessingJobFeatureStoreOutput, ProcessingJobFeatureStoreOutputArgs
Configuration for processing job outputs in Amazon SageMaker Feature Store.- Feature
Group stringName - The name of the Amazon SageMaker FeatureGroup to use as the destination for processing job output. Note that your processing script is responsible for putting records into your Feature Store.
- Feature
Group stringName - The name of the Amazon SageMaker FeatureGroup to use as the destination for processing job output. Note that your processing script is responsible for putting records into your Feature Store.
- feature
Group StringName - The name of the Amazon SageMaker FeatureGroup to use as the destination for processing job output. Note that your processing script is responsible for putting records into your Feature Store.
- feature
Group stringName - The name of the Amazon SageMaker FeatureGroup to use as the destination for processing job output. Note that your processing script is responsible for putting records into your Feature Store.
- feature_
group_ strname - The name of the Amazon SageMaker FeatureGroup to use as the destination for processing job output. Note that your processing script is responsible for putting records into your Feature Store.
- feature
Group StringName - The name of the Amazon SageMaker FeatureGroup to use as the destination for processing job output. Note that your processing script is responsible for putting records into your Feature Store.
ProcessingJobNetworkConfig, ProcessingJobNetworkConfigArgs
Networking options for a job, such as network traffic encryption between containers, whether to allow inbound and outbound network calls to and from containers, and the VPC subnets and security groups to use for VPC-enabled jobs.- Enable
Inter boolContainer Traffic Encryption - Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.
- Enable
Network boolIsolation - Whether to allow inbound and outbound network calls to and from the containers used for the processing job.
- Vpc
Config Pulumi.Aws Native. Sage Maker. Inputs. Processing Job Vpc Config - Specifies an Amazon Virtual Private Cloud (VPC) that your SageMaker jobs, hosted models, and compute resources have access to. You can control access to and from your resources by configuring a VPC. For more information, see Give SageMaker Access to Resources in your Amazon VPC .
- Enable
Inter boolContainer Traffic Encryption - Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.
- Enable
Network boolIsolation - Whether to allow inbound and outbound network calls to and from the containers used for the processing job.
- Vpc
Config ProcessingJob Vpc Config - Specifies an Amazon Virtual Private Cloud (VPC) that your SageMaker jobs, hosted models, and compute resources have access to. You can control access to and from your resources by configuring a VPC. For more information, see Give SageMaker Access to Resources in your Amazon VPC .
- enable
Inter BooleanContainer Traffic Encryption - Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.
- enable
Network BooleanIsolation - Whether to allow inbound and outbound network calls to and from the containers used for the processing job.
- vpc
Config ProcessingJob Vpc Config - Specifies an Amazon Virtual Private Cloud (VPC) that your SageMaker jobs, hosted models, and compute resources have access to. You can control access to and from your resources by configuring a VPC. For more information, see Give SageMaker Access to Resources in your Amazon VPC .
- enable
Inter booleanContainer Traffic Encryption - Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.
- enable
Network booleanIsolation - Whether to allow inbound and outbound network calls to and from the containers used for the processing job.
- vpc
Config ProcessingJob Vpc Config - Specifies an Amazon Virtual Private Cloud (VPC) that your SageMaker jobs, hosted models, and compute resources have access to. You can control access to and from your resources by configuring a VPC. For more information, see Give SageMaker Access to Resources in your Amazon VPC .
- enable_
inter_ boolcontainer_ traffic_ encryption - Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.
- enable_
network_ boolisolation - Whether to allow inbound and outbound network calls to and from the containers used for the processing job.
- vpc_
config ProcessingJob Vpc Config - Specifies an Amazon Virtual Private Cloud (VPC) that your SageMaker jobs, hosted models, and compute resources have access to. You can control access to and from your resources by configuring a VPC. For more information, see Give SageMaker Access to Resources in your Amazon VPC .
- enable
Inter BooleanContainer Traffic Encryption - Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.
- enable
Network BooleanIsolation - Whether to allow inbound and outbound network calls to and from the containers used for the processing job.
- vpc
Config Property Map - Specifies an Amazon Virtual Private Cloud (VPC) that your SageMaker jobs, hosted models, and compute resources have access to. You can control access to and from your resources by configuring a VPC. For more information, see Give SageMaker Access to Resources in your Amazon VPC .
ProcessingJobProcessingInputsObject, ProcessingJobProcessingInputsObjectArgs
The inputs for a processing job. The processing input must specify exactly one of either S3Input or DatasetDefinition types.- Input
Name string - The name for the processing job input.
- App
Managed bool - When True, input operations such as data download are managed natively by the processing job application. When False (default), input operations are managed by Amazon SageMaker.
- Dataset
Definition Pulumi.Aws Native. Sage Maker. Inputs. Processing Job Dataset Definition - Configuration for Dataset Definition inputs. The Dataset Definition input must specify exactly one of either
AthenaDatasetDefinitionorRedshiftDatasetDefinitiontypes. - S3Input
Pulumi.
Aws Native. Sage Maker. Inputs. Processing Job S3Input - Configuration for downloading input data from Amazon S3 into the processing container.
- Input
Name string - The name for the processing job input.
- App
Managed bool - When True, input operations such as data download are managed natively by the processing job application. When False (default), input operations are managed by Amazon SageMaker.
- Dataset
Definition ProcessingJob Dataset Definition - Configuration for Dataset Definition inputs. The Dataset Definition input must specify exactly one of either
AthenaDatasetDefinitionorRedshiftDatasetDefinitiontypes. - S3Input
Processing
Job S3Input - Configuration for downloading input data from Amazon S3 into the processing container.
- input
Name String - The name for the processing job input.
- app
Managed Boolean - When True, input operations such as data download are managed natively by the processing job application. When False (default), input operations are managed by Amazon SageMaker.
- dataset
Definition ProcessingJob Dataset Definition - Configuration for Dataset Definition inputs. The Dataset Definition input must specify exactly one of either
AthenaDatasetDefinitionorRedshiftDatasetDefinitiontypes. - s3Input
Processing
Job S3Input - Configuration for downloading input data from Amazon S3 into the processing container.
- input
Name string - The name for the processing job input.
- app
Managed boolean - When True, input operations such as data download are managed natively by the processing job application. When False (default), input operations are managed by Amazon SageMaker.
- dataset
Definition ProcessingJob Dataset Definition - Configuration for Dataset Definition inputs. The Dataset Definition input must specify exactly one of either
AthenaDatasetDefinitionorRedshiftDatasetDefinitiontypes. - s3Input
Processing
Job S3Input - Configuration for downloading input data from Amazon S3 into the processing container.
- input_
name str - The name for the processing job input.
- app_
managed bool - When True, input operations such as data download are managed natively by the processing job application. When False (default), input operations are managed by Amazon SageMaker.
- dataset_
definition ProcessingJob Dataset Definition - Configuration for Dataset Definition inputs. The Dataset Definition input must specify exactly one of either
AthenaDatasetDefinitionorRedshiftDatasetDefinitiontypes. - s3_
input ProcessingJob S3Input - Configuration for downloading input data from Amazon S3 into the processing container.
- input
Name String - The name for the processing job input.
- app
Managed Boolean - When True, input operations such as data download are managed natively by the processing job application. When False (default), input operations are managed by Amazon SageMaker.
- dataset
Definition Property Map - Configuration for Dataset Definition inputs. The Dataset Definition input must specify exactly one of either
AthenaDatasetDefinitionorRedshiftDatasetDefinitiontypes. - s3Input Property Map
- Configuration for downloading input data from Amazon S3 into the processing container.
ProcessingJobProcessingOutputConfig, ProcessingJobProcessingOutputConfigArgs
Configuration for uploading output from the processing container.- Outputs
List<Pulumi.
Aws Native. Sage Maker. Inputs. Processing Job Processing Outputs Object> - An array of outputs configuring the data to upload from the processing container.
- Kms
Key stringId - The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the processing job output. KmsKeyId can be an ID of a KMS key, ARN of a KMS key, or alias of a KMS key. The KmsKeyId is applied to all outputs.
- Outputs
[]Processing
Job Processing Outputs Object - An array of outputs configuring the data to upload from the processing container.
- Kms
Key stringId - The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the processing job output. KmsKeyId can be an ID of a KMS key, ARN of a KMS key, or alias of a KMS key. The KmsKeyId is applied to all outputs.
- outputs
List<Processing
Job Processing Outputs Object> - An array of outputs configuring the data to upload from the processing container.
- kms
Key StringId - The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the processing job output. KmsKeyId can be an ID of a KMS key, ARN of a KMS key, or alias of a KMS key. The KmsKeyId is applied to all outputs.
- outputs
Processing
Job Processing Outputs Object[] - An array of outputs configuring the data to upload from the processing container.
- kms
Key stringId - The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the processing job output. KmsKeyId can be an ID of a KMS key, ARN of a KMS key, or alias of a KMS key. The KmsKeyId is applied to all outputs.
- outputs
Sequence[Processing
Job Processing Outputs Object] - An array of outputs configuring the data to upload from the processing container.
- kms_
key_ strid - The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the processing job output. KmsKeyId can be an ID of a KMS key, ARN of a KMS key, or alias of a KMS key. The KmsKeyId is applied to all outputs.
- outputs List<Property Map>
- An array of outputs configuring the data to upload from the processing container.
- kms
Key StringId - The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the processing job output. KmsKeyId can be an ID of a KMS key, ARN of a KMS key, or alias of a KMS key. The KmsKeyId is applied to all outputs.
ProcessingJobProcessingOutputsObject, ProcessingJobProcessingOutputsObjectArgs
Describes the results of a processing job. The processing output must specify exactly one of either S3Output or FeatureStoreOutput types.- Output
Name string - The name for the processing job output.
- App
Managed bool - When True, output operations such as data upload are managed natively by the processing job application. When False (default), output operations are managed by Amazon SageMaker.
- Feature
Store Pulumi.Output Aws Native. Sage Maker. Inputs. Processing Job Feature Store Output - Configuration for processing job outputs in Amazon SageMaker Feature Store.
- S3Output
Pulumi.
Aws Native. Sage Maker. Inputs. Processing Job S3Output - Configuration for uploading output data to Amazon S3 from the processing container.
- Output
Name string - The name for the processing job output.
- App
Managed bool - When True, output operations such as data upload are managed natively by the processing job application. When False (default), output operations are managed by Amazon SageMaker.
- Feature
Store ProcessingOutput Job Feature Store Output - Configuration for processing job outputs in Amazon SageMaker Feature Store.
- S3Output
Processing
Job S3Output - Configuration for uploading output data to Amazon S3 from the processing container.
- output
Name String - The name for the processing job output.
- app
Managed Boolean - When True, output operations such as data upload are managed natively by the processing job application. When False (default), output operations are managed by Amazon SageMaker.
- feature
Store ProcessingOutput Job Feature Store Output - Configuration for processing job outputs in Amazon SageMaker Feature Store.
- s3Output
Processing
Job S3Output - Configuration for uploading output data to Amazon S3 from the processing container.
- output
Name string - The name for the processing job output.
- app
Managed boolean - When True, output operations such as data upload are managed natively by the processing job application. When False (default), output operations are managed by Amazon SageMaker.
- feature
Store ProcessingOutput Job Feature Store Output - Configuration for processing job outputs in Amazon SageMaker Feature Store.
- s3Output
Processing
Job S3Output - Configuration for uploading output data to Amazon S3 from the processing container.
- output_
name str - The name for the processing job output.
- app_
managed bool - When True, output operations such as data upload are managed natively by the processing job application. When False (default), output operations are managed by Amazon SageMaker.
- feature_
store_ Processingoutput Job Feature Store Output - Configuration for processing job outputs in Amazon SageMaker Feature Store.
- s3_
output ProcessingJob S3Output - Configuration for uploading output data to Amazon S3 from the processing container.
- output
Name String - The name for the processing job output.
- app
Managed Boolean - When True, output operations such as data upload are managed natively by the processing job application. When False (default), output operations are managed by Amazon SageMaker.
- feature
Store Property MapOutput - Configuration for processing job outputs in Amazon SageMaker Feature Store.
- s3Output Property Map
- Configuration for uploading output data to Amazon S3 from the processing container.
ProcessingJobProcessingResources, ProcessingJobProcessingResourcesArgs
Identifies the resources, ML compute instances, and ML storage volumes to deploy for a processing job. In distributed training, you specify more than one instance.- Cluster
Config Pulumi.Aws Native. Sage Maker. Inputs. Processing Job Cluster Config - The configuration for the resources in a cluster used to run the processing job.
- Cluster
Config ProcessingJob Cluster Config - The configuration for the resources in a cluster used to run the processing job.
- cluster
Config ProcessingJob Cluster Config - The configuration for the resources in a cluster used to run the processing job.
- cluster
Config ProcessingJob Cluster Config - The configuration for the resources in a cluster used to run the processing job.
- cluster_
config ProcessingJob Cluster Config - The configuration for the resources in a cluster used to run the processing job.
- cluster
Config Property Map - The configuration for the resources in a cluster used to run the processing job.
ProcessingJobRedshiftDatasetDefinition, ProcessingJobRedshiftDatasetDefinitionArgs
Configuration for Redshift Dataset Definition input.- Cluster
Id string - The Redshift cluster Identifier.
- Cluster
Role stringArn - The IAM role attached to your Redshift cluster that Amazon SageMaker uses to generate datasets.
- Database string
- The name of the Redshift database used in Redshift query execution.
- Db
User string - The database user name used in Redshift query execution.
- Output
Format Pulumi.Aws Native. Sage Maker. Processing Job Redshift Dataset Definition Output Format - The data storage format for Redshift query results.
- Output
S3Uri string - The location in Amazon S3 where the Redshift query results are stored.
- Query
String string - The SQL query statements to be executed.
- Kms
Key stringId - The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data from a Redshift execution.
- Output
Compression Pulumi.Aws Native. Sage Maker. Processing Job Redshift Dataset Definition Output Compression - The compression used for Redshift query results.
- Cluster
Id string - The Redshift cluster Identifier.
- Cluster
Role stringArn - The IAM role attached to your Redshift cluster that Amazon SageMaker uses to generate datasets.
- Database string
- The name of the Redshift database used in Redshift query execution.
- Db
User string - The database user name used in Redshift query execution.
- Output
Format ProcessingJob Redshift Dataset Definition Output Format - The data storage format for Redshift query results.
- Output
S3Uri string - The location in Amazon S3 where the Redshift query results are stored.
- Query
String string - The SQL query statements to be executed.
- Kms
Key stringId - The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data from a Redshift execution.
- Output
Compression ProcessingJob Redshift Dataset Definition Output Compression - The compression used for Redshift query results.
- cluster
Id String - The Redshift cluster Identifier.
- cluster
Role StringArn - The IAM role attached to your Redshift cluster that Amazon SageMaker uses to generate datasets.
- database String
- The name of the Redshift database used in Redshift query execution.
- db
User String - The database user name used in Redshift query execution.
- output
Format ProcessingJob Redshift Dataset Definition Output Format - The data storage format for Redshift query results.
- output
S3Uri String - The location in Amazon S3 where the Redshift query results are stored.
- query
String String - The SQL query statements to be executed.
- kms
Key StringId - The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data from a Redshift execution.
- output
Compression ProcessingJob Redshift Dataset Definition Output Compression - The compression used for Redshift query results.
- cluster
Id string - The Redshift cluster Identifier.
- cluster
Role stringArn - The IAM role attached to your Redshift cluster that Amazon SageMaker uses to generate datasets.
- database string
- The name of the Redshift database used in Redshift query execution.
- db
User string - The database user name used in Redshift query execution.
- output
Format ProcessingJob Redshift Dataset Definition Output Format - The data storage format for Redshift query results.
- output
S3Uri string - The location in Amazon S3 where the Redshift query results are stored.
- query
String string - The SQL query statements to be executed.
- kms
Key stringId - The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data from a Redshift execution.
- output
Compression ProcessingJob Redshift Dataset Definition Output Compression - The compression used for Redshift query results.
- cluster_
id str - The Redshift cluster Identifier.
- cluster_
role_ strarn - The IAM role attached to your Redshift cluster that Amazon SageMaker uses to generate datasets.
- database str
- The name of the Redshift database used in Redshift query execution.
- db_
user str - The database user name used in Redshift query execution.
- output_
format ProcessingJob Redshift Dataset Definition Output Format - The data storage format for Redshift query results.
- output_
s3_ struri - The location in Amazon S3 where the Redshift query results are stored.
- query_
string str - The SQL query statements to be executed.
- kms_
key_ strid - The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data from a Redshift execution.
- output_
compression ProcessingJob Redshift Dataset Definition Output Compression - The compression used for Redshift query results.
- cluster
Id String - The Redshift cluster Identifier.
- cluster
Role StringArn - The IAM role attached to your Redshift cluster that Amazon SageMaker uses to generate datasets.
- database String
- The name of the Redshift database used in Redshift query execution.
- db
User String - The database user name used in Redshift query execution.
- output
Format "PARQUET" | "CSV" - The data storage format for Redshift query results.
- output
S3Uri String - The location in Amazon S3 where the Redshift query results are stored.
- query
String String - The SQL query statements to be executed.
- kms
Key StringId - The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data from a Redshift execution.
- output
Compression "None" | "GZIP" | "SNAPPY" | "ZSTD" | "BZIP2" - The compression used for Redshift query results.
ProcessingJobRedshiftDatasetDefinitionOutputCompression, ProcessingJobRedshiftDatasetDefinitionOutputCompressionArgs
- None
None- Gzip
GZIP- Snappy
SNAPPY- Zstd
ZSTD- Bzip2
BZIP2
- Processing
Job Redshift Dataset Definition Output Compression None None- Processing
Job Redshift Dataset Definition Output Compression Gzip GZIP- Processing
Job Redshift Dataset Definition Output Compression Snappy SNAPPY- Processing
Job Redshift Dataset Definition Output Compression Zstd ZSTD- Processing
Job Redshift Dataset Definition Output Compression Bzip2 BZIP2
- None
None- Gzip
GZIP- Snappy
SNAPPY- Zstd
ZSTD- Bzip2
BZIP2
- None
None- Gzip
GZIP- Snappy
SNAPPY- Zstd
ZSTD- Bzip2
BZIP2
- NONE
None- GZIP
GZIP- SNAPPY
SNAPPY- ZSTD
ZSTD- BZIP2
BZIP2
- "None"
None- "GZIP"
GZIP- "SNAPPY"
SNAPPY- "ZSTD"
ZSTD- "BZIP2"
BZIP2
ProcessingJobRedshiftDatasetDefinitionOutputFormat, ProcessingJobRedshiftDatasetDefinitionOutputFormatArgs
- Parquet
PARQUET- Csv
CSV
- Processing
Job Redshift Dataset Definition Output Format Parquet PARQUET- Processing
Job Redshift Dataset Definition Output Format Csv CSV
- Parquet
PARQUET- Csv
CSV
- Parquet
PARQUET- Csv
CSV
- PARQUET
PARQUET- CSV
CSV
- "PARQUET"
PARQUET- "CSV"
CSV
ProcessingJobS3Input, ProcessingJobS3InputArgs
Configuration for downloading input data from Amazon S3 into the processing container.- S3Data
Type Pulumi.Aws Native. Sage Maker. Processing Job S3Input S3Data Type - Whether you use an S3Prefix or a ManifestFile for the data type. If you choose S3Prefix, S3Uri identifies a key name prefix. Amazon SageMaker uses all objects with the specified key name prefix for the processing job. If you choose ManifestFile, S3Uri identifies an object that is a manifest file containing a list of object keys that you want Amazon SageMaker to use for the processing job.
- S3Uri string
- The URI of the Amazon S3 prefix Amazon SageMaker downloads data required to run a processing job.
- Local
Path string - The local path in your container where you want Amazon SageMaker to write input data to.
LocalPathis an absolute path to the input data and must begin with/opt/ml/processing/. LocalPath is a required parameter whenAppManagedisFalse(default). - S3Compression
Type Pulumi.Aws Native. Sage Maker. Processing Job S3Input S3Compression Type - Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container.
Gzipcan only be used whenPipemode is specified as theS3InputMode. InPipemode, Amazon SageMaker streams input data from the source directly to your container without using the EBS volume. - S3Data
Distribution Pulumi.Type Aws Native. Sage Maker. Processing Job S3Input S3Data Distribution Type - Whether to distribute the data from Amazon S3 to all processing instances with
FullyReplicated, or whether the data from Amazon S3 is shared by Amazon S3 key, downloading one shard of data to each processing instance. - S3Input
Mode Pulumi.Aws Native. Sage Maker. Processing Job S3Input S3Input Mode - Whether to use File or Pipe input mode. In File mode, Amazon SageMaker copies the data from the input source onto the local ML storage volume before starting your processing container. This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your processing container into named pipes without using the ML storage volume.
- S3Data
Type ProcessingJob S3Input S3Data Type - Whether you use an S3Prefix or a ManifestFile for the data type. If you choose S3Prefix, S3Uri identifies a key name prefix. Amazon SageMaker uses all objects with the specified key name prefix for the processing job. If you choose ManifestFile, S3Uri identifies an object that is a manifest file containing a list of object keys that you want Amazon SageMaker to use for the processing job.
- S3Uri string
- The URI of the Amazon S3 prefix Amazon SageMaker downloads data required to run a processing job.
- Local
Path string - The local path in your container where you want Amazon SageMaker to write input data to.
LocalPathis an absolute path to the input data and must begin with/opt/ml/processing/. LocalPath is a required parameter whenAppManagedisFalse(default). - S3Compression
Type ProcessingJob S3Input S3Compression Type - Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container.
Gzipcan only be used whenPipemode is specified as theS3InputMode. InPipemode, Amazon SageMaker streams input data from the source directly to your container without using the EBS volume. - S3Data
Distribution ProcessingType Job S3Input S3Data Distribution Type - Whether to distribute the data from Amazon S3 to all processing instances with
FullyReplicated, or whether the data from Amazon S3 is shared by Amazon S3 key, downloading one shard of data to each processing instance. - S3Input
Mode ProcessingJob S3Input S3Input Mode - Whether to use File or Pipe input mode. In File mode, Amazon SageMaker copies the data from the input source onto the local ML storage volume before starting your processing container. This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your processing container into named pipes without using the ML storage volume.
- s3Data
Type ProcessingJob S3Input S3Data Type - Whether you use an S3Prefix or a ManifestFile for the data type. If you choose S3Prefix, S3Uri identifies a key name prefix. Amazon SageMaker uses all objects with the specified key name prefix for the processing job. If you choose ManifestFile, S3Uri identifies an object that is a manifest file containing a list of object keys that you want Amazon SageMaker to use for the processing job.
- s3Uri String
- The URI of the Amazon S3 prefix Amazon SageMaker downloads data required to run a processing job.
- local
Path String - The local path in your container where you want Amazon SageMaker to write input data to.
LocalPathis an absolute path to the input data and must begin with/opt/ml/processing/. LocalPath is a required parameter whenAppManagedisFalse(default). - s3Compression
Type ProcessingJob S3Input S3Compression Type - Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container.
Gzipcan only be used whenPipemode is specified as theS3InputMode. InPipemode, Amazon SageMaker streams input data from the source directly to your container without using the EBS volume. - s3Data
Distribution ProcessingType Job S3Input S3Data Distribution Type - Whether to distribute the data from Amazon S3 to all processing instances with
FullyReplicated, or whether the data from Amazon S3 is shared by Amazon S3 key, downloading one shard of data to each processing instance. - s3Input
Mode ProcessingJob S3Input S3Input Mode - Whether to use File or Pipe input mode. In File mode, Amazon SageMaker copies the data from the input source onto the local ML storage volume before starting your processing container. This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your processing container into named pipes without using the ML storage volume.
- s3Data
Type ProcessingJob S3Input S3Data Type - Whether you use an S3Prefix or a ManifestFile for the data type. If you choose S3Prefix, S3Uri identifies a key name prefix. Amazon SageMaker uses all objects with the specified key name prefix for the processing job. If you choose ManifestFile, S3Uri identifies an object that is a manifest file containing a list of object keys that you want Amazon SageMaker to use for the processing job.
- s3Uri string
- The URI of the Amazon S3 prefix Amazon SageMaker downloads data required to run a processing job.
- local
Path string - The local path in your container where you want Amazon SageMaker to write input data to.
LocalPathis an absolute path to the input data and must begin with/opt/ml/processing/. LocalPath is a required parameter whenAppManagedisFalse(default). - s3Compression
Type ProcessingJob S3Input S3Compression Type - Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container.
Gzipcan only be used whenPipemode is specified as theS3InputMode. InPipemode, Amazon SageMaker streams input data from the source directly to your container without using the EBS volume. - s3Data
Distribution ProcessingType Job S3Input S3Data Distribution Type - Whether to distribute the data from Amazon S3 to all processing instances with
FullyReplicated, or whether the data from Amazon S3 is shared by Amazon S3 key, downloading one shard of data to each processing instance. - s3Input
Mode ProcessingJob S3Input S3Input Mode - Whether to use File or Pipe input mode. In File mode, Amazon SageMaker copies the data from the input source onto the local ML storage volume before starting your processing container. This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your processing container into named pipes without using the ML storage volume.
- s3_
data_ Processingtype Job S3Input S3Data Type - Whether you use an S3Prefix or a ManifestFile for the data type. If you choose S3Prefix, S3Uri identifies a key name prefix. Amazon SageMaker uses all objects with the specified key name prefix for the processing job. If you choose ManifestFile, S3Uri identifies an object that is a manifest file containing a list of object keys that you want Amazon SageMaker to use for the processing job.
- s3_
uri str - The URI of the Amazon S3 prefix Amazon SageMaker downloads data required to run a processing job.
- local_
path str - The local path in your container where you want Amazon SageMaker to write input data to.
LocalPathis an absolute path to the input data and must begin with/opt/ml/processing/. LocalPath is a required parameter whenAppManagedisFalse(default). - s3_
compression_ Processingtype Job S3Input S3Compression Type - Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container.
Gzipcan only be used whenPipemode is specified as theS3InputMode. InPipemode, Amazon SageMaker streams input data from the source directly to your container without using the EBS volume. - s3_
data_ Processingdistribution_ type Job S3Input S3Data Distribution Type - Whether to distribute the data from Amazon S3 to all processing instances with
FullyReplicated, or whether the data from Amazon S3 is shared by Amazon S3 key, downloading one shard of data to each processing instance. - s3_
input_ Processingmode Job S3Input S3Input Mode - Whether to use File or Pipe input mode. In File mode, Amazon SageMaker copies the data from the input source onto the local ML storage volume before starting your processing container. This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your processing container into named pipes without using the ML storage volume.
- s3Data
Type "ManifestFile" | "S3Prefix" - Whether you use an S3Prefix or a ManifestFile for the data type. If you choose S3Prefix, S3Uri identifies a key name prefix. Amazon SageMaker uses all objects with the specified key name prefix for the processing job. If you choose ManifestFile, S3Uri identifies an object that is a manifest file containing a list of object keys that you want Amazon SageMaker to use for the processing job.
- s3Uri String
- The URI of the Amazon S3 prefix Amazon SageMaker downloads data required to run a processing job.
- local
Path String - The local path in your container where you want Amazon SageMaker to write input data to.
LocalPathis an absolute path to the input data and must begin with/opt/ml/processing/. LocalPath is a required parameter whenAppManagedisFalse(default). - s3Compression
Type "None" | "Gzip" - Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container.
Gzipcan only be used whenPipemode is specified as theS3InputMode. InPipemode, Amazon SageMaker streams input data from the source directly to your container without using the EBS volume. - s3Data
Distribution "FullyType Replicated" | "Sharded By S3Key" - Whether to distribute the data from Amazon S3 to all processing instances with
FullyReplicated, or whether the data from Amazon S3 is shared by Amazon S3 key, downloading one shard of data to each processing instance. - s3Input
Mode "File" | "Pipe" - Whether to use File or Pipe input mode. In File mode, Amazon SageMaker copies the data from the input source onto the local ML storage volume before starting your processing container. This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your processing container into named pipes without using the ML storage volume.
ProcessingJobS3InputS3CompressionType, ProcessingJobS3InputS3CompressionTypeArgs
- None
None- Gzip
Gzip
- Processing
Job S3Input S3Compression Type None None- Processing
Job S3Input S3Compression Type Gzip Gzip
- None
None- Gzip
Gzip
- None
None- Gzip
Gzip
- NONE
None- GZIP
Gzip
- "None"
None- "Gzip"
Gzip
ProcessingJobS3InputS3DataDistributionType, ProcessingJobS3InputS3DataDistributionTypeArgs
- Fully
Replicated FullyReplicated- Sharded
By S3Key ShardedByS3Key
- Processing
Job S3Input S3Data Distribution Type Fully Replicated FullyReplicated- Processing
Job S3Input S3Data Distribution Type Sharded By S3Key ShardedByS3Key
- Fully
Replicated FullyReplicated- Sharded
By S3Key ShardedByS3Key
- Fully
Replicated FullyReplicated- Sharded
By S3Key ShardedByS3Key
- FULLY_REPLICATED
FullyReplicated- SHARDED_BY_S3_KEY
ShardedByS3Key
- "Fully
Replicated" FullyReplicated- "Sharded
By S3Key" ShardedByS3Key
ProcessingJobS3InputS3DataType, ProcessingJobS3InputS3DataTypeArgs
- Manifest
File ManifestFile- S3Prefix
S3Prefix
- Processing
Job S3Input S3Data Type Manifest File ManifestFile- Processing
Job S3Input S3Data Type S3Prefix S3Prefix
- Manifest
File ManifestFile- S3Prefix
S3Prefix
- Manifest
File ManifestFile- S3Prefix
S3Prefix
- MANIFEST_FILE
ManifestFile- S3_PREFIX
S3Prefix
- "Manifest
File" ManifestFile- "S3Prefix"
S3Prefix
ProcessingJobS3InputS3InputMode, ProcessingJobS3InputS3InputModeArgs
- File
File- Pipe
Pipe
- Processing
Job S3Input S3Input Mode File File- Processing
Job S3Input S3Input Mode Pipe Pipe
- File
File- Pipe
Pipe
- File
File- Pipe
Pipe
- FILE
File- PIPE
Pipe
- "File"
File- "Pipe"
Pipe
ProcessingJobS3Output, ProcessingJobS3OutputArgs
Configuration for uploading output data to Amazon S3 from the processing container.- S3Upload
Mode Pulumi.Aws Native. Sage Maker. Processing Job S3Output S3Upload Mode - Whether to upload the results of the processing job continuously or after the job completes.
- S3Uri string
- A URI that identifies the Amazon S3 bucket where you want Amazon SageMaker to save the results of a processing job.
- Local
Path string - The local path of a directory where you want Amazon SageMaker to upload its contents to Amazon S3. LocalPath is an absolute path to a directory containing output files. This directory will be created by the platform and exist when your container's entrypoint is invoked.
- S3Upload
Mode ProcessingJob S3Output S3Upload Mode - Whether to upload the results of the processing job continuously or after the job completes.
- S3Uri string
- A URI that identifies the Amazon S3 bucket where you want Amazon SageMaker to save the results of a processing job.
- Local
Path string - The local path of a directory where you want Amazon SageMaker to upload its contents to Amazon S3. LocalPath is an absolute path to a directory containing output files. This directory will be created by the platform and exist when your container's entrypoint is invoked.
- s3Upload
Mode ProcessingJob S3Output S3Upload Mode - Whether to upload the results of the processing job continuously or after the job completes.
- s3Uri String
- A URI that identifies the Amazon S3 bucket where you want Amazon SageMaker to save the results of a processing job.
- local
Path String - The local path of a directory where you want Amazon SageMaker to upload its contents to Amazon S3. LocalPath is an absolute path to a directory containing output files. This directory will be created by the platform and exist when your container's entrypoint is invoked.
- s3Upload
Mode ProcessingJob S3Output S3Upload Mode - Whether to upload the results of the processing job continuously or after the job completes.
- s3Uri string
- A URI that identifies the Amazon S3 bucket where you want Amazon SageMaker to save the results of a processing job.
- local
Path string - The local path of a directory where you want Amazon SageMaker to upload its contents to Amazon S3. LocalPath is an absolute path to a directory containing output files. This directory will be created by the platform and exist when your container's entrypoint is invoked.
- s3_
upload_ Processingmode Job S3Output S3Upload Mode - Whether to upload the results of the processing job continuously or after the job completes.
- s3_
uri str - A URI that identifies the Amazon S3 bucket where you want Amazon SageMaker to save the results of a processing job.
- local_
path str - The local path of a directory where you want Amazon SageMaker to upload its contents to Amazon S3. LocalPath is an absolute path to a directory containing output files. This directory will be created by the platform and exist when your container's entrypoint is invoked.
- s3Upload
Mode "Continuous" | "EndOf Job" - Whether to upload the results of the processing job continuously or after the job completes.
- s3Uri String
- A URI that identifies the Amazon S3 bucket where you want Amazon SageMaker to save the results of a processing job.
- local
Path String - The local path of a directory where you want Amazon SageMaker to upload its contents to Amazon S3. LocalPath is an absolute path to a directory containing output files. This directory will be created by the platform and exist when your container's entrypoint is invoked.
ProcessingJobS3OutputS3UploadMode, ProcessingJobS3OutputS3UploadModeArgs
- Continuous
Continuous- End
Of Job EndOfJob
- Processing
Job S3Output S3Upload Mode Continuous Continuous- Processing
Job S3Output S3Upload Mode End Of Job EndOfJob
- Continuous
Continuous- End
Of Job EndOfJob
- Continuous
Continuous- End
Of Job EndOfJob
- CONTINUOUS
Continuous- END_OF_JOB
EndOfJob
- "Continuous"
Continuous- "End
Of Job" EndOfJob
ProcessingJobStatus, ProcessingJobStatusArgs
- Completed
Completed- In
Progress InProgress- Stopping
Stopping- Stopped
Stopped- Failed
Failed
- Processing
Job Status Completed Completed- Processing
Job Status In Progress InProgress- Processing
Job Status Stopping Stopping- Processing
Job Status Stopped Stopped- Processing
Job Status Failed Failed
- Completed
Completed- In
Progress InProgress- Stopping
Stopping- Stopped
Stopped- Failed
Failed
- Completed
Completed- In
Progress InProgress- Stopping
Stopping- Stopped
Stopped- Failed
Failed
- COMPLETED
Completed- IN_PROGRESS
InProgress- STOPPING
Stopping- STOPPED
Stopped- FAILED
Failed
- "Completed"
Completed- "In
Progress" InProgress- "Stopping"
Stopping- "Stopped"
Stopped- "Failed"
Failed
ProcessingJobStoppingCondition, ProcessingJobStoppingConditionArgs
Configures conditions under which the processing job should be stopped, such as how long the processing job has been running. After the condition is met, the processing job is stopped.- Max
Runtime intIn Seconds - Specifies the maximum runtime in seconds.
- Max
Runtime intIn Seconds - Specifies the maximum runtime in seconds.
- max
Runtime IntegerIn Seconds - Specifies the maximum runtime in seconds.
- max
Runtime numberIn Seconds - Specifies the maximum runtime in seconds.
- max_
runtime_ intin_ seconds - Specifies the maximum runtime in seconds.
- max
Runtime NumberIn Seconds - Specifies the maximum runtime in seconds.
ProcessingJobVpcConfig, ProcessingJobVpcConfigArgs
Specifies an Amazon Virtual Private Cloud (VPC) that your SageMaker jobs, hosted models, and compute resources have access to. You can control access to and from your resources by configuring a VPC. For more information, see https://docs.aws.amazon.com/sagemaker/latest/dg/infrastructure-give-access.html- Security
Group List<string>Ids - The VPC security group IDs, in the form 'sg-xxxxxxxx'. Specify the security groups for the VPC that is specified in the 'Subnets' field.
- Subnets List<string>
- The ID of the subnets in the VPC to which you want to connect your training job or model. For information about the availability of specific instance types, see https://docs.aws.amazon.com/sagemaker/latest/dg/regions-quotas.html
- Security
Group []stringIds - The VPC security group IDs, in the form 'sg-xxxxxxxx'. Specify the security groups for the VPC that is specified in the 'Subnets' field.
- Subnets []string
- The ID of the subnets in the VPC to which you want to connect your training job or model. For information about the availability of specific instance types, see https://docs.aws.amazon.com/sagemaker/latest/dg/regions-quotas.html
- security
Group List<String>Ids - The VPC security group IDs, in the form 'sg-xxxxxxxx'. Specify the security groups for the VPC that is specified in the 'Subnets' field.
- subnets List<String>
- The ID of the subnets in the VPC to which you want to connect your training job or model. For information about the availability of specific instance types, see https://docs.aws.amazon.com/sagemaker/latest/dg/regions-quotas.html
- security
Group string[]Ids - The VPC security group IDs, in the form 'sg-xxxxxxxx'. Specify the security groups for the VPC that is specified in the 'Subnets' field.
- subnets string[]
- The ID of the subnets in the VPC to which you want to connect your training job or model. For information about the availability of specific instance types, see https://docs.aws.amazon.com/sagemaker/latest/dg/regions-quotas.html
- security_
group_ Sequence[str]ids - The VPC security group IDs, in the form 'sg-xxxxxxxx'. Specify the security groups for the VPC that is specified in the 'Subnets' field.
- subnets Sequence[str]
- The ID of the subnets in the VPC to which you want to connect your training job or model. For information about the availability of specific instance types, see https://docs.aws.amazon.com/sagemaker/latest/dg/regions-quotas.html
- security
Group List<String>Ids - The VPC security group IDs, in the form 'sg-xxxxxxxx'. Specify the security groups for the VPC that is specified in the 'Subnets' field.
- subnets List<String>
- The ID of the subnets in the VPC to which you want to connect your training job or model. For information about the availability of specific instance types, see https://docs.aws.amazon.com/sagemaker/latest/dg/regions-quotas.html
Package Details
- Repository
- AWS Native pulumi/pulumi-aws-native
- License
- Apache-2.0
We recommend new projects start with resources from the AWS provider.
