1. Packages
  2. Google Cloud Native
  3. API Docs
  4. dataplex
  5. dataplex/v1
  6. Task

Google Cloud Native is in preview. Google Cloud Classic is fully supported.

Google Cloud Native v0.32.0 published on Wednesday, Nov 29, 2023 by Pulumi

google-native.dataplex/v1.Task

Explore with Pulumi AI

google-native logo

Google Cloud Native is in preview. Google Cloud Classic is fully supported.

Google Cloud Native v0.32.0 published on Wednesday, Nov 29, 2023 by Pulumi

    Creates a task resource within a lake. Auto-naming is currently not supported for this resource.

    Create Task Resource

    new Task(name: string, args: TaskArgs, opts?: CustomResourceOptions);
    @overload
    def Task(resource_name: str,
             opts: Optional[ResourceOptions] = None,
             description: Optional[str] = None,
             display_name: Optional[str] = None,
             execution_spec: Optional[GoogleCloudDataplexV1TaskExecutionSpecArgs] = None,
             labels: Optional[Mapping[str, str]] = None,
             lake_id: Optional[str] = None,
             location: Optional[str] = None,
             notebook: Optional[GoogleCloudDataplexV1TaskNotebookTaskConfigArgs] = None,
             project: Optional[str] = None,
             spark: Optional[GoogleCloudDataplexV1TaskSparkTaskConfigArgs] = None,
             task_id: Optional[str] = None,
             trigger_spec: Optional[GoogleCloudDataplexV1TaskTriggerSpecArgs] = None)
    @overload
    def Task(resource_name: str,
             args: TaskArgs,
             opts: Optional[ResourceOptions] = None)
    func NewTask(ctx *Context, name string, args TaskArgs, opts ...ResourceOption) (*Task, error)
    public Task(string name, TaskArgs args, CustomResourceOptions? opts = null)
    public Task(String name, TaskArgs args)
    public Task(String name, TaskArgs args, CustomResourceOptions options)
    
    type: google-native:dataplex/v1:Task
    properties: # The arguments to resource properties.
    options: # Bag of options to control resource's behavior.
    
    
    name string
    The unique name of the resource.
    args TaskArgs
    The arguments to resource properties.
    opts CustomResourceOptions
    Bag of options to control resource's behavior.
    resource_name str
    The unique name of the resource.
    args TaskArgs
    The arguments to resource properties.
    opts ResourceOptions
    Bag of options to control resource's behavior.
    ctx Context
    Context object for the current deployment.
    name string
    The unique name of the resource.
    args TaskArgs
    The arguments to resource properties.
    opts ResourceOption
    Bag of options to control resource's behavior.
    name string
    The unique name of the resource.
    args TaskArgs
    The arguments to resource properties.
    opts CustomResourceOptions
    Bag of options to control resource's behavior.
    name String
    The unique name of the resource.
    args TaskArgs
    The arguments to resource properties.
    options CustomResourceOptions
    Bag of options to control resource's behavior.

    Task Resource Properties

    To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.

    Inputs

    The Task resource accepts the following input properties:

    ExecutionSpec Pulumi.GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskExecutionSpec

    Spec related to how a task is executed.

    LakeId string
    TaskId string

    Required. Task identifier.

    TriggerSpec Pulumi.GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskTriggerSpec

    Spec related to how often and when a task should be triggered.

    Description string

    Optional. Description of the task.

    DisplayName string

    Optional. User friendly display name.

    Labels Dictionary<string, string>

    Optional. User-defined labels for the task.

    Location string
    Notebook Pulumi.GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskNotebookTaskConfig

    Config related to running scheduled Notebooks.

    Project string
    Spark Pulumi.GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskSparkTaskConfig

    Config related to running custom Spark tasks.

    ExecutionSpec GoogleCloudDataplexV1TaskExecutionSpecArgs

    Spec related to how a task is executed.

    LakeId string
    TaskId string

    Required. Task identifier.

    TriggerSpec GoogleCloudDataplexV1TaskTriggerSpecArgs

    Spec related to how often and when a task should be triggered.

    Description string

    Optional. Description of the task.

    DisplayName string

    Optional. User friendly display name.

    Labels map[string]string

    Optional. User-defined labels for the task.

    Location string
    Notebook GoogleCloudDataplexV1TaskNotebookTaskConfigArgs

    Config related to running scheduled Notebooks.

    Project string
    Spark GoogleCloudDataplexV1TaskSparkTaskConfigArgs

    Config related to running custom Spark tasks.

    executionSpec GoogleCloudDataplexV1TaskExecutionSpec

    Spec related to how a task is executed.

    lakeId String
    taskId String

    Required. Task identifier.

    triggerSpec GoogleCloudDataplexV1TaskTriggerSpec

    Spec related to how often and when a task should be triggered.

    description String

    Optional. Description of the task.

    displayName String

    Optional. User friendly display name.

    labels Map<String,String>

    Optional. User-defined labels for the task.

    location String
    notebook GoogleCloudDataplexV1TaskNotebookTaskConfig

    Config related to running scheduled Notebooks.

    project String
    spark GoogleCloudDataplexV1TaskSparkTaskConfig

    Config related to running custom Spark tasks.

    executionSpec GoogleCloudDataplexV1TaskExecutionSpec

    Spec related to how a task is executed.

    lakeId string
    taskId string

    Required. Task identifier.

    triggerSpec GoogleCloudDataplexV1TaskTriggerSpec

    Spec related to how often and when a task should be triggered.

    description string

    Optional. Description of the task.

    displayName string

    Optional. User friendly display name.

    labels {[key: string]: string}

    Optional. User-defined labels for the task.

    location string
    notebook GoogleCloudDataplexV1TaskNotebookTaskConfig

    Config related to running scheduled Notebooks.

    project string
    spark GoogleCloudDataplexV1TaskSparkTaskConfig

    Config related to running custom Spark tasks.

    execution_spec GoogleCloudDataplexV1TaskExecutionSpecArgs

    Spec related to how a task is executed.

    lake_id str
    task_id str

    Required. Task identifier.

    trigger_spec GoogleCloudDataplexV1TaskTriggerSpecArgs

    Spec related to how often and when a task should be triggered.

    description str

    Optional. Description of the task.

    display_name str

    Optional. User friendly display name.

    labels Mapping[str, str]

    Optional. User-defined labels for the task.

    location str
    notebook GoogleCloudDataplexV1TaskNotebookTaskConfigArgs

    Config related to running scheduled Notebooks.

    project str
    spark GoogleCloudDataplexV1TaskSparkTaskConfigArgs

    Config related to running custom Spark tasks.

    executionSpec Property Map

    Spec related to how a task is executed.

    lakeId String
    taskId String

    Required. Task identifier.

    triggerSpec Property Map

    Spec related to how often and when a task should be triggered.

    description String

    Optional. Description of the task.

    displayName String

    Optional. User friendly display name.

    labels Map<String>

    Optional. User-defined labels for the task.

    location String
    notebook Property Map

    Config related to running scheduled Notebooks.

    project String
    spark Property Map

    Config related to running custom Spark tasks.

    Outputs

    All input properties are implicitly available as output properties. Additionally, the Task resource produces the following output properties:

    CreateTime string

    The time when the task was created.

    ExecutionStatus Pulumi.GoogleNative.Dataplex.V1.Outputs.GoogleCloudDataplexV1TaskExecutionStatusResponse

    Status of the latest task executions.

    Id string

    The provider-assigned unique ID for this managed resource.

    Name string

    The relative resource name of the task, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/ tasks/{task_id}.

    State string

    Current state of the task.

    Uid string

    System generated globally unique ID for the task. This ID will be different if the task is deleted and re-created with the same name.

    UpdateTime string

    The time when the task was last updated.

    CreateTime string

    The time when the task was created.

    ExecutionStatus GoogleCloudDataplexV1TaskExecutionStatusResponse

    Status of the latest task executions.

    Id string

    The provider-assigned unique ID for this managed resource.

    Name string

    The relative resource name of the task, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/ tasks/{task_id}.

    State string

    Current state of the task.

    Uid string

    System generated globally unique ID for the task. This ID will be different if the task is deleted and re-created with the same name.

    UpdateTime string

    The time when the task was last updated.

    createTime String

    The time when the task was created.

    executionStatus GoogleCloudDataplexV1TaskExecutionStatusResponse

    Status of the latest task executions.

    id String

    The provider-assigned unique ID for this managed resource.

    name String

    The relative resource name of the task, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/ tasks/{task_id}.

    state String

    Current state of the task.

    uid String

    System generated globally unique ID for the task. This ID will be different if the task is deleted and re-created with the same name.

    updateTime String

    The time when the task was last updated.

    createTime string

    The time when the task was created.

    executionStatus GoogleCloudDataplexV1TaskExecutionStatusResponse

    Status of the latest task executions.

    id string

    The provider-assigned unique ID for this managed resource.

    name string

    The relative resource name of the task, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/ tasks/{task_id}.

    state string

    Current state of the task.

    uid string

    System generated globally unique ID for the task. This ID will be different if the task is deleted and re-created with the same name.

    updateTime string

    The time when the task was last updated.

    create_time str

    The time when the task was created.

    execution_status GoogleCloudDataplexV1TaskExecutionStatusResponse

    Status of the latest task executions.

    id str

    The provider-assigned unique ID for this managed resource.

    name str

    The relative resource name of the task, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/ tasks/{task_id}.

    state str

    Current state of the task.

    uid str

    System generated globally unique ID for the task. This ID will be different if the task is deleted and re-created with the same name.

    update_time str

    The time when the task was last updated.

    createTime String

    The time when the task was created.

    executionStatus Property Map

    Status of the latest task executions.

    id String

    The provider-assigned unique ID for this managed resource.

    name String

    The relative resource name of the task, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/ tasks/{task_id}.

    state String

    Current state of the task.

    uid String

    System generated globally unique ID for the task. This ID will be different if the task is deleted and re-created with the same name.

    updateTime String

    The time when the task was last updated.

    Supporting Types

    GoogleCloudDataplexV1JobResponse, GoogleCloudDataplexV1JobResponseArgs

    EndTime string

    The time when the job ended.

    ExecutionSpec Pulumi.GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskExecutionSpecResponse

    Spec related to how a task is executed.

    Labels Dictionary<string, string>

    User-defined labels for the task.

    Message string

    Additional information about the current state.

    Name string

    The relative resource name of the job, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/tasks/{task_id}/jobs/{job_id}.

    RetryCount int

    The number of times the job has been retried (excluding the initial attempt).

    Service string

    The underlying service running a job.

    ServiceJob string

    The full resource name for the job run under a particular service.

    StartTime string

    The time when the job was started.

    State string

    Execution state for the job.

    Trigger string

    Job execution trigger.

    Uid string

    System generated globally unique ID for the job.

    EndTime string

    The time when the job ended.

    ExecutionSpec GoogleCloudDataplexV1TaskExecutionSpecResponse

    Spec related to how a task is executed.

    Labels map[string]string

    User-defined labels for the task.

    Message string

    Additional information about the current state.

    Name string

    The relative resource name of the job, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/tasks/{task_id}/jobs/{job_id}.

    RetryCount int

    The number of times the job has been retried (excluding the initial attempt).

    Service string

    The underlying service running a job.

    ServiceJob string

    The full resource name for the job run under a particular service.

    StartTime string

    The time when the job was started.

    State string

    Execution state for the job.

    Trigger string

    Job execution trigger.

    Uid string

    System generated globally unique ID for the job.

    endTime String

    The time when the job ended.

    executionSpec GoogleCloudDataplexV1TaskExecutionSpecResponse

    Spec related to how a task is executed.

    labels Map<String,String>

    User-defined labels for the task.

    message String

    Additional information about the current state.

    name String

    The relative resource name of the job, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/tasks/{task_id}/jobs/{job_id}.

    retryCount Integer

    The number of times the job has been retried (excluding the initial attempt).

    service String

    The underlying service running a job.

    serviceJob String

    The full resource name for the job run under a particular service.

    startTime String

    The time when the job was started.

    state String

    Execution state for the job.

    trigger String

    Job execution trigger.

    uid String

    System generated globally unique ID for the job.

    endTime string

    The time when the job ended.

    executionSpec GoogleCloudDataplexV1TaskExecutionSpecResponse

    Spec related to how a task is executed.

    labels {[key: string]: string}

    User-defined labels for the task.

    message string

    Additional information about the current state.

    name string

    The relative resource name of the job, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/tasks/{task_id}/jobs/{job_id}.

    retryCount number

    The number of times the job has been retried (excluding the initial attempt).

    service string

    The underlying service running a job.

    serviceJob string

    The full resource name for the job run under a particular service.

    startTime string

    The time when the job was started.

    state string

    Execution state for the job.

    trigger string

    Job execution trigger.

    uid string

    System generated globally unique ID for the job.

    end_time str

    The time when the job ended.

    execution_spec GoogleCloudDataplexV1TaskExecutionSpecResponse

    Spec related to how a task is executed.

    labels Mapping[str, str]

    User-defined labels for the task.

    message str

    Additional information about the current state.

    name str

    The relative resource name of the job, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/tasks/{task_id}/jobs/{job_id}.

    retry_count int

    The number of times the job has been retried (excluding the initial attempt).

    service str

    The underlying service running a job.

    service_job str

    The full resource name for the job run under a particular service.

    start_time str

    The time when the job was started.

    state str

    Execution state for the job.

    trigger str

    Job execution trigger.

    uid str

    System generated globally unique ID for the job.

    endTime String

    The time when the job ended.

    executionSpec Property Map

    Spec related to how a task is executed.

    labels Map<String>

    User-defined labels for the task.

    message String

    Additional information about the current state.

    name String

    The relative resource name of the job, of the form: projects/{project_number}/locations/{location_id}/lakes/{lake_id}/tasks/{task_id}/jobs/{job_id}.

    retryCount Number

    The number of times the job has been retried (excluding the initial attempt).

    service String

    The underlying service running a job.

    serviceJob String

    The full resource name for the job run under a particular service.

    startTime String

    The time when the job was started.

    state String

    Execution state for the job.

    trigger String

    Job execution trigger.

    uid String

    System generated globally unique ID for the job.

    GoogleCloudDataplexV1TaskExecutionSpec, GoogleCloudDataplexV1TaskExecutionSpecArgs

    ServiceAccount string

    Service account to use to execute a task. If not provided, the default Compute service account for the project is used.

    Args Dictionary<string, string>

    Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.

    KmsKey string

    Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.

    MaxJobExecutionLifetime string

    Optional. The maximum duration after which the job execution is expired.

    Project string

    Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.

    ServiceAccount string

    Service account to use to execute a task. If not provided, the default Compute service account for the project is used.

    Args map[string]string

    Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.

    KmsKey string

    Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.

    MaxJobExecutionLifetime string

    Optional. The maximum duration after which the job execution is expired.

    Project string

    Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.

    serviceAccount String

    Service account to use to execute a task. If not provided, the default Compute service account for the project is used.

    args Map<String,String>

    Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.

    kmsKey String

    Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.

    maxJobExecutionLifetime String

    Optional. The maximum duration after which the job execution is expired.

    project String

    Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.

    serviceAccount string

    Service account to use to execute a task. If not provided, the default Compute service account for the project is used.

    args {[key: string]: string}

    Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.

    kmsKey string

    Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.

    maxJobExecutionLifetime string

    Optional. The maximum duration after which the job execution is expired.

    project string

    Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.

    service_account str

    Service account to use to execute a task. If not provided, the default Compute service account for the project is used.

    args Mapping[str, str]

    Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.

    kms_key str

    Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.

    max_job_execution_lifetime str

    Optional. The maximum duration after which the job execution is expired.

    project str

    Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.

    serviceAccount String

    Service account to use to execute a task. If not provided, the default Compute service account for the project is used.

    args Map<String>

    Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.

    kmsKey String

    Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.

    maxJobExecutionLifetime String

    Optional. The maximum duration after which the job execution is expired.

    project String

    Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.

    GoogleCloudDataplexV1TaskExecutionSpecResponse, GoogleCloudDataplexV1TaskExecutionSpecResponseArgs

    Args Dictionary<string, string>

    Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.

    KmsKey string

    Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.

    MaxJobExecutionLifetime string

    Optional. The maximum duration after which the job execution is expired.

    Project string

    Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.

    ServiceAccount string

    Service account to use to execute a task. If not provided, the default Compute service account for the project is used.

    Args map[string]string

    Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.

    KmsKey string

    Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.

    MaxJobExecutionLifetime string

    Optional. The maximum duration after which the job execution is expired.

    Project string

    Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.

    ServiceAccount string

    Service account to use to execute a task. If not provided, the default Compute service account for the project is used.

    args Map<String,String>

    Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.

    kmsKey String

    Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.

    maxJobExecutionLifetime String

    Optional. The maximum duration after which the job execution is expired.

    project String

    Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.

    serviceAccount String

    Service account to use to execute a task. If not provided, the default Compute service account for the project is used.

    args {[key: string]: string}

    Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.

    kmsKey string

    Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.

    maxJobExecutionLifetime string

    Optional. The maximum duration after which the job execution is expired.

    project string

    Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.

    serviceAccount string

    Service account to use to execute a task. If not provided, the default Compute service account for the project is used.

    args Mapping[str, str]

    Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.

    kms_key str

    Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.

    max_job_execution_lifetime str

    Optional. The maximum duration after which the job execution is expired.

    project str

    Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.

    service_account str

    Service account to use to execute a task. If not provided, the default Compute service account for the project is used.

    args Map<String>

    Optional. The arguments to pass to the task. The args can use placeholders of the format ${placeholder} as part of key/value string. These will be interpolated before passing the args to the driver. Currently supported placeholders: - ${task_id} - ${job_time} To pass positional args, set the key as TASK_ARGS. The value should be a comma-separated string of all the positional arguments. To use a delimiter other than comma, refer to https://cloud.google.com/sdk/gcloud/reference/topic/escaping. In case of other keys being present in the args, then TASK_ARGS will be passed as the last argument.

    kmsKey String

    Optional. The Cloud KMS key to use for encryption, of the form: projects/{project_number}/locations/{location_id}/keyRings/{key-ring-name}/cryptoKeys/{key-name}.

    maxJobExecutionLifetime String

    Optional. The maximum duration after which the job execution is expired.

    project String

    Optional. The project in which jobs are run. By default, the project containing the Lake is used. If a project is provided, the ExecutionSpec.service_account must belong to this project.

    serviceAccount String

    Service account to use to execute a task. If not provided, the default Compute service account for the project is used.

    GoogleCloudDataplexV1TaskExecutionStatusResponse, GoogleCloudDataplexV1TaskExecutionStatusResponseArgs

    LatestJob GoogleCloudDataplexV1JobResponse

    latest job execution

    UpdateTime string

    Last update time of the status.

    latestJob GoogleCloudDataplexV1JobResponse

    latest job execution

    updateTime String

    Last update time of the status.

    latestJob GoogleCloudDataplexV1JobResponse

    latest job execution

    updateTime string

    Last update time of the status.

    latest_job GoogleCloudDataplexV1JobResponse

    latest job execution

    update_time str

    Last update time of the status.

    latestJob Property Map

    latest job execution

    updateTime String

    Last update time of the status.

    GoogleCloudDataplexV1TaskInfrastructureSpec, GoogleCloudDataplexV1TaskInfrastructureSpecArgs

    batch Property Map

    Compute resources needed for a Task when using Dataproc Serverless.

    containerImage Property Map

    Container Image Runtime Configuration.

    vpcNetwork Property Map

    Vpc network.

    GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResources, GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesArgs

    ExecutorsCount int

    Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2

    MaxExecutorsCount int

    Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000

    ExecutorsCount int

    Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2

    MaxExecutorsCount int

    Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000

    executorsCount Integer

    Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2

    maxExecutorsCount Integer

    Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000

    executorsCount number

    Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2

    maxExecutorsCount number

    Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000

    executors_count int

    Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2

    max_executors_count int

    Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000

    executorsCount Number

    Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2

    maxExecutorsCount Number

    Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000

    GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesResponse, GoogleCloudDataplexV1TaskInfrastructureSpecBatchComputeResourcesResponseArgs

    ExecutorsCount int

    Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2

    MaxExecutorsCount int

    Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000

    ExecutorsCount int

    Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2

    MaxExecutorsCount int

    Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000

    executorsCount Integer

    Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2

    maxExecutorsCount Integer

    Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000

    executorsCount number

    Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2

    maxExecutorsCount number

    Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000

    executors_count int

    Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2

    max_executors_count int

    Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000

    executorsCount Number

    Optional. Total number of job executors. Executor Count should be between 2 and 100. Default=2

    maxExecutorsCount Number

    Optional. Max configurable executors. If max_executors_count > executors_count, then auto-scaling is enabled. Max Executor Count should be between 2 and 1000. Default=1000

    GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntime, GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeArgs

    Image string

    Optional. Container image to use.

    JavaJars List<string>

    Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar

    Properties Dictionary<string, string>

    Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).

    PythonPackages List<string>

    Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz

    Image string

    Optional. Container image to use.

    JavaJars []string

    Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar

    Properties map[string]string

    Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).

    PythonPackages []string

    Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz

    image String

    Optional. Container image to use.

    javaJars List<String>

    Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar

    properties Map<String,String>

    Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).

    pythonPackages List<String>

    Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz

    image string

    Optional. Container image to use.

    javaJars string[]

    Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar

    properties {[key: string]: string}

    Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).

    pythonPackages string[]

    Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz

    image str

    Optional. Container image to use.

    java_jars Sequence[str]

    Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar

    properties Mapping[str, str]

    Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).

    python_packages Sequence[str]

    Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz

    image String

    Optional. Container image to use.

    javaJars List<String>

    Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar

    properties Map<String>

    Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).

    pythonPackages List<String>

    Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz

    GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeResponse, GoogleCloudDataplexV1TaskInfrastructureSpecContainerImageRuntimeResponseArgs

    Image string

    Optional. Container image to use.

    JavaJars List<string>

    Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar

    Properties Dictionary<string, string>

    Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).

    PythonPackages List<string>

    Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz

    Image string

    Optional. Container image to use.

    JavaJars []string

    Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar

    Properties map[string]string

    Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).

    PythonPackages []string

    Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz

    image String

    Optional. Container image to use.

    javaJars List<String>

    Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar

    properties Map<String,String>

    Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).

    pythonPackages List<String>

    Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz

    image string

    Optional. Container image to use.

    javaJars string[]

    Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar

    properties {[key: string]: string}

    Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).

    pythonPackages string[]

    Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz

    image str

    Optional. Container image to use.

    java_jars Sequence[str]

    Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar

    properties Mapping[str, str]

    Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).

    python_packages Sequence[str]

    Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz

    image String

    Optional. Container image to use.

    javaJars List<String>

    Optional. A list of Java JARS to add to the classpath. Valid input includes Cloud Storage URIs to Jar binaries. For example, gs://bucket-name/my/path/to/file.jar

    properties Map<String>

    Optional. Override to common configuration of open source components installed on the Dataproc cluster. The properties to set on daemon config files. Property keys are specified in prefix:property format, for example core:hadoop.tmp.dir. For more information, see Cluster properties (https://cloud.google.com/dataproc/docs/concepts/cluster-properties).

    pythonPackages List<String>

    Optional. A list of python packages to be installed. Valid formats include Cloud Storage URI to a PIP installable library. For example, gs://bucket-name/my/path/to/lib.tar.gz

    GoogleCloudDataplexV1TaskInfrastructureSpecResponse, GoogleCloudDataplexV1TaskInfrastructureSpecResponseArgs

    batch Property Map

    Compute resources needed for a Task when using Dataproc Serverless.

    containerImage Property Map

    Container Image Runtime Configuration.

    vpcNetwork Property Map

    Vpc network.

    GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetwork, GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkArgs

    Network string

    Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.

    NetworkTags List<string>

    Optional. List of network tags to apply to the job.

    SubNetwork string

    Optional. The Cloud VPC sub-network in which the job is run.

    Network string

    Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.

    NetworkTags []string

    Optional. List of network tags to apply to the job.

    SubNetwork string

    Optional. The Cloud VPC sub-network in which the job is run.

    network String

    Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.

    networkTags List<String>

    Optional. List of network tags to apply to the job.

    subNetwork String

    Optional. The Cloud VPC sub-network in which the job is run.

    network string

    Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.

    networkTags string[]

    Optional. List of network tags to apply to the job.

    subNetwork string

    Optional. The Cloud VPC sub-network in which the job is run.

    network str

    Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.

    network_tags Sequence[str]

    Optional. List of network tags to apply to the job.

    sub_network str

    Optional. The Cloud VPC sub-network in which the job is run.

    network String

    Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.

    networkTags List<String>

    Optional. List of network tags to apply to the job.

    subNetwork String

    Optional. The Cloud VPC sub-network in which the job is run.

    GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkResponse, GoogleCloudDataplexV1TaskInfrastructureSpecVpcNetworkResponseArgs

    Network string

    Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.

    NetworkTags List<string>

    Optional. List of network tags to apply to the job.

    SubNetwork string

    Optional. The Cloud VPC sub-network in which the job is run.

    Network string

    Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.

    NetworkTags []string

    Optional. List of network tags to apply to the job.

    SubNetwork string

    Optional. The Cloud VPC sub-network in which the job is run.

    network String

    Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.

    networkTags List<String>

    Optional. List of network tags to apply to the job.

    subNetwork String

    Optional. The Cloud VPC sub-network in which the job is run.

    network string

    Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.

    networkTags string[]

    Optional. List of network tags to apply to the job.

    subNetwork string

    Optional. The Cloud VPC sub-network in which the job is run.

    network str

    Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.

    network_tags Sequence[str]

    Optional. List of network tags to apply to the job.

    sub_network str

    Optional. The Cloud VPC sub-network in which the job is run.

    network String

    Optional. The Cloud VPC network in which the job is run. By default, the Cloud VPC network named Default within the project is used.

    networkTags List<String>

    Optional. List of network tags to apply to the job.

    subNetwork String

    Optional. The Cloud VPC sub-network in which the job is run.

    GoogleCloudDataplexV1TaskNotebookTaskConfig, GoogleCloudDataplexV1TaskNotebookTaskConfigArgs

    Notebook string

    Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).

    ArchiveUris List<string>

    Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

    FileUris List<string>

    Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.

    InfrastructureSpec Pulumi.GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskInfrastructureSpec

    Optional. Infrastructure specification for the execution.

    Notebook string

    Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).

    ArchiveUris []string

    Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

    FileUris []string

    Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.

    InfrastructureSpec GoogleCloudDataplexV1TaskInfrastructureSpec

    Optional. Infrastructure specification for the execution.

    notebook String

    Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).

    archiveUris List<String>

    Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

    fileUris List<String>

    Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.

    infrastructureSpec GoogleCloudDataplexV1TaskInfrastructureSpec

    Optional. Infrastructure specification for the execution.

    notebook string

    Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).

    archiveUris string[]

    Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

    fileUris string[]

    Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.

    infrastructureSpec GoogleCloudDataplexV1TaskInfrastructureSpec

    Optional. Infrastructure specification for the execution.

    notebook str

    Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).

    archive_uris Sequence[str]

    Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

    file_uris Sequence[str]

    Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.

    infrastructure_spec GoogleCloudDataplexV1TaskInfrastructureSpec

    Optional. Infrastructure specification for the execution.

    notebook String

    Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).

    archiveUris List<String>

    Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

    fileUris List<String>

    Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.

    infrastructureSpec Property Map

    Optional. Infrastructure specification for the execution.

    GoogleCloudDataplexV1TaskNotebookTaskConfigResponse, GoogleCloudDataplexV1TaskNotebookTaskConfigResponseArgs

    ArchiveUris List<string>

    Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

    FileUris List<string>

    Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.

    InfrastructureSpec Pulumi.GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskInfrastructureSpecResponse

    Optional. Infrastructure specification for the execution.

    Notebook string

    Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).

    ArchiveUris []string

    Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

    FileUris []string

    Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.

    InfrastructureSpec GoogleCloudDataplexV1TaskInfrastructureSpecResponse

    Optional. Infrastructure specification for the execution.

    Notebook string

    Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).

    archiveUris List<String>

    Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

    fileUris List<String>

    Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.

    infrastructureSpec GoogleCloudDataplexV1TaskInfrastructureSpecResponse

    Optional. Infrastructure specification for the execution.

    notebook String

    Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).

    archiveUris string[]

    Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

    fileUris string[]

    Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.

    infrastructureSpec GoogleCloudDataplexV1TaskInfrastructureSpecResponse

    Optional. Infrastructure specification for the execution.

    notebook string

    Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).

    archive_uris Sequence[str]

    Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

    file_uris Sequence[str]

    Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.

    infrastructure_spec GoogleCloudDataplexV1TaskInfrastructureSpecResponse

    Optional. Infrastructure specification for the execution.

    notebook str

    Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).

    archiveUris List<String>

    Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

    fileUris List<String>

    Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.

    infrastructureSpec Property Map

    Optional. Infrastructure specification for the execution.

    notebook String

    Path to input notebook. This can be the Cloud Storage URI of the notebook file or the path to a Notebook Content. The execution args are accessible as environment variables (TASK_key=value).

    GoogleCloudDataplexV1TaskSparkTaskConfig, GoogleCloudDataplexV1TaskSparkTaskConfigArgs

    ArchiveUris List<string>

    Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

    FileUris List<string>

    Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.

    InfrastructureSpec Pulumi.GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskInfrastructureSpec

    Optional. Infrastructure specification for the execution.

    MainClass string

    The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).

    MainJarFileUri string

    The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).

    PythonScriptFile string

    The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).

    SqlScript string

    The query text. The execution args are used to declare a set of script variables (set key="value";).

    SqlScriptFile string

    A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).

    ArchiveUris []string

    Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

    FileUris []string

    Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.

    InfrastructureSpec GoogleCloudDataplexV1TaskInfrastructureSpec

    Optional. Infrastructure specification for the execution.

    MainClass string

    The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).

    MainJarFileUri string

    The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).

    PythonScriptFile string

    The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).

    SqlScript string

    The query text. The execution args are used to declare a set of script variables (set key="value";).

    SqlScriptFile string

    A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).

    archiveUris List<String>

    Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

    fileUris List<String>

    Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.

    infrastructureSpec GoogleCloudDataplexV1TaskInfrastructureSpec

    Optional. Infrastructure specification for the execution.

    mainClass String

    The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).

    mainJarFileUri String

    The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).

    pythonScriptFile String

    The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).

    sqlScript String

    The query text. The execution args are used to declare a set of script variables (set key="value";).

    sqlScriptFile String

    A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).

    archiveUris string[]

    Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

    fileUris string[]

    Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.

    infrastructureSpec GoogleCloudDataplexV1TaskInfrastructureSpec

    Optional. Infrastructure specification for the execution.

    mainClass string

    The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).

    mainJarFileUri string

    The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).

    pythonScriptFile string

    The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).

    sqlScript string

    The query text. The execution args are used to declare a set of script variables (set key="value";).

    sqlScriptFile string

    A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).

    archive_uris Sequence[str]

    Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

    file_uris Sequence[str]

    Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.

    infrastructure_spec GoogleCloudDataplexV1TaskInfrastructureSpec

    Optional. Infrastructure specification for the execution.

    main_class str

    The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).

    main_jar_file_uri str

    The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).

    python_script_file str

    The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).

    sql_script str

    The query text. The execution args are used to declare a set of script variables (set key="value";).

    sql_script_file str

    A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).

    archiveUris List<String>

    Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

    fileUris List<String>

    Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.

    infrastructureSpec Property Map

    Optional. Infrastructure specification for the execution.

    mainClass String

    The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).

    mainJarFileUri String

    The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).

    pythonScriptFile String

    The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).

    sqlScript String

    The query text. The execution args are used to declare a set of script variables (set key="value";).

    sqlScriptFile String

    A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).

    GoogleCloudDataplexV1TaskSparkTaskConfigResponse, GoogleCloudDataplexV1TaskSparkTaskConfigResponseArgs

    ArchiveUris List<string>

    Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

    FileUris List<string>

    Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.

    InfrastructureSpec Pulumi.GoogleNative.Dataplex.V1.Inputs.GoogleCloudDataplexV1TaskInfrastructureSpecResponse

    Optional. Infrastructure specification for the execution.

    MainClass string

    The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).

    MainJarFileUri string

    The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).

    PythonScriptFile string

    The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).

    SqlScript string

    The query text. The execution args are used to declare a set of script variables (set key="value";).

    SqlScriptFile string

    A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).

    ArchiveUris []string

    Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

    FileUris []string

    Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.

    InfrastructureSpec GoogleCloudDataplexV1TaskInfrastructureSpecResponse

    Optional. Infrastructure specification for the execution.

    MainClass string

    The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).

    MainJarFileUri string

    The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).

    PythonScriptFile string

    The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).

    SqlScript string

    The query text. The execution args are used to declare a set of script variables (set key="value";).

    SqlScriptFile string

    A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).

    archiveUris List<String>

    Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

    fileUris List<String>

    Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.

    infrastructureSpec GoogleCloudDataplexV1TaskInfrastructureSpecResponse

    Optional. Infrastructure specification for the execution.

    mainClass String

    The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).

    mainJarFileUri String

    The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).

    pythonScriptFile String

    The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).

    sqlScript String

    The query text. The execution args are used to declare a set of script variables (set key="value";).

    sqlScriptFile String

    A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).

    archiveUris string[]

    Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

    fileUris string[]

    Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.

    infrastructureSpec GoogleCloudDataplexV1TaskInfrastructureSpecResponse

    Optional. Infrastructure specification for the execution.

    mainClass string

    The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).

    mainJarFileUri string

    The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).

    pythonScriptFile string

    The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).

    sqlScript string

    The query text. The execution args are used to declare a set of script variables (set key="value";).

    sqlScriptFile string

    A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).

    archive_uris Sequence[str]

    Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

    file_uris Sequence[str]

    Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.

    infrastructure_spec GoogleCloudDataplexV1TaskInfrastructureSpecResponse

    Optional. Infrastructure specification for the execution.

    main_class str

    The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).

    main_jar_file_uri str

    The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).

    python_script_file str

    The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).

    sql_script str

    The query text. The execution args are used to declare a set of script variables (set key="value";).

    sql_script_file str

    A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).

    archiveUris List<String>

    Optional. Cloud Storage URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

    fileUris List<String>

    Optional. Cloud Storage URIs of files to be placed in the working directory of each executor.

    infrastructureSpec Property Map

    Optional. Infrastructure specification for the execution.

    mainClass String

    The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris. The execution args are passed in as a sequence of named process arguments (--key=value).

    mainJarFileUri String

    The Cloud Storage URI of the jar file that contains the main class. The execution args are passed in as a sequence of named process arguments (--key=value).

    pythonScriptFile String

    The Gcloud Storage URI of the main Python file to use as the driver. Must be a .py file. The execution args are passed in as a sequence of named process arguments (--key=value).

    sqlScript String

    The query text. The execution args are used to declare a set of script variables (set key="value";).

    sqlScriptFile String

    A reference to a query file. This can be the Cloud Storage URI of the query file or it can the path to a SqlScript Content. The execution args are used to declare a set of script variables (set key="value";).

    GoogleCloudDataplexV1TaskTriggerSpec, GoogleCloudDataplexV1TaskTriggerSpecArgs

    Type Pulumi.GoogleNative.Dataplex.V1.GoogleCloudDataplexV1TaskTriggerSpecType

    Immutable. Trigger type of the user-specified Task.

    Disabled bool

    Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.

    MaxRetries int

    Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.

    Schedule string

    Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.

    StartTime string

    Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.

    Type GoogleCloudDataplexV1TaskTriggerSpecType

    Immutable. Trigger type of the user-specified Task.

    Disabled bool

    Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.

    MaxRetries int

    Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.

    Schedule string

    Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.

    StartTime string

    Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.

    type GoogleCloudDataplexV1TaskTriggerSpecType

    Immutable. Trigger type of the user-specified Task.

    disabled Boolean

    Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.

    maxRetries Integer

    Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.

    schedule String

    Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.

    startTime String

    Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.

    type GoogleCloudDataplexV1TaskTriggerSpecType

    Immutable. Trigger type of the user-specified Task.

    disabled boolean

    Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.

    maxRetries number

    Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.

    schedule string

    Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.

    startTime string

    Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.

    type GoogleCloudDataplexV1TaskTriggerSpecType

    Immutable. Trigger type of the user-specified Task.

    disabled bool

    Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.

    max_retries int

    Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.

    schedule str

    Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.

    start_time str

    Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.

    type "TYPE_UNSPECIFIED" | "ON_DEMAND" | "RECURRING"

    Immutable. Trigger type of the user-specified Task.

    disabled Boolean

    Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.

    maxRetries Number

    Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.

    schedule String

    Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.

    startTime String

    Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.

    GoogleCloudDataplexV1TaskTriggerSpecResponse, GoogleCloudDataplexV1TaskTriggerSpecResponseArgs

    Disabled bool

    Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.

    MaxRetries int

    Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.

    Schedule string

    Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.

    StartTime string

    Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.

    Type string

    Immutable. Trigger type of the user-specified Task.

    Disabled bool

    Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.

    MaxRetries int

    Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.

    Schedule string

    Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.

    StartTime string

    Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.

    Type string

    Immutable. Trigger type of the user-specified Task.

    disabled Boolean

    Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.

    maxRetries Integer

    Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.

    schedule String

    Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.

    startTime String

    Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.

    type String

    Immutable. Trigger type of the user-specified Task.

    disabled boolean

    Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.

    maxRetries number

    Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.

    schedule string

    Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.

    startTime string

    Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.

    type string

    Immutable. Trigger type of the user-specified Task.

    disabled bool

    Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.

    max_retries int

    Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.

    schedule str

    Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.

    start_time str

    Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.

    type str

    Immutable. Trigger type of the user-specified Task.

    disabled Boolean

    Optional. Prevent the task from executing. This does not cancel already running tasks. It is intended to temporarily disable RECURRING tasks.

    maxRetries Number

    Optional. Number of retry attempts before aborting. Set to zero to never attempt to retry a failed task.

    schedule String

    Optional. Cron schedule (https://en.wikipedia.org/wiki/Cron) for running tasks periodically. To explicitly set a timezone to the cron tab, apply a prefix in the cron tab: "CRON_TZ=${IANA_TIME_ZONE}" or "TZ=${IANA_TIME_ZONE}". The ${IANA_TIME_ZONE} may only be a valid string from IANA time zone database. For example, CRON_TZ=America/New_York 1 * * * *, or TZ=America/New_York 1 * * * *. This field is required for RECURRING tasks.

    startTime String

    Optional. The first run of the task will be after this time. If not specified, the task will run shortly after being submitted if ON_DEMAND and based on the schedule if RECURRING.

    type String

    Immutable. Trigger type of the user-specified Task.

    GoogleCloudDataplexV1TaskTriggerSpecType, GoogleCloudDataplexV1TaskTriggerSpecTypeArgs

    TypeUnspecified
    TYPE_UNSPECIFIED

    Unspecified trigger type.

    OnDemand
    ON_DEMAND

    The task runs one-time shortly after Task Creation.

    Recurring
    RECURRING

    The task is scheduled to run periodically.

    GoogleCloudDataplexV1TaskTriggerSpecTypeTypeUnspecified
    TYPE_UNSPECIFIED

    Unspecified trigger type.

    GoogleCloudDataplexV1TaskTriggerSpecTypeOnDemand
    ON_DEMAND

    The task runs one-time shortly after Task Creation.

    GoogleCloudDataplexV1TaskTriggerSpecTypeRecurring
    RECURRING

    The task is scheduled to run periodically.

    TypeUnspecified
    TYPE_UNSPECIFIED

    Unspecified trigger type.

    OnDemand
    ON_DEMAND

    The task runs one-time shortly after Task Creation.

    Recurring
    RECURRING

    The task is scheduled to run periodically.

    TypeUnspecified
    TYPE_UNSPECIFIED

    Unspecified trigger type.

    OnDemand
    ON_DEMAND

    The task runs one-time shortly after Task Creation.

    Recurring
    RECURRING

    The task is scheduled to run periodically.

    TYPE_UNSPECIFIED
    TYPE_UNSPECIFIED

    Unspecified trigger type.

    ON_DEMAND
    ON_DEMAND

    The task runs one-time shortly after Task Creation.

    RECURRING
    RECURRING

    The task is scheduled to run periodically.

    "TYPE_UNSPECIFIED"
    TYPE_UNSPECIFIED

    Unspecified trigger type.

    "ON_DEMAND"
    ON_DEMAND

    The task runs one-time shortly after Task Creation.

    "RECURRING"
    RECURRING

    The task is scheduled to run periodically.

    Package Details

    Repository
    Google Cloud Native pulumi/pulumi-google-native
    License
    Apache-2.0
    google-native logo

    Google Cloud Native is in preview. Google Cloud Classic is fully supported.

    Google Cloud Native v0.32.0 published on Wednesday, Nov 29, 2023 by Pulumi