gcp logo
Google Cloud Classic v6.52.0, Mar 22 23

gcp.dataflow.FlexTemplateJob

Import

This resource does not support import.

Create FlexTemplateJob Resource

new FlexTemplateJob(name: string, args: FlexTemplateJobArgs, opts?: CustomResourceOptions);
@overload
def FlexTemplateJob(resource_name: str,
                    opts: Optional[ResourceOptions] = None,
                    container_spec_gcs_path: Optional[str] = None,
                    labels: Optional[Mapping[str, Any]] = None,
                    name: Optional[str] = None,
                    on_delete: Optional[str] = None,
                    parameters: Optional[Mapping[str, Any]] = None,
                    project: Optional[str] = None,
                    region: Optional[str] = None,
                    skip_wait_on_job_termination: Optional[bool] = None)
@overload
def FlexTemplateJob(resource_name: str,
                    args: FlexTemplateJobArgs,
                    opts: Optional[ResourceOptions] = None)
func NewFlexTemplateJob(ctx *Context, name string, args FlexTemplateJobArgs, opts ...ResourceOption) (*FlexTemplateJob, error)
public FlexTemplateJob(string name, FlexTemplateJobArgs args, CustomResourceOptions? opts = null)
public FlexTemplateJob(String name, FlexTemplateJobArgs args)
public FlexTemplateJob(String name, FlexTemplateJobArgs args, CustomResourceOptions options)
type: gcp:dataflow:FlexTemplateJob
properties: # The arguments to resource properties.
options: # Bag of options to control resource's behavior.

name string
The unique name of the resource.
args FlexTemplateJobArgs
The arguments to resource properties.
opts CustomResourceOptions
Bag of options to control resource's behavior.
resource_name str
The unique name of the resource.
args FlexTemplateJobArgs
The arguments to resource properties.
opts ResourceOptions
Bag of options to control resource's behavior.
ctx Context
Context object for the current deployment.
name string
The unique name of the resource.
args FlexTemplateJobArgs
The arguments to resource properties.
opts ResourceOption
Bag of options to control resource's behavior.
name string
The unique name of the resource.
args FlexTemplateJobArgs
The arguments to resource properties.
opts CustomResourceOptions
Bag of options to control resource's behavior.
name String
The unique name of the resource.
args FlexTemplateJobArgs
The arguments to resource properties.
options CustomResourceOptions
Bag of options to control resource's behavior.

FlexTemplateJob Resource Properties

To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.

Inputs

The FlexTemplateJob resource accepts the following input properties:

ContainerSpecGcsPath string

The GCS path to the Dataflow job Flex Template.

Labels Dictionary<string, object>

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. Note: This field is marked as deprecated as the API does not currently support adding labels. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

Name string

A unique name for the resource, required by Dataflow.

OnDelete string

One of "drain" or "cancel". Specifies behavior of deletion during pulumi destroy. See above note.

Parameters Dictionary<string, object>

Key/Value pairs to be passed to the Dataflow job (as used in the template). Additional pipeline options such as serviceAccount, workerMachineType, etc can be specified here.

Project string

The project in which the resource belongs. If it is not provided, the provider project is used.

Region string

The region in which the created job should run.

SkipWaitOnJobTermination bool

If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on. WARNING: this will lead to job name conflicts if you do not ensure that the job names are different, e.g. by embedding a release ID or by using a random_id.

ContainerSpecGcsPath string

The GCS path to the Dataflow job Flex Template.

Labels map[string]interface{}

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. Note: This field is marked as deprecated as the API does not currently support adding labels. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

Name string

A unique name for the resource, required by Dataflow.

OnDelete string

One of "drain" or "cancel". Specifies behavior of deletion during pulumi destroy. See above note.

Parameters map[string]interface{}

Key/Value pairs to be passed to the Dataflow job (as used in the template). Additional pipeline options such as serviceAccount, workerMachineType, etc can be specified here.

Project string

The project in which the resource belongs. If it is not provided, the provider project is used.

Region string

The region in which the created job should run.

SkipWaitOnJobTermination bool

If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on. WARNING: this will lead to job name conflicts if you do not ensure that the job names are different, e.g. by embedding a release ID or by using a random_id.

containerSpecGcsPath String

The GCS path to the Dataflow job Flex Template.

labels Map<String,Object>

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. Note: This field is marked as deprecated as the API does not currently support adding labels. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

name String

A unique name for the resource, required by Dataflow.

onDelete String

One of "drain" or "cancel". Specifies behavior of deletion during pulumi destroy. See above note.

parameters Map<String,Object>

Key/Value pairs to be passed to the Dataflow job (as used in the template). Additional pipeline options such as serviceAccount, workerMachineType, etc can be specified here.

project String

The project in which the resource belongs. If it is not provided, the provider project is used.

region String

The region in which the created job should run.

skipWaitOnJobTermination Boolean

If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on. WARNING: this will lead to job name conflicts if you do not ensure that the job names are different, e.g. by embedding a release ID or by using a random_id.

containerSpecGcsPath string

The GCS path to the Dataflow job Flex Template.

labels {[key: string]: any}

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. Note: This field is marked as deprecated as the API does not currently support adding labels. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

name string

A unique name for the resource, required by Dataflow.

onDelete string

One of "drain" or "cancel". Specifies behavior of deletion during pulumi destroy. See above note.

parameters {[key: string]: any}

Key/Value pairs to be passed to the Dataflow job (as used in the template). Additional pipeline options such as serviceAccount, workerMachineType, etc can be specified here.

project string

The project in which the resource belongs. If it is not provided, the provider project is used.

region string

The region in which the created job should run.

skipWaitOnJobTermination boolean

If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on. WARNING: this will lead to job name conflicts if you do not ensure that the job names are different, e.g. by embedding a release ID or by using a random_id.

container_spec_gcs_path str

The GCS path to the Dataflow job Flex Template.

labels Mapping[str, Any]

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. Note: This field is marked as deprecated as the API does not currently support adding labels. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

name str

A unique name for the resource, required by Dataflow.

on_delete str

One of "drain" or "cancel". Specifies behavior of deletion during pulumi destroy. See above note.

parameters Mapping[str, Any]

Key/Value pairs to be passed to the Dataflow job (as used in the template). Additional pipeline options such as serviceAccount, workerMachineType, etc can be specified here.

project str

The project in which the resource belongs. If it is not provided, the provider project is used.

region str

The region in which the created job should run.

skip_wait_on_job_termination bool

If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on. WARNING: this will lead to job name conflicts if you do not ensure that the job names are different, e.g. by embedding a release ID or by using a random_id.

containerSpecGcsPath String

The GCS path to the Dataflow job Flex Template.

labels Map<Any>

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. Note: This field is marked as deprecated as the API does not currently support adding labels. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

name String

A unique name for the resource, required by Dataflow.

onDelete String

One of "drain" or "cancel". Specifies behavior of deletion during pulumi destroy. See above note.

parameters Map<Any>

Key/Value pairs to be passed to the Dataflow job (as used in the template). Additional pipeline options such as serviceAccount, workerMachineType, etc can be specified here.

project String

The project in which the resource belongs. If it is not provided, the provider project is used.

region String

The region in which the created job should run.

skipWaitOnJobTermination Boolean

If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on. WARNING: this will lead to job name conflicts if you do not ensure that the job names are different, e.g. by embedding a release ID or by using a random_id.

Outputs

All input properties are implicitly available as output properties. Additionally, the FlexTemplateJob resource produces the following output properties:

Id string

The provider-assigned unique ID for this managed resource.

JobId string

The unique ID of this job.

State string

The current state of the resource, selected from the JobState enum

Id string

The provider-assigned unique ID for this managed resource.

JobId string

The unique ID of this job.

State string

The current state of the resource, selected from the JobState enum

id String

The provider-assigned unique ID for this managed resource.

jobId String

The unique ID of this job.

state String

The current state of the resource, selected from the JobState enum

id string

The provider-assigned unique ID for this managed resource.

jobId string

The unique ID of this job.

state string

The current state of the resource, selected from the JobState enum

id str

The provider-assigned unique ID for this managed resource.

job_id str

The unique ID of this job.

state str

The current state of the resource, selected from the JobState enum

id String

The provider-assigned unique ID for this managed resource.

jobId String

The unique ID of this job.

state String

The current state of the resource, selected from the JobState enum

Look up Existing FlexTemplateJob Resource

Get an existing FlexTemplateJob resource’s state with the given name, ID, and optional extra properties used to qualify the lookup.

public static get(name: string, id: Input<ID>, state?: FlexTemplateJobState, opts?: CustomResourceOptions): FlexTemplateJob
@staticmethod
def get(resource_name: str,
        id: str,
        opts: Optional[ResourceOptions] = None,
        container_spec_gcs_path: Optional[str] = None,
        job_id: Optional[str] = None,
        labels: Optional[Mapping[str, Any]] = None,
        name: Optional[str] = None,
        on_delete: Optional[str] = None,
        parameters: Optional[Mapping[str, Any]] = None,
        project: Optional[str] = None,
        region: Optional[str] = None,
        skip_wait_on_job_termination: Optional[bool] = None,
        state: Optional[str] = None) -> FlexTemplateJob
func GetFlexTemplateJob(ctx *Context, name string, id IDInput, state *FlexTemplateJobState, opts ...ResourceOption) (*FlexTemplateJob, error)
public static FlexTemplateJob Get(string name, Input<string> id, FlexTemplateJobState? state, CustomResourceOptions? opts = null)
public static FlexTemplateJob get(String name, Output<String> id, FlexTemplateJobState state, CustomResourceOptions options)
Resource lookup is not supported in YAML
name
The unique name of the resulting resource.
id
The unique provider ID of the resource to lookup.
state
Any extra arguments used during the lookup.
opts
A bag of options that control this resource's behavior.
resource_name
The unique name of the resulting resource.
id
The unique provider ID of the resource to lookup.
name
The unique name of the resulting resource.
id
The unique provider ID of the resource to lookup.
state
Any extra arguments used during the lookup.
opts
A bag of options that control this resource's behavior.
name
The unique name of the resulting resource.
id
The unique provider ID of the resource to lookup.
state
Any extra arguments used during the lookup.
opts
A bag of options that control this resource's behavior.
name
The unique name of the resulting resource.
id
The unique provider ID of the resource to lookup.
state
Any extra arguments used during the lookup.
opts
A bag of options that control this resource's behavior.
The following state arguments are supported:
ContainerSpecGcsPath string

The GCS path to the Dataflow job Flex Template.

JobId string

The unique ID of this job.

Labels Dictionary<string, object>

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. Note: This field is marked as deprecated as the API does not currently support adding labels. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

Name string

A unique name for the resource, required by Dataflow.

OnDelete string

One of "drain" or "cancel". Specifies behavior of deletion during pulumi destroy. See above note.

Parameters Dictionary<string, object>

Key/Value pairs to be passed to the Dataflow job (as used in the template). Additional pipeline options such as serviceAccount, workerMachineType, etc can be specified here.

Project string

The project in which the resource belongs. If it is not provided, the provider project is used.

Region string

The region in which the created job should run.

SkipWaitOnJobTermination bool

If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on. WARNING: this will lead to job name conflicts if you do not ensure that the job names are different, e.g. by embedding a release ID or by using a random_id.

State string

The current state of the resource, selected from the JobState enum

ContainerSpecGcsPath string

The GCS path to the Dataflow job Flex Template.

JobId string

The unique ID of this job.

Labels map[string]interface{}

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. Note: This field is marked as deprecated as the API does not currently support adding labels. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

Name string

A unique name for the resource, required by Dataflow.

OnDelete string

One of "drain" or "cancel". Specifies behavior of deletion during pulumi destroy. See above note.

Parameters map[string]interface{}

Key/Value pairs to be passed to the Dataflow job (as used in the template). Additional pipeline options such as serviceAccount, workerMachineType, etc can be specified here.

Project string

The project in which the resource belongs. If it is not provided, the provider project is used.

Region string

The region in which the created job should run.

SkipWaitOnJobTermination bool

If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on. WARNING: this will lead to job name conflicts if you do not ensure that the job names are different, e.g. by embedding a release ID or by using a random_id.

State string

The current state of the resource, selected from the JobState enum

containerSpecGcsPath String

The GCS path to the Dataflow job Flex Template.

jobId String

The unique ID of this job.

labels Map<String,Object>

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. Note: This field is marked as deprecated as the API does not currently support adding labels. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

name String

A unique name for the resource, required by Dataflow.

onDelete String

One of "drain" or "cancel". Specifies behavior of deletion during pulumi destroy. See above note.

parameters Map<String,Object>

Key/Value pairs to be passed to the Dataflow job (as used in the template). Additional pipeline options such as serviceAccount, workerMachineType, etc can be specified here.

project String

The project in which the resource belongs. If it is not provided, the provider project is used.

region String

The region in which the created job should run.

skipWaitOnJobTermination Boolean

If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on. WARNING: this will lead to job name conflicts if you do not ensure that the job names are different, e.g. by embedding a release ID or by using a random_id.

state String

The current state of the resource, selected from the JobState enum

containerSpecGcsPath string

The GCS path to the Dataflow job Flex Template.

jobId string

The unique ID of this job.

labels {[key: string]: any}

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. Note: This field is marked as deprecated as the API does not currently support adding labels. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

name string

A unique name for the resource, required by Dataflow.

onDelete string

One of "drain" or "cancel". Specifies behavior of deletion during pulumi destroy. See above note.

parameters {[key: string]: any}

Key/Value pairs to be passed to the Dataflow job (as used in the template). Additional pipeline options such as serviceAccount, workerMachineType, etc can be specified here.

project string

The project in which the resource belongs. If it is not provided, the provider project is used.

region string

The region in which the created job should run.

skipWaitOnJobTermination boolean

If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on. WARNING: this will lead to job name conflicts if you do not ensure that the job names are different, e.g. by embedding a release ID or by using a random_id.

state string

The current state of the resource, selected from the JobState enum

container_spec_gcs_path str

The GCS path to the Dataflow job Flex Template.

job_id str

The unique ID of this job.

labels Mapping[str, Any]

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. Note: This field is marked as deprecated as the API does not currently support adding labels. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

name str

A unique name for the resource, required by Dataflow.

on_delete str

One of "drain" or "cancel". Specifies behavior of deletion during pulumi destroy. See above note.

parameters Mapping[str, Any]

Key/Value pairs to be passed to the Dataflow job (as used in the template). Additional pipeline options such as serviceAccount, workerMachineType, etc can be specified here.

project str

The project in which the resource belongs. If it is not provided, the provider project is used.

region str

The region in which the created job should run.

skip_wait_on_job_termination bool

If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on. WARNING: this will lead to job name conflicts if you do not ensure that the job names are different, e.g. by embedding a release ID or by using a random_id.

state str

The current state of the resource, selected from the JobState enum

containerSpecGcsPath String

The GCS path to the Dataflow job Flex Template.

jobId String

The unique ID of this job.

labels Map<Any>

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. Note: This field is marked as deprecated as the API does not currently support adding labels. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

name String

A unique name for the resource, required by Dataflow.

onDelete String

One of "drain" or "cancel". Specifies behavior of deletion during pulumi destroy. See above note.

parameters Map<Any>

Key/Value pairs to be passed to the Dataflow job (as used in the template). Additional pipeline options such as serviceAccount, workerMachineType, etc can be specified here.

project String

The project in which the resource belongs. If it is not provided, the provider project is used.

region String

The region in which the created job should run.

skipWaitOnJobTermination Boolean

If true, treat DRAINING and CANCELLING as terminal job states and do not wait for further changes before removing from terraform state and moving on. WARNING: this will lead to job name conflicts if you do not ensure that the job names are different, e.g. by embedding a release ID or by using a random_id.

state String

The current state of the resource, selected from the JobState enum

Package Details

Repository
Google Cloud (GCP) Classic pulumi/pulumi-gcp
License
Apache-2.0
Notes

This Pulumi package is based on the google-beta Terraform Provider.