gcp logo
Google Cloud Classic v6.56.0, May 18 23

gcp.dataflow.Job

Explore with Pulumi AI

Import

Dataflow jobs can be imported using the job id e.g.

 $ pulumi import gcp:dataflow/job:Job example 2022-07-31_06_25_42-11926927532632678660

Create Job Resource

new Job(name: string, args: JobArgs, opts?: CustomResourceOptions);
@overload
def Job(resource_name: str,
        opts: Optional[ResourceOptions] = None,
        additional_experiments: Optional[Sequence[str]] = None,
        enable_streaming_engine: Optional[bool] = None,
        ip_configuration: Optional[str] = None,
        kms_key_name: Optional[str] = None,
        labels: Optional[Mapping[str, Any]] = None,
        machine_type: Optional[str] = None,
        max_workers: Optional[int] = None,
        name: Optional[str] = None,
        network: Optional[str] = None,
        on_delete: Optional[str] = None,
        parameters: Optional[Mapping[str, Any]] = None,
        project: Optional[str] = None,
        region: Optional[str] = None,
        service_account_email: Optional[str] = None,
        skip_wait_on_job_termination: Optional[bool] = None,
        subnetwork: Optional[str] = None,
        temp_gcs_location: Optional[str] = None,
        template_gcs_path: Optional[str] = None,
        transform_name_mapping: Optional[Mapping[str, Any]] = None,
        zone: Optional[str] = None)
@overload
def Job(resource_name: str,
        args: JobArgs,
        opts: Optional[ResourceOptions] = None)
func NewJob(ctx *Context, name string, args JobArgs, opts ...ResourceOption) (*Job, error)
public Job(string name, JobArgs args, CustomResourceOptions? opts = null)
public Job(String name, JobArgs args)
public Job(String name, JobArgs args, CustomResourceOptions options)
type: gcp:dataflow:Job
properties: # The arguments to resource properties.
options: # Bag of options to control resource's behavior.

name string
The unique name of the resource.
args JobArgs
The arguments to resource properties.
opts CustomResourceOptions
Bag of options to control resource's behavior.
resource_name str
The unique name of the resource.
args JobArgs
The arguments to resource properties.
opts ResourceOptions
Bag of options to control resource's behavior.
ctx Context
Context object for the current deployment.
name string
The unique name of the resource.
args JobArgs
The arguments to resource properties.
opts ResourceOption
Bag of options to control resource's behavior.
name string
The unique name of the resource.
args JobArgs
The arguments to resource properties.
opts CustomResourceOptions
Bag of options to control resource's behavior.
name String
The unique name of the resource.
args JobArgs
The arguments to resource properties.
options CustomResourceOptions
Bag of options to control resource's behavior.

Job Resource Properties

To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.

Inputs

The Job resource accepts the following input properties:

TempGcsLocation string

A writeable location on GCS for the Dataflow job to dump its temporary data.

TemplateGcsPath string

The GCS path to the Dataflow job template.

AdditionalExperiments List<string>

List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].

EnableStreamingEngine bool

Enable/disable the use of Streaming Engine for the job. Note that Streaming Engine is enabled by default for pipelines developed against the Beam SDK for Python v2.21.0 or later when using Python 3.

IpConfiguration string

The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".

KmsKeyName string

The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY

Labels Dictionary<string, object>

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

MachineType string

The machine type to use for the job.

MaxWorkers int

The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.

Name string

A unique name for the resource, required by Dataflow.

Network string

The network to which VMs will be assigned. If it is not provided, "default" will be used.

OnDelete string

One of "drain" or "cancel". Specifies behavior of deletion during pulumi destroy. See above note.

Parameters Dictionary<string, object>

Key/Value pairs to be passed to the Dataflow job (as used in the template).

Project string

The project in which the resource belongs. If it is not provided, the provider project is used.

Region string

The region in which the created job should run.

ServiceAccountEmail string

The Service Account email used to create the job.

SkipWaitOnJobTermination bool

If set to true, Pulumi will treat DRAINING and CANCELLING as terminal states when deleting the resource, and will remove the resource from Pulumi state and move on. See above note.

Subnetwork string

The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL. For example "googleapis.com/compute/v1/projects/PROJECT_ID/regions/REGION/subnetworks/SUBNET_NAME"

TransformNameMapping Dictionary<string, object>

Only applicable when updating a pipeline. Map of transform name prefixes of the job to be replaced with the corresponding name prefixes of the new job. This field is not used outside of update.

Zone string

The zone in which the created job should run. If it is not provided, the provider zone is used.

TempGcsLocation string

A writeable location on GCS for the Dataflow job to dump its temporary data.

TemplateGcsPath string

The GCS path to the Dataflow job template.

AdditionalExperiments []string

List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].

EnableStreamingEngine bool

Enable/disable the use of Streaming Engine for the job. Note that Streaming Engine is enabled by default for pipelines developed against the Beam SDK for Python v2.21.0 or later when using Python 3.

IpConfiguration string

The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".

KmsKeyName string

The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY

Labels map[string]interface{}

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

MachineType string

The machine type to use for the job.

MaxWorkers int

The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.

Name string

A unique name for the resource, required by Dataflow.

Network string

The network to which VMs will be assigned. If it is not provided, "default" will be used.

OnDelete string

One of "drain" or "cancel". Specifies behavior of deletion during pulumi destroy. See above note.

Parameters map[string]interface{}

Key/Value pairs to be passed to the Dataflow job (as used in the template).

Project string

The project in which the resource belongs. If it is not provided, the provider project is used.

Region string

The region in which the created job should run.

ServiceAccountEmail string

The Service Account email used to create the job.

SkipWaitOnJobTermination bool

If set to true, Pulumi will treat DRAINING and CANCELLING as terminal states when deleting the resource, and will remove the resource from Pulumi state and move on. See above note.

Subnetwork string

The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL. For example "googleapis.com/compute/v1/projects/PROJECT_ID/regions/REGION/subnetworks/SUBNET_NAME"

TransformNameMapping map[string]interface{}

Only applicable when updating a pipeline. Map of transform name prefixes of the job to be replaced with the corresponding name prefixes of the new job. This field is not used outside of update.

Zone string

The zone in which the created job should run. If it is not provided, the provider zone is used.

tempGcsLocation String

A writeable location on GCS for the Dataflow job to dump its temporary data.

templateGcsPath String

The GCS path to the Dataflow job template.

additionalExperiments List<String>

List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].

enableStreamingEngine Boolean

Enable/disable the use of Streaming Engine for the job. Note that Streaming Engine is enabled by default for pipelines developed against the Beam SDK for Python v2.21.0 or later when using Python 3.

ipConfiguration String

The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".

kmsKeyName String

The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY

labels Map<String,Object>

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

machineType String

The machine type to use for the job.

maxWorkers Integer

The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.

name String

A unique name for the resource, required by Dataflow.

network String

The network to which VMs will be assigned. If it is not provided, "default" will be used.

onDelete String

One of "drain" or "cancel". Specifies behavior of deletion during pulumi destroy. See above note.

parameters Map<String,Object>

Key/Value pairs to be passed to the Dataflow job (as used in the template).

project String

The project in which the resource belongs. If it is not provided, the provider project is used.

region String

The region in which the created job should run.

serviceAccountEmail String

The Service Account email used to create the job.

skipWaitOnJobTermination Boolean

If set to true, Pulumi will treat DRAINING and CANCELLING as terminal states when deleting the resource, and will remove the resource from Pulumi state and move on. See above note.

subnetwork String

The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL. For example "googleapis.com/compute/v1/projects/PROJECT_ID/regions/REGION/subnetworks/SUBNET_NAME"

transformNameMapping Map<String,Object>

Only applicable when updating a pipeline. Map of transform name prefixes of the job to be replaced with the corresponding name prefixes of the new job. This field is not used outside of update.

zone String

The zone in which the created job should run. If it is not provided, the provider zone is used.

tempGcsLocation string

A writeable location on GCS for the Dataflow job to dump its temporary data.

templateGcsPath string

The GCS path to the Dataflow job template.

additionalExperiments string[]

List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].

enableStreamingEngine boolean

Enable/disable the use of Streaming Engine for the job. Note that Streaming Engine is enabled by default for pipelines developed against the Beam SDK for Python v2.21.0 or later when using Python 3.

ipConfiguration string

The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".

kmsKeyName string

The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY

labels {[key: string]: any}

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

machineType string

The machine type to use for the job.

maxWorkers number

The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.

name string

A unique name for the resource, required by Dataflow.

network string

The network to which VMs will be assigned. If it is not provided, "default" will be used.

onDelete string

One of "drain" or "cancel". Specifies behavior of deletion during pulumi destroy. See above note.

parameters {[key: string]: any}

Key/Value pairs to be passed to the Dataflow job (as used in the template).

project string

The project in which the resource belongs. If it is not provided, the provider project is used.

region string

The region in which the created job should run.

serviceAccountEmail string

The Service Account email used to create the job.

skipWaitOnJobTermination boolean

If set to true, Pulumi will treat DRAINING and CANCELLING as terminal states when deleting the resource, and will remove the resource from Pulumi state and move on. See above note.

subnetwork string

The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL. For example "googleapis.com/compute/v1/projects/PROJECT_ID/regions/REGION/subnetworks/SUBNET_NAME"

transformNameMapping {[key: string]: any}

Only applicable when updating a pipeline. Map of transform name prefixes of the job to be replaced with the corresponding name prefixes of the new job. This field is not used outside of update.

zone string

The zone in which the created job should run. If it is not provided, the provider zone is used.

temp_gcs_location str

A writeable location on GCS for the Dataflow job to dump its temporary data.

template_gcs_path str

The GCS path to the Dataflow job template.

additional_experiments Sequence[str]

List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].

enable_streaming_engine bool

Enable/disable the use of Streaming Engine for the job. Note that Streaming Engine is enabled by default for pipelines developed against the Beam SDK for Python v2.21.0 or later when using Python 3.

ip_configuration str

The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".

kms_key_name str

The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY

labels Mapping[str, Any]

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

machine_type str

The machine type to use for the job.

max_workers int

The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.

name str

A unique name for the resource, required by Dataflow.

network str

The network to which VMs will be assigned. If it is not provided, "default" will be used.

on_delete str

One of "drain" or "cancel". Specifies behavior of deletion during pulumi destroy. See above note.

parameters Mapping[str, Any]

Key/Value pairs to be passed to the Dataflow job (as used in the template).

project str

The project in which the resource belongs. If it is not provided, the provider project is used.

region str

The region in which the created job should run.

service_account_email str

The Service Account email used to create the job.

skip_wait_on_job_termination bool

If set to true, Pulumi will treat DRAINING and CANCELLING as terminal states when deleting the resource, and will remove the resource from Pulumi state and move on. See above note.

subnetwork str

The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL. For example "googleapis.com/compute/v1/projects/PROJECT_ID/regions/REGION/subnetworks/SUBNET_NAME"

transform_name_mapping Mapping[str, Any]

Only applicable when updating a pipeline. Map of transform name prefixes of the job to be replaced with the corresponding name prefixes of the new job. This field is not used outside of update.

zone str

The zone in which the created job should run. If it is not provided, the provider zone is used.

tempGcsLocation String

A writeable location on GCS for the Dataflow job to dump its temporary data.

templateGcsPath String

The GCS path to the Dataflow job template.

additionalExperiments List<String>

List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].

enableStreamingEngine Boolean

Enable/disable the use of Streaming Engine for the job. Note that Streaming Engine is enabled by default for pipelines developed against the Beam SDK for Python v2.21.0 or later when using Python 3.

ipConfiguration String

The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".

kmsKeyName String

The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY

labels Map<Any>

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

machineType String

The machine type to use for the job.

maxWorkers Number

The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.

name String

A unique name for the resource, required by Dataflow.

network String

The network to which VMs will be assigned. If it is not provided, "default" will be used.

onDelete String

One of "drain" or "cancel". Specifies behavior of deletion during pulumi destroy. See above note.

parameters Map<Any>

Key/Value pairs to be passed to the Dataflow job (as used in the template).

project String

The project in which the resource belongs. If it is not provided, the provider project is used.

region String

The region in which the created job should run.

serviceAccountEmail String

The Service Account email used to create the job.

skipWaitOnJobTermination Boolean

If set to true, Pulumi will treat DRAINING and CANCELLING as terminal states when deleting the resource, and will remove the resource from Pulumi state and move on. See above note.

subnetwork String

The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL. For example "googleapis.com/compute/v1/projects/PROJECT_ID/regions/REGION/subnetworks/SUBNET_NAME"

transformNameMapping Map<Any>

Only applicable when updating a pipeline. Map of transform name prefixes of the job to be replaced with the corresponding name prefixes of the new job. This field is not used outside of update.

zone String

The zone in which the created job should run. If it is not provided, the provider zone is used.

Outputs

All input properties are implicitly available as output properties. Additionally, the Job resource produces the following output properties:

Id string

The provider-assigned unique ID for this managed resource.

JobId string

The unique ID of this job.

State string

The current state of the resource, selected from the JobState enum

Type string

The type of this job, selected from the JobType enum

Id string

The provider-assigned unique ID for this managed resource.

JobId string

The unique ID of this job.

State string

The current state of the resource, selected from the JobState enum

Type string

The type of this job, selected from the JobType enum

id String

The provider-assigned unique ID for this managed resource.

jobId String

The unique ID of this job.

state String

The current state of the resource, selected from the JobState enum

type String

The type of this job, selected from the JobType enum

id string

The provider-assigned unique ID for this managed resource.

jobId string

The unique ID of this job.

state string

The current state of the resource, selected from the JobState enum

type string

The type of this job, selected from the JobType enum

id str

The provider-assigned unique ID for this managed resource.

job_id str

The unique ID of this job.

state str

The current state of the resource, selected from the JobState enum

type str

The type of this job, selected from the JobType enum

id String

The provider-assigned unique ID for this managed resource.

jobId String

The unique ID of this job.

state String

The current state of the resource, selected from the JobState enum

type String

The type of this job, selected from the JobType enum

Look up Existing Job Resource

Get an existing Job resource’s state with the given name, ID, and optional extra properties used to qualify the lookup.

public static get(name: string, id: Input<ID>, state?: JobState, opts?: CustomResourceOptions): Job
@staticmethod
def get(resource_name: str,
        id: str,
        opts: Optional[ResourceOptions] = None,
        additional_experiments: Optional[Sequence[str]] = None,
        enable_streaming_engine: Optional[bool] = None,
        ip_configuration: Optional[str] = None,
        job_id: Optional[str] = None,
        kms_key_name: Optional[str] = None,
        labels: Optional[Mapping[str, Any]] = None,
        machine_type: Optional[str] = None,
        max_workers: Optional[int] = None,
        name: Optional[str] = None,
        network: Optional[str] = None,
        on_delete: Optional[str] = None,
        parameters: Optional[Mapping[str, Any]] = None,
        project: Optional[str] = None,
        region: Optional[str] = None,
        service_account_email: Optional[str] = None,
        skip_wait_on_job_termination: Optional[bool] = None,
        state: Optional[str] = None,
        subnetwork: Optional[str] = None,
        temp_gcs_location: Optional[str] = None,
        template_gcs_path: Optional[str] = None,
        transform_name_mapping: Optional[Mapping[str, Any]] = None,
        type: Optional[str] = None,
        zone: Optional[str] = None) -> Job
func GetJob(ctx *Context, name string, id IDInput, state *JobState, opts ...ResourceOption) (*Job, error)
public static Job Get(string name, Input<string> id, JobState? state, CustomResourceOptions? opts = null)
public static Job get(String name, Output<String> id, JobState state, CustomResourceOptions options)
Resource lookup is not supported in YAML
name
The unique name of the resulting resource.
id
The unique provider ID of the resource to lookup.
state
Any extra arguments used during the lookup.
opts
A bag of options that control this resource's behavior.
resource_name
The unique name of the resulting resource.
id
The unique provider ID of the resource to lookup.
name
The unique name of the resulting resource.
id
The unique provider ID of the resource to lookup.
state
Any extra arguments used during the lookup.
opts
A bag of options that control this resource's behavior.
name
The unique name of the resulting resource.
id
The unique provider ID of the resource to lookup.
state
Any extra arguments used during the lookup.
opts
A bag of options that control this resource's behavior.
name
The unique name of the resulting resource.
id
The unique provider ID of the resource to lookup.
state
Any extra arguments used during the lookup.
opts
A bag of options that control this resource's behavior.
The following state arguments are supported:
AdditionalExperiments List<string>

List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].

EnableStreamingEngine bool

Enable/disable the use of Streaming Engine for the job. Note that Streaming Engine is enabled by default for pipelines developed against the Beam SDK for Python v2.21.0 or later when using Python 3.

IpConfiguration string

The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".

JobId string

The unique ID of this job.

KmsKeyName string

The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY

Labels Dictionary<string, object>

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

MachineType string

The machine type to use for the job.

MaxWorkers int

The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.

Name string

A unique name for the resource, required by Dataflow.

Network string

The network to which VMs will be assigned. If it is not provided, "default" will be used.

OnDelete string

One of "drain" or "cancel". Specifies behavior of deletion during pulumi destroy. See above note.

Parameters Dictionary<string, object>

Key/Value pairs to be passed to the Dataflow job (as used in the template).

Project string

The project in which the resource belongs. If it is not provided, the provider project is used.

Region string

The region in which the created job should run.

ServiceAccountEmail string

The Service Account email used to create the job.

SkipWaitOnJobTermination bool

If set to true, Pulumi will treat DRAINING and CANCELLING as terminal states when deleting the resource, and will remove the resource from Pulumi state and move on. See above note.

State string

The current state of the resource, selected from the JobState enum

Subnetwork string

The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL. For example "googleapis.com/compute/v1/projects/PROJECT_ID/regions/REGION/subnetworks/SUBNET_NAME"

TempGcsLocation string

A writeable location on GCS for the Dataflow job to dump its temporary data.

TemplateGcsPath string

The GCS path to the Dataflow job template.

TransformNameMapping Dictionary<string, object>

Only applicable when updating a pipeline. Map of transform name prefixes of the job to be replaced with the corresponding name prefixes of the new job. This field is not used outside of update.

Type string

The type of this job, selected from the JobType enum

Zone string

The zone in which the created job should run. If it is not provided, the provider zone is used.

AdditionalExperiments []string

List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].

EnableStreamingEngine bool

Enable/disable the use of Streaming Engine for the job. Note that Streaming Engine is enabled by default for pipelines developed against the Beam SDK for Python v2.21.0 or later when using Python 3.

IpConfiguration string

The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".

JobId string

The unique ID of this job.

KmsKeyName string

The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY

Labels map[string]interface{}

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

MachineType string

The machine type to use for the job.

MaxWorkers int

The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.

Name string

A unique name for the resource, required by Dataflow.

Network string

The network to which VMs will be assigned. If it is not provided, "default" will be used.

OnDelete string

One of "drain" or "cancel". Specifies behavior of deletion during pulumi destroy. See above note.

Parameters map[string]interface{}

Key/Value pairs to be passed to the Dataflow job (as used in the template).

Project string

The project in which the resource belongs. If it is not provided, the provider project is used.

Region string

The region in which the created job should run.

ServiceAccountEmail string

The Service Account email used to create the job.

SkipWaitOnJobTermination bool

If set to true, Pulumi will treat DRAINING and CANCELLING as terminal states when deleting the resource, and will remove the resource from Pulumi state and move on. See above note.

State string

The current state of the resource, selected from the JobState enum

Subnetwork string

The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL. For example "googleapis.com/compute/v1/projects/PROJECT_ID/regions/REGION/subnetworks/SUBNET_NAME"

TempGcsLocation string

A writeable location on GCS for the Dataflow job to dump its temporary data.

TemplateGcsPath string

The GCS path to the Dataflow job template.

TransformNameMapping map[string]interface{}

Only applicable when updating a pipeline. Map of transform name prefixes of the job to be replaced with the corresponding name prefixes of the new job. This field is not used outside of update.

Type string

The type of this job, selected from the JobType enum

Zone string

The zone in which the created job should run. If it is not provided, the provider zone is used.

additionalExperiments List<String>

List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].

enableStreamingEngine Boolean

Enable/disable the use of Streaming Engine for the job. Note that Streaming Engine is enabled by default for pipelines developed against the Beam SDK for Python v2.21.0 or later when using Python 3.

ipConfiguration String

The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".

jobId String

The unique ID of this job.

kmsKeyName String

The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY

labels Map<String,Object>

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

machineType String

The machine type to use for the job.

maxWorkers Integer

The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.

name String

A unique name for the resource, required by Dataflow.

network String

The network to which VMs will be assigned. If it is not provided, "default" will be used.

onDelete String

One of "drain" or "cancel". Specifies behavior of deletion during pulumi destroy. See above note.

parameters Map<String,Object>

Key/Value pairs to be passed to the Dataflow job (as used in the template).

project String

The project in which the resource belongs. If it is not provided, the provider project is used.

region String

The region in which the created job should run.

serviceAccountEmail String

The Service Account email used to create the job.

skipWaitOnJobTermination Boolean

If set to true, Pulumi will treat DRAINING and CANCELLING as terminal states when deleting the resource, and will remove the resource from Pulumi state and move on. See above note.

state String

The current state of the resource, selected from the JobState enum

subnetwork String

The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL. For example "googleapis.com/compute/v1/projects/PROJECT_ID/regions/REGION/subnetworks/SUBNET_NAME"

tempGcsLocation String

A writeable location on GCS for the Dataflow job to dump its temporary data.

templateGcsPath String

The GCS path to the Dataflow job template.

transformNameMapping Map<String,Object>

Only applicable when updating a pipeline. Map of transform name prefixes of the job to be replaced with the corresponding name prefixes of the new job. This field is not used outside of update.

type String

The type of this job, selected from the JobType enum

zone String

The zone in which the created job should run. If it is not provided, the provider zone is used.

additionalExperiments string[]

List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].

enableStreamingEngine boolean

Enable/disable the use of Streaming Engine for the job. Note that Streaming Engine is enabled by default for pipelines developed against the Beam SDK for Python v2.21.0 or later when using Python 3.

ipConfiguration string

The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".

jobId string

The unique ID of this job.

kmsKeyName string

The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY

labels {[key: string]: any}

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

machineType string

The machine type to use for the job.

maxWorkers number

The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.

name string

A unique name for the resource, required by Dataflow.

network string

The network to which VMs will be assigned. If it is not provided, "default" will be used.

onDelete string

One of "drain" or "cancel". Specifies behavior of deletion during pulumi destroy. See above note.

parameters {[key: string]: any}

Key/Value pairs to be passed to the Dataflow job (as used in the template).

project string

The project in which the resource belongs. If it is not provided, the provider project is used.

region string

The region in which the created job should run.

serviceAccountEmail string

The Service Account email used to create the job.

skipWaitOnJobTermination boolean

If set to true, Pulumi will treat DRAINING and CANCELLING as terminal states when deleting the resource, and will remove the resource from Pulumi state and move on. See above note.

state string

The current state of the resource, selected from the JobState enum

subnetwork string

The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL. For example "googleapis.com/compute/v1/projects/PROJECT_ID/regions/REGION/subnetworks/SUBNET_NAME"

tempGcsLocation string

A writeable location on GCS for the Dataflow job to dump its temporary data.

templateGcsPath string

The GCS path to the Dataflow job template.

transformNameMapping {[key: string]: any}

Only applicable when updating a pipeline. Map of transform name prefixes of the job to be replaced with the corresponding name prefixes of the new job. This field is not used outside of update.

type string

The type of this job, selected from the JobType enum

zone string

The zone in which the created job should run. If it is not provided, the provider zone is used.

additional_experiments Sequence[str]

List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].

enable_streaming_engine bool

Enable/disable the use of Streaming Engine for the job. Note that Streaming Engine is enabled by default for pipelines developed against the Beam SDK for Python v2.21.0 or later when using Python 3.

ip_configuration str

The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".

job_id str

The unique ID of this job.

kms_key_name str

The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY

labels Mapping[str, Any]

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

machine_type str

The machine type to use for the job.

max_workers int

The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.

name str

A unique name for the resource, required by Dataflow.

network str

The network to which VMs will be assigned. If it is not provided, "default" will be used.

on_delete str

One of "drain" or "cancel". Specifies behavior of deletion during pulumi destroy. See above note.

parameters Mapping[str, Any]

Key/Value pairs to be passed to the Dataflow job (as used in the template).

project str

The project in which the resource belongs. If it is not provided, the provider project is used.

region str

The region in which the created job should run.

service_account_email str

The Service Account email used to create the job.

skip_wait_on_job_termination bool

If set to true, Pulumi will treat DRAINING and CANCELLING as terminal states when deleting the resource, and will remove the resource from Pulumi state and move on. See above note.

state str

The current state of the resource, selected from the JobState enum

subnetwork str

The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL. For example "googleapis.com/compute/v1/projects/PROJECT_ID/regions/REGION/subnetworks/SUBNET_NAME"

temp_gcs_location str

A writeable location on GCS for the Dataflow job to dump its temporary data.

template_gcs_path str

The GCS path to the Dataflow job template.

transform_name_mapping Mapping[str, Any]

Only applicable when updating a pipeline. Map of transform name prefixes of the job to be replaced with the corresponding name prefixes of the new job. This field is not used outside of update.

type str

The type of this job, selected from the JobType enum

zone str

The zone in which the created job should run. If it is not provided, the provider zone is used.

additionalExperiments List<String>

List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].

enableStreamingEngine Boolean

Enable/disable the use of Streaming Engine for the job. Note that Streaming Engine is enabled by default for pipelines developed against the Beam SDK for Python v2.21.0 or later when using Python 3.

ipConfiguration String

The configuration for VM IPs. Options are "WORKER_IP_PUBLIC" or "WORKER_IP_PRIVATE".

jobId String

The unique ID of this job.

kmsKeyName String

The name for the Cloud KMS key for the job. Key format is: projects/PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY

labels Map<Any>

User labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. NOTE: Google-provided Dataflow templates often provide default labels that begin with goog-dataflow-provided. Unless explicitly set in config, these labels will be ignored to prevent diffs on re-apply.

machineType String

The machine type to use for the job.

maxWorkers Number

The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.

name String

A unique name for the resource, required by Dataflow.

network String

The network to which VMs will be assigned. If it is not provided, "default" will be used.

onDelete String

One of "drain" or "cancel". Specifies behavior of deletion during pulumi destroy. See above note.

parameters Map<Any>

Key/Value pairs to be passed to the Dataflow job (as used in the template).

project String

The project in which the resource belongs. If it is not provided, the provider project is used.

region String

The region in which the created job should run.

serviceAccountEmail String

The Service Account email used to create the job.

skipWaitOnJobTermination Boolean

If set to true, Pulumi will treat DRAINING and CANCELLING as terminal states when deleting the resource, and will remove the resource from Pulumi state and move on. See above note.

state String

The current state of the resource, selected from the JobState enum

subnetwork String

The subnetwork to which VMs will be assigned. Should be of the form "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL. For example "googleapis.com/compute/v1/projects/PROJECT_ID/regions/REGION/subnetworks/SUBNET_NAME"

tempGcsLocation String

A writeable location on GCS for the Dataflow job to dump its temporary data.

templateGcsPath String

The GCS path to the Dataflow job template.

transformNameMapping Map<Any>

Only applicable when updating a pipeline. Map of transform name prefixes of the job to be replaced with the corresponding name prefixes of the new job. This field is not used outside of update.

type String

The type of this job, selected from the JobType enum

zone String

The zone in which the created job should run. If it is not provided, the provider zone is used.

Package Details

Repository
Google Cloud (GCP) Classic pulumi/pulumi-gcp
License
Apache-2.0
Notes

This Pulumi package is based on the google-beta Terraform Provider.