We recommend new projects start with resources from the AWS provider.
aws-native.appflow.getFlow
We recommend new projects start with resources from the AWS provider.
Resource schema for AWS::AppFlow::Flow.
Using getFlow
Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.
function getFlow(args: GetFlowArgs, opts?: InvokeOptions): Promise<GetFlowResult>
function getFlowOutput(args: GetFlowOutputArgs, opts?: InvokeOptions): Output<GetFlowResult>def get_flow(flow_name: Optional[str] = None,
             opts: Optional[InvokeOptions] = None) -> GetFlowResult
def get_flow_output(flow_name: Optional[pulumi.Input[str]] = None,
             opts: Optional[InvokeOptions] = None) -> Output[GetFlowResult]func LookupFlow(ctx *Context, args *LookupFlowArgs, opts ...InvokeOption) (*LookupFlowResult, error)
func LookupFlowOutput(ctx *Context, args *LookupFlowOutputArgs, opts ...InvokeOption) LookupFlowResultOutput> Note: This function is named LookupFlow in the Go SDK.
public static class GetFlow 
{
    public static Task<GetFlowResult> InvokeAsync(GetFlowArgs args, InvokeOptions? opts = null)
    public static Output<GetFlowResult> Invoke(GetFlowInvokeArgs args, InvokeOptions? opts = null)
}public static CompletableFuture<GetFlowResult> getFlow(GetFlowArgs args, InvokeOptions options)
public static Output<GetFlowResult> getFlow(GetFlowArgs args, InvokeOptions options)
fn::invoke:
  function: aws-native:appflow:getFlow
  arguments:
    # arguments dictionaryThe following arguments are supported:
- FlowName string
- Name of the flow.
- FlowName string
- Name of the flow.
- flowName String
- Name of the flow.
- flowName string
- Name of the flow.
- flow_name str
- Name of the flow.
- flowName String
- Name of the flow.
getFlow Result
The following output properties are available:
- Description string
- Description of the flow.
- DestinationFlow List<Pulumi.Config List Aws Native. App Flow. Outputs. Flow Destination Flow Config> 
- List of Destination connectors of the flow.
- FlowArn string
- ARN identifier of the flow.
- FlowStatus Pulumi.Aws Native. App Flow. Flow Status 
- Flow activation status for Scheduled- and Event-triggered flows
- MetadataCatalog Pulumi.Config Aws Native. App Flow. Outputs. Flow Metadata Catalog Config 
- Configurations of metadata catalog of the flow.
- SourceFlow Pulumi.Config Aws Native. App Flow. Outputs. Flow Source Flow Config 
- Configurations of Source connector of the flow.
- 
List<Pulumi.Aws Native. Outputs. Tag> 
- List of Tags.
- Tasks
List<Pulumi.Aws Native. App Flow. Outputs. Flow Task> 
- List of tasks for the flow.
- TriggerConfig Pulumi.Aws Native. App Flow. Outputs. Flow Trigger Config 
- Trigger settings of the flow.
- Description string
- Description of the flow.
- DestinationFlow []FlowConfig List Destination Flow Config 
- List of Destination connectors of the flow.
- FlowArn string
- ARN identifier of the flow.
- FlowStatus FlowStatus 
- Flow activation status for Scheduled- and Event-triggered flows
- MetadataCatalog FlowConfig Metadata Catalog Config 
- Configurations of metadata catalog of the flow.
- SourceFlow FlowConfig Source Flow Config 
- Configurations of Source connector of the flow.
- Tag
- List of Tags.
- Tasks
[]FlowTask 
- List of tasks for the flow.
- TriggerConfig FlowTrigger Config 
- Trigger settings of the flow.
- description String
- Description of the flow.
- destinationFlow List<FlowConfig List Destination Flow Config> 
- List of Destination connectors of the flow.
- flowArn String
- ARN identifier of the flow.
- flowStatus FlowStatus 
- Flow activation status for Scheduled- and Event-triggered flows
- metadataCatalog FlowConfig Metadata Catalog Config 
- Configurations of metadata catalog of the flow.
- sourceFlow FlowConfig Source Flow Config 
- Configurations of Source connector of the flow.
- List<Tag>
- List of Tags.
- tasks
List<FlowTask> 
- List of tasks for the flow.
- triggerConfig FlowTrigger Config 
- Trigger settings of the flow.
- description string
- Description of the flow.
- destinationFlow FlowConfig List Destination Flow Config[] 
- List of Destination connectors of the flow.
- flowArn string
- ARN identifier of the flow.
- flowStatus FlowStatus 
- Flow activation status for Scheduled- and Event-triggered flows
- metadataCatalog FlowConfig Metadata Catalog Config 
- Configurations of metadata catalog of the flow.
- sourceFlow FlowConfig Source Flow Config 
- Configurations of Source connector of the flow.
- Tag[]
- List of Tags.
- tasks
FlowTask[] 
- List of tasks for the flow.
- triggerConfig FlowTrigger Config 
- Trigger settings of the flow.
- description str
- Description of the flow.
- destination_flow_ Sequence[Flowconfig_ list Destination Flow Config] 
- List of Destination connectors of the flow.
- flow_arn str
- ARN identifier of the flow.
- flow_status FlowStatus 
- Flow activation status for Scheduled- and Event-triggered flows
- metadata_catalog_ Flowconfig Metadata Catalog Config 
- Configurations of metadata catalog of the flow.
- source_flow_ Flowconfig Source Flow Config 
- Configurations of Source connector of the flow.
- Sequence[root_Tag]
- List of Tags.
- tasks
Sequence[FlowTask] 
- List of tasks for the flow.
- trigger_config FlowTrigger Config 
- Trigger settings of the flow.
- description String
- Description of the flow.
- destinationFlow List<Property Map>Config List 
- List of Destination connectors of the flow.
- flowArn String
- ARN identifier of the flow.
- flowStatus "Active" | "Suspended" | "Draft"
- Flow activation status for Scheduled- and Event-triggered flows
- metadataCatalog Property MapConfig 
- Configurations of metadata catalog of the flow.
- sourceFlow Property MapConfig 
- Configurations of Source connector of the flow.
- List<Property Map>
- List of Tags.
- tasks List<Property Map>
- List of tasks for the flow.
- triggerConfig Property Map
- Trigger settings of the flow.
Supporting Types
FlowAggregationConfig  
- AggregationType Pulumi.Aws Native. App Flow. Flow Aggregation Type 
- Specifies whether Amazon AppFlow aggregates the flow records into a single file, or leave them unaggregated.
- TargetFile intSize 
- The desired file size, in MB, for each output file that Amazon AppFlow writes to the flow destination. For each file, Amazon AppFlow attempts to achieve the size that you specify. The actual file sizes might differ from this target based on the number and size of the records that each file contains.
- AggregationType FlowAggregation Type 
- Specifies whether Amazon AppFlow aggregates the flow records into a single file, or leave them unaggregated.
- TargetFile intSize 
- The desired file size, in MB, for each output file that Amazon AppFlow writes to the flow destination. For each file, Amazon AppFlow attempts to achieve the size that you specify. The actual file sizes might differ from this target based on the number and size of the records that each file contains.
- aggregationType FlowAggregation Type 
- Specifies whether Amazon AppFlow aggregates the flow records into a single file, or leave them unaggregated.
- targetFile IntegerSize 
- The desired file size, in MB, for each output file that Amazon AppFlow writes to the flow destination. For each file, Amazon AppFlow attempts to achieve the size that you specify. The actual file sizes might differ from this target based on the number and size of the records that each file contains.
- aggregationType FlowAggregation Type 
- Specifies whether Amazon AppFlow aggregates the flow records into a single file, or leave them unaggregated.
- targetFile numberSize 
- The desired file size, in MB, for each output file that Amazon AppFlow writes to the flow destination. For each file, Amazon AppFlow attempts to achieve the size that you specify. The actual file sizes might differ from this target based on the number and size of the records that each file contains.
- aggregation_type FlowAggregation Type 
- Specifies whether Amazon AppFlow aggregates the flow records into a single file, or leave them unaggregated.
- target_file_ intsize 
- The desired file size, in MB, for each output file that Amazon AppFlow writes to the flow destination. For each file, Amazon AppFlow attempts to achieve the size that you specify. The actual file sizes might differ from this target based on the number and size of the records that each file contains.
- aggregationType "None" | "SingleFile" 
- Specifies whether Amazon AppFlow aggregates the flow records into a single file, or leave them unaggregated.
- targetFile NumberSize 
- The desired file size, in MB, for each output file that Amazon AppFlow writes to the flow destination. For each file, Amazon AppFlow attempts to achieve the size that you specify. The actual file sizes might differ from this target based on the number and size of the records that each file contains.
FlowAggregationType  
FlowAmplitudeConnectorOperator   
FlowAmplitudeSourceProperties   
- Object string
- The object specified in the Amplitude flow source.
- Object string
- The object specified in the Amplitude flow source.
- object String
- The object specified in the Amplitude flow source.
- object string
- The object specified in the Amplitude flow source.
- object str
- The object specified in the Amplitude flow source.
- object String
- The object specified in the Amplitude flow source.
FlowConnectorOperator  
- Amplitude
Pulumi.Aws Native. App Flow. Flow Amplitude Connector Operator 
- The operation to be performed on the provided Amplitude source fields.
- CustomConnector Pulumi.Aws Native. App Flow. Flow Operator 
- Operators supported by the custom connector.
- Datadog
Pulumi.Aws Native. App Flow. Flow Datadog Connector Operator 
- The operation to be performed on the provided Datadog source fields.
- Dynatrace
Pulumi.Aws Native. App Flow. Flow Dynatrace Connector Operator 
- The operation to be performed on the provided Dynatrace source fields.
- GoogleAnalytics Pulumi.Aws Native. App Flow. Flow Google Analytics Connector Operator 
- The operation to be performed on the provided Google Analytics source fields.
- InforNexus Pulumi.Aws Native. App Flow. Flow Infor Nexus Connector Operator 
- The operation to be performed on the provided Infor Nexus source fields.
- Marketo
Pulumi.Aws Native. App Flow. Flow Marketo Connector Operator 
- The operation to be performed on the provided Marketo source fields.
- Pardot
Pulumi.Aws Native. App Flow. Flow Pardot Connector Operator 
- The operation to be performed on the provided Salesforce Pardot source fields.
- S3
Pulumi.Aws Native. App Flow. Flow S3Connector Operator 
- The operation to be performed on the provided Amazon S3 source fields.
- Salesforce
Pulumi.Aws Native. App Flow. Flow Salesforce Connector Operator 
- The operation to be performed on the provided Salesforce source fields.
- SapoData Pulumi.Aws Native. App Flow. Flow Sapo Data Connector Operator 
- The operation to be performed on the provided SAPOData source fields.
- ServiceNow Pulumi.Aws Native. App Flow. Flow Service Now Connector Operator 
- The operation to be performed on the provided ServiceNow source fields.
- Singular
Pulumi.Aws Native. App Flow. Flow Singular Connector Operator 
- The operation to be performed on the provided Singular source fields.
- Slack
Pulumi.Aws Native. App Flow. Flow Slack Connector Operator 
- The operation to be performed on the provided Slack source fields.
- Trendmicro
Pulumi.Aws Native. App Flow. Flow Trendmicro Connector Operator 
- The operation to be performed on the provided Trend Micro source fields.
- Veeva
Pulumi.Aws Native. App Flow. Flow Veeva Connector Operator 
- The operation to be performed on the provided Veeva source fields.
- Zendesk
Pulumi.Aws Native. App Flow. Flow Zendesk Connector Operator 
- The operation to be performed on the provided Zendesk source fields.
- Amplitude
FlowAmplitude Connector Operator 
- The operation to be performed on the provided Amplitude source fields.
- CustomConnector FlowOperator 
- Operators supported by the custom connector.
- Datadog
FlowDatadog Connector Operator 
- The operation to be performed on the provided Datadog source fields.
- Dynatrace
FlowDynatrace Connector Operator 
- The operation to be performed on the provided Dynatrace source fields.
- GoogleAnalytics FlowGoogle Analytics Connector Operator 
- The operation to be performed on the provided Google Analytics source fields.
- InforNexus FlowInfor Nexus Connector Operator 
- The operation to be performed on the provided Infor Nexus source fields.
- Marketo
FlowMarketo Connector Operator 
- The operation to be performed on the provided Marketo source fields.
- Pardot
FlowPardot Connector Operator 
- The operation to be performed on the provided Salesforce Pardot source fields.
- S3
FlowS3Connector Operator 
- The operation to be performed on the provided Amazon S3 source fields.
- Salesforce
FlowSalesforce Connector Operator 
- The operation to be performed on the provided Salesforce source fields.
- SapoData FlowSapo Data Connector Operator 
- The operation to be performed on the provided SAPOData source fields.
- ServiceNow FlowService Now Connector Operator 
- The operation to be performed on the provided ServiceNow source fields.
- Singular
FlowSingular Connector Operator 
- The operation to be performed on the provided Singular source fields.
- Slack
FlowSlack Connector Operator 
- The operation to be performed on the provided Slack source fields.
- Trendmicro
FlowTrendmicro Connector Operator 
- The operation to be performed on the provided Trend Micro source fields.
- Veeva
FlowVeeva Connector Operator 
- The operation to be performed on the provided Veeva source fields.
- Zendesk
FlowZendesk Connector Operator 
- The operation to be performed on the provided Zendesk source fields.
- amplitude
FlowAmplitude Connector Operator 
- The operation to be performed on the provided Amplitude source fields.
- customConnector FlowOperator 
- Operators supported by the custom connector.
- datadog
FlowDatadog Connector Operator 
- The operation to be performed on the provided Datadog source fields.
- dynatrace
FlowDynatrace Connector Operator 
- The operation to be performed on the provided Dynatrace source fields.
- googleAnalytics FlowGoogle Analytics Connector Operator 
- The operation to be performed on the provided Google Analytics source fields.
- inforNexus FlowInfor Nexus Connector Operator 
- The operation to be performed on the provided Infor Nexus source fields.
- marketo
FlowMarketo Connector Operator 
- The operation to be performed on the provided Marketo source fields.
- pardot
FlowPardot Connector Operator 
- The operation to be performed on the provided Salesforce Pardot source fields.
- s3
FlowS3Connector Operator 
- The operation to be performed on the provided Amazon S3 source fields.
- salesforce
FlowSalesforce Connector Operator 
- The operation to be performed on the provided Salesforce source fields.
- sapoData FlowSapo Data Connector Operator 
- The operation to be performed on the provided SAPOData source fields.
- serviceNow FlowService Now Connector Operator 
- The operation to be performed on the provided ServiceNow source fields.
- singular
FlowSingular Connector Operator 
- The operation to be performed on the provided Singular source fields.
- slack
FlowSlack Connector Operator 
- The operation to be performed on the provided Slack source fields.
- trendmicro
FlowTrendmicro Connector Operator 
- The operation to be performed on the provided Trend Micro source fields.
- veeva
FlowVeeva Connector Operator 
- The operation to be performed on the provided Veeva source fields.
- zendesk
FlowZendesk Connector Operator 
- The operation to be performed on the provided Zendesk source fields.
- amplitude
FlowAmplitude Connector Operator 
- The operation to be performed on the provided Amplitude source fields.
- customConnector FlowOperator 
- Operators supported by the custom connector.
- datadog
FlowDatadog Connector Operator 
- The operation to be performed on the provided Datadog source fields.
- dynatrace
FlowDynatrace Connector Operator 
- The operation to be performed on the provided Dynatrace source fields.
- googleAnalytics FlowGoogle Analytics Connector Operator 
- The operation to be performed on the provided Google Analytics source fields.
- inforNexus FlowInfor Nexus Connector Operator 
- The operation to be performed on the provided Infor Nexus source fields.
- marketo
FlowMarketo Connector Operator 
- The operation to be performed on the provided Marketo source fields.
- pardot
FlowPardot Connector Operator 
- The operation to be performed on the provided Salesforce Pardot source fields.
- s3
FlowS3Connector Operator 
- The operation to be performed on the provided Amazon S3 source fields.
- salesforce
FlowSalesforce Connector Operator 
- The operation to be performed on the provided Salesforce source fields.
- sapoData FlowSapo Data Connector Operator 
- The operation to be performed on the provided SAPOData source fields.
- serviceNow FlowService Now Connector Operator 
- The operation to be performed on the provided ServiceNow source fields.
- singular
FlowSingular Connector Operator 
- The operation to be performed on the provided Singular source fields.
- slack
FlowSlack Connector Operator 
- The operation to be performed on the provided Slack source fields.
- trendmicro
FlowTrendmicro Connector Operator 
- The operation to be performed on the provided Trend Micro source fields.
- veeva
FlowVeeva Connector Operator 
- The operation to be performed on the provided Veeva source fields.
- zendesk
FlowZendesk Connector Operator 
- The operation to be performed on the provided Zendesk source fields.
- amplitude
FlowAmplitude Connector Operator 
- The operation to be performed on the provided Amplitude source fields.
- custom_connector FlowOperator 
- Operators supported by the custom connector.
- datadog
FlowDatadog Connector Operator 
- The operation to be performed on the provided Datadog source fields.
- dynatrace
FlowDynatrace Connector Operator 
- The operation to be performed on the provided Dynatrace source fields.
- google_analytics FlowGoogle Analytics Connector Operator 
- The operation to be performed on the provided Google Analytics source fields.
- infor_nexus FlowInfor Nexus Connector Operator 
- The operation to be performed on the provided Infor Nexus source fields.
- marketo
FlowMarketo Connector Operator 
- The operation to be performed on the provided Marketo source fields.
- pardot
FlowPardot Connector Operator 
- The operation to be performed on the provided Salesforce Pardot source fields.
- s3
FlowS3Connector Operator 
- The operation to be performed on the provided Amazon S3 source fields.
- salesforce
FlowSalesforce Connector Operator 
- The operation to be performed on the provided Salesforce source fields.
- sapo_data FlowSapo Data Connector Operator 
- The operation to be performed on the provided SAPOData source fields.
- service_now FlowService Now Connector Operator 
- The operation to be performed on the provided ServiceNow source fields.
- singular
FlowSingular Connector Operator 
- The operation to be performed on the provided Singular source fields.
- slack
FlowSlack Connector Operator 
- The operation to be performed on the provided Slack source fields.
- trendmicro
FlowTrendmicro Connector Operator 
- The operation to be performed on the provided Trend Micro source fields.
- veeva
FlowVeeva Connector Operator 
- The operation to be performed on the provided Veeva source fields.
- zendesk
FlowZendesk Connector Operator 
- The operation to be performed on the provided Zendesk source fields.
- amplitude "BETWEEN"
- The operation to be performed on the provided Amplitude source fields.
- customConnector "PROJECTION" | "LESS_THAN" | "GREATER_THAN" | "CONTAINS" | "BETWEEN" | "LESS_THAN_OR_EQUAL_TO" | "GREATER_THAN_OR_EQUAL_TO" | "EQUAL_TO" | "NOT_EQUAL_TO" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP"
- Operators supported by the custom connector.
- datadog "PROJECTION" | "BETWEEN" | "EQUAL_TO" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP"
- The operation to be performed on the provided Datadog source fields.
- dynatrace "PROJECTION" | "BETWEEN" | "EQUAL_TO" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP"
- The operation to be performed on the provided Dynatrace source fields.
- googleAnalytics "PROJECTION" | "BETWEEN"
- The operation to be performed on the provided Google Analytics source fields.
- inforNexus "PROJECTION" | "BETWEEN" | "EQUAL_TO" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP"
- The operation to be performed on the provided Infor Nexus source fields.
- marketo "PROJECTION" | "LESS_THAN" | "GREATER_THAN" | "BETWEEN" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP"
- The operation to be performed on the provided Marketo source fields.
- pardot "PROJECTION" | "EQUAL_TO" | "NO_OP" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC"
- The operation to be performed on the provided Salesforce Pardot source fields.
- s3 "PROJECTION" | "LESS_THAN" | "GREATER_THAN" | "BETWEEN" | "LESS_THAN_OR_EQUAL_TO" | "GREATER_THAN_OR_EQUAL_TO" | "EQUAL_TO" | "NOT_EQUAL_TO" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP"
- The operation to be performed on the provided Amazon S3 source fields.
- salesforce "PROJECTION" | "LESS_THAN" | "CONTAINS" | "GREATER_THAN" | "BETWEEN" | "LESS_THAN_OR_EQUAL_TO" | "GREATER_THAN_OR_EQUAL_TO" | "EQUAL_TO" | "NOT_EQUAL_TO" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP"
- The operation to be performed on the provided Salesforce source fields.
- sapoData "PROJECTION" | "LESS_THAN" | "CONTAINS" | "GREATER_THAN" | "BETWEEN" | "LESS_THAN_OR_EQUAL_TO" | "GREATER_THAN_OR_EQUAL_TO" | "EQUAL_TO" | "NOT_EQUAL_TO" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP"
- The operation to be performed on the provided SAPOData source fields.
- serviceNow "PROJECTION" | "LESS_THAN" | "CONTAINS" | "GREATER_THAN" | "BETWEEN" | "LESS_THAN_OR_EQUAL_TO" | "GREATER_THAN_OR_EQUAL_TO" | "EQUAL_TO" | "NOT_EQUAL_TO" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP"
- The operation to be performed on the provided ServiceNow source fields.
- singular "PROJECTION" | "EQUAL_TO" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP"
- The operation to be performed on the provided Singular source fields.
- slack "PROJECTION" | "BETWEEN" | "EQUAL_TO" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP"
- The operation to be performed on the provided Slack source fields.
- trendmicro "PROJECTION" | "EQUAL_TO" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP"
- The operation to be performed on the provided Trend Micro source fields.
- veeva "PROJECTION" | "LESS_THAN" | "GREATER_THAN" | "BETWEEN" | "LESS_THAN_OR_EQUAL_TO" | "GREATER_THAN_OR_EQUAL_TO" | "EQUAL_TO" | "NOT_EQUAL_TO" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP"
- The operation to be performed on the provided Veeva source fields.
- zendesk "PROJECTION" | "GREATER_THAN" | "ADDITION" | "MULTIPLICATION" | "DIVISION" | "SUBTRACTION" | "MASK_ALL" | "MASK_FIRST_N" | "MASK_LAST_N" | "VALIDATE_NON_NULL" | "VALIDATE_NON_ZERO" | "VALIDATE_NON_NEGATIVE" | "VALIDATE_NUMERIC" | "NO_OP"
- The operation to be performed on the provided Zendesk source fields.
FlowConnectorType  
FlowCustomConnectorDestinationProperties    
- EntityName string
- The entity specified in the custom connector as a destination in the flow.
- CustomProperties Dictionary<string, string>
- The custom properties that are specific to the connector when it's used as a destination in the flow.
- ErrorHandling Pulumi.Config Aws Native. App Flow. Inputs. Flow Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the custom connector as destination.
- IdField List<string>Names 
- List of fields used as ID when performing a write operation.
- WriteOperation Pulumi.Type Aws Native. App Flow. Flow Write Operation Type 
- Specifies the type of write operation to be performed in the custom connector when it's used as destination.
- EntityName string
- The entity specified in the custom connector as a destination in the flow.
- CustomProperties map[string]string
- The custom properties that are specific to the connector when it's used as a destination in the flow.
- ErrorHandling FlowConfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the custom connector as destination.
- IdField []stringNames 
- List of fields used as ID when performing a write operation.
- WriteOperation FlowType Write Operation Type 
- Specifies the type of write operation to be performed in the custom connector when it's used as destination.
- entityName String
- The entity specified in the custom connector as a destination in the flow.
- customProperties Map<String,String>
- The custom properties that are specific to the connector when it's used as a destination in the flow.
- errorHandling FlowConfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the custom connector as destination.
- idField List<String>Names 
- List of fields used as ID when performing a write operation.
- writeOperation FlowType Write Operation Type 
- Specifies the type of write operation to be performed in the custom connector when it's used as destination.
- entityName string
- The entity specified in the custom connector as a destination in the flow.
- customProperties {[key: string]: string}
- The custom properties that are specific to the connector when it's used as a destination in the flow.
- errorHandling FlowConfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the custom connector as destination.
- idField string[]Names 
- List of fields used as ID when performing a write operation.
- writeOperation FlowType Write Operation Type 
- Specifies the type of write operation to be performed in the custom connector when it's used as destination.
- entity_name str
- The entity specified in the custom connector as a destination in the flow.
- custom_properties Mapping[str, str]
- The custom properties that are specific to the connector when it's used as a destination in the flow.
- error_handling_ Flowconfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the custom connector as destination.
- id_field_ Sequence[str]names 
- List of fields used as ID when performing a write operation.
- write_operation_ Flowtype Write Operation Type 
- Specifies the type of write operation to be performed in the custom connector when it's used as destination.
- entityName String
- The entity specified in the custom connector as a destination in the flow.
- customProperties Map<String>
- The custom properties that are specific to the connector when it's used as a destination in the flow.
- errorHandling Property MapConfig 
- The settings that determine how Amazon AppFlow handles an error when placing data in the custom connector as destination.
- idField List<String>Names 
- List of fields used as ID when performing a write operation.
- writeOperation "INSERT" | "UPSERT" | "UPDATE" | "DELETE"Type 
- Specifies the type of write operation to be performed in the custom connector when it's used as destination.
FlowCustomConnectorSourceProperties    
- EntityName string
- The entity specified in the custom connector as a source in the flow.
- CustomProperties Dictionary<string, string>
- Custom properties that are required to use the custom connector as a source.
- DataTransfer Pulumi.Api Aws Native. App Flow. Inputs. Flow Custom Connector Source Properties Data Transfer Api Properties 
- The API of the connector application that Amazon AppFlow uses to transfer your data.
- EntityName string
- The entity specified in the custom connector as a source in the flow.
- CustomProperties map[string]string
- Custom properties that are required to use the custom connector as a source.
- DataTransfer FlowApi Custom Connector Source Properties Data Transfer Api Properties 
- The API of the connector application that Amazon AppFlow uses to transfer your data.
- entityName String
- The entity specified in the custom connector as a source in the flow.
- customProperties Map<String,String>
- Custom properties that are required to use the custom connector as a source.
- dataTransfer FlowApi Custom Connector Source Properties Data Transfer Api Properties 
- The API of the connector application that Amazon AppFlow uses to transfer your data.
- entityName string
- The entity specified in the custom connector as a source in the flow.
- customProperties {[key: string]: string}
- Custom properties that are required to use the custom connector as a source.
- dataTransfer FlowApi Custom Connector Source Properties Data Transfer Api Properties 
- The API of the connector application that Amazon AppFlow uses to transfer your data.
- entity_name str
- The entity specified in the custom connector as a source in the flow.
- custom_properties Mapping[str, str]
- Custom properties that are required to use the custom connector as a source.
- data_transfer_ Flowapi Custom Connector Source Properties Data Transfer Api Properties 
- The API of the connector application that Amazon AppFlow uses to transfer your data.
- entityName String
- The entity specified in the custom connector as a source in the flow.
- customProperties Map<String>
- Custom properties that are required to use the custom connector as a source.
- dataTransfer Property MapApi 
- The API of the connector application that Amazon AppFlow uses to transfer your data.
FlowCustomConnectorSourcePropertiesDataTransferApiProperties        
FlowCustomConnectorSourcePropertiesDataTransferApiPropertiesType         
FlowDataTransferApi   
FlowDatadogConnectorOperator   
FlowDatadogSourceProperties   
- Object string
- The object specified in the Datadog flow source.
- Object string
- The object specified in the Datadog flow source.
- object String
- The object specified in the Datadog flow source.
- object string
- The object specified in the Datadog flow source.
- object str
- The object specified in the Datadog flow source.
- object String
- The object specified in the Datadog flow source.
FlowDestinationConnectorProperties   
- CustomConnector Pulumi.Aws Native. App Flow. Inputs. Flow Custom Connector Destination Properties 
- The properties that are required to query the custom Connector.
- EventBridge Pulumi.Aws Native. App Flow. Inputs. Flow Event Bridge Destination Properties 
- The properties required to query Amazon EventBridge.
- LookoutMetrics Pulumi.Aws Native. App Flow. Inputs. Flow Lookout Metrics Destination Properties 
- The properties required to query Amazon Lookout for Metrics.
- Marketo
Pulumi.Aws Native. App Flow. Inputs. Flow Marketo Destination Properties 
- The properties required to query Marketo.
- Redshift
Pulumi.Aws Native. App Flow. Inputs. Flow Redshift Destination Properties 
- The properties required to query Amazon Redshift.
- S3
Pulumi.Aws Native. App Flow. Inputs. Flow S3Destination Properties 
- The properties required to query Amazon S3.
- Salesforce
Pulumi.Aws Native. App Flow. Inputs. Flow Salesforce Destination Properties 
- The properties required to query Salesforce.
- SapoData Pulumi.Aws Native. App Flow. Inputs. Flow Sapo Data Destination Properties 
- The properties required to query SAPOData.
- Snowflake
Pulumi.Aws Native. App Flow. Inputs. Flow Snowflake Destination Properties 
- The properties required to query Snowflake.
- Upsolver
Pulumi.Aws Native. App Flow. Inputs. Flow Upsolver Destination Properties 
- The properties required to query Upsolver.
- Zendesk
Pulumi.Aws Native. App Flow. Inputs. Flow Zendesk Destination Properties 
- The properties required to query Zendesk.
- CustomConnector FlowCustom Connector Destination Properties 
- The properties that are required to query the custom Connector.
- EventBridge FlowEvent Bridge Destination Properties 
- The properties required to query Amazon EventBridge.
- LookoutMetrics FlowLookout Metrics Destination Properties 
- The properties required to query Amazon Lookout for Metrics.
- Marketo
FlowMarketo Destination Properties 
- The properties required to query Marketo.
- Redshift
FlowRedshift Destination Properties 
- The properties required to query Amazon Redshift.
- S3
FlowS3Destination Properties 
- The properties required to query Amazon S3.
- Salesforce
FlowSalesforce Destination Properties 
- The properties required to query Salesforce.
- SapoData FlowSapo Data Destination Properties 
- The properties required to query SAPOData.
- Snowflake
FlowSnowflake Destination Properties 
- The properties required to query Snowflake.
- Upsolver
FlowUpsolver Destination Properties 
- The properties required to query Upsolver.
- Zendesk
FlowZendesk Destination Properties 
- The properties required to query Zendesk.
- customConnector FlowCustom Connector Destination Properties 
- The properties that are required to query the custom Connector.
- eventBridge FlowEvent Bridge Destination Properties 
- The properties required to query Amazon EventBridge.
- lookoutMetrics FlowLookout Metrics Destination Properties 
- The properties required to query Amazon Lookout for Metrics.
- marketo
FlowMarketo Destination Properties 
- The properties required to query Marketo.
- redshift
FlowRedshift Destination Properties 
- The properties required to query Amazon Redshift.
- s3
FlowS3Destination Properties 
- The properties required to query Amazon S3.
- salesforce
FlowSalesforce Destination Properties 
- The properties required to query Salesforce.
- sapoData FlowSapo Data Destination Properties 
- The properties required to query SAPOData.
- snowflake
FlowSnowflake Destination Properties 
- The properties required to query Snowflake.
- upsolver
FlowUpsolver Destination Properties 
- The properties required to query Upsolver.
- zendesk
FlowZendesk Destination Properties 
- The properties required to query Zendesk.
- customConnector FlowCustom Connector Destination Properties 
- The properties that are required to query the custom Connector.
- eventBridge FlowEvent Bridge Destination Properties 
- The properties required to query Amazon EventBridge.
- lookoutMetrics FlowLookout Metrics Destination Properties 
- The properties required to query Amazon Lookout for Metrics.
- marketo
FlowMarketo Destination Properties 
- The properties required to query Marketo.
- redshift
FlowRedshift Destination Properties 
- The properties required to query Amazon Redshift.
- s3
FlowS3Destination Properties 
- The properties required to query Amazon S3.
- salesforce
FlowSalesforce Destination Properties 
- The properties required to query Salesforce.
- sapoData FlowSapo Data Destination Properties 
- The properties required to query SAPOData.
- snowflake
FlowSnowflake Destination Properties 
- The properties required to query Snowflake.
- upsolver
FlowUpsolver Destination Properties 
- The properties required to query Upsolver.
- zendesk
FlowZendesk Destination Properties 
- The properties required to query Zendesk.
- custom_connector FlowCustom Connector Destination Properties 
- The properties that are required to query the custom Connector.
- event_bridge FlowEvent Bridge Destination Properties 
- The properties required to query Amazon EventBridge.
- lookout_metrics FlowLookout Metrics Destination Properties 
- The properties required to query Amazon Lookout for Metrics.
- marketo
FlowMarketo Destination Properties 
- The properties required to query Marketo.
- redshift
FlowRedshift Destination Properties 
- The properties required to query Amazon Redshift.
- s3
FlowS3Destination Properties 
- The properties required to query Amazon S3.
- salesforce
FlowSalesforce Destination Properties 
- The properties required to query Salesforce.
- sapo_data FlowSapo Data Destination Properties 
- The properties required to query SAPOData.
- snowflake
FlowSnowflake Destination Properties 
- The properties required to query Snowflake.
- upsolver
FlowUpsolver Destination Properties 
- The properties required to query Upsolver.
- zendesk
FlowZendesk Destination Properties 
- The properties required to query Zendesk.
- customConnector Property Map
- The properties that are required to query the custom Connector.
- eventBridge Property Map
- The properties required to query Amazon EventBridge.
- lookoutMetrics Property Map
- The properties required to query Amazon Lookout for Metrics.
- marketo Property Map
- The properties required to query Marketo.
- redshift Property Map
- The properties required to query Amazon Redshift.
- s3 Property Map
- The properties required to query Amazon S3.
- salesforce Property Map
- The properties required to query Salesforce.
- sapoData Property Map
- The properties required to query SAPOData.
- snowflake Property Map
- The properties required to query Snowflake.
- upsolver Property Map
- The properties required to query Upsolver.
- zendesk Property Map
- The properties required to query Zendesk.
FlowDestinationFlowConfig   
- ConnectorType Pulumi.Aws Native. App Flow. Flow Connector Type 
- Destination connector type
- DestinationConnector Pulumi.Properties Aws Native. App Flow. Inputs. Flow Destination Connector Properties 
- Destination connector details
- ApiVersion string
- The API version that the destination connector uses.
- ConnectorProfile stringName 
- Name of destination connector profile
- ConnectorType FlowConnector Type 
- Destination connector type
- DestinationConnector FlowProperties Destination Connector Properties 
- Destination connector details
- ApiVersion string
- The API version that the destination connector uses.
- ConnectorProfile stringName 
- Name of destination connector profile
- connectorType FlowConnector Type 
- Destination connector type
- destinationConnector FlowProperties Destination Connector Properties 
- Destination connector details
- apiVersion String
- The API version that the destination connector uses.
- connectorProfile StringName 
- Name of destination connector profile
- connectorType FlowConnector Type 
- Destination connector type
- destinationConnector FlowProperties Destination Connector Properties 
- Destination connector details
- apiVersion string
- The API version that the destination connector uses.
- connectorProfile stringName 
- Name of destination connector profile
- connector_type FlowConnector Type 
- Destination connector type
- destination_connector_ Flowproperties Destination Connector Properties 
- Destination connector details
- api_version str
- The API version that the destination connector uses.
- connector_profile_ strname 
- Name of destination connector profile
- connectorType "SAPOData" | "Salesforce" | "Pardot" | "Singular" | "Slack" | "Redshift" | "S3" | "Marketo" | "Googleanalytics" | "Zendesk" | "Servicenow" | "Datadog" | "Trendmicro" | "Snowflake" | "Dynatrace" | "Infornexus" | "Amplitude" | "Veeva" | "CustomConnector" | "Event Bridge" | "Upsolver" | "Lookout Metrics" 
- Destination connector type
- destinationConnector Property MapProperties 
- Destination connector details
- apiVersion String
- The API version that the destination connector uses.
- connectorProfile StringName 
- Name of destination connector profile
FlowDynatraceConnectorOperator   
FlowDynatraceSourceProperties   
- Object string
- The object specified in the Dynatrace flow source.
- Object string
- The object specified in the Dynatrace flow source.
- object String
- The object specified in the Dynatrace flow source.
- object string
- The object specified in the Dynatrace flow source.
- object str
- The object specified in the Dynatrace flow source.
- object String
- The object specified in the Dynatrace flow source.
FlowErrorHandlingConfig   
- BucketName string
- Specifies the name of the Amazon S3 bucket.
- BucketPrefix string
- Specifies the Amazon S3 bucket prefix.
- FailOn boolFirst Error 
- Specifies if the flow should fail after the first instance of a failure when attempting to place data in the destination.
- BucketName string
- Specifies the name of the Amazon S3 bucket.
- BucketPrefix string
- Specifies the Amazon S3 bucket prefix.
- FailOn boolFirst Error 
- Specifies if the flow should fail after the first instance of a failure when attempting to place data in the destination.
- bucketName String
- Specifies the name of the Amazon S3 bucket.
- bucketPrefix String
- Specifies the Amazon S3 bucket prefix.
- failOn BooleanFirst Error 
- Specifies if the flow should fail after the first instance of a failure when attempting to place data in the destination.
- bucketName string
- Specifies the name of the Amazon S3 bucket.
- bucketPrefix string
- Specifies the Amazon S3 bucket prefix.
- failOn booleanFirst Error 
- Specifies if the flow should fail after the first instance of a failure when attempting to place data in the destination.
- bucket_name str
- Specifies the name of the Amazon S3 bucket.
- bucket_prefix str
- Specifies the Amazon S3 bucket prefix.
- fail_on_ boolfirst_ error 
- Specifies if the flow should fail after the first instance of a failure when attempting to place data in the destination.
- bucketName String
- Specifies the name of the Amazon S3 bucket.
- bucketPrefix String
- Specifies the Amazon S3 bucket prefix.
- failOn BooleanFirst Error 
- Specifies if the flow should fail after the first instance of a failure when attempting to place data in the destination.
FlowEventBridgeDestinationProperties    
- Object string
- The object specified in the Amazon EventBridge flow destination.
- ErrorHandling Pulumi.Config Aws Native. App Flow. Inputs. Flow Error Handling Config 
- The object specified in the Amplitude flow source.
- Object string
- The object specified in the Amazon EventBridge flow destination.
- ErrorHandling FlowConfig Error Handling Config 
- The object specified in the Amplitude flow source.
- object String
- The object specified in the Amazon EventBridge flow destination.
- errorHandling FlowConfig Error Handling Config 
- The object specified in the Amplitude flow source.
- object string
- The object specified in the Amazon EventBridge flow destination.
- errorHandling FlowConfig Error Handling Config 
- The object specified in the Amplitude flow source.
- object str
- The object specified in the Amazon EventBridge flow destination.
- error_handling_ Flowconfig Error Handling Config 
- The object specified in the Amplitude flow source.
- object String
- The object specified in the Amazon EventBridge flow destination.
- errorHandling Property MapConfig 
- The object specified in the Amplitude flow source.
FlowFileType  
FlowGlueDataCatalog   
- DatabaseName string
- A string containing the value for the tag
- RoleArn string
- A string containing the value for the tag
- TablePrefix string
- A string containing the value for the tag
- DatabaseName string
- A string containing the value for the tag
- RoleArn string
- A string containing the value for the tag
- TablePrefix string
- A string containing the value for the tag
- databaseName String
- A string containing the value for the tag
- roleArn String
- A string containing the value for the tag
- tablePrefix String
- A string containing the value for the tag
- databaseName string
- A string containing the value for the tag
- roleArn string
- A string containing the value for the tag
- tablePrefix string
- A string containing the value for the tag
- database_name str
- A string containing the value for the tag
- role_arn str
- A string containing the value for the tag
- table_prefix str
- A string containing the value for the tag
- databaseName String
- A string containing the value for the tag
- roleArn String
- A string containing the value for the tag
- tablePrefix String
- A string containing the value for the tag
FlowGoogleAnalyticsConnectorOperator    
FlowGoogleAnalyticsSourceProperties    
- Object string
- The object specified in the Google Analytics flow source.
- Object string
- The object specified in the Google Analytics flow source.
- object String
- The object specified in the Google Analytics flow source.
- object string
- The object specified in the Google Analytics flow source.
- object str
- The object specified in the Google Analytics flow source.
- object String
- The object specified in the Google Analytics flow source.
FlowIncrementalPullConfig   
- DatetimeType stringField Name 
- A field that specifies the date time or timestamp field as the criteria to use when importing incremental records from the source.
- DatetimeType stringField Name 
- A field that specifies the date time or timestamp field as the criteria to use when importing incremental records from the source.
- datetimeType StringField Name 
- A field that specifies the date time or timestamp field as the criteria to use when importing incremental records from the source.
- datetimeType stringField Name 
- A field that specifies the date time or timestamp field as the criteria to use when importing incremental records from the source.
- datetime_type_ strfield_ name 
- A field that specifies the date time or timestamp field as the criteria to use when importing incremental records from the source.
- datetimeType StringField Name 
- A field that specifies the date time or timestamp field as the criteria to use when importing incremental records from the source.
FlowInforNexusConnectorOperator    
FlowInforNexusSourceProperties    
- Object string
- The object specified in the Infor Nexus flow source.
- Object string
- The object specified in the Infor Nexus flow source.
- object String
- The object specified in the Infor Nexus flow source.
- object string
- The object specified in the Infor Nexus flow source.
- object str
- The object specified in the Infor Nexus flow source.
- object String
- The object specified in the Infor Nexus flow source.
FlowLookoutMetricsDestinationProperties    
- Object string
- The object specified in the Amazon Lookout for Metrics flow destination.
- Object string
- The object specified in the Amazon Lookout for Metrics flow destination.
- object String
- The object specified in the Amazon Lookout for Metrics flow destination.
- object string
- The object specified in the Amazon Lookout for Metrics flow destination.
- object str
- The object specified in the Amazon Lookout for Metrics flow destination.
- object String
- The object specified in the Amazon Lookout for Metrics flow destination.
FlowMarketoConnectorOperator   
FlowMarketoDestinationProperties   
- Object string
- The object specified in the Marketo flow destination.
- ErrorHandling Pulumi.Config Aws Native. App Flow. Inputs. Flow Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- Object string
- The object specified in the Marketo flow destination.
- ErrorHandling FlowConfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- object String
- The object specified in the Marketo flow destination.
- errorHandling FlowConfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- object string
- The object specified in the Marketo flow destination.
- errorHandling FlowConfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- object str
- The object specified in the Marketo flow destination.
- error_handling_ Flowconfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- object String
- The object specified in the Marketo flow destination.
- errorHandling Property MapConfig 
- The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
FlowMarketoSourceProperties   
- Object string
- The object specified in the Marketo flow source.
- Object string
- The object specified in the Marketo flow source.
- object String
- The object specified in the Marketo flow source.
- object string
- The object specified in the Marketo flow source.
- object str
- The object specified in the Marketo flow source.
- object String
- The object specified in the Marketo flow source.
FlowMetadataCatalogConfig   
- GlueData Pulumi.Catalog Aws Native. App Flow. Inputs. Flow Glue Data Catalog 
- Configurations of glue data catalog of the flow.
- GlueData FlowCatalog Glue Data Catalog 
- Configurations of glue data catalog of the flow.
- glueData FlowCatalog Glue Data Catalog 
- Configurations of glue data catalog of the flow.
- glueData FlowCatalog Glue Data Catalog 
- Configurations of glue data catalog of the flow.
- glue_data_ Flowcatalog Glue Data Catalog 
- Configurations of glue data catalog of the flow.
- glueData Property MapCatalog 
- Configurations of glue data catalog of the flow.
FlowOperator 
FlowOperatorPropertiesKeys   
FlowPardotConnectorOperator   
FlowPardotSourceProperties   
- Object string
- The object specified in the Salesforce Pardot flow source.
- Object string
- The object specified in the Salesforce Pardot flow source.
- object String
- The object specified in the Salesforce Pardot flow source.
- object string
- The object specified in the Salesforce Pardot flow source.
- object str
- The object specified in the Salesforce Pardot flow source.
- object String
- The object specified in the Salesforce Pardot flow source.
FlowPathPrefix  
FlowPrefixConfig  
- PathPrefix List<Pulumi.Hierarchy Aws Native. App Flow. Flow Path Prefix> 
- Specifies whether the destination file path includes either or both of the following elements: - EXECUTION_ID - The ID that Amazon AppFlow assigns to the flow run. 
- SCHEMA_VERSION - The version number of your data schema. Amazon AppFlow assigns this version number. The version number increases by one when you change any of the following settings in your flow configuration: 
- Source-to-destination field mappings 
- Field data types 
- Partition keys 
 
- PrefixFormat Pulumi.Aws Native. App Flow. Flow Prefix Format 
- Determines the level of granularity for the date and time that's included in the prefix.
- PrefixType Pulumi.Aws Native. App Flow. Flow Prefix Type 
- Determines the format of the prefix, and whether it applies to the file name, file path, or both.
- PathPrefix []FlowHierarchy Path Prefix 
- Specifies whether the destination file path includes either or both of the following elements: - EXECUTION_ID - The ID that Amazon AppFlow assigns to the flow run. 
- SCHEMA_VERSION - The version number of your data schema. Amazon AppFlow assigns this version number. The version number increases by one when you change any of the following settings in your flow configuration: 
- Source-to-destination field mappings 
- Field data types 
- Partition keys 
 
- PrefixFormat FlowPrefix Format 
- Determines the level of granularity for the date and time that's included in the prefix.
- PrefixType FlowPrefix Type 
- Determines the format of the prefix, and whether it applies to the file name, file path, or both.
- pathPrefix List<FlowHierarchy Path Prefix> 
- Specifies whether the destination file path includes either or both of the following elements: - EXECUTION_ID - The ID that Amazon AppFlow assigns to the flow run. 
- SCHEMA_VERSION - The version number of your data schema. Amazon AppFlow assigns this version number. The version number increases by one when you change any of the following settings in your flow configuration: 
- Source-to-destination field mappings 
- Field data types 
- Partition keys 
 
- prefixFormat FlowPrefix Format 
- Determines the level of granularity for the date and time that's included in the prefix.
- prefixType FlowPrefix Type 
- Determines the format of the prefix, and whether it applies to the file name, file path, or both.
- pathPrefix FlowHierarchy Path Prefix[] 
- Specifies whether the destination file path includes either or both of the following elements: - EXECUTION_ID - The ID that Amazon AppFlow assigns to the flow run. 
- SCHEMA_VERSION - The version number of your data schema. Amazon AppFlow assigns this version number. The version number increases by one when you change any of the following settings in your flow configuration: 
- Source-to-destination field mappings 
- Field data types 
- Partition keys 
 
- prefixFormat FlowPrefix Format 
- Determines the level of granularity for the date and time that's included in the prefix.
- prefixType FlowPrefix Type 
- Determines the format of the prefix, and whether it applies to the file name, file path, or both.
- path_prefix_ Sequence[Flowhierarchy Path Prefix] 
- Specifies whether the destination file path includes either or both of the following elements: - EXECUTION_ID - The ID that Amazon AppFlow assigns to the flow run. 
- SCHEMA_VERSION - The version number of your data schema. Amazon AppFlow assigns this version number. The version number increases by one when you change any of the following settings in your flow configuration: 
- Source-to-destination field mappings 
- Field data types 
- Partition keys 
 
- prefix_format FlowPrefix Format 
- Determines the level of granularity for the date and time that's included in the prefix.
- prefix_type FlowPrefix Type 
- Determines the format of the prefix, and whether it applies to the file name, file path, or both.
- pathPrefix List<"EXECUTION_ID" | "SCHEMA_VERSION">Hierarchy 
- Specifies whether the destination file path includes either or both of the following elements: - EXECUTION_ID - The ID that Amazon AppFlow assigns to the flow run. 
- SCHEMA_VERSION - The version number of your data schema. Amazon AppFlow assigns this version number. The version number increases by one when you change any of the following settings in your flow configuration: 
- Source-to-destination field mappings 
- Field data types 
- Partition keys 
 
- prefixFormat "YEAR" | "MONTH" | "DAY" | "HOUR" | "MINUTE"
- Determines the level of granularity for the date and time that's included in the prefix.
- prefixType "FILENAME" | "PATH" | "PATH_AND_FILENAME"
- Determines the format of the prefix, and whether it applies to the file name, file path, or both.
FlowPrefixFormat  
FlowPrefixType  
FlowRedshiftDestinationProperties   
- IntermediateBucket stringName 
- The intermediate bucket that Amazon AppFlow uses when moving data into Amazon Redshift.
- Object string
- The object specified in the Amazon Redshift flow destination.
- BucketPrefix string
- The object key for the bucket in which Amazon AppFlow places the destination files.
- ErrorHandling Pulumi.Config Aws Native. App Flow. Inputs. Flow Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the Amazon Redshift destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- IntermediateBucket stringName 
- The intermediate bucket that Amazon AppFlow uses when moving data into Amazon Redshift.
- Object string
- The object specified in the Amazon Redshift flow destination.
- BucketPrefix string
- The object key for the bucket in which Amazon AppFlow places the destination files.
- ErrorHandling FlowConfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the Amazon Redshift destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- intermediateBucket StringName 
- The intermediate bucket that Amazon AppFlow uses when moving data into Amazon Redshift.
- object String
- The object specified in the Amazon Redshift flow destination.
- bucketPrefix String
- The object key for the bucket in which Amazon AppFlow places the destination files.
- errorHandling FlowConfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the Amazon Redshift destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- intermediateBucket stringName 
- The intermediate bucket that Amazon AppFlow uses when moving data into Amazon Redshift.
- object string
- The object specified in the Amazon Redshift flow destination.
- bucketPrefix string
- The object key for the bucket in which Amazon AppFlow places the destination files.
- errorHandling FlowConfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the Amazon Redshift destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- intermediate_bucket_ strname 
- The intermediate bucket that Amazon AppFlow uses when moving data into Amazon Redshift.
- object str
- The object specified in the Amazon Redshift flow destination.
- bucket_prefix str
- The object key for the bucket in which Amazon AppFlow places the destination files.
- error_handling_ Flowconfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the Amazon Redshift destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- intermediateBucket StringName 
- The intermediate bucket that Amazon AppFlow uses when moving data into Amazon Redshift.
- object String
- The object specified in the Amazon Redshift flow destination.
- bucketPrefix String
- The object key for the bucket in which Amazon AppFlow places the destination files.
- errorHandling Property MapConfig 
- The settings that determine how Amazon AppFlow handles an error when placing data in the Amazon Redshift destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
FlowS3ConnectorOperator  
FlowS3DestinationProperties  
- BucketName string
- The Amazon S3 bucket name in which Amazon AppFlow places the transferred data.
- BucketPrefix string
- The object key for the destination bucket in which Amazon AppFlow places the files.
- S3OutputFormat Pulumi.Config Aws Native. App Flow. Inputs. Flow S3Output Format Config 
- The configuration that determines how Amazon AppFlow should format the flow output data when Amazon S3 is used as the destination.
- BucketName string
- The Amazon S3 bucket name in which Amazon AppFlow places the transferred data.
- BucketPrefix string
- The object key for the destination bucket in which Amazon AppFlow places the files.
- S3OutputFormat FlowConfig S3Output Format Config 
- The configuration that determines how Amazon AppFlow should format the flow output data when Amazon S3 is used as the destination.
- bucketName String
- The Amazon S3 bucket name in which Amazon AppFlow places the transferred data.
- bucketPrefix String
- The object key for the destination bucket in which Amazon AppFlow places the files.
- s3OutputFormat FlowConfig S3Output Format Config 
- The configuration that determines how Amazon AppFlow should format the flow output data when Amazon S3 is used as the destination.
- bucketName string
- The Amazon S3 bucket name in which Amazon AppFlow places the transferred data.
- bucketPrefix string
- The object key for the destination bucket in which Amazon AppFlow places the files.
- s3OutputFormat FlowConfig S3Output Format Config 
- The configuration that determines how Amazon AppFlow should format the flow output data when Amazon S3 is used as the destination.
- bucket_name str
- The Amazon S3 bucket name in which Amazon AppFlow places the transferred data.
- bucket_prefix str
- The object key for the destination bucket in which Amazon AppFlow places the files.
- s3_output_ Flowformat_ config S3Output Format Config 
- The configuration that determines how Amazon AppFlow should format the flow output data when Amazon S3 is used as the destination.
- bucketName String
- The Amazon S3 bucket name in which Amazon AppFlow places the transferred data.
- bucketPrefix String
- The object key for the destination bucket in which Amazon AppFlow places the files.
- s3OutputFormat Property MapConfig 
- The configuration that determines how Amazon AppFlow should format the flow output data when Amazon S3 is used as the destination.
FlowS3InputFormatConfig   
- S3InputFile Pulumi.Type Aws Native. App Flow. Flow S3Input Format Config S3Input File Type 
- The file type that Amazon AppFlow gets from your Amazon S3 bucket.
- S3InputFile FlowType S3Input Format Config S3Input File Type 
- The file type that Amazon AppFlow gets from your Amazon S3 bucket.
- s3InputFile FlowType S3Input Format Config S3Input File Type 
- The file type that Amazon AppFlow gets from your Amazon S3 bucket.
- s3InputFile FlowType S3Input Format Config S3Input File Type 
- The file type that Amazon AppFlow gets from your Amazon S3 bucket.
- s3_input_ Flowfile_ type S3Input Format Config S3Input File Type 
- The file type that Amazon AppFlow gets from your Amazon S3 bucket.
- s3InputFile "CSV" | "JSON"Type 
- The file type that Amazon AppFlow gets from your Amazon S3 bucket.
FlowS3InputFormatConfigS3InputFileType      
FlowS3OutputFormatConfig   
- AggregationConfig Pulumi.Aws Native. App Flow. Inputs. Flow Aggregation Config 
- The aggregation settings that you can use to customize the output format of your flow data.
- FileType Pulumi.Aws Native. App Flow. Flow File Type 
- Indicates the file type that Amazon AppFlow places in the Amazon S3 bucket.
- PrefixConfig Pulumi.Aws Native. App Flow. Inputs. Flow Prefix Config 
- Determines the prefix that Amazon AppFlow applies to the folder name in the Amazon S3 bucket. You can name folders according to the flow frequency and date.
- PreserveSource boolData Typing 
- If your file output format is Parquet, use this parameter to set whether Amazon AppFlow preserves the data types in your source data when it writes the output to Amazon S3.- true: Amazon AppFlow preserves the data types when it writes to Amazon S3. For example, an integer or- 1in your source data is still an integer in your output.
- false: Amazon AppFlow converts all of the source data into strings when it writes to Amazon S3. For example, an integer of- 1in your source data becomes the string- "1"in the output.
 
- AggregationConfig FlowAggregation Config 
- The aggregation settings that you can use to customize the output format of your flow data.
- FileType FlowFile Type 
- Indicates the file type that Amazon AppFlow places in the Amazon S3 bucket.
- PrefixConfig FlowPrefix Config 
- Determines the prefix that Amazon AppFlow applies to the folder name in the Amazon S3 bucket. You can name folders according to the flow frequency and date.
- PreserveSource boolData Typing 
- If your file output format is Parquet, use this parameter to set whether Amazon AppFlow preserves the data types in your source data when it writes the output to Amazon S3.- true: Amazon AppFlow preserves the data types when it writes to Amazon S3. For example, an integer or- 1in your source data is still an integer in your output.
- false: Amazon AppFlow converts all of the source data into strings when it writes to Amazon S3. For example, an integer of- 1in your source data becomes the string- "1"in the output.
 
- aggregationConfig FlowAggregation Config 
- The aggregation settings that you can use to customize the output format of your flow data.
- fileType FlowFile Type 
- Indicates the file type that Amazon AppFlow places in the Amazon S3 bucket.
- prefixConfig FlowPrefix Config 
- Determines the prefix that Amazon AppFlow applies to the folder name in the Amazon S3 bucket. You can name folders according to the flow frequency and date.
- preserveSource BooleanData Typing 
- If your file output format is Parquet, use this parameter to set whether Amazon AppFlow preserves the data types in your source data when it writes the output to Amazon S3.- true: Amazon AppFlow preserves the data types when it writes to Amazon S3. For example, an integer or- 1in your source data is still an integer in your output.
- false: Amazon AppFlow converts all of the source data into strings when it writes to Amazon S3. For example, an integer of- 1in your source data becomes the string- "1"in the output.
 
- aggregationConfig FlowAggregation Config 
- The aggregation settings that you can use to customize the output format of your flow data.
- fileType FlowFile Type 
- Indicates the file type that Amazon AppFlow places in the Amazon S3 bucket.
- prefixConfig FlowPrefix Config 
- Determines the prefix that Amazon AppFlow applies to the folder name in the Amazon S3 bucket. You can name folders according to the flow frequency and date.
- preserveSource booleanData Typing 
- If your file output format is Parquet, use this parameter to set whether Amazon AppFlow preserves the data types in your source data when it writes the output to Amazon S3.- true: Amazon AppFlow preserves the data types when it writes to Amazon S3. For example, an integer or- 1in your source data is still an integer in your output.
- false: Amazon AppFlow converts all of the source data into strings when it writes to Amazon S3. For example, an integer of- 1in your source data becomes the string- "1"in the output.
 
- aggregation_config FlowAggregation Config 
- The aggregation settings that you can use to customize the output format of your flow data.
- file_type FlowFile Type 
- Indicates the file type that Amazon AppFlow places in the Amazon S3 bucket.
- prefix_config FlowPrefix Config 
- Determines the prefix that Amazon AppFlow applies to the folder name in the Amazon S3 bucket. You can name folders according to the flow frequency and date.
- preserve_source_ booldata_ typing 
- If your file output format is Parquet, use this parameter to set whether Amazon AppFlow preserves the data types in your source data when it writes the output to Amazon S3.- true: Amazon AppFlow preserves the data types when it writes to Amazon S3. For example, an integer or- 1in your source data is still an integer in your output.
- false: Amazon AppFlow converts all of the source data into strings when it writes to Amazon S3. For example, an integer of- 1in your source data becomes the string- "1"in the output.
 
- aggregationConfig Property Map
- The aggregation settings that you can use to customize the output format of your flow data.
- fileType "CSV" | "JSON" | "PARQUET"
- Indicates the file type that Amazon AppFlow places in the Amazon S3 bucket.
- prefixConfig Property Map
- Determines the prefix that Amazon AppFlow applies to the folder name in the Amazon S3 bucket. You can name folders according to the flow frequency and date.
- preserveSource BooleanData Typing 
- If your file output format is Parquet, use this parameter to set whether Amazon AppFlow preserves the data types in your source data when it writes the output to Amazon S3.- true: Amazon AppFlow preserves the data types when it writes to Amazon S3. For example, an integer or- 1in your source data is still an integer in your output.
- false: Amazon AppFlow converts all of the source data into strings when it writes to Amazon S3. For example, an integer of- 1in your source data becomes the string- "1"in the output.
 
FlowS3SourceProperties  
- BucketName string
- The Amazon S3 bucket name where the source files are stored.
- BucketPrefix string
- The object key for the Amazon S3 bucket in which the source files are stored.
- S3InputFormat Pulumi.Config Aws Native. App Flow. Inputs. Flow S3Input Format Config 
- When you use Amazon S3 as the source, the configuration format that you provide the flow input data.
- BucketName string
- The Amazon S3 bucket name where the source files are stored.
- BucketPrefix string
- The object key for the Amazon S3 bucket in which the source files are stored.
- S3InputFormat FlowConfig S3Input Format Config 
- When you use Amazon S3 as the source, the configuration format that you provide the flow input data.
- bucketName String
- The Amazon S3 bucket name where the source files are stored.
- bucketPrefix String
- The object key for the Amazon S3 bucket in which the source files are stored.
- s3InputFormat FlowConfig S3Input Format Config 
- When you use Amazon S3 as the source, the configuration format that you provide the flow input data.
- bucketName string
- The Amazon S3 bucket name where the source files are stored.
- bucketPrefix string
- The object key for the Amazon S3 bucket in which the source files are stored.
- s3InputFormat FlowConfig S3Input Format Config 
- When you use Amazon S3 as the source, the configuration format that you provide the flow input data.
- bucket_name str
- The Amazon S3 bucket name where the source files are stored.
- bucket_prefix str
- The object key for the Amazon S3 bucket in which the source files are stored.
- s3_input_ Flowformat_ config S3Input Format Config 
- When you use Amazon S3 as the source, the configuration format that you provide the flow input data.
- bucketName String
- The Amazon S3 bucket name where the source files are stored.
- bucketPrefix String
- The object key for the Amazon S3 bucket in which the source files are stored.
- s3InputFormat Property MapConfig 
- When you use Amazon S3 as the source, the configuration format that you provide the flow input data.
FlowSalesforceConnectorOperator   
FlowSalesforceDestinationProperties   
- Object string
- The object specified in the Salesforce flow destination.
- DataTransfer Pulumi.Api Aws Native. App Flow. Flow Data Transfer Api 
- Specifies which Salesforce API is used by Amazon AppFlow when your flow transfers data to Salesforce. - AUTOMATIC - The default. Amazon AppFlow selects which API to use based on the number of records that your flow transfers to Salesforce. If your flow transfers fewer than 1,000 records, Amazon AppFlow uses Salesforce REST API. If your flow transfers 1,000 records or more, Amazon AppFlow uses Salesforce Bulk API 2.0.
 - Each of these Salesforce APIs structures data differently. If Amazon AppFlow selects the API automatically, be aware that, for recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900 records, and it might use Bulk API 2.0 on the next day to transfer 1,100 records. For each of these flow runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and null values are represented. Also, Bulk API 2.0 doesn't transfer Salesforce compound fields. - By choosing this option, you optimize flow performance for both small and large data transfers, but the tradeoff is inconsistent formatting in the output. - BULKV2 - Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large sets of data. By choosing this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.
 - Note that Bulk API 2.0 does not transfer Salesforce compound fields. - REST_SYNC - Amazon AppFlow uses only Salesforce REST API. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0. In some cases, if your flow attempts to transfer a vary large set of data, it might fail with a timed out error.
 
- ErrorHandling Pulumi.Config Aws Native. App Flow. Inputs. Flow Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the Salesforce destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- IdField List<string>Names 
- List of fields used as ID when performing a write operation.
- WriteOperation Pulumi.Type Aws Native. App Flow. Flow Write Operation Type 
- This specifies the type of write operation to be performed in Salesforce. When the value is UPSERT, thenidFieldNamesis required.
- Object string
- The object specified in the Salesforce flow destination.
- DataTransfer FlowApi Data Transfer Api 
- Specifies which Salesforce API is used by Amazon AppFlow when your flow transfers data to Salesforce. - AUTOMATIC - The default. Amazon AppFlow selects which API to use based on the number of records that your flow transfers to Salesforce. If your flow transfers fewer than 1,000 records, Amazon AppFlow uses Salesforce REST API. If your flow transfers 1,000 records or more, Amazon AppFlow uses Salesforce Bulk API 2.0.
 - Each of these Salesforce APIs structures data differently. If Amazon AppFlow selects the API automatically, be aware that, for recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900 records, and it might use Bulk API 2.0 on the next day to transfer 1,100 records. For each of these flow runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and null values are represented. Also, Bulk API 2.0 doesn't transfer Salesforce compound fields. - By choosing this option, you optimize flow performance for both small and large data transfers, but the tradeoff is inconsistent formatting in the output. - BULKV2 - Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large sets of data. By choosing this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.
 - Note that Bulk API 2.0 does not transfer Salesforce compound fields. - REST_SYNC - Amazon AppFlow uses only Salesforce REST API. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0. In some cases, if your flow attempts to transfer a vary large set of data, it might fail with a timed out error.
 
- ErrorHandling FlowConfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the Salesforce destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- IdField []stringNames 
- List of fields used as ID when performing a write operation.
- WriteOperation FlowType Write Operation Type 
- This specifies the type of write operation to be performed in Salesforce. When the value is UPSERT, thenidFieldNamesis required.
- object String
- The object specified in the Salesforce flow destination.
- dataTransfer FlowApi Data Transfer Api 
- Specifies which Salesforce API is used by Amazon AppFlow when your flow transfers data to Salesforce. - AUTOMATIC - The default. Amazon AppFlow selects which API to use based on the number of records that your flow transfers to Salesforce. If your flow transfers fewer than 1,000 records, Amazon AppFlow uses Salesforce REST API. If your flow transfers 1,000 records or more, Amazon AppFlow uses Salesforce Bulk API 2.0.
 - Each of these Salesforce APIs structures data differently. If Amazon AppFlow selects the API automatically, be aware that, for recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900 records, and it might use Bulk API 2.0 on the next day to transfer 1,100 records. For each of these flow runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and null values are represented. Also, Bulk API 2.0 doesn't transfer Salesforce compound fields. - By choosing this option, you optimize flow performance for both small and large data transfers, but the tradeoff is inconsistent formatting in the output. - BULKV2 - Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large sets of data. By choosing this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.
 - Note that Bulk API 2.0 does not transfer Salesforce compound fields. - REST_SYNC - Amazon AppFlow uses only Salesforce REST API. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0. In some cases, if your flow attempts to transfer a vary large set of data, it might fail with a timed out error.
 
- errorHandling FlowConfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the Salesforce destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- idField List<String>Names 
- List of fields used as ID when performing a write operation.
- writeOperation FlowType Write Operation Type 
- This specifies the type of write operation to be performed in Salesforce. When the value is UPSERT, thenidFieldNamesis required.
- object string
- The object specified in the Salesforce flow destination.
- dataTransfer FlowApi Data Transfer Api 
- Specifies which Salesforce API is used by Amazon AppFlow when your flow transfers data to Salesforce. - AUTOMATIC - The default. Amazon AppFlow selects which API to use based on the number of records that your flow transfers to Salesforce. If your flow transfers fewer than 1,000 records, Amazon AppFlow uses Salesforce REST API. If your flow transfers 1,000 records or more, Amazon AppFlow uses Salesforce Bulk API 2.0.
 - Each of these Salesforce APIs structures data differently. If Amazon AppFlow selects the API automatically, be aware that, for recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900 records, and it might use Bulk API 2.0 on the next day to transfer 1,100 records. For each of these flow runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and null values are represented. Also, Bulk API 2.0 doesn't transfer Salesforce compound fields. - By choosing this option, you optimize flow performance for both small and large data transfers, but the tradeoff is inconsistent formatting in the output. - BULKV2 - Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large sets of data. By choosing this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.
 - Note that Bulk API 2.0 does not transfer Salesforce compound fields. - REST_SYNC - Amazon AppFlow uses only Salesforce REST API. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0. In some cases, if your flow attempts to transfer a vary large set of data, it might fail with a timed out error.
 
- errorHandling FlowConfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the Salesforce destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- idField string[]Names 
- List of fields used as ID when performing a write operation.
- writeOperation FlowType Write Operation Type 
- This specifies the type of write operation to be performed in Salesforce. When the value is UPSERT, thenidFieldNamesis required.
- object str
- The object specified in the Salesforce flow destination.
- data_transfer_ Flowapi Data Transfer Api 
- Specifies which Salesforce API is used by Amazon AppFlow when your flow transfers data to Salesforce. - AUTOMATIC - The default. Amazon AppFlow selects which API to use based on the number of records that your flow transfers to Salesforce. If your flow transfers fewer than 1,000 records, Amazon AppFlow uses Salesforce REST API. If your flow transfers 1,000 records or more, Amazon AppFlow uses Salesforce Bulk API 2.0.
 - Each of these Salesforce APIs structures data differently. If Amazon AppFlow selects the API automatically, be aware that, for recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900 records, and it might use Bulk API 2.0 on the next day to transfer 1,100 records. For each of these flow runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and null values are represented. Also, Bulk API 2.0 doesn't transfer Salesforce compound fields. - By choosing this option, you optimize flow performance for both small and large data transfers, but the tradeoff is inconsistent formatting in the output. - BULKV2 - Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large sets of data. By choosing this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.
 - Note that Bulk API 2.0 does not transfer Salesforce compound fields. - REST_SYNC - Amazon AppFlow uses only Salesforce REST API. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0. In some cases, if your flow attempts to transfer a vary large set of data, it might fail with a timed out error.
 
- error_handling_ Flowconfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the Salesforce destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- id_field_ Sequence[str]names 
- List of fields used as ID when performing a write operation.
- write_operation_ Flowtype Write Operation Type 
- This specifies the type of write operation to be performed in Salesforce. When the value is UPSERT, thenidFieldNamesis required.
- object String
- The object specified in the Salesforce flow destination.
- dataTransfer "AUTOMATIC" | "BULKV2" | "REST_SYNC"Api 
- Specifies which Salesforce API is used by Amazon AppFlow when your flow transfers data to Salesforce. - AUTOMATIC - The default. Amazon AppFlow selects which API to use based on the number of records that your flow transfers to Salesforce. If your flow transfers fewer than 1,000 records, Amazon AppFlow uses Salesforce REST API. If your flow transfers 1,000 records or more, Amazon AppFlow uses Salesforce Bulk API 2.0.
 - Each of these Salesforce APIs structures data differently. If Amazon AppFlow selects the API automatically, be aware that, for recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900 records, and it might use Bulk API 2.0 on the next day to transfer 1,100 records. For each of these flow runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and null values are represented. Also, Bulk API 2.0 doesn't transfer Salesforce compound fields. - By choosing this option, you optimize flow performance for both small and large data transfers, but the tradeoff is inconsistent formatting in the output. - BULKV2 - Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large sets of data. By choosing this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.
 - Note that Bulk API 2.0 does not transfer Salesforce compound fields. - REST_SYNC - Amazon AppFlow uses only Salesforce REST API. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0. In some cases, if your flow attempts to transfer a vary large set of data, it might fail with a timed out error.
 
- errorHandling Property MapConfig 
- The settings that determine how Amazon AppFlow handles an error when placing data in the Salesforce destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- idField List<String>Names 
- List of fields used as ID when performing a write operation.
- writeOperation "INSERT" | "UPSERT" | "UPDATE" | "DELETE"Type 
- This specifies the type of write operation to be performed in Salesforce. When the value is UPSERT, thenidFieldNamesis required.
FlowSalesforceSourceProperties   
- Object string
- The object specified in the Salesforce flow source.
- DataTransfer Pulumi.Api Aws Native. App Flow. Flow Data Transfer Api 
- Specifies which Salesforce API is used by Amazon AppFlow when your flow transfers data from Salesforce. - AUTOMATIC - The default. Amazon AppFlow selects which API to use based on the number of records that your flow transfers from Salesforce. If your flow transfers fewer than 1,000,000 records, Amazon AppFlow uses Salesforce REST API. If your flow transfers 1,000,000 records or more, Amazon AppFlow uses Salesforce Bulk API 2.0.
 - Each of these Salesforce APIs structures data differently. If Amazon AppFlow selects the API automatically, be aware that, for recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900,000 records, and it might use Bulk API 2.0 on the next day to transfer 1,100,000 records. For each of these flow runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and null values are represented. Also, Bulk API 2.0 doesn't transfer Salesforce compound fields. - By choosing this option, you optimize flow performance for both small and large data transfers, but the tradeoff is inconsistent formatting in the output. - BULKV2 - Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large sets of data. By choosing this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.
 - Note that Bulk API 2.0 does not transfer Salesforce compound fields. - REST_SYNC - Amazon AppFlow uses only Salesforce REST API. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0. In some cases, if your flow attempts to transfer a vary large set of data, it might fail wituh a timed out error.
 
- EnableDynamic boolField Update 
- The flag that enables dynamic fetching of new (recently added) fields in the Salesforce objects while running a flow.
- IncludeDeleted boolRecords 
- Indicates whether Amazon AppFlow includes deleted files in the flow run.
- Object string
- The object specified in the Salesforce flow source.
- DataTransfer FlowApi Data Transfer Api 
- Specifies which Salesforce API is used by Amazon AppFlow when your flow transfers data from Salesforce. - AUTOMATIC - The default. Amazon AppFlow selects which API to use based on the number of records that your flow transfers from Salesforce. If your flow transfers fewer than 1,000,000 records, Amazon AppFlow uses Salesforce REST API. If your flow transfers 1,000,000 records or more, Amazon AppFlow uses Salesforce Bulk API 2.0.
 - Each of these Salesforce APIs structures data differently. If Amazon AppFlow selects the API automatically, be aware that, for recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900,000 records, and it might use Bulk API 2.0 on the next day to transfer 1,100,000 records. For each of these flow runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and null values are represented. Also, Bulk API 2.0 doesn't transfer Salesforce compound fields. - By choosing this option, you optimize flow performance for both small and large data transfers, but the tradeoff is inconsistent formatting in the output. - BULKV2 - Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large sets of data. By choosing this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.
 - Note that Bulk API 2.0 does not transfer Salesforce compound fields. - REST_SYNC - Amazon AppFlow uses only Salesforce REST API. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0. In some cases, if your flow attempts to transfer a vary large set of data, it might fail wituh a timed out error.
 
- EnableDynamic boolField Update 
- The flag that enables dynamic fetching of new (recently added) fields in the Salesforce objects while running a flow.
- IncludeDeleted boolRecords 
- Indicates whether Amazon AppFlow includes deleted files in the flow run.
- object String
- The object specified in the Salesforce flow source.
- dataTransfer FlowApi Data Transfer Api 
- Specifies which Salesforce API is used by Amazon AppFlow when your flow transfers data from Salesforce. - AUTOMATIC - The default. Amazon AppFlow selects which API to use based on the number of records that your flow transfers from Salesforce. If your flow transfers fewer than 1,000,000 records, Amazon AppFlow uses Salesforce REST API. If your flow transfers 1,000,000 records or more, Amazon AppFlow uses Salesforce Bulk API 2.0.
 - Each of these Salesforce APIs structures data differently. If Amazon AppFlow selects the API automatically, be aware that, for recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900,000 records, and it might use Bulk API 2.0 on the next day to transfer 1,100,000 records. For each of these flow runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and null values are represented. Also, Bulk API 2.0 doesn't transfer Salesforce compound fields. - By choosing this option, you optimize flow performance for both small and large data transfers, but the tradeoff is inconsistent formatting in the output. - BULKV2 - Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large sets of data. By choosing this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.
 - Note that Bulk API 2.0 does not transfer Salesforce compound fields. - REST_SYNC - Amazon AppFlow uses only Salesforce REST API. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0. In some cases, if your flow attempts to transfer a vary large set of data, it might fail wituh a timed out error.
 
- enableDynamic BooleanField Update 
- The flag that enables dynamic fetching of new (recently added) fields in the Salesforce objects while running a flow.
- includeDeleted BooleanRecords 
- Indicates whether Amazon AppFlow includes deleted files in the flow run.
- object string
- The object specified in the Salesforce flow source.
- dataTransfer FlowApi Data Transfer Api 
- Specifies which Salesforce API is used by Amazon AppFlow when your flow transfers data from Salesforce. - AUTOMATIC - The default. Amazon AppFlow selects which API to use based on the number of records that your flow transfers from Salesforce. If your flow transfers fewer than 1,000,000 records, Amazon AppFlow uses Salesforce REST API. If your flow transfers 1,000,000 records or more, Amazon AppFlow uses Salesforce Bulk API 2.0.
 - Each of these Salesforce APIs structures data differently. If Amazon AppFlow selects the API automatically, be aware that, for recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900,000 records, and it might use Bulk API 2.0 on the next day to transfer 1,100,000 records. For each of these flow runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and null values are represented. Also, Bulk API 2.0 doesn't transfer Salesforce compound fields. - By choosing this option, you optimize flow performance for both small and large data transfers, but the tradeoff is inconsistent formatting in the output. - BULKV2 - Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large sets of data. By choosing this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.
 - Note that Bulk API 2.0 does not transfer Salesforce compound fields. - REST_SYNC - Amazon AppFlow uses only Salesforce REST API. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0. In some cases, if your flow attempts to transfer a vary large set of data, it might fail wituh a timed out error.
 
- enableDynamic booleanField Update 
- The flag that enables dynamic fetching of new (recently added) fields in the Salesforce objects while running a flow.
- includeDeleted booleanRecords 
- Indicates whether Amazon AppFlow includes deleted files in the flow run.
- object str
- The object specified in the Salesforce flow source.
- data_transfer_ Flowapi Data Transfer Api 
- Specifies which Salesforce API is used by Amazon AppFlow when your flow transfers data from Salesforce. - AUTOMATIC - The default. Amazon AppFlow selects which API to use based on the number of records that your flow transfers from Salesforce. If your flow transfers fewer than 1,000,000 records, Amazon AppFlow uses Salesforce REST API. If your flow transfers 1,000,000 records or more, Amazon AppFlow uses Salesforce Bulk API 2.0.
 - Each of these Salesforce APIs structures data differently. If Amazon AppFlow selects the API automatically, be aware that, for recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900,000 records, and it might use Bulk API 2.0 on the next day to transfer 1,100,000 records. For each of these flow runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and null values are represented. Also, Bulk API 2.0 doesn't transfer Salesforce compound fields. - By choosing this option, you optimize flow performance for both small and large data transfers, but the tradeoff is inconsistent formatting in the output. - BULKV2 - Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large sets of data. By choosing this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.
 - Note that Bulk API 2.0 does not transfer Salesforce compound fields. - REST_SYNC - Amazon AppFlow uses only Salesforce REST API. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0. In some cases, if your flow attempts to transfer a vary large set of data, it might fail wituh a timed out error.
 
- enable_dynamic_ boolfield_ update 
- The flag that enables dynamic fetching of new (recently added) fields in the Salesforce objects while running a flow.
- include_deleted_ boolrecords 
- Indicates whether Amazon AppFlow includes deleted files in the flow run.
- object String
- The object specified in the Salesforce flow source.
- dataTransfer "AUTOMATIC" | "BULKV2" | "REST_SYNC"Api 
- Specifies which Salesforce API is used by Amazon AppFlow when your flow transfers data from Salesforce. - AUTOMATIC - The default. Amazon AppFlow selects which API to use based on the number of records that your flow transfers from Salesforce. If your flow transfers fewer than 1,000,000 records, Amazon AppFlow uses Salesforce REST API. If your flow transfers 1,000,000 records or more, Amazon AppFlow uses Salesforce Bulk API 2.0.
 - Each of these Salesforce APIs structures data differently. If Amazon AppFlow selects the API automatically, be aware that, for recurring flows, the data output might vary from one flow run to the next. For example, if a flow runs daily, it might use REST API on one day to transfer 900,000 records, and it might use Bulk API 2.0 on the next day to transfer 1,100,000 records. For each of these flow runs, the respective Salesforce API formats the data differently. Some of the differences include how dates are formatted and null values are represented. Also, Bulk API 2.0 doesn't transfer Salesforce compound fields. - By choosing this option, you optimize flow performance for both small and large data transfers, but the tradeoff is inconsistent formatting in the output. - BULKV2 - Amazon AppFlow uses only Salesforce Bulk API 2.0. This API runs asynchronous data transfers, and it's optimal for large sets of data. By choosing this option, you ensure that your flow writes consistent output, but you optimize performance only for large data transfers.
 - Note that Bulk API 2.0 does not transfer Salesforce compound fields. - REST_SYNC - Amazon AppFlow uses only Salesforce REST API. By choosing this option, you ensure that your flow writes consistent output, but you decrease performance for large data transfers that are better suited for Bulk API 2.0. In some cases, if your flow attempts to transfer a vary large set of data, it might fail wituh a timed out error.
 
- enableDynamic BooleanField Update 
- The flag that enables dynamic fetching of new (recently added) fields in the Salesforce objects while running a flow.
- includeDeleted BooleanRecords 
- Indicates whether Amazon AppFlow includes deleted files in the flow run.
FlowSapoDataConnectorOperator    
FlowSapoDataDestinationProperties    
- ObjectPath string
- The object path specified in the SAPOData flow destination.
- ErrorHandling Pulumi.Config Aws Native. App Flow. Inputs. Flow Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- IdField List<string>Names 
- List of fields used as ID when performing a write operation.
- SuccessResponse Pulumi.Handling Config Aws Native. App Flow. Inputs. Flow Success Response Handling Config 
- Determines how Amazon AppFlow handles the success response that it gets from the connector after placing data. - For example, this setting would determine where to write the response from a destination connector upon a successful insert operation. 
- WriteOperation Pulumi.Type Aws Native. App Flow. Flow Write Operation Type 
- The possible write operations in the destination connector. When this value is not provided, this defaults to the INSERToperation.
- ObjectPath string
- The object path specified in the SAPOData flow destination.
- ErrorHandling FlowConfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- IdField []stringNames 
- List of fields used as ID when performing a write operation.
- SuccessResponse FlowHandling Config Success Response Handling Config 
- Determines how Amazon AppFlow handles the success response that it gets from the connector after placing data. - For example, this setting would determine where to write the response from a destination connector upon a successful insert operation. 
- WriteOperation FlowType Write Operation Type 
- The possible write operations in the destination connector. When this value is not provided, this defaults to the INSERToperation.
- objectPath String
- The object path specified in the SAPOData flow destination.
- errorHandling FlowConfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- idField List<String>Names 
- List of fields used as ID when performing a write operation.
- successResponse FlowHandling Config Success Response Handling Config 
- Determines how Amazon AppFlow handles the success response that it gets from the connector after placing data. - For example, this setting would determine where to write the response from a destination connector upon a successful insert operation. 
- writeOperation FlowType Write Operation Type 
- The possible write operations in the destination connector. When this value is not provided, this defaults to the INSERToperation.
- objectPath string
- The object path specified in the SAPOData flow destination.
- errorHandling FlowConfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- idField string[]Names 
- List of fields used as ID when performing a write operation.
- successResponse FlowHandling Config Success Response Handling Config 
- Determines how Amazon AppFlow handles the success response that it gets from the connector after placing data. - For example, this setting would determine where to write the response from a destination connector upon a successful insert operation. 
- writeOperation FlowType Write Operation Type 
- The possible write operations in the destination connector. When this value is not provided, this defaults to the INSERToperation.
- object_path str
- The object path specified in the SAPOData flow destination.
- error_handling_ Flowconfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- id_field_ Sequence[str]names 
- List of fields used as ID when performing a write operation.
- success_response_ Flowhandling_ config Success Response Handling Config 
- Determines how Amazon AppFlow handles the success response that it gets from the connector after placing data. - For example, this setting would determine where to write the response from a destination connector upon a successful insert operation. 
- write_operation_ Flowtype Write Operation Type 
- The possible write operations in the destination connector. When this value is not provided, this defaults to the INSERToperation.
- objectPath String
- The object path specified in the SAPOData flow destination.
- errorHandling Property MapConfig 
- The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- idField List<String>Names 
- List of fields used as ID when performing a write operation.
- successResponse Property MapHandling Config 
- Determines how Amazon AppFlow handles the success response that it gets from the connector after placing data. - For example, this setting would determine where to write the response from a destination connector upon a successful insert operation. 
- writeOperation "INSERT" | "UPSERT" | "UPDATE" | "DELETE"Type 
- The possible write operations in the destination connector. When this value is not provided, this defaults to the INSERToperation.
FlowSapoDataPaginationConfig    
- MaxPage intSize 
- The maximum number of records that Amazon AppFlow receives in each page of the response from your SAP application. For transfers of OData records, the maximum page size is 3,000. For transfers of data that comes from an ODP provider, the maximum page size is 10,000.
- MaxPage intSize 
- The maximum number of records that Amazon AppFlow receives in each page of the response from your SAP application. For transfers of OData records, the maximum page size is 3,000. For transfers of data that comes from an ODP provider, the maximum page size is 10,000.
- maxPage IntegerSize 
- The maximum number of records that Amazon AppFlow receives in each page of the response from your SAP application. For transfers of OData records, the maximum page size is 3,000. For transfers of data that comes from an ODP provider, the maximum page size is 10,000.
- maxPage numberSize 
- The maximum number of records that Amazon AppFlow receives in each page of the response from your SAP application. For transfers of OData records, the maximum page size is 3,000. For transfers of data that comes from an ODP provider, the maximum page size is 10,000.
- max_page_ intsize 
- The maximum number of records that Amazon AppFlow receives in each page of the response from your SAP application. For transfers of OData records, the maximum page size is 3,000. For transfers of data that comes from an ODP provider, the maximum page size is 10,000.
- maxPage NumberSize 
- The maximum number of records that Amazon AppFlow receives in each page of the response from your SAP application. For transfers of OData records, the maximum page size is 3,000. For transfers of data that comes from an ODP provider, the maximum page size is 10,000.
FlowSapoDataParallelismConfig    
- MaxParallelism int
- The maximum number of processes that Amazon AppFlow runs at the same time when it retrieves your data from your SAP application.
- MaxParallelism int
- The maximum number of processes that Amazon AppFlow runs at the same time when it retrieves your data from your SAP application.
- maxParallelism Integer
- The maximum number of processes that Amazon AppFlow runs at the same time when it retrieves your data from your SAP application.
- maxParallelism number
- The maximum number of processes that Amazon AppFlow runs at the same time when it retrieves your data from your SAP application.
- max_parallelism int
- The maximum number of processes that Amazon AppFlow runs at the same time when it retrieves your data from your SAP application.
- maxParallelism Number
- The maximum number of processes that Amazon AppFlow runs at the same time when it retrieves your data from your SAP application.
FlowSapoDataSourceProperties    
- ObjectPath string
- The object path specified in the SAPOData flow source.
- PaginationConfig Pulumi.Aws Native. App Flow. Inputs. Flow Sapo Data Pagination Config 
- Sets the page size for each concurrent process that transfers OData records from your SAP instance.
- ParallelismConfig Pulumi.Aws Native. App Flow. Inputs. Flow Sapo Data Parallelism Config 
- Sets the number of concurrent processes that transfers OData records from your SAP instance.
- ObjectPath string
- The object path specified in the SAPOData flow source.
- PaginationConfig FlowSapo Data Pagination Config 
- Sets the page size for each concurrent process that transfers OData records from your SAP instance.
- ParallelismConfig FlowSapo Data Parallelism Config 
- Sets the number of concurrent processes that transfers OData records from your SAP instance.
- objectPath String
- The object path specified in the SAPOData flow source.
- paginationConfig FlowSapo Data Pagination Config 
- Sets the page size for each concurrent process that transfers OData records from your SAP instance.
- parallelismConfig FlowSapo Data Parallelism Config 
- Sets the number of concurrent processes that transfers OData records from your SAP instance.
- objectPath string
- The object path specified in the SAPOData flow source.
- paginationConfig FlowSapo Data Pagination Config 
- Sets the page size for each concurrent process that transfers OData records from your SAP instance.
- parallelismConfig FlowSapo Data Parallelism Config 
- Sets the number of concurrent processes that transfers OData records from your SAP instance.
- object_path str
- The object path specified in the SAPOData flow source.
- pagination_config FlowSapo Data Pagination Config 
- Sets the page size for each concurrent process that transfers OData records from your SAP instance.
- parallelism_config FlowSapo Data Parallelism Config 
- Sets the number of concurrent processes that transfers OData records from your SAP instance.
- objectPath String
- The object path specified in the SAPOData flow source.
- paginationConfig Property Map
- Sets the page size for each concurrent process that transfers OData records from your SAP instance.
- parallelismConfig Property Map
- Sets the number of concurrent processes that transfers OData records from your SAP instance.
FlowScheduledTriggerProperties   
- ScheduleExpression string
- The scheduling expression that determines the rate at which the schedule will run, for example rate(5minutes).
- DataPull Pulumi.Mode Aws Native. App Flow. Flow Scheduled Trigger Properties Data Pull Mode 
- Specifies whether a scheduled flow has an incremental data transfer or a complete data transfer for each flow run.
- FirstExecution doubleFrom 
- Specifies the date range for the records to import from the connector in the first flow run.
- FlowError intDeactivation Threshold 
- Defines how many times a scheduled flow fails consecutively before Amazon AppFlow deactivates it.
- ScheduleEnd doubleTime 
- The time at which the scheduled flow ends. The time is formatted as a timestamp that follows the ISO 8601 standard, such as 2022-04-27T13:00:00-07:00.
- ScheduleOffset double
- Specifies the optional offset that is added to the time interval for a schedule-triggered flow.
- ScheduleStart doubleTime 
- The time at which the scheduled flow starts. The time is formatted as a timestamp that follows the ISO 8601 standard, such as 2022-04-26T13:00:00-07:00.
- TimeZone string
- Specifies the time zone used when referring to the dates and times of a scheduled flow, such as - America/New_York. This time zone is only a descriptive label. It doesn't affect how Amazon AppFlow interprets the timestamps that you specify to schedule the flow.- If you want to schedule a flow by using times in a particular time zone, indicate the time zone as a UTC offset in your timestamps. For example, the UTC offsets for the - America/New_Yorktimezone are- -04:00EDT and- -05:00 EST.
- ScheduleExpression string
- The scheduling expression that determines the rate at which the schedule will run, for example rate(5minutes).
- DataPull FlowMode Scheduled Trigger Properties Data Pull Mode 
- Specifies whether a scheduled flow has an incremental data transfer or a complete data transfer for each flow run.
- FirstExecution float64From 
- Specifies the date range for the records to import from the connector in the first flow run.
- FlowError intDeactivation Threshold 
- Defines how many times a scheduled flow fails consecutively before Amazon AppFlow deactivates it.
- ScheduleEnd float64Time 
- The time at which the scheduled flow ends. The time is formatted as a timestamp that follows the ISO 8601 standard, such as 2022-04-27T13:00:00-07:00.
- ScheduleOffset float64
- Specifies the optional offset that is added to the time interval for a schedule-triggered flow.
- ScheduleStart float64Time 
- The time at which the scheduled flow starts. The time is formatted as a timestamp that follows the ISO 8601 standard, such as 2022-04-26T13:00:00-07:00.
- TimeZone string
- Specifies the time zone used when referring to the dates and times of a scheduled flow, such as - America/New_York. This time zone is only a descriptive label. It doesn't affect how Amazon AppFlow interprets the timestamps that you specify to schedule the flow.- If you want to schedule a flow by using times in a particular time zone, indicate the time zone as a UTC offset in your timestamps. For example, the UTC offsets for the - America/New_Yorktimezone are- -04:00EDT and- -05:00 EST.
- scheduleExpression String
- The scheduling expression that determines the rate at which the schedule will run, for example rate(5minutes).
- dataPull FlowMode Scheduled Trigger Properties Data Pull Mode 
- Specifies whether a scheduled flow has an incremental data transfer or a complete data transfer for each flow run.
- firstExecution DoubleFrom 
- Specifies the date range for the records to import from the connector in the first flow run.
- flowError IntegerDeactivation Threshold 
- Defines how many times a scheduled flow fails consecutively before Amazon AppFlow deactivates it.
- scheduleEnd DoubleTime 
- The time at which the scheduled flow ends. The time is formatted as a timestamp that follows the ISO 8601 standard, such as 2022-04-27T13:00:00-07:00.
- scheduleOffset Double
- Specifies the optional offset that is added to the time interval for a schedule-triggered flow.
- scheduleStart DoubleTime 
- The time at which the scheduled flow starts. The time is formatted as a timestamp that follows the ISO 8601 standard, such as 2022-04-26T13:00:00-07:00.
- timeZone String
- Specifies the time zone used when referring to the dates and times of a scheduled flow, such as - America/New_York. This time zone is only a descriptive label. It doesn't affect how Amazon AppFlow interprets the timestamps that you specify to schedule the flow.- If you want to schedule a flow by using times in a particular time zone, indicate the time zone as a UTC offset in your timestamps. For example, the UTC offsets for the - America/New_Yorktimezone are- -04:00EDT and- -05:00 EST.
- scheduleExpression string
- The scheduling expression that determines the rate at which the schedule will run, for example rate(5minutes).
- dataPull FlowMode Scheduled Trigger Properties Data Pull Mode 
- Specifies whether a scheduled flow has an incremental data transfer or a complete data transfer for each flow run.
- firstExecution numberFrom 
- Specifies the date range for the records to import from the connector in the first flow run.
- flowError numberDeactivation Threshold 
- Defines how many times a scheduled flow fails consecutively before Amazon AppFlow deactivates it.
- scheduleEnd numberTime 
- The time at which the scheduled flow ends. The time is formatted as a timestamp that follows the ISO 8601 standard, such as 2022-04-27T13:00:00-07:00.
- scheduleOffset number
- Specifies the optional offset that is added to the time interval for a schedule-triggered flow.
- scheduleStart numberTime 
- The time at which the scheduled flow starts. The time is formatted as a timestamp that follows the ISO 8601 standard, such as 2022-04-26T13:00:00-07:00.
- timeZone string
- Specifies the time zone used when referring to the dates and times of a scheduled flow, such as - America/New_York. This time zone is only a descriptive label. It doesn't affect how Amazon AppFlow interprets the timestamps that you specify to schedule the flow.- If you want to schedule a flow by using times in a particular time zone, indicate the time zone as a UTC offset in your timestamps. For example, the UTC offsets for the - America/New_Yorktimezone are- -04:00EDT and- -05:00 EST.
- schedule_expression str
- The scheduling expression that determines the rate at which the schedule will run, for example rate(5minutes).
- data_pull_ Flowmode Scheduled Trigger Properties Data Pull Mode 
- Specifies whether a scheduled flow has an incremental data transfer or a complete data transfer for each flow run.
- first_execution_ floatfrom 
- Specifies the date range for the records to import from the connector in the first flow run.
- flow_error_ intdeactivation_ threshold 
- Defines how many times a scheduled flow fails consecutively before Amazon AppFlow deactivates it.
- schedule_end_ floattime 
- The time at which the scheduled flow ends. The time is formatted as a timestamp that follows the ISO 8601 standard, such as 2022-04-27T13:00:00-07:00.
- schedule_offset float
- Specifies the optional offset that is added to the time interval for a schedule-triggered flow.
- schedule_start_ floattime 
- The time at which the scheduled flow starts. The time is formatted as a timestamp that follows the ISO 8601 standard, such as 2022-04-26T13:00:00-07:00.
- time_zone str
- Specifies the time zone used when referring to the dates and times of a scheduled flow, such as - America/New_York. This time zone is only a descriptive label. It doesn't affect how Amazon AppFlow interprets the timestamps that you specify to schedule the flow.- If you want to schedule a flow by using times in a particular time zone, indicate the time zone as a UTC offset in your timestamps. For example, the UTC offsets for the - America/New_Yorktimezone are- -04:00EDT and- -05:00 EST.
- scheduleExpression String
- The scheduling expression that determines the rate at which the schedule will run, for example rate(5minutes).
- dataPull "Incremental" | "Complete"Mode 
- Specifies whether a scheduled flow has an incremental data transfer or a complete data transfer for each flow run.
- firstExecution NumberFrom 
- Specifies the date range for the records to import from the connector in the first flow run.
- flowError NumberDeactivation Threshold 
- Defines how many times a scheduled flow fails consecutively before Amazon AppFlow deactivates it.
- scheduleEnd NumberTime 
- The time at which the scheduled flow ends. The time is formatted as a timestamp that follows the ISO 8601 standard, such as 2022-04-27T13:00:00-07:00.
- scheduleOffset Number
- Specifies the optional offset that is added to the time interval for a schedule-triggered flow.
- scheduleStart NumberTime 
- The time at which the scheduled flow starts. The time is formatted as a timestamp that follows the ISO 8601 standard, such as 2022-04-26T13:00:00-07:00.
- timeZone String
- Specifies the time zone used when referring to the dates and times of a scheduled flow, such as - America/New_York. This time zone is only a descriptive label. It doesn't affect how Amazon AppFlow interprets the timestamps that you specify to schedule the flow.- If you want to schedule a flow by using times in a particular time zone, indicate the time zone as a UTC offset in your timestamps. For example, the UTC offsets for the - America/New_Yorktimezone are- -04:00EDT and- -05:00 EST.
FlowScheduledTriggerPropertiesDataPullMode      
FlowServiceNowConnectorOperator    
FlowServiceNowSourceProperties    
- Object string
- The object specified in the ServiceNow flow source.
- Object string
- The object specified in the ServiceNow flow source.
- object String
- The object specified in the ServiceNow flow source.
- object string
- The object specified in the ServiceNow flow source.
- object str
- The object specified in the ServiceNow flow source.
- object String
- The object specified in the ServiceNow flow source.
FlowSingularConnectorOperator   
FlowSingularSourceProperties   
- Object string
- The object specified in the Singular flow source.
- Object string
- The object specified in the Singular flow source.
- object String
- The object specified in the Singular flow source.
- object string
- The object specified in the Singular flow source.
- object str
- The object specified in the Singular flow source.
- object String
- The object specified in the Singular flow source.
FlowSlackConnectorOperator   
FlowSlackSourceProperties   
- Object string
- The object specified in the Slack flow source.
- Object string
- The object specified in the Slack flow source.
- object String
- The object specified in the Slack flow source.
- object string
- The object specified in the Slack flow source.
- object str
- The object specified in the Slack flow source.
- object String
- The object specified in the Slack flow source.
FlowSnowflakeDestinationProperties   
- IntermediateBucket stringName 
- The intermediate bucket that Amazon AppFlow uses when moving data into Snowflake.
- Object string
- The object specified in the Snowflake flow destination.
- BucketPrefix string
- The object key for the destination bucket in which Amazon AppFlow places the files.
- ErrorHandling Pulumi.Config Aws Native. App Flow. Inputs. Flow Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the Snowflake destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- IntermediateBucket stringName 
- The intermediate bucket that Amazon AppFlow uses when moving data into Snowflake.
- Object string
- The object specified in the Snowflake flow destination.
- BucketPrefix string
- The object key for the destination bucket in which Amazon AppFlow places the files.
- ErrorHandling FlowConfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the Snowflake destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- intermediateBucket StringName 
- The intermediate bucket that Amazon AppFlow uses when moving data into Snowflake.
- object String
- The object specified in the Snowflake flow destination.
- bucketPrefix String
- The object key for the destination bucket in which Amazon AppFlow places the files.
- errorHandling FlowConfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the Snowflake destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- intermediateBucket stringName 
- The intermediate bucket that Amazon AppFlow uses when moving data into Snowflake.
- object string
- The object specified in the Snowflake flow destination.
- bucketPrefix string
- The object key for the destination bucket in which Amazon AppFlow places the files.
- errorHandling FlowConfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the Snowflake destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- intermediate_bucket_ strname 
- The intermediate bucket that Amazon AppFlow uses when moving data into Snowflake.
- object str
- The object specified in the Snowflake flow destination.
- bucket_prefix str
- The object key for the destination bucket in which Amazon AppFlow places the files.
- error_handling_ Flowconfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the Snowflake destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- intermediateBucket StringName 
- The intermediate bucket that Amazon AppFlow uses when moving data into Snowflake.
- object String
- The object specified in the Snowflake flow destination.
- bucketPrefix String
- The object key for the destination bucket in which Amazon AppFlow places the files.
- errorHandling Property MapConfig 
- The settings that determine how Amazon AppFlow handles an error when placing data in the Snowflake destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
FlowSourceConnectorProperties   
- Amplitude
Pulumi.Aws Native. App Flow. Inputs. Flow Amplitude Source Properties 
- Specifies the information that is required for querying Amplitude.
- CustomConnector Pulumi.Aws Native. App Flow. Inputs. Flow Custom Connector Source Properties 
- The properties that are applied when the custom connector is being used as a source.
- Datadog
Pulumi.Aws Native. App Flow. Inputs. Flow Datadog Source Properties 
- Specifies the information that is required for querying Datadog.
- Dynatrace
Pulumi.Aws Native. App Flow. Inputs. Flow Dynatrace Source Properties 
- Specifies the information that is required for querying Dynatrace.
- GoogleAnalytics Pulumi.Aws Native. App Flow. Inputs. Flow Google Analytics Source Properties 
- Specifies the information that is required for querying Google Analytics.
- InforNexus Pulumi.Aws Native. App Flow. Inputs. Flow Infor Nexus Source Properties 
- Specifies the information that is required for querying Infor Nexus.
- Marketo
Pulumi.Aws Native. App Flow. Inputs. Flow Marketo Source Properties 
- Specifies the information that is required for querying Marketo.
- Pardot
Pulumi.Aws Native. App Flow. Inputs. Flow Pardot Source Properties 
- Specifies the information that is required for querying Salesforce Pardot.
- S3
Pulumi.Aws Native. App Flow. Inputs. Flow S3Source Properties 
- Specifies the information that is required for querying Amazon S3.
- Salesforce
Pulumi.Aws Native. App Flow. Inputs. Flow Salesforce Source Properties 
- Specifies the information that is required for querying Salesforce.
- SapoData Pulumi.Aws Native. App Flow. Inputs. Flow Sapo Data Source Properties 
- The properties that are applied when using SAPOData as a flow source.
- ServiceNow Pulumi.Aws Native. App Flow. Inputs. Flow Service Now Source Properties 
- Specifies the information that is required for querying ServiceNow.
- Singular
Pulumi.Aws Native. App Flow. Inputs. Flow Singular Source Properties 
- Specifies the information that is required for querying Singular.
- Slack
Pulumi.Aws Native. App Flow. Inputs. Flow Slack Source Properties 
- Specifies the information that is required for querying Slack.
- Trendmicro
Pulumi.Aws Native. App Flow. Inputs. Flow Trendmicro Source Properties 
- Specifies the information that is required for querying Trend Micro.
- Veeva
Pulumi.Aws Native. App Flow. Inputs. Flow Veeva Source Properties 
- Specifies the information that is required for querying Veeva.
- Zendesk
Pulumi.Aws Native. App Flow. Inputs. Flow Zendesk Source Properties 
- Specifies the information that is required for querying Zendesk.
- Amplitude
FlowAmplitude Source Properties 
- Specifies the information that is required for querying Amplitude.
- CustomConnector FlowCustom Connector Source Properties 
- The properties that are applied when the custom connector is being used as a source.
- Datadog
FlowDatadog Source Properties 
- Specifies the information that is required for querying Datadog.
- Dynatrace
FlowDynatrace Source Properties 
- Specifies the information that is required for querying Dynatrace.
- GoogleAnalytics FlowGoogle Analytics Source Properties 
- Specifies the information that is required for querying Google Analytics.
- InforNexus FlowInfor Nexus Source Properties 
- Specifies the information that is required for querying Infor Nexus.
- Marketo
FlowMarketo Source Properties 
- Specifies the information that is required for querying Marketo.
- Pardot
FlowPardot Source Properties 
- Specifies the information that is required for querying Salesforce Pardot.
- S3
FlowS3Source Properties 
- Specifies the information that is required for querying Amazon S3.
- Salesforce
FlowSalesforce Source Properties 
- Specifies the information that is required for querying Salesforce.
- SapoData FlowSapo Data Source Properties 
- The properties that are applied when using SAPOData as a flow source.
- ServiceNow FlowService Now Source Properties 
- Specifies the information that is required for querying ServiceNow.
- Singular
FlowSingular Source Properties 
- Specifies the information that is required for querying Singular.
- Slack
FlowSlack Source Properties 
- Specifies the information that is required for querying Slack.
- Trendmicro
FlowTrendmicro Source Properties 
- Specifies the information that is required for querying Trend Micro.
- Veeva
FlowVeeva Source Properties 
- Specifies the information that is required for querying Veeva.
- Zendesk
FlowZendesk Source Properties 
- Specifies the information that is required for querying Zendesk.
- amplitude
FlowAmplitude Source Properties 
- Specifies the information that is required for querying Amplitude.
- customConnector FlowCustom Connector Source Properties 
- The properties that are applied when the custom connector is being used as a source.
- datadog
FlowDatadog Source Properties 
- Specifies the information that is required for querying Datadog.
- dynatrace
FlowDynatrace Source Properties 
- Specifies the information that is required for querying Dynatrace.
- googleAnalytics FlowGoogle Analytics Source Properties 
- Specifies the information that is required for querying Google Analytics.
- inforNexus FlowInfor Nexus Source Properties 
- Specifies the information that is required for querying Infor Nexus.
- marketo
FlowMarketo Source Properties 
- Specifies the information that is required for querying Marketo.
- pardot
FlowPardot Source Properties 
- Specifies the information that is required for querying Salesforce Pardot.
- s3
FlowS3Source Properties 
- Specifies the information that is required for querying Amazon S3.
- salesforce
FlowSalesforce Source Properties 
- Specifies the information that is required for querying Salesforce.
- sapoData FlowSapo Data Source Properties 
- The properties that are applied when using SAPOData as a flow source.
- serviceNow FlowService Now Source Properties 
- Specifies the information that is required for querying ServiceNow.
- singular
FlowSingular Source Properties 
- Specifies the information that is required for querying Singular.
- slack
FlowSlack Source Properties 
- Specifies the information that is required for querying Slack.
- trendmicro
FlowTrendmicro Source Properties 
- Specifies the information that is required for querying Trend Micro.
- veeva
FlowVeeva Source Properties 
- Specifies the information that is required for querying Veeva.
- zendesk
FlowZendesk Source Properties 
- Specifies the information that is required for querying Zendesk.
- amplitude
FlowAmplitude Source Properties 
- Specifies the information that is required for querying Amplitude.
- customConnector FlowCustom Connector Source Properties 
- The properties that are applied when the custom connector is being used as a source.
- datadog
FlowDatadog Source Properties 
- Specifies the information that is required for querying Datadog.
- dynatrace
FlowDynatrace Source Properties 
- Specifies the information that is required for querying Dynatrace.
- googleAnalytics FlowGoogle Analytics Source Properties 
- Specifies the information that is required for querying Google Analytics.
- inforNexus FlowInfor Nexus Source Properties 
- Specifies the information that is required for querying Infor Nexus.
- marketo
FlowMarketo Source Properties 
- Specifies the information that is required for querying Marketo.
- pardot
FlowPardot Source Properties 
- Specifies the information that is required for querying Salesforce Pardot.
- s3
FlowS3Source Properties 
- Specifies the information that is required for querying Amazon S3.
- salesforce
FlowSalesforce Source Properties 
- Specifies the information that is required for querying Salesforce.
- sapoData FlowSapo Data Source Properties 
- The properties that are applied when using SAPOData as a flow source.
- serviceNow FlowService Now Source Properties 
- Specifies the information that is required for querying ServiceNow.
- singular
FlowSingular Source Properties 
- Specifies the information that is required for querying Singular.
- slack
FlowSlack Source Properties 
- Specifies the information that is required for querying Slack.
- trendmicro
FlowTrendmicro Source Properties 
- Specifies the information that is required for querying Trend Micro.
- veeva
FlowVeeva Source Properties 
- Specifies the information that is required for querying Veeva.
- zendesk
FlowZendesk Source Properties 
- Specifies the information that is required for querying Zendesk.
- amplitude
FlowAmplitude Source Properties 
- Specifies the information that is required for querying Amplitude.
- custom_connector FlowCustom Connector Source Properties 
- The properties that are applied when the custom connector is being used as a source.
- datadog
FlowDatadog Source Properties 
- Specifies the information that is required for querying Datadog.
- dynatrace
FlowDynatrace Source Properties 
- Specifies the information that is required for querying Dynatrace.
- google_analytics FlowGoogle Analytics Source Properties 
- Specifies the information that is required for querying Google Analytics.
- infor_nexus FlowInfor Nexus Source Properties 
- Specifies the information that is required for querying Infor Nexus.
- marketo
FlowMarketo Source Properties 
- Specifies the information that is required for querying Marketo.
- pardot
FlowPardot Source Properties 
- Specifies the information that is required for querying Salesforce Pardot.
- s3
FlowS3Source Properties 
- Specifies the information that is required for querying Amazon S3.
- salesforce
FlowSalesforce Source Properties 
- Specifies the information that is required for querying Salesforce.
- sapo_data FlowSapo Data Source Properties 
- The properties that are applied when using SAPOData as a flow source.
- service_now FlowService Now Source Properties 
- Specifies the information that is required for querying ServiceNow.
- singular
FlowSingular Source Properties 
- Specifies the information that is required for querying Singular.
- slack
FlowSlack Source Properties 
- Specifies the information that is required for querying Slack.
- trendmicro
FlowTrendmicro Source Properties 
- Specifies the information that is required for querying Trend Micro.
- veeva
FlowVeeva Source Properties 
- Specifies the information that is required for querying Veeva.
- zendesk
FlowZendesk Source Properties 
- Specifies the information that is required for querying Zendesk.
- amplitude Property Map
- Specifies the information that is required for querying Amplitude.
- customConnector Property Map
- The properties that are applied when the custom connector is being used as a source.
- datadog Property Map
- Specifies the information that is required for querying Datadog.
- dynatrace Property Map
- Specifies the information that is required for querying Dynatrace.
- googleAnalytics Property Map
- Specifies the information that is required for querying Google Analytics.
- inforNexus Property Map
- Specifies the information that is required for querying Infor Nexus.
- marketo Property Map
- Specifies the information that is required for querying Marketo.
- pardot Property Map
- Specifies the information that is required for querying Salesforce Pardot.
- s3 Property Map
- Specifies the information that is required for querying Amazon S3.
- salesforce Property Map
- Specifies the information that is required for querying Salesforce.
- sapoData Property Map
- The properties that are applied when using SAPOData as a flow source.
- serviceNow Property Map
- Specifies the information that is required for querying ServiceNow.
- singular Property Map
- Specifies the information that is required for querying Singular.
- slack Property Map
- Specifies the information that is required for querying Slack.
- trendmicro Property Map
- Specifies the information that is required for querying Trend Micro.
- veeva Property Map
- Specifies the information that is required for querying Veeva.
- zendesk Property Map
- Specifies the information that is required for querying Zendesk.
FlowSourceFlowConfig   
- ConnectorType Pulumi.Aws Native. App Flow. Flow Connector Type 
- Type of source connector
- SourceConnector Pulumi.Properties Aws Native. App Flow. Inputs. Flow Source Connector Properties 
- Source connector details required to query a connector
- ApiVersion string
- The API version that the destination connector uses.
- ConnectorProfile stringName 
- Name of source connector profile
- IncrementalPull Pulumi.Config Aws Native. App Flow. Inputs. Flow Incremental Pull Config 
- Configuration for scheduled incremental data pull
- ConnectorType FlowConnector Type 
- Type of source connector
- SourceConnector FlowProperties Source Connector Properties 
- Source connector details required to query a connector
- ApiVersion string
- The API version that the destination connector uses.
- ConnectorProfile stringName 
- Name of source connector profile
- IncrementalPull FlowConfig Incremental Pull Config 
- Configuration for scheduled incremental data pull
- connectorType FlowConnector Type 
- Type of source connector
- sourceConnector FlowProperties Source Connector Properties 
- Source connector details required to query a connector
- apiVersion String
- The API version that the destination connector uses.
- connectorProfile StringName 
- Name of source connector profile
- incrementalPull FlowConfig Incremental Pull Config 
- Configuration for scheduled incremental data pull
- connectorType FlowConnector Type 
- Type of source connector
- sourceConnector FlowProperties Source Connector Properties 
- Source connector details required to query a connector
- apiVersion string
- The API version that the destination connector uses.
- connectorProfile stringName 
- Name of source connector profile
- incrementalPull FlowConfig Incremental Pull Config 
- Configuration for scheduled incremental data pull
- connector_type FlowConnector Type 
- Type of source connector
- source_connector_ Flowproperties Source Connector Properties 
- Source connector details required to query a connector
- api_version str
- The API version that the destination connector uses.
- connector_profile_ strname 
- Name of source connector profile
- incremental_pull_ Flowconfig Incremental Pull Config 
- Configuration for scheduled incremental data pull
- connectorType "SAPOData" | "Salesforce" | "Pardot" | "Singular" | "Slack" | "Redshift" | "S3" | "Marketo" | "Googleanalytics" | "Zendesk" | "Servicenow" | "Datadog" | "Trendmicro" | "Snowflake" | "Dynatrace" | "Infornexus" | "Amplitude" | "Veeva" | "CustomConnector" | "Event Bridge" | "Upsolver" | "Lookout Metrics" 
- Type of source connector
- sourceConnector Property MapProperties 
- Source connector details required to query a connector
- apiVersion String
- The API version that the destination connector uses.
- connectorProfile StringName 
- Name of source connector profile
- incrementalPull Property MapConfig 
- Configuration for scheduled incremental data pull
FlowStatus 
FlowSuccessResponseHandlingConfig    
- BucketName string
- The name of the Amazon S3 bucket.
- BucketPrefix string
- The Amazon S3 bucket prefix.
- BucketName string
- The name of the Amazon S3 bucket.
- BucketPrefix string
- The Amazon S3 bucket prefix.
- bucketName String
- The name of the Amazon S3 bucket.
- bucketPrefix String
- The Amazon S3 bucket prefix.
- bucketName string
- The name of the Amazon S3 bucket.
- bucketPrefix string
- The Amazon S3 bucket prefix.
- bucket_name str
- The name of the Amazon S3 bucket.
- bucket_prefix str
- The Amazon S3 bucket prefix.
- bucketName String
- The name of the Amazon S3 bucket.
- bucketPrefix String
- The Amazon S3 bucket prefix.
FlowTask 
- SourceFields List<string>
- Source fields on which particular task will be applied
- TaskType Pulumi.Aws Native. App Flow. Flow Task Type 
- Type of task
- ConnectorOperator Pulumi.Aws Native. App Flow. Inputs. Flow Connector Operator 
- Operation to be performed on provided source fields
- DestinationField string
- A field value on which source field should be validated
- TaskProperties List<Pulumi.Aws Native. App Flow. Inputs. Flow Task Properties Object> 
- A Map used to store task related info
- SourceFields []string
- Source fields on which particular task will be applied
- TaskType FlowTask Type 
- Type of task
- ConnectorOperator FlowConnector Operator 
- Operation to be performed on provided source fields
- DestinationField string
- A field value on which source field should be validated
- TaskProperties []FlowTask Properties Object 
- A Map used to store task related info
- sourceFields List<String>
- Source fields on which particular task will be applied
- taskType FlowTask Type 
- Type of task
- connectorOperator FlowConnector Operator 
- Operation to be performed on provided source fields
- destinationField String
- A field value on which source field should be validated
- taskProperties List<FlowTask Properties Object> 
- A Map used to store task related info
- sourceFields string[]
- Source fields on which particular task will be applied
- taskType FlowTask Type 
- Type of task
- connectorOperator FlowConnector Operator 
- Operation to be performed on provided source fields
- destinationField string
- A field value on which source field should be validated
- taskProperties FlowTask Properties Object[] 
- A Map used to store task related info
- source_fields Sequence[str]
- Source fields on which particular task will be applied
- task_type FlowTask Type 
- Type of task
- connector_operator FlowConnector Operator 
- Operation to be performed on provided source fields
- destination_field str
- A field value on which source field should be validated
- task_properties Sequence[FlowTask Properties Object] 
- A Map used to store task related info
- sourceFields List<String>
- Source fields on which particular task will be applied
- taskType "Arithmetic" | "Filter" | "Map" | "Map_all" | "Mask" | "Merge" | "Passthrough" | "Truncate" | "Validate" | "Partition" 
- Type of task
- connectorOperator Property Map
- Operation to be performed on provided source fields
- destinationField String
- A field value on which source field should be validated
- taskProperties List<Property Map>
- A Map used to store task related info
FlowTaskPropertiesObject   
- Key
Pulumi.Aws Native. App Flow. Flow Operator Properties Keys 
- The task property key.
- Value string
- The task property value.
- Key
FlowOperator Properties Keys 
- The task property key.
- Value string
- The task property value.
- key
FlowOperator Properties Keys 
- The task property key.
- value String
- The task property value.
- key
FlowOperator Properties Keys 
- The task property key.
- value string
- The task property value.
- key
FlowOperator Properties Keys 
- The task property key.
- value str
- The task property value.
- key "VALUE" | "VALUES" | "DATA_TYPE" | "UPPER_BOUND" | "LOWER_BOUND" | "SOURCE_DATA_TYPE" | "DESTINATION_DATA_TYPE" | "VALIDATION_ACTION" | "MASK_VALUE" | "MASK_LENGTH" | "TRUNCATE_LENGTH" | "MATH_OPERATION_FIELDS_ORDER" | "CONCAT_FORMAT" | "SUBFIELD_CATEGORY_MAP" | "EXCLUDE_SOURCE_FIELDS_LIST" | "INCLUDE_NEW_FIELDS" | "ORDERED_PARTITION_KEYS_LIST"
- The task property key.
- value String
- The task property value.
FlowTaskType  
FlowTrendmicroConnectorOperator   
FlowTrendmicroSourceProperties   
- Object string
- The object specified in the Trend Micro flow source.
- Object string
- The object specified in the Trend Micro flow source.
- object String
- The object specified in the Trend Micro flow source.
- object string
- The object specified in the Trend Micro flow source.
- object str
- The object specified in the Trend Micro flow source.
- object String
- The object specified in the Trend Micro flow source.
FlowTriggerConfig  
- TriggerType Pulumi.Aws Native. App Flow. Flow Trigger Type 
- Trigger type of the flow
- TriggerProperties Pulumi.Aws Native. App Flow. Inputs. Flow Scheduled Trigger Properties 
- Details required based on the type of trigger
- TriggerType FlowTrigger Type 
- Trigger type of the flow
- TriggerProperties FlowScheduled Trigger Properties 
- Details required based on the type of trigger
- triggerType FlowTrigger Type 
- Trigger type of the flow
- triggerProperties FlowScheduled Trigger Properties 
- Details required based on the type of trigger
- triggerType FlowTrigger Type 
- Trigger type of the flow
- triggerProperties FlowScheduled Trigger Properties 
- Details required based on the type of trigger
- trigger_type FlowTrigger Type 
- Trigger type of the flow
- trigger_properties FlowScheduled Trigger Properties 
- Details required based on the type of trigger
- triggerType "Scheduled" | "Event" | "OnDemand" 
- Trigger type of the flow
- triggerProperties Property Map
- Details required based on the type of trigger
FlowTriggerType  
FlowUpsolverDestinationProperties   
- BucketName string
- The Upsolver Amazon S3 bucket name in which Amazon AppFlow places the transferred data.
- S3OutputFormat Pulumi.Config Aws Native. App Flow. Inputs. Flow Upsolver S3Output Format Config 
- The configuration that determines how data is formatted when Upsolver is used as the flow destination.
- BucketPrefix string
- The object key for the destination Upsolver Amazon S3 bucket in which Amazon AppFlow places the files.
- BucketName string
- The Upsolver Amazon S3 bucket name in which Amazon AppFlow places the transferred data.
- S3OutputFormat FlowConfig Upsolver S3Output Format Config 
- The configuration that determines how data is formatted when Upsolver is used as the flow destination.
- BucketPrefix string
- The object key for the destination Upsolver Amazon S3 bucket in which Amazon AppFlow places the files.
- bucketName String
- The Upsolver Amazon S3 bucket name in which Amazon AppFlow places the transferred data.
- s3OutputFormat FlowConfig Upsolver S3Output Format Config 
- The configuration that determines how data is formatted when Upsolver is used as the flow destination.
- bucketPrefix String
- The object key for the destination Upsolver Amazon S3 bucket in which Amazon AppFlow places the files.
- bucketName string
- The Upsolver Amazon S3 bucket name in which Amazon AppFlow places the transferred data.
- s3OutputFormat FlowConfig Upsolver S3Output Format Config 
- The configuration that determines how data is formatted when Upsolver is used as the flow destination.
- bucketPrefix string
- The object key for the destination Upsolver Amazon S3 bucket in which Amazon AppFlow places the files.
- bucket_name str
- The Upsolver Amazon S3 bucket name in which Amazon AppFlow places the transferred data.
- s3_output_ Flowformat_ config Upsolver S3Output Format Config 
- The configuration that determines how data is formatted when Upsolver is used as the flow destination.
- bucket_prefix str
- The object key for the destination Upsolver Amazon S3 bucket in which Amazon AppFlow places the files.
- bucketName String
- The Upsolver Amazon S3 bucket name in which Amazon AppFlow places the transferred data.
- s3OutputFormat Property MapConfig 
- The configuration that determines how data is formatted when Upsolver is used as the flow destination.
- bucketPrefix String
- The object key for the destination Upsolver Amazon S3 bucket in which Amazon AppFlow places the files.
FlowUpsolverS3OutputFormatConfig    
- PrefixConfig Pulumi.Aws Native. App Flow. Inputs. Flow Prefix Config 
- Specifies elements that Amazon AppFlow includes in the file and folder names in the flow destination.
- AggregationConfig Pulumi.Aws Native. App Flow. Inputs. Flow Aggregation Config 
- The aggregation settings that you can use to customize the output format of your flow data.
- FileType Pulumi.Aws Native. App Flow. Flow File Type 
- Indicates the file type that Amazon AppFlow places in the Upsolver Amazon S3 bucket.
- PrefixConfig FlowPrefix Config 
- Specifies elements that Amazon AppFlow includes in the file and folder names in the flow destination.
- AggregationConfig FlowAggregation Config 
- The aggregation settings that you can use to customize the output format of your flow data.
- FileType FlowFile Type 
- Indicates the file type that Amazon AppFlow places in the Upsolver Amazon S3 bucket.
- prefixConfig FlowPrefix Config 
- Specifies elements that Amazon AppFlow includes in the file and folder names in the flow destination.
- aggregationConfig FlowAggregation Config 
- The aggregation settings that you can use to customize the output format of your flow data.
- fileType FlowFile Type 
- Indicates the file type that Amazon AppFlow places in the Upsolver Amazon S3 bucket.
- prefixConfig FlowPrefix Config 
- Specifies elements that Amazon AppFlow includes in the file and folder names in the flow destination.
- aggregationConfig FlowAggregation Config 
- The aggregation settings that you can use to customize the output format of your flow data.
- fileType FlowFile Type 
- Indicates the file type that Amazon AppFlow places in the Upsolver Amazon S3 bucket.
- prefix_config FlowPrefix Config 
- Specifies elements that Amazon AppFlow includes in the file and folder names in the flow destination.
- aggregation_config FlowAggregation Config 
- The aggregation settings that you can use to customize the output format of your flow data.
- file_type FlowFile Type 
- Indicates the file type that Amazon AppFlow places in the Upsolver Amazon S3 bucket.
- prefixConfig Property Map
- Specifies elements that Amazon AppFlow includes in the file and folder names in the flow destination.
- aggregationConfig Property Map
- The aggregation settings that you can use to customize the output format of your flow data.
- fileType "CSV" | "JSON" | "PARQUET"
- Indicates the file type that Amazon AppFlow places in the Upsolver Amazon S3 bucket.
FlowVeevaConnectorOperator   
FlowVeevaSourceProperties   
- Object string
- The object specified in the Veeva flow source.
- DocumentType string
- The document type specified in the Veeva document extract flow.
- IncludeAll boolVersions 
- Boolean value to include All Versions of files in Veeva document extract flow.
- IncludeRenditions bool
- Boolean value to include file renditions in Veeva document extract flow.
- IncludeSource boolFiles 
- Boolean value to include source files in Veeva document extract flow.
- Object string
- The object specified in the Veeva flow source.
- DocumentType string
- The document type specified in the Veeva document extract flow.
- IncludeAll boolVersions 
- Boolean value to include All Versions of files in Veeva document extract flow.
- IncludeRenditions bool
- Boolean value to include file renditions in Veeva document extract flow.
- IncludeSource boolFiles 
- Boolean value to include source files in Veeva document extract flow.
- object String
- The object specified in the Veeva flow source.
- documentType String
- The document type specified in the Veeva document extract flow.
- includeAll BooleanVersions 
- Boolean value to include All Versions of files in Veeva document extract flow.
- includeRenditions Boolean
- Boolean value to include file renditions in Veeva document extract flow.
- includeSource BooleanFiles 
- Boolean value to include source files in Veeva document extract flow.
- object string
- The object specified in the Veeva flow source.
- documentType string
- The document type specified in the Veeva document extract flow.
- includeAll booleanVersions 
- Boolean value to include All Versions of files in Veeva document extract flow.
- includeRenditions boolean
- Boolean value to include file renditions in Veeva document extract flow.
- includeSource booleanFiles 
- Boolean value to include source files in Veeva document extract flow.
- object str
- The object specified in the Veeva flow source.
- document_type str
- The document type specified in the Veeva document extract flow.
- include_all_ boolversions 
- Boolean value to include All Versions of files in Veeva document extract flow.
- include_renditions bool
- Boolean value to include file renditions in Veeva document extract flow.
- include_source_ boolfiles 
- Boolean value to include source files in Veeva document extract flow.
- object String
- The object specified in the Veeva flow source.
- documentType String
- The document type specified in the Veeva document extract flow.
- includeAll BooleanVersions 
- Boolean value to include All Versions of files in Veeva document extract flow.
- includeRenditions Boolean
- Boolean value to include file renditions in Veeva document extract flow.
- includeSource BooleanFiles 
- Boolean value to include source files in Veeva document extract flow.
FlowWriteOperationType   
FlowZendeskConnectorOperator   
FlowZendeskDestinationProperties   
- Object string
- The object specified in the Zendesk flow destination.
- ErrorHandling Pulumi.Config Aws Native. App Flow. Inputs. Flow Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- IdField List<string>Names 
- List of fields used as ID when performing a write operation.
- WriteOperation Pulumi.Type Aws Native. App Flow. Flow Write Operation Type 
- The possible write operations in the destination connector. When this value is not provided, this defaults to the INSERToperation.
- Object string
- The object specified in the Zendesk flow destination.
- ErrorHandling FlowConfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- IdField []stringNames 
- List of fields used as ID when performing a write operation.
- WriteOperation FlowType Write Operation Type 
- The possible write operations in the destination connector. When this value is not provided, this defaults to the INSERToperation.
- object String
- The object specified in the Zendesk flow destination.
- errorHandling FlowConfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- idField List<String>Names 
- List of fields used as ID when performing a write operation.
- writeOperation FlowType Write Operation Type 
- The possible write operations in the destination connector. When this value is not provided, this defaults to the INSERToperation.
- object string
- The object specified in the Zendesk flow destination.
- errorHandling FlowConfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- idField string[]Names 
- List of fields used as ID when performing a write operation.
- writeOperation FlowType Write Operation Type 
- The possible write operations in the destination connector. When this value is not provided, this defaults to the INSERToperation.
- object str
- The object specified in the Zendesk flow destination.
- error_handling_ Flowconfig Error Handling Config 
- The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- id_field_ Sequence[str]names 
- List of fields used as ID when performing a write operation.
- write_operation_ Flowtype Write Operation Type 
- The possible write operations in the destination connector. When this value is not provided, this defaults to the INSERToperation.
- object String
- The object specified in the Zendesk flow destination.
- errorHandling Property MapConfig 
- The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure. ErrorHandlingConfigis a part of the destination connector details.
- idField List<String>Names 
- List of fields used as ID when performing a write operation.
- writeOperation "INSERT" | "UPSERT" | "UPDATE" | "DELETE"Type 
- The possible write operations in the destination connector. When this value is not provided, this defaults to the INSERToperation.
FlowZendeskSourceProperties   
- Object string
- The object specified in the Zendesk flow source.
- Object string
- The object specified in the Zendesk flow source.
- object String
- The object specified in the Zendesk flow source.
- object string
- The object specified in the Zendesk flow source.
- object str
- The object specified in the Zendesk flow source.
- object String
- The object specified in the Zendesk flow source.
Tag
Package Details
- Repository
- AWS Native pulumi/pulumi-aws-native
- License
- Apache-2.0
We recommend new projects start with resources from the AWS provider.
