We recommend new projects start with resources from the AWS provider.
aws-native.mwaa.Environment
Explore with Pulumi AI
We recommend new projects start with resources from the AWS provider.
Resource schema for AWS::MWAA::Environment
Create Environment Resource
Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.
Constructor syntax
new Environment(name: string, args?: EnvironmentArgs, opts?: CustomResourceOptions);
@overload
def Environment(resource_name: str,
args: Optional[EnvironmentArgs] = None,
opts: Optional[ResourceOptions] = None)
@overload
def Environment(resource_name: str,
opts: Optional[ResourceOptions] = None,
airflow_configuration_options: Optional[Any] = None,
airflow_version: Optional[str] = None,
dag_s3_path: Optional[str] = None,
endpoint_management: Optional[EnvironmentEndpointManagement] = None,
environment_class: Optional[str] = None,
execution_role_arn: Optional[str] = None,
kms_key: Optional[str] = None,
logging_configuration: Optional[EnvironmentLoggingConfigurationArgs] = None,
max_webservers: Optional[int] = None,
max_workers: Optional[int] = None,
min_webservers: Optional[int] = None,
min_workers: Optional[int] = None,
name: Optional[str] = None,
network_configuration: Optional[EnvironmentNetworkConfigurationArgs] = None,
plugins_s3_object_version: Optional[str] = None,
plugins_s3_path: Optional[str] = None,
requirements_s3_object_version: Optional[str] = None,
requirements_s3_path: Optional[str] = None,
schedulers: Optional[int] = None,
source_bucket_arn: Optional[str] = None,
startup_script_s3_object_version: Optional[str] = None,
startup_script_s3_path: Optional[str] = None,
tags: Optional[Any] = None,
webserver_access_mode: Optional[EnvironmentWebserverAccessMode] = None,
weekly_maintenance_window_start: Optional[str] = None)
func NewEnvironment(ctx *Context, name string, args *EnvironmentArgs, opts ...ResourceOption) (*Environment, error)
public Environment(string name, EnvironmentArgs? args = null, CustomResourceOptions? opts = null)
public Environment(String name, EnvironmentArgs args)
public Environment(String name, EnvironmentArgs args, CustomResourceOptions options)
type: aws-native:mwaa:Environment
properties: # The arguments to resource properties.
options: # Bag of options to control resource's behavior.
Parameters
- name string
- The unique name of the resource.
- args EnvironmentArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- resource_name str
- The unique name of the resource.
- args EnvironmentArgs
- The arguments to resource properties.
- opts ResourceOptions
- Bag of options to control resource's behavior.
- ctx Context
- Context object for the current deployment.
- name string
- The unique name of the resource.
- args EnvironmentArgs
- The arguments to resource properties.
- opts ResourceOption
- Bag of options to control resource's behavior.
- name string
- The unique name of the resource.
- args EnvironmentArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- name String
- The unique name of the resource.
- args EnvironmentArgs
- The arguments to resource properties.
- options CustomResourceOptions
- Bag of options to control resource's behavior.
Environment Resource Properties
To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.
Inputs
In Python, inputs that are objects can be passed either as argument classes or as dictionary literals.
The Environment resource accepts the following input properties:
- Airflow
Configuration objectOptions Key/value pairs representing Airflow configuration variables. Keys are prefixed by their section:
[core] dags_folder={AIRFLOW_HOME}/dags
Would be represented as
"core.dags_folder": "{AIRFLOW_HOME}/dags"
Search the CloudFormation User Guide for
AWS::MWAA::Environment
for more information about the expected schema for this property.- Airflow
Version string The version of Apache Airflow to use for the environment. If no value is specified, defaults to the latest version.
If you specify a newer version number for an existing environment, the version update requires some service interruption before taking effect.
Allowed Values :
1.10.12
|2.0.2
|2.2.2
|2.4.3
|2.5.1
|2.6.3
|2.7.2
|2.8.1
|2.9.2
(latest)- Dag
S3Path string - The relative path to the DAGs folder on your Amazon S3 bucket. For example,
dags
. To learn more, see Adding or updating DAGs . - Endpoint
Management Pulumi.Aws Native. Mwaa. Environment Endpoint Management - Defines whether the VPC endpoints configured for the environment are created, and managed, by the customer or by Amazon MWAA. If set to
SERVICE
, Amazon MWAA will create and manage the required VPC endpoints in your VPC. If set toCUSTOMER
, you must create, and manage, the VPC endpoints in your VPC. - Environment
Class string - The environment class type. Valid values:
mw1.small
,mw1.medium
,mw1.large
. To learn more, see Amazon MWAA environment class . - Execution
Role stringArn - The Amazon Resource Name (ARN) of the execution role in IAM that allows MWAA to access AWS resources in your environment. For example,
arn:aws:iam::123456789:role/my-execution-role
. To learn more, see Amazon MWAA Execution role . - Kms
Key string - The AWS Key Management Service (KMS) key to encrypt and decrypt the data in your environment. You can use an AWS KMS key managed by MWAA, or a customer-managed KMS key (advanced).
- Logging
Configuration Pulumi.Aws Native. Mwaa. Inputs. Environment Logging Configuration - The Apache Airflow logs being sent to CloudWatch Logs:
DagProcessingLogs
,SchedulerLogs
,TaskLogs
,WebserverLogs
,WorkerLogs
. - Max
Webservers int The maximum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for
MaxWebservers
when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. For example, in scenarios where your workload requires network calls to the Apache Airflow REST API with a high transaction-per-second (TPS) rate, Amazon MWAA will increase the number of web servers up to the number set inMaxWebserers
. As TPS rates decrease Amazon MWAA disposes of the additional web servers, and scales down to the number set inMinxWebserers
.Valid values: For environments larger than mw1.micro, accepts values from
2
to5
. Defaults to2
for all environment sizes except mw1.micro, which defaults to1
.- Max
Workers int - The maximum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the
MaxWorkers
field. For example,20
. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify inMinWorkers
. - Min
Webservers int The minimum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for
MaxWebservers
when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. As the transaction-per-second rate, and the network load, decrease, Amazon MWAA disposes of the additional web servers, and scales down to the number set inMinxWebserers
.Valid values: For environments larger than mw1.micro, accepts values from
2
to5
. Defaults to2
for all environment sizes except mw1.micro, which defaults to1
.- Min
Workers int - The minimum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the
MaxWorkers
field. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in theMinWorkers
field. For example,2
. - Name string
- The name of your Amazon MWAA environment.
- Network
Configuration Pulumi.Aws Native. Mwaa. Inputs. Environment Network Configuration - The VPC networking components used to secure and enable network traffic between the AWS resources for your environment. To learn more, see About networking on Amazon MWAA .
- Plugins
S3Object stringVersion - The version of the plugins.zip file on your Amazon S3 bucket. To learn more, see Installing custom plugins .
- Plugins
S3Path string - The relative path to the
plugins.zip
file on your Amazon S3 bucket. For example,plugins.zip
. To learn more, see Installing custom plugins . - Requirements
S3Object stringVersion - The version of the requirements.txt file on your Amazon S3 bucket. To learn more, see Installing Python dependencies .
- Requirements
S3Path string - The relative path to the
requirements.txt
file on your Amazon S3 bucket. For example,requirements.txt
. To learn more, see Installing Python dependencies . - Schedulers int
- The number of schedulers that you want to run in your environment. Valid values:
- v2 - Accepts between 2 to 5. Defaults to 2.
- v1 - Accepts 1.
- Source
Bucket stringArn - The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example,
arn:aws:s3:::my-airflow-bucket-unique-name
. To learn more, see Create an Amazon S3 bucket for Amazon MWAA . - Startup
Script stringS3Object Version The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script.
Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example:
3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpUMLUo
For more information, see Using a startup script .
- Startup
Script stringS3Path The relative path to the startup shell script in your Amazon S3 bucket. For example,
s3://mwaa-environment/startup.sh
.Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script .
- object
A map of tags for the environment.
Search the CloudFormation User Guide for
AWS::MWAA::Environment
for more information about the expected schema for this property.- Webserver
Access Pulumi.Mode Aws Native. Mwaa. Environment Webserver Access Mode - The Apache Airflow Web server access mode. To learn more, see Apache Airflow access modes . Valid values:
PRIVATE_ONLY
orPUBLIC_ONLY
. - Weekly
Maintenance stringWindow Start - The day and time of the week to start weekly maintenance updates of your environment in the following format:
DAY:HH:MM
. For example:TUE:03:30
. You can specify a start time in 30 minute increments only. Supported input includes the following:- MON|TUE|WED|THU|FRI|SAT|SUN:([01]\d|2[0-3]):(00|30)
- Airflow
Configuration interface{}Options Key/value pairs representing Airflow configuration variables. Keys are prefixed by their section:
[core] dags_folder={AIRFLOW_HOME}/dags
Would be represented as
"core.dags_folder": "{AIRFLOW_HOME}/dags"
Search the CloudFormation User Guide for
AWS::MWAA::Environment
for more information about the expected schema for this property.- Airflow
Version string The version of Apache Airflow to use for the environment. If no value is specified, defaults to the latest version.
If you specify a newer version number for an existing environment, the version update requires some service interruption before taking effect.
Allowed Values :
1.10.12
|2.0.2
|2.2.2
|2.4.3
|2.5.1
|2.6.3
|2.7.2
|2.8.1
|2.9.2
(latest)- Dag
S3Path string - The relative path to the DAGs folder on your Amazon S3 bucket. For example,
dags
. To learn more, see Adding or updating DAGs . - Endpoint
Management EnvironmentEndpoint Management - Defines whether the VPC endpoints configured for the environment are created, and managed, by the customer or by Amazon MWAA. If set to
SERVICE
, Amazon MWAA will create and manage the required VPC endpoints in your VPC. If set toCUSTOMER
, you must create, and manage, the VPC endpoints in your VPC. - Environment
Class string - The environment class type. Valid values:
mw1.small
,mw1.medium
,mw1.large
. To learn more, see Amazon MWAA environment class . - Execution
Role stringArn - The Amazon Resource Name (ARN) of the execution role in IAM that allows MWAA to access AWS resources in your environment. For example,
arn:aws:iam::123456789:role/my-execution-role
. To learn more, see Amazon MWAA Execution role . - Kms
Key string - The AWS Key Management Service (KMS) key to encrypt and decrypt the data in your environment. You can use an AWS KMS key managed by MWAA, or a customer-managed KMS key (advanced).
- Logging
Configuration EnvironmentLogging Configuration Args - The Apache Airflow logs being sent to CloudWatch Logs:
DagProcessingLogs
,SchedulerLogs
,TaskLogs
,WebserverLogs
,WorkerLogs
. - Max
Webservers int The maximum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for
MaxWebservers
when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. For example, in scenarios where your workload requires network calls to the Apache Airflow REST API with a high transaction-per-second (TPS) rate, Amazon MWAA will increase the number of web servers up to the number set inMaxWebserers
. As TPS rates decrease Amazon MWAA disposes of the additional web servers, and scales down to the number set inMinxWebserers
.Valid values: For environments larger than mw1.micro, accepts values from
2
to5
. Defaults to2
for all environment sizes except mw1.micro, which defaults to1
.- Max
Workers int - The maximum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the
MaxWorkers
field. For example,20
. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify inMinWorkers
. - Min
Webservers int The minimum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for
MaxWebservers
when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. As the transaction-per-second rate, and the network load, decrease, Amazon MWAA disposes of the additional web servers, and scales down to the number set inMinxWebserers
.Valid values: For environments larger than mw1.micro, accepts values from
2
to5
. Defaults to2
for all environment sizes except mw1.micro, which defaults to1
.- Min
Workers int - The minimum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the
MaxWorkers
field. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in theMinWorkers
field. For example,2
. - Name string
- The name of your Amazon MWAA environment.
- Network
Configuration EnvironmentNetwork Configuration Args - The VPC networking components used to secure and enable network traffic between the AWS resources for your environment. To learn more, see About networking on Amazon MWAA .
- Plugins
S3Object stringVersion - The version of the plugins.zip file on your Amazon S3 bucket. To learn more, see Installing custom plugins .
- Plugins
S3Path string - The relative path to the
plugins.zip
file on your Amazon S3 bucket. For example,plugins.zip
. To learn more, see Installing custom plugins . - Requirements
S3Object stringVersion - The version of the requirements.txt file on your Amazon S3 bucket. To learn more, see Installing Python dependencies .
- Requirements
S3Path string - The relative path to the
requirements.txt
file on your Amazon S3 bucket. For example,requirements.txt
. To learn more, see Installing Python dependencies . - Schedulers int
- The number of schedulers that you want to run in your environment. Valid values:
- v2 - Accepts between 2 to 5. Defaults to 2.
- v1 - Accepts 1.
- Source
Bucket stringArn - The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example,
arn:aws:s3:::my-airflow-bucket-unique-name
. To learn more, see Create an Amazon S3 bucket for Amazon MWAA . - Startup
Script stringS3Object Version The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script.
Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example:
3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpUMLUo
For more information, see Using a startup script .
- Startup
Script stringS3Path The relative path to the startup shell script in your Amazon S3 bucket. For example,
s3://mwaa-environment/startup.sh
.Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script .
- interface{}
A map of tags for the environment.
Search the CloudFormation User Guide for
AWS::MWAA::Environment
for more information about the expected schema for this property.- Webserver
Access EnvironmentMode Webserver Access Mode - The Apache Airflow Web server access mode. To learn more, see Apache Airflow access modes . Valid values:
PRIVATE_ONLY
orPUBLIC_ONLY
. - Weekly
Maintenance stringWindow Start - The day and time of the week to start weekly maintenance updates of your environment in the following format:
DAY:HH:MM
. For example:TUE:03:30
. You can specify a start time in 30 minute increments only. Supported input includes the following:- MON|TUE|WED|THU|FRI|SAT|SUN:([01]\d|2[0-3]):(00|30)
- airflow
Configuration ObjectOptions Key/value pairs representing Airflow configuration variables. Keys are prefixed by their section:
[core] dags_folder={AIRFLOW_HOME}/dags
Would be represented as
"core.dags_folder": "{AIRFLOW_HOME}/dags"
Search the CloudFormation User Guide for
AWS::MWAA::Environment
for more information about the expected schema for this property.- airflow
Version String The version of Apache Airflow to use for the environment. If no value is specified, defaults to the latest version.
If you specify a newer version number for an existing environment, the version update requires some service interruption before taking effect.
Allowed Values :
1.10.12
|2.0.2
|2.2.2
|2.4.3
|2.5.1
|2.6.3
|2.7.2
|2.8.1
|2.9.2
(latest)- dag
S3Path String - The relative path to the DAGs folder on your Amazon S3 bucket. For example,
dags
. To learn more, see Adding or updating DAGs . - endpoint
Management EnvironmentEndpoint Management - Defines whether the VPC endpoints configured for the environment are created, and managed, by the customer or by Amazon MWAA. If set to
SERVICE
, Amazon MWAA will create and manage the required VPC endpoints in your VPC. If set toCUSTOMER
, you must create, and manage, the VPC endpoints in your VPC. - environment
Class String - The environment class type. Valid values:
mw1.small
,mw1.medium
,mw1.large
. To learn more, see Amazon MWAA environment class . - execution
Role StringArn - The Amazon Resource Name (ARN) of the execution role in IAM that allows MWAA to access AWS resources in your environment. For example,
arn:aws:iam::123456789:role/my-execution-role
. To learn more, see Amazon MWAA Execution role . - kms
Key String - The AWS Key Management Service (KMS) key to encrypt and decrypt the data in your environment. You can use an AWS KMS key managed by MWAA, or a customer-managed KMS key (advanced).
- logging
Configuration EnvironmentLogging Configuration - The Apache Airflow logs being sent to CloudWatch Logs:
DagProcessingLogs
,SchedulerLogs
,TaskLogs
,WebserverLogs
,WorkerLogs
. - max
Webservers Integer The maximum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for
MaxWebservers
when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. For example, in scenarios where your workload requires network calls to the Apache Airflow REST API with a high transaction-per-second (TPS) rate, Amazon MWAA will increase the number of web servers up to the number set inMaxWebserers
. As TPS rates decrease Amazon MWAA disposes of the additional web servers, and scales down to the number set inMinxWebserers
.Valid values: For environments larger than mw1.micro, accepts values from
2
to5
. Defaults to2
for all environment sizes except mw1.micro, which defaults to1
.- max
Workers Integer - The maximum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the
MaxWorkers
field. For example,20
. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify inMinWorkers
. - min
Webservers Integer The minimum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for
MaxWebservers
when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. As the transaction-per-second rate, and the network load, decrease, Amazon MWAA disposes of the additional web servers, and scales down to the number set inMinxWebserers
.Valid values: For environments larger than mw1.micro, accepts values from
2
to5
. Defaults to2
for all environment sizes except mw1.micro, which defaults to1
.- min
Workers Integer - The minimum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the
MaxWorkers
field. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in theMinWorkers
field. For example,2
. - name String
- The name of your Amazon MWAA environment.
- network
Configuration EnvironmentNetwork Configuration - The VPC networking components used to secure and enable network traffic between the AWS resources for your environment. To learn more, see About networking on Amazon MWAA .
- plugins
S3Object StringVersion - The version of the plugins.zip file on your Amazon S3 bucket. To learn more, see Installing custom plugins .
- plugins
S3Path String - The relative path to the
plugins.zip
file on your Amazon S3 bucket. For example,plugins.zip
. To learn more, see Installing custom plugins . - requirements
S3Object StringVersion - The version of the requirements.txt file on your Amazon S3 bucket. To learn more, see Installing Python dependencies .
- requirements
S3Path String - The relative path to the
requirements.txt
file on your Amazon S3 bucket. For example,requirements.txt
. To learn more, see Installing Python dependencies . - schedulers Integer
- The number of schedulers that you want to run in your environment. Valid values:
- v2 - Accepts between 2 to 5. Defaults to 2.
- v1 - Accepts 1.
- source
Bucket StringArn - The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example,
arn:aws:s3:::my-airflow-bucket-unique-name
. To learn more, see Create an Amazon S3 bucket for Amazon MWAA . - startup
Script StringS3Object Version The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script.
Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example:
3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpUMLUo
For more information, see Using a startup script .
- startup
Script StringS3Path The relative path to the startup shell script in your Amazon S3 bucket. For example,
s3://mwaa-environment/startup.sh
.Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script .
- Object
A map of tags for the environment.
Search the CloudFormation User Guide for
AWS::MWAA::Environment
for more information about the expected schema for this property.- webserver
Access EnvironmentMode Webserver Access Mode - The Apache Airflow Web server access mode. To learn more, see Apache Airflow access modes . Valid values:
PRIVATE_ONLY
orPUBLIC_ONLY
. - weekly
Maintenance StringWindow Start - The day and time of the week to start weekly maintenance updates of your environment in the following format:
DAY:HH:MM
. For example:TUE:03:30
. You can specify a start time in 30 minute increments only. Supported input includes the following:- MON|TUE|WED|THU|FRI|SAT|SUN:([01]\d|2[0-3]):(00|30)
- airflow
Configuration anyOptions Key/value pairs representing Airflow configuration variables. Keys are prefixed by their section:
[core] dags_folder={AIRFLOW_HOME}/dags
Would be represented as
"core.dags_folder": "{AIRFLOW_HOME}/dags"
Search the CloudFormation User Guide for
AWS::MWAA::Environment
for more information about the expected schema for this property.- airflow
Version string The version of Apache Airflow to use for the environment. If no value is specified, defaults to the latest version.
If you specify a newer version number for an existing environment, the version update requires some service interruption before taking effect.
Allowed Values :
1.10.12
|2.0.2
|2.2.2
|2.4.3
|2.5.1
|2.6.3
|2.7.2
|2.8.1
|2.9.2
(latest)- dag
S3Path string - The relative path to the DAGs folder on your Amazon S3 bucket. For example,
dags
. To learn more, see Adding or updating DAGs . - endpoint
Management EnvironmentEndpoint Management - Defines whether the VPC endpoints configured for the environment are created, and managed, by the customer or by Amazon MWAA. If set to
SERVICE
, Amazon MWAA will create and manage the required VPC endpoints in your VPC. If set toCUSTOMER
, you must create, and manage, the VPC endpoints in your VPC. - environment
Class string - The environment class type. Valid values:
mw1.small
,mw1.medium
,mw1.large
. To learn more, see Amazon MWAA environment class . - execution
Role stringArn - The Amazon Resource Name (ARN) of the execution role in IAM that allows MWAA to access AWS resources in your environment. For example,
arn:aws:iam::123456789:role/my-execution-role
. To learn more, see Amazon MWAA Execution role . - kms
Key string - The AWS Key Management Service (KMS) key to encrypt and decrypt the data in your environment. You can use an AWS KMS key managed by MWAA, or a customer-managed KMS key (advanced).
- logging
Configuration EnvironmentLogging Configuration - The Apache Airflow logs being sent to CloudWatch Logs:
DagProcessingLogs
,SchedulerLogs
,TaskLogs
,WebserverLogs
,WorkerLogs
. - max
Webservers number The maximum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for
MaxWebservers
when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. For example, in scenarios where your workload requires network calls to the Apache Airflow REST API with a high transaction-per-second (TPS) rate, Amazon MWAA will increase the number of web servers up to the number set inMaxWebserers
. As TPS rates decrease Amazon MWAA disposes of the additional web servers, and scales down to the number set inMinxWebserers
.Valid values: For environments larger than mw1.micro, accepts values from
2
to5
. Defaults to2
for all environment sizes except mw1.micro, which defaults to1
.- max
Workers number - The maximum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the
MaxWorkers
field. For example,20
. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify inMinWorkers
. - min
Webservers number The minimum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for
MaxWebservers
when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. As the transaction-per-second rate, and the network load, decrease, Amazon MWAA disposes of the additional web servers, and scales down to the number set inMinxWebserers
.Valid values: For environments larger than mw1.micro, accepts values from
2
to5
. Defaults to2
for all environment sizes except mw1.micro, which defaults to1
.- min
Workers number - The minimum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the
MaxWorkers
field. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in theMinWorkers
field. For example,2
. - name string
- The name of your Amazon MWAA environment.
- network
Configuration EnvironmentNetwork Configuration - The VPC networking components used to secure and enable network traffic between the AWS resources for your environment. To learn more, see About networking on Amazon MWAA .
- plugins
S3Object stringVersion - The version of the plugins.zip file on your Amazon S3 bucket. To learn more, see Installing custom plugins .
- plugins
S3Path string - The relative path to the
plugins.zip
file on your Amazon S3 bucket. For example,plugins.zip
. To learn more, see Installing custom plugins . - requirements
S3Object stringVersion - The version of the requirements.txt file on your Amazon S3 bucket. To learn more, see Installing Python dependencies .
- requirements
S3Path string - The relative path to the
requirements.txt
file on your Amazon S3 bucket. For example,requirements.txt
. To learn more, see Installing Python dependencies . - schedulers number
- The number of schedulers that you want to run in your environment. Valid values:
- v2 - Accepts between 2 to 5. Defaults to 2.
- v1 - Accepts 1.
- source
Bucket stringArn - The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example,
arn:aws:s3:::my-airflow-bucket-unique-name
. To learn more, see Create an Amazon S3 bucket for Amazon MWAA . - startup
Script stringS3Object Version The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script.
Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example:
3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpUMLUo
For more information, see Using a startup script .
- startup
Script stringS3Path The relative path to the startup shell script in your Amazon S3 bucket. For example,
s3://mwaa-environment/startup.sh
.Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script .
- any
A map of tags for the environment.
Search the CloudFormation User Guide for
AWS::MWAA::Environment
for more information about the expected schema for this property.- webserver
Access EnvironmentMode Webserver Access Mode - The Apache Airflow Web server access mode. To learn more, see Apache Airflow access modes . Valid values:
PRIVATE_ONLY
orPUBLIC_ONLY
. - weekly
Maintenance stringWindow Start - The day and time of the week to start weekly maintenance updates of your environment in the following format:
DAY:HH:MM
. For example:TUE:03:30
. You can specify a start time in 30 minute increments only. Supported input includes the following:- MON|TUE|WED|THU|FRI|SAT|SUN:([01]\d|2[0-3]):(00|30)
- airflow_
configuration_ Anyoptions Key/value pairs representing Airflow configuration variables. Keys are prefixed by their section:
[core] dags_folder={AIRFLOW_HOME}/dags
Would be represented as
"core.dags_folder": "{AIRFLOW_HOME}/dags"
Search the CloudFormation User Guide for
AWS::MWAA::Environment
for more information about the expected schema for this property.- airflow_
version str The version of Apache Airflow to use for the environment. If no value is specified, defaults to the latest version.
If you specify a newer version number for an existing environment, the version update requires some service interruption before taking effect.
Allowed Values :
1.10.12
|2.0.2
|2.2.2
|2.4.3
|2.5.1
|2.6.3
|2.7.2
|2.8.1
|2.9.2
(latest)- dag_
s3_ strpath - The relative path to the DAGs folder on your Amazon S3 bucket. For example,
dags
. To learn more, see Adding or updating DAGs . - endpoint_
management EnvironmentEndpoint Management - Defines whether the VPC endpoints configured for the environment are created, and managed, by the customer or by Amazon MWAA. If set to
SERVICE
, Amazon MWAA will create and manage the required VPC endpoints in your VPC. If set toCUSTOMER
, you must create, and manage, the VPC endpoints in your VPC. - environment_
class str - The environment class type. Valid values:
mw1.small
,mw1.medium
,mw1.large
. To learn more, see Amazon MWAA environment class . - execution_
role_ strarn - The Amazon Resource Name (ARN) of the execution role in IAM that allows MWAA to access AWS resources in your environment. For example,
arn:aws:iam::123456789:role/my-execution-role
. To learn more, see Amazon MWAA Execution role . - kms_
key str - The AWS Key Management Service (KMS) key to encrypt and decrypt the data in your environment. You can use an AWS KMS key managed by MWAA, or a customer-managed KMS key (advanced).
- logging_
configuration EnvironmentLogging Configuration Args - The Apache Airflow logs being sent to CloudWatch Logs:
DagProcessingLogs
,SchedulerLogs
,TaskLogs
,WebserverLogs
,WorkerLogs
. - max_
webservers int The maximum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for
MaxWebservers
when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. For example, in scenarios where your workload requires network calls to the Apache Airflow REST API with a high transaction-per-second (TPS) rate, Amazon MWAA will increase the number of web servers up to the number set inMaxWebserers
. As TPS rates decrease Amazon MWAA disposes of the additional web servers, and scales down to the number set inMinxWebserers
.Valid values: For environments larger than mw1.micro, accepts values from
2
to5
. Defaults to2
for all environment sizes except mw1.micro, which defaults to1
.- max_
workers int - The maximum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the
MaxWorkers
field. For example,20
. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify inMinWorkers
. - min_
webservers int The minimum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for
MaxWebservers
when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. As the transaction-per-second rate, and the network load, decrease, Amazon MWAA disposes of the additional web servers, and scales down to the number set inMinxWebserers
.Valid values: For environments larger than mw1.micro, accepts values from
2
to5
. Defaults to2
for all environment sizes except mw1.micro, which defaults to1
.- min_
workers int - The minimum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the
MaxWorkers
field. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in theMinWorkers
field. For example,2
. - name str
- The name of your Amazon MWAA environment.
- network_
configuration EnvironmentNetwork Configuration Args - The VPC networking components used to secure and enable network traffic between the AWS resources for your environment. To learn more, see About networking on Amazon MWAA .
- plugins_
s3_ strobject_ version - The version of the plugins.zip file on your Amazon S3 bucket. To learn more, see Installing custom plugins .
- plugins_
s3_ strpath - The relative path to the
plugins.zip
file on your Amazon S3 bucket. For example,plugins.zip
. To learn more, see Installing custom plugins . - requirements_
s3_ strobject_ version - The version of the requirements.txt file on your Amazon S3 bucket. To learn more, see Installing Python dependencies .
- requirements_
s3_ strpath - The relative path to the
requirements.txt
file on your Amazon S3 bucket. For example,requirements.txt
. To learn more, see Installing Python dependencies . - schedulers int
- The number of schedulers that you want to run in your environment. Valid values:
- v2 - Accepts between 2 to 5. Defaults to 2.
- v1 - Accepts 1.
- source_
bucket_ strarn - The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example,
arn:aws:s3:::my-airflow-bucket-unique-name
. To learn more, see Create an Amazon S3 bucket for Amazon MWAA . - startup_
script_ strs3_ object_ version The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script.
Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example:
3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpUMLUo
For more information, see Using a startup script .
- startup_
script_ strs3_ path The relative path to the startup shell script in your Amazon S3 bucket. For example,
s3://mwaa-environment/startup.sh
.Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script .
- Any
A map of tags for the environment.
Search the CloudFormation User Guide for
AWS::MWAA::Environment
for more information about the expected schema for this property.- webserver_
access_ Environmentmode Webserver Access Mode - The Apache Airflow Web server access mode. To learn more, see Apache Airflow access modes . Valid values:
PRIVATE_ONLY
orPUBLIC_ONLY
. - weekly_
maintenance_ strwindow_ start - The day and time of the week to start weekly maintenance updates of your environment in the following format:
DAY:HH:MM
. For example:TUE:03:30
. You can specify a start time in 30 minute increments only. Supported input includes the following:- MON|TUE|WED|THU|FRI|SAT|SUN:([01]\d|2[0-3]):(00|30)
- airflow
Configuration AnyOptions Key/value pairs representing Airflow configuration variables. Keys are prefixed by their section:
[core] dags_folder={AIRFLOW_HOME}/dags
Would be represented as
"core.dags_folder": "{AIRFLOW_HOME}/dags"
Search the CloudFormation User Guide for
AWS::MWAA::Environment
for more information about the expected schema for this property.- airflow
Version String The version of Apache Airflow to use for the environment. If no value is specified, defaults to the latest version.
If you specify a newer version number for an existing environment, the version update requires some service interruption before taking effect.
Allowed Values :
1.10.12
|2.0.2
|2.2.2
|2.4.3
|2.5.1
|2.6.3
|2.7.2
|2.8.1
|2.9.2
(latest)- dag
S3Path String - The relative path to the DAGs folder on your Amazon S3 bucket. For example,
dags
. To learn more, see Adding or updating DAGs . - endpoint
Management "CUSTOMER" | "SERVICE" - Defines whether the VPC endpoints configured for the environment are created, and managed, by the customer or by Amazon MWAA. If set to
SERVICE
, Amazon MWAA will create and manage the required VPC endpoints in your VPC. If set toCUSTOMER
, you must create, and manage, the VPC endpoints in your VPC. - environment
Class String - The environment class type. Valid values:
mw1.small
,mw1.medium
,mw1.large
. To learn more, see Amazon MWAA environment class . - execution
Role StringArn - The Amazon Resource Name (ARN) of the execution role in IAM that allows MWAA to access AWS resources in your environment. For example,
arn:aws:iam::123456789:role/my-execution-role
. To learn more, see Amazon MWAA Execution role . - kms
Key String - The AWS Key Management Service (KMS) key to encrypt and decrypt the data in your environment. You can use an AWS KMS key managed by MWAA, or a customer-managed KMS key (advanced).
- logging
Configuration Property Map - The Apache Airflow logs being sent to CloudWatch Logs:
DagProcessingLogs
,SchedulerLogs
,TaskLogs
,WebserverLogs
,WorkerLogs
. - max
Webservers Number The maximum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for
MaxWebservers
when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. For example, in scenarios where your workload requires network calls to the Apache Airflow REST API with a high transaction-per-second (TPS) rate, Amazon MWAA will increase the number of web servers up to the number set inMaxWebserers
. As TPS rates decrease Amazon MWAA disposes of the additional web servers, and scales down to the number set inMinxWebserers
.Valid values: For environments larger than mw1.micro, accepts values from
2
to5
. Defaults to2
for all environment sizes except mw1.micro, which defaults to1
.- max
Workers Number - The maximum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the
MaxWorkers
field. For example,20
. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify inMinWorkers
. - min
Webservers Number The minimum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for
MaxWebservers
when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. As the transaction-per-second rate, and the network load, decrease, Amazon MWAA disposes of the additional web servers, and scales down to the number set inMinxWebserers
.Valid values: For environments larger than mw1.micro, accepts values from
2
to5
. Defaults to2
for all environment sizes except mw1.micro, which defaults to1
.- min
Workers Number - The minimum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the
MaxWorkers
field. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in theMinWorkers
field. For example,2
. - name String
- The name of your Amazon MWAA environment.
- network
Configuration Property Map - The VPC networking components used to secure and enable network traffic between the AWS resources for your environment. To learn more, see About networking on Amazon MWAA .
- plugins
S3Object StringVersion - The version of the plugins.zip file on your Amazon S3 bucket. To learn more, see Installing custom plugins .
- plugins
S3Path String - The relative path to the
plugins.zip
file on your Amazon S3 bucket. For example,plugins.zip
. To learn more, see Installing custom plugins . - requirements
S3Object StringVersion - The version of the requirements.txt file on your Amazon S3 bucket. To learn more, see Installing Python dependencies .
- requirements
S3Path String - The relative path to the
requirements.txt
file on your Amazon S3 bucket. For example,requirements.txt
. To learn more, see Installing Python dependencies . - schedulers Number
- The number of schedulers that you want to run in your environment. Valid values:
- v2 - Accepts between 2 to 5. Defaults to 2.
- v1 - Accepts 1.
- source
Bucket StringArn - The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example,
arn:aws:s3:::my-airflow-bucket-unique-name
. To learn more, see Create an Amazon S3 bucket for Amazon MWAA . - startup
Script StringS3Object Version The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script.
Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example:
3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpUMLUo
For more information, see Using a startup script .
- startup
Script StringS3Path The relative path to the startup shell script in your Amazon S3 bucket. For example,
s3://mwaa-environment/startup.sh
.Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script .
- Any
A map of tags for the environment.
Search the CloudFormation User Guide for
AWS::MWAA::Environment
for more information about the expected schema for this property.- webserver
Access "PRIVATE_ONLY" | "PUBLIC_ONLY"Mode - The Apache Airflow Web server access mode. To learn more, see Apache Airflow access modes . Valid values:
PRIVATE_ONLY
orPUBLIC_ONLY
. - weekly
Maintenance StringWindow Start - The day and time of the week to start weekly maintenance updates of your environment in the following format:
DAY:HH:MM
. For example:TUE:03:30
. You can specify a start time in 30 minute increments only. Supported input includes the following:- MON|TUE|WED|THU|FRI|SAT|SUN:([01]\d|2[0-3]):(00|30)
Outputs
All input properties are implicitly available as output properties. Additionally, the Environment resource produces the following output properties:
- Arn string
- The ARN for the Amazon MWAA environment.
- Celery
Executor stringQueue - The queue ARN for the environment's Celery Executor . Amazon MWAA uses a Celery Executor to distribute tasks across multiple workers. When you create an environment in a shared VPC, you must provide access to the Celery Executor queue from your VPC.
- Database
Vpc stringEndpoint Service - The VPC endpoint for the environment's Amazon RDS database.
- Id string
- The provider-assigned unique ID for this managed resource.
- Webserver
Url string - The URL of your Apache Airflow UI.
- Webserver
Vpc stringEndpoint Service - The VPC endpoint for the environment's web server.
- Arn string
- The ARN for the Amazon MWAA environment.
- Celery
Executor stringQueue - The queue ARN for the environment's Celery Executor . Amazon MWAA uses a Celery Executor to distribute tasks across multiple workers. When you create an environment in a shared VPC, you must provide access to the Celery Executor queue from your VPC.
- Database
Vpc stringEndpoint Service - The VPC endpoint for the environment's Amazon RDS database.
- Id string
- The provider-assigned unique ID for this managed resource.
- Webserver
Url string - The URL of your Apache Airflow UI.
- Webserver
Vpc stringEndpoint Service - The VPC endpoint for the environment's web server.
- arn String
- The ARN for the Amazon MWAA environment.
- celery
Executor StringQueue - The queue ARN for the environment's Celery Executor . Amazon MWAA uses a Celery Executor to distribute tasks across multiple workers. When you create an environment in a shared VPC, you must provide access to the Celery Executor queue from your VPC.
- database
Vpc StringEndpoint Service - The VPC endpoint for the environment's Amazon RDS database.
- id String
- The provider-assigned unique ID for this managed resource.
- webserver
Url String - The URL of your Apache Airflow UI.
- webserver
Vpc StringEndpoint Service - The VPC endpoint for the environment's web server.
- arn string
- The ARN for the Amazon MWAA environment.
- celery
Executor stringQueue - The queue ARN for the environment's Celery Executor . Amazon MWAA uses a Celery Executor to distribute tasks across multiple workers. When you create an environment in a shared VPC, you must provide access to the Celery Executor queue from your VPC.
- database
Vpc stringEndpoint Service - The VPC endpoint for the environment's Amazon RDS database.
- id string
- The provider-assigned unique ID for this managed resource.
- webserver
Url string - The URL of your Apache Airflow UI.
- webserver
Vpc stringEndpoint Service - The VPC endpoint for the environment's web server.
- arn str
- The ARN for the Amazon MWAA environment.
- celery_
executor_ strqueue - The queue ARN for the environment's Celery Executor . Amazon MWAA uses a Celery Executor to distribute tasks across multiple workers. When you create an environment in a shared VPC, you must provide access to the Celery Executor queue from your VPC.
- database_
vpc_ strendpoint_ service - The VPC endpoint for the environment's Amazon RDS database.
- id str
- The provider-assigned unique ID for this managed resource.
- webserver_
url str - The URL of your Apache Airflow UI.
- webserver_
vpc_ strendpoint_ service - The VPC endpoint for the environment's web server.
- arn String
- The ARN for the Amazon MWAA environment.
- celery
Executor StringQueue - The queue ARN for the environment's Celery Executor . Amazon MWAA uses a Celery Executor to distribute tasks across multiple workers. When you create an environment in a shared VPC, you must provide access to the Celery Executor queue from your VPC.
- database
Vpc StringEndpoint Service - The VPC endpoint for the environment's Amazon RDS database.
- id String
- The provider-assigned unique ID for this managed resource.
- webserver
Url String - The URL of your Apache Airflow UI.
- webserver
Vpc StringEndpoint Service - The VPC endpoint for the environment's web server.
Supporting Types
EnvironmentEndpointManagement, EnvironmentEndpointManagementArgs
- Customer
- CUSTOMER
- Service
- SERVICE
- Environment
Endpoint Management Customer - CUSTOMER
- Environment
Endpoint Management Service - SERVICE
- Customer
- CUSTOMER
- Service
- SERVICE
- Customer
- CUSTOMER
- Service
- SERVICE
- CUSTOMER
- CUSTOMER
- SERVICE
- SERVICE
- "CUSTOMER"
- CUSTOMER
- "SERVICE"
- SERVICE
EnvironmentLoggingConfiguration, EnvironmentLoggingConfigurationArgs
- Dag
Processing Pulumi.Logs Aws Native. Mwaa. Inputs. Environment Module Logging Configuration - Defines the processing logs sent to CloudWatch Logs and the logging level to send.
- Scheduler
Logs Pulumi.Aws Native. Mwaa. Inputs. Environment Module Logging Configuration - Defines the scheduler logs sent to CloudWatch Logs and the logging level to send.
- Task
Logs Pulumi.Aws Native. Mwaa. Inputs. Environment Module Logging Configuration - Defines the task logs sent to CloudWatch Logs and the logging level to send.
- Webserver
Logs Pulumi.Aws Native. Mwaa. Inputs. Environment Module Logging Configuration - Defines the web server logs sent to CloudWatch Logs and the logging level to send.
- Worker
Logs Pulumi.Aws Native. Mwaa. Inputs. Environment Module Logging Configuration - Defines the worker logs sent to CloudWatch Logs and the logging level to send.
- Dag
Processing EnvironmentLogs Module Logging Configuration - Defines the processing logs sent to CloudWatch Logs and the logging level to send.
- Scheduler
Logs EnvironmentModule Logging Configuration - Defines the scheduler logs sent to CloudWatch Logs and the logging level to send.
- Task
Logs EnvironmentModule Logging Configuration - Defines the task logs sent to CloudWatch Logs and the logging level to send.
- Webserver
Logs EnvironmentModule Logging Configuration - Defines the web server logs sent to CloudWatch Logs and the logging level to send.
- Worker
Logs EnvironmentModule Logging Configuration - Defines the worker logs sent to CloudWatch Logs and the logging level to send.
- dag
Processing EnvironmentLogs Module Logging Configuration - Defines the processing logs sent to CloudWatch Logs and the logging level to send.
- scheduler
Logs EnvironmentModule Logging Configuration - Defines the scheduler logs sent to CloudWatch Logs and the logging level to send.
- task
Logs EnvironmentModule Logging Configuration - Defines the task logs sent to CloudWatch Logs and the logging level to send.
- webserver
Logs EnvironmentModule Logging Configuration - Defines the web server logs sent to CloudWatch Logs and the logging level to send.
- worker
Logs EnvironmentModule Logging Configuration - Defines the worker logs sent to CloudWatch Logs and the logging level to send.
- dag
Processing EnvironmentLogs Module Logging Configuration - Defines the processing logs sent to CloudWatch Logs and the logging level to send.
- scheduler
Logs EnvironmentModule Logging Configuration - Defines the scheduler logs sent to CloudWatch Logs and the logging level to send.
- task
Logs EnvironmentModule Logging Configuration - Defines the task logs sent to CloudWatch Logs and the logging level to send.
- webserver
Logs EnvironmentModule Logging Configuration - Defines the web server logs sent to CloudWatch Logs and the logging level to send.
- worker
Logs EnvironmentModule Logging Configuration - Defines the worker logs sent to CloudWatch Logs and the logging level to send.
- dag_
processing_ Environmentlogs Module Logging Configuration - Defines the processing logs sent to CloudWatch Logs and the logging level to send.
- scheduler_
logs EnvironmentModule Logging Configuration - Defines the scheduler logs sent to CloudWatch Logs and the logging level to send.
- task_
logs EnvironmentModule Logging Configuration - Defines the task logs sent to CloudWatch Logs and the logging level to send.
- webserver_
logs EnvironmentModule Logging Configuration - Defines the web server logs sent to CloudWatch Logs and the logging level to send.
- worker_
logs EnvironmentModule Logging Configuration - Defines the worker logs sent to CloudWatch Logs and the logging level to send.
- dag
Processing Property MapLogs - Defines the processing logs sent to CloudWatch Logs and the logging level to send.
- scheduler
Logs Property Map - Defines the scheduler logs sent to CloudWatch Logs and the logging level to send.
- task
Logs Property Map - Defines the task logs sent to CloudWatch Logs and the logging level to send.
- webserver
Logs Property Map - Defines the web server logs sent to CloudWatch Logs and the logging level to send.
- worker
Logs Property Map - Defines the worker logs sent to CloudWatch Logs and the logging level to send.
EnvironmentLoggingLevel, EnvironmentLoggingLevelArgs
- Critical
- CRITICAL
- Error
- ERROR
- Warning
- WARNING
- Info
- INFO
- Debug
- DEBUG
- Environment
Logging Level Critical - CRITICAL
- Environment
Logging Level Error - ERROR
- Environment
Logging Level Warning - WARNING
- Environment
Logging Level Info - INFO
- Environment
Logging Level Debug - DEBUG
- Critical
- CRITICAL
- Error
- ERROR
- Warning
- WARNING
- Info
- INFO
- Debug
- DEBUG
- Critical
- CRITICAL
- Error
- ERROR
- Warning
- WARNING
- Info
- INFO
- Debug
- DEBUG
- CRITICAL
- CRITICAL
- ERROR
- ERROR
- WARNING
- WARNING
- INFO
- INFO
- DEBUG
- DEBUG
- "CRITICAL"
- CRITICAL
- "ERROR"
- ERROR
- "WARNING"
- WARNING
- "INFO"
- INFO
- "DEBUG"
- DEBUG
EnvironmentModuleLoggingConfiguration, EnvironmentModuleLoggingConfigurationArgs
- Cloud
Watch stringLog Group Arn The ARN of the CloudWatch Logs log group for each type of Apache Airflow log type that you have enabled.
CloudWatchLogGroupArn
is available only as a return value, accessible when specified as an attribute in theFn:GetAtt
intrinsic function. Any value you provide forCloudWatchLogGroupArn
is discarded by Amazon MWAA.- Enabled bool
- Indicates whether to enable the Apache Airflow log type (e.g.
DagProcessingLogs
) in CloudWatch Logs. - Log
Level Pulumi.Aws Native. Mwaa. Environment Logging Level - Defines the Apache Airflow logs to send for the log type (e.g.
DagProcessingLogs
) to CloudWatch Logs. Valid values:CRITICAL
,ERROR
,WARNING
,INFO
.
- Cloud
Watch stringLog Group Arn The ARN of the CloudWatch Logs log group for each type of Apache Airflow log type that you have enabled.
CloudWatchLogGroupArn
is available only as a return value, accessible when specified as an attribute in theFn:GetAtt
intrinsic function. Any value you provide forCloudWatchLogGroupArn
is discarded by Amazon MWAA.- Enabled bool
- Indicates whether to enable the Apache Airflow log type (e.g.
DagProcessingLogs
) in CloudWatch Logs. - Log
Level EnvironmentLogging Level - Defines the Apache Airflow logs to send for the log type (e.g.
DagProcessingLogs
) to CloudWatch Logs. Valid values:CRITICAL
,ERROR
,WARNING
,INFO
.
- cloud
Watch StringLog Group Arn The ARN of the CloudWatch Logs log group for each type of Apache Airflow log type that you have enabled.
CloudWatchLogGroupArn
is available only as a return value, accessible when specified as an attribute in theFn:GetAtt
intrinsic function. Any value you provide forCloudWatchLogGroupArn
is discarded by Amazon MWAA.- enabled Boolean
- Indicates whether to enable the Apache Airflow log type (e.g.
DagProcessingLogs
) in CloudWatch Logs. - log
Level EnvironmentLogging Level - Defines the Apache Airflow logs to send for the log type (e.g.
DagProcessingLogs
) to CloudWatch Logs. Valid values:CRITICAL
,ERROR
,WARNING
,INFO
.
- cloud
Watch stringLog Group Arn The ARN of the CloudWatch Logs log group for each type of Apache Airflow log type that you have enabled.
CloudWatchLogGroupArn
is available only as a return value, accessible when specified as an attribute in theFn:GetAtt
intrinsic function. Any value you provide forCloudWatchLogGroupArn
is discarded by Amazon MWAA.- enabled boolean
- Indicates whether to enable the Apache Airflow log type (e.g.
DagProcessingLogs
) in CloudWatch Logs. - log
Level EnvironmentLogging Level - Defines the Apache Airflow logs to send for the log type (e.g.
DagProcessingLogs
) to CloudWatch Logs. Valid values:CRITICAL
,ERROR
,WARNING
,INFO
.
- cloud_
watch_ strlog_ group_ arn The ARN of the CloudWatch Logs log group for each type of Apache Airflow log type that you have enabled.
CloudWatchLogGroupArn
is available only as a return value, accessible when specified as an attribute in theFn:GetAtt
intrinsic function. Any value you provide forCloudWatchLogGroupArn
is discarded by Amazon MWAA.- enabled bool
- Indicates whether to enable the Apache Airflow log type (e.g.
DagProcessingLogs
) in CloudWatch Logs. - log_
level EnvironmentLogging Level - Defines the Apache Airflow logs to send for the log type (e.g.
DagProcessingLogs
) to CloudWatch Logs. Valid values:CRITICAL
,ERROR
,WARNING
,INFO
.
- cloud
Watch StringLog Group Arn The ARN of the CloudWatch Logs log group for each type of Apache Airflow log type that you have enabled.
CloudWatchLogGroupArn
is available only as a return value, accessible when specified as an attribute in theFn:GetAtt
intrinsic function. Any value you provide forCloudWatchLogGroupArn
is discarded by Amazon MWAA.- enabled Boolean
- Indicates whether to enable the Apache Airflow log type (e.g.
DagProcessingLogs
) in CloudWatch Logs. - log
Level "CRITICAL" | "ERROR" | "WARNING" | "INFO" | "DEBUG" - Defines the Apache Airflow logs to send for the log type (e.g.
DagProcessingLogs
) to CloudWatch Logs. Valid values:CRITICAL
,ERROR
,WARNING
,INFO
.
EnvironmentNetworkConfiguration, EnvironmentNetworkConfigurationArgs
- Security
Group List<string>Ids - A list of security groups to use for the environment.
- Subnet
Ids List<string> - A list of subnets to use for the environment. These must be private subnets, in the same VPC, in two different availability zones.
- Security
Group []stringIds - A list of security groups to use for the environment.
- Subnet
Ids []string - A list of subnets to use for the environment. These must be private subnets, in the same VPC, in two different availability zones.
- security
Group List<String>Ids - A list of security groups to use for the environment.
- subnet
Ids List<String> - A list of subnets to use for the environment. These must be private subnets, in the same VPC, in two different availability zones.
- security
Group string[]Ids - A list of security groups to use for the environment.
- subnet
Ids string[] - A list of subnets to use for the environment. These must be private subnets, in the same VPC, in two different availability zones.
- security_
group_ Sequence[str]ids - A list of security groups to use for the environment.
- subnet_
ids Sequence[str] - A list of subnets to use for the environment. These must be private subnets, in the same VPC, in two different availability zones.
- security
Group List<String>Ids - A list of security groups to use for the environment.
- subnet
Ids List<String> - A list of subnets to use for the environment. These must be private subnets, in the same VPC, in two different availability zones.
EnvironmentWebserverAccessMode, EnvironmentWebserverAccessModeArgs
- Private
Only - PRIVATE_ONLY
- Public
Only - PUBLIC_ONLY
- Environment
Webserver Access Mode Private Only - PRIVATE_ONLY
- Environment
Webserver Access Mode Public Only - PUBLIC_ONLY
- Private
Only - PRIVATE_ONLY
- Public
Only - PUBLIC_ONLY
- Private
Only - PRIVATE_ONLY
- Public
Only - PUBLIC_ONLY
- PRIVATE_ONLY
- PRIVATE_ONLY
- PUBLIC_ONLY
- PUBLIC_ONLY
- "PRIVATE_ONLY"
- PRIVATE_ONLY
- "PUBLIC_ONLY"
- PUBLIC_ONLY
Package Details
- Repository
- AWS Native pulumi/pulumi-aws-native
- License
- Apache-2.0
We recommend new projects start with resources from the AWS provider.