mongodbatlas.getDataLake
Explore with Pulumi AI
mongodbatlas.DataLake
describe a Data Lake.
NOTE: Groups and projects are synonymous terms. You may find group_id in the official documentation.
Using getDataLake
Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.
function getDataLake(args: GetDataLakeArgs, opts?: InvokeOptions): Promise<GetDataLakeResult>
function getDataLakeOutput(args: GetDataLakeOutputArgs, opts?: InvokeOptions): Output<GetDataLakeResult>
def get_data_lake(name: Optional[str] = None,
project_id: Optional[str] = None,
opts: Optional[InvokeOptions] = None) -> GetDataLakeResult
def get_data_lake_output(name: Optional[pulumi.Input[str]] = None,
project_id: Optional[pulumi.Input[str]] = None,
opts: Optional[InvokeOptions] = None) -> Output[GetDataLakeResult]
func LookupDataLake(ctx *Context, args *LookupDataLakeArgs, opts ...InvokeOption) (*LookupDataLakeResult, error)
func LookupDataLakeOutput(ctx *Context, args *LookupDataLakeOutputArgs, opts ...InvokeOption) LookupDataLakeResultOutput
> Note: This function is named LookupDataLake
in the Go SDK.
public static class GetDataLake
{
public static Task<GetDataLakeResult> InvokeAsync(GetDataLakeArgs args, InvokeOptions? opts = null)
public static Output<GetDataLakeResult> Invoke(GetDataLakeInvokeArgs args, InvokeOptions? opts = null)
}
public static CompletableFuture<GetDataLakeResult> getDataLake(GetDataLakeArgs args, InvokeOptions options)
// Output-based functions aren't available in Java yet
fn::invoke:
function: mongodbatlas:index/getDataLake:getDataLake
arguments:
# arguments dictionary
The following arguments are supported:
- name str
Name of the data lake.
- project_
id str The unique ID for the project to create a data lake.
getDataLake Result
The following output properties are available:
- Aws
List<Get
Data Lake Aw> AWS provider of the cloud service where Data Lake can access the S3 Bucket.
aws.0.role_id
- Unique identifier of the role that Data Lake can use to access the data stores.aws.0.test_s3_bucket
- Name of the S3 data bucket that the provided role ID is authorized to access.aws.0.role_id
- Unique identifier of the role that Data Lake can use to access the data stores.aws.0.test_s3_bucket
- Name of the S3 data bucket that the provided role ID is authorized to access.aws.0.iam_assumed_role_arn
- Amazon Resource Name (ARN) of the IAM Role that Data Lake assumes when accessing S3 Bucket data stores.
- Data
Process List<GetRegions Data Lake Data Process Region> The cloud provider region to which Atlas Data Lake routes client connections for data processing.
data_process_region.0.cloud_provider
- Name of the cloud service provider.data_process_region.0.region
-Name of the region to which Data Lake routes client connections for data processing.
- Hostnames List<string>
The list of hostnames assigned to the Atlas Data Lake. Each string in the array is a hostname assigned to the Atlas Data Lake.
- Id string
The provider-assigned unique ID for this managed resource.
- Name string
- Project
Id string - State string
Current state of the Atlas Data Lake:
- Storage
Databases List<GetData Lake Storage Database> Configuration details for mapping each data store to queryable databases and collections.
storage_databases.#.name
- Name of the database to which Data Lake maps the data contained in the data store.storage_databases.#.collections
- Array of objects where each object represents a collection and data sources that map to a stores data store.storage_databases.#.collections.#.name
- Name of the collection.storage_databases.#.collections.#.data_sources
- Array of objects where each object represents a stores data store to map with the collection.storage_databases.#.collections.#.data_sources.#.store_name
- Name of a data store to map to the<collection>
.storage_databases.#.collections.#.data_sources.#.default_format
- Default format that Data Lake assumes if it encounters a file without an extension while searching the storeName.storage_databases.#.collections.#.data_sources.#.path
- Controls how Atlas Data Lake searches for and parses files in the storeName before mapping them to the<collection>
.storage_databases.#.views
- Array of objects where each object represents an aggregation pipeline on a collection.storage_databases.#.views.#.name
- Name of the view.storage_databases.#.views.#.source
- Name of the source collection for the view.storage_databases.#.views.#.pipeline
- Aggregation pipeline stage(s) to apply to the source collection.
- Storage
Stores List<GetData Lake Storage Store> Each object in the array represents a data store. Data Lake uses the storage.databases configuration details to map data in each data store to queryable databases and collections.
storage_stores.#.name
- Name of the data store.storage_stores.#.provider
- Defines where the data is stored.storage_stores.#.region
- Name of the AWS region in which the S3 bucket is hosted.storage_stores.#.bucket
- Name of the AWS S3 bucket.storage_stores.#.prefix
- Prefix Data Lake applies when searching for files in the S3 bucket .storage_stores.#.delimiter
- The delimiter that separatesstorage_databases.#.collections.#.data_sources.#.path
segments in the data store.storage_stores.#.include_tags
- Determines whether or not to use S3 tags on the files in the given path as additional partition attributes.
- Aws
[]Get
Data Lake Aw AWS provider of the cloud service where Data Lake can access the S3 Bucket.
aws.0.role_id
- Unique identifier of the role that Data Lake can use to access the data stores.aws.0.test_s3_bucket
- Name of the S3 data bucket that the provided role ID is authorized to access.aws.0.role_id
- Unique identifier of the role that Data Lake can use to access the data stores.aws.0.test_s3_bucket
- Name of the S3 data bucket that the provided role ID is authorized to access.aws.0.iam_assumed_role_arn
- Amazon Resource Name (ARN) of the IAM Role that Data Lake assumes when accessing S3 Bucket data stores.
- Data
Process []GetRegions Data Lake Data Process Region The cloud provider region to which Atlas Data Lake routes client connections for data processing.
data_process_region.0.cloud_provider
- Name of the cloud service provider.data_process_region.0.region
-Name of the region to which Data Lake routes client connections for data processing.
- Hostnames []string
The list of hostnames assigned to the Atlas Data Lake. Each string in the array is a hostname assigned to the Atlas Data Lake.
- Id string
The provider-assigned unique ID for this managed resource.
- Name string
- Project
Id string - State string
Current state of the Atlas Data Lake:
- Storage
Databases []GetData Lake Storage Database Configuration details for mapping each data store to queryable databases and collections.
storage_databases.#.name
- Name of the database to which Data Lake maps the data contained in the data store.storage_databases.#.collections
- Array of objects where each object represents a collection and data sources that map to a stores data store.storage_databases.#.collections.#.name
- Name of the collection.storage_databases.#.collections.#.data_sources
- Array of objects where each object represents a stores data store to map with the collection.storage_databases.#.collections.#.data_sources.#.store_name
- Name of a data store to map to the<collection>
.storage_databases.#.collections.#.data_sources.#.default_format
- Default format that Data Lake assumes if it encounters a file without an extension while searching the storeName.storage_databases.#.collections.#.data_sources.#.path
- Controls how Atlas Data Lake searches for and parses files in the storeName before mapping them to the<collection>
.storage_databases.#.views
- Array of objects where each object represents an aggregation pipeline on a collection.storage_databases.#.views.#.name
- Name of the view.storage_databases.#.views.#.source
- Name of the source collection for the view.storage_databases.#.views.#.pipeline
- Aggregation pipeline stage(s) to apply to the source collection.
- Storage
Stores []GetData Lake Storage Store Each object in the array represents a data store. Data Lake uses the storage.databases configuration details to map data in each data store to queryable databases and collections.
storage_stores.#.name
- Name of the data store.storage_stores.#.provider
- Defines where the data is stored.storage_stores.#.region
- Name of the AWS region in which the S3 bucket is hosted.storage_stores.#.bucket
- Name of the AWS S3 bucket.storage_stores.#.prefix
- Prefix Data Lake applies when searching for files in the S3 bucket .storage_stores.#.delimiter
- The delimiter that separatesstorage_databases.#.collections.#.data_sources.#.path
segments in the data store.storage_stores.#.include_tags
- Determines whether or not to use S3 tags on the files in the given path as additional partition attributes.
- aws
List<Get
Data Lake Aw> AWS provider of the cloud service where Data Lake can access the S3 Bucket.
aws.0.role_id
- Unique identifier of the role that Data Lake can use to access the data stores.aws.0.test_s3_bucket
- Name of the S3 data bucket that the provided role ID is authorized to access.aws.0.role_id
- Unique identifier of the role that Data Lake can use to access the data stores.aws.0.test_s3_bucket
- Name of the S3 data bucket that the provided role ID is authorized to access.aws.0.iam_assumed_role_arn
- Amazon Resource Name (ARN) of the IAM Role that Data Lake assumes when accessing S3 Bucket data stores.
- data
Process List<GetRegions Data Lake Data Process Region> The cloud provider region to which Atlas Data Lake routes client connections for data processing.
data_process_region.0.cloud_provider
- Name of the cloud service provider.data_process_region.0.region
-Name of the region to which Data Lake routes client connections for data processing.
- hostnames List<String>
The list of hostnames assigned to the Atlas Data Lake. Each string in the array is a hostname assigned to the Atlas Data Lake.
- id String
The provider-assigned unique ID for this managed resource.
- name String
- project
Id String - state String
Current state of the Atlas Data Lake:
- storage
Databases List<GetData Lake Storage Database> Configuration details for mapping each data store to queryable databases and collections.
storage_databases.#.name
- Name of the database to which Data Lake maps the data contained in the data store.storage_databases.#.collections
- Array of objects where each object represents a collection and data sources that map to a stores data store.storage_databases.#.collections.#.name
- Name of the collection.storage_databases.#.collections.#.data_sources
- Array of objects where each object represents a stores data store to map with the collection.storage_databases.#.collections.#.data_sources.#.store_name
- Name of a data store to map to the<collection>
.storage_databases.#.collections.#.data_sources.#.default_format
- Default format that Data Lake assumes if it encounters a file without an extension while searching the storeName.storage_databases.#.collections.#.data_sources.#.path
- Controls how Atlas Data Lake searches for and parses files in the storeName before mapping them to the<collection>
.storage_databases.#.views
- Array of objects where each object represents an aggregation pipeline on a collection.storage_databases.#.views.#.name
- Name of the view.storage_databases.#.views.#.source
- Name of the source collection for the view.storage_databases.#.views.#.pipeline
- Aggregation pipeline stage(s) to apply to the source collection.
- storage
Stores List<GetData Lake Storage Store> Each object in the array represents a data store. Data Lake uses the storage.databases configuration details to map data in each data store to queryable databases and collections.
storage_stores.#.name
- Name of the data store.storage_stores.#.provider
- Defines where the data is stored.storage_stores.#.region
- Name of the AWS region in which the S3 bucket is hosted.storage_stores.#.bucket
- Name of the AWS S3 bucket.storage_stores.#.prefix
- Prefix Data Lake applies when searching for files in the S3 bucket .storage_stores.#.delimiter
- The delimiter that separatesstorage_databases.#.collections.#.data_sources.#.path
segments in the data store.storage_stores.#.include_tags
- Determines whether or not to use S3 tags on the files in the given path as additional partition attributes.
- aws
Get
Data Lake Aw[] AWS provider of the cloud service where Data Lake can access the S3 Bucket.
aws.0.role_id
- Unique identifier of the role that Data Lake can use to access the data stores.aws.0.test_s3_bucket
- Name of the S3 data bucket that the provided role ID is authorized to access.aws.0.role_id
- Unique identifier of the role that Data Lake can use to access the data stores.aws.0.test_s3_bucket
- Name of the S3 data bucket that the provided role ID is authorized to access.aws.0.iam_assumed_role_arn
- Amazon Resource Name (ARN) of the IAM Role that Data Lake assumes when accessing S3 Bucket data stores.
- data
Process GetRegions Data Lake Data Process Region[] The cloud provider region to which Atlas Data Lake routes client connections for data processing.
data_process_region.0.cloud_provider
- Name of the cloud service provider.data_process_region.0.region
-Name of the region to which Data Lake routes client connections for data processing.
- hostnames string[]
The list of hostnames assigned to the Atlas Data Lake. Each string in the array is a hostname assigned to the Atlas Data Lake.
- id string
The provider-assigned unique ID for this managed resource.
- name string
- project
Id string - state string
Current state of the Atlas Data Lake:
- storage
Databases GetData Lake Storage Database[] Configuration details for mapping each data store to queryable databases and collections.
storage_databases.#.name
- Name of the database to which Data Lake maps the data contained in the data store.storage_databases.#.collections
- Array of objects where each object represents a collection and data sources that map to a stores data store.storage_databases.#.collections.#.name
- Name of the collection.storage_databases.#.collections.#.data_sources
- Array of objects where each object represents a stores data store to map with the collection.storage_databases.#.collections.#.data_sources.#.store_name
- Name of a data store to map to the<collection>
.storage_databases.#.collections.#.data_sources.#.default_format
- Default format that Data Lake assumes if it encounters a file without an extension while searching the storeName.storage_databases.#.collections.#.data_sources.#.path
- Controls how Atlas Data Lake searches for and parses files in the storeName before mapping them to the<collection>
.storage_databases.#.views
- Array of objects where each object represents an aggregation pipeline on a collection.storage_databases.#.views.#.name
- Name of the view.storage_databases.#.views.#.source
- Name of the source collection for the view.storage_databases.#.views.#.pipeline
- Aggregation pipeline stage(s) to apply to the source collection.
- storage
Stores GetData Lake Storage Store[] Each object in the array represents a data store. Data Lake uses the storage.databases configuration details to map data in each data store to queryable databases and collections.
storage_stores.#.name
- Name of the data store.storage_stores.#.provider
- Defines where the data is stored.storage_stores.#.region
- Name of the AWS region in which the S3 bucket is hosted.storage_stores.#.bucket
- Name of the AWS S3 bucket.storage_stores.#.prefix
- Prefix Data Lake applies when searching for files in the S3 bucket .storage_stores.#.delimiter
- The delimiter that separatesstorage_databases.#.collections.#.data_sources.#.path
segments in the data store.storage_stores.#.include_tags
- Determines whether or not to use S3 tags on the files in the given path as additional partition attributes.
- aws
Sequence[Get
Data Lake Aw] AWS provider of the cloud service where Data Lake can access the S3 Bucket.
aws.0.role_id
- Unique identifier of the role that Data Lake can use to access the data stores.aws.0.test_s3_bucket
- Name of the S3 data bucket that the provided role ID is authorized to access.aws.0.role_id
- Unique identifier of the role that Data Lake can use to access the data stores.aws.0.test_s3_bucket
- Name of the S3 data bucket that the provided role ID is authorized to access.aws.0.iam_assumed_role_arn
- Amazon Resource Name (ARN) of the IAM Role that Data Lake assumes when accessing S3 Bucket data stores.
- data_
process_ Sequence[Getregions Data Lake Data Process Region] The cloud provider region to which Atlas Data Lake routes client connections for data processing.
data_process_region.0.cloud_provider
- Name of the cloud service provider.data_process_region.0.region
-Name of the region to which Data Lake routes client connections for data processing.
- hostnames Sequence[str]
The list of hostnames assigned to the Atlas Data Lake. Each string in the array is a hostname assigned to the Atlas Data Lake.
- id str
The provider-assigned unique ID for this managed resource.
- name str
- project_
id str - state str
Current state of the Atlas Data Lake:
- storage_
databases Sequence[GetData Lake Storage Database] Configuration details for mapping each data store to queryable databases and collections.
storage_databases.#.name
- Name of the database to which Data Lake maps the data contained in the data store.storage_databases.#.collections
- Array of objects where each object represents a collection and data sources that map to a stores data store.storage_databases.#.collections.#.name
- Name of the collection.storage_databases.#.collections.#.data_sources
- Array of objects where each object represents a stores data store to map with the collection.storage_databases.#.collections.#.data_sources.#.store_name
- Name of a data store to map to the<collection>
.storage_databases.#.collections.#.data_sources.#.default_format
- Default format that Data Lake assumes if it encounters a file without an extension while searching the storeName.storage_databases.#.collections.#.data_sources.#.path
- Controls how Atlas Data Lake searches for and parses files in the storeName before mapping them to the<collection>
.storage_databases.#.views
- Array of objects where each object represents an aggregation pipeline on a collection.storage_databases.#.views.#.name
- Name of the view.storage_databases.#.views.#.source
- Name of the source collection for the view.storage_databases.#.views.#.pipeline
- Aggregation pipeline stage(s) to apply to the source collection.
- storage_
stores Sequence[GetData Lake Storage Store] Each object in the array represents a data store. Data Lake uses the storage.databases configuration details to map data in each data store to queryable databases and collections.
storage_stores.#.name
- Name of the data store.storage_stores.#.provider
- Defines where the data is stored.storage_stores.#.region
- Name of the AWS region in which the S3 bucket is hosted.storage_stores.#.bucket
- Name of the AWS S3 bucket.storage_stores.#.prefix
- Prefix Data Lake applies when searching for files in the S3 bucket .storage_stores.#.delimiter
- The delimiter that separatesstorage_databases.#.collections.#.data_sources.#.path
segments in the data store.storage_stores.#.include_tags
- Determines whether or not to use S3 tags on the files in the given path as additional partition attributes.
- aws List<Property Map>
AWS provider of the cloud service where Data Lake can access the S3 Bucket.
aws.0.role_id
- Unique identifier of the role that Data Lake can use to access the data stores.aws.0.test_s3_bucket
- Name of the S3 data bucket that the provided role ID is authorized to access.aws.0.role_id
- Unique identifier of the role that Data Lake can use to access the data stores.aws.0.test_s3_bucket
- Name of the S3 data bucket that the provided role ID is authorized to access.aws.0.iam_assumed_role_arn
- Amazon Resource Name (ARN) of the IAM Role that Data Lake assumes when accessing S3 Bucket data stores.
- data
Process List<Property Map>Regions The cloud provider region to which Atlas Data Lake routes client connections for data processing.
data_process_region.0.cloud_provider
- Name of the cloud service provider.data_process_region.0.region
-Name of the region to which Data Lake routes client connections for data processing.
- hostnames List<String>
The list of hostnames assigned to the Atlas Data Lake. Each string in the array is a hostname assigned to the Atlas Data Lake.
- id String
The provider-assigned unique ID for this managed resource.
- name String
- project
Id String - state String
Current state of the Atlas Data Lake:
- storage
Databases List<Property Map> Configuration details for mapping each data store to queryable databases and collections.
storage_databases.#.name
- Name of the database to which Data Lake maps the data contained in the data store.storage_databases.#.collections
- Array of objects where each object represents a collection and data sources that map to a stores data store.storage_databases.#.collections.#.name
- Name of the collection.storage_databases.#.collections.#.data_sources
- Array of objects where each object represents a stores data store to map with the collection.storage_databases.#.collections.#.data_sources.#.store_name
- Name of a data store to map to the<collection>
.storage_databases.#.collections.#.data_sources.#.default_format
- Default format that Data Lake assumes if it encounters a file without an extension while searching the storeName.storage_databases.#.collections.#.data_sources.#.path
- Controls how Atlas Data Lake searches for and parses files in the storeName before mapping them to the<collection>
.storage_databases.#.views
- Array of objects where each object represents an aggregation pipeline on a collection.storage_databases.#.views.#.name
- Name of the view.storage_databases.#.views.#.source
- Name of the source collection for the view.storage_databases.#.views.#.pipeline
- Aggregation pipeline stage(s) to apply to the source collection.
- storage
Stores List<Property Map> Each object in the array represents a data store. Data Lake uses the storage.databases configuration details to map data in each data store to queryable databases and collections.
storage_stores.#.name
- Name of the data store.storage_stores.#.provider
- Defines where the data is stored.storage_stores.#.region
- Name of the AWS region in which the S3 bucket is hosted.storage_stores.#.bucket
- Name of the AWS S3 bucket.storage_stores.#.prefix
- Prefix Data Lake applies when searching for files in the S3 bucket .storage_stores.#.delimiter
- The delimiter that separatesstorage_databases.#.collections.#.data_sources.#.path
segments in the data store.storage_stores.#.include_tags
- Determines whether or not to use S3 tags on the files in the given path as additional partition attributes.
Supporting Types
GetDataLakeAw
- External
Id string - Iam
Assumed stringRole Arn - Iam
User stringArn - Role
Id string - Test
S3Bucket string
- External
Id string - Iam
Assumed stringRole Arn - Iam
User stringArn - Role
Id string - Test
S3Bucket string
- external
Id String - iam
Assumed StringRole Arn - iam
User StringArn - role
Id String - test
S3Bucket String
- external
Id string - iam
Assumed stringRole Arn - iam
User stringArn - role
Id string - test
S3Bucket string
- external_
id str - iam_
assumed_ strrole_ arn - iam_
user_ strarn - role_
id str - test_
s3_ strbucket
- external
Id String - iam
Assumed StringRole Arn - iam
User StringArn - role
Id String - test
S3Bucket String
GetDataLakeDataProcessRegion
- Cloud
Provider string - Region string
- Cloud
Provider string - Region string
- cloud
Provider String - region String
- cloud
Provider string - region string
- cloud_
provider str - region str
- cloud
Provider String - region String
GetDataLakeStorageDatabase
- Collections
List<Get
Data Lake Storage Database Collection> - Max
Wildcard intCollections - Name string
Name of the data lake.
- Views
List<Get
Data Lake Storage Database View>
- Collections
[]Get
Data Lake Storage Database Collection - Max
Wildcard intCollections - Name string
Name of the data lake.
- Views
[]Get
Data Lake Storage Database View
- collections
List<Get
Data Lake Storage Database Collection> - max
Wildcard IntegerCollections - name String
Name of the data lake.
- views
List<Get
Data Lake Storage Database View>
- collections
Get
Data Lake Storage Database Collection[] - max
Wildcard numberCollections - name string
Name of the data lake.
- views
Get
Data Lake Storage Database View[]
- collections List<Property Map>
- max
Wildcard NumberCollections - name String
Name of the data lake.
- views List<Property Map>
GetDataLakeStorageDatabaseCollection
- Data
Sources List<GetData Lake Storage Database Collection Data Source> - Name string
Name of the data lake.
- Data
Sources []GetData Lake Storage Database Collection Data Source - Name string
Name of the data lake.
- data
Sources List<GetData Lake Storage Database Collection Data Source> - name String
Name of the data lake.
- data
Sources GetData Lake Storage Database Collection Data Source[] - name string
Name of the data lake.
- data_
sources Sequence[GetData Lake Storage Database Collection Data Source] - name str
Name of the data lake.
- data
Sources List<Property Map> - name String
Name of the data lake.
GetDataLakeStorageDatabaseCollectionDataSource
- Default
Format string - Path string
- Store
Name string
- Default
Format string - Path string
- Store
Name string
- default
Format String - path String
- store
Name String
- default
Format string - path string
- store
Name string
- default_
format str - path str
- store_
name str
- default
Format String - path String
- store
Name String
GetDataLakeStorageDatabaseView
GetDataLakeStorageStore
Package Details
- Repository
- MongoDB Atlas pulumi/pulumi-mongodbatlas
- License
- Apache-2.0
- Notes
This Pulumi package is based on the
mongodbatlas
Terraform Provider.