MongoDB Atlas

Pulumi Official
Package maintained by Pulumi
v3.5.0 published on Wednesday, Jul 20, 2022 by Pulumi

getDataLake

mongodbatlas.DataLake describe a Data Lake.

NOTE: Groups and projects are synonymous terms. You may find group_id in the official documentation.

Using getDataLake

Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.

function getDataLake(args: GetDataLakeArgs, opts?: InvokeOptions): Promise<GetDataLakeResult>
function getDataLakeOutput(args: GetDataLakeOutputArgs, opts?: InvokeOptions): Output<GetDataLakeResult>
def get_data_lake(name: Optional[str] = None,
                  project_id: Optional[str] = None,
                  opts: Optional[InvokeOptions] = None) -> GetDataLakeResult
def get_data_lake_output(name: Optional[pulumi.Input[str]] = None,
                  project_id: Optional[pulumi.Input[str]] = None,
                  opts: Optional[InvokeOptions] = None) -> Output[GetDataLakeResult]
func LookupDataLake(ctx *Context, args *LookupDataLakeArgs, opts ...InvokeOption) (*LookupDataLakeResult, error)
func LookupDataLakeOutput(ctx *Context, args *LookupDataLakeOutputArgs, opts ...InvokeOption) LookupDataLakeResultOutput

> Note: This function is named LookupDataLake in the Go SDK.

public static class GetDataLake 
{
    public static Task<GetDataLakeResult> InvokeAsync(GetDataLakeArgs args, InvokeOptions? opts = null)
    public static Output<GetDataLakeResult> Invoke(GetDataLakeInvokeArgs args, InvokeOptions? opts = null)
}
public static CompletableFuture<GetDataLakeResult> getDataLake(GetDataLakeArgs args, InvokeOptions options)
// Output-based functions aren't available in Java yet
Fn::Invoke:
  Function: mongodbatlas:index/getDataLake:getDataLake
  Arguments:
    # Arguments dictionary

The following arguments are supported:

Name string

Name of the data lake.

ProjectId string

The unique ID for the project to create a data lake.

Name string

Name of the data lake.

ProjectId string

The unique ID for the project to create a data lake.

name String

Name of the data lake.

projectId String

The unique ID for the project to create a data lake.

name string

Name of the data lake.

projectId string

The unique ID for the project to create a data lake.

name str

Name of the data lake.

project_id str

The unique ID for the project to create a data lake.

name String

Name of the data lake.

projectId String

The unique ID for the project to create a data lake.

getDataLake Result

The following output properties are available:

Aws List<GetDataLakeAw>

AWS provider of the cloud service where Data Lake can access the S3 Bucket.

  • aws.0.role_id - Unique identifier of the role that Data Lake can use to access the data stores.
  • aws.0.test_s3_bucket - Name of the S3 data bucket that the provided role ID is authorized to access.
  • aws.0.role_id - Unique identifier of the role that Data Lake can use to access the data stores.
  • aws.0.test_s3_bucket - Name of the S3 data bucket that the provided role ID is authorized to access.
  • aws.0.iam_assumed_role_arn - Amazon Resource Name (ARN) of the IAM Role that Data Lake assumes when accessing S3 Bucket data stores.
DataProcessRegions List<GetDataLakeDataProcessRegion>

The cloud provider region to which Atlas Data Lake routes client connections for data processing.

  • data_process_region.0.cloud_provider - Name of the cloud service provider.
  • data_process_region.0.region -Name of the region to which Data Lake routes client connections for data processing.
Hostnames List<string>

The list of hostnames assigned to the Atlas Data Lake. Each string in the array is a hostname assigned to the Atlas Data Lake.

Id string

The provider-assigned unique ID for this managed resource.

Name string
ProjectId string
State string

Current state of the Atlas Data Lake:

StorageDatabases List<GetDataLakeStorageDatabase>

Configuration details for mapping each data store to queryable databases and collections.

  • storage_databases.#.name - Name of the database to which Data Lake maps the data contained in the data store.
  • storage_databases.#.collections - Array of objects where each object represents a collection and data sources that map to a stores data store.
  • storage_databases.#.collections.#.name - Name of the collection.
  • storage_databases.#.collections.#.data_sources - Array of objects where each object represents a stores data store to map with the collection.
  • storage_databases.#.collections.#.data_sources.#.store_name - Name of a data store to map to the <collection>.
  • storage_databases.#.collections.#.data_sources.#.default_format - Default format that Data Lake assumes if it encounters a file without an extension while searching the storeName.
  • storage_databases.#.collections.#.data_sources.#.path - Controls how Atlas Data Lake searches for and parses files in the storeName before mapping them to the <collection>.
  • storage_databases.#.views - Array of objects where each object represents an aggregation pipeline on a collection.
  • storage_databases.#.views.#.name - Name of the view.
  • storage_databases.#.views.#.source - Name of the source collection for the view.
  • storage_databases.#.views.#.pipeline- Aggregation pipeline stage(s) to apply to the source collection.
StorageStores List<GetDataLakeStorageStore>

Each object in the array represents a data store. Data Lake uses the storage.databases configuration details to map data in each data store to queryable databases and collections.

  • storage_stores.#.name - Name of the data store.
  • storage_stores.#.provider - Defines where the data is stored.
  • storage_stores.#.region - Name of the AWS region in which the S3 bucket is hosted.
  • storage_stores.#.bucket - Name of the AWS S3 bucket.
  • storage_stores.#.prefix - Prefix Data Lake applies when searching for files in the S3 bucket .
  • storage_stores.#.delimiter - The delimiter that separates storage_databases.#.collections.#.data_sources.#.path segments in the data store.
  • storage_stores.#.include_tags - Determines whether or not to use S3 tags on the files in the given path as additional partition attributes.
Aws []GetDataLakeAw

AWS provider of the cloud service where Data Lake can access the S3 Bucket.

  • aws.0.role_id - Unique identifier of the role that Data Lake can use to access the data stores.
  • aws.0.test_s3_bucket - Name of the S3 data bucket that the provided role ID is authorized to access.
  • aws.0.role_id - Unique identifier of the role that Data Lake can use to access the data stores.
  • aws.0.test_s3_bucket - Name of the S3 data bucket that the provided role ID is authorized to access.
  • aws.0.iam_assumed_role_arn - Amazon Resource Name (ARN) of the IAM Role that Data Lake assumes when accessing S3 Bucket data stores.
DataProcessRegions []GetDataLakeDataProcessRegion

The cloud provider region to which Atlas Data Lake routes client connections for data processing.

  • data_process_region.0.cloud_provider - Name of the cloud service provider.
  • data_process_region.0.region -Name of the region to which Data Lake routes client connections for data processing.
Hostnames []string

The list of hostnames assigned to the Atlas Data Lake. Each string in the array is a hostname assigned to the Atlas Data Lake.

Id string

The provider-assigned unique ID for this managed resource.

Name string
ProjectId string
State string

Current state of the Atlas Data Lake:

StorageDatabases []GetDataLakeStorageDatabase

Configuration details for mapping each data store to queryable databases and collections.

  • storage_databases.#.name - Name of the database to which Data Lake maps the data contained in the data store.
  • storage_databases.#.collections - Array of objects where each object represents a collection and data sources that map to a stores data store.
  • storage_databases.#.collections.#.name - Name of the collection.
  • storage_databases.#.collections.#.data_sources - Array of objects where each object represents a stores data store to map with the collection.
  • storage_databases.#.collections.#.data_sources.#.store_name - Name of a data store to map to the <collection>.
  • storage_databases.#.collections.#.data_sources.#.default_format - Default format that Data Lake assumes if it encounters a file without an extension while searching the storeName.
  • storage_databases.#.collections.#.data_sources.#.path - Controls how Atlas Data Lake searches for and parses files in the storeName before mapping them to the <collection>.
  • storage_databases.#.views - Array of objects where each object represents an aggregation pipeline on a collection.
  • storage_databases.#.views.#.name - Name of the view.
  • storage_databases.#.views.#.source - Name of the source collection for the view.
  • storage_databases.#.views.#.pipeline- Aggregation pipeline stage(s) to apply to the source collection.
StorageStores []GetDataLakeStorageStore

Each object in the array represents a data store. Data Lake uses the storage.databases configuration details to map data in each data store to queryable databases and collections.

  • storage_stores.#.name - Name of the data store.
  • storage_stores.#.provider - Defines where the data is stored.
  • storage_stores.#.region - Name of the AWS region in which the S3 bucket is hosted.
  • storage_stores.#.bucket - Name of the AWS S3 bucket.
  • storage_stores.#.prefix - Prefix Data Lake applies when searching for files in the S3 bucket .
  • storage_stores.#.delimiter - The delimiter that separates storage_databases.#.collections.#.data_sources.#.path segments in the data store.
  • storage_stores.#.include_tags - Determines whether or not to use S3 tags on the files in the given path as additional partition attributes.
aws List<GetDataLakeAw>

AWS provider of the cloud service where Data Lake can access the S3 Bucket.

  • aws.0.role_id - Unique identifier of the role that Data Lake can use to access the data stores.
  • aws.0.test_s3_bucket - Name of the S3 data bucket that the provided role ID is authorized to access.
  • aws.0.role_id - Unique identifier of the role that Data Lake can use to access the data stores.
  • aws.0.test_s3_bucket - Name of the S3 data bucket that the provided role ID is authorized to access.
  • aws.0.iam_assumed_role_arn - Amazon Resource Name (ARN) of the IAM Role that Data Lake assumes when accessing S3 Bucket data stores.
dataProcessRegions List<GetDataLakeDataProcessRegion>

The cloud provider region to which Atlas Data Lake routes client connections for data processing.

  • data_process_region.0.cloud_provider - Name of the cloud service provider.
  • data_process_region.0.region -Name of the region to which Data Lake routes client connections for data processing.
hostnames List<String>

The list of hostnames assigned to the Atlas Data Lake. Each string in the array is a hostname assigned to the Atlas Data Lake.

id String

The provider-assigned unique ID for this managed resource.

name String
projectId String
state String

Current state of the Atlas Data Lake:

storageDatabases List<GetDataLakeStorageDatabase>

Configuration details for mapping each data store to queryable databases and collections.

  • storage_databases.#.name - Name of the database to which Data Lake maps the data contained in the data store.
  • storage_databases.#.collections - Array of objects where each object represents a collection and data sources that map to a stores data store.
  • storage_databases.#.collections.#.name - Name of the collection.
  • storage_databases.#.collections.#.data_sources - Array of objects where each object represents a stores data store to map with the collection.
  • storage_databases.#.collections.#.data_sources.#.store_name - Name of a data store to map to the <collection>.
  • storage_databases.#.collections.#.data_sources.#.default_format - Default format that Data Lake assumes if it encounters a file without an extension while searching the storeName.
  • storage_databases.#.collections.#.data_sources.#.path - Controls how Atlas Data Lake searches for and parses files in the storeName before mapping them to the <collection>.
  • storage_databases.#.views - Array of objects where each object represents an aggregation pipeline on a collection.
  • storage_databases.#.views.#.name - Name of the view.
  • storage_databases.#.views.#.source - Name of the source collection for the view.
  • storage_databases.#.views.#.pipeline- Aggregation pipeline stage(s) to apply to the source collection.
storageStores List<GetDataLakeStorageStore>

Each object in the array represents a data store. Data Lake uses the storage.databases configuration details to map data in each data store to queryable databases and collections.

  • storage_stores.#.name - Name of the data store.
  • storage_stores.#.provider - Defines where the data is stored.
  • storage_stores.#.region - Name of the AWS region in which the S3 bucket is hosted.
  • storage_stores.#.bucket - Name of the AWS S3 bucket.
  • storage_stores.#.prefix - Prefix Data Lake applies when searching for files in the S3 bucket .
  • storage_stores.#.delimiter - The delimiter that separates storage_databases.#.collections.#.data_sources.#.path segments in the data store.
  • storage_stores.#.include_tags - Determines whether or not to use S3 tags on the files in the given path as additional partition attributes.
aws GetDataLakeAw[]

AWS provider of the cloud service where Data Lake can access the S3 Bucket.

  • aws.0.role_id - Unique identifier of the role that Data Lake can use to access the data stores.
  • aws.0.test_s3_bucket - Name of the S3 data bucket that the provided role ID is authorized to access.
  • aws.0.role_id - Unique identifier of the role that Data Lake can use to access the data stores.
  • aws.0.test_s3_bucket - Name of the S3 data bucket that the provided role ID is authorized to access.
  • aws.0.iam_assumed_role_arn - Amazon Resource Name (ARN) of the IAM Role that Data Lake assumes when accessing S3 Bucket data stores.
dataProcessRegions GetDataLakeDataProcessRegion[]

The cloud provider region to which Atlas Data Lake routes client connections for data processing.

  • data_process_region.0.cloud_provider - Name of the cloud service provider.
  • data_process_region.0.region -Name of the region to which Data Lake routes client connections for data processing.
hostnames string[]

The list of hostnames assigned to the Atlas Data Lake. Each string in the array is a hostname assigned to the Atlas Data Lake.

id string

The provider-assigned unique ID for this managed resource.

name string
projectId string
state string

Current state of the Atlas Data Lake:

storageDatabases GetDataLakeStorageDatabase[]

Configuration details for mapping each data store to queryable databases and collections.

  • storage_databases.#.name - Name of the database to which Data Lake maps the data contained in the data store.
  • storage_databases.#.collections - Array of objects where each object represents a collection and data sources that map to a stores data store.
  • storage_databases.#.collections.#.name - Name of the collection.
  • storage_databases.#.collections.#.data_sources - Array of objects where each object represents a stores data store to map with the collection.
  • storage_databases.#.collections.#.data_sources.#.store_name - Name of a data store to map to the <collection>.
  • storage_databases.#.collections.#.data_sources.#.default_format - Default format that Data Lake assumes if it encounters a file without an extension while searching the storeName.
  • storage_databases.#.collections.#.data_sources.#.path - Controls how Atlas Data Lake searches for and parses files in the storeName before mapping them to the <collection>.
  • storage_databases.#.views - Array of objects where each object represents an aggregation pipeline on a collection.
  • storage_databases.#.views.#.name - Name of the view.
  • storage_databases.#.views.#.source - Name of the source collection for the view.
  • storage_databases.#.views.#.pipeline- Aggregation pipeline stage(s) to apply to the source collection.
storageStores GetDataLakeStorageStore[]

Each object in the array represents a data store. Data Lake uses the storage.databases configuration details to map data in each data store to queryable databases and collections.

  • storage_stores.#.name - Name of the data store.
  • storage_stores.#.provider - Defines where the data is stored.
  • storage_stores.#.region - Name of the AWS region in which the S3 bucket is hosted.
  • storage_stores.#.bucket - Name of the AWS S3 bucket.
  • storage_stores.#.prefix - Prefix Data Lake applies when searching for files in the S3 bucket .
  • storage_stores.#.delimiter - The delimiter that separates storage_databases.#.collections.#.data_sources.#.path segments in the data store.
  • storage_stores.#.include_tags - Determines whether or not to use S3 tags on the files in the given path as additional partition attributes.
aws Sequence[GetDataLakeAw]

AWS provider of the cloud service where Data Lake can access the S3 Bucket.

  • aws.0.role_id - Unique identifier of the role that Data Lake can use to access the data stores.
  • aws.0.test_s3_bucket - Name of the S3 data bucket that the provided role ID is authorized to access.
  • aws.0.role_id - Unique identifier of the role that Data Lake can use to access the data stores.
  • aws.0.test_s3_bucket - Name of the S3 data bucket that the provided role ID is authorized to access.
  • aws.0.iam_assumed_role_arn - Amazon Resource Name (ARN) of the IAM Role that Data Lake assumes when accessing S3 Bucket data stores.
data_process_regions Sequence[GetDataLakeDataProcessRegion]

The cloud provider region to which Atlas Data Lake routes client connections for data processing.

  • data_process_region.0.cloud_provider - Name of the cloud service provider.
  • data_process_region.0.region -Name of the region to which Data Lake routes client connections for data processing.
hostnames Sequence[str]

The list of hostnames assigned to the Atlas Data Lake. Each string in the array is a hostname assigned to the Atlas Data Lake.

id str

The provider-assigned unique ID for this managed resource.

name str
project_id str
state str

Current state of the Atlas Data Lake:

storage_databases Sequence[GetDataLakeStorageDatabase]

Configuration details for mapping each data store to queryable databases and collections.

  • storage_databases.#.name - Name of the database to which Data Lake maps the data contained in the data store.
  • storage_databases.#.collections - Array of objects where each object represents a collection and data sources that map to a stores data store.
  • storage_databases.#.collections.#.name - Name of the collection.
  • storage_databases.#.collections.#.data_sources - Array of objects where each object represents a stores data store to map with the collection.
  • storage_databases.#.collections.#.data_sources.#.store_name - Name of a data store to map to the <collection>.
  • storage_databases.#.collections.#.data_sources.#.default_format - Default format that Data Lake assumes if it encounters a file without an extension while searching the storeName.
  • storage_databases.#.collections.#.data_sources.#.path - Controls how Atlas Data Lake searches for and parses files in the storeName before mapping them to the <collection>.
  • storage_databases.#.views - Array of objects where each object represents an aggregation pipeline on a collection.
  • storage_databases.#.views.#.name - Name of the view.
  • storage_databases.#.views.#.source - Name of the source collection for the view.
  • storage_databases.#.views.#.pipeline- Aggregation pipeline stage(s) to apply to the source collection.
storage_stores Sequence[GetDataLakeStorageStore]

Each object in the array represents a data store. Data Lake uses the storage.databases configuration details to map data in each data store to queryable databases and collections.

  • storage_stores.#.name - Name of the data store.
  • storage_stores.#.provider - Defines where the data is stored.
  • storage_stores.#.region - Name of the AWS region in which the S3 bucket is hosted.
  • storage_stores.#.bucket - Name of the AWS S3 bucket.
  • storage_stores.#.prefix - Prefix Data Lake applies when searching for files in the S3 bucket .
  • storage_stores.#.delimiter - The delimiter that separates storage_databases.#.collections.#.data_sources.#.path segments in the data store.
  • storage_stores.#.include_tags - Determines whether or not to use S3 tags on the files in the given path as additional partition attributes.
aws List<Property Map>

AWS provider of the cloud service where Data Lake can access the S3 Bucket.

  • aws.0.role_id - Unique identifier of the role that Data Lake can use to access the data stores.
  • aws.0.test_s3_bucket - Name of the S3 data bucket that the provided role ID is authorized to access.
  • aws.0.role_id - Unique identifier of the role that Data Lake can use to access the data stores.
  • aws.0.test_s3_bucket - Name of the S3 data bucket that the provided role ID is authorized to access.
  • aws.0.iam_assumed_role_arn - Amazon Resource Name (ARN) of the IAM Role that Data Lake assumes when accessing S3 Bucket data stores.
dataProcessRegions List<Property Map>

The cloud provider region to which Atlas Data Lake routes client connections for data processing.

  • data_process_region.0.cloud_provider - Name of the cloud service provider.
  • data_process_region.0.region -Name of the region to which Data Lake routes client connections for data processing.
hostnames List<String>

The list of hostnames assigned to the Atlas Data Lake. Each string in the array is a hostname assigned to the Atlas Data Lake.

id String

The provider-assigned unique ID for this managed resource.

name String
projectId String
state String

Current state of the Atlas Data Lake:

storageDatabases List<Property Map>

Configuration details for mapping each data store to queryable databases and collections.

  • storage_databases.#.name - Name of the database to which Data Lake maps the data contained in the data store.
  • storage_databases.#.collections - Array of objects where each object represents a collection and data sources that map to a stores data store.
  • storage_databases.#.collections.#.name - Name of the collection.
  • storage_databases.#.collections.#.data_sources - Array of objects where each object represents a stores data store to map with the collection.
  • storage_databases.#.collections.#.data_sources.#.store_name - Name of a data store to map to the <collection>.
  • storage_databases.#.collections.#.data_sources.#.default_format - Default format that Data Lake assumes if it encounters a file without an extension while searching the storeName.
  • storage_databases.#.collections.#.data_sources.#.path - Controls how Atlas Data Lake searches for and parses files in the storeName before mapping them to the <collection>.
  • storage_databases.#.views - Array of objects where each object represents an aggregation pipeline on a collection.
  • storage_databases.#.views.#.name - Name of the view.
  • storage_databases.#.views.#.source - Name of the source collection for the view.
  • storage_databases.#.views.#.pipeline- Aggregation pipeline stage(s) to apply to the source collection.
storageStores List<Property Map>

Each object in the array represents a data store. Data Lake uses the storage.databases configuration details to map data in each data store to queryable databases and collections.

  • storage_stores.#.name - Name of the data store.
  • storage_stores.#.provider - Defines where the data is stored.
  • storage_stores.#.region - Name of the AWS region in which the S3 bucket is hosted.
  • storage_stores.#.bucket - Name of the AWS S3 bucket.
  • storage_stores.#.prefix - Prefix Data Lake applies when searching for files in the S3 bucket .
  • storage_stores.#.delimiter - The delimiter that separates storage_databases.#.collections.#.data_sources.#.path segments in the data store.
  • storage_stores.#.include_tags - Determines whether or not to use S3 tags on the files in the given path as additional partition attributes.

Supporting Types

GetDataLakeAw

GetDataLakeDataProcessRegion

CloudProvider string
Region string
CloudProvider string
Region string
cloudProvider String
region String
cloudProvider string
region string
cloudProvider String
region String

GetDataLakeStorageDatabase

GetDataLakeStorageDatabaseCollection

dataSources List<Property Map>
name String

Name of the data lake.

GetDataLakeStorageDatabaseCollectionDataSource

DefaultFormat string
Path string
StoreName string
DefaultFormat string
Path string
StoreName string
defaultFormat String
path String
storeName String
defaultFormat string
path string
storeName string
defaultFormat String
path String
storeName String

GetDataLakeStorageDatabaseView

Name string

Name of the data lake.

Pipeline string
Source string
Name string

Name of the data lake.

Pipeline string
Source string
name String

Name of the data lake.

pipeline String
source String
name string

Name of the data lake.

pipeline string
source string
name str

Name of the data lake.

pipeline str
source str
name String

Name of the data lake.

pipeline String
source String

GetDataLakeStorageStore

AdditionalStorageClasses List<string>
Bucket string
Delimiter string
IncludeTags bool
Name string

Name of the data lake.

Prefix string
Provider string
Region string
AdditionalStorageClasses []string
Bucket string
Delimiter string
IncludeTags bool
Name string

Name of the data lake.

Prefix string
Provider string
Region string
additionalStorageClasses List<String>
bucket String
delimiter String
includeTags Boolean
name String

Name of the data lake.

prefix String
provider String
region String
additionalStorageClasses string[]
bucket string
delimiter string
includeTags boolean
name string

Name of the data lake.

prefix string
provider string
region string
additional_storage_classes Sequence[str]
bucket str
delimiter str
include_tags bool
name str

Name of the data lake.

prefix str
provider str
region str
additionalStorageClasses List<String>
bucket String
delimiter String
includeTags Boolean
name String

Name of the data lake.

prefix String
provider String
region String

Package Details

Repository
https://github.com/pulumi/pulumi-mongodbatlas
License
Apache-2.0
Notes

This Pulumi package is based on the mongodbatlas Terraform Provider.