mongodbatlas.getDataLakes
mongodbatlas.getDataLakes
describe all Data Lakes.
NOTE: Groups and projects are synonymous terms. You may find
groupId
in the official documentation.
Example Usage
using System.Collections.Generic;
using Pulumi;
using Mongodbatlas = Pulumi.Mongodbatlas;
return await Deployment.RunAsync(() =>
{
var test = Mongodbatlas.GetDataLakes.Invoke(new()
{
ProjectId = "PROJECT ID",
});
});
package main
import (
"github.com/pulumi/pulumi-mongodbatlas/sdk/v3/go/mongodbatlas"
"github.com/pulumi/pulumi/sdk/v3/go/pulumi"
)
func main() {
pulumi.Run(func(ctx *pulumi.Context) error {
_, err := mongodbatlas.LookupDataLakes(ctx, &mongodbatlas.LookupDataLakesArgs{
ProjectId: "PROJECT ID",
}, nil)
if err != nil {
return err
}
return nil
})
}
package generated_program;
import com.pulumi.Context;
import com.pulumi.Pulumi;
import com.pulumi.core.Output;
import com.pulumi.mongodbatlas.MongodbatlasFunctions;
import com.pulumi.mongodbatlas.inputs.GetDataLakesArgs;
import java.util.List;
import java.util.ArrayList;
import java.util.Map;
import java.io.File;
import java.nio.file.Files;
import java.nio.file.Paths;
public class App {
public static void main(String[] args) {
Pulumi.run(App::stack);
}
public static void stack(Context ctx) {
final var test = MongodbatlasFunctions.getDataLakes(GetDataLakesArgs.builder()
.projectId("PROJECT ID")
.build());
}
}
import pulumi
import pulumi_mongodbatlas as mongodbatlas
test = mongodbatlas.get_data_lakes(project_id="PROJECT ID")
import * as pulumi from "@pulumi/pulumi";
import * as mongodbatlas from "@pulumi/mongodbatlas";
const test = mongodbatlas.getDataLakes({
projectId: "PROJECT ID",
});
variables:
test:
fn::invoke:
Function: mongodbatlas:getDataLakes
Arguments:
projectId: PROJECT ID
Using getDataLakes
Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.
function getDataLakes(args: GetDataLakesArgs, opts?: InvokeOptions): Promise<GetDataLakesResult>
function getDataLakesOutput(args: GetDataLakesOutputArgs, opts?: InvokeOptions): Output<GetDataLakesResult>
def get_data_lakes(project_id: Optional[str] = None,
opts: Optional[InvokeOptions] = None) -> GetDataLakesResult
def get_data_lakes_output(project_id: Optional[pulumi.Input[str]] = None,
opts: Optional[InvokeOptions] = None) -> Output[GetDataLakesResult]
func LookupDataLakes(ctx *Context, args *LookupDataLakesArgs, opts ...InvokeOption) (*LookupDataLakesResult, error)
func LookupDataLakesOutput(ctx *Context, args *LookupDataLakesOutputArgs, opts ...InvokeOption) LookupDataLakesResultOutput
> Note: This function is named LookupDataLakes
in the Go SDK.
public static class GetDataLakes
{
public static Task<GetDataLakesResult> InvokeAsync(GetDataLakesArgs args, InvokeOptions? opts = null)
public static Output<GetDataLakesResult> Invoke(GetDataLakesInvokeArgs args, InvokeOptions? opts = null)
}
public static CompletableFuture<GetDataLakesResult> getDataLakes(GetDataLakesArgs args, InvokeOptions options)
// Output-based functions aren't available in Java yet
fn::invoke:
function: mongodbatlas:index/getDataLakes:getDataLakes
arguments:
# arguments dictionary
The following arguments are supported:
- Project
Id string The unique ID for the project to get all data lakes.
- Project
Id string The unique ID for the project to get all data lakes.
- project
Id String The unique ID for the project to get all data lakes.
- project
Id string The unique ID for the project to get all data lakes.
- project_
id str The unique ID for the project to get all data lakes.
- project
Id String The unique ID for the project to get all data lakes.
getDataLakes Result
The following output properties are available:
- Id string
The provider-assigned unique ID for this managed resource.
- Project
Id string - Results
List<Get
Data Lakes Result> A list where each represents a Data lake.
- Id string
The provider-assigned unique ID for this managed resource.
- Project
Id string - Results
[]Get
Data Lakes Result A list where each represents a Data lake.
- id String
The provider-assigned unique ID for this managed resource.
- project
Id String - results
List<Get
Data Lakes Result> A list where each represents a Data lake.
- id string
The provider-assigned unique ID for this managed resource.
- project
Id string - results
Get
Data Lakes Result[] A list where each represents a Data lake.
- id str
The provider-assigned unique ID for this managed resource.
- project_
id str - results
Sequence[Get
Data Lakes Result] A list where each represents a Data lake.
- id String
The provider-assigned unique ID for this managed resource.
- project
Id String - results List<Property Map>
A list where each represents a Data lake.
Supporting Types
GetDataLakesResult
- Aws
List<Get
Data Lakes Result Aw> - Data
Process List<GetRegions Data Lakes Result Data Process Region> The cloud provider region to which Atlas Data Lake routes client connections for data processing.
data_process_region.0.cloud_provider
- Name of the cloud service provider.data_process_region.0.region
-Name of the region to which Data Lake routes client connections for data processing.
- Hostnames List<string>
The list of hostnames assigned to the Atlas Data Lake. Each string in the array is a hostname assigned to the Atlas Data Lake.
- Name string
- Project
Id string The unique ID for the project to get all data lakes.
- State string
Current state of the Atlas Data Lake:
- Storage
Databases List<GetData Lakes Result Storage Database> Configuration details for mapping each data store to queryable databases and collections.
storage_databases.#.name
- Name of the database to which Data Lake maps the data contained in the data store.storage_databases.#.collections
- Array of objects where each object represents a collection and data sources that map to a stores data store.storage_databases.#.collections.#.name
- Name of the collection.storage_databases.#.collections.#.data_sources
- Array of objects where each object represents a stores data store to map with the collection.storage_databases.#.collections.#.data_sources.#.store_name
- Name of a data store to map to the<collection>
.storage_databases.#.collections.#.data_sources.#.default_format
- Default format that Data Lake assumes if it encounters a file without an extension while searching the storeName.storage_databases.#.collections.#.data_sources.#.path
- Controls how Atlas Data Lake searches for and parses files in the storeName before mapping them to the<collection>
.storage_databases.#.views
- Array of objects where each object represents an aggregation pipeline on a collection.storage_databases.#.views.#.name
- Name of the view.storage_databases.#.views.#.source
- Name of the source collection for the view.storage_databases.#.views.#.pipeline
- Aggregation pipeline stage(s) to apply to the source collection.
- Storage
Stores List<GetData Lakes Result Storage Store> Each object in the array represents a data store. Data Lake uses the storage.databases configuration details to map data in each data store to queryable databases and collections.
storage_stores.#.name
- Name of the data store.storage_stores.#.provider
- Defines where the data is stored.storage_stores.#.region
- Name of the AWS region in which the S3 bucket is hosted.storage_stores.#.bucket
- Name of the AWS S3 bucket.storage_stores.#.prefix
- Prefix Data Lake applies when searching for files in the S3 bucket .storage_stores.#.delimiter
- The delimiter that separatesstorage_databases.#.collections.#.data_sources.#.path
segments in the data store.storage_stores.#.include_tags
- Determines whether or not to use S3 tags on the files in the given path as additional partition attributes.
- Aws
[]Get
Data Lakes Result Aw - Data
Process []GetRegions Data Lakes Result Data Process Region The cloud provider region to which Atlas Data Lake routes client connections for data processing.
data_process_region.0.cloud_provider
- Name of the cloud service provider.data_process_region.0.region
-Name of the region to which Data Lake routes client connections for data processing.
- Hostnames []string
The list of hostnames assigned to the Atlas Data Lake. Each string in the array is a hostname assigned to the Atlas Data Lake.
- Name string
- Project
Id string The unique ID for the project to get all data lakes.
- State string
Current state of the Atlas Data Lake:
- Storage
Databases []GetData Lakes Result Storage Database Configuration details for mapping each data store to queryable databases and collections.
storage_databases.#.name
- Name of the database to which Data Lake maps the data contained in the data store.storage_databases.#.collections
- Array of objects where each object represents a collection and data sources that map to a stores data store.storage_databases.#.collections.#.name
- Name of the collection.storage_databases.#.collections.#.data_sources
- Array of objects where each object represents a stores data store to map with the collection.storage_databases.#.collections.#.data_sources.#.store_name
- Name of a data store to map to the<collection>
.storage_databases.#.collections.#.data_sources.#.default_format
- Default format that Data Lake assumes if it encounters a file without an extension while searching the storeName.storage_databases.#.collections.#.data_sources.#.path
- Controls how Atlas Data Lake searches for and parses files in the storeName before mapping them to the<collection>
.storage_databases.#.views
- Array of objects where each object represents an aggregation pipeline on a collection.storage_databases.#.views.#.name
- Name of the view.storage_databases.#.views.#.source
- Name of the source collection for the view.storage_databases.#.views.#.pipeline
- Aggregation pipeline stage(s) to apply to the source collection.
- Storage
Stores []GetData Lakes Result Storage Store Each object in the array represents a data store. Data Lake uses the storage.databases configuration details to map data in each data store to queryable databases and collections.
storage_stores.#.name
- Name of the data store.storage_stores.#.provider
- Defines where the data is stored.storage_stores.#.region
- Name of the AWS region in which the S3 bucket is hosted.storage_stores.#.bucket
- Name of the AWS S3 bucket.storage_stores.#.prefix
- Prefix Data Lake applies when searching for files in the S3 bucket .storage_stores.#.delimiter
- The delimiter that separatesstorage_databases.#.collections.#.data_sources.#.path
segments in the data store.storage_stores.#.include_tags
- Determines whether or not to use S3 tags on the files in the given path as additional partition attributes.
- aws
List<Get
Data Lakes Result Aw> - data
Process List<GetRegions Data Lakes Result Data Process Region> The cloud provider region to which Atlas Data Lake routes client connections for data processing.
data_process_region.0.cloud_provider
- Name of the cloud service provider.data_process_region.0.region
-Name of the region to which Data Lake routes client connections for data processing.
- hostnames List<String>
The list of hostnames assigned to the Atlas Data Lake. Each string in the array is a hostname assigned to the Atlas Data Lake.
- name String
- project
Id String The unique ID for the project to get all data lakes.
- state String
Current state of the Atlas Data Lake:
- storage
Databases List<GetData Lakes Result Storage Database> Configuration details for mapping each data store to queryable databases and collections.
storage_databases.#.name
- Name of the database to which Data Lake maps the data contained in the data store.storage_databases.#.collections
- Array of objects where each object represents a collection and data sources that map to a stores data store.storage_databases.#.collections.#.name
- Name of the collection.storage_databases.#.collections.#.data_sources
- Array of objects where each object represents a stores data store to map with the collection.storage_databases.#.collections.#.data_sources.#.store_name
- Name of a data store to map to the<collection>
.storage_databases.#.collections.#.data_sources.#.default_format
- Default format that Data Lake assumes if it encounters a file without an extension while searching the storeName.storage_databases.#.collections.#.data_sources.#.path
- Controls how Atlas Data Lake searches for and parses files in the storeName before mapping them to the<collection>
.storage_databases.#.views
- Array of objects where each object represents an aggregation pipeline on a collection.storage_databases.#.views.#.name
- Name of the view.storage_databases.#.views.#.source
- Name of the source collection for the view.storage_databases.#.views.#.pipeline
- Aggregation pipeline stage(s) to apply to the source collection.
- storage
Stores List<GetData Lakes Result Storage Store> Each object in the array represents a data store. Data Lake uses the storage.databases configuration details to map data in each data store to queryable databases and collections.
storage_stores.#.name
- Name of the data store.storage_stores.#.provider
- Defines where the data is stored.storage_stores.#.region
- Name of the AWS region in which the S3 bucket is hosted.storage_stores.#.bucket
- Name of the AWS S3 bucket.storage_stores.#.prefix
- Prefix Data Lake applies when searching for files in the S3 bucket .storage_stores.#.delimiter
- The delimiter that separatesstorage_databases.#.collections.#.data_sources.#.path
segments in the data store.storage_stores.#.include_tags
- Determines whether or not to use S3 tags on the files in the given path as additional partition attributes.
- aws
Get
Data Lakes Result Aw[] - data
Process GetRegions Data Lakes Result Data Process Region[] The cloud provider region to which Atlas Data Lake routes client connections for data processing.
data_process_region.0.cloud_provider
- Name of the cloud service provider.data_process_region.0.region
-Name of the region to which Data Lake routes client connections for data processing.
- hostnames string[]
The list of hostnames assigned to the Atlas Data Lake. Each string in the array is a hostname assigned to the Atlas Data Lake.
- name string
- project
Id string The unique ID for the project to get all data lakes.
- state string
Current state of the Atlas Data Lake:
- storage
Databases GetData Lakes Result Storage Database[] Configuration details for mapping each data store to queryable databases and collections.
storage_databases.#.name
- Name of the database to which Data Lake maps the data contained in the data store.storage_databases.#.collections
- Array of objects where each object represents a collection and data sources that map to a stores data store.storage_databases.#.collections.#.name
- Name of the collection.storage_databases.#.collections.#.data_sources
- Array of objects where each object represents a stores data store to map with the collection.storage_databases.#.collections.#.data_sources.#.store_name
- Name of a data store to map to the<collection>
.storage_databases.#.collections.#.data_sources.#.default_format
- Default format that Data Lake assumes if it encounters a file without an extension while searching the storeName.storage_databases.#.collections.#.data_sources.#.path
- Controls how Atlas Data Lake searches for and parses files in the storeName before mapping them to the<collection>
.storage_databases.#.views
- Array of objects where each object represents an aggregation pipeline on a collection.storage_databases.#.views.#.name
- Name of the view.storage_databases.#.views.#.source
- Name of the source collection for the view.storage_databases.#.views.#.pipeline
- Aggregation pipeline stage(s) to apply to the source collection.
- storage
Stores GetData Lakes Result Storage Store[] Each object in the array represents a data store. Data Lake uses the storage.databases configuration details to map data in each data store to queryable databases and collections.
storage_stores.#.name
- Name of the data store.storage_stores.#.provider
- Defines where the data is stored.storage_stores.#.region
- Name of the AWS region in which the S3 bucket is hosted.storage_stores.#.bucket
- Name of the AWS S3 bucket.storage_stores.#.prefix
- Prefix Data Lake applies when searching for files in the S3 bucket .storage_stores.#.delimiter
- The delimiter that separatesstorage_databases.#.collections.#.data_sources.#.path
segments in the data store.storage_stores.#.include_tags
- Determines whether or not to use S3 tags on the files in the given path as additional partition attributes.
- aws
Sequence[Get
Data Lakes Result Aw] - data_
process_ Sequence[Getregions Data Lakes Result Data Process Region] The cloud provider region to which Atlas Data Lake routes client connections for data processing.
data_process_region.0.cloud_provider
- Name of the cloud service provider.data_process_region.0.region
-Name of the region to which Data Lake routes client connections for data processing.
- hostnames Sequence[str]
The list of hostnames assigned to the Atlas Data Lake. Each string in the array is a hostname assigned to the Atlas Data Lake.
- name str
- project_
id str The unique ID for the project to get all data lakes.
- state str
Current state of the Atlas Data Lake:
- storage_
databases Sequence[GetData Lakes Result Storage Database] Configuration details for mapping each data store to queryable databases and collections.
storage_databases.#.name
- Name of the database to which Data Lake maps the data contained in the data store.storage_databases.#.collections
- Array of objects where each object represents a collection and data sources that map to a stores data store.storage_databases.#.collections.#.name
- Name of the collection.storage_databases.#.collections.#.data_sources
- Array of objects where each object represents a stores data store to map with the collection.storage_databases.#.collections.#.data_sources.#.store_name
- Name of a data store to map to the<collection>
.storage_databases.#.collections.#.data_sources.#.default_format
- Default format that Data Lake assumes if it encounters a file without an extension while searching the storeName.storage_databases.#.collections.#.data_sources.#.path
- Controls how Atlas Data Lake searches for and parses files in the storeName before mapping them to the<collection>
.storage_databases.#.views
- Array of objects where each object represents an aggregation pipeline on a collection.storage_databases.#.views.#.name
- Name of the view.storage_databases.#.views.#.source
- Name of the source collection for the view.storage_databases.#.views.#.pipeline
- Aggregation pipeline stage(s) to apply to the source collection.
- storage_
stores Sequence[GetData Lakes Result Storage Store] Each object in the array represents a data store. Data Lake uses the storage.databases configuration details to map data in each data store to queryable databases and collections.
storage_stores.#.name
- Name of the data store.storage_stores.#.provider
- Defines where the data is stored.storage_stores.#.region
- Name of the AWS region in which the S3 bucket is hosted.storage_stores.#.bucket
- Name of the AWS S3 bucket.storage_stores.#.prefix
- Prefix Data Lake applies when searching for files in the S3 bucket .storage_stores.#.delimiter
- The delimiter that separatesstorage_databases.#.collections.#.data_sources.#.path
segments in the data store.storage_stores.#.include_tags
- Determines whether or not to use S3 tags on the files in the given path as additional partition attributes.
- aws List<Property Map>
- data
Process List<Property Map>Regions The cloud provider region to which Atlas Data Lake routes client connections for data processing.
data_process_region.0.cloud_provider
- Name of the cloud service provider.data_process_region.0.region
-Name of the region to which Data Lake routes client connections for data processing.
- hostnames List<String>
The list of hostnames assigned to the Atlas Data Lake. Each string in the array is a hostname assigned to the Atlas Data Lake.
- name String
- project
Id String The unique ID for the project to get all data lakes.
- state String
Current state of the Atlas Data Lake:
- storage
Databases List<Property Map> Configuration details for mapping each data store to queryable databases and collections.
storage_databases.#.name
- Name of the database to which Data Lake maps the data contained in the data store.storage_databases.#.collections
- Array of objects where each object represents a collection and data sources that map to a stores data store.storage_databases.#.collections.#.name
- Name of the collection.storage_databases.#.collections.#.data_sources
- Array of objects where each object represents a stores data store to map with the collection.storage_databases.#.collections.#.data_sources.#.store_name
- Name of a data store to map to the<collection>
.storage_databases.#.collections.#.data_sources.#.default_format
- Default format that Data Lake assumes if it encounters a file without an extension while searching the storeName.storage_databases.#.collections.#.data_sources.#.path
- Controls how Atlas Data Lake searches for and parses files in the storeName before mapping them to the<collection>
.storage_databases.#.views
- Array of objects where each object represents an aggregation pipeline on a collection.storage_databases.#.views.#.name
- Name of the view.storage_databases.#.views.#.source
- Name of the source collection for the view.storage_databases.#.views.#.pipeline
- Aggregation pipeline stage(s) to apply to the source collection.
- storage
Stores List<Property Map> Each object in the array represents a data store. Data Lake uses the storage.databases configuration details to map data in each data store to queryable databases and collections.
storage_stores.#.name
- Name of the data store.storage_stores.#.provider
- Defines where the data is stored.storage_stores.#.region
- Name of the AWS region in which the S3 bucket is hosted.storage_stores.#.bucket
- Name of the AWS S3 bucket.storage_stores.#.prefix
- Prefix Data Lake applies when searching for files in the S3 bucket .storage_stores.#.delimiter
- The delimiter that separatesstorage_databases.#.collections.#.data_sources.#.path
segments in the data store.storage_stores.#.include_tags
- Determines whether or not to use S3 tags on the files in the given path as additional partition attributes.
GetDataLakesResultAw
- External
Id string - Iam
Assumed stringRole Arn - Iam
User stringArn - Role
Id string - Test
S3Bucket string
- External
Id string - Iam
Assumed stringRole Arn - Iam
User stringArn - Role
Id string - Test
S3Bucket string
- external
Id String - iam
Assumed StringRole Arn - iam
User StringArn - role
Id String - test
S3Bucket String
- external
Id string - iam
Assumed stringRole Arn - iam
User stringArn - role
Id string - test
S3Bucket string
- external_
id str - iam_
assumed_ strrole_ arn - iam_
user_ strarn - role_
id str - test_
s3_ strbucket
- external
Id String - iam
Assumed StringRole Arn - iam
User StringArn - role
Id String - test
S3Bucket String
GetDataLakesResultDataProcessRegion
- Cloud
Provider string - Region string
- Cloud
Provider string - Region string
- cloud
Provider String - region String
- cloud
Provider string - region string
- cloud_
provider str - region str
- cloud
Provider String - region String
GetDataLakesResultStorageDatabase
GetDataLakesResultStorageDatabaseCollection
GetDataLakesResultStorageDatabaseCollectionDataSource
- Default
Format string - Path string
- Store
Name string
- Default
Format string - Path string
- Store
Name string
- default
Format String - path String
- store
Name String
- default
Format string - path string
- store
Name string
- default_
format str - path str
- store_
name str
- default
Format String - path String
- store
Name String
GetDataLakesResultStorageDatabaseView
GetDataLakesResultStorageStore
Package Details
- Repository
- MongoDB Atlas pulumi/pulumi-mongodbatlas
- License
- Apache-2.0
- Notes
This Pulumi package is based on the
mongodbatlas
Terraform Provider.