1. Packages
  2. Databricks Provider
  3. API Docs
  4. getCluster
Databricks v1.56.1 published on Monday, Dec 2, 2024 by Pulumi

databricks.getCluster

Explore with Pulumi AI

databricks logo
Databricks v1.56.1 published on Monday, Dec 2, 2024 by Pulumi

    Note If you have a fully automated setup with workspaces created by databricks.MwsWorkspaces or azurerm_databricks_workspace, please make sure to add depends_on attribute in order to prevent default auth: cannot configure default credentials errors.

    Retrieves information about a databricks.Cluster using its id. This could be retrieved programmatically using databricks.getClusters data source.

    Example Usage

    Retrieve attributes of each SQL warehouses in a workspace

    import * as pulumi from "@pulumi/pulumi";
    import * as databricks from "@pulumi/databricks";
    
    const all = databricks.getClusters({});
    const allGetCluster = all.then(all => .reduce((__obj, [, ]) => ({ ...__obj, [__key]: databricks.getCluster({
        clusterId: __value,
    }) })));
    
    import pulumi
    import pulumi_databricks as databricks
    
    all = databricks.get_clusters()
    all_get_cluster = {__key: databricks.get_cluster(cluster_id=__value) for __key, __value in all.ids}
    
    Coming soon!
    
    using System.Collections.Generic;
    using System.Linq;
    using Pulumi;
    using Databricks = Pulumi.Databricks;
    
    return await Deployment.RunAsync(() => 
    {
        var all = Databricks.GetClusters.Invoke();
    
        var allGetCluster = ;
    
    });
    
    Coming soon!
    
    Coming soon!
    

    The following resources are often used in the same context:

    • End to end workspace management guide.
    • databricks.Cluster to create Databricks Clusters.
    • databricks.ClusterPolicy to create a databricks.Cluster policy, which limits the ability to create clusters based on a set of rules.
    • databricks.InstancePool to manage instance pools to reduce cluster start and auto-scaling times by maintaining a set of idle, ready-to-use instances.
    • databricks.Job to manage Databricks Jobs to run non-interactive code in a databricks_cluster.
    • databricks.Library to install a library on databricks_cluster.
    • databricks.Pipeline to deploy Delta Live Tables.

    Using getCluster

    Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.

    function getCluster(args: GetClusterArgs, opts?: InvokeOptions): Promise<GetClusterResult>
    function getClusterOutput(args: GetClusterOutputArgs, opts?: InvokeOptions): Output<GetClusterResult>
    def get_cluster(cluster_id: Optional[str] = None,
                    cluster_info: Optional[GetClusterClusterInfo] = None,
                    cluster_name: Optional[str] = None,
                    id: Optional[str] = None,
                    opts: Optional[InvokeOptions] = None) -> GetClusterResult
    def get_cluster_output(cluster_id: Optional[pulumi.Input[str]] = None,
                    cluster_info: Optional[pulumi.Input[GetClusterClusterInfoArgs]] = None,
                    cluster_name: Optional[pulumi.Input[str]] = None,
                    id: Optional[pulumi.Input[str]] = None,
                    opts: Optional[InvokeOptions] = None) -> Output[GetClusterResult]
    func LookupCluster(ctx *Context, args *LookupClusterArgs, opts ...InvokeOption) (*LookupClusterResult, error)
    func LookupClusterOutput(ctx *Context, args *LookupClusterOutputArgs, opts ...InvokeOption) LookupClusterResultOutput

    > Note: This function is named LookupCluster in the Go SDK.

    public static class GetCluster 
    {
        public static Task<GetClusterResult> InvokeAsync(GetClusterArgs args, InvokeOptions? opts = null)
        public static Output<GetClusterResult> Invoke(GetClusterInvokeArgs args, InvokeOptions? opts = null)
    }
    public static CompletableFuture<GetClusterResult> getCluster(GetClusterArgs args, InvokeOptions options)
    // Output-based functions aren't available in Java yet
    
    fn::invoke:
      function: databricks:index/getCluster:getCluster
      arguments:
        # arguments dictionary

    The following arguments are supported:

    ClusterId string
    The id of the cluster
    ClusterInfo GetClusterClusterInfo
    block, consisting of following fields:
    ClusterName string
    The exact name of the cluster to search
    Id string
    cluster ID
    ClusterId string
    The id of the cluster
    ClusterInfo GetClusterClusterInfo
    block, consisting of following fields:
    ClusterName string
    The exact name of the cluster to search
    Id string
    cluster ID
    clusterId String
    The id of the cluster
    clusterInfo GetClusterClusterInfo
    block, consisting of following fields:
    clusterName String
    The exact name of the cluster to search
    id String
    cluster ID
    clusterId string
    The id of the cluster
    clusterInfo GetClusterClusterInfo
    block, consisting of following fields:
    clusterName string
    The exact name of the cluster to search
    id string
    cluster ID
    cluster_id str
    The id of the cluster
    cluster_info GetClusterClusterInfo
    block, consisting of following fields:
    cluster_name str
    The exact name of the cluster to search
    id str
    cluster ID
    clusterId String
    The id of the cluster
    clusterInfo Property Map
    block, consisting of following fields:
    clusterName String
    The exact name of the cluster to search
    id String
    cluster ID

    getCluster Result

    The following output properties are available:

    ClusterId string
    ClusterInfo GetClusterClusterInfo
    block, consisting of following fields:
    ClusterName string
    Cluster name, which doesn’t have to be unique.
    Id string
    cluster ID
    ClusterId string
    ClusterInfo GetClusterClusterInfo
    block, consisting of following fields:
    ClusterName string
    Cluster name, which doesn’t have to be unique.
    Id string
    cluster ID
    clusterId String
    clusterInfo GetClusterClusterInfo
    block, consisting of following fields:
    clusterName String
    Cluster name, which doesn’t have to be unique.
    id String
    cluster ID
    clusterId string
    clusterInfo GetClusterClusterInfo
    block, consisting of following fields:
    clusterName string
    Cluster name, which doesn’t have to be unique.
    id string
    cluster ID
    cluster_id str
    cluster_info GetClusterClusterInfo
    block, consisting of following fields:
    cluster_name str
    Cluster name, which doesn’t have to be unique.
    id str
    cluster ID
    clusterId String
    clusterInfo Property Map
    block, consisting of following fields:
    clusterName String
    Cluster name, which doesn’t have to be unique.
    id String
    cluster ID

    Supporting Types

    GetClusterClusterInfo

    Autoscale GetClusterClusterInfoAutoscale
    AutoterminationMinutes int
    Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
    AwsAttributes GetClusterClusterInfoAwsAttributes
    AzureAttributes GetClusterClusterInfoAzureAttributes
    ClusterCores double
    ClusterId string
    The id of the cluster
    ClusterLogConf GetClusterClusterInfoClusterLogConf
    ClusterLogStatus GetClusterClusterInfoClusterLogStatus
    ClusterMemoryMb int
    ClusterName string
    The exact name of the cluster to search
    ClusterSource string
    CreatorUserName string
    CustomTags Dictionary<string, string>
    Additional tags for cluster resources.
    DataSecurityMode string
    Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.
    DefaultTags Dictionary<string, string>
    DockerImage GetClusterClusterInfoDockerImage
    Driver GetClusterClusterInfoDriver
    DriverInstancePoolId string
    similar to instance_pool_id, but for driver node.
    DriverNodeTypeId string
    The node type of the Spark driver.
    EnableElasticDisk bool
    Use autoscaling local storage.
    EnableLocalDiskEncryption bool
    Enable local disk encryption.
    Executors List<GetClusterClusterInfoExecutor>
    GcpAttributes GetClusterClusterInfoGcpAttributes
    InitScripts List<GetClusterClusterInfoInitScript>
    InstancePoolId string
    The pool of idle instances the cluster is attached to.
    JdbcPort int
    LastRestartedTime int
    LastStateLossTime int
    NodeTypeId string
    Any supported databricks.getNodeType id.
    NumWorkers int
    PolicyId string
    Identifier of Cluster Policy to validate cluster and preset certain defaults.
    RuntimeEngine string
    The type of runtime of the cluster
    SingleUserName string
    The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
    SparkConf Dictionary<string, string>
    Map with key-value pairs to fine-tune Spark clusters.
    SparkContextId int
    SparkEnvVars Dictionary<string, string>
    Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
    SparkVersion string
    Runtime version of the cluster.
    Spec GetClusterClusterInfoSpec
    SshPublicKeys List<string>
    SSH public key contents that will be added to each Spark node in this cluster.
    StartTime int
    State string
    StateMessage string
    TerminatedTime int
    TerminationReason GetClusterClusterInfoTerminationReason
    WorkloadType GetClusterClusterInfoWorkloadType
    Autoscale GetClusterClusterInfoAutoscale
    AutoterminationMinutes int
    Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
    AwsAttributes GetClusterClusterInfoAwsAttributes
    AzureAttributes GetClusterClusterInfoAzureAttributes
    ClusterCores float64
    ClusterId string
    The id of the cluster
    ClusterLogConf GetClusterClusterInfoClusterLogConf
    ClusterLogStatus GetClusterClusterInfoClusterLogStatus
    ClusterMemoryMb int
    ClusterName string
    The exact name of the cluster to search
    ClusterSource string
    CreatorUserName string
    CustomTags map[string]string
    Additional tags for cluster resources.
    DataSecurityMode string
    Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.
    DefaultTags map[string]string
    DockerImage GetClusterClusterInfoDockerImage
    Driver GetClusterClusterInfoDriver
    DriverInstancePoolId string
    similar to instance_pool_id, but for driver node.
    DriverNodeTypeId string
    The node type of the Spark driver.
    EnableElasticDisk bool
    Use autoscaling local storage.
    EnableLocalDiskEncryption bool
    Enable local disk encryption.
    Executors []GetClusterClusterInfoExecutor
    GcpAttributes GetClusterClusterInfoGcpAttributes
    InitScripts []GetClusterClusterInfoInitScript
    InstancePoolId string
    The pool of idle instances the cluster is attached to.
    JdbcPort int
    LastRestartedTime int
    LastStateLossTime int
    NodeTypeId string
    Any supported databricks.getNodeType id.
    NumWorkers int
    PolicyId string
    Identifier of Cluster Policy to validate cluster and preset certain defaults.
    RuntimeEngine string
    The type of runtime of the cluster
    SingleUserName string
    The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
    SparkConf map[string]string
    Map with key-value pairs to fine-tune Spark clusters.
    SparkContextId int
    SparkEnvVars map[string]string
    Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
    SparkVersion string
    Runtime version of the cluster.
    Spec GetClusterClusterInfoSpec
    SshPublicKeys []string
    SSH public key contents that will be added to each Spark node in this cluster.
    StartTime int
    State string
    StateMessage string
    TerminatedTime int
    TerminationReason GetClusterClusterInfoTerminationReason
    WorkloadType GetClusterClusterInfoWorkloadType
    autoscale GetClusterClusterInfoAutoscale
    autoterminationMinutes Integer
    Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
    awsAttributes GetClusterClusterInfoAwsAttributes
    azureAttributes GetClusterClusterInfoAzureAttributes
    clusterCores Double
    clusterId String
    The id of the cluster
    clusterLogConf GetClusterClusterInfoClusterLogConf
    clusterLogStatus GetClusterClusterInfoClusterLogStatus
    clusterMemoryMb Integer
    clusterName String
    The exact name of the cluster to search
    clusterSource String
    creatorUserName String
    customTags Map<String,String>
    Additional tags for cluster resources.
    dataSecurityMode String
    Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.
    defaultTags Map<String,String>
    dockerImage GetClusterClusterInfoDockerImage
    driver GetClusterClusterInfoDriver
    driverInstancePoolId String
    similar to instance_pool_id, but for driver node.
    driverNodeTypeId String
    The node type of the Spark driver.
    enableElasticDisk Boolean
    Use autoscaling local storage.
    enableLocalDiskEncryption Boolean
    Enable local disk encryption.
    executors List<GetClusterClusterInfoExecutor>
    gcpAttributes GetClusterClusterInfoGcpAttributes
    initScripts List<GetClusterClusterInfoInitScript>
    instancePoolId String
    The pool of idle instances the cluster is attached to.
    jdbcPort Integer
    lastRestartedTime Integer
    lastStateLossTime Integer
    nodeTypeId String
    Any supported databricks.getNodeType id.
    numWorkers Integer
    policyId String
    Identifier of Cluster Policy to validate cluster and preset certain defaults.
    runtimeEngine String
    The type of runtime of the cluster
    singleUserName String
    The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
    sparkConf Map<String,String>
    Map with key-value pairs to fine-tune Spark clusters.
    sparkContextId Integer
    sparkEnvVars Map<String,String>
    Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
    sparkVersion String
    Runtime version of the cluster.
    spec GetClusterClusterInfoSpec
    sshPublicKeys List<String>
    SSH public key contents that will be added to each Spark node in this cluster.
    startTime Integer
    state String
    stateMessage String
    terminatedTime Integer
    terminationReason GetClusterClusterInfoTerminationReason
    workloadType GetClusterClusterInfoWorkloadType
    autoscale GetClusterClusterInfoAutoscale
    autoterminationMinutes number
    Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
    awsAttributes GetClusterClusterInfoAwsAttributes
    azureAttributes GetClusterClusterInfoAzureAttributes
    clusterCores number
    clusterId string
    The id of the cluster
    clusterLogConf GetClusterClusterInfoClusterLogConf
    clusterLogStatus GetClusterClusterInfoClusterLogStatus
    clusterMemoryMb number
    clusterName string
    The exact name of the cluster to search
    clusterSource string
    creatorUserName string
    customTags {[key: string]: string}
    Additional tags for cluster resources.
    dataSecurityMode string
    Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.
    defaultTags {[key: string]: string}
    dockerImage GetClusterClusterInfoDockerImage
    driver GetClusterClusterInfoDriver
    driverInstancePoolId string
    similar to instance_pool_id, but for driver node.
    driverNodeTypeId string
    The node type of the Spark driver.
    enableElasticDisk boolean
    Use autoscaling local storage.
    enableLocalDiskEncryption boolean
    Enable local disk encryption.
    executors GetClusterClusterInfoExecutor[]
    gcpAttributes GetClusterClusterInfoGcpAttributes
    initScripts GetClusterClusterInfoInitScript[]
    instancePoolId string
    The pool of idle instances the cluster is attached to.
    jdbcPort number
    lastRestartedTime number
    lastStateLossTime number
    nodeTypeId string
    Any supported databricks.getNodeType id.
    numWorkers number
    policyId string
    Identifier of Cluster Policy to validate cluster and preset certain defaults.
    runtimeEngine string
    The type of runtime of the cluster
    singleUserName string
    The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
    sparkConf {[key: string]: string}
    Map with key-value pairs to fine-tune Spark clusters.
    sparkContextId number
    sparkEnvVars {[key: string]: string}
    Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
    sparkVersion string
    Runtime version of the cluster.
    spec GetClusterClusterInfoSpec
    sshPublicKeys string[]
    SSH public key contents that will be added to each Spark node in this cluster.
    startTime number
    state string
    stateMessage string
    terminatedTime number
    terminationReason GetClusterClusterInfoTerminationReason
    workloadType GetClusterClusterInfoWorkloadType
    autoscale GetClusterClusterInfoAutoscale
    autotermination_minutes int
    Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
    aws_attributes GetClusterClusterInfoAwsAttributes
    azure_attributes GetClusterClusterInfoAzureAttributes
    cluster_cores float
    cluster_id str
    The id of the cluster
    cluster_log_conf GetClusterClusterInfoClusterLogConf
    cluster_log_status GetClusterClusterInfoClusterLogStatus
    cluster_memory_mb int
    cluster_name str
    The exact name of the cluster to search
    cluster_source str
    creator_user_name str
    custom_tags Mapping[str, str]
    Additional tags for cluster resources.
    data_security_mode str
    Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.
    default_tags Mapping[str, str]
    docker_image GetClusterClusterInfoDockerImage
    driver GetClusterClusterInfoDriver
    driver_instance_pool_id str
    similar to instance_pool_id, but for driver node.
    driver_node_type_id str
    The node type of the Spark driver.
    enable_elastic_disk bool
    Use autoscaling local storage.
    enable_local_disk_encryption bool
    Enable local disk encryption.
    executors Sequence[GetClusterClusterInfoExecutor]
    gcp_attributes GetClusterClusterInfoGcpAttributes
    init_scripts Sequence[GetClusterClusterInfoInitScript]
    instance_pool_id str
    The pool of idle instances the cluster is attached to.
    jdbc_port int
    last_restarted_time int
    last_state_loss_time int
    node_type_id str
    Any supported databricks.getNodeType id.
    num_workers int
    policy_id str
    Identifier of Cluster Policy to validate cluster and preset certain defaults.
    runtime_engine str
    The type of runtime of the cluster
    single_user_name str
    The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
    spark_conf Mapping[str, str]
    Map with key-value pairs to fine-tune Spark clusters.
    spark_context_id int
    spark_env_vars Mapping[str, str]
    Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
    spark_version str
    Runtime version of the cluster.
    spec GetClusterClusterInfoSpec
    ssh_public_keys Sequence[str]
    SSH public key contents that will be added to each Spark node in this cluster.
    start_time int
    state str
    state_message str
    terminated_time int
    termination_reason GetClusterClusterInfoTerminationReason
    workload_type GetClusterClusterInfoWorkloadType
    autoscale Property Map
    autoterminationMinutes Number
    Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
    awsAttributes Property Map
    azureAttributes Property Map
    clusterCores Number
    clusterId String
    The id of the cluster
    clusterLogConf Property Map
    clusterLogStatus Property Map
    clusterMemoryMb Number
    clusterName String
    The exact name of the cluster to search
    clusterSource String
    creatorUserName String
    customTags Map<String>
    Additional tags for cluster resources.
    dataSecurityMode String
    Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.
    defaultTags Map<String>
    dockerImage Property Map
    driver Property Map
    driverInstancePoolId String
    similar to instance_pool_id, but for driver node.
    driverNodeTypeId String
    The node type of the Spark driver.
    enableElasticDisk Boolean
    Use autoscaling local storage.
    enableLocalDiskEncryption Boolean
    Enable local disk encryption.
    executors List<Property Map>
    gcpAttributes Property Map
    initScripts List<Property Map>
    instancePoolId String
    The pool of idle instances the cluster is attached to.
    jdbcPort Number
    lastRestartedTime Number
    lastStateLossTime Number
    nodeTypeId String
    Any supported databricks.getNodeType id.
    numWorkers Number
    policyId String
    Identifier of Cluster Policy to validate cluster and preset certain defaults.
    runtimeEngine String
    The type of runtime of the cluster
    singleUserName String
    The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
    sparkConf Map<String>
    Map with key-value pairs to fine-tune Spark clusters.
    sparkContextId Number
    sparkEnvVars Map<String>
    Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
    sparkVersion String
    Runtime version of the cluster.
    spec Property Map
    sshPublicKeys List<String>
    SSH public key contents that will be added to each Spark node in this cluster.
    startTime Number
    state String
    stateMessage String
    terminatedTime Number
    terminationReason Property Map
    workloadType Property Map

    GetClusterClusterInfoAutoscale

    maxWorkers Integer
    minWorkers Integer
    maxWorkers number
    minWorkers number
    maxWorkers Number
    minWorkers Number

    GetClusterClusterInfoAwsAttributes

    GetClusterClusterInfoAzureAttributes

    GetClusterClusterInfoAzureAttributesLogAnalyticsInfo

    GetClusterClusterInfoClusterLogConf

    GetClusterClusterInfoClusterLogConfDbfs

    GetClusterClusterInfoClusterLogConfS3

    Destination string
    CannedAcl string
    EnableEncryption bool
    EncryptionType string
    Endpoint string
    KmsKey string
    Region string
    Destination string
    CannedAcl string
    EnableEncryption bool
    EncryptionType string
    Endpoint string
    KmsKey string
    Region string
    destination String
    cannedAcl String
    enableEncryption Boolean
    encryptionType String
    endpoint String
    kmsKey String
    region String
    destination string
    cannedAcl string
    enableEncryption boolean
    encryptionType string
    endpoint string
    kmsKey string
    region string
    destination String
    cannedAcl String
    enableEncryption Boolean
    encryptionType String
    endpoint String
    kmsKey String
    region String

    GetClusterClusterInfoClusterLogStatus

    GetClusterClusterInfoDockerImage

    GetClusterClusterInfoDockerImageBasicAuth

    Password string
    Username string
    Password string
    Username string
    password String
    username String
    password string
    username string
    password String
    username String

    GetClusterClusterInfoDriver

    GetClusterClusterInfoDriverNodeAwsAttributes

    IsSpot bool
    IsSpot bool
    isSpot Boolean
    isSpot boolean
    is_spot bool
    isSpot Boolean

    GetClusterClusterInfoExecutor

    GetClusterClusterInfoExecutorNodeAwsAttributes

    IsSpot bool
    IsSpot bool
    isSpot Boolean
    isSpot boolean
    is_spot bool
    isSpot Boolean

    GetClusterClusterInfoGcpAttributes

    GetClusterClusterInfoInitScript

    GetClusterClusterInfoInitScriptAbfss

    GetClusterClusterInfoInitScriptDbfs

    GetClusterClusterInfoInitScriptFile

    GetClusterClusterInfoInitScriptGcs

    GetClusterClusterInfoInitScriptS3

    Destination string
    CannedAcl string
    EnableEncryption bool
    EncryptionType string
    Endpoint string
    KmsKey string
    Region string
    Destination string
    CannedAcl string
    EnableEncryption bool
    EncryptionType string
    Endpoint string
    KmsKey string
    Region string
    destination String
    cannedAcl String
    enableEncryption Boolean
    encryptionType String
    endpoint String
    kmsKey String
    region String
    destination string
    cannedAcl string
    enableEncryption boolean
    encryptionType string
    endpoint string
    kmsKey string
    region string
    destination String
    cannedAcl String
    enableEncryption Boolean
    encryptionType String
    endpoint String
    kmsKey String
    region String

    GetClusterClusterInfoInitScriptVolumes

    GetClusterClusterInfoInitScriptWorkspace

    GetClusterClusterInfoSpec

    ClusterId string
    The id of the cluster
    DriverInstancePoolId string
    similar to instance_pool_id, but for driver node.
    DriverNodeTypeId string
    The node type of the Spark driver.
    EnableElasticDisk bool
    Use autoscaling local storage.
    EnableLocalDiskEncryption bool
    Enable local disk encryption.
    NodeTypeId string
    Any supported databricks.getNodeType id.
    SparkVersion string
    Runtime version of the cluster.
    ApplyPolicyDefaultValues bool
    Autoscale GetClusterClusterInfoSpecAutoscale
    AwsAttributes GetClusterClusterInfoSpecAwsAttributes
    AzureAttributes GetClusterClusterInfoSpecAzureAttributes
    ClusterLogConf GetClusterClusterInfoSpecClusterLogConf
    ClusterMountInfos List<GetClusterClusterInfoSpecClusterMountInfo>
    ClusterName string
    The exact name of the cluster to search
    CustomTags Dictionary<string, string>
    Additional tags for cluster resources.
    DataSecurityMode string
    Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.
    DockerImage GetClusterClusterInfoSpecDockerImage
    GcpAttributes GetClusterClusterInfoSpecGcpAttributes
    IdempotencyToken string
    An optional token to guarantee the idempotency of cluster creation requests.
    InitScripts List<GetClusterClusterInfoSpecInitScript>
    InstancePoolId string
    The pool of idle instances the cluster is attached to.
    Libraries List<GetClusterClusterInfoSpecLibrary>
    NumWorkers int
    PolicyId string
    Identifier of Cluster Policy to validate cluster and preset certain defaults.
    RuntimeEngine string
    The type of runtime of the cluster
    SingleUserName string
    The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
    SparkConf Dictionary<string, string>
    Map with key-value pairs to fine-tune Spark clusters.
    SparkEnvVars Dictionary<string, string>
    Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
    SshPublicKeys List<string>
    SSH public key contents that will be added to each Spark node in this cluster.
    WorkloadType GetClusterClusterInfoSpecWorkloadType
    ClusterId string
    The id of the cluster
    DriverInstancePoolId string
    similar to instance_pool_id, but for driver node.
    DriverNodeTypeId string
    The node type of the Spark driver.
    EnableElasticDisk bool
    Use autoscaling local storage.
    EnableLocalDiskEncryption bool
    Enable local disk encryption.
    NodeTypeId string
    Any supported databricks.getNodeType id.
    SparkVersion string
    Runtime version of the cluster.
    ApplyPolicyDefaultValues bool
    Autoscale GetClusterClusterInfoSpecAutoscale
    AwsAttributes GetClusterClusterInfoSpecAwsAttributes
    AzureAttributes GetClusterClusterInfoSpecAzureAttributes
    ClusterLogConf GetClusterClusterInfoSpecClusterLogConf
    ClusterMountInfos []GetClusterClusterInfoSpecClusterMountInfo
    ClusterName string
    The exact name of the cluster to search
    CustomTags map[string]string
    Additional tags for cluster resources.
    DataSecurityMode string
    Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.
    DockerImage GetClusterClusterInfoSpecDockerImage
    GcpAttributes GetClusterClusterInfoSpecGcpAttributes
    IdempotencyToken string
    An optional token to guarantee the idempotency of cluster creation requests.
    InitScripts []GetClusterClusterInfoSpecInitScript
    InstancePoolId string
    The pool of idle instances the cluster is attached to.
    Libraries []GetClusterClusterInfoSpecLibrary
    NumWorkers int
    PolicyId string
    Identifier of Cluster Policy to validate cluster and preset certain defaults.
    RuntimeEngine string
    The type of runtime of the cluster
    SingleUserName string
    The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
    SparkConf map[string]string
    Map with key-value pairs to fine-tune Spark clusters.
    SparkEnvVars map[string]string
    Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
    SshPublicKeys []string
    SSH public key contents that will be added to each Spark node in this cluster.
    WorkloadType GetClusterClusterInfoSpecWorkloadType
    clusterId String
    The id of the cluster
    driverInstancePoolId String
    similar to instance_pool_id, but for driver node.
    driverNodeTypeId String
    The node type of the Spark driver.
    enableElasticDisk Boolean
    Use autoscaling local storage.
    enableLocalDiskEncryption Boolean
    Enable local disk encryption.
    nodeTypeId String
    Any supported databricks.getNodeType id.
    sparkVersion String
    Runtime version of the cluster.
    applyPolicyDefaultValues Boolean
    autoscale GetClusterClusterInfoSpecAutoscale
    awsAttributes GetClusterClusterInfoSpecAwsAttributes
    azureAttributes GetClusterClusterInfoSpecAzureAttributes
    clusterLogConf GetClusterClusterInfoSpecClusterLogConf
    clusterMountInfos List<GetClusterClusterInfoSpecClusterMountInfo>
    clusterName String
    The exact name of the cluster to search
    customTags Map<String,String>
    Additional tags for cluster resources.
    dataSecurityMode String
    Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.
    dockerImage GetClusterClusterInfoSpecDockerImage
    gcpAttributes GetClusterClusterInfoSpecGcpAttributes
    idempotencyToken String
    An optional token to guarantee the idempotency of cluster creation requests.
    initScripts List<GetClusterClusterInfoSpecInitScript>
    instancePoolId String
    The pool of idle instances the cluster is attached to.
    libraries List<GetClusterClusterInfoSpecLibrary>
    numWorkers Integer
    policyId String
    Identifier of Cluster Policy to validate cluster and preset certain defaults.
    runtimeEngine String
    The type of runtime of the cluster
    singleUserName String
    The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
    sparkConf Map<String,String>
    Map with key-value pairs to fine-tune Spark clusters.
    sparkEnvVars Map<String,String>
    Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
    sshPublicKeys List<String>
    SSH public key contents that will be added to each Spark node in this cluster.
    workloadType GetClusterClusterInfoSpecWorkloadType
    clusterId string
    The id of the cluster
    driverInstancePoolId string
    similar to instance_pool_id, but for driver node.
    driverNodeTypeId string
    The node type of the Spark driver.
    enableElasticDisk boolean
    Use autoscaling local storage.
    enableLocalDiskEncryption boolean
    Enable local disk encryption.
    nodeTypeId string
    Any supported databricks.getNodeType id.
    sparkVersion string
    Runtime version of the cluster.
    applyPolicyDefaultValues boolean
    autoscale GetClusterClusterInfoSpecAutoscale
    awsAttributes GetClusterClusterInfoSpecAwsAttributes
    azureAttributes GetClusterClusterInfoSpecAzureAttributes
    clusterLogConf GetClusterClusterInfoSpecClusterLogConf
    clusterMountInfos GetClusterClusterInfoSpecClusterMountInfo[]
    clusterName string
    The exact name of the cluster to search
    customTags {[key: string]: string}
    Additional tags for cluster resources.
    dataSecurityMode string
    Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.
    dockerImage GetClusterClusterInfoSpecDockerImage
    gcpAttributes GetClusterClusterInfoSpecGcpAttributes
    idempotencyToken string
    An optional token to guarantee the idempotency of cluster creation requests.
    initScripts GetClusterClusterInfoSpecInitScript[]
    instancePoolId string
    The pool of idle instances the cluster is attached to.
    libraries GetClusterClusterInfoSpecLibrary[]
    numWorkers number
    policyId string
    Identifier of Cluster Policy to validate cluster and preset certain defaults.
    runtimeEngine string
    The type of runtime of the cluster
    singleUserName string
    The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
    sparkConf {[key: string]: string}
    Map with key-value pairs to fine-tune Spark clusters.
    sparkEnvVars {[key: string]: string}
    Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
    sshPublicKeys string[]
    SSH public key contents that will be added to each Spark node in this cluster.
    workloadType GetClusterClusterInfoSpecWorkloadType
    cluster_id str
    The id of the cluster
    driver_instance_pool_id str
    similar to instance_pool_id, but for driver node.
    driver_node_type_id str
    The node type of the Spark driver.
    enable_elastic_disk bool
    Use autoscaling local storage.
    enable_local_disk_encryption bool
    Enable local disk encryption.
    node_type_id str
    Any supported databricks.getNodeType id.
    spark_version str
    Runtime version of the cluster.
    apply_policy_default_values bool
    autoscale GetClusterClusterInfoSpecAutoscale
    aws_attributes GetClusterClusterInfoSpecAwsAttributes
    azure_attributes GetClusterClusterInfoSpecAzureAttributes
    cluster_log_conf GetClusterClusterInfoSpecClusterLogConf
    cluster_mount_infos Sequence[GetClusterClusterInfoSpecClusterMountInfo]
    cluster_name str
    The exact name of the cluster to search
    custom_tags Mapping[str, str]
    Additional tags for cluster resources.
    data_security_mode str
    Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.
    docker_image GetClusterClusterInfoSpecDockerImage
    gcp_attributes GetClusterClusterInfoSpecGcpAttributes
    idempotency_token str
    An optional token to guarantee the idempotency of cluster creation requests.
    init_scripts Sequence[GetClusterClusterInfoSpecInitScript]
    instance_pool_id str
    The pool of idle instances the cluster is attached to.
    libraries Sequence[GetClusterClusterInfoSpecLibrary]
    num_workers int
    policy_id str
    Identifier of Cluster Policy to validate cluster and preset certain defaults.
    runtime_engine str
    The type of runtime of the cluster
    single_user_name str
    The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
    spark_conf Mapping[str, str]
    Map with key-value pairs to fine-tune Spark clusters.
    spark_env_vars Mapping[str, str]
    Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
    ssh_public_keys Sequence[str]
    SSH public key contents that will be added to each Spark node in this cluster.
    workload_type GetClusterClusterInfoSpecWorkloadType
    clusterId String
    The id of the cluster
    driverInstancePoolId String
    similar to instance_pool_id, but for driver node.
    driverNodeTypeId String
    The node type of the Spark driver.
    enableElasticDisk Boolean
    Use autoscaling local storage.
    enableLocalDiskEncryption Boolean
    Enable local disk encryption.
    nodeTypeId String
    Any supported databricks.getNodeType id.
    sparkVersion String
    Runtime version of the cluster.
    applyPolicyDefaultValues Boolean
    autoscale Property Map
    awsAttributes Property Map
    azureAttributes Property Map
    clusterLogConf Property Map
    clusterMountInfos List<Property Map>
    clusterName String
    The exact name of the cluster to search
    customTags Map<String>
    Additional tags for cluster resources.
    dataSecurityMode String
    Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.
    dockerImage Property Map
    gcpAttributes Property Map
    idempotencyToken String
    An optional token to guarantee the idempotency of cluster creation requests.
    initScripts List<Property Map>
    instancePoolId String
    The pool of idle instances the cluster is attached to.
    libraries List<Property Map>
    numWorkers Number
    policyId String
    Identifier of Cluster Policy to validate cluster and preset certain defaults.
    runtimeEngine String
    The type of runtime of the cluster
    singleUserName String
    The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
    sparkConf Map<String>
    Map with key-value pairs to fine-tune Spark clusters.
    sparkEnvVars Map<String>
    Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
    sshPublicKeys List<String>
    SSH public key contents that will be added to each Spark node in this cluster.
    workloadType Property Map

    GetClusterClusterInfoSpecAutoscale

    maxWorkers Integer
    minWorkers Integer
    maxWorkers number
    minWorkers number
    maxWorkers Number
    minWorkers Number

    GetClusterClusterInfoSpecAwsAttributes

    GetClusterClusterInfoSpecAzureAttributes

    GetClusterClusterInfoSpecAzureAttributesLogAnalyticsInfo

    GetClusterClusterInfoSpecClusterLogConf

    GetClusterClusterInfoSpecClusterLogConfDbfs

    GetClusterClusterInfoSpecClusterLogConfS3

    Destination string
    CannedAcl string
    EnableEncryption bool
    EncryptionType string
    Endpoint string
    KmsKey string
    Region string
    Destination string
    CannedAcl string
    EnableEncryption bool
    EncryptionType string
    Endpoint string
    KmsKey string
    Region string
    destination String
    cannedAcl String
    enableEncryption Boolean
    encryptionType String
    endpoint String
    kmsKey String
    region String
    destination string
    cannedAcl string
    enableEncryption boolean
    encryptionType string
    endpoint string
    kmsKey string
    region string
    destination String
    cannedAcl String
    enableEncryption Boolean
    encryptionType String
    endpoint String
    kmsKey String
    region String

    GetClusterClusterInfoSpecClusterMountInfo

    GetClusterClusterInfoSpecClusterMountInfoNetworkFilesystemInfo

    GetClusterClusterInfoSpecDockerImage

    GetClusterClusterInfoSpecDockerImageBasicAuth

    Password string
    Username string
    Password string
    Username string
    password String
    username String
    password string
    username string
    password String
    username String

    GetClusterClusterInfoSpecGcpAttributes

    GetClusterClusterInfoSpecInitScript

    abfss Property Map
    dbfs Property Map

    Deprecated: For init scripts use 'volumes', 'workspace' or cloud storage location instead of 'dbfs'.

    file Property Map
    gcs Property Map
    s3 Property Map
    volumes Property Map
    workspace Property Map

    GetClusterClusterInfoSpecInitScriptAbfss

    GetClusterClusterInfoSpecInitScriptDbfs

    GetClusterClusterInfoSpecInitScriptFile

    GetClusterClusterInfoSpecInitScriptGcs

    GetClusterClusterInfoSpecInitScriptS3

    Destination string
    CannedAcl string
    EnableEncryption bool
    EncryptionType string
    Endpoint string
    KmsKey string
    Region string
    Destination string
    CannedAcl string
    EnableEncryption bool
    EncryptionType string
    Endpoint string
    KmsKey string
    Region string
    destination String
    cannedAcl String
    enableEncryption Boolean
    encryptionType String
    endpoint String
    kmsKey String
    region String
    destination string
    cannedAcl string
    enableEncryption boolean
    encryptionType string
    endpoint string
    kmsKey string
    region string
    destination String
    cannedAcl String
    enableEncryption Boolean
    encryptionType String
    endpoint String
    kmsKey String
    region String

    GetClusterClusterInfoSpecInitScriptVolumes

    GetClusterClusterInfoSpecInitScriptWorkspace

    GetClusterClusterInfoSpecLibrary

    GetClusterClusterInfoSpecLibraryCran

    Package string
    Repo string
    Package string
    Repo string
    package_ String
    repo String
    package string
    repo string
    package str
    repo str
    package String
    repo String

    GetClusterClusterInfoSpecLibraryMaven

    Coordinates string
    Exclusions List<string>
    Repo string
    Coordinates string
    Exclusions []string
    Repo string
    coordinates String
    exclusions List<String>
    repo String
    coordinates string
    exclusions string[]
    repo string
    coordinates str
    exclusions Sequence[str]
    repo str
    coordinates String
    exclusions List<String>
    repo String

    GetClusterClusterInfoSpecLibraryPypi

    Package string
    Repo string
    Package string
    Repo string
    package_ String
    repo String
    package string
    repo string
    package str
    repo str
    package String
    repo String

    GetClusterClusterInfoSpecWorkloadType

    GetClusterClusterInfoSpecWorkloadTypeClients

    Jobs bool
    Notebooks bool
    Jobs bool
    Notebooks bool
    jobs Boolean
    notebooks Boolean
    jobs boolean
    notebooks boolean
    jobs bool
    notebooks bool
    jobs Boolean
    notebooks Boolean

    GetClusterClusterInfoTerminationReason

    Code string
    Parameters Dictionary<string, string>
    Type string
    Code string
    Parameters map[string]string
    Type string
    code String
    parameters Map<String,String>
    type String
    code string
    parameters {[key: string]: string}
    type string
    code str
    parameters Mapping[str, str]
    type str
    code String
    parameters Map<String>
    type String

    GetClusterClusterInfoWorkloadType

    GetClusterClusterInfoWorkloadTypeClients

    Jobs bool
    Notebooks bool
    Jobs bool
    Notebooks bool
    jobs Boolean
    notebooks Boolean
    jobs boolean
    notebooks boolean
    jobs bool
    notebooks bool
    jobs Boolean
    notebooks Boolean

    Package Details

    Repository
    databricks pulumi/pulumi-databricks
    License
    Apache-2.0
    Notes
    This Pulumi package is based on the databricks Terraform Provider.
    databricks logo
    Databricks v1.56.1 published on Monday, Dec 2, 2024 by Pulumi