databricks logo
Databricks v1.10.0, Mar 15 23

databricks.getCluster

The following resources are often used in the same context:

  • End to end workspace management guide.
  • databricks.Cluster to create Databricks Clusters.
  • databricks.ClusterPolicy to create a databricks.Cluster policy, which limits the ability to create clusters based on a set of rules.
  • databricks.InstancePool to manage instance pools to reduce cluster start and auto-scaling times by maintaining a set of idle, ready-to-use instances.
  • databricks.Job to manage Databricks Jobs to run non-interactive code in a databricks_cluster.
  • databricks.Library to install a library on databricks_cluster.
  • databricks.Pipeline to deploy Delta Live Tables.

Example Usage

Retrieve attributes of each SQL warehouses in a workspace

Coming soon!

Coming soon!

Coming soon!

import pulumi
import pulumi_databricks as databricks

all_clusters = databricks.get_clusters()
all_cluster = [databricks.get_cluster(cluster_id=__value) for __key, __value in data["databricks_clusters"]["ids"]]
import * as pulumi from "@pulumi/pulumi";
import * as databricks from "@pulumi/databricks";

const allClusters = databricks.getClusters({});
const allCluster = .map(([, ]) => databricks.getCluster({
    clusterId: __value,
}));

Coming soon!

Using getCluster

Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.

function getCluster(args: GetClusterArgs, opts?: InvokeOptions): Promise<GetClusterResult>
function getClusterOutput(args: GetClusterOutputArgs, opts?: InvokeOptions): Output<GetClusterResult>
def get_cluster(cluster_id: Optional[str] = None,
                cluster_info: Optional[GetClusterClusterInfo] = None,
                cluster_name: Optional[str] = None,
                id: Optional[str] = None,
                opts: Optional[InvokeOptions] = None) -> GetClusterResult
def get_cluster_output(cluster_id: Optional[pulumi.Input[str]] = None,
                cluster_info: Optional[pulumi.Input[GetClusterClusterInfoArgs]] = None,
                cluster_name: Optional[pulumi.Input[str]] = None,
                id: Optional[pulumi.Input[str]] = None,
                opts: Optional[InvokeOptions] = None) -> Output[GetClusterResult]
func LookupCluster(ctx *Context, args *LookupClusterArgs, opts ...InvokeOption) (*LookupClusterResult, error)
func LookupClusterOutput(ctx *Context, args *LookupClusterOutputArgs, opts ...InvokeOption) LookupClusterResultOutput

> Note: This function is named LookupCluster in the Go SDK.

public static class GetCluster 
{
    public static Task<GetClusterResult> InvokeAsync(GetClusterArgs args, InvokeOptions? opts = null)
    public static Output<GetClusterResult> Invoke(GetClusterInvokeArgs args, InvokeOptions? opts = null)
}
public static CompletableFuture<GetClusterResult> getCluster(GetClusterArgs args, InvokeOptions options)
// Output-based functions aren't available in Java yet
fn::invoke:
  function: databricks:index/getCluster:getCluster
  arguments:
    # arguments dictionary

The following arguments are supported:

ClusterId string

The id of the cluster

ClusterInfo GetClusterClusterInfo

block, consisting of following fields:

ClusterName string

The exact name of the cluster to search

Id string

cluster ID

ClusterId string

The id of the cluster

ClusterInfo GetClusterClusterInfo

block, consisting of following fields:

ClusterName string

The exact name of the cluster to search

Id string

cluster ID

clusterId String

The id of the cluster

clusterInfo GetClusterClusterInfo

block, consisting of following fields:

clusterName String

The exact name of the cluster to search

id String

cluster ID

clusterId string

The id of the cluster

clusterInfo GetClusterClusterInfo

block, consisting of following fields:

clusterName string

The exact name of the cluster to search

id string

cluster ID

cluster_id str

The id of the cluster

cluster_info GetClusterClusterInfo

block, consisting of following fields:

cluster_name str

The exact name of the cluster to search

id str

cluster ID

clusterId String

The id of the cluster

clusterInfo Property Map

block, consisting of following fields:

clusterName String

The exact name of the cluster to search

id String

cluster ID

getCluster Result

The following output properties are available:

ClusterId string
ClusterInfo GetClusterClusterInfo

block, consisting of following fields:

ClusterName string

Cluster name, which doesn’t have to be unique.

Id string

cluster ID

ClusterId string
ClusterInfo GetClusterClusterInfo

block, consisting of following fields:

ClusterName string

Cluster name, which doesn’t have to be unique.

Id string

cluster ID

clusterId String
clusterInfo GetClusterClusterInfo

block, consisting of following fields:

clusterName String

Cluster name, which doesn’t have to be unique.

id String

cluster ID

clusterId string
clusterInfo GetClusterClusterInfo

block, consisting of following fields:

clusterName string

Cluster name, which doesn’t have to be unique.

id string

cluster ID

cluster_id str
cluster_info GetClusterClusterInfo

block, consisting of following fields:

cluster_name str

Cluster name, which doesn’t have to be unique.

id str

cluster ID

clusterId String
clusterInfo Property Map

block, consisting of following fields:

clusterName String

Cluster name, which doesn’t have to be unique.

id String

cluster ID

Supporting Types

GetClusterClusterInfo

DefaultTags Dictionary<string, object>
DriverInstancePoolId string

similar to instance_pool_id, but for driver node.

SparkVersion string

Runtime version of the cluster.

State string
Autoscale GetClusterClusterInfoAutoscale
AutoterminationMinutes int

Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.

AwsAttributes GetClusterClusterInfoAwsAttributes
AzureAttributes GetClusterClusterInfoAzureAttributes
ClusterCores double
ClusterId string

The id of the cluster

ClusterLogConf GetClusterClusterInfoClusterLogConf
ClusterLogStatus GetClusterClusterInfoClusterLogStatus
ClusterMemoryMb int
ClusterName string

The exact name of the cluster to search

ClusterSource string
CreatorUserName string
CustomTags Dictionary<string, object>

Additional tags for cluster resources.

DataSecurityMode string

Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.

DockerImage GetClusterClusterInfoDockerImage
Driver GetClusterClusterInfoDriver
DriverNodeTypeId string

The node type of the Spark driver.

EnableElasticDisk bool

Use autoscaling local storage.

EnableLocalDiskEncryption bool

Enable local disk encryption.

Executors List<GetClusterClusterInfoExecutor>
GcpAttributes GetClusterClusterInfoGcpAttributes
InitScripts List<GetClusterClusterInfoInitScript>
InstancePoolId string

The pool of idle instances the cluster is attached to.

JdbcPort int
LastActivityTime int
LastStateLossTime int
NodeTypeId string

Any supported databricks.getNodeType id.

NumWorkers int
PolicyId string

Identifier of Cluster Policy to validate cluster and preset certain defaults.

RuntimeEngine string

The type of runtime of the cluster

SingleUserName string

The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).

SparkConf Dictionary<string, object>

Map with key-value pairs to fine-tune Spark clusters.

SparkContextId int
SparkEnvVars Dictionary<string, object>

Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.

SshPublicKeys List<string>

SSH public key contents that will be added to each Spark node in this cluster.

StartTime int
StateMessage string
TerminateTime int
TerminationReason GetClusterClusterInfoTerminationReason
DefaultTags map[string]interface{}
DriverInstancePoolId string

similar to instance_pool_id, but for driver node.

SparkVersion string

Runtime version of the cluster.

State string
Autoscale GetClusterClusterInfoAutoscale
AutoterminationMinutes int

Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.

AwsAttributes GetClusterClusterInfoAwsAttributes
AzureAttributes GetClusterClusterInfoAzureAttributes
ClusterCores float64
ClusterId string

The id of the cluster

ClusterLogConf GetClusterClusterInfoClusterLogConf
ClusterLogStatus GetClusterClusterInfoClusterLogStatus
ClusterMemoryMb int
ClusterName string

The exact name of the cluster to search

ClusterSource string
CreatorUserName string
CustomTags map[string]interface{}

Additional tags for cluster resources.

DataSecurityMode string

Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.

DockerImage GetClusterClusterInfoDockerImage
Driver GetClusterClusterInfoDriver
DriverNodeTypeId string

The node type of the Spark driver.

EnableElasticDisk bool

Use autoscaling local storage.

EnableLocalDiskEncryption bool

Enable local disk encryption.

Executors []GetClusterClusterInfoExecutor
GcpAttributes GetClusterClusterInfoGcpAttributes
InitScripts []GetClusterClusterInfoInitScript
InstancePoolId string

The pool of idle instances the cluster is attached to.

JdbcPort int
LastActivityTime int
LastStateLossTime int
NodeTypeId string

Any supported databricks.getNodeType id.

NumWorkers int
PolicyId string

Identifier of Cluster Policy to validate cluster and preset certain defaults.

RuntimeEngine string

The type of runtime of the cluster

SingleUserName string

The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).

SparkConf map[string]interface{}

Map with key-value pairs to fine-tune Spark clusters.

SparkContextId int
SparkEnvVars map[string]interface{}

Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.

SshPublicKeys []string

SSH public key contents that will be added to each Spark node in this cluster.

StartTime int
StateMessage string
TerminateTime int
TerminationReason GetClusterClusterInfoTerminationReason
defaultTags Map<String,Object>
driverInstancePoolId String

similar to instance_pool_id, but for driver node.

sparkVersion String

Runtime version of the cluster.

state String
autoscale GetClusterClusterInfoAutoscale
autoterminationMinutes Integer

Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.

awsAttributes GetClusterClusterInfoAwsAttributes
azureAttributes GetClusterClusterInfoAzureAttributes
clusterCores Double
clusterId String

The id of the cluster

clusterLogConf GetClusterClusterInfoClusterLogConf
clusterLogStatus GetClusterClusterInfoClusterLogStatus
clusterMemoryMb Integer
clusterName String

The exact name of the cluster to search

clusterSource String
creatorUserName String
customTags Map<String,Object>

Additional tags for cluster resources.

dataSecurityMode String

Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.

dockerImage GetClusterClusterInfoDockerImage
driver GetClusterClusterInfoDriver
driverNodeTypeId String

The node type of the Spark driver.

enableElasticDisk Boolean

Use autoscaling local storage.

enableLocalDiskEncryption Boolean

Enable local disk encryption.

executors List<GetClusterClusterInfoExecutor>
gcpAttributes GetClusterClusterInfoGcpAttributes
initScripts List<GetClusterClusterInfoInitScript>
instancePoolId String

The pool of idle instances the cluster is attached to.

jdbcPort Integer
lastActivityTime Integer
lastStateLossTime Integer
nodeTypeId String

Any supported databricks.getNodeType id.

numWorkers Integer
policyId String

Identifier of Cluster Policy to validate cluster and preset certain defaults.

runtimeEngine String

The type of runtime of the cluster

singleUserName String

The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).

sparkConf Map<String,Object>

Map with key-value pairs to fine-tune Spark clusters.

sparkContextId Integer
sparkEnvVars Map<String,Object>

Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.

sshPublicKeys List<String>

SSH public key contents that will be added to each Spark node in this cluster.

startTime Integer
stateMessage String
terminateTime Integer
terminationReason GetClusterClusterInfoTerminationReason
defaultTags {[key: string]: any}
driverInstancePoolId string

similar to instance_pool_id, but for driver node.

sparkVersion string

Runtime version of the cluster.

state string
autoscale GetClusterClusterInfoAutoscale
autoterminationMinutes number

Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.

awsAttributes GetClusterClusterInfoAwsAttributes
azureAttributes GetClusterClusterInfoAzureAttributes
clusterCores number
clusterId string

The id of the cluster

clusterLogConf GetClusterClusterInfoClusterLogConf
clusterLogStatus GetClusterClusterInfoClusterLogStatus
clusterMemoryMb number
clusterName string

The exact name of the cluster to search

clusterSource string
creatorUserName string
customTags {[key: string]: any}

Additional tags for cluster resources.

dataSecurityMode string

Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.

dockerImage GetClusterClusterInfoDockerImage
driver GetClusterClusterInfoDriver
driverNodeTypeId string

The node type of the Spark driver.

enableElasticDisk boolean

Use autoscaling local storage.

enableLocalDiskEncryption boolean

Enable local disk encryption.

executors GetClusterClusterInfoExecutor[]
gcpAttributes GetClusterClusterInfoGcpAttributes
initScripts GetClusterClusterInfoInitScript[]
instancePoolId string

The pool of idle instances the cluster is attached to.

jdbcPort number
lastActivityTime number
lastStateLossTime number
nodeTypeId string

Any supported databricks.getNodeType id.

numWorkers number
policyId string

Identifier of Cluster Policy to validate cluster and preset certain defaults.

runtimeEngine string

The type of runtime of the cluster

singleUserName string

The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).

sparkConf {[key: string]: any}

Map with key-value pairs to fine-tune Spark clusters.

sparkContextId number
sparkEnvVars {[key: string]: any}

Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.

sshPublicKeys string[]

SSH public key contents that will be added to each Spark node in this cluster.

startTime number
stateMessage string
terminateTime number
terminationReason GetClusterClusterInfoTerminationReason
default_tags Mapping[str, Any]
driver_instance_pool_id str

similar to instance_pool_id, but for driver node.

spark_version str

Runtime version of the cluster.

state str
autoscale GetClusterClusterInfoAutoscale
autotermination_minutes int

Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.

aws_attributes GetClusterClusterInfoAwsAttributes
azure_attributes GetClusterClusterInfoAzureAttributes
cluster_cores float
cluster_id str

The id of the cluster

cluster_log_conf GetClusterClusterInfoClusterLogConf
cluster_log_status GetClusterClusterInfoClusterLogStatus
cluster_memory_mb int
cluster_name str

The exact name of the cluster to search

cluster_source str
creator_user_name str
custom_tags Mapping[str, Any]

Additional tags for cluster resources.

data_security_mode str

Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.

docker_image GetClusterClusterInfoDockerImage
driver GetClusterClusterInfoDriver
driver_node_type_id str

The node type of the Spark driver.

enable_elastic_disk bool

Use autoscaling local storage.

enable_local_disk_encryption bool

Enable local disk encryption.

executors Sequence[GetClusterClusterInfoExecutor]
gcp_attributes GetClusterClusterInfoGcpAttributes
init_scripts Sequence[GetClusterClusterInfoInitScript]
instance_pool_id str

The pool of idle instances the cluster is attached to.

jdbc_port int
last_activity_time int
last_state_loss_time int
node_type_id str

Any supported databricks.getNodeType id.

num_workers int
policy_id str

Identifier of Cluster Policy to validate cluster and preset certain defaults.

runtime_engine str

The type of runtime of the cluster

single_user_name str

The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).

spark_conf Mapping[str, Any]

Map with key-value pairs to fine-tune Spark clusters.

spark_context_id int
spark_env_vars Mapping[str, Any]

Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.

ssh_public_keys Sequence[str]

SSH public key contents that will be added to each Spark node in this cluster.

start_time int
state_message str
terminate_time int
termination_reason GetClusterClusterInfoTerminationReason
defaultTags Map<Any>
driverInstancePoolId String

similar to instance_pool_id, but for driver node.

sparkVersion String

Runtime version of the cluster.

state String
autoscale Property Map
autoterminationMinutes Number

Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.

awsAttributes Property Map
azureAttributes Property Map
clusterCores Number
clusterId String

The id of the cluster

clusterLogConf Property Map
clusterLogStatus Property Map
clusterMemoryMb Number
clusterName String

The exact name of the cluster to search

clusterSource String
creatorUserName String
customTags Map<Any>

Additional tags for cluster resources.

dataSecurityMode String

Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.

dockerImage Property Map
driver Property Map
driverNodeTypeId String

The node type of the Spark driver.

enableElasticDisk Boolean

Use autoscaling local storage.

enableLocalDiskEncryption Boolean

Enable local disk encryption.

executors List<Property Map>
gcpAttributes Property Map
initScripts List<Property Map>
instancePoolId String

The pool of idle instances the cluster is attached to.

jdbcPort Number
lastActivityTime Number
lastStateLossTime Number
nodeTypeId String

Any supported databricks.getNodeType id.

numWorkers Number
policyId String

Identifier of Cluster Policy to validate cluster and preset certain defaults.

runtimeEngine String

The type of runtime of the cluster

singleUserName String

The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).

sparkConf Map<Any>

Map with key-value pairs to fine-tune Spark clusters.

sparkContextId Number
sparkEnvVars Map<Any>

Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.

sshPublicKeys List<String>

SSH public key contents that will be added to each Spark node in this cluster.

startTime Number
stateMessage String
terminateTime Number
terminationReason Property Map

GetClusterClusterInfoAutoscale

maxWorkers Integer
minWorkers Integer
maxWorkers number
minWorkers number
maxWorkers Number
minWorkers Number

GetClusterClusterInfoAwsAttributes

GetClusterClusterInfoAzureAttributes

GetClusterClusterInfoClusterLogConf

GetClusterClusterInfoClusterLogConfDbfs

GetClusterClusterInfoClusterLogConfS3

Destination string
CannedAcl string
EnableEncryption bool
EncryptionType string
Endpoint string
KmsKey string
Region string
Destination string
CannedAcl string
EnableEncryption bool
EncryptionType string
Endpoint string
KmsKey string
Region string
destination String
cannedAcl String
enableEncryption Boolean
encryptionType String
endpoint String
kmsKey String
region String
destination string
cannedAcl string
enableEncryption boolean
encryptionType string
endpoint string
kmsKey string
region string
destination String
cannedAcl String
enableEncryption Boolean
encryptionType String
endpoint String
kmsKey String
region String

GetClusterClusterInfoClusterLogStatus

GetClusterClusterInfoDockerImage

GetClusterClusterInfoDockerImageBasicAuth

Password string
Username string
Password string
Username string
password String
username String
password string
username string
password String
username String

GetClusterClusterInfoDriver

GetClusterClusterInfoDriverNodeAwsAttributes

IsSpot bool
IsSpot bool
isSpot Boolean
isSpot boolean
is_spot bool
isSpot Boolean

GetClusterClusterInfoExecutor

GetClusterClusterInfoExecutorNodeAwsAttributes

IsSpot bool
IsSpot bool
isSpot Boolean
isSpot boolean
is_spot bool
isSpot Boolean

GetClusterClusterInfoGcpAttributes

GetClusterClusterInfoInitScript

GetClusterClusterInfoInitScriptAbfss

GetClusterClusterInfoInitScriptDbfs

GetClusterClusterInfoInitScriptFile

GetClusterClusterInfoInitScriptGcs

GetClusterClusterInfoInitScriptS3

Destination string
CannedAcl string
EnableEncryption bool
EncryptionType string
Endpoint string
KmsKey string
Region string
Destination string
CannedAcl string
EnableEncryption bool
EncryptionType string
Endpoint string
KmsKey string
Region string
destination String
cannedAcl String
enableEncryption Boolean
encryptionType String
endpoint String
kmsKey String
region String
destination string
cannedAcl string
enableEncryption boolean
encryptionType string
endpoint string
kmsKey string
region string
destination String
cannedAcl String
enableEncryption Boolean
encryptionType String
endpoint String
kmsKey String
region String

GetClusterClusterInfoTerminationReason

Code string
Parameters Dictionary<string, object>
Type string
Code string
Parameters map[string]interface{}
Type string
code String
parameters Map<String,Object>
type String
code string
parameters {[key: string]: any}
type string
code str
parameters Mapping[str, Any]
type str
code String
parameters Map<Any>
type String

Package Details

Repository
databricks pulumi/pulumi-databricks
License
Apache-2.0
Notes

This Pulumi package is based on the databricks Terraform Provider.