databricks.getCluster
Explore with Pulumi AI
Note If you have a fully automated setup with workspaces created by databricks_mws_workspaces, please make sure to add depends_on attribute in order to prevent default auth: cannot configure default credentials errors.
Retrieves information about a databricks.Cluster using its id. This could be retrieved programmatically using databricks.getClusters data source.
Related Resources
The following resources are often used in the same context:
- End to end workspace management guide.
- databricks.Cluster to create Databricks Clusters.
- databricks.ClusterPolicy to create a databricks.Cluster policy, which limits the ability to create clusters based on a set of rules.
- databricks.InstancePool to manage instance pools to reduce cluster start and auto-scaling times by maintaining a set of idle, ready-to-use instances.
- databricks.Job to manage Databricks Jobs to run non-interactive code in a databricks_cluster.
- databricks.Library to install a library on databricks_cluster.
- databricks.Pipeline to deploy Delta Live Tables.
Example Usage
Retrieve attributes of each SQL warehouses in a workspace
using System.Collections.Generic;
using System.Linq;
using Pulumi;
using Databricks = Pulumi.Databricks;
return await Deployment.RunAsync(() =>
{
var allClusters = Databricks.GetClusters.Invoke();
var allCluster = .Select(__value =>
{
return Databricks.GetCluster.Invoke(new()
{
ClusterId = __value,
});
}).ToList();
});
package main
import (
"github.com/pulumi/pulumi-databricks/sdk/go/databricks"
"github.com/pulumi/pulumi/sdk/v3/go/pulumi"
)
func main() {
pulumi.Run(func(ctx *pulumi.Context) error {
_, err := databricks.GetClusters(ctx, nil, nil)
if err != nil {
return err
}
_ := "TODO: For expression"
return nil
})
}
Coming soon!
import pulumi
import pulumi_databricks as databricks
all_clusters = databricks.get_clusters()
all_cluster = [databricks.get_cluster(cluster_id=__value) for __key, __value in data["databricks_clusters"]["ids"]]
import * as pulumi from "@pulumi/pulumi";
import * as databricks from "@pulumi/databricks";
const allClusters = databricks.getClusters({});
const allCluster = .map(([, ]) => (databricks.getCluster({
clusterId: __value,
})));
Coming soon!
Using getCluster
Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.
function getCluster(args: GetClusterArgs, opts?: InvokeOptions): Promise<GetClusterResult>
function getClusterOutput(args: GetClusterOutputArgs, opts?: InvokeOptions): Output<GetClusterResult>
def get_cluster(cluster_id: Optional[str] = None,
cluster_info: Optional[GetClusterClusterInfo] = None,
cluster_name: Optional[str] = None,
id: Optional[str] = None,
opts: Optional[InvokeOptions] = None) -> GetClusterResult
def get_cluster_output(cluster_id: Optional[pulumi.Input[str]] = None,
cluster_info: Optional[pulumi.Input[GetClusterClusterInfoArgs]] = None,
cluster_name: Optional[pulumi.Input[str]] = None,
id: Optional[pulumi.Input[str]] = None,
opts: Optional[InvokeOptions] = None) -> Output[GetClusterResult]
func LookupCluster(ctx *Context, args *LookupClusterArgs, opts ...InvokeOption) (*LookupClusterResult, error)
func LookupClusterOutput(ctx *Context, args *LookupClusterOutputArgs, opts ...InvokeOption) LookupClusterResultOutput
> Note: This function is named LookupCluster
in the Go SDK.
public static class GetCluster
{
public static Task<GetClusterResult> InvokeAsync(GetClusterArgs args, InvokeOptions? opts = null)
public static Output<GetClusterResult> Invoke(GetClusterInvokeArgs args, InvokeOptions? opts = null)
}
public static CompletableFuture<GetClusterResult> getCluster(GetClusterArgs args, InvokeOptions options)
// Output-based functions aren't available in Java yet
fn::invoke:
function: databricks:index/getCluster:getCluster
arguments:
# arguments dictionary
The following arguments are supported:
- Cluster
Id string The id of the cluster
- Cluster
Info GetCluster Cluster Info block, consisting of following fields:
- Cluster
Name string The exact name of the cluster to search
- Id string
cluster ID
- Cluster
Id string The id of the cluster
- Cluster
Info GetCluster Cluster Info block, consisting of following fields:
- Cluster
Name string The exact name of the cluster to search
- Id string
cluster ID
- cluster
Id String The id of the cluster
- cluster
Info GetCluster Cluster Info block, consisting of following fields:
- cluster
Name String The exact name of the cluster to search
- id String
cluster ID
- cluster
Id string The id of the cluster
- cluster
Info GetCluster Cluster Info block, consisting of following fields:
- cluster
Name string The exact name of the cluster to search
- id string
cluster ID
- cluster_
id str The id of the cluster
- cluster_
info GetCluster Cluster Info block, consisting of following fields:
- cluster_
name str The exact name of the cluster to search
- id str
cluster ID
- cluster
Id String The id of the cluster
- cluster
Info Property Map block, consisting of following fields:
- cluster
Name String The exact name of the cluster to search
- id String
cluster ID
getCluster Result
The following output properties are available:
- Cluster
Id string - Cluster
Info GetCluster Cluster Info block, consisting of following fields:
- Cluster
Name string Cluster name, which doesn’t have to be unique.
- Id string
cluster ID
- Cluster
Id string - Cluster
Info GetCluster Cluster Info block, consisting of following fields:
- Cluster
Name string Cluster name, which doesn’t have to be unique.
- Id string
cluster ID
- cluster
Id String - cluster
Info GetCluster Cluster Info block, consisting of following fields:
- cluster
Name String Cluster name, which doesn’t have to be unique.
- id String
cluster ID
- cluster
Id string - cluster
Info GetCluster Cluster Info block, consisting of following fields:
- cluster
Name string Cluster name, which doesn’t have to be unique.
- id string
cluster ID
- cluster_
id str - cluster_
info GetCluster Cluster Info block, consisting of following fields:
- cluster_
name str Cluster name, which doesn’t have to be unique.
- id str
cluster ID
- cluster
Id String - cluster
Info Property Map block, consisting of following fields:
- cluster
Name String Cluster name, which doesn’t have to be unique.
- id String
cluster ID
Supporting Types
GetClusterClusterInfo
- Dictionary<string, object>
- Driver
Instance stringPool Id similar to
instance_pool_id
, but for driver node.- Spark
Version string Runtime version of the cluster.
- State string
- Autoscale
Get
Cluster Cluster Info Autoscale - Autotermination
Minutes int Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
- Aws
Attributes GetCluster Cluster Info Aws Attributes - Azure
Attributes GetCluster Cluster Info Azure Attributes - Cluster
Cores double - Cluster
Id string The id of the cluster
- Cluster
Log GetConf Cluster Cluster Info Cluster Log Conf - Cluster
Log GetStatus Cluster Cluster Info Cluster Log Status - Cluster
Memory intMb - Cluster
Name string The exact name of the cluster to search
- Cluster
Source string - Creator
User stringName - Dictionary<string, object>
Additional tags for cluster resources.
- Data
Security stringMode Security features of the cluster. Unity Catalog requires
SINGLE_USER
orUSER_ISOLATION
mode.LEGACY_PASSTHROUGH
for passthrough cluster andLEGACY_TABLE_ACL
for Table ACL cluster. Default toNONE
, i.e. no security feature enabled.- Docker
Image GetCluster Cluster Info Docker Image - Driver
Get
Cluster Cluster Info Driver - Driver
Node stringType Id The node type of the Spark driver.
- Enable
Elastic boolDisk Use autoscaling local storage.
- Enable
Local boolDisk Encryption Enable local disk encryption.
- Executors
List<Get
Cluster Cluster Info Executor> - Gcp
Attributes GetCluster Cluster Info Gcp Attributes - Init
Scripts List<GetCluster Cluster Info Init Script> - Instance
Pool stringId The pool of idle instances the cluster is attached to.
- Jdbc
Port int - Last
Activity intTime - Last
State intLoss Time - Node
Type stringId Any supported databricks.getNodeType id.
- Num
Workers int - Policy
Id string Identifier of Cluster Policy to validate cluster and preset certain defaults.
- Runtime
Engine string The type of runtime of the cluster
- Single
User stringName The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- Spark
Conf Dictionary<string, object> Map with key-value pairs to fine-tune Spark clusters.
- Spark
Context intId - Spark
Env Dictionary<string, object>Vars Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- Ssh
Public List<string>Keys SSH public key contents that will be added to each Spark node in this cluster.
- Start
Time int - State
Message string - Terminate
Time int - Termination
Reason GetCluster Cluster Info Termination Reason
- map[string]interface{}
- Driver
Instance stringPool Id similar to
instance_pool_id
, but for driver node.- Spark
Version string Runtime version of the cluster.
- State string
- Autoscale
Get
Cluster Cluster Info Autoscale - Autotermination
Minutes int Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
- Aws
Attributes GetCluster Cluster Info Aws Attributes - Azure
Attributes GetCluster Cluster Info Azure Attributes - Cluster
Cores float64 - Cluster
Id string The id of the cluster
- Cluster
Log GetConf Cluster Cluster Info Cluster Log Conf - Cluster
Log GetStatus Cluster Cluster Info Cluster Log Status - Cluster
Memory intMb - Cluster
Name string The exact name of the cluster to search
- Cluster
Source string - Creator
User stringName - map[string]interface{}
Additional tags for cluster resources.
- Data
Security stringMode Security features of the cluster. Unity Catalog requires
SINGLE_USER
orUSER_ISOLATION
mode.LEGACY_PASSTHROUGH
for passthrough cluster andLEGACY_TABLE_ACL
for Table ACL cluster. Default toNONE
, i.e. no security feature enabled.- Docker
Image GetCluster Cluster Info Docker Image - Driver
Get
Cluster Cluster Info Driver - Driver
Node stringType Id The node type of the Spark driver.
- Enable
Elastic boolDisk Use autoscaling local storage.
- Enable
Local boolDisk Encryption Enable local disk encryption.
- Executors
[]Get
Cluster Cluster Info Executor - Gcp
Attributes GetCluster Cluster Info Gcp Attributes - Init
Scripts []GetCluster Cluster Info Init Script - Instance
Pool stringId The pool of idle instances the cluster is attached to.
- Jdbc
Port int - Last
Activity intTime - Last
State intLoss Time - Node
Type stringId Any supported databricks.getNodeType id.
- Num
Workers int - Policy
Id string Identifier of Cluster Policy to validate cluster and preset certain defaults.
- Runtime
Engine string The type of runtime of the cluster
- Single
User stringName The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- Spark
Conf map[string]interface{} Map with key-value pairs to fine-tune Spark clusters.
- Spark
Context intId - Spark
Env map[string]interface{}Vars Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- Ssh
Public []stringKeys SSH public key contents that will be added to each Spark node in this cluster.
- Start
Time int - State
Message string - Terminate
Time int - Termination
Reason GetCluster Cluster Info Termination Reason
- Map<String,Object>
- driver
Instance StringPool Id similar to
instance_pool_id
, but for driver node.- spark
Version String Runtime version of the cluster.
- state String
- autoscale
Get
Cluster Cluster Info Autoscale - autotermination
Minutes Integer Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
- aws
Attributes GetCluster Cluster Info Aws Attributes - azure
Attributes GetCluster Cluster Info Azure Attributes - cluster
Cores Double - cluster
Id String The id of the cluster
- cluster
Log GetConf Cluster Cluster Info Cluster Log Conf - cluster
Log GetStatus Cluster Cluster Info Cluster Log Status - cluster
Memory IntegerMb - cluster
Name String The exact name of the cluster to search
- cluster
Source String - creator
User StringName - Map<String,Object>
Additional tags for cluster resources.
- data
Security StringMode Security features of the cluster. Unity Catalog requires
SINGLE_USER
orUSER_ISOLATION
mode.LEGACY_PASSTHROUGH
for passthrough cluster andLEGACY_TABLE_ACL
for Table ACL cluster. Default toNONE
, i.e. no security feature enabled.- docker
Image GetCluster Cluster Info Docker Image - driver
Get
Cluster Cluster Info Driver - driver
Node StringType Id The node type of the Spark driver.
- enable
Elastic BooleanDisk Use autoscaling local storage.
- enable
Local BooleanDisk Encryption Enable local disk encryption.
- executors
List<Get
Cluster Cluster Info Executor> - gcp
Attributes GetCluster Cluster Info Gcp Attributes - init
Scripts List<GetCluster Cluster Info Init Script> - instance
Pool StringId The pool of idle instances the cluster is attached to.
- jdbc
Port Integer - last
Activity IntegerTime - last
State IntegerLoss Time - node
Type StringId Any supported databricks.getNodeType id.
- num
Workers Integer - policy
Id String Identifier of Cluster Policy to validate cluster and preset certain defaults.
- runtime
Engine String The type of runtime of the cluster
- single
User StringName The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- spark
Conf Map<String,Object> Map with key-value pairs to fine-tune Spark clusters.
- spark
Context IntegerId - spark
Env Map<String,Object>Vars Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- ssh
Public List<String>Keys SSH public key contents that will be added to each Spark node in this cluster.
- start
Time Integer - state
Message String - terminate
Time Integer - termination
Reason GetCluster Cluster Info Termination Reason
- {[key: string]: any}
- driver
Instance stringPool Id similar to
instance_pool_id
, but for driver node.- spark
Version string Runtime version of the cluster.
- state string
- autoscale
Get
Cluster Cluster Info Autoscale - autotermination
Minutes number Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
- aws
Attributes GetCluster Cluster Info Aws Attributes - azure
Attributes GetCluster Cluster Info Azure Attributes - cluster
Cores number - cluster
Id string The id of the cluster
- cluster
Log GetConf Cluster Cluster Info Cluster Log Conf - cluster
Log GetStatus Cluster Cluster Info Cluster Log Status - cluster
Memory numberMb - cluster
Name string The exact name of the cluster to search
- cluster
Source string - creator
User stringName - {[key: string]: any}
Additional tags for cluster resources.
- data
Security stringMode Security features of the cluster. Unity Catalog requires
SINGLE_USER
orUSER_ISOLATION
mode.LEGACY_PASSTHROUGH
for passthrough cluster andLEGACY_TABLE_ACL
for Table ACL cluster. Default toNONE
, i.e. no security feature enabled.- docker
Image GetCluster Cluster Info Docker Image - driver
Get
Cluster Cluster Info Driver - driver
Node stringType Id The node type of the Spark driver.
- enable
Elastic booleanDisk Use autoscaling local storage.
- enable
Local booleanDisk Encryption Enable local disk encryption.
- executors
Get
Cluster Cluster Info Executor[] - gcp
Attributes GetCluster Cluster Info Gcp Attributes - init
Scripts GetCluster Cluster Info Init Script[] - instance
Pool stringId The pool of idle instances the cluster is attached to.
- jdbc
Port number - last
Activity numberTime - last
State numberLoss Time - node
Type stringId Any supported databricks.getNodeType id.
- num
Workers number - policy
Id string Identifier of Cluster Policy to validate cluster and preset certain defaults.
- runtime
Engine string The type of runtime of the cluster
- single
User stringName The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- spark
Conf {[key: string]: any} Map with key-value pairs to fine-tune Spark clusters.
- spark
Context numberId - spark
Env {[key: string]: any}Vars Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- ssh
Public string[]Keys SSH public key contents that will be added to each Spark node in this cluster.
- start
Time number - state
Message string - terminate
Time number - termination
Reason GetCluster Cluster Info Termination Reason
- Mapping[str, Any]
- driver_
instance_ strpool_ id similar to
instance_pool_id
, but for driver node.- spark_
version str Runtime version of the cluster.
- state str
- autoscale
Get
Cluster Cluster Info Autoscale - autotermination_
minutes int Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
- aws_
attributes GetCluster Cluster Info Aws Attributes - azure_
attributes GetCluster Cluster Info Azure Attributes - cluster_
cores float - cluster_
id str The id of the cluster
- cluster_
log_ Getconf Cluster Cluster Info Cluster Log Conf - cluster_
log_ Getstatus Cluster Cluster Info Cluster Log Status - cluster_
memory_ intmb - cluster_
name str The exact name of the cluster to search
- cluster_
source str - creator_
user_ strname - Mapping[str, Any]
Additional tags for cluster resources.
- data_
security_ strmode Security features of the cluster. Unity Catalog requires
SINGLE_USER
orUSER_ISOLATION
mode.LEGACY_PASSTHROUGH
for passthrough cluster andLEGACY_TABLE_ACL
for Table ACL cluster. Default toNONE
, i.e. no security feature enabled.- docker_
image GetCluster Cluster Info Docker Image - driver
Get
Cluster Cluster Info Driver - driver_
node_ strtype_ id The node type of the Spark driver.
- enable_
elastic_ booldisk Use autoscaling local storage.
- enable_
local_ booldisk_ encryption Enable local disk encryption.
- executors
Sequence[Get
Cluster Cluster Info Executor] - gcp_
attributes GetCluster Cluster Info Gcp Attributes - init_
scripts Sequence[GetCluster Cluster Info Init Script] - instance_
pool_ strid The pool of idle instances the cluster is attached to.
- jdbc_
port int - last_
activity_ inttime - last_
state_ intloss_ time - node_
type_ strid Any supported databricks.getNodeType id.
- num_
workers int - policy_
id str Identifier of Cluster Policy to validate cluster and preset certain defaults.
- runtime_
engine str The type of runtime of the cluster
- single_
user_ strname The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- spark_
conf Mapping[str, Any] Map with key-value pairs to fine-tune Spark clusters.
- spark_
context_ intid - spark_
env_ Mapping[str, Any]vars Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- ssh_
public_ Sequence[str]keys SSH public key contents that will be added to each Spark node in this cluster.
- start_
time int - state_
message str - terminate_
time int - termination_
reason GetCluster Cluster Info Termination Reason
- Map<Any>
- driver
Instance StringPool Id similar to
instance_pool_id
, but for driver node.- spark
Version String Runtime version of the cluster.
- state String
- autoscale Property Map
- autotermination
Minutes Number Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
- aws
Attributes Property Map - azure
Attributes Property Map - cluster
Cores Number - cluster
Id String The id of the cluster
- cluster
Log Property MapConf - cluster
Log Property MapStatus - cluster
Memory NumberMb - cluster
Name String The exact name of the cluster to search
- cluster
Source String - creator
User StringName - Map<Any>
Additional tags for cluster resources.
- data
Security StringMode Security features of the cluster. Unity Catalog requires
SINGLE_USER
orUSER_ISOLATION
mode.LEGACY_PASSTHROUGH
for passthrough cluster andLEGACY_TABLE_ACL
for Table ACL cluster. Default toNONE
, i.e. no security feature enabled.- docker
Image Property Map - driver Property Map
- driver
Node StringType Id The node type of the Spark driver.
- enable
Elastic BooleanDisk Use autoscaling local storage.
- enable
Local BooleanDisk Encryption Enable local disk encryption.
- executors List<Property Map>
- gcp
Attributes Property Map - init
Scripts List<Property Map> - instance
Pool StringId The pool of idle instances the cluster is attached to.
- jdbc
Port Number - last
Activity NumberTime - last
State NumberLoss Time - node
Type StringId Any supported databricks.getNodeType id.
- num
Workers Number - policy
Id String Identifier of Cluster Policy to validate cluster and preset certain defaults.
- runtime
Engine String The type of runtime of the cluster
- single
User StringName The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
- spark
Conf Map<Any> Map with key-value pairs to fine-tune Spark clusters.
- spark
Context NumberId - spark
Env Map<Any>Vars Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
- ssh
Public List<String>Keys SSH public key contents that will be added to each Spark node in this cluster.
- start
Time Number - state
Message String - terminate
Time Number - termination
Reason Property Map
GetClusterClusterInfoAutoscale
- Max
Workers int - Min
Workers int
- Max
Workers int - Min
Workers int
- max
Workers Integer - min
Workers Integer
- max
Workers number - min
Workers number
- max_
workers int - min_
workers int
- max
Workers Number - min
Workers Number
GetClusterClusterInfoAwsAttributes
- Availability string
- Ebs
Volume intCount - Ebs
Volume intSize - Ebs
Volume stringType - First
On intDemand - Instance
Profile stringArn - Spot
Bid intPrice Percent - Zone
Id string
- Availability string
- Ebs
Volume intCount - Ebs
Volume intSize - Ebs
Volume stringType - First
On intDemand - Instance
Profile stringArn - Spot
Bid intPrice Percent - Zone
Id string
- availability String
- ebs
Volume IntegerCount - ebs
Volume IntegerSize - ebs
Volume StringType - first
On IntegerDemand - instance
Profile StringArn - spot
Bid IntegerPrice Percent - zone
Id String
- availability string
- ebs
Volume numberCount - ebs
Volume numberSize - ebs
Volume stringType - first
On numberDemand - instance
Profile stringArn - spot
Bid numberPrice Percent - zone
Id string
- availability str
- ebs_
volume_ intcount - ebs_
volume_ intsize - ebs_
volume_ strtype - first_
on_ intdemand - instance_
profile_ strarn - spot_
bid_ intprice_ percent - zone_
id str
- availability String
- ebs
Volume NumberCount - ebs
Volume NumberSize - ebs
Volume StringType - first
On NumberDemand - instance
Profile StringArn - spot
Bid NumberPrice Percent - zone
Id String
GetClusterClusterInfoAzureAttributes
- Availability string
- First
On intDemand - Spot
Bid doubleMax Price
- Availability string
- First
On intDemand - Spot
Bid float64Max Price
- availability String
- first
On IntegerDemand - spot
Bid DoubleMax Price
- availability string
- first
On numberDemand - spot
Bid numberMax Price
- availability str
- first_
on_ intdemand - spot_
bid_ floatmax_ price
- availability String
- first
On NumberDemand - spot
Bid NumberMax Price
GetClusterClusterInfoClusterLogConf
GetClusterClusterInfoClusterLogConfDbfs
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoClusterLogConfS3
- Destination string
- Canned
Acl string - Enable
Encryption bool - Encryption
Type string - Endpoint string
- Kms
Key string - Region string
- Destination string
- Canned
Acl string - Enable
Encryption bool - Encryption
Type string - Endpoint string
- Kms
Key string - Region string
- destination String
- canned
Acl String - enable
Encryption Boolean - encryption
Type String - endpoint String
- kms
Key String - region String
- destination string
- canned
Acl string - enable
Encryption boolean - encryption
Type string - endpoint string
- kms
Key string - region string
- destination str
- canned_
acl str - enable_
encryption bool - encryption_
type str - endpoint str
- kms_
key str - region str
- destination String
- canned
Acl String - enable
Encryption Boolean - encryption
Type String - endpoint String
- kms
Key String - region String
GetClusterClusterInfoClusterLogStatus
- Last
Attempted int - Last
Exception string
- Last
Attempted int - Last
Exception string
- last
Attempted Integer - last
Exception String
- last
Attempted number - last
Exception string
- last_
attempted int - last_
exception str
- last
Attempted Number - last
Exception String
GetClusterClusterInfoDockerImage
- url String
- basic
Auth Property Map
GetClusterClusterInfoDockerImageBasicAuth
GetClusterClusterInfoDriver
- Host
Private stringIp - Instance
Id string - Node
Aws GetAttributes Cluster Cluster Info Driver Node Aws Attributes - Node
Id string - Private
Ip string - Public
Dns string - Start
Timestamp int
- Host
Private stringIp - Instance
Id string - Node
Aws GetAttributes Cluster Cluster Info Driver Node Aws Attributes - Node
Id string - Private
Ip string - Public
Dns string - Start
Timestamp int
- host
Private StringIp - instance
Id String - node
Aws GetAttributes Cluster Cluster Info Driver Node Aws Attributes - node
Id String - private
Ip String - public
Dns String - start
Timestamp Integer
- host
Private stringIp - instance
Id string - node
Aws GetAttributes Cluster Cluster Info Driver Node Aws Attributes - node
Id string - private
Ip string - public
Dns string - start
Timestamp number
- host
Private StringIp - instance
Id String - node
Aws Property MapAttributes - node
Id String - private
Ip String - public
Dns String - start
Timestamp Number
GetClusterClusterInfoDriverNodeAwsAttributes
- Is
Spot bool
- Is
Spot bool
- is
Spot Boolean
- is
Spot boolean
- is_
spot bool
- is
Spot Boolean
GetClusterClusterInfoExecutor
- Host
Private stringIp - Instance
Id string - Node
Aws GetAttributes Cluster Cluster Info Executor Node Aws Attributes - Node
Id string - Private
Ip string - Public
Dns string - Start
Timestamp int
- Host
Private stringIp - Instance
Id string - Node
Aws GetAttributes Cluster Cluster Info Executor Node Aws Attributes - Node
Id string - Private
Ip string - Public
Dns string - Start
Timestamp int
- host
Private StringIp - instance
Id String - node
Aws GetAttributes Cluster Cluster Info Executor Node Aws Attributes - node
Id String - private
Ip String - public
Dns String - start
Timestamp Integer
- host
Private stringIp - instance
Id string - node
Aws GetAttributes Cluster Cluster Info Executor Node Aws Attributes - node
Id string - private
Ip string - public
Dns string - start
Timestamp number
- host
Private StringIp - instance
Id String - node
Aws Property MapAttributes - node
Id String - private
Ip String - public
Dns String - start
Timestamp Number
GetClusterClusterInfoExecutorNodeAwsAttributes
- Is
Spot bool
- Is
Spot bool
- is
Spot Boolean
- is
Spot boolean
- is_
spot bool
- is
Spot Boolean
GetClusterClusterInfoGcpAttributes
- Availability string
- Boot
Disk intSize - Google
Service stringAccount - Local
Ssd intCount - Use
Preemptible boolExecutors - Zone
Id string
- Availability string
- Boot
Disk intSize - Google
Service stringAccount - Local
Ssd intCount - Use
Preemptible boolExecutors - Zone
Id string
- availability String
- boot
Disk IntegerSize - google
Service StringAccount - local
Ssd IntegerCount - use
Preemptible BooleanExecutors - zone
Id String
- availability string
- boot
Disk numberSize - google
Service stringAccount - local
Ssd numberCount - use
Preemptible booleanExecutors - zone
Id string
- availability str
- boot_
disk_ intsize - google_
service_ straccount - local_
ssd_ intcount - use_
preemptible_ boolexecutors - zone_
id str
- availability String
- boot
Disk NumberSize - google
Service StringAccount - local
Ssd NumberCount - use
Preemptible BooleanExecutors - zone
Id String
GetClusterClusterInfoInitScript
GetClusterClusterInfoInitScriptAbfss
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoInitScriptDbfs
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoInitScriptFile
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoInitScriptGcs
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoInitScriptS3
- Destination string
- Canned
Acl string - Enable
Encryption bool - Encryption
Type string - Endpoint string
- Kms
Key string - Region string
- Destination string
- Canned
Acl string - Enable
Encryption bool - Encryption
Type string - Endpoint string
- Kms
Key string - Region string
- destination String
- canned
Acl String - enable
Encryption Boolean - encryption
Type String - endpoint String
- kms
Key String - region String
- destination string
- canned
Acl string - enable
Encryption boolean - encryption
Type string - endpoint string
- kms
Key string - region string
- destination str
- canned_
acl str - enable_
encryption bool - encryption_
type str - endpoint str
- kms_
key str - region str
- destination String
- canned
Acl String - enable
Encryption Boolean - encryption
Type String - endpoint String
- kms
Key String - region String
GetClusterClusterInfoInitScriptVolumes
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoInitScriptWorkspace
- Destination string
- Destination string
- destination String
- destination string
- destination str
- destination String
GetClusterClusterInfoTerminationReason
- Code string
- Parameters Dictionary<string, object>
- Type string
- Code string
- Parameters map[string]interface{}
- Type string
- code String
- parameters Map<String,Object>
- type String
- code string
- parameters {[key: string]: any}
- type string
- code str
- parameters Mapping[str, Any]
- type str
- code String
- parameters Map<Any>
- type String
Package Details
- Repository
- databricks pulumi/pulumi-databricks
- License
- Apache-2.0
- Notes
This Pulumi package is based on the
databricks
Terraform Provider.