Oracle Cloud Infrastructure

Pulumi Official
Package maintained by Pulumi
v0.1.1 published on Tuesday, May 3, 2022 by Pulumi

getApplication

This data source provides details about a specific Application resource in Oracle Cloud Infrastructure Data Flow service.

Retrieves an application using an applicationId.

Example Usage

using Pulumi;
using Oci = Pulumi.Oci;

class MyStack : Stack
{
    public MyStack()
    {
        var testApplication = Output.Create(Oci.DataFlow.GetApplication.InvokeAsync(new Oci.DataFlow.GetApplicationArgs
        {
            ApplicationId = oci_dataflow_application.Test_application.Id,
        }));
    }

}
package main

import (
	"github.com/pulumi/pulumi-oci/sdk/go/oci/DataFlow"
	"github.com/pulumi/pulumi/sdk/v3/go/pulumi"
)

func main() {
	pulumi.Run(func(ctx *pulumi.Context) error {
		_, err := DataFlow.GetApplication(ctx, &dataflow.GetApplicationArgs{
			ApplicationId: oci_dataflow_application.Test_application.Id,
		}, nil)
		if err != nil {
			return err
		}
		return nil
	})
}

Coming soon!

import pulumi
import pulumi_oci as oci

test_application = oci.DataFlow.get_application(application_id=oci_dataflow_application["test_application"]["id"])
import * as pulumi from "@pulumi/pulumi";
import * as oci from "@pulumi/oci";

const testApplication = oci.DataFlow.getApplication({
    applicationId: oci_dataflow_application.test_application.id,
});

Coming soon!

Using getApplication

Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.

function getApplication(args: GetApplicationArgs, opts?: InvokeOptions): Promise<GetApplicationResult>
function getApplicationOutput(args: GetApplicationOutputArgs, opts?: InvokeOptions): Output<GetApplicationResult>
def get_application(application_id: Optional[str] = None,
                    opts: Optional[InvokeOptions] = None) -> GetApplicationResult
def get_application_output(application_id: Optional[pulumi.Input[str]] = None,
                    opts: Optional[InvokeOptions] = None) -> Output[GetApplicationResult]
func GetApplication(ctx *Context, args *GetApplicationArgs, opts ...InvokeOption) (*GetApplicationResult, error)
func GetApplicationOutput(ctx *Context, args *GetApplicationOutputArgs, opts ...InvokeOption) GetApplicationResultOutput

> Note: This function is named GetApplication in the Go SDK.

public static class GetApplication 
{
    public static Task<GetApplicationResult> InvokeAsync(GetApplicationArgs args, InvokeOptions? opts = null)
    public static Output<GetApplicationResult> Invoke(GetApplicationInvokeArgs args, InvokeOptions? opts = null)
}
public static CompletableFuture<GetApplicationResult> getApplication(GetApplicationArgs args, InvokeOptions options)
// Output-based functions aren't available in Java yet
Fn::Invoke:
  Function: oci:DataFlow/getApplication:getApplication
  Arguments:
    # Arguments dictionary

The following arguments are supported:

ApplicationId string

The unique ID for an application.

ApplicationId string

The unique ID for an application.

applicationId String

The unique ID for an application.

applicationId string

The unique ID for an application.

application_id str

The unique ID for an application.

applicationId String

The unique ID for an application.

getApplication Result

The following output properties are available:

ApplicationId string
ArchiveUri string

An Oracle Cloud Infrastructure URI of an archive.zip file containing custom dependencies that may be used to support the execution a Python, Java, or Scala application. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.

Arguments List<string>

The arguments passed to the running application as command line arguments. An argument is either a plain text or a placeholder. Placeholders are replaced using values from the parameters map. Each placeholder specified must be represented in the parameters map else the request (POST or PUT) will fail with a HTTP 400 status code. Placeholders are specified as Service Api Spec, where name is the name of the parameter. Example: [ "--input", "${input_file}", "--name", "John Doe" ] If "input_file" has a value of "mydata.xml", then the value above will be translated to --input mydata.xml --name "John Doe"

ClassName string

The class for the application.

CompartmentId string

The OCID of a compartment.

Configuration Dictionary<string, object>

The Spark configuration passed to the running process. See https://spark.apache.org/docs/latest/configuration.html#available-properties. Example: { "spark.app.name" : "My App Name", "spark.shuffle.io.maxRetries" : "4" } Note: Not all Spark properties are permitted to be set. Attempting to set a property that is not allowed to be overwritten will cause a 400 status to be returned.

DefinedTags Dictionary<string, object>

Defined tags for this resource. Each key is predefined and scoped to a namespace. For more information, see Resource Tags. Example: {"Operations.CostCenter": "42"}

Description string

A user-friendly description.

DisplayName string

A user-friendly name. This name is not necessarily unique.

DriverShape string

The VM shape for the driver. Sets the driver cores and memory.

Execute string

The input used for spark-submit command. For more details see https://spark.apache.org/docs/latest/submitting-applications.html#launching-applications-with-spark-submit. Supported options include --class, --file, --jars, --conf, --py-files, and main application file with arguments. Example: --jars oci://path/to/a.jar,oci://path/to/b.jar --files oci://path/to/a.json,oci://path/to/b.csv --py-files oci://path/to/a.py,oci://path/to/b.py --conf spark.sql.crossJoin.enabled=true --class org.apache.spark.examples.SparkPi oci://path/to/main.jar 10 Note: If execute is specified together with applicationId, className, configuration, fileUri, language, arguments, parameters during application create/update, or run create/submit, Data Flow service will use derived information from execute input only.

ExecutorShape string

The VM shape for the executors. Sets the executor cores and memory.

FileUri string

An Oracle Cloud Infrastructure URI of the file containing the application to execute. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.

FreeformTags Dictionary<string, object>

Free-form tags for this resource. Each tag is a simple key-value pair with no predefined name, type, or namespace. For more information, see Resource Tags. Example: {"Department": "Finance"}

Id string

The application ID.

Language string

The Spark language.

LogsBucketUri string

An Oracle Cloud Infrastructure URI of the bucket where the Spark job logs are to be uploaded. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.

MetastoreId string

The OCID of Oracle Cloud Infrastructure Hive Metastore.

NumExecutors int

The number of executor VMs requested.

OwnerPrincipalId string

The OCID of the user who created the resource.

OwnerUserName string

The username of the user who created the resource. If the username of the owner does not exist, null will be returned and the caller should refer to the ownerPrincipalId value instead.

Parameters List<GetApplicationParameter>

An array of name/value pairs used to fill placeholders found in properties like Application.arguments. The name must be a string of one or more word characters (a-z, A-Z, 0-9, _). The value can be a string of 0 or more characters of any kind. Example: [ { name: "iterations", value: "10"}, { name: "input_file", value: "mydata.xml" }, { name: "variable_x", value: "${x}"} ]

PrivateEndpointId string

The OCID of a private endpoint.

SparkVersion string

The Spark version utilized to run the application.

State string

The current state of this application.

TimeCreated string

The date and time a application was created, expressed in RFC 3339 timestamp format. Example: 2018-04-03T21:10:29.600Z

TimeUpdated string

The date and time a application was updated, expressed in RFC 3339 timestamp format. Example: 2018-04-03T21:10:29.600Z

Type string

The Spark application processing type.

WarehouseBucketUri string

An Oracle Cloud Infrastructure URI of the bucket to be used as default warehouse directory for BATCH SQL runs. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.

ApplicationId string
ArchiveUri string

An Oracle Cloud Infrastructure URI of an archive.zip file containing custom dependencies that may be used to support the execution a Python, Java, or Scala application. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.

Arguments []string

The arguments passed to the running application as command line arguments. An argument is either a plain text or a placeholder. Placeholders are replaced using values from the parameters map. Each placeholder specified must be represented in the parameters map else the request (POST or PUT) will fail with a HTTP 400 status code. Placeholders are specified as Service Api Spec, where name is the name of the parameter. Example: [ "--input", "${input_file}", "--name", "John Doe" ] If "input_file" has a value of "mydata.xml", then the value above will be translated to --input mydata.xml --name "John Doe"

ClassName string

The class for the application.

CompartmentId string

The OCID of a compartment.

Configuration map[string]interface{}

The Spark configuration passed to the running process. See https://spark.apache.org/docs/latest/configuration.html#available-properties. Example: { "spark.app.name" : "My App Name", "spark.shuffle.io.maxRetries" : "4" } Note: Not all Spark properties are permitted to be set. Attempting to set a property that is not allowed to be overwritten will cause a 400 status to be returned.

DefinedTags map[string]interface{}

Defined tags for this resource. Each key is predefined and scoped to a namespace. For more information, see Resource Tags. Example: {"Operations.CostCenter": "42"}

Description string

A user-friendly description.

DisplayName string

A user-friendly name. This name is not necessarily unique.

DriverShape string

The VM shape for the driver. Sets the driver cores and memory.

Execute string

The input used for spark-submit command. For more details see https://spark.apache.org/docs/latest/submitting-applications.html#launching-applications-with-spark-submit. Supported options include --class, --file, --jars, --conf, --py-files, and main application file with arguments. Example: --jars oci://path/to/a.jar,oci://path/to/b.jar --files oci://path/to/a.json,oci://path/to/b.csv --py-files oci://path/to/a.py,oci://path/to/b.py --conf spark.sql.crossJoin.enabled=true --class org.apache.spark.examples.SparkPi oci://path/to/main.jar 10 Note: If execute is specified together with applicationId, className, configuration, fileUri, language, arguments, parameters during application create/update, or run create/submit, Data Flow service will use derived information from execute input only.

ExecutorShape string

The VM shape for the executors. Sets the executor cores and memory.

FileUri string

An Oracle Cloud Infrastructure URI of the file containing the application to execute. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.

FreeformTags map[string]interface{}

Free-form tags for this resource. Each tag is a simple key-value pair with no predefined name, type, or namespace. For more information, see Resource Tags. Example: {"Department": "Finance"}

Id string

The application ID.

Language string

The Spark language.

LogsBucketUri string

An Oracle Cloud Infrastructure URI of the bucket where the Spark job logs are to be uploaded. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.

MetastoreId string

The OCID of Oracle Cloud Infrastructure Hive Metastore.

NumExecutors int

The number of executor VMs requested.

OwnerPrincipalId string

The OCID of the user who created the resource.

OwnerUserName string

The username of the user who created the resource. If the username of the owner does not exist, null will be returned and the caller should refer to the ownerPrincipalId value instead.

Parameters []GetApplicationParameter

An array of name/value pairs used to fill placeholders found in properties like Application.arguments. The name must be a string of one or more word characters (a-z, A-Z, 0-9, _). The value can be a string of 0 or more characters of any kind. Example: [ { name: "iterations", value: "10"}, { name: "input_file", value: "mydata.xml" }, { name: "variable_x", value: "${x}"} ]

PrivateEndpointId string

The OCID of a private endpoint.

SparkVersion string

The Spark version utilized to run the application.

State string

The current state of this application.

TimeCreated string

The date and time a application was created, expressed in RFC 3339 timestamp format. Example: 2018-04-03T21:10:29.600Z

TimeUpdated string

The date and time a application was updated, expressed in RFC 3339 timestamp format. Example: 2018-04-03T21:10:29.600Z

Type string

The Spark application processing type.

WarehouseBucketUri string

An Oracle Cloud Infrastructure URI of the bucket to be used as default warehouse directory for BATCH SQL runs. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.

applicationId String
archiveUri String

An Oracle Cloud Infrastructure URI of an archive.zip file containing custom dependencies that may be used to support the execution a Python, Java, or Scala application. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.

arguments List<String>

The arguments passed to the running application as command line arguments. An argument is either a plain text or a placeholder. Placeholders are replaced using values from the parameters map. Each placeholder specified must be represented in the parameters map else the request (POST or PUT) will fail with a HTTP 400 status code. Placeholders are specified as Service Api Spec, where name is the name of the parameter. Example: [ "--input", "${input_file}", "--name", "John Doe" ] If "input_file" has a value of "mydata.xml", then the value above will be translated to --input mydata.xml --name "John Doe"

className String

The class for the application.

compartmentId String

The OCID of a compartment.

configuration Map<String,Object>

The Spark configuration passed to the running process. See https://spark.apache.org/docs/latest/configuration.html#available-properties. Example: { "spark.app.name" : "My App Name", "spark.shuffle.io.maxRetries" : "4" } Note: Not all Spark properties are permitted to be set. Attempting to set a property that is not allowed to be overwritten will cause a 400 status to be returned.

definedTags Map<String,Object>

Defined tags for this resource. Each key is predefined and scoped to a namespace. For more information, see Resource Tags. Example: {"Operations.CostCenter": "42"}

description String

A user-friendly description.

displayName String

A user-friendly name. This name is not necessarily unique.

driverShape String

The VM shape for the driver. Sets the driver cores and memory.

execute String

The input used for spark-submit command. For more details see https://spark.apache.org/docs/latest/submitting-applications.html#launching-applications-with-spark-submit. Supported options include --class, --file, --jars, --conf, --py-files, and main application file with arguments. Example: --jars oci://path/to/a.jar,oci://path/to/b.jar --files oci://path/to/a.json,oci://path/to/b.csv --py-files oci://path/to/a.py,oci://path/to/b.py --conf spark.sql.crossJoin.enabled=true --class org.apache.spark.examples.SparkPi oci://path/to/main.jar 10 Note: If execute is specified together with applicationId, className, configuration, fileUri, language, arguments, parameters during application create/update, or run create/submit, Data Flow service will use derived information from execute input only.

executorShape String

The VM shape for the executors. Sets the executor cores and memory.

fileUri String

An Oracle Cloud Infrastructure URI of the file containing the application to execute. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.

freeformTags Map<String,Object>

Free-form tags for this resource. Each tag is a simple key-value pair with no predefined name, type, or namespace. For more information, see Resource Tags. Example: {"Department": "Finance"}

id String

The application ID.

language String

The Spark language.

logsBucketUri String

An Oracle Cloud Infrastructure URI of the bucket where the Spark job logs are to be uploaded. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.

metastoreId String

The OCID of Oracle Cloud Infrastructure Hive Metastore.

numExecutors Integer

The number of executor VMs requested.

ownerPrincipalId String

The OCID of the user who created the resource.

ownerUserName String

The username of the user who created the resource. If the username of the owner does not exist, null will be returned and the caller should refer to the ownerPrincipalId value instead.

parameters List<GetApplicationParameter>

An array of name/value pairs used to fill placeholders found in properties like Application.arguments. The name must be a string of one or more word characters (a-z, A-Z, 0-9, _). The value can be a string of 0 or more characters of any kind. Example: [ { name: "iterations", value: "10"}, { name: "input_file", value: "mydata.xml" }, { name: "variable_x", value: "${x}"} ]

privateEndpointId String

The OCID of a private endpoint.

sparkVersion String

The Spark version utilized to run the application.

state String

The current state of this application.

timeCreated String

The date and time a application was created, expressed in RFC 3339 timestamp format. Example: 2018-04-03T21:10:29.600Z

timeUpdated String

The date and time a application was updated, expressed in RFC 3339 timestamp format. Example: 2018-04-03T21:10:29.600Z

type String

The Spark application processing type.

warehouseBucketUri String

An Oracle Cloud Infrastructure URI of the bucket to be used as default warehouse directory for BATCH SQL runs. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.

applicationId string
archiveUri string

An Oracle Cloud Infrastructure URI of an archive.zip file containing custom dependencies that may be used to support the execution a Python, Java, or Scala application. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.

arguments string[]

The arguments passed to the running application as command line arguments. An argument is either a plain text or a placeholder. Placeholders are replaced using values from the parameters map. Each placeholder specified must be represented in the parameters map else the request (POST or PUT) will fail with a HTTP 400 status code. Placeholders are specified as Service Api Spec, where name is the name of the parameter. Example: [ "--input", "${input_file}", "--name", "John Doe" ] If "input_file" has a value of "mydata.xml", then the value above will be translated to --input mydata.xml --name "John Doe"

className string

The class for the application.

compartmentId string

The OCID of a compartment.

configuration {[key: string]: any}

The Spark configuration passed to the running process. See https://spark.apache.org/docs/latest/configuration.html#available-properties. Example: { "spark.app.name" : "My App Name", "spark.shuffle.io.maxRetries" : "4" } Note: Not all Spark properties are permitted to be set. Attempting to set a property that is not allowed to be overwritten will cause a 400 status to be returned.

definedTags {[key: string]: any}

Defined tags for this resource. Each key is predefined and scoped to a namespace. For more information, see Resource Tags. Example: {"Operations.CostCenter": "42"}

description string

A user-friendly description.

displayName string

A user-friendly name. This name is not necessarily unique.

driverShape string

The VM shape for the driver. Sets the driver cores and memory.

execute string

The input used for spark-submit command. For more details see https://spark.apache.org/docs/latest/submitting-applications.html#launching-applications-with-spark-submit. Supported options include --class, --file, --jars, --conf, --py-files, and main application file with arguments. Example: --jars oci://path/to/a.jar,oci://path/to/b.jar --files oci://path/to/a.json,oci://path/to/b.csv --py-files oci://path/to/a.py,oci://path/to/b.py --conf spark.sql.crossJoin.enabled=true --class org.apache.spark.examples.SparkPi oci://path/to/main.jar 10 Note: If execute is specified together with applicationId, className, configuration, fileUri, language, arguments, parameters during application create/update, or run create/submit, Data Flow service will use derived information from execute input only.

executorShape string

The VM shape for the executors. Sets the executor cores and memory.

fileUri string

An Oracle Cloud Infrastructure URI of the file containing the application to execute. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.

freeformTags {[key: string]: any}

Free-form tags for this resource. Each tag is a simple key-value pair with no predefined name, type, or namespace. For more information, see Resource Tags. Example: {"Department": "Finance"}

id string

The application ID.

language string

The Spark language.

logsBucketUri string

An Oracle Cloud Infrastructure URI of the bucket where the Spark job logs are to be uploaded. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.

metastoreId string

The OCID of Oracle Cloud Infrastructure Hive Metastore.

numExecutors number

The number of executor VMs requested.

ownerPrincipalId string

The OCID of the user who created the resource.

ownerUserName string

The username of the user who created the resource. If the username of the owner does not exist, null will be returned and the caller should refer to the ownerPrincipalId value instead.

parameters GetApplicationParameter[]

An array of name/value pairs used to fill placeholders found in properties like Application.arguments. The name must be a string of one or more word characters (a-z, A-Z, 0-9, _). The value can be a string of 0 or more characters of any kind. Example: [ { name: "iterations", value: "10"}, { name: "input_file", value: "mydata.xml" }, { name: "variable_x", value: "${x}"} ]

privateEndpointId string

The OCID of a private endpoint.

sparkVersion string

The Spark version utilized to run the application.

state string

The current state of this application.

timeCreated string

The date and time a application was created, expressed in RFC 3339 timestamp format. Example: 2018-04-03T21:10:29.600Z

timeUpdated string

The date and time a application was updated, expressed in RFC 3339 timestamp format. Example: 2018-04-03T21:10:29.600Z

type string

The Spark application processing type.

warehouseBucketUri string

An Oracle Cloud Infrastructure URI of the bucket to be used as default warehouse directory for BATCH SQL runs. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.

application_id str
archive_uri str

An Oracle Cloud Infrastructure URI of an archive.zip file containing custom dependencies that may be used to support the execution a Python, Java, or Scala application. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.

arguments Sequence[str]

The arguments passed to the running application as command line arguments. An argument is either a plain text or a placeholder. Placeholders are replaced using values from the parameters map. Each placeholder specified must be represented in the parameters map else the request (POST or PUT) will fail with a HTTP 400 status code. Placeholders are specified as Service Api Spec, where name is the name of the parameter. Example: [ "--input", "${input_file}", "--name", "John Doe" ] If "input_file" has a value of "mydata.xml", then the value above will be translated to --input mydata.xml --name "John Doe"

class_name str

The class for the application.

compartment_id str

The OCID of a compartment.

configuration Mapping[str, Any]

The Spark configuration passed to the running process. See https://spark.apache.org/docs/latest/configuration.html#available-properties. Example: { "spark.app.name" : "My App Name", "spark.shuffle.io.maxRetries" : "4" } Note: Not all Spark properties are permitted to be set. Attempting to set a property that is not allowed to be overwritten will cause a 400 status to be returned.

defined_tags Mapping[str, Any]

Defined tags for this resource. Each key is predefined and scoped to a namespace. For more information, see Resource Tags. Example: {"Operations.CostCenter": "42"}

description str

A user-friendly description.

display_name str

A user-friendly name. This name is not necessarily unique.

driver_shape str

The VM shape for the driver. Sets the driver cores and memory.

execute str

The input used for spark-submit command. For more details see https://spark.apache.org/docs/latest/submitting-applications.html#launching-applications-with-spark-submit. Supported options include --class, --file, --jars, --conf, --py-files, and main application file with arguments. Example: --jars oci://path/to/a.jar,oci://path/to/b.jar --files oci://path/to/a.json,oci://path/to/b.csv --py-files oci://path/to/a.py,oci://path/to/b.py --conf spark.sql.crossJoin.enabled=true --class org.apache.spark.examples.SparkPi oci://path/to/main.jar 10 Note: If execute is specified together with applicationId, className, configuration, fileUri, language, arguments, parameters during application create/update, or run create/submit, Data Flow service will use derived information from execute input only.

executor_shape str

The VM shape for the executors. Sets the executor cores and memory.

file_uri str

An Oracle Cloud Infrastructure URI of the file containing the application to execute. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.

freeform_tags Mapping[str, Any]

Free-form tags for this resource. Each tag is a simple key-value pair with no predefined name, type, or namespace. For more information, see Resource Tags. Example: {"Department": "Finance"}

id str

The application ID.

language str

The Spark language.

logs_bucket_uri str

An Oracle Cloud Infrastructure URI of the bucket where the Spark job logs are to be uploaded. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.

metastore_id str

The OCID of Oracle Cloud Infrastructure Hive Metastore.

num_executors int

The number of executor VMs requested.

owner_principal_id str

The OCID of the user who created the resource.

owner_user_name str

The username of the user who created the resource. If the username of the owner does not exist, null will be returned and the caller should refer to the ownerPrincipalId value instead.

parameters GetApplicationParameter]

An array of name/value pairs used to fill placeholders found in properties like Application.arguments. The name must be a string of one or more word characters (a-z, A-Z, 0-9, _). The value can be a string of 0 or more characters of any kind. Example: [ { name: "iterations", value: "10"}, { name: "input_file", value: "mydata.xml" }, { name: "variable_x", value: "${x}"} ]

private_endpoint_id str

The OCID of a private endpoint.

spark_version str

The Spark version utilized to run the application.

state str

The current state of this application.

time_created str

The date and time a application was created, expressed in RFC 3339 timestamp format. Example: 2018-04-03T21:10:29.600Z

time_updated str

The date and time a application was updated, expressed in RFC 3339 timestamp format. Example: 2018-04-03T21:10:29.600Z

type str

The Spark application processing type.

warehouse_bucket_uri str

An Oracle Cloud Infrastructure URI of the bucket to be used as default warehouse directory for BATCH SQL runs. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.

applicationId String
archiveUri String

An Oracle Cloud Infrastructure URI of an archive.zip file containing custom dependencies that may be used to support the execution a Python, Java, or Scala application. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.

arguments List<String>

The arguments passed to the running application as command line arguments. An argument is either a plain text or a placeholder. Placeholders are replaced using values from the parameters map. Each placeholder specified must be represented in the parameters map else the request (POST or PUT) will fail with a HTTP 400 status code. Placeholders are specified as Service Api Spec, where name is the name of the parameter. Example: [ "--input", "${input_file}", "--name", "John Doe" ] If "input_file" has a value of "mydata.xml", then the value above will be translated to --input mydata.xml --name "John Doe"

className String

The class for the application.

compartmentId String

The OCID of a compartment.

configuration Map<Any>

The Spark configuration passed to the running process. See https://spark.apache.org/docs/latest/configuration.html#available-properties. Example: { "spark.app.name" : "My App Name", "spark.shuffle.io.maxRetries" : "4" } Note: Not all Spark properties are permitted to be set. Attempting to set a property that is not allowed to be overwritten will cause a 400 status to be returned.

definedTags Map<Any>

Defined tags for this resource. Each key is predefined and scoped to a namespace. For more information, see Resource Tags. Example: {"Operations.CostCenter": "42"}

description String

A user-friendly description.

displayName String

A user-friendly name. This name is not necessarily unique.

driverShape String

The VM shape for the driver. Sets the driver cores and memory.

execute String

The input used for spark-submit command. For more details see https://spark.apache.org/docs/latest/submitting-applications.html#launching-applications-with-spark-submit. Supported options include --class, --file, --jars, --conf, --py-files, and main application file with arguments. Example: --jars oci://path/to/a.jar,oci://path/to/b.jar --files oci://path/to/a.json,oci://path/to/b.csv --py-files oci://path/to/a.py,oci://path/to/b.py --conf spark.sql.crossJoin.enabled=true --class org.apache.spark.examples.SparkPi oci://path/to/main.jar 10 Note: If execute is specified together with applicationId, className, configuration, fileUri, language, arguments, parameters during application create/update, or run create/submit, Data Flow service will use derived information from execute input only.

executorShape String

The VM shape for the executors. Sets the executor cores and memory.

fileUri String

An Oracle Cloud Infrastructure URI of the file containing the application to execute. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.

freeformTags Map<Any>

Free-form tags for this resource. Each tag is a simple key-value pair with no predefined name, type, or namespace. For more information, see Resource Tags. Example: {"Department": "Finance"}

id String

The application ID.

language String

The Spark language.

logsBucketUri String

An Oracle Cloud Infrastructure URI of the bucket where the Spark job logs are to be uploaded. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.

metastoreId String

The OCID of Oracle Cloud Infrastructure Hive Metastore.

numExecutors Number

The number of executor VMs requested.

ownerPrincipalId String

The OCID of the user who created the resource.

ownerUserName String

The username of the user who created the resource. If the username of the owner does not exist, null will be returned and the caller should refer to the ownerPrincipalId value instead.

parameters List<Property Map>

An array of name/value pairs used to fill placeholders found in properties like Application.arguments. The name must be a string of one or more word characters (a-z, A-Z, 0-9, _). The value can be a string of 0 or more characters of any kind. Example: [ { name: "iterations", value: "10"}, { name: "input_file", value: "mydata.xml" }, { name: "variable_x", value: "${x}"} ]

privateEndpointId String

The OCID of a private endpoint.

sparkVersion String

The Spark version utilized to run the application.

state String

The current state of this application.

timeCreated String

The date and time a application was created, expressed in RFC 3339 timestamp format. Example: 2018-04-03T21:10:29.600Z

timeUpdated String

The date and time a application was updated, expressed in RFC 3339 timestamp format. Example: 2018-04-03T21:10:29.600Z

type String

The Spark application processing type.

warehouseBucketUri String

An Oracle Cloud Infrastructure URI of the bucket to be used as default warehouse directory for BATCH SQL runs. See https://docs.cloud.oracle.com/iaas/Content/API/SDKDocs/hdfsconnector.htm#uriformat.

Supporting Types

GetApplicationParameter

Name string

The name of the parameter. It must be a string of one or more word characters (a-z, A-Z, 0-9, _). Examples: "iterations", "input_file"

Value string

The value of the parameter. It must be a string of 0 or more characters of any kind. Examples: "" (empty string), "10", "mydata.xml", "${x}"

Name string

The name of the parameter. It must be a string of one or more word characters (a-z, A-Z, 0-9, _). Examples: "iterations", "input_file"

Value string

The value of the parameter. It must be a string of 0 or more characters of any kind. Examples: "" (empty string), "10", "mydata.xml", "${x}"

name String

The name of the parameter. It must be a string of one or more word characters (a-z, A-Z, 0-9, _). Examples: "iterations", "input_file"

value String

The value of the parameter. It must be a string of 0 or more characters of any kind. Examples: "" (empty string), "10", "mydata.xml", "${x}"

name string

The name of the parameter. It must be a string of one or more word characters (a-z, A-Z, 0-9, _). Examples: "iterations", "input_file"

value string

The value of the parameter. It must be a string of 0 or more characters of any kind. Examples: "" (empty string), "10", "mydata.xml", "${x}"

name str

The name of the parameter. It must be a string of one or more word characters (a-z, A-Z, 0-9, _). Examples: "iterations", "input_file"

value str

The value of the parameter. It must be a string of 0 or more characters of any kind. Examples: "" (empty string), "10", "mydata.xml", "${x}"

name String

The name of the parameter. It must be a string of one or more word characters (a-z, A-Z, 0-9, _). Examples: "iterations", "input_file"

value String

The value of the parameter. It must be a string of 0 or more characters of any kind. Examples: "" (empty string), "10", "mydata.xml", "${x}"

Package Details

Repository
https://github.com/pulumi/pulumi-oci
License
Apache-2.0
Notes

This Pulumi package is based on the oci Terraform Provider.