Akamai

v3.1.0 published on Tuesday, Oct 4, 2022 by Pulumi

Datastream

Import

Basic usagehcl resource “akamai_datastream” “example” {

(resource arguments)

} You can import your Akamai DataStream configuration using a stream version ID. For example

 $ pulumi import akamai:index/datastream:Datastream example 1234

~> IMPORTANT: For security reasons, this command doesn’t import any secrets you specify for your connector. To make sure the state file includes complete data, use this resource to manually add the arguments marked Secret above.

Create a Datastream Resource

new Datastream(name: string, args: DatastreamArgs, opts?: CustomResourceOptions);
@overload
def Datastream(resource_name: str,
               opts: Optional[ResourceOptions] = None,
               active: Optional[bool] = None,
               azure_connector: Optional[DatastreamAzureConnectorArgs] = None,
               config: Optional[DatastreamConfigArgs] = None,
               contract_id: Optional[str] = None,
               datadog_connector: Optional[DatastreamDatadogConnectorArgs] = None,
               dataset_fields_ids: Optional[Sequence[int]] = None,
               email_ids: Optional[Sequence[str]] = None,
               gcs_connector: Optional[DatastreamGcsConnectorArgs] = None,
               group_id: Optional[str] = None,
               https_connector: Optional[DatastreamHttpsConnectorArgs] = None,
               oracle_connector: Optional[DatastreamOracleConnectorArgs] = None,
               property_ids: Optional[Sequence[str]] = None,
               s3_connector: Optional[DatastreamS3ConnectorArgs] = None,
               splunk_connector: Optional[DatastreamSplunkConnectorArgs] = None,
               stream_name: Optional[str] = None,
               stream_type: Optional[str] = None,
               sumologic_connector: Optional[DatastreamSumologicConnectorArgs] = None,
               template_name: Optional[str] = None)
@overload
def Datastream(resource_name: str,
               args: DatastreamArgs,
               opts: Optional[ResourceOptions] = None)
func NewDatastream(ctx *Context, name string, args DatastreamArgs, opts ...ResourceOption) (*Datastream, error)
public Datastream(string name, DatastreamArgs args, CustomResourceOptions? opts = null)
public Datastream(String name, DatastreamArgs args)
public Datastream(String name, DatastreamArgs args, CustomResourceOptions options)
type: akamai:Datastream
properties: # The arguments to resource properties.
options: # Bag of options to control resource's behavior.

name string
The unique name of the resource.
args DatastreamArgs
The arguments to resource properties.
opts CustomResourceOptions
Bag of options to control resource's behavior.
resource_name str
The unique name of the resource.
args DatastreamArgs
The arguments to resource properties.
opts ResourceOptions
Bag of options to control resource's behavior.
ctx Context
Context object for the current deployment.
name string
The unique name of the resource.
args DatastreamArgs
The arguments to resource properties.
opts ResourceOption
Bag of options to control resource's behavior.
name string
The unique name of the resource.
args DatastreamArgs
The arguments to resource properties.
opts CustomResourceOptions
Bag of options to control resource's behavior.
name String
The unique name of the resource.
args DatastreamArgs
The arguments to resource properties.
options CustomResourceOptions
Bag of options to control resource's behavior.

Datastream Resource Properties

To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.

Inputs

The Datastream resource accepts the following input properties:

Active bool

Whether you want to start activating the stream when applying the resource. Either true for activating the stream upon sending the request or false for leaving the stream inactive after the request.

Config DatastreamConfigArgs

Provides information about the log line configuration, log file format, names of log files sent, and file delivery. The argument includes these sub-arguments:

ContractId string

Identifies the contract that has access to the product.

DatasetFieldsIds List<int>

Identifiers of the data set fields within the template that you want to receive in logs. The order of the identifiers define how the value for these fields appears in the log lines. See Data set parameters.

GroupId string

Identifies the group that has access to the product and this stream configuration.

PropertyIds List<string>

Identifies the properties that you want to monitor in the stream. Note that a stream can only log data for active properties.

StreamName string

The name of the stream.

StreamType string

The type of stream that you want to create. Currently, RAW_LOGS is the only possible stream type.

TemplateName string

The name of the data set template available for the product that you want to use in the stream. Currently, EDGE_LOGS is the only data set template available.

AzureConnector DatastreamAzureConnectorArgs

Specify details about the Azure Storage connector configuration in a data stream. Note that currently DataStream supports only streaming data to block objects. The argument includes these sub-arguments:

DatadogConnector DatastreamDatadogConnectorArgs

Specify details about the Datadog connector in a stream, including:

EmailIds List<string>

A list of email addresses you want to notify about activations and deactivations of the stream.

GcsConnector DatastreamGcsConnectorArgs

Specify details about the Google Cloud Storage connector you can use in a stream. When validating this connector, DataStream uses the private access key to create an Akamai_access_verification_<timestamp>.txt object file in your GCS bucket. You can only see this file if the validation process is successful, and you have access to the Google Cloud Storage bucket where you are trying to send logs. The argument includes these sub-arguments:

HttpsConnector DatastreamHttpsConnectorArgs

Specify details about the custom HTTPS endpoint you can use as a connector for a stream, including:

OracleConnector DatastreamOracleConnectorArgs

Specify details about the Oracle Cloud Storage connector in a stream. When validating this connector, DataStream uses the provided access_key and secret_access_key values and tries to save an Akamai_access_verification_<timestamp>.txt file in your Oracle Cloud Storage folder. You can only see this file if the validation process is successful, and you have access to the Oracle Cloud Storage bucket and folder that you’re trying to send logs to.

S3Connector DatastreamS3ConnectorArgs

Specify details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided access_key and secret_access_key values and saves an akamai_write_test_2147483647.txt file in your Amazon S3 folder. You can only see this file if validation succeeds, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to. The argument includes these sub-arguments:

SplunkConnector DatastreamSplunkConnectorArgs

Specify details about the Splunk connector in your stream. Note that currently DataStream supports only endpoint URLs ending with collector/raw. The argument includes these sub-arguments:

SumologicConnector DatastreamSumologicConnectorArgs

Specify details about the Sumo Logic connector in a stream, including:

Active bool

Whether you want to start activating the stream when applying the resource. Either true for activating the stream upon sending the request or false for leaving the stream inactive after the request.

Config DatastreamConfigArgs

Provides information about the log line configuration, log file format, names of log files sent, and file delivery. The argument includes these sub-arguments:

ContractId string

Identifies the contract that has access to the product.

DatasetFieldsIds []int

Identifiers of the data set fields within the template that you want to receive in logs. The order of the identifiers define how the value for these fields appears in the log lines. See Data set parameters.

GroupId string

Identifies the group that has access to the product and this stream configuration.

PropertyIds []string

Identifies the properties that you want to monitor in the stream. Note that a stream can only log data for active properties.

StreamName string

The name of the stream.

StreamType string

The type of stream that you want to create. Currently, RAW_LOGS is the only possible stream type.

TemplateName string

The name of the data set template available for the product that you want to use in the stream. Currently, EDGE_LOGS is the only data set template available.

AzureConnector DatastreamAzureConnectorArgs

Specify details about the Azure Storage connector configuration in a data stream. Note that currently DataStream supports only streaming data to block objects. The argument includes these sub-arguments:

DatadogConnector DatastreamDatadogConnectorArgs

Specify details about the Datadog connector in a stream, including:

EmailIds []string

A list of email addresses you want to notify about activations and deactivations of the stream.

GcsConnector DatastreamGcsConnectorArgs

Specify details about the Google Cloud Storage connector you can use in a stream. When validating this connector, DataStream uses the private access key to create an Akamai_access_verification_<timestamp>.txt object file in your GCS bucket. You can only see this file if the validation process is successful, and you have access to the Google Cloud Storage bucket where you are trying to send logs. The argument includes these sub-arguments:

HttpsConnector DatastreamHttpsConnectorArgs

Specify details about the custom HTTPS endpoint you can use as a connector for a stream, including:

OracleConnector DatastreamOracleConnectorArgs

Specify details about the Oracle Cloud Storage connector in a stream. When validating this connector, DataStream uses the provided access_key and secret_access_key values and tries to save an Akamai_access_verification_<timestamp>.txt file in your Oracle Cloud Storage folder. You can only see this file if the validation process is successful, and you have access to the Oracle Cloud Storage bucket and folder that you’re trying to send logs to.

S3Connector DatastreamS3ConnectorArgs

Specify details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided access_key and secret_access_key values and saves an akamai_write_test_2147483647.txt file in your Amazon S3 folder. You can only see this file if validation succeeds, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to. The argument includes these sub-arguments:

SplunkConnector DatastreamSplunkConnectorArgs

Specify details about the Splunk connector in your stream. Note that currently DataStream supports only endpoint URLs ending with collector/raw. The argument includes these sub-arguments:

SumologicConnector DatastreamSumologicConnectorArgs

Specify details about the Sumo Logic connector in a stream, including:

active Boolean

Whether you want to start activating the stream when applying the resource. Either true for activating the stream upon sending the request or false for leaving the stream inactive after the request.

config DatastreamConfigArgs

Provides information about the log line configuration, log file format, names of log files sent, and file delivery. The argument includes these sub-arguments:

contractId String

Identifies the contract that has access to the product.

datasetFieldsIds List<Integer>

Identifiers of the data set fields within the template that you want to receive in logs. The order of the identifiers define how the value for these fields appears in the log lines. See Data set parameters.

groupId String

Identifies the group that has access to the product and this stream configuration.

propertyIds List<String>

Identifies the properties that you want to monitor in the stream. Note that a stream can only log data for active properties.

streamName String

The name of the stream.

streamType String

The type of stream that you want to create. Currently, RAW_LOGS is the only possible stream type.

templateName String

The name of the data set template available for the product that you want to use in the stream. Currently, EDGE_LOGS is the only data set template available.

azureConnector DatastreamAzureConnectorArgs

Specify details about the Azure Storage connector configuration in a data stream. Note that currently DataStream supports only streaming data to block objects. The argument includes these sub-arguments:

datadogConnector DatastreamDatadogConnectorArgs

Specify details about the Datadog connector in a stream, including:

emailIds List<String>

A list of email addresses you want to notify about activations and deactivations of the stream.

gcsConnector DatastreamGcsConnectorArgs

Specify details about the Google Cloud Storage connector you can use in a stream. When validating this connector, DataStream uses the private access key to create an Akamai_access_verification_<timestamp>.txt object file in your GCS bucket. You can only see this file if the validation process is successful, and you have access to the Google Cloud Storage bucket where you are trying to send logs. The argument includes these sub-arguments:

httpsConnector DatastreamHttpsConnectorArgs

Specify details about the custom HTTPS endpoint you can use as a connector for a stream, including:

oracleConnector DatastreamOracleConnectorArgs

Specify details about the Oracle Cloud Storage connector in a stream. When validating this connector, DataStream uses the provided access_key and secret_access_key values and tries to save an Akamai_access_verification_<timestamp>.txt file in your Oracle Cloud Storage folder. You can only see this file if the validation process is successful, and you have access to the Oracle Cloud Storage bucket and folder that you’re trying to send logs to.

s3Connector DatastreamS3ConnectorArgs

Specify details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided access_key and secret_access_key values and saves an akamai_write_test_2147483647.txt file in your Amazon S3 folder. You can only see this file if validation succeeds, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to. The argument includes these sub-arguments:

splunkConnector DatastreamSplunkConnectorArgs

Specify details about the Splunk connector in your stream. Note that currently DataStream supports only endpoint URLs ending with collector/raw. The argument includes these sub-arguments:

sumologicConnector DatastreamSumologicConnectorArgs

Specify details about the Sumo Logic connector in a stream, including:

active boolean

Whether you want to start activating the stream when applying the resource. Either true for activating the stream upon sending the request or false for leaving the stream inactive after the request.

config DatastreamConfigArgs

Provides information about the log line configuration, log file format, names of log files sent, and file delivery. The argument includes these sub-arguments:

contractId string

Identifies the contract that has access to the product.

datasetFieldsIds number[]

Identifiers of the data set fields within the template that you want to receive in logs. The order of the identifiers define how the value for these fields appears in the log lines. See Data set parameters.

groupId string

Identifies the group that has access to the product and this stream configuration.

propertyIds string[]

Identifies the properties that you want to monitor in the stream. Note that a stream can only log data for active properties.

streamName string

The name of the stream.

streamType string

The type of stream that you want to create. Currently, RAW_LOGS is the only possible stream type.

templateName string

The name of the data set template available for the product that you want to use in the stream. Currently, EDGE_LOGS is the only data set template available.

azureConnector DatastreamAzureConnectorArgs

Specify details about the Azure Storage connector configuration in a data stream. Note that currently DataStream supports only streaming data to block objects. The argument includes these sub-arguments:

datadogConnector DatastreamDatadogConnectorArgs

Specify details about the Datadog connector in a stream, including:

emailIds string[]

A list of email addresses you want to notify about activations and deactivations of the stream.

gcsConnector DatastreamGcsConnectorArgs

Specify details about the Google Cloud Storage connector you can use in a stream. When validating this connector, DataStream uses the private access key to create an Akamai_access_verification_<timestamp>.txt object file in your GCS bucket. You can only see this file if the validation process is successful, and you have access to the Google Cloud Storage bucket where you are trying to send logs. The argument includes these sub-arguments:

httpsConnector DatastreamHttpsConnectorArgs

Specify details about the custom HTTPS endpoint you can use as a connector for a stream, including:

oracleConnector DatastreamOracleConnectorArgs

Specify details about the Oracle Cloud Storage connector in a stream. When validating this connector, DataStream uses the provided access_key and secret_access_key values and tries to save an Akamai_access_verification_<timestamp>.txt file in your Oracle Cloud Storage folder. You can only see this file if the validation process is successful, and you have access to the Oracle Cloud Storage bucket and folder that you’re trying to send logs to.

s3Connector DatastreamS3ConnectorArgs

Specify details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided access_key and secret_access_key values and saves an akamai_write_test_2147483647.txt file in your Amazon S3 folder. You can only see this file if validation succeeds, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to. The argument includes these sub-arguments:

splunkConnector DatastreamSplunkConnectorArgs

Specify details about the Splunk connector in your stream. Note that currently DataStream supports only endpoint URLs ending with collector/raw. The argument includes these sub-arguments:

sumologicConnector DatastreamSumologicConnectorArgs

Specify details about the Sumo Logic connector in a stream, including:

active bool

Whether you want to start activating the stream when applying the resource. Either true for activating the stream upon sending the request or false for leaving the stream inactive after the request.

config DatastreamConfigArgs

Provides information about the log line configuration, log file format, names of log files sent, and file delivery. The argument includes these sub-arguments:

contract_id str

Identifies the contract that has access to the product.

dataset_fields_ids Sequence[int]

Identifiers of the data set fields within the template that you want to receive in logs. The order of the identifiers define how the value for these fields appears in the log lines. See Data set parameters.

group_id str

Identifies the group that has access to the product and this stream configuration.

property_ids Sequence[str]

Identifies the properties that you want to monitor in the stream. Note that a stream can only log data for active properties.

stream_name str

The name of the stream.

stream_type str

The type of stream that you want to create. Currently, RAW_LOGS is the only possible stream type.

template_name str

The name of the data set template available for the product that you want to use in the stream. Currently, EDGE_LOGS is the only data set template available.

azure_connector DatastreamAzureConnectorArgs

Specify details about the Azure Storage connector configuration in a data stream. Note that currently DataStream supports only streaming data to block objects. The argument includes these sub-arguments:

datadog_connector DatastreamDatadogConnectorArgs

Specify details about the Datadog connector in a stream, including:

email_ids Sequence[str]

A list of email addresses you want to notify about activations and deactivations of the stream.

gcs_connector DatastreamGcsConnectorArgs

Specify details about the Google Cloud Storage connector you can use in a stream. When validating this connector, DataStream uses the private access key to create an Akamai_access_verification_<timestamp>.txt object file in your GCS bucket. You can only see this file if the validation process is successful, and you have access to the Google Cloud Storage bucket where you are trying to send logs. The argument includes these sub-arguments:

https_connector DatastreamHttpsConnectorArgs

Specify details about the custom HTTPS endpoint you can use as a connector for a stream, including:

oracle_connector DatastreamOracleConnectorArgs

Specify details about the Oracle Cloud Storage connector in a stream. When validating this connector, DataStream uses the provided access_key and secret_access_key values and tries to save an Akamai_access_verification_<timestamp>.txt file in your Oracle Cloud Storage folder. You can only see this file if the validation process is successful, and you have access to the Oracle Cloud Storage bucket and folder that you’re trying to send logs to.

s3_connector DatastreamS3ConnectorArgs

Specify details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided access_key and secret_access_key values and saves an akamai_write_test_2147483647.txt file in your Amazon S3 folder. You can only see this file if validation succeeds, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to. The argument includes these sub-arguments:

splunk_connector DatastreamSplunkConnectorArgs

Specify details about the Splunk connector in your stream. Note that currently DataStream supports only endpoint URLs ending with collector/raw. The argument includes these sub-arguments:

sumologic_connector DatastreamSumologicConnectorArgs

Specify details about the Sumo Logic connector in a stream, including:

active Boolean

Whether you want to start activating the stream when applying the resource. Either true for activating the stream upon sending the request or false for leaving the stream inactive after the request.

config Property Map

Provides information about the log line configuration, log file format, names of log files sent, and file delivery. The argument includes these sub-arguments:

contractId String

Identifies the contract that has access to the product.

datasetFieldsIds List<Number>

Identifiers of the data set fields within the template that you want to receive in logs. The order of the identifiers define how the value for these fields appears in the log lines. See Data set parameters.

groupId String

Identifies the group that has access to the product and this stream configuration.

propertyIds List<String>

Identifies the properties that you want to monitor in the stream. Note that a stream can only log data for active properties.

streamName String

The name of the stream.

streamType String

The type of stream that you want to create. Currently, RAW_LOGS is the only possible stream type.

templateName String

The name of the data set template available for the product that you want to use in the stream. Currently, EDGE_LOGS is the only data set template available.

azureConnector Property Map

Specify details about the Azure Storage connector configuration in a data stream. Note that currently DataStream supports only streaming data to block objects. The argument includes these sub-arguments:

datadogConnector Property Map

Specify details about the Datadog connector in a stream, including:

emailIds List<String>

A list of email addresses you want to notify about activations and deactivations of the stream.

gcsConnector Property Map

Specify details about the Google Cloud Storage connector you can use in a stream. When validating this connector, DataStream uses the private access key to create an Akamai_access_verification_<timestamp>.txt object file in your GCS bucket. You can only see this file if the validation process is successful, and you have access to the Google Cloud Storage bucket where you are trying to send logs. The argument includes these sub-arguments:

httpsConnector Property Map

Specify details about the custom HTTPS endpoint you can use as a connector for a stream, including:

oracleConnector Property Map

Specify details about the Oracle Cloud Storage connector in a stream. When validating this connector, DataStream uses the provided access_key and secret_access_key values and tries to save an Akamai_access_verification_<timestamp>.txt file in your Oracle Cloud Storage folder. You can only see this file if the validation process is successful, and you have access to the Oracle Cloud Storage bucket and folder that you’re trying to send logs to.

s3Connector Property Map

Specify details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided access_key and secret_access_key values and saves an akamai_write_test_2147483647.txt file in your Amazon S3 folder. You can only see this file if validation succeeds, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to. The argument includes these sub-arguments:

splunkConnector Property Map

Specify details about the Splunk connector in your stream. Note that currently DataStream supports only endpoint URLs ending with collector/raw. The argument includes these sub-arguments:

sumologicConnector Property Map

Specify details about the Sumo Logic connector in a stream, including:

Outputs

All input properties are implicitly available as output properties. Additionally, the Datastream resource produces the following output properties:

CreatedBy string

The username who created the stream

CreatedDate string

The date and time when the stream was created

GroupName string

The name of the user group for which the stream was created

Id string

The provider-assigned unique ID for this managed resource.

ModifiedBy string

The username who modified the stream

ModifiedDate string

The date and time when the stream was modified

PapiJson string

The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior

ProductId string

The ID of the product for which the stream was created

ProductName string

The name of the product for which the stream was created

StreamVersionId int

Identifies the configuration version of the stream

CreatedBy string

The username who created the stream

CreatedDate string

The date and time when the stream was created

GroupName string

The name of the user group for which the stream was created

Id string

The provider-assigned unique ID for this managed resource.

ModifiedBy string

The username who modified the stream

ModifiedDate string

The date and time when the stream was modified

PapiJson string

The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior

ProductId string

The ID of the product for which the stream was created

ProductName string

The name of the product for which the stream was created

StreamVersionId int

Identifies the configuration version of the stream

createdBy String

The username who created the stream

createdDate String

The date and time when the stream was created

groupName String

The name of the user group for which the stream was created

id String

The provider-assigned unique ID for this managed resource.

modifiedBy String

The username who modified the stream

modifiedDate String

The date and time when the stream was modified

papiJson String

The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior

productId String

The ID of the product for which the stream was created

productName String

The name of the product for which the stream was created

streamVersionId Integer

Identifies the configuration version of the stream

createdBy string

The username who created the stream

createdDate string

The date and time when the stream was created

groupName string

The name of the user group for which the stream was created

id string

The provider-assigned unique ID for this managed resource.

modifiedBy string

The username who modified the stream

modifiedDate string

The date and time when the stream was modified

papiJson string

The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior

productId string

The ID of the product for which the stream was created

productName string

The name of the product for which the stream was created

streamVersionId number

Identifies the configuration version of the stream

created_by str

The username who created the stream

created_date str

The date and time when the stream was created

group_name str

The name of the user group for which the stream was created

id str

The provider-assigned unique ID for this managed resource.

modified_by str

The username who modified the stream

modified_date str

The date and time when the stream was modified

papi_json str

The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior

product_id str

The ID of the product for which the stream was created

product_name str

The name of the product for which the stream was created

stream_version_id int

Identifies the configuration version of the stream

createdBy String

The username who created the stream

createdDate String

The date and time when the stream was created

groupName String

The name of the user group for which the stream was created

id String

The provider-assigned unique ID for this managed resource.

modifiedBy String

The username who modified the stream

modifiedDate String

The date and time when the stream was modified

papiJson String

The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior

productId String

The ID of the product for which the stream was created

productName String

The name of the product for which the stream was created

streamVersionId Number

Identifies the configuration version of the stream

Look up an Existing Datastream Resource

Get an existing Datastream resource’s state with the given name, ID, and optional extra properties used to qualify the lookup.

public static get(name: string, id: Input<ID>, state?: DatastreamState, opts?: CustomResourceOptions): Datastream
@staticmethod
def get(resource_name: str,
        id: str,
        opts: Optional[ResourceOptions] = None,
        active: Optional[bool] = None,
        azure_connector: Optional[DatastreamAzureConnectorArgs] = None,
        config: Optional[DatastreamConfigArgs] = None,
        contract_id: Optional[str] = None,
        created_by: Optional[str] = None,
        created_date: Optional[str] = None,
        datadog_connector: Optional[DatastreamDatadogConnectorArgs] = None,
        dataset_fields_ids: Optional[Sequence[int]] = None,
        email_ids: Optional[Sequence[str]] = None,
        gcs_connector: Optional[DatastreamGcsConnectorArgs] = None,
        group_id: Optional[str] = None,
        group_name: Optional[str] = None,
        https_connector: Optional[DatastreamHttpsConnectorArgs] = None,
        modified_by: Optional[str] = None,
        modified_date: Optional[str] = None,
        oracle_connector: Optional[DatastreamOracleConnectorArgs] = None,
        papi_json: Optional[str] = None,
        product_id: Optional[str] = None,
        product_name: Optional[str] = None,
        property_ids: Optional[Sequence[str]] = None,
        s3_connector: Optional[DatastreamS3ConnectorArgs] = None,
        splunk_connector: Optional[DatastreamSplunkConnectorArgs] = None,
        stream_name: Optional[str] = None,
        stream_type: Optional[str] = None,
        stream_version_id: Optional[int] = None,
        sumologic_connector: Optional[DatastreamSumologicConnectorArgs] = None,
        template_name: Optional[str] = None) -> Datastream
func GetDatastream(ctx *Context, name string, id IDInput, state *DatastreamState, opts ...ResourceOption) (*Datastream, error)
public static Datastream Get(string name, Input<string> id, DatastreamState? state, CustomResourceOptions? opts = null)
public static Datastream get(String name, Output<String> id, DatastreamState state, CustomResourceOptions options)
Resource lookup is not supported in YAML
name
The unique name of the resulting resource.
id
The unique provider ID of the resource to lookup.
state
Any extra arguments used during the lookup.
opts
A bag of options that control this resource's behavior.
resource_name
The unique name of the resulting resource.
id
The unique provider ID of the resource to lookup.
name
The unique name of the resulting resource.
id
The unique provider ID of the resource to lookup.
state
Any extra arguments used during the lookup.
opts
A bag of options that control this resource's behavior.
name
The unique name of the resulting resource.
id
The unique provider ID of the resource to lookup.
state
Any extra arguments used during the lookup.
opts
A bag of options that control this resource's behavior.
name
The unique name of the resulting resource.
id
The unique provider ID of the resource to lookup.
state
Any extra arguments used during the lookup.
opts
A bag of options that control this resource's behavior.
The following state arguments are supported:
Active bool

Whether you want to start activating the stream when applying the resource. Either true for activating the stream upon sending the request or false for leaving the stream inactive after the request.

AzureConnector DatastreamAzureConnectorArgs

Specify details about the Azure Storage connector configuration in a data stream. Note that currently DataStream supports only streaming data to block objects. The argument includes these sub-arguments:

Config DatastreamConfigArgs

Provides information about the log line configuration, log file format, names of log files sent, and file delivery. The argument includes these sub-arguments:

ContractId string

Identifies the contract that has access to the product.

CreatedBy string

The username who created the stream

CreatedDate string

The date and time when the stream was created

DatadogConnector DatastreamDatadogConnectorArgs

Specify details about the Datadog connector in a stream, including:

DatasetFieldsIds List<int>

Identifiers of the data set fields within the template that you want to receive in logs. The order of the identifiers define how the value for these fields appears in the log lines. See Data set parameters.

EmailIds List<string>

A list of email addresses you want to notify about activations and deactivations of the stream.

GcsConnector DatastreamGcsConnectorArgs

Specify details about the Google Cloud Storage connector you can use in a stream. When validating this connector, DataStream uses the private access key to create an Akamai_access_verification_<timestamp>.txt object file in your GCS bucket. You can only see this file if the validation process is successful, and you have access to the Google Cloud Storage bucket where you are trying to send logs. The argument includes these sub-arguments:

GroupId string

Identifies the group that has access to the product and this stream configuration.

GroupName string

The name of the user group for which the stream was created

HttpsConnector DatastreamHttpsConnectorArgs

Specify details about the custom HTTPS endpoint you can use as a connector for a stream, including:

ModifiedBy string

The username who modified the stream

ModifiedDate string

The date and time when the stream was modified

OracleConnector DatastreamOracleConnectorArgs

Specify details about the Oracle Cloud Storage connector in a stream. When validating this connector, DataStream uses the provided access_key and secret_access_key values and tries to save an Akamai_access_verification_<timestamp>.txt file in your Oracle Cloud Storage folder. You can only see this file if the validation process is successful, and you have access to the Oracle Cloud Storage bucket and folder that you’re trying to send logs to.

PapiJson string

The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior

ProductId string

The ID of the product for which the stream was created

ProductName string

The name of the product for which the stream was created

PropertyIds List<string>

Identifies the properties that you want to monitor in the stream. Note that a stream can only log data for active properties.

S3Connector DatastreamS3ConnectorArgs

Specify details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided access_key and secret_access_key values and saves an akamai_write_test_2147483647.txt file in your Amazon S3 folder. You can only see this file if validation succeeds, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to. The argument includes these sub-arguments:

SplunkConnector DatastreamSplunkConnectorArgs

Specify details about the Splunk connector in your stream. Note that currently DataStream supports only endpoint URLs ending with collector/raw. The argument includes these sub-arguments:

StreamName string

The name of the stream.

StreamType string

The type of stream that you want to create. Currently, RAW_LOGS is the only possible stream type.

StreamVersionId int

Identifies the configuration version of the stream

SumologicConnector DatastreamSumologicConnectorArgs

Specify details about the Sumo Logic connector in a stream, including:

TemplateName string

The name of the data set template available for the product that you want to use in the stream. Currently, EDGE_LOGS is the only data set template available.

Active bool

Whether you want to start activating the stream when applying the resource. Either true for activating the stream upon sending the request or false for leaving the stream inactive after the request.

AzureConnector DatastreamAzureConnectorArgs

Specify details about the Azure Storage connector configuration in a data stream. Note that currently DataStream supports only streaming data to block objects. The argument includes these sub-arguments:

Config DatastreamConfigArgs

Provides information about the log line configuration, log file format, names of log files sent, and file delivery. The argument includes these sub-arguments:

ContractId string

Identifies the contract that has access to the product.

CreatedBy string

The username who created the stream

CreatedDate string

The date and time when the stream was created

DatadogConnector DatastreamDatadogConnectorArgs

Specify details about the Datadog connector in a stream, including:

DatasetFieldsIds []int

Identifiers of the data set fields within the template that you want to receive in logs. The order of the identifiers define how the value for these fields appears in the log lines. See Data set parameters.

EmailIds []string

A list of email addresses you want to notify about activations and deactivations of the stream.

GcsConnector DatastreamGcsConnectorArgs

Specify details about the Google Cloud Storage connector you can use in a stream. When validating this connector, DataStream uses the private access key to create an Akamai_access_verification_<timestamp>.txt object file in your GCS bucket. You can only see this file if the validation process is successful, and you have access to the Google Cloud Storage bucket where you are trying to send logs. The argument includes these sub-arguments:

GroupId string

Identifies the group that has access to the product and this stream configuration.

GroupName string

The name of the user group for which the stream was created

HttpsConnector DatastreamHttpsConnectorArgs

Specify details about the custom HTTPS endpoint you can use as a connector for a stream, including:

ModifiedBy string

The username who modified the stream

ModifiedDate string

The date and time when the stream was modified

OracleConnector DatastreamOracleConnectorArgs

Specify details about the Oracle Cloud Storage connector in a stream. When validating this connector, DataStream uses the provided access_key and secret_access_key values and tries to save an Akamai_access_verification_<timestamp>.txt file in your Oracle Cloud Storage folder. You can only see this file if the validation process is successful, and you have access to the Oracle Cloud Storage bucket and folder that you’re trying to send logs to.

PapiJson string

The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior

ProductId string

The ID of the product for which the stream was created

ProductName string

The name of the product for which the stream was created

PropertyIds []string

Identifies the properties that you want to monitor in the stream. Note that a stream can only log data for active properties.

S3Connector DatastreamS3ConnectorArgs

Specify details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided access_key and secret_access_key values and saves an akamai_write_test_2147483647.txt file in your Amazon S3 folder. You can only see this file if validation succeeds, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to. The argument includes these sub-arguments:

SplunkConnector DatastreamSplunkConnectorArgs

Specify details about the Splunk connector in your stream. Note that currently DataStream supports only endpoint URLs ending with collector/raw. The argument includes these sub-arguments:

StreamName string

The name of the stream.

StreamType string

The type of stream that you want to create. Currently, RAW_LOGS is the only possible stream type.

StreamVersionId int

Identifies the configuration version of the stream

SumologicConnector DatastreamSumologicConnectorArgs

Specify details about the Sumo Logic connector in a stream, including:

TemplateName string

The name of the data set template available for the product that you want to use in the stream. Currently, EDGE_LOGS is the only data set template available.

active Boolean

Whether you want to start activating the stream when applying the resource. Either true for activating the stream upon sending the request or false for leaving the stream inactive after the request.

azureConnector DatastreamAzureConnectorArgs

Specify details about the Azure Storage connector configuration in a data stream. Note that currently DataStream supports only streaming data to block objects. The argument includes these sub-arguments:

config DatastreamConfigArgs

Provides information about the log line configuration, log file format, names of log files sent, and file delivery. The argument includes these sub-arguments:

contractId String

Identifies the contract that has access to the product.

createdBy String

The username who created the stream

createdDate String

The date and time when the stream was created

datadogConnector DatastreamDatadogConnectorArgs

Specify details about the Datadog connector in a stream, including:

datasetFieldsIds List<Integer>

Identifiers of the data set fields within the template that you want to receive in logs. The order of the identifiers define how the value for these fields appears in the log lines. See Data set parameters.

emailIds List<String>

A list of email addresses you want to notify about activations and deactivations of the stream.

gcsConnector DatastreamGcsConnectorArgs

Specify details about the Google Cloud Storage connector you can use in a stream. When validating this connector, DataStream uses the private access key to create an Akamai_access_verification_<timestamp>.txt object file in your GCS bucket. You can only see this file if the validation process is successful, and you have access to the Google Cloud Storage bucket where you are trying to send logs. The argument includes these sub-arguments:

groupId String

Identifies the group that has access to the product and this stream configuration.

groupName String

The name of the user group for which the stream was created

httpsConnector DatastreamHttpsConnectorArgs

Specify details about the custom HTTPS endpoint you can use as a connector for a stream, including:

modifiedBy String

The username who modified the stream

modifiedDate String

The date and time when the stream was modified

oracleConnector DatastreamOracleConnectorArgs

Specify details about the Oracle Cloud Storage connector in a stream. When validating this connector, DataStream uses the provided access_key and secret_access_key values and tries to save an Akamai_access_verification_<timestamp>.txt file in your Oracle Cloud Storage folder. You can only see this file if the validation process is successful, and you have access to the Oracle Cloud Storage bucket and folder that you’re trying to send logs to.

papiJson String

The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior

productId String

The ID of the product for which the stream was created

productName String

The name of the product for which the stream was created

propertyIds List<String>

Identifies the properties that you want to monitor in the stream. Note that a stream can only log data for active properties.

s3Connector DatastreamS3ConnectorArgs

Specify details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided access_key and secret_access_key values and saves an akamai_write_test_2147483647.txt file in your Amazon S3 folder. You can only see this file if validation succeeds, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to. The argument includes these sub-arguments:

splunkConnector DatastreamSplunkConnectorArgs

Specify details about the Splunk connector in your stream. Note that currently DataStream supports only endpoint URLs ending with collector/raw. The argument includes these sub-arguments:

streamName String

The name of the stream.

streamType String

The type of stream that you want to create. Currently, RAW_LOGS is the only possible stream type.

streamVersionId Integer

Identifies the configuration version of the stream

sumologicConnector DatastreamSumologicConnectorArgs

Specify details about the Sumo Logic connector in a stream, including:

templateName String

The name of the data set template available for the product that you want to use in the stream. Currently, EDGE_LOGS is the only data set template available.

active boolean

Whether you want to start activating the stream when applying the resource. Either true for activating the stream upon sending the request or false for leaving the stream inactive after the request.

azureConnector DatastreamAzureConnectorArgs

Specify details about the Azure Storage connector configuration in a data stream. Note that currently DataStream supports only streaming data to block objects. The argument includes these sub-arguments:

config DatastreamConfigArgs

Provides information about the log line configuration, log file format, names of log files sent, and file delivery. The argument includes these sub-arguments:

contractId string

Identifies the contract that has access to the product.

createdBy string

The username who created the stream

createdDate string

The date and time when the stream was created

datadogConnector DatastreamDatadogConnectorArgs

Specify details about the Datadog connector in a stream, including:

datasetFieldsIds number[]

Identifiers of the data set fields within the template that you want to receive in logs. The order of the identifiers define how the value for these fields appears in the log lines. See Data set parameters.

emailIds string[]

A list of email addresses you want to notify about activations and deactivations of the stream.

gcsConnector DatastreamGcsConnectorArgs

Specify details about the Google Cloud Storage connector you can use in a stream. When validating this connector, DataStream uses the private access key to create an Akamai_access_verification_<timestamp>.txt object file in your GCS bucket. You can only see this file if the validation process is successful, and you have access to the Google Cloud Storage bucket where you are trying to send logs. The argument includes these sub-arguments:

groupId string

Identifies the group that has access to the product and this stream configuration.

groupName string

The name of the user group for which the stream was created

httpsConnector DatastreamHttpsConnectorArgs

Specify details about the custom HTTPS endpoint you can use as a connector for a stream, including:

modifiedBy string

The username who modified the stream

modifiedDate string

The date and time when the stream was modified

oracleConnector DatastreamOracleConnectorArgs

Specify details about the Oracle Cloud Storage connector in a stream. When validating this connector, DataStream uses the provided access_key and secret_access_key values and tries to save an Akamai_access_verification_<timestamp>.txt file in your Oracle Cloud Storage folder. You can only see this file if the validation process is successful, and you have access to the Oracle Cloud Storage bucket and folder that you’re trying to send logs to.

papiJson string

The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior

productId string

The ID of the product for which the stream was created

productName string

The name of the product for which the stream was created

propertyIds string[]

Identifies the properties that you want to monitor in the stream. Note that a stream can only log data for active properties.

s3Connector DatastreamS3ConnectorArgs

Specify details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided access_key and secret_access_key values and saves an akamai_write_test_2147483647.txt file in your Amazon S3 folder. You can only see this file if validation succeeds, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to. The argument includes these sub-arguments:

splunkConnector DatastreamSplunkConnectorArgs

Specify details about the Splunk connector in your stream. Note that currently DataStream supports only endpoint URLs ending with collector/raw. The argument includes these sub-arguments:

streamName string

The name of the stream.

streamType string

The type of stream that you want to create. Currently, RAW_LOGS is the only possible stream type.

streamVersionId number

Identifies the configuration version of the stream

sumologicConnector DatastreamSumologicConnectorArgs

Specify details about the Sumo Logic connector in a stream, including:

templateName string

The name of the data set template available for the product that you want to use in the stream. Currently, EDGE_LOGS is the only data set template available.

active bool

Whether you want to start activating the stream when applying the resource. Either true for activating the stream upon sending the request or false for leaving the stream inactive after the request.

azure_connector DatastreamAzureConnectorArgs

Specify details about the Azure Storage connector configuration in a data stream. Note that currently DataStream supports only streaming data to block objects. The argument includes these sub-arguments:

config DatastreamConfigArgs

Provides information about the log line configuration, log file format, names of log files sent, and file delivery. The argument includes these sub-arguments:

contract_id str

Identifies the contract that has access to the product.

created_by str

The username who created the stream

created_date str

The date and time when the stream was created

datadog_connector DatastreamDatadogConnectorArgs

Specify details about the Datadog connector in a stream, including:

dataset_fields_ids Sequence[int]

Identifiers of the data set fields within the template that you want to receive in logs. The order of the identifiers define how the value for these fields appears in the log lines. See Data set parameters.

email_ids Sequence[str]

A list of email addresses you want to notify about activations and deactivations of the stream.

gcs_connector DatastreamGcsConnectorArgs

Specify details about the Google Cloud Storage connector you can use in a stream. When validating this connector, DataStream uses the private access key to create an Akamai_access_verification_<timestamp>.txt object file in your GCS bucket. You can only see this file if the validation process is successful, and you have access to the Google Cloud Storage bucket where you are trying to send logs. The argument includes these sub-arguments:

group_id str

Identifies the group that has access to the product and this stream configuration.

group_name str

The name of the user group for which the stream was created

https_connector DatastreamHttpsConnectorArgs

Specify details about the custom HTTPS endpoint you can use as a connector for a stream, including:

modified_by str

The username who modified the stream

modified_date str

The date and time when the stream was modified

oracle_connector DatastreamOracleConnectorArgs

Specify details about the Oracle Cloud Storage connector in a stream. When validating this connector, DataStream uses the provided access_key and secret_access_key values and tries to save an Akamai_access_verification_<timestamp>.txt file in your Oracle Cloud Storage folder. You can only see this file if the validation process is successful, and you have access to the Oracle Cloud Storage bucket and folder that you’re trying to send logs to.

papi_json str

The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior

product_id str

The ID of the product for which the stream was created

product_name str

The name of the product for which the stream was created

property_ids Sequence[str]

Identifies the properties that you want to monitor in the stream. Note that a stream can only log data for active properties.

s3_connector DatastreamS3ConnectorArgs

Specify details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided access_key and secret_access_key values and saves an akamai_write_test_2147483647.txt file in your Amazon S3 folder. You can only see this file if validation succeeds, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to. The argument includes these sub-arguments:

splunk_connector DatastreamSplunkConnectorArgs

Specify details about the Splunk connector in your stream. Note that currently DataStream supports only endpoint URLs ending with collector/raw. The argument includes these sub-arguments:

stream_name str

The name of the stream.

stream_type str

The type of stream that you want to create. Currently, RAW_LOGS is the only possible stream type.

stream_version_id int

Identifies the configuration version of the stream

sumologic_connector DatastreamSumologicConnectorArgs

Specify details about the Sumo Logic connector in a stream, including:

template_name str

The name of the data set template available for the product that you want to use in the stream. Currently, EDGE_LOGS is the only data set template available.

active Boolean

Whether you want to start activating the stream when applying the resource. Either true for activating the stream upon sending the request or false for leaving the stream inactive after the request.

azureConnector Property Map

Specify details about the Azure Storage connector configuration in a data stream. Note that currently DataStream supports only streaming data to block objects. The argument includes these sub-arguments:

config Property Map

Provides information about the log line configuration, log file format, names of log files sent, and file delivery. The argument includes these sub-arguments:

contractId String

Identifies the contract that has access to the product.

createdBy String

The username who created the stream

createdDate String

The date and time when the stream was created

datadogConnector Property Map

Specify details about the Datadog connector in a stream, including:

datasetFieldsIds List<Number>

Identifiers of the data set fields within the template that you want to receive in logs. The order of the identifiers define how the value for these fields appears in the log lines. See Data set parameters.

emailIds List<String>

A list of email addresses you want to notify about activations and deactivations of the stream.

gcsConnector Property Map

Specify details about the Google Cloud Storage connector you can use in a stream. When validating this connector, DataStream uses the private access key to create an Akamai_access_verification_<timestamp>.txt object file in your GCS bucket. You can only see this file if the validation process is successful, and you have access to the Google Cloud Storage bucket where you are trying to send logs. The argument includes these sub-arguments:

groupId String

Identifies the group that has access to the product and this stream configuration.

groupName String

The name of the user group for which the stream was created

httpsConnector Property Map

Specify details about the custom HTTPS endpoint you can use as a connector for a stream, including:

modifiedBy String

The username who modified the stream

modifiedDate String

The date and time when the stream was modified

oracleConnector Property Map

Specify details about the Oracle Cloud Storage connector in a stream. When validating this connector, DataStream uses the provided access_key and secret_access_key values and tries to save an Akamai_access_verification_<timestamp>.txt file in your Oracle Cloud Storage folder. You can only see this file if the validation process is successful, and you have access to the Oracle Cloud Storage bucket and folder that you’re trying to send logs to.

papiJson String

The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior

productId String

The ID of the product for which the stream was created

productName String

The name of the product for which the stream was created

propertyIds List<String>

Identifies the properties that you want to monitor in the stream. Note that a stream can only log data for active properties.

s3Connector Property Map

Specify details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided access_key and secret_access_key values and saves an akamai_write_test_2147483647.txt file in your Amazon S3 folder. You can only see this file if validation succeeds, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to. The argument includes these sub-arguments:

splunkConnector Property Map

Specify details about the Splunk connector in your stream. Note that currently DataStream supports only endpoint URLs ending with collector/raw. The argument includes these sub-arguments:

streamName String

The name of the stream.

streamType String

The type of stream that you want to create. Currently, RAW_LOGS is the only possible stream type.

streamVersionId Number

Identifies the configuration version of the stream

sumologicConnector Property Map

Specify details about the Sumo Logic connector in a stream, including:

templateName String

The name of the data set template available for the product that you want to use in the stream. Currently, EDGE_LOGS is the only data set template available.

Supporting Types

DatastreamAzureConnector

AccessKey string

Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.

AccountName string

Specifies the Azure Storage account name.

ConnectorName string

The name of the connector.

ContainerName string

Specifies the Azure Storage container name.

Path string

The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.

CompressLogs bool

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

ConnectorId int
AccessKey string

Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.

AccountName string

Specifies the Azure Storage account name.

ConnectorName string

The name of the connector.

ContainerName string

Specifies the Azure Storage container name.

Path string

The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.

CompressLogs bool

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

ConnectorId int
accessKey String

Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.

accountName String

Specifies the Azure Storage account name.

connectorName String

The name of the connector.

containerName String

Specifies the Azure Storage container name.

path String

The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.

compressLogs Boolean

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connectorId Integer
accessKey string

Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.

accountName string

Specifies the Azure Storage account name.

connectorName string

The name of the connector.

containerName string

Specifies the Azure Storage container name.

path string

The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.

compressLogs boolean

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connectorId number
access_key str

Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.

account_name str

Specifies the Azure Storage account name.

connector_name str

The name of the connector.

container_name str

Specifies the Azure Storage container name.

path str

The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.

compress_logs bool

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connector_id int
accessKey String

Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.

accountName String

Specifies the Azure Storage account name.

connectorName String

The name of the connector.

containerName String

Specifies the Azure Storage container name.

path String

The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.

compressLogs Boolean

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connectorId Number

DatastreamConfig

Format string

The format in which you want to receive log files, either STRUCTURED or JSON. When delimiter is present in the request, STRUCTURED is the mandatory format.

Frequency DatastreamConfigFrequency

How often you want to collect logs from each uploader and send them to a destination.

Delimiter string

A delimiter that you want to use to separate data set fields in the log lines. Currently, SPACE is the only available delimiter. This field is required for the STRUCTURED log file format.

UploadFilePrefix string

The prefix of the log file that you want to send to a destination. It’s a string of at most 200 characters. If unspecified, defaults to ak.

UploadFileSuffix string

The suffix of the log file that you want to send to a destination. It’s a static string of at most 10 characters. If unspecified, defaults to ds.

Format string

The format in which you want to receive log files, either STRUCTURED or JSON. When delimiter is present in the request, STRUCTURED is the mandatory format.

Frequency DatastreamConfigFrequency

How often you want to collect logs from each uploader and send them to a destination.

Delimiter string

A delimiter that you want to use to separate data set fields in the log lines. Currently, SPACE is the only available delimiter. This field is required for the STRUCTURED log file format.

UploadFilePrefix string

The prefix of the log file that you want to send to a destination. It’s a string of at most 200 characters. If unspecified, defaults to ak.

UploadFileSuffix string

The suffix of the log file that you want to send to a destination. It’s a static string of at most 10 characters. If unspecified, defaults to ds.

format String

The format in which you want to receive log files, either STRUCTURED or JSON. When delimiter is present in the request, STRUCTURED is the mandatory format.

frequency DatastreamConfigFrequency

How often you want to collect logs from each uploader and send them to a destination.

delimiter String

A delimiter that you want to use to separate data set fields in the log lines. Currently, SPACE is the only available delimiter. This field is required for the STRUCTURED log file format.

uploadFilePrefix String

The prefix of the log file that you want to send to a destination. It’s a string of at most 200 characters. If unspecified, defaults to ak.

uploadFileSuffix String

The suffix of the log file that you want to send to a destination. It’s a static string of at most 10 characters. If unspecified, defaults to ds.

format string

The format in which you want to receive log files, either STRUCTURED or JSON. When delimiter is present in the request, STRUCTURED is the mandatory format.

frequency DatastreamConfigFrequency

How often you want to collect logs from each uploader and send them to a destination.

delimiter string

A delimiter that you want to use to separate data set fields in the log lines. Currently, SPACE is the only available delimiter. This field is required for the STRUCTURED log file format.

uploadFilePrefix string

The prefix of the log file that you want to send to a destination. It’s a string of at most 200 characters. If unspecified, defaults to ak.

uploadFileSuffix string

The suffix of the log file that you want to send to a destination. It’s a static string of at most 10 characters. If unspecified, defaults to ds.

format str

The format in which you want to receive log files, either STRUCTURED or JSON. When delimiter is present in the request, STRUCTURED is the mandatory format.

frequency DatastreamConfigFrequency

How often you want to collect logs from each uploader and send them to a destination.

delimiter str

A delimiter that you want to use to separate data set fields in the log lines. Currently, SPACE is the only available delimiter. This field is required for the STRUCTURED log file format.

upload_file_prefix str

The prefix of the log file that you want to send to a destination. It’s a string of at most 200 characters. If unspecified, defaults to ak.

upload_file_suffix str

The suffix of the log file that you want to send to a destination. It’s a static string of at most 10 characters. If unspecified, defaults to ds.

format String

The format in which you want to receive log files, either STRUCTURED or JSON. When delimiter is present in the request, STRUCTURED is the mandatory format.

frequency Property Map

How often you want to collect logs from each uploader and send them to a destination.

delimiter String

A delimiter that you want to use to separate data set fields in the log lines. Currently, SPACE is the only available delimiter. This field is required for the STRUCTURED log file format.

uploadFilePrefix String

The prefix of the log file that you want to send to a destination. It’s a string of at most 200 characters. If unspecified, defaults to ak.

uploadFileSuffix String

The suffix of the log file that you want to send to a destination. It’s a static string of at most 10 characters. If unspecified, defaults to ds.

DatastreamConfigFrequency

TimeInSec int

The time in seconds after which the system bundles log lines into a file and sends it to a destination. 30 or 60 are the possible values.

TimeInSec int

The time in seconds after which the system bundles log lines into a file and sends it to a destination. 30 or 60 are the possible values.

timeInSec Integer

The time in seconds after which the system bundles log lines into a file and sends it to a destination. 30 or 60 are the possible values.

timeInSec number

The time in seconds after which the system bundles log lines into a file and sends it to a destination. 30 or 60 are the possible values.

time_in_sec int

The time in seconds after which the system bundles log lines into a file and sends it to a destination. 30 or 60 are the possible values.

timeInSec Number

The time in seconds after which the system bundles log lines into a file and sends it to a destination. 30 or 60 are the possible values.

DatastreamDatadogConnector

AuthToken string

Secret. The API key associated with your Datadog account. See View API keys in Datadog.

  • compress logs - (Optional) Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to false.
ConnectorName string

The name of the connector.

Url string

Enter the secure URL where you want to send and store your logs.

CompressLogs bool

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

ConnectorId int
Service string

The service of the Datadog connector. A service groups together endpoints, queries, or jobs for the purposes of scaling instances. See View Datadog reserved attribute list.

Source string

The source of the Datadog connector. See View Datadog reserved attribute list.

Tags string

The tags of the Datadog connector. See View Datadog tags.

AuthToken string

Secret. The API key associated with your Datadog account. See View API keys in Datadog.

  • compress logs - (Optional) Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to false.
ConnectorName string

The name of the connector.

Url string

Enter the secure URL where you want to send and store your logs.

CompressLogs bool

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

ConnectorId int
Service string

The service of the Datadog connector. A service groups together endpoints, queries, or jobs for the purposes of scaling instances. See View Datadog reserved attribute list.

Source string

The source of the Datadog connector. See View Datadog reserved attribute list.

Tags string

The tags of the Datadog connector. See View Datadog tags.

authToken String

Secret. The API key associated with your Datadog account. See View API keys in Datadog.

  • compress logs - (Optional) Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to false.
connectorName String

The name of the connector.

url String

Enter the secure URL where you want to send and store your logs.

compressLogs Boolean

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connectorId Integer
service String

The service of the Datadog connector. A service groups together endpoints, queries, or jobs for the purposes of scaling instances. See View Datadog reserved attribute list.

source String

The source of the Datadog connector. See View Datadog reserved attribute list.

tags String

The tags of the Datadog connector. See View Datadog tags.

authToken string

Secret. The API key associated with your Datadog account. See View API keys in Datadog.

  • compress logs - (Optional) Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to false.
connectorName string

The name of the connector.

url string

Enter the secure URL where you want to send and store your logs.

compressLogs boolean

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connectorId number
service string

The service of the Datadog connector. A service groups together endpoints, queries, or jobs for the purposes of scaling instances. See View Datadog reserved attribute list.

source string

The source of the Datadog connector. See View Datadog reserved attribute list.

tags string

The tags of the Datadog connector. See View Datadog tags.

auth_token str

Secret. The API key associated with your Datadog account. See View API keys in Datadog.

  • compress logs - (Optional) Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to false.
connector_name str

The name of the connector.

url str

Enter the secure URL where you want to send and store your logs.

compress_logs bool

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connector_id int
service str

The service of the Datadog connector. A service groups together endpoints, queries, or jobs for the purposes of scaling instances. See View Datadog reserved attribute list.

source str

The source of the Datadog connector. See View Datadog reserved attribute list.

tags str

The tags of the Datadog connector. See View Datadog tags.

authToken String

Secret. The API key associated with your Datadog account. See View API keys in Datadog.

  • compress logs - (Optional) Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to false.
connectorName String

The name of the connector.

url String

Enter the secure URL where you want to send and store your logs.

compressLogs Boolean

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connectorId Number
service String

The service of the Datadog connector. A service groups together endpoints, queries, or jobs for the purposes of scaling instances. See View Datadog reserved attribute list.

source String

The source of the Datadog connector. See View Datadog reserved attribute list.

tags String

The tags of the Datadog connector. See View Datadog tags.

DatastreamGcsConnector

Bucket string

The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.

ConnectorName string

The name of the connector.

PrivateKey string

Secret. The contents of the JSON private key you generated and downloaded in your Google Cloud Storage account.

ProjectId string

The unique ID of your Google Cloud project.

ServiceAccountName string

The name of the service account with the storage.object.create permission or Storage Object Creator role.

CompressLogs bool

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

ConnectorId int
Path string

The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.

Bucket string

The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.

ConnectorName string

The name of the connector.

PrivateKey string

Secret. The contents of the JSON private key you generated and downloaded in your Google Cloud Storage account.

ProjectId string

The unique ID of your Google Cloud project.

ServiceAccountName string

The name of the service account with the storage.object.create permission or Storage Object Creator role.

CompressLogs bool

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

ConnectorId int
Path string

The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.

bucket String

The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.

connectorName String

The name of the connector.

privateKey String

Secret. The contents of the JSON private key you generated and downloaded in your Google Cloud Storage account.

projectId String

The unique ID of your Google Cloud project.

serviceAccountName String

The name of the service account with the storage.object.create permission or Storage Object Creator role.

compressLogs Boolean

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connectorId Integer
path String

The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.

bucket string

The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.

connectorName string

The name of the connector.

privateKey string

Secret. The contents of the JSON private key you generated and downloaded in your Google Cloud Storage account.

projectId string

The unique ID of your Google Cloud project.

serviceAccountName string

The name of the service account with the storage.object.create permission or Storage Object Creator role.

compressLogs boolean

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connectorId number
path string

The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.

bucket str

The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.

connector_name str

The name of the connector.

private_key str

Secret. The contents of the JSON private key you generated and downloaded in your Google Cloud Storage account.

project_id str

The unique ID of your Google Cloud project.

service_account_name str

The name of the service account with the storage.object.create permission or Storage Object Creator role.

compress_logs bool

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connector_id int
path str

The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.

bucket String

The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.

connectorName String

The name of the connector.

privateKey String

Secret. The contents of the JSON private key you generated and downloaded in your Google Cloud Storage account.

projectId String

The unique ID of your Google Cloud project.

serviceAccountName String

The name of the service account with the storage.object.create permission or Storage Object Creator role.

compressLogs Boolean

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connectorId Number
path String

The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.

DatastreamHttpsConnector

AuthenticationType string

Either NONE for no authentication, or BASIC. For basic authentication, provide the user_name and password you set in your custom HTTPS endpoint.

ConnectorName string

The name of the connector.

Url string

Enter the secure URL where you want to send and store your logs.

CompressLogs bool

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

ConnectorId int
Password string

Secret. Enter the password you set in your custom HTTPS endpoint for authentication.

UserName string

Secret. Enter the valid username you set in your custom HTTPS endpoint for authentication.

AuthenticationType string

Either NONE for no authentication, or BASIC. For basic authentication, provide the user_name and password you set in your custom HTTPS endpoint.

ConnectorName string

The name of the connector.

Url string

Enter the secure URL where you want to send and store your logs.

CompressLogs bool

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

ConnectorId int
Password string

Secret. Enter the password you set in your custom HTTPS endpoint for authentication.

UserName string

Secret. Enter the valid username you set in your custom HTTPS endpoint for authentication.

authenticationType String

Either NONE for no authentication, or BASIC. For basic authentication, provide the user_name and password you set in your custom HTTPS endpoint.

connectorName String

The name of the connector.

url String

Enter the secure URL where you want to send and store your logs.

compressLogs Boolean

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connectorId Integer
password String

Secret. Enter the password you set in your custom HTTPS endpoint for authentication.

userName String

Secret. Enter the valid username you set in your custom HTTPS endpoint for authentication.

authenticationType string

Either NONE for no authentication, or BASIC. For basic authentication, provide the user_name and password you set in your custom HTTPS endpoint.

connectorName string

The name of the connector.

url string

Enter the secure URL where you want to send and store your logs.

compressLogs boolean

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connectorId number
password string

Secret. Enter the password you set in your custom HTTPS endpoint for authentication.

userName string

Secret. Enter the valid username you set in your custom HTTPS endpoint for authentication.

authentication_type str

Either NONE for no authentication, or BASIC. For basic authentication, provide the user_name and password you set in your custom HTTPS endpoint.

connector_name str

The name of the connector.

url str

Enter the secure URL where you want to send and store your logs.

compress_logs bool

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connector_id int
password str

Secret. Enter the password you set in your custom HTTPS endpoint for authentication.

user_name str

Secret. Enter the valid username you set in your custom HTTPS endpoint for authentication.

authenticationType String

Either NONE for no authentication, or BASIC. For basic authentication, provide the user_name and password you set in your custom HTTPS endpoint.

connectorName String

The name of the connector.

url String

Enter the secure URL where you want to send and store your logs.

compressLogs Boolean

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connectorId Number
password String

Secret. Enter the password you set in your custom HTTPS endpoint for authentication.

userName String

Secret. Enter the valid username you set in your custom HTTPS endpoint for authentication.

DatastreamOracleConnector

AccessKey string

Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.

Bucket string

The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.

ConnectorName string

The name of the connector.

Namespace string

The namespace of your Oracle Cloud Storage account. See Understanding Object Storage namespaces.

Path string

The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.

Region string

The Oracle Cloud Storage region where your bucket resides. See Regions and availability domains in OCS.

SecretAccessKey string

Secret. The secret access key identifier that you use to authenticate requests to your Oracle Cloud account.

CompressLogs bool

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

ConnectorId int
AccessKey string

Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.

Bucket string

The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.

ConnectorName string

The name of the connector.

Namespace string

The namespace of your Oracle Cloud Storage account. See Understanding Object Storage namespaces.

Path string

The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.

Region string

The Oracle Cloud Storage region where your bucket resides. See Regions and availability domains in OCS.

SecretAccessKey string

Secret. The secret access key identifier that you use to authenticate requests to your Oracle Cloud account.

CompressLogs bool

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

ConnectorId int
accessKey String

Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.

bucket String

The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.

connectorName String

The name of the connector.

namespace String

The namespace of your Oracle Cloud Storage account. See Understanding Object Storage namespaces.

path String

The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.

region String

The Oracle Cloud Storage region where your bucket resides. See Regions and availability domains in OCS.

secretAccessKey String

Secret. The secret access key identifier that you use to authenticate requests to your Oracle Cloud account.

compressLogs Boolean

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connectorId Integer
accessKey string

Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.

bucket string

The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.

connectorName string

The name of the connector.

namespace string

The namespace of your Oracle Cloud Storage account. See Understanding Object Storage namespaces.

path string

The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.

region string

The Oracle Cloud Storage region where your bucket resides. See Regions and availability domains in OCS.

secretAccessKey string

Secret. The secret access key identifier that you use to authenticate requests to your Oracle Cloud account.

compressLogs boolean

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connectorId number
access_key str

Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.

bucket str

The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.

connector_name str

The name of the connector.

namespace str

The namespace of your Oracle Cloud Storage account. See Understanding Object Storage namespaces.

path str

The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.

region str

The Oracle Cloud Storage region where your bucket resides. See Regions and availability domains in OCS.

secret_access_key str

Secret. The secret access key identifier that you use to authenticate requests to your Oracle Cloud account.

compress_logs bool

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connector_id int
accessKey String

Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.

bucket String

The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.

connectorName String

The name of the connector.

namespace String

The namespace of your Oracle Cloud Storage account. See Understanding Object Storage namespaces.

path String

The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.

region String

The Oracle Cloud Storage region where your bucket resides. See Regions and availability domains in OCS.

secretAccessKey String

Secret. The secret access key identifier that you use to authenticate requests to your Oracle Cloud account.

compressLogs Boolean

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connectorId Number

DatastreamS3Connector

AccessKey string

Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.

Bucket string

The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.

ConnectorName string

The name of the connector.

Path string

The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.

Region string

The Oracle Cloud Storage region where your bucket resides. See Regions and availability domains in OCS.

SecretAccessKey string

Secret. The secret access key identifier that you use to authenticate requests to your Oracle Cloud account.

CompressLogs bool

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

ConnectorId int
AccessKey string

Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.

Bucket string

The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.

ConnectorName string

The name of the connector.

Path string

The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.

Region string

The Oracle Cloud Storage region where your bucket resides. See Regions and availability domains in OCS.

SecretAccessKey string

Secret. The secret access key identifier that you use to authenticate requests to your Oracle Cloud account.

CompressLogs bool

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

ConnectorId int
accessKey String

Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.

bucket String

The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.

connectorName String

The name of the connector.

path String

The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.

region String

The Oracle Cloud Storage region where your bucket resides. See Regions and availability domains in OCS.

secretAccessKey String

Secret. The secret access key identifier that you use to authenticate requests to your Oracle Cloud account.

compressLogs Boolean

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connectorId Integer
accessKey string

Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.

bucket string

The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.

connectorName string

The name of the connector.

path string

The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.

region string

The Oracle Cloud Storage region where your bucket resides. See Regions and availability domains in OCS.

secretAccessKey string

Secret. The secret access key identifier that you use to authenticate requests to your Oracle Cloud account.

compressLogs boolean

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connectorId number
access_key str

Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.

bucket str

The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.

connector_name str

The name of the connector.

path str

The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.

region str

The Oracle Cloud Storage region where your bucket resides. See Regions and availability domains in OCS.

secret_access_key str

Secret. The secret access key identifier that you use to authenticate requests to your Oracle Cloud account.

compress_logs bool

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connector_id int
accessKey String

Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.

bucket String

The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.

connectorName String

The name of the connector.

path String

The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.

region String

The Oracle Cloud Storage region where your bucket resides. See Regions and availability domains in OCS.

secretAccessKey String

Secret. The secret access key identifier that you use to authenticate requests to your Oracle Cloud account.

compressLogs Boolean

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connectorId Number

DatastreamSplunkConnector

ConnectorName string

The name of the connector.

EventCollectorToken string

Secret. The Event Collector token associated with your Splunk account. See View usage of Event Collector token in Splunk.

Url string

Enter the secure URL where you want to send and store your logs.

CompressLogs bool

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

ConnectorId int
ConnectorName string

The name of the connector.

EventCollectorToken string

Secret. The Event Collector token associated with your Splunk account. See View usage of Event Collector token in Splunk.

Url string

Enter the secure URL where you want to send and store your logs.

CompressLogs bool

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

ConnectorId int
connectorName String

The name of the connector.

eventCollectorToken String

Secret. The Event Collector token associated with your Splunk account. See View usage of Event Collector token in Splunk.

url String

Enter the secure URL where you want to send and store your logs.

compressLogs Boolean

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connectorId Integer
connectorName string

The name of the connector.

eventCollectorToken string

Secret. The Event Collector token associated with your Splunk account. See View usage of Event Collector token in Splunk.

url string

Enter the secure URL where you want to send and store your logs.

compressLogs boolean

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connectorId number
connector_name str

The name of the connector.

event_collector_token str

Secret. The Event Collector token associated with your Splunk account. See View usage of Event Collector token in Splunk.

url str

Enter the secure URL where you want to send and store your logs.

compress_logs bool

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connector_id int
connectorName String

The name of the connector.

eventCollectorToken String

Secret. The Event Collector token associated with your Splunk account. See View usage of Event Collector token in Splunk.

url String

Enter the secure URL where you want to send and store your logs.

compressLogs Boolean

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connectorId Number

DatastreamSumologicConnector

CollectorCode string

Secret. The unique HTTP collector code of your Sumo Logic endpoint.

ConnectorName string

The name of the connector.

Endpoint string

The Sumo Logic collection endpoint where you want to send your logs. You should follow the https://<SumoEndpoint>/receiver/v1/http format and pass the collector code in the collectorCode argument.

CompressLogs bool

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

ConnectorId int
CollectorCode string

Secret. The unique HTTP collector code of your Sumo Logic endpoint.

ConnectorName string

The name of the connector.

Endpoint string

The Sumo Logic collection endpoint where you want to send your logs. You should follow the https://<SumoEndpoint>/receiver/v1/http format and pass the collector code in the collectorCode argument.

CompressLogs bool

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

ConnectorId int
collectorCode String

Secret. The unique HTTP collector code of your Sumo Logic endpoint.

connectorName String

The name of the connector.

endpoint String

The Sumo Logic collection endpoint where you want to send your logs. You should follow the https://<SumoEndpoint>/receiver/v1/http format and pass the collector code in the collectorCode argument.

compressLogs Boolean

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connectorId Integer
collectorCode string

Secret. The unique HTTP collector code of your Sumo Logic endpoint.

connectorName string

The name of the connector.

endpoint string

The Sumo Logic collection endpoint where you want to send your logs. You should follow the https://<SumoEndpoint>/receiver/v1/http format and pass the collector code in the collectorCode argument.

compressLogs boolean

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connectorId number
collector_code str

Secret. The unique HTTP collector code of your Sumo Logic endpoint.

connector_name str

The name of the connector.

endpoint str

The Sumo Logic collection endpoint where you want to send your logs. You should follow the https://<SumoEndpoint>/receiver/v1/http format and pass the collector code in the collectorCode argument.

compress_logs bool

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connector_id int
collectorCode String

Secret. The unique HTTP collector code of your Sumo Logic endpoint.

connectorName String

The name of the connector.

endpoint String

The Sumo Logic collection endpoint where you want to send your logs. You should follow the https://<SumoEndpoint>/receiver/v1/http format and pass the collector code in the collectorCode argument.

compressLogs Boolean

Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to true.

connectorId Number

Package Details

Repository
https://github.com/pulumi/pulumi-akamai
License
Apache-2.0
Notes

This Pulumi package is based on the akamai Terraform Provider.