akamai.Datastream
Import
Basic usagehcl resource “akamai_datastream” “example” {
(resource arguments)
} You can import your Akamai DataStream configuration using a stream version ID. For example
$ pulumi import akamai:index/datastream:Datastream example 1234
~> IMPORTANT: For security reasons, this command doesn’t import any secrets you specify for your connector. To make sure the state file includes complete data, use this resource to manually add the arguments marked Secret above.
Create Datastream Resource
new Datastream(name: string, args: DatastreamArgs, opts?: CustomResourceOptions);
@overload
def Datastream(resource_name: str,
opts: Optional[ResourceOptions] = None,
active: Optional[bool] = None,
azure_connector: Optional[DatastreamAzureConnectorArgs] = None,
config: Optional[DatastreamConfigArgs] = None,
contract_id: Optional[str] = None,
datadog_connector: Optional[DatastreamDatadogConnectorArgs] = None,
dataset_fields_ids: Optional[Sequence[int]] = None,
elasticsearch_connector: Optional[DatastreamElasticsearchConnectorArgs] = None,
email_ids: Optional[Sequence[str]] = None,
gcs_connector: Optional[DatastreamGcsConnectorArgs] = None,
group_id: Optional[str] = None,
https_connector: Optional[DatastreamHttpsConnectorArgs] = None,
loggly_connector: Optional[DatastreamLogglyConnectorArgs] = None,
new_relic_connector: Optional[DatastreamNewRelicConnectorArgs] = None,
oracle_connector: Optional[DatastreamOracleConnectorArgs] = None,
property_ids: Optional[Sequence[str]] = None,
s3_connector: Optional[DatastreamS3ConnectorArgs] = None,
splunk_connector: Optional[DatastreamSplunkConnectorArgs] = None,
stream_name: Optional[str] = None,
stream_type: Optional[str] = None,
sumologic_connector: Optional[DatastreamSumologicConnectorArgs] = None,
template_name: Optional[str] = None)
@overload
def Datastream(resource_name: str,
args: DatastreamArgs,
opts: Optional[ResourceOptions] = None)
func NewDatastream(ctx *Context, name string, args DatastreamArgs, opts ...ResourceOption) (*Datastream, error)
public Datastream(string name, DatastreamArgs args, CustomResourceOptions? opts = null)
public Datastream(String name, DatastreamArgs args)
public Datastream(String name, DatastreamArgs args, CustomResourceOptions options)
type: akamai:Datastream
properties: # The arguments to resource properties.
options: # Bag of options to control resource's behavior.
- name string
- The unique name of the resource.
- args DatastreamArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- resource_name str
- The unique name of the resource.
- args DatastreamArgs
- The arguments to resource properties.
- opts ResourceOptions
- Bag of options to control resource's behavior.
- ctx Context
- Context object for the current deployment.
- name string
- The unique name of the resource.
- args DatastreamArgs
- The arguments to resource properties.
- opts ResourceOption
- Bag of options to control resource's behavior.
- name string
- The unique name of the resource.
- args DatastreamArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- name String
- The unique name of the resource.
- args DatastreamArgs
- The arguments to resource properties.
- options CustomResourceOptions
- Bag of options to control resource's behavior.
Datastream Resource Properties
To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.
Inputs
The Datastream resource accepts the following input properties:
- Active bool
Whether you want to start activating the stream when applying the resource. Either
true
for activating the stream upon sending the request orfalse
for leaving the stream inactive after the request.- Config
Datastream
Config Args Provides information about the log line configuration, log file format, names of log files sent, and file delivery. The argument includes these sub-arguments:
- Contract
Id string Identifies the contract that has access to the product.
- Dataset
Fields List<int>Ids Identifiers of the data set fields within the template that you want to receive in logs. The order of the identifiers define how the value for these fields appears in the log lines. See Data set parameters.
- Group
Id string Identifies the group that has access to the product and this stream configuration.
- Property
Ids List<string> Identifies the properties that you want to monitor in the stream. Note that a stream can only log data for active properties.
- Stream
Name string The name of the stream.
- Stream
Type string The type of stream that you want to create. Currently,
RAW_LOGS
is the only possible stream type.- Template
Name string The name of the data set template available for the product that you want to use in the stream. Currently,
EDGE_LOGS
is the only data set template available.- Azure
Connector DatastreamAzure Connector Args Specify details about the Azure Storage connector configuration in a data stream. Note that currently DataStream supports only streaming data to block objects. The argument includes these sub-arguments:
- Datadog
Connector DatastreamDatadog Connector Args Specify details about the Datadog connector in a stream, including:
- Elasticsearch
Connector DatastreamElasticsearch Connector Args Specify details about the Elasticsearch connector you can use in a stream, including:
- Email
Ids List<string> A list of email addresses you want to notify about activations and deactivations of the stream.
- Gcs
Connector DatastreamGcs Connector Args Specify details about the Google Cloud Storage connector you can use in a stream. When validating this connector, DataStream uses the private access key to create an
Akamai_access_verification_<timestamp>.txt
object file in your GCS bucket. You can only see this file if the validation process is successful, and you have access to the Google Cloud Storage bucket where you are trying to send logs. The argument includes these sub-arguments:- Https
Connector DatastreamHttps Connector Args Specify details about the custom HTTPS endpoint you can use as a connector for a stream, including:
- Loggly
Connector DatastreamLoggly Connector Args Specify details about the Loggly connector you can use in a stream, including:
- New
Relic DatastreamConnector New Relic Connector Args Specify details about the New Relic connector you can use in a stream, including:
- Oracle
Connector DatastreamOracle Connector Args Specify details about the Oracle Cloud Storage connector in a stream. When validating this connector, DataStream uses the provided
access_key
andsecret_access_key
values and tries to save anAkamai_access_verification_<timestamp>.txt
file in your Oracle Cloud Storage folder. You can only see this file if the validation process is successful, and you have access to the Oracle Cloud Storage bucket and folder that you’re trying to send logs to.- S3Connector
Datastream
S3Connector Args Specify details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided
access_key
andsecret_access_key
values and saves anakamai_write_test_2147483647.txt
file in your Amazon S3 folder. You can only see this file if validation succeeds, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to. The argument includes these sub-arguments:- Splunk
Connector DatastreamSplunk Connector Args Specify details about the Splunk connector in your stream. Note that currently DataStream supports only endpoint URLs ending with
collector/raw
. The argument includes these sub-arguments:- Sumologic
Connector DatastreamSumologic Connector Args Specify details about the Sumo Logic connector in a stream, including:
- Active bool
Whether you want to start activating the stream when applying the resource. Either
true
for activating the stream upon sending the request orfalse
for leaving the stream inactive after the request.- Config
Datastream
Config Args Provides information about the log line configuration, log file format, names of log files sent, and file delivery. The argument includes these sub-arguments:
- Contract
Id string Identifies the contract that has access to the product.
- Dataset
Fields []intIds Identifiers of the data set fields within the template that you want to receive in logs. The order of the identifiers define how the value for these fields appears in the log lines. See Data set parameters.
- Group
Id string Identifies the group that has access to the product and this stream configuration.
- Property
Ids []string Identifies the properties that you want to monitor in the stream. Note that a stream can only log data for active properties.
- Stream
Name string The name of the stream.
- Stream
Type string The type of stream that you want to create. Currently,
RAW_LOGS
is the only possible stream type.- Template
Name string The name of the data set template available for the product that you want to use in the stream. Currently,
EDGE_LOGS
is the only data set template available.- Azure
Connector DatastreamAzure Connector Args Specify details about the Azure Storage connector configuration in a data stream. Note that currently DataStream supports only streaming data to block objects. The argument includes these sub-arguments:
- Datadog
Connector DatastreamDatadog Connector Args Specify details about the Datadog connector in a stream, including:
- Elasticsearch
Connector DatastreamElasticsearch Connector Args Specify details about the Elasticsearch connector you can use in a stream, including:
- Email
Ids []string A list of email addresses you want to notify about activations and deactivations of the stream.
- Gcs
Connector DatastreamGcs Connector Args Specify details about the Google Cloud Storage connector you can use in a stream. When validating this connector, DataStream uses the private access key to create an
Akamai_access_verification_<timestamp>.txt
object file in your GCS bucket. You can only see this file if the validation process is successful, and you have access to the Google Cloud Storage bucket where you are trying to send logs. The argument includes these sub-arguments:- Https
Connector DatastreamHttps Connector Args Specify details about the custom HTTPS endpoint you can use as a connector for a stream, including:
- Loggly
Connector DatastreamLoggly Connector Args Specify details about the Loggly connector you can use in a stream, including:
- New
Relic DatastreamConnector New Relic Connector Args Specify details about the New Relic connector you can use in a stream, including:
- Oracle
Connector DatastreamOracle Connector Args Specify details about the Oracle Cloud Storage connector in a stream. When validating this connector, DataStream uses the provided
access_key
andsecret_access_key
values and tries to save anAkamai_access_verification_<timestamp>.txt
file in your Oracle Cloud Storage folder. You can only see this file if the validation process is successful, and you have access to the Oracle Cloud Storage bucket and folder that you’re trying to send logs to.- S3Connector
Datastream
S3Connector Args Specify details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided
access_key
andsecret_access_key
values and saves anakamai_write_test_2147483647.txt
file in your Amazon S3 folder. You can only see this file if validation succeeds, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to. The argument includes these sub-arguments:- Splunk
Connector DatastreamSplunk Connector Args Specify details about the Splunk connector in your stream. Note that currently DataStream supports only endpoint URLs ending with
collector/raw
. The argument includes these sub-arguments:- Sumologic
Connector DatastreamSumologic Connector Args Specify details about the Sumo Logic connector in a stream, including:
- active Boolean
Whether you want to start activating the stream when applying the resource. Either
true
for activating the stream upon sending the request orfalse
for leaving the stream inactive after the request.- config
Datastream
Config Args Provides information about the log line configuration, log file format, names of log files sent, and file delivery. The argument includes these sub-arguments:
- contract
Id String Identifies the contract that has access to the product.
- dataset
Fields List<Integer>Ids Identifiers of the data set fields within the template that you want to receive in logs. The order of the identifiers define how the value for these fields appears in the log lines. See Data set parameters.
- group
Id String Identifies the group that has access to the product and this stream configuration.
- property
Ids List<String> Identifies the properties that you want to monitor in the stream. Note that a stream can only log data for active properties.
- stream
Name String The name of the stream.
- stream
Type String The type of stream that you want to create. Currently,
RAW_LOGS
is the only possible stream type.- template
Name String The name of the data set template available for the product that you want to use in the stream. Currently,
EDGE_LOGS
is the only data set template available.- azure
Connector DatastreamAzure Connector Args Specify details about the Azure Storage connector configuration in a data stream. Note that currently DataStream supports only streaming data to block objects. The argument includes these sub-arguments:
- datadog
Connector DatastreamDatadog Connector Args Specify details about the Datadog connector in a stream, including:
- elasticsearch
Connector DatastreamElasticsearch Connector Args Specify details about the Elasticsearch connector you can use in a stream, including:
- email
Ids List<String> A list of email addresses you want to notify about activations and deactivations of the stream.
- gcs
Connector DatastreamGcs Connector Args Specify details about the Google Cloud Storage connector you can use in a stream. When validating this connector, DataStream uses the private access key to create an
Akamai_access_verification_<timestamp>.txt
object file in your GCS bucket. You can only see this file if the validation process is successful, and you have access to the Google Cloud Storage bucket where you are trying to send logs. The argument includes these sub-arguments:- https
Connector DatastreamHttps Connector Args Specify details about the custom HTTPS endpoint you can use as a connector for a stream, including:
- loggly
Connector DatastreamLoggly Connector Args Specify details about the Loggly connector you can use in a stream, including:
- new
Relic DatastreamConnector New Relic Connector Args Specify details about the New Relic connector you can use in a stream, including:
- oracle
Connector DatastreamOracle Connector Args Specify details about the Oracle Cloud Storage connector in a stream. When validating this connector, DataStream uses the provided
access_key
andsecret_access_key
values and tries to save anAkamai_access_verification_<timestamp>.txt
file in your Oracle Cloud Storage folder. You can only see this file if the validation process is successful, and you have access to the Oracle Cloud Storage bucket and folder that you’re trying to send logs to.- s3Connector
Datastream
S3Connector Args Specify details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided
access_key
andsecret_access_key
values and saves anakamai_write_test_2147483647.txt
file in your Amazon S3 folder. You can only see this file if validation succeeds, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to. The argument includes these sub-arguments:- splunk
Connector DatastreamSplunk Connector Args Specify details about the Splunk connector in your stream. Note that currently DataStream supports only endpoint URLs ending with
collector/raw
. The argument includes these sub-arguments:- sumologic
Connector DatastreamSumologic Connector Args Specify details about the Sumo Logic connector in a stream, including:
- active boolean
Whether you want to start activating the stream when applying the resource. Either
true
for activating the stream upon sending the request orfalse
for leaving the stream inactive after the request.- config
Datastream
Config Args Provides information about the log line configuration, log file format, names of log files sent, and file delivery. The argument includes these sub-arguments:
- contract
Id string Identifies the contract that has access to the product.
- dataset
Fields number[]Ids Identifiers of the data set fields within the template that you want to receive in logs. The order of the identifiers define how the value for these fields appears in the log lines. See Data set parameters.
- group
Id string Identifies the group that has access to the product and this stream configuration.
- property
Ids string[] Identifies the properties that you want to monitor in the stream. Note that a stream can only log data for active properties.
- stream
Name string The name of the stream.
- stream
Type string The type of stream that you want to create. Currently,
RAW_LOGS
is the only possible stream type.- template
Name string The name of the data set template available for the product that you want to use in the stream. Currently,
EDGE_LOGS
is the only data set template available.- azure
Connector DatastreamAzure Connector Args Specify details about the Azure Storage connector configuration in a data stream. Note that currently DataStream supports only streaming data to block objects. The argument includes these sub-arguments:
- datadog
Connector DatastreamDatadog Connector Args Specify details about the Datadog connector in a stream, including:
- elasticsearch
Connector DatastreamElasticsearch Connector Args Specify details about the Elasticsearch connector you can use in a stream, including:
- email
Ids string[] A list of email addresses you want to notify about activations and deactivations of the stream.
- gcs
Connector DatastreamGcs Connector Args Specify details about the Google Cloud Storage connector you can use in a stream. When validating this connector, DataStream uses the private access key to create an
Akamai_access_verification_<timestamp>.txt
object file in your GCS bucket. You can only see this file if the validation process is successful, and you have access to the Google Cloud Storage bucket where you are trying to send logs. The argument includes these sub-arguments:- https
Connector DatastreamHttps Connector Args Specify details about the custom HTTPS endpoint you can use as a connector for a stream, including:
- loggly
Connector DatastreamLoggly Connector Args Specify details about the Loggly connector you can use in a stream, including:
- new
Relic DatastreamConnector New Relic Connector Args Specify details about the New Relic connector you can use in a stream, including:
- oracle
Connector DatastreamOracle Connector Args Specify details about the Oracle Cloud Storage connector in a stream. When validating this connector, DataStream uses the provided
access_key
andsecret_access_key
values and tries to save anAkamai_access_verification_<timestamp>.txt
file in your Oracle Cloud Storage folder. You can only see this file if the validation process is successful, and you have access to the Oracle Cloud Storage bucket and folder that you’re trying to send logs to.- s3Connector
Datastream
S3Connector Args Specify details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided
access_key
andsecret_access_key
values and saves anakamai_write_test_2147483647.txt
file in your Amazon S3 folder. You can only see this file if validation succeeds, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to. The argument includes these sub-arguments:- splunk
Connector DatastreamSplunk Connector Args Specify details about the Splunk connector in your stream. Note that currently DataStream supports only endpoint URLs ending with
collector/raw
. The argument includes these sub-arguments:- sumologic
Connector DatastreamSumologic Connector Args Specify details about the Sumo Logic connector in a stream, including:
- active bool
Whether you want to start activating the stream when applying the resource. Either
true
for activating the stream upon sending the request orfalse
for leaving the stream inactive after the request.- config
Datastream
Config Args Provides information about the log line configuration, log file format, names of log files sent, and file delivery. The argument includes these sub-arguments:
- contract_
id str Identifies the contract that has access to the product.
- dataset_
fields_ Sequence[int]ids Identifiers of the data set fields within the template that you want to receive in logs. The order of the identifiers define how the value for these fields appears in the log lines. See Data set parameters.
- group_
id str Identifies the group that has access to the product and this stream configuration.
- property_
ids Sequence[str] Identifies the properties that you want to monitor in the stream. Note that a stream can only log data for active properties.
- stream_
name str The name of the stream.
- stream_
type str The type of stream that you want to create. Currently,
RAW_LOGS
is the only possible stream type.- template_
name str The name of the data set template available for the product that you want to use in the stream. Currently,
EDGE_LOGS
is the only data set template available.- azure_
connector DatastreamAzure Connector Args Specify details about the Azure Storage connector configuration in a data stream. Note that currently DataStream supports only streaming data to block objects. The argument includes these sub-arguments:
- datadog_
connector DatastreamDatadog Connector Args Specify details about the Datadog connector in a stream, including:
- elasticsearch_
connector DatastreamElasticsearch Connector Args Specify details about the Elasticsearch connector you can use in a stream, including:
- email_
ids Sequence[str] A list of email addresses you want to notify about activations and deactivations of the stream.
- gcs_
connector DatastreamGcs Connector Args Specify details about the Google Cloud Storage connector you can use in a stream. When validating this connector, DataStream uses the private access key to create an
Akamai_access_verification_<timestamp>.txt
object file in your GCS bucket. You can only see this file if the validation process is successful, and you have access to the Google Cloud Storage bucket where you are trying to send logs. The argument includes these sub-arguments:- https_
connector DatastreamHttps Connector Args Specify details about the custom HTTPS endpoint you can use as a connector for a stream, including:
- loggly_
connector DatastreamLoggly Connector Args Specify details about the Loggly connector you can use in a stream, including:
- new_
relic_ Datastreamconnector New Relic Connector Args Specify details about the New Relic connector you can use in a stream, including:
- oracle_
connector DatastreamOracle Connector Args Specify details about the Oracle Cloud Storage connector in a stream. When validating this connector, DataStream uses the provided
access_key
andsecret_access_key
values and tries to save anAkamai_access_verification_<timestamp>.txt
file in your Oracle Cloud Storage folder. You can only see this file if the validation process is successful, and you have access to the Oracle Cloud Storage bucket and folder that you’re trying to send logs to.- s3_
connector DatastreamS3Connector Args Specify details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided
access_key
andsecret_access_key
values and saves anakamai_write_test_2147483647.txt
file in your Amazon S3 folder. You can only see this file if validation succeeds, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to. The argument includes these sub-arguments:- splunk_
connector DatastreamSplunk Connector Args Specify details about the Splunk connector in your stream. Note that currently DataStream supports only endpoint URLs ending with
collector/raw
. The argument includes these sub-arguments:- sumologic_
connector DatastreamSumologic Connector Args Specify details about the Sumo Logic connector in a stream, including:
- active Boolean
Whether you want to start activating the stream when applying the resource. Either
true
for activating the stream upon sending the request orfalse
for leaving the stream inactive after the request.- config Property Map
Provides information about the log line configuration, log file format, names of log files sent, and file delivery. The argument includes these sub-arguments:
- contract
Id String Identifies the contract that has access to the product.
- dataset
Fields List<Number>Ids Identifiers of the data set fields within the template that you want to receive in logs. The order of the identifiers define how the value for these fields appears in the log lines. See Data set parameters.
- group
Id String Identifies the group that has access to the product and this stream configuration.
- property
Ids List<String> Identifies the properties that you want to monitor in the stream. Note that a stream can only log data for active properties.
- stream
Name String The name of the stream.
- stream
Type String The type of stream that you want to create. Currently,
RAW_LOGS
is the only possible stream type.- template
Name String The name of the data set template available for the product that you want to use in the stream. Currently,
EDGE_LOGS
is the only data set template available.- azure
Connector Property Map Specify details about the Azure Storage connector configuration in a data stream. Note that currently DataStream supports only streaming data to block objects. The argument includes these sub-arguments:
- datadog
Connector Property Map Specify details about the Datadog connector in a stream, including:
- elasticsearch
Connector Property Map Specify details about the Elasticsearch connector you can use in a stream, including:
- email
Ids List<String> A list of email addresses you want to notify about activations and deactivations of the stream.
- gcs
Connector Property Map Specify details about the Google Cloud Storage connector you can use in a stream. When validating this connector, DataStream uses the private access key to create an
Akamai_access_verification_<timestamp>.txt
object file in your GCS bucket. You can only see this file if the validation process is successful, and you have access to the Google Cloud Storage bucket where you are trying to send logs. The argument includes these sub-arguments:- https
Connector Property Map Specify details about the custom HTTPS endpoint you can use as a connector for a stream, including:
- loggly
Connector Property Map Specify details about the Loggly connector you can use in a stream, including:
- new
Relic Property MapConnector Specify details about the New Relic connector you can use in a stream, including:
- oracle
Connector Property Map Specify details about the Oracle Cloud Storage connector in a stream. When validating this connector, DataStream uses the provided
access_key
andsecret_access_key
values and tries to save anAkamai_access_verification_<timestamp>.txt
file in your Oracle Cloud Storage folder. You can only see this file if the validation process is successful, and you have access to the Oracle Cloud Storage bucket and folder that you’re trying to send logs to.- s3Connector Property Map
Specify details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided
access_key
andsecret_access_key
values and saves anakamai_write_test_2147483647.txt
file in your Amazon S3 folder. You can only see this file if validation succeeds, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to. The argument includes these sub-arguments:- splunk
Connector Property Map Specify details about the Splunk connector in your stream. Note that currently DataStream supports only endpoint URLs ending with
collector/raw
. The argument includes these sub-arguments:- sumologic
Connector Property Map Specify details about the Sumo Logic connector in a stream, including:
Outputs
All input properties are implicitly available as output properties. Additionally, the Datastream resource produces the following output properties:
- Created
By string The username who created the stream
- Created
Date string The date and time when the stream was created
- Group
Name string The name of the user group for which the stream was created
- Id string
The provider-assigned unique ID for this managed resource.
- Modified
By string The username who modified the stream
- Modified
Date string The date and time when the stream was modified
- Papi
Json string The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- Product
Id string The ID of the product for which the stream was created
- Product
Name string The name of the product for which the stream was created
- Stream
Version intId Identifies the configuration version of the stream
- Created
By string The username who created the stream
- Created
Date string The date and time when the stream was created
- Group
Name string The name of the user group for which the stream was created
- Id string
The provider-assigned unique ID for this managed resource.
- Modified
By string The username who modified the stream
- Modified
Date string The date and time when the stream was modified
- Papi
Json string The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- Product
Id string The ID of the product for which the stream was created
- Product
Name string The name of the product for which the stream was created
- Stream
Version intId Identifies the configuration version of the stream
- created
By String The username who created the stream
- created
Date String The date and time when the stream was created
- group
Name String The name of the user group for which the stream was created
- id String
The provider-assigned unique ID for this managed resource.
- modified
By String The username who modified the stream
- modified
Date String The date and time when the stream was modified
- papi
Json String The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- product
Id String The ID of the product for which the stream was created
- product
Name String The name of the product for which the stream was created
- stream
Version IntegerId Identifies the configuration version of the stream
- created
By string The username who created the stream
- created
Date string The date and time when the stream was created
- group
Name string The name of the user group for which the stream was created
- id string
The provider-assigned unique ID for this managed resource.
- modified
By string The username who modified the stream
- modified
Date string The date and time when the stream was modified
- papi
Json string The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- product
Id string The ID of the product for which the stream was created
- product
Name string The name of the product for which the stream was created
- stream
Version numberId Identifies the configuration version of the stream
- created_
by str The username who created the stream
- created_
date str The date and time when the stream was created
- group_
name str The name of the user group for which the stream was created
- id str
The provider-assigned unique ID for this managed resource.
- modified_
by str The username who modified the stream
- modified_
date str The date and time when the stream was modified
- papi_
json str The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- product_
id str The ID of the product for which the stream was created
- product_
name str The name of the product for which the stream was created
- stream_
version_ intid Identifies the configuration version of the stream
- created
By String The username who created the stream
- created
Date String The date and time when the stream was created
- group
Name String The name of the user group for which the stream was created
- id String
The provider-assigned unique ID for this managed resource.
- modified
By String The username who modified the stream
- modified
Date String The date and time when the stream was modified
- papi
Json String The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- product
Id String The ID of the product for which the stream was created
- product
Name String The name of the product for which the stream was created
- stream
Version NumberId Identifies the configuration version of the stream
Look up Existing Datastream Resource
Get an existing Datastream resource’s state with the given name, ID, and optional extra properties used to qualify the lookup.
public static get(name: string, id: Input<ID>, state?: DatastreamState, opts?: CustomResourceOptions): Datastream
@staticmethod
def get(resource_name: str,
id: str,
opts: Optional[ResourceOptions] = None,
active: Optional[bool] = None,
azure_connector: Optional[DatastreamAzureConnectorArgs] = None,
config: Optional[DatastreamConfigArgs] = None,
contract_id: Optional[str] = None,
created_by: Optional[str] = None,
created_date: Optional[str] = None,
datadog_connector: Optional[DatastreamDatadogConnectorArgs] = None,
dataset_fields_ids: Optional[Sequence[int]] = None,
elasticsearch_connector: Optional[DatastreamElasticsearchConnectorArgs] = None,
email_ids: Optional[Sequence[str]] = None,
gcs_connector: Optional[DatastreamGcsConnectorArgs] = None,
group_id: Optional[str] = None,
group_name: Optional[str] = None,
https_connector: Optional[DatastreamHttpsConnectorArgs] = None,
loggly_connector: Optional[DatastreamLogglyConnectorArgs] = None,
modified_by: Optional[str] = None,
modified_date: Optional[str] = None,
new_relic_connector: Optional[DatastreamNewRelicConnectorArgs] = None,
oracle_connector: Optional[DatastreamOracleConnectorArgs] = None,
papi_json: Optional[str] = None,
product_id: Optional[str] = None,
product_name: Optional[str] = None,
property_ids: Optional[Sequence[str]] = None,
s3_connector: Optional[DatastreamS3ConnectorArgs] = None,
splunk_connector: Optional[DatastreamSplunkConnectorArgs] = None,
stream_name: Optional[str] = None,
stream_type: Optional[str] = None,
stream_version_id: Optional[int] = None,
sumologic_connector: Optional[DatastreamSumologicConnectorArgs] = None,
template_name: Optional[str] = None) -> Datastream
func GetDatastream(ctx *Context, name string, id IDInput, state *DatastreamState, opts ...ResourceOption) (*Datastream, error)
public static Datastream Get(string name, Input<string> id, DatastreamState? state, CustomResourceOptions? opts = null)
public static Datastream get(String name, Output<String> id, DatastreamState state, CustomResourceOptions options)
Resource lookup is not supported in YAML
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- resource_name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- Active bool
Whether you want to start activating the stream when applying the resource. Either
true
for activating the stream upon sending the request orfalse
for leaving the stream inactive after the request.- Azure
Connector DatastreamAzure Connector Args Specify details about the Azure Storage connector configuration in a data stream. Note that currently DataStream supports only streaming data to block objects. The argument includes these sub-arguments:
- Config
Datastream
Config Args Provides information about the log line configuration, log file format, names of log files sent, and file delivery. The argument includes these sub-arguments:
- Contract
Id string Identifies the contract that has access to the product.
- Created
By string The username who created the stream
- Created
Date string The date and time when the stream was created
- Datadog
Connector DatastreamDatadog Connector Args Specify details about the Datadog connector in a stream, including:
- Dataset
Fields List<int>Ids Identifiers of the data set fields within the template that you want to receive in logs. The order of the identifiers define how the value for these fields appears in the log lines. See Data set parameters.
- Elasticsearch
Connector DatastreamElasticsearch Connector Args Specify details about the Elasticsearch connector you can use in a stream, including:
- Email
Ids List<string> A list of email addresses you want to notify about activations and deactivations of the stream.
- Gcs
Connector DatastreamGcs Connector Args Specify details about the Google Cloud Storage connector you can use in a stream. When validating this connector, DataStream uses the private access key to create an
Akamai_access_verification_<timestamp>.txt
object file in your GCS bucket. You can only see this file if the validation process is successful, and you have access to the Google Cloud Storage bucket where you are trying to send logs. The argument includes these sub-arguments:- Group
Id string Identifies the group that has access to the product and this stream configuration.
- Group
Name string The name of the user group for which the stream was created
- Https
Connector DatastreamHttps Connector Args Specify details about the custom HTTPS endpoint you can use as a connector for a stream, including:
- Loggly
Connector DatastreamLoggly Connector Args Specify details about the Loggly connector you can use in a stream, including:
- Modified
By string The username who modified the stream
- Modified
Date string The date and time when the stream was modified
- New
Relic DatastreamConnector New Relic Connector Args Specify details about the New Relic connector you can use in a stream, including:
- Oracle
Connector DatastreamOracle Connector Args Specify details about the Oracle Cloud Storage connector in a stream. When validating this connector, DataStream uses the provided
access_key
andsecret_access_key
values and tries to save anAkamai_access_verification_<timestamp>.txt
file in your Oracle Cloud Storage folder. You can only see this file if the validation process is successful, and you have access to the Oracle Cloud Storage bucket and folder that you’re trying to send logs to.- Papi
Json string The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- Product
Id string The ID of the product for which the stream was created
- Product
Name string The name of the product for which the stream was created
- Property
Ids List<string> Identifies the properties that you want to monitor in the stream. Note that a stream can only log data for active properties.
- S3Connector
Datastream
S3Connector Args Specify details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided
access_key
andsecret_access_key
values and saves anakamai_write_test_2147483647.txt
file in your Amazon S3 folder. You can only see this file if validation succeeds, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to. The argument includes these sub-arguments:- Splunk
Connector DatastreamSplunk Connector Args Specify details about the Splunk connector in your stream. Note that currently DataStream supports only endpoint URLs ending with
collector/raw
. The argument includes these sub-arguments:- Stream
Name string The name of the stream.
- Stream
Type string The type of stream that you want to create. Currently,
RAW_LOGS
is the only possible stream type.- Stream
Version intId Identifies the configuration version of the stream
- Sumologic
Connector DatastreamSumologic Connector Args Specify details about the Sumo Logic connector in a stream, including:
- Template
Name string The name of the data set template available for the product that you want to use in the stream. Currently,
EDGE_LOGS
is the only data set template available.
- Active bool
Whether you want to start activating the stream when applying the resource. Either
true
for activating the stream upon sending the request orfalse
for leaving the stream inactive after the request.- Azure
Connector DatastreamAzure Connector Args Specify details about the Azure Storage connector configuration in a data stream. Note that currently DataStream supports only streaming data to block objects. The argument includes these sub-arguments:
- Config
Datastream
Config Args Provides information about the log line configuration, log file format, names of log files sent, and file delivery. The argument includes these sub-arguments:
- Contract
Id string Identifies the contract that has access to the product.
- Created
By string The username who created the stream
- Created
Date string The date and time when the stream was created
- Datadog
Connector DatastreamDatadog Connector Args Specify details about the Datadog connector in a stream, including:
- Dataset
Fields []intIds Identifiers of the data set fields within the template that you want to receive in logs. The order of the identifiers define how the value for these fields appears in the log lines. See Data set parameters.
- Elasticsearch
Connector DatastreamElasticsearch Connector Args Specify details about the Elasticsearch connector you can use in a stream, including:
- Email
Ids []string A list of email addresses you want to notify about activations and deactivations of the stream.
- Gcs
Connector DatastreamGcs Connector Args Specify details about the Google Cloud Storage connector you can use in a stream. When validating this connector, DataStream uses the private access key to create an
Akamai_access_verification_<timestamp>.txt
object file in your GCS bucket. You can only see this file if the validation process is successful, and you have access to the Google Cloud Storage bucket where you are trying to send logs. The argument includes these sub-arguments:- Group
Id string Identifies the group that has access to the product and this stream configuration.
- Group
Name string The name of the user group for which the stream was created
- Https
Connector DatastreamHttps Connector Args Specify details about the custom HTTPS endpoint you can use as a connector for a stream, including:
- Loggly
Connector DatastreamLoggly Connector Args Specify details about the Loggly connector you can use in a stream, including:
- Modified
By string The username who modified the stream
- Modified
Date string The date and time when the stream was modified
- New
Relic DatastreamConnector New Relic Connector Args Specify details about the New Relic connector you can use in a stream, including:
- Oracle
Connector DatastreamOracle Connector Args Specify details about the Oracle Cloud Storage connector in a stream. When validating this connector, DataStream uses the provided
access_key
andsecret_access_key
values and tries to save anAkamai_access_verification_<timestamp>.txt
file in your Oracle Cloud Storage folder. You can only see this file if the validation process is successful, and you have access to the Oracle Cloud Storage bucket and folder that you’re trying to send logs to.- Papi
Json string The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- Product
Id string The ID of the product for which the stream was created
- Product
Name string The name of the product for which the stream was created
- Property
Ids []string Identifies the properties that you want to monitor in the stream. Note that a stream can only log data for active properties.
- S3Connector
Datastream
S3Connector Args Specify details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided
access_key
andsecret_access_key
values and saves anakamai_write_test_2147483647.txt
file in your Amazon S3 folder. You can only see this file if validation succeeds, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to. The argument includes these sub-arguments:- Splunk
Connector DatastreamSplunk Connector Args Specify details about the Splunk connector in your stream. Note that currently DataStream supports only endpoint URLs ending with
collector/raw
. The argument includes these sub-arguments:- Stream
Name string The name of the stream.
- Stream
Type string The type of stream that you want to create. Currently,
RAW_LOGS
is the only possible stream type.- Stream
Version intId Identifies the configuration version of the stream
- Sumologic
Connector DatastreamSumologic Connector Args Specify details about the Sumo Logic connector in a stream, including:
- Template
Name string The name of the data set template available for the product that you want to use in the stream. Currently,
EDGE_LOGS
is the only data set template available.
- active Boolean
Whether you want to start activating the stream when applying the resource. Either
true
for activating the stream upon sending the request orfalse
for leaving the stream inactive after the request.- azure
Connector DatastreamAzure Connector Args Specify details about the Azure Storage connector configuration in a data stream. Note that currently DataStream supports only streaming data to block objects. The argument includes these sub-arguments:
- config
Datastream
Config Args Provides information about the log line configuration, log file format, names of log files sent, and file delivery. The argument includes these sub-arguments:
- contract
Id String Identifies the contract that has access to the product.
- created
By String The username who created the stream
- created
Date String The date and time when the stream was created
- datadog
Connector DatastreamDatadog Connector Args Specify details about the Datadog connector in a stream, including:
- dataset
Fields List<Integer>Ids Identifiers of the data set fields within the template that you want to receive in logs. The order of the identifiers define how the value for these fields appears in the log lines. See Data set parameters.
- elasticsearch
Connector DatastreamElasticsearch Connector Args Specify details about the Elasticsearch connector you can use in a stream, including:
- email
Ids List<String> A list of email addresses you want to notify about activations and deactivations of the stream.
- gcs
Connector DatastreamGcs Connector Args Specify details about the Google Cloud Storage connector you can use in a stream. When validating this connector, DataStream uses the private access key to create an
Akamai_access_verification_<timestamp>.txt
object file in your GCS bucket. You can only see this file if the validation process is successful, and you have access to the Google Cloud Storage bucket where you are trying to send logs. The argument includes these sub-arguments:- group
Id String Identifies the group that has access to the product and this stream configuration.
- group
Name String The name of the user group for which the stream was created
- https
Connector DatastreamHttps Connector Args Specify details about the custom HTTPS endpoint you can use as a connector for a stream, including:
- loggly
Connector DatastreamLoggly Connector Args Specify details about the Loggly connector you can use in a stream, including:
- modified
By String The username who modified the stream
- modified
Date String The date and time when the stream was modified
- new
Relic DatastreamConnector New Relic Connector Args Specify details about the New Relic connector you can use in a stream, including:
- oracle
Connector DatastreamOracle Connector Args Specify details about the Oracle Cloud Storage connector in a stream. When validating this connector, DataStream uses the provided
access_key
andsecret_access_key
values and tries to save anAkamai_access_verification_<timestamp>.txt
file in your Oracle Cloud Storage folder. You can only see this file if the validation process is successful, and you have access to the Oracle Cloud Storage bucket and folder that you’re trying to send logs to.- papi
Json String The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- product
Id String The ID of the product for which the stream was created
- product
Name String The name of the product for which the stream was created
- property
Ids List<String> Identifies the properties that you want to monitor in the stream. Note that a stream can only log data for active properties.
- s3Connector
Datastream
S3Connector Args Specify details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided
access_key
andsecret_access_key
values and saves anakamai_write_test_2147483647.txt
file in your Amazon S3 folder. You can only see this file if validation succeeds, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to. The argument includes these sub-arguments:- splunk
Connector DatastreamSplunk Connector Args Specify details about the Splunk connector in your stream. Note that currently DataStream supports only endpoint URLs ending with
collector/raw
. The argument includes these sub-arguments:- stream
Name String The name of the stream.
- stream
Type String The type of stream that you want to create. Currently,
RAW_LOGS
is the only possible stream type.- stream
Version IntegerId Identifies the configuration version of the stream
- sumologic
Connector DatastreamSumologic Connector Args Specify details about the Sumo Logic connector in a stream, including:
- template
Name String The name of the data set template available for the product that you want to use in the stream. Currently,
EDGE_LOGS
is the only data set template available.
- active boolean
Whether you want to start activating the stream when applying the resource. Either
true
for activating the stream upon sending the request orfalse
for leaving the stream inactive after the request.- azure
Connector DatastreamAzure Connector Args Specify details about the Azure Storage connector configuration in a data stream. Note that currently DataStream supports only streaming data to block objects. The argument includes these sub-arguments:
- config
Datastream
Config Args Provides information about the log line configuration, log file format, names of log files sent, and file delivery. The argument includes these sub-arguments:
- contract
Id string Identifies the contract that has access to the product.
- created
By string The username who created the stream
- created
Date string The date and time when the stream was created
- datadog
Connector DatastreamDatadog Connector Args Specify details about the Datadog connector in a stream, including:
- dataset
Fields number[]Ids Identifiers of the data set fields within the template that you want to receive in logs. The order of the identifiers define how the value for these fields appears in the log lines. See Data set parameters.
- elasticsearch
Connector DatastreamElasticsearch Connector Args Specify details about the Elasticsearch connector you can use in a stream, including:
- email
Ids string[] A list of email addresses you want to notify about activations and deactivations of the stream.
- gcs
Connector DatastreamGcs Connector Args Specify details about the Google Cloud Storage connector you can use in a stream. When validating this connector, DataStream uses the private access key to create an
Akamai_access_verification_<timestamp>.txt
object file in your GCS bucket. You can only see this file if the validation process is successful, and you have access to the Google Cloud Storage bucket where you are trying to send logs. The argument includes these sub-arguments:- group
Id string Identifies the group that has access to the product and this stream configuration.
- group
Name string The name of the user group for which the stream was created
- https
Connector DatastreamHttps Connector Args Specify details about the custom HTTPS endpoint you can use as a connector for a stream, including:
- loggly
Connector DatastreamLoggly Connector Args Specify details about the Loggly connector you can use in a stream, including:
- modified
By string The username who modified the stream
- modified
Date string The date and time when the stream was modified
- new
Relic DatastreamConnector New Relic Connector Args Specify details about the New Relic connector you can use in a stream, including:
- oracle
Connector DatastreamOracle Connector Args Specify details about the Oracle Cloud Storage connector in a stream. When validating this connector, DataStream uses the provided
access_key
andsecret_access_key
values and tries to save anAkamai_access_verification_<timestamp>.txt
file in your Oracle Cloud Storage folder. You can only see this file if the validation process is successful, and you have access to the Oracle Cloud Storage bucket and folder that you’re trying to send logs to.- papi
Json string The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- product
Id string The ID of the product for which the stream was created
- product
Name string The name of the product for which the stream was created
- property
Ids string[] Identifies the properties that you want to monitor in the stream. Note that a stream can only log data for active properties.
- s3Connector
Datastream
S3Connector Args Specify details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided
access_key
andsecret_access_key
values and saves anakamai_write_test_2147483647.txt
file in your Amazon S3 folder. You can only see this file if validation succeeds, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to. The argument includes these sub-arguments:- splunk
Connector DatastreamSplunk Connector Args Specify details about the Splunk connector in your stream. Note that currently DataStream supports only endpoint URLs ending with
collector/raw
. The argument includes these sub-arguments:- stream
Name string The name of the stream.
- stream
Type string The type of stream that you want to create. Currently,
RAW_LOGS
is the only possible stream type.- stream
Version numberId Identifies the configuration version of the stream
- sumologic
Connector DatastreamSumologic Connector Args Specify details about the Sumo Logic connector in a stream, including:
- template
Name string The name of the data set template available for the product that you want to use in the stream. Currently,
EDGE_LOGS
is the only data set template available.
- active bool
Whether you want to start activating the stream when applying the resource. Either
true
for activating the stream upon sending the request orfalse
for leaving the stream inactive after the request.- azure_
connector DatastreamAzure Connector Args Specify details about the Azure Storage connector configuration in a data stream. Note that currently DataStream supports only streaming data to block objects. The argument includes these sub-arguments:
- config
Datastream
Config Args Provides information about the log line configuration, log file format, names of log files sent, and file delivery. The argument includes these sub-arguments:
- contract_
id str Identifies the contract that has access to the product.
- created_
by str The username who created the stream
- created_
date str The date and time when the stream was created
- datadog_
connector DatastreamDatadog Connector Args Specify details about the Datadog connector in a stream, including:
- dataset_
fields_ Sequence[int]ids Identifiers of the data set fields within the template that you want to receive in logs. The order of the identifiers define how the value for these fields appears in the log lines. See Data set parameters.
- elasticsearch_
connector DatastreamElasticsearch Connector Args Specify details about the Elasticsearch connector you can use in a stream, including:
- email_
ids Sequence[str] A list of email addresses you want to notify about activations and deactivations of the stream.
- gcs_
connector DatastreamGcs Connector Args Specify details about the Google Cloud Storage connector you can use in a stream. When validating this connector, DataStream uses the private access key to create an
Akamai_access_verification_<timestamp>.txt
object file in your GCS bucket. You can only see this file if the validation process is successful, and you have access to the Google Cloud Storage bucket where you are trying to send logs. The argument includes these sub-arguments:- group_
id str Identifies the group that has access to the product and this stream configuration.
- group_
name str The name of the user group for which the stream was created
- https_
connector DatastreamHttps Connector Args Specify details about the custom HTTPS endpoint you can use as a connector for a stream, including:
- loggly_
connector DatastreamLoggly Connector Args Specify details about the Loggly connector you can use in a stream, including:
- modified_
by str The username who modified the stream
- modified_
date str The date and time when the stream was modified
- new_
relic_ Datastreamconnector New Relic Connector Args Specify details about the New Relic connector you can use in a stream, including:
- oracle_
connector DatastreamOracle Connector Args Specify details about the Oracle Cloud Storage connector in a stream. When validating this connector, DataStream uses the provided
access_key
andsecret_access_key
values and tries to save anAkamai_access_verification_<timestamp>.txt
file in your Oracle Cloud Storage folder. You can only see this file if the validation process is successful, and you have access to the Oracle Cloud Storage bucket and folder that you’re trying to send logs to.- papi_
json str The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- product_
id str The ID of the product for which the stream was created
- product_
name str The name of the product for which the stream was created
- property_
ids Sequence[str] Identifies the properties that you want to monitor in the stream. Note that a stream can only log data for active properties.
- s3_
connector DatastreamS3Connector Args Specify details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided
access_key
andsecret_access_key
values and saves anakamai_write_test_2147483647.txt
file in your Amazon S3 folder. You can only see this file if validation succeeds, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to. The argument includes these sub-arguments:- splunk_
connector DatastreamSplunk Connector Args Specify details about the Splunk connector in your stream. Note that currently DataStream supports only endpoint URLs ending with
collector/raw
. The argument includes these sub-arguments:- stream_
name str The name of the stream.
- stream_
type str The type of stream that you want to create. Currently,
RAW_LOGS
is the only possible stream type.- stream_
version_ intid Identifies the configuration version of the stream
- sumologic_
connector DatastreamSumologic Connector Args Specify details about the Sumo Logic connector in a stream, including:
- template_
name str The name of the data set template available for the product that you want to use in the stream. Currently,
EDGE_LOGS
is the only data set template available.
- active Boolean
Whether you want to start activating the stream when applying the resource. Either
true
for activating the stream upon sending the request orfalse
for leaving the stream inactive after the request.- azure
Connector Property Map Specify details about the Azure Storage connector configuration in a data stream. Note that currently DataStream supports only streaming data to block objects. The argument includes these sub-arguments:
- config Property Map
Provides information about the log line configuration, log file format, names of log files sent, and file delivery. The argument includes these sub-arguments:
- contract
Id String Identifies the contract that has access to the product.
- created
By String The username who created the stream
- created
Date String The date and time when the stream was created
- datadog
Connector Property Map Specify details about the Datadog connector in a stream, including:
- dataset
Fields List<Number>Ids Identifiers of the data set fields within the template that you want to receive in logs. The order of the identifiers define how the value for these fields appears in the log lines. See Data set parameters.
- elasticsearch
Connector Property Map Specify details about the Elasticsearch connector you can use in a stream, including:
- email
Ids List<String> A list of email addresses you want to notify about activations and deactivations of the stream.
- gcs
Connector Property Map Specify details about the Google Cloud Storage connector you can use in a stream. When validating this connector, DataStream uses the private access key to create an
Akamai_access_verification_<timestamp>.txt
object file in your GCS bucket. You can only see this file if the validation process is successful, and you have access to the Google Cloud Storage bucket where you are trying to send logs. The argument includes these sub-arguments:- group
Id String Identifies the group that has access to the product and this stream configuration.
- group
Name String The name of the user group for which the stream was created
- https
Connector Property Map Specify details about the custom HTTPS endpoint you can use as a connector for a stream, including:
- loggly
Connector Property Map Specify details about the Loggly connector you can use in a stream, including:
- modified
By String The username who modified the stream
- modified
Date String The date and time when the stream was modified
- new
Relic Property MapConnector Specify details about the New Relic connector you can use in a stream, including:
- oracle
Connector Property Map Specify details about the Oracle Cloud Storage connector in a stream. When validating this connector, DataStream uses the provided
access_key
andsecret_access_key
values and tries to save anAkamai_access_verification_<timestamp>.txt
file in your Oracle Cloud Storage folder. You can only see this file if the validation process is successful, and you have access to the Oracle Cloud Storage bucket and folder that you’re trying to send logs to.- papi
Json String The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- product
Id String The ID of the product for which the stream was created
- product
Name String The name of the product for which the stream was created
- property
Ids List<String> Identifies the properties that you want to monitor in the stream. Note that a stream can only log data for active properties.
- s3Connector Property Map
Specify details about the Amazon S3 connector in a stream. When validating this connector, DataStream uses the provided
access_key
andsecret_access_key
values and saves anakamai_write_test_2147483647.txt
file in your Amazon S3 folder. You can only see this file if validation succeeds, and you have access to the Amazon S3 bucket and folder that you’re trying to send logs to. The argument includes these sub-arguments:- splunk
Connector Property Map Specify details about the Splunk connector in your stream. Note that currently DataStream supports only endpoint URLs ending with
collector/raw
. The argument includes these sub-arguments:- stream
Name String The name of the stream.
- stream
Type String The type of stream that you want to create. Currently,
RAW_LOGS
is the only possible stream type.- stream
Version NumberId Identifies the configuration version of the stream
- sumologic
Connector Property Map Specify details about the Sumo Logic connector in a stream, including:
- template
Name String The name of the data set template available for the product that you want to use in the stream. Currently,
EDGE_LOGS
is the only data set template available.
Supporting Types
DatastreamAzureConnector
- Access
Key string Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.
- Account
Name string Specifies the Azure Storage account name.
- Connector
Name string The name of the connector.
- Container
Name string Specifies the Azure Storage container name.
- Path string
The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.
- Compress
Logs bool Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- Connector
Id int
- Access
Key string Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.
- Account
Name string Specifies the Azure Storage account name.
- Connector
Name string The name of the connector.
- Container
Name string Specifies the Azure Storage container name.
- Path string
The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.
- Compress
Logs bool Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- Connector
Id int
- access
Key String Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.
- account
Name String Specifies the Azure Storage account name.
- connector
Name String The name of the connector.
- container
Name String Specifies the Azure Storage container name.
- path String
The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.
- compress
Logs Boolean Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector
Id Integer
- access
Key string Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.
- account
Name string Specifies the Azure Storage account name.
- connector
Name string The name of the connector.
- container
Name string Specifies the Azure Storage container name.
- path string
The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.
- compress
Logs boolean Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector
Id number
- access_
key str Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.
- account_
name str Specifies the Azure Storage account name.
- connector_
name str The name of the connector.
- container_
name str Specifies the Azure Storage container name.
- path str
The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.
- compress_
logs bool Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector_
id int
- access
Key String Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.
- account
Name String Specifies the Azure Storage account name.
- connector
Name String The name of the connector.
- container
Name String Specifies the Azure Storage container name.
- path String
The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.
- compress
Logs Boolean Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector
Id Number
DatastreamConfig
- Format string
The format in which you want to receive log files, either
STRUCTURED
orJSON
. Whendelimiter
is present in the request,STRUCTURED
is the mandatory format.- Frequency
Datastream
Config Frequency How often you want to collect logs from each uploader and send them to a destination.
- Delimiter string
A delimiter that you want to use to separate data set fields in the log lines. Currently,
SPACE
is the only available delimiter. This field is required for theSTRUCTURED
log fileformat
.- Upload
File stringPrefix The prefix of the log file that you want to send to a destination. It’s a string of at most 200 characters. If unspecified, defaults to
ak
.- Upload
File stringSuffix The suffix of the log file that you want to send to a destination. It’s a static string of at most 10 characters. If unspecified, defaults to
ds
.
- Format string
The format in which you want to receive log files, either
STRUCTURED
orJSON
. Whendelimiter
is present in the request,STRUCTURED
is the mandatory format.- Frequency
Datastream
Config Frequency How often you want to collect logs from each uploader and send them to a destination.
- Delimiter string
A delimiter that you want to use to separate data set fields in the log lines. Currently,
SPACE
is the only available delimiter. This field is required for theSTRUCTURED
log fileformat
.- Upload
File stringPrefix The prefix of the log file that you want to send to a destination. It’s a string of at most 200 characters. If unspecified, defaults to
ak
.- Upload
File stringSuffix The suffix of the log file that you want to send to a destination. It’s a static string of at most 10 characters. If unspecified, defaults to
ds
.
- format String
The format in which you want to receive log files, either
STRUCTURED
orJSON
. Whendelimiter
is present in the request,STRUCTURED
is the mandatory format.- frequency
Datastream
Config Frequency How often you want to collect logs from each uploader and send them to a destination.
- delimiter String
A delimiter that you want to use to separate data set fields in the log lines. Currently,
SPACE
is the only available delimiter. This field is required for theSTRUCTURED
log fileformat
.- upload
File StringPrefix The prefix of the log file that you want to send to a destination. It’s a string of at most 200 characters. If unspecified, defaults to
ak
.- upload
File StringSuffix The suffix of the log file that you want to send to a destination. It’s a static string of at most 10 characters. If unspecified, defaults to
ds
.
- format string
The format in which you want to receive log files, either
STRUCTURED
orJSON
. Whendelimiter
is present in the request,STRUCTURED
is the mandatory format.- frequency
Datastream
Config Frequency How often you want to collect logs from each uploader and send them to a destination.
- delimiter string
A delimiter that you want to use to separate data set fields in the log lines. Currently,
SPACE
is the only available delimiter. This field is required for theSTRUCTURED
log fileformat
.- upload
File stringPrefix The prefix of the log file that you want to send to a destination. It’s a string of at most 200 characters. If unspecified, defaults to
ak
.- upload
File stringSuffix The suffix of the log file that you want to send to a destination. It’s a static string of at most 10 characters. If unspecified, defaults to
ds
.
- format str
The format in which you want to receive log files, either
STRUCTURED
orJSON
. Whendelimiter
is present in the request,STRUCTURED
is the mandatory format.- frequency
Datastream
Config Frequency How often you want to collect logs from each uploader and send them to a destination.
- delimiter str
A delimiter that you want to use to separate data set fields in the log lines. Currently,
SPACE
is the only available delimiter. This field is required for theSTRUCTURED
log fileformat
.- upload_
file_ strprefix The prefix of the log file that you want to send to a destination. It’s a string of at most 200 characters. If unspecified, defaults to
ak
.- upload_
file_ strsuffix The suffix of the log file that you want to send to a destination. It’s a static string of at most 10 characters. If unspecified, defaults to
ds
.
- format String
The format in which you want to receive log files, either
STRUCTURED
orJSON
. Whendelimiter
is present in the request,STRUCTURED
is the mandatory format.- frequency Property Map
How often you want to collect logs from each uploader and send them to a destination.
- delimiter String
A delimiter that you want to use to separate data set fields in the log lines. Currently,
SPACE
is the only available delimiter. This field is required for theSTRUCTURED
log fileformat
.- upload
File StringPrefix The prefix of the log file that you want to send to a destination. It’s a string of at most 200 characters. If unspecified, defaults to
ak
.- upload
File StringSuffix The suffix of the log file that you want to send to a destination. It’s a static string of at most 10 characters. If unspecified, defaults to
ds
.
DatastreamConfigFrequency
- Time
In intSec The time in seconds after which the system bundles log lines into a file and sends it to a destination.
30
or60
are the possible values.
- Time
In intSec The time in seconds after which the system bundles log lines into a file and sends it to a destination.
30
or60
are the possible values.
- time
In IntegerSec The time in seconds after which the system bundles log lines into a file and sends it to a destination.
30
or60
are the possible values.
- time
In numberSec The time in seconds after which the system bundles log lines into a file and sends it to a destination.
30
or60
are the possible values.
- time_
in_ intsec The time in seconds after which the system bundles log lines into a file and sends it to a destination.
30
or60
are the possible values.
- time
In NumberSec The time in seconds after which the system bundles log lines into a file and sends it to a destination.
30
or60
are the possible values.
DatastreamDatadogConnector
- Auth
Token string Secret. Your Log API token for your account in New Relic.
- Connector
Name string The name of the connector.
- Url string
Enter the secure URL where you want to send and store your logs.
- Compress
Logs bool Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- Connector
Id int - Service string
The service of the Datadog connector. A service groups together endpoints, queries, or jobs for the purposes of scaling instances. See View Datadog reserved attribute list.
- Source string
The source of the Datadog connector. See View Datadog reserved attribute list.
- string
The tags you can use to segment and filter log events in Loggly. Learn more about Tags.
- Auth
Token string Secret. Your Log API token for your account in New Relic.
- Connector
Name string The name of the connector.
- Url string
Enter the secure URL where you want to send and store your logs.
- Compress
Logs bool Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- Connector
Id int - Service string
The service of the Datadog connector. A service groups together endpoints, queries, or jobs for the purposes of scaling instances. See View Datadog reserved attribute list.
- Source string
The source of the Datadog connector. See View Datadog reserved attribute list.
- string
The tags you can use to segment and filter log events in Loggly. Learn more about Tags.
- auth
Token String Secret. Your Log API token for your account in New Relic.
- connector
Name String The name of the connector.
- url String
Enter the secure URL where you want to send and store your logs.
- compress
Logs Boolean Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector
Id Integer - service String
The service of the Datadog connector. A service groups together endpoints, queries, or jobs for the purposes of scaling instances. See View Datadog reserved attribute list.
- source String
The source of the Datadog connector. See View Datadog reserved attribute list.
- String
The tags you can use to segment and filter log events in Loggly. Learn more about Tags.
- auth
Token string Secret. Your Log API token for your account in New Relic.
- connector
Name string The name of the connector.
- url string
Enter the secure URL where you want to send and store your logs.
- compress
Logs boolean Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector
Id number - service string
The service of the Datadog connector. A service groups together endpoints, queries, or jobs for the purposes of scaling instances. See View Datadog reserved attribute list.
- source string
The source of the Datadog connector. See View Datadog reserved attribute list.
- string
The tags you can use to segment and filter log events in Loggly. Learn more about Tags.
- auth_
token str Secret. Your Log API token for your account in New Relic.
- connector_
name str The name of the connector.
- url str
Enter the secure URL where you want to send and store your logs.
- compress_
logs bool Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector_
id int - service str
The service of the Datadog connector. A service groups together endpoints, queries, or jobs for the purposes of scaling instances. See View Datadog reserved attribute list.
- source str
The source of the Datadog connector. See View Datadog reserved attribute list.
- str
The tags you can use to segment and filter log events in Loggly. Learn more about Tags.
- auth
Token String Secret. Your Log API token for your account in New Relic.
- connector
Name String The name of the connector.
- url String
Enter the secure URL where you want to send and store your logs.
- compress
Logs Boolean Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector
Id Number - service String
The service of the Datadog connector. A service groups together endpoints, queries, or jobs for the purposes of scaling instances. See View Datadog reserved attribute list.
- source String
The source of the Datadog connector. See View Datadog reserved attribute list.
- String
The tags you can use to segment and filter log events in Loggly. Learn more about Tags.
DatastreamElasticsearchConnector
- Connector
Name string The name of the connector.
- Endpoint string
The Elasticsearch bulk endpoint URL in the format:
https://<hostname>.elastic-cloud.com:9243/_bulk/
. Setindex_name
in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. Learn more about how to Stream logs to Elasticsearch.- Index
Name string Secret. The index name of the Elastic cloud where you want to store log files.
- Password string
Secret. The Elasticsearch basic access authentication password.
- User
Name string Secret. The Elasticsearch basic access authentication username.
- Ca
Cert string Secret. The certification authority (CA) certificate used to verify the origin server's certificate. It's needed if the certificate stored in
client_cert
is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.- Client
Cert string Secret. The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- Client
Key string Secret. The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- Content
Type string Content type to pass in the log file header.
- Custom
Header stringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- Custom
Header stringValue The custom header's contents passed with the request that contains information about the client connection.
- MTls bool
- Tls
Hostname string The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- Connector
Name string The name of the connector.
- Endpoint string
The Elasticsearch bulk endpoint URL in the format:
https://<hostname>.elastic-cloud.com:9243/_bulk/
. Setindex_name
in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. Learn more about how to Stream logs to Elasticsearch.- Index
Name string Secret. The index name of the Elastic cloud where you want to store log files.
- Password string
Secret. The Elasticsearch basic access authentication password.
- User
Name string Secret. The Elasticsearch basic access authentication username.
- Ca
Cert string Secret. The certification authority (CA) certificate used to verify the origin server's certificate. It's needed if the certificate stored in
client_cert
is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.- Client
Cert string Secret. The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- Client
Key string Secret. The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- Content
Type string Content type to pass in the log file header.
- Custom
Header stringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- Custom
Header stringValue The custom header's contents passed with the request that contains information about the client connection.
- MTls bool
- Tls
Hostname string The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- connector
Name String The name of the connector.
- endpoint String
The Elasticsearch bulk endpoint URL in the format:
https://<hostname>.elastic-cloud.com:9243/_bulk/
. Setindex_name
in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. Learn more about how to Stream logs to Elasticsearch.- index
Name String Secret. The index name of the Elastic cloud where you want to store log files.
- password String
Secret. The Elasticsearch basic access authentication password.
- user
Name String Secret. The Elasticsearch basic access authentication username.
- ca
Cert String Secret. The certification authority (CA) certificate used to verify the origin server's certificate. It's needed if the certificate stored in
client_cert
is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.- client
Cert String Secret. The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- client
Key String Secret. The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- content
Type String Content type to pass in the log file header.
- custom
Header StringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- custom
Header StringValue The custom header's contents passed with the request that contains information about the client connection.
- m
Tls Boolean - tls
Hostname String The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- connector
Name string The name of the connector.
- endpoint string
The Elasticsearch bulk endpoint URL in the format:
https://<hostname>.elastic-cloud.com:9243/_bulk/
. Setindex_name
in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. Learn more about how to Stream logs to Elasticsearch.- index
Name string Secret. The index name of the Elastic cloud where you want to store log files.
- password string
Secret. The Elasticsearch basic access authentication password.
- user
Name string Secret. The Elasticsearch basic access authentication username.
- ca
Cert string Secret. The certification authority (CA) certificate used to verify the origin server's certificate. It's needed if the certificate stored in
client_cert
is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.- client
Cert string Secret. The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- client
Key string Secret. The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- content
Type string Content type to pass in the log file header.
- custom
Header stringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- custom
Header stringValue The custom header's contents passed with the request that contains information about the client connection.
- m
Tls boolean - tls
Hostname string The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- connector_
name str The name of the connector.
- endpoint str
The Elasticsearch bulk endpoint URL in the format:
https://<hostname>.elastic-cloud.com:9243/_bulk/
. Setindex_name
in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. Learn more about how to Stream logs to Elasticsearch.- index_
name str Secret. The index name of the Elastic cloud where you want to store log files.
- password str
Secret. The Elasticsearch basic access authentication password.
- user_
name str Secret. The Elasticsearch basic access authentication username.
- ca_
cert str Secret. The certification authority (CA) certificate used to verify the origin server's certificate. It's needed if the certificate stored in
client_cert
is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.- client_
cert str Secret. The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- client_
key str Secret. The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- content_
type str Content type to pass in the log file header.
- custom_
header_ strname A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- custom_
header_ strvalue The custom header's contents passed with the request that contains information about the client connection.
- m_
tls bool - tls_
hostname str The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- connector
Name String The name of the connector.
- endpoint String
The Elasticsearch bulk endpoint URL in the format:
https://<hostname>.elastic-cloud.com:9243/_bulk/
. Setindex_name
in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. Learn more about how to Stream logs to Elasticsearch.- index
Name String Secret. The index name of the Elastic cloud where you want to store log files.
- password String
Secret. The Elasticsearch basic access authentication password.
- user
Name String Secret. The Elasticsearch basic access authentication username.
- ca
Cert String Secret. The certification authority (CA) certificate used to verify the origin server's certificate. It's needed if the certificate stored in
client_cert
is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.- client
Cert String Secret. The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- client
Key String Secret. The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- content
Type String Content type to pass in the log file header.
- custom
Header StringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- custom
Header StringValue The custom header's contents passed with the request that contains information about the client connection.
- m
Tls Boolean - tls
Hostname String The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
DatastreamGcsConnector
- Bucket string
The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.
- Connector
Name string The name of the connector.
- Private
Key string Secret. The contents of the JSON private key you generated and downloaded in your Google Cloud Storage account.
- Project
Id string The unique ID of your Google Cloud project.
- Service
Account stringName The name of the service account with the storage.object.create permission or Storage Object Creator role.
- Compress
Logs bool Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- Connector
Id int - Path string
The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.
- Bucket string
The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.
- Connector
Name string The name of the connector.
- Private
Key string Secret. The contents of the JSON private key you generated and downloaded in your Google Cloud Storage account.
- Project
Id string The unique ID of your Google Cloud project.
- Service
Account stringName The name of the service account with the storage.object.create permission or Storage Object Creator role.
- Compress
Logs bool Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- Connector
Id int - Path string
The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.
- bucket String
The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.
- connector
Name String The name of the connector.
- private
Key String Secret. The contents of the JSON private key you generated and downloaded in your Google Cloud Storage account.
- project
Id String The unique ID of your Google Cloud project.
- service
Account StringName The name of the service account with the storage.object.create permission or Storage Object Creator role.
- compress
Logs Boolean Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector
Id Integer - path String
The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.
- bucket string
The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.
- connector
Name string The name of the connector.
- private
Key string Secret. The contents of the JSON private key you generated and downloaded in your Google Cloud Storage account.
- project
Id string The unique ID of your Google Cloud project.
- service
Account stringName The name of the service account with the storage.object.create permission or Storage Object Creator role.
- compress
Logs boolean Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector
Id number - path string
The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.
- bucket str
The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.
- connector_
name str The name of the connector.
- private_
key str Secret. The contents of the JSON private key you generated and downloaded in your Google Cloud Storage account.
- project_
id str The unique ID of your Google Cloud project.
- service_
account_ strname The name of the service account with the storage.object.create permission or Storage Object Creator role.
- compress_
logs bool Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector_
id int - path str
The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.
- bucket String
The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.
- connector
Name String The name of the connector.
- private
Key String Secret. The contents of the JSON private key you generated and downloaded in your Google Cloud Storage account.
- project
Id String The unique ID of your Google Cloud project.
- service
Account StringName The name of the service account with the storage.object.create permission or Storage Object Creator role.
- compress
Logs Boolean Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector
Id Number - path String
The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.
DatastreamHttpsConnector
- Authentication
Type string Either
NONE
for no authentication, orBASIC
. For basic authentication, provide theuser_name
andpassword
you set in your custom HTTPS endpoint.- Connector
Name string The name of the connector.
- Url string
Enter the secure URL where you want to send and store your logs.
- Ca
Cert string Secret. The certification authority (CA) certificate used to verify the origin server's certificate. It's needed if the certificate stored in
client_cert
is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.- Client
Cert string Secret. The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- Client
Key string Secret. The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- Compress
Logs bool Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- Connector
Id int - Content
Type string Content type to pass in the log file header.
- Custom
Header stringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- Custom
Header stringValue The custom header's contents passed with the request that contains information about the client connection.
- MTls bool
- Password string
Secret. The Elasticsearch basic access authentication password.
- Tls
Hostname string The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- User
Name string Secret. The Elasticsearch basic access authentication username.
- Authentication
Type string Either
NONE
for no authentication, orBASIC
. For basic authentication, provide theuser_name
andpassword
you set in your custom HTTPS endpoint.- Connector
Name string The name of the connector.
- Url string
Enter the secure URL where you want to send and store your logs.
- Ca
Cert string Secret. The certification authority (CA) certificate used to verify the origin server's certificate. It's needed if the certificate stored in
client_cert
is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.- Client
Cert string Secret. The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- Client
Key string Secret. The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- Compress
Logs bool Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- Connector
Id int - Content
Type string Content type to pass in the log file header.
- Custom
Header stringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- Custom
Header stringValue The custom header's contents passed with the request that contains information about the client connection.
- MTls bool
- Password string
Secret. The Elasticsearch basic access authentication password.
- Tls
Hostname string The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- User
Name string Secret. The Elasticsearch basic access authentication username.
- authentication
Type String Either
NONE
for no authentication, orBASIC
. For basic authentication, provide theuser_name
andpassword
you set in your custom HTTPS endpoint.- connector
Name String The name of the connector.
- url String
Enter the secure URL where you want to send and store your logs.
- ca
Cert String Secret. The certification authority (CA) certificate used to verify the origin server's certificate. It's needed if the certificate stored in
client_cert
is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.- client
Cert String Secret. The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- client
Key String Secret. The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- compress
Logs Boolean Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector
Id Integer - content
Type String Content type to pass in the log file header.
- custom
Header StringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- custom
Header StringValue The custom header's contents passed with the request that contains information about the client connection.
- m
Tls Boolean - password String
Secret. The Elasticsearch basic access authentication password.
- tls
Hostname String The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- user
Name String Secret. The Elasticsearch basic access authentication username.
- authentication
Type string Either
NONE
for no authentication, orBASIC
. For basic authentication, provide theuser_name
andpassword
you set in your custom HTTPS endpoint.- connector
Name string The name of the connector.
- url string
Enter the secure URL where you want to send and store your logs.
- ca
Cert string Secret. The certification authority (CA) certificate used to verify the origin server's certificate. It's needed if the certificate stored in
client_cert
is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.- client
Cert string Secret. The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- client
Key string Secret. The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- compress
Logs boolean Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector
Id number - content
Type string Content type to pass in the log file header.
- custom
Header stringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- custom
Header stringValue The custom header's contents passed with the request that contains information about the client connection.
- m
Tls boolean - password string
Secret. The Elasticsearch basic access authentication password.
- tls
Hostname string The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- user
Name string Secret. The Elasticsearch basic access authentication username.
- authentication_
type str Either
NONE
for no authentication, orBASIC
. For basic authentication, provide theuser_name
andpassword
you set in your custom HTTPS endpoint.- connector_
name str The name of the connector.
- url str
Enter the secure URL where you want to send and store your logs.
- ca_
cert str Secret. The certification authority (CA) certificate used to verify the origin server's certificate. It's needed if the certificate stored in
client_cert
is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.- client_
cert str Secret. The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- client_
key str Secret. The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- compress_
logs bool Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector_
id int - content_
type str Content type to pass in the log file header.
- custom_
header_ strname A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- custom_
header_ strvalue The custom header's contents passed with the request that contains information about the client connection.
- m_
tls bool - password str
Secret. The Elasticsearch basic access authentication password.
- tls_
hostname str The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- user_
name str Secret. The Elasticsearch basic access authentication username.
- authentication
Type String Either
NONE
for no authentication, orBASIC
. For basic authentication, provide theuser_name
andpassword
you set in your custom HTTPS endpoint.- connector
Name String The name of the connector.
- url String
Enter the secure URL where you want to send and store your logs.
- ca
Cert String Secret. The certification authority (CA) certificate used to verify the origin server's certificate. It's needed if the certificate stored in
client_cert
is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.- client
Cert String Secret. The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- client
Key String Secret. The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- compress
Logs Boolean Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector
Id Number - content
Type String Content type to pass in the log file header.
- custom
Header StringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- custom
Header StringValue The custom header's contents passed with the request that contains information about the client connection.
- m
Tls Boolean - password String
Secret. The Elasticsearch basic access authentication password.
- tls
Hostname String The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- user
Name String Secret. The Elasticsearch basic access authentication username.
DatastreamLogglyConnector
- Auth
Token string Secret. Your Log API token for your account in New Relic.
- Connector
Name string The name of the connector.
- Endpoint string
The Elasticsearch bulk endpoint URL in the format:
https://<hostname>.elastic-cloud.com:9243/_bulk/
. Setindex_name
in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. Learn more about how to Stream logs to Elasticsearch.- Content
Type string Content type to pass in the log file header.
- Custom
Header stringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- Custom
Header stringValue The custom header's contents passed with the request that contains information about the client connection.
- string
The tags you can use to segment and filter log events in Loggly. Learn more about Tags.
- Auth
Token string Secret. Your Log API token for your account in New Relic.
- Connector
Name string The name of the connector.
- Endpoint string
The Elasticsearch bulk endpoint URL in the format:
https://<hostname>.elastic-cloud.com:9243/_bulk/
. Setindex_name
in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. Learn more about how to Stream logs to Elasticsearch.- Content
Type string Content type to pass in the log file header.
- Custom
Header stringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- Custom
Header stringValue The custom header's contents passed with the request that contains information about the client connection.
- string
The tags you can use to segment and filter log events in Loggly. Learn more about Tags.
- auth
Token String Secret. Your Log API token for your account in New Relic.
- connector
Name String The name of the connector.
- endpoint String
The Elasticsearch bulk endpoint URL in the format:
https://<hostname>.elastic-cloud.com:9243/_bulk/
. Setindex_name
in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. Learn more about how to Stream logs to Elasticsearch.- content
Type String Content type to pass in the log file header.
- custom
Header StringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- custom
Header StringValue The custom header's contents passed with the request that contains information about the client connection.
- String
The tags you can use to segment and filter log events in Loggly. Learn more about Tags.
- auth
Token string Secret. Your Log API token for your account in New Relic.
- connector
Name string The name of the connector.
- endpoint string
The Elasticsearch bulk endpoint URL in the format:
https://<hostname>.elastic-cloud.com:9243/_bulk/
. Setindex_name
in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. Learn more about how to Stream logs to Elasticsearch.- content
Type string Content type to pass in the log file header.
- custom
Header stringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- custom
Header stringValue The custom header's contents passed with the request that contains information about the client connection.
- string
The tags you can use to segment and filter log events in Loggly. Learn more about Tags.
- auth_
token str Secret. Your Log API token for your account in New Relic.
- connector_
name str The name of the connector.
- endpoint str
The Elasticsearch bulk endpoint URL in the format:
https://<hostname>.elastic-cloud.com:9243/_bulk/
. Setindex_name
in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. Learn more about how to Stream logs to Elasticsearch.- content_
type str Content type to pass in the log file header.
- custom_
header_ strname A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- custom_
header_ strvalue The custom header's contents passed with the request that contains information about the client connection.
- str
The tags you can use to segment and filter log events in Loggly. Learn more about Tags.
- auth
Token String Secret. Your Log API token for your account in New Relic.
- connector
Name String The name of the connector.
- endpoint String
The Elasticsearch bulk endpoint URL in the format:
https://<hostname>.elastic-cloud.com:9243/_bulk/
. Setindex_name
in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. Learn more about how to Stream logs to Elasticsearch.- content
Type String Content type to pass in the log file header.
- custom
Header StringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- custom
Header StringValue The custom header's contents passed with the request that contains information about the client connection.
- String
The tags you can use to segment and filter log events in Loggly. Learn more about Tags.
DatastreamNewRelicConnector
- Auth
Token string Secret. Your Log API token for your account in New Relic.
- Connector
Name string The name of the connector.
- Endpoint string
The Elasticsearch bulk endpoint URL in the format:
https://<hostname>.elastic-cloud.com:9243/_bulk/
. Setindex_name
in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. Learn more about how to Stream logs to Elasticsearch.- Content
Type string Content type to pass in the log file header.
- Custom
Header stringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- Custom
Header stringValue The custom header's contents passed with the request that contains information about the client connection.
- Auth
Token string Secret. Your Log API token for your account in New Relic.
- Connector
Name string The name of the connector.
- Endpoint string
The Elasticsearch bulk endpoint URL in the format:
https://<hostname>.elastic-cloud.com:9243/_bulk/
. Setindex_name
in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. Learn more about how to Stream logs to Elasticsearch.- Content
Type string Content type to pass in the log file header.
- Custom
Header stringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- Custom
Header stringValue The custom header's contents passed with the request that contains information about the client connection.
- auth
Token String Secret. Your Log API token for your account in New Relic.
- connector
Name String The name of the connector.
- endpoint String
The Elasticsearch bulk endpoint URL in the format:
https://<hostname>.elastic-cloud.com:9243/_bulk/
. Setindex_name
in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. Learn more about how to Stream logs to Elasticsearch.- content
Type String Content type to pass in the log file header.
- custom
Header StringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- custom
Header StringValue The custom header's contents passed with the request that contains information about the client connection.
- auth
Token string Secret. Your Log API token for your account in New Relic.
- connector
Name string The name of the connector.
- endpoint string
The Elasticsearch bulk endpoint URL in the format:
https://<hostname>.elastic-cloud.com:9243/_bulk/
. Setindex_name
in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. Learn more about how to Stream logs to Elasticsearch.- content
Type string Content type to pass in the log file header.
- custom
Header stringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- custom
Header stringValue The custom header's contents passed with the request that contains information about the client connection.
- auth_
token str Secret. Your Log API token for your account in New Relic.
- connector_
name str The name of the connector.
- endpoint str
The Elasticsearch bulk endpoint URL in the format:
https://<hostname>.elastic-cloud.com:9243/_bulk/
. Setindex_name
in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. Learn more about how to Stream logs to Elasticsearch.- content_
type str Content type to pass in the log file header.
- custom_
header_ strname A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- custom_
header_ strvalue The custom header's contents passed with the request that contains information about the client connection.
- auth
Token String Secret. Your Log API token for your account in New Relic.
- connector
Name String The name of the connector.
- endpoint String
The Elasticsearch bulk endpoint URL in the format:
https://<hostname>.elastic-cloud.com:9243/_bulk/
. Setindex_name
in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. Learn more about how to Stream logs to Elasticsearch.- content
Type String Content type to pass in the log file header.
- custom
Header StringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- custom
Header StringValue The custom header's contents passed with the request that contains information about the client connection.
DatastreamOracleConnector
- Access
Key string Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.
- Bucket string
The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.
- Connector
Name string The name of the connector.
- Namespace string
The namespace of your Oracle Cloud Storage account. See Understanding Object Storage namespaces.
- Path string
The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.
- Region string
The Oracle Cloud Storage region where your bucket resides. See Regions and availability domains in OCS.
- Secret
Access stringKey Secret. The secret access key identifier that you use to authenticate requests to your Oracle Cloud account.
- Compress
Logs bool Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- Connector
Id int
- Access
Key string Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.
- Bucket string
The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.
- Connector
Name string The name of the connector.
- Namespace string
The namespace of your Oracle Cloud Storage account. See Understanding Object Storage namespaces.
- Path string
The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.
- Region string
The Oracle Cloud Storage region where your bucket resides. See Regions and availability domains in OCS.
- Secret
Access stringKey Secret. The secret access key identifier that you use to authenticate requests to your Oracle Cloud account.
- Compress
Logs bool Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- Connector
Id int
- access
Key String Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.
- bucket String
The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.
- connector
Name String The name of the connector.
- namespace String
The namespace of your Oracle Cloud Storage account. See Understanding Object Storage namespaces.
- path String
The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.
- region String
The Oracle Cloud Storage region where your bucket resides. See Regions and availability domains in OCS.
- secret
Access StringKey Secret. The secret access key identifier that you use to authenticate requests to your Oracle Cloud account.
- compress
Logs Boolean Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector
Id Integer
- access
Key string Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.
- bucket string
The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.
- connector
Name string The name of the connector.
- namespace string
The namespace of your Oracle Cloud Storage account. See Understanding Object Storage namespaces.
- path string
The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.
- region string
The Oracle Cloud Storage region where your bucket resides. See Regions and availability domains in OCS.
- secret
Access stringKey Secret. The secret access key identifier that you use to authenticate requests to your Oracle Cloud account.
- compress
Logs boolean Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector
Id number
- access_
key str Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.
- bucket str
The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.
- connector_
name str The name of the connector.
- namespace str
The namespace of your Oracle Cloud Storage account. See Understanding Object Storage namespaces.
- path str
The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.
- region str
The Oracle Cloud Storage region where your bucket resides. See Regions and availability domains in OCS.
- secret_
access_ strkey Secret. The secret access key identifier that you use to authenticate requests to your Oracle Cloud account.
- compress_
logs bool Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector_
id int
- access
Key String Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.
- bucket String
The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.
- connector
Name String The name of the connector.
- namespace String
The namespace of your Oracle Cloud Storage account. See Understanding Object Storage namespaces.
- path String
The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.
- region String
The Oracle Cloud Storage region where your bucket resides. See Regions and availability domains in OCS.
- secret
Access StringKey Secret. The secret access key identifier that you use to authenticate requests to your Oracle Cloud account.
- compress
Logs Boolean Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector
Id Number
DatastreamS3Connector
- Access
Key string Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.
- Bucket string
The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.
- Connector
Name string The name of the connector.
- Path string
The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.
- Region string
The Oracle Cloud Storage region where your bucket resides. See Regions and availability domains in OCS.
- Secret
Access stringKey Secret. The secret access key identifier that you use to authenticate requests to your Oracle Cloud account.
- Compress
Logs bool Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- Connector
Id int
- Access
Key string Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.
- Bucket string
The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.
- Connector
Name string The name of the connector.
- Path string
The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.
- Region string
The Oracle Cloud Storage region where your bucket resides. See Regions and availability domains in OCS.
- Secret
Access stringKey Secret. The secret access key identifier that you use to authenticate requests to your Oracle Cloud account.
- Compress
Logs bool Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- Connector
Id int
- access
Key String Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.
- bucket String
The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.
- connector
Name String The name of the connector.
- path String
The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.
- region String
The Oracle Cloud Storage region where your bucket resides. See Regions and availability domains in OCS.
- secret
Access StringKey Secret. The secret access key identifier that you use to authenticate requests to your Oracle Cloud account.
- compress
Logs Boolean Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector
Id Integer
- access
Key string Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.
- bucket string
The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.
- connector
Name string The name of the connector.
- path string
The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.
- region string
The Oracle Cloud Storage region where your bucket resides. See Regions and availability domains in OCS.
- secret
Access stringKey Secret. The secret access key identifier that you use to authenticate requests to your Oracle Cloud account.
- compress
Logs boolean Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector
Id number
- access_
key str Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.
- bucket str
The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.
- connector_
name str The name of the connector.
- path str
The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.
- region str
The Oracle Cloud Storage region where your bucket resides. See Regions and availability domains in OCS.
- secret_
access_ strkey Secret. The secret access key identifier that you use to authenticate requests to your Oracle Cloud account.
- compress_
logs bool Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector_
id int
- access
Key String Secret. The access key identifier that you use to authenticate requests to your Oracle Cloud account. See Managing user credentials in OCS.
- bucket String
The name of the Oracle Cloud Storage bucket. See Working with Oracle Cloud Storage buckets.
- connector
Name String The name of the connector.
- path String
The path to the folder within your Oracle Cloud Storage bucket where you want to store your logs.
- region String
The Oracle Cloud Storage region where your bucket resides. See Regions and availability domains in OCS.
- secret
Access StringKey Secret. The secret access key identifier that you use to authenticate requests to your Oracle Cloud account.
- compress
Logs Boolean Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector
Id Number
DatastreamSplunkConnector
- Connector
Name string The name of the connector.
- Event
Collector stringToken Secret. The Event Collector token associated with your Splunk account. See View usage of Event Collector token in Splunk.
- Url string
Enter the secure URL where you want to send and store your logs.
- Ca
Cert string Secret. The certification authority (CA) certificate used to verify the origin server's certificate. It's needed if the certificate stored in
client_cert
is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.- Client
Cert string Secret. The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- Client
Key string Secret. The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- Compress
Logs bool Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- Connector
Id int - Custom
Header stringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- Custom
Header stringValue The custom header's contents passed with the request that contains information about the client connection.
- MTls bool
- Tls
Hostname string The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- Connector
Name string The name of the connector.
- Event
Collector stringToken Secret. The Event Collector token associated with your Splunk account. See View usage of Event Collector token in Splunk.
- Url string
Enter the secure URL where you want to send and store your logs.
- Ca
Cert string Secret. The certification authority (CA) certificate used to verify the origin server's certificate. It's needed if the certificate stored in
client_cert
is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.- Client
Cert string Secret. The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- Client
Key string Secret. The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- Compress
Logs bool Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- Connector
Id int - Custom
Header stringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- Custom
Header stringValue The custom header's contents passed with the request that contains information about the client connection.
- MTls bool
- Tls
Hostname string The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- connector
Name String The name of the connector.
- event
Collector StringToken Secret. The Event Collector token associated with your Splunk account. See View usage of Event Collector token in Splunk.
- url String
Enter the secure URL where you want to send and store your logs.
- ca
Cert String Secret. The certification authority (CA) certificate used to verify the origin server's certificate. It's needed if the certificate stored in
client_cert
is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.- client
Cert String Secret. The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- client
Key String Secret. The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- compress
Logs Boolean Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector
Id Integer - custom
Header StringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- custom
Header StringValue The custom header's contents passed with the request that contains information about the client connection.
- m
Tls Boolean - tls
Hostname String The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- connector
Name string The name of the connector.
- event
Collector stringToken Secret. The Event Collector token associated with your Splunk account. See View usage of Event Collector token in Splunk.
- url string
Enter the secure URL where you want to send and store your logs.
- ca
Cert string Secret. The certification authority (CA) certificate used to verify the origin server's certificate. It's needed if the certificate stored in
client_cert
is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.- client
Cert string Secret. The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- client
Key string Secret. The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- compress
Logs boolean Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector
Id number - custom
Header stringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- custom
Header stringValue The custom header's contents passed with the request that contains information about the client connection.
- m
Tls boolean - tls
Hostname string The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- connector_
name str The name of the connector.
- event_
collector_ strtoken Secret. The Event Collector token associated with your Splunk account. See View usage of Event Collector token in Splunk.
- url str
Enter the secure URL where you want to send and store your logs.
- ca_
cert str Secret. The certification authority (CA) certificate used to verify the origin server's certificate. It's needed if the certificate stored in
client_cert
is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.- client_
cert str Secret. The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- client_
key str Secret. The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- compress_
logs bool Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector_
id int - custom_
header_ strname A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- custom_
header_ strvalue The custom header's contents passed with the request that contains information about the client connection.
- m_
tls bool - tls_
hostname str The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- connector
Name String The name of the connector.
- event
Collector StringToken Secret. The Event Collector token associated with your Splunk account. See View usage of Event Collector token in Splunk.
- url String
Enter the secure URL where you want to send and store your logs.
- ca
Cert String Secret. The certification authority (CA) certificate used to verify the origin server's certificate. It's needed if the certificate stored in
client_cert
is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.- client
Cert String Secret. The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- client
Key String Secret. The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- compress
Logs Boolean Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector
Id Number - custom
Header StringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- custom
Header StringValue The custom header's contents passed with the request that contains information about the client connection.
- m
Tls Boolean - tls
Hostname String The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
DatastreamSumologicConnector
- Collector
Code string Secret. The unique HTTP collector code of your Sumo Logic
endpoint
.- Connector
Name string The name of the connector.
- Endpoint string
The Elasticsearch bulk endpoint URL in the format:
https://<hostname>.elastic-cloud.com:9243/_bulk/
. Setindex_name
in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. Learn more about how to Stream logs to Elasticsearch.- Compress
Logs bool Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- Connector
Id int - Content
Type string Content type to pass in the log file header.
- Custom
Header stringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- Custom
Header stringValue The custom header's contents passed with the request that contains information about the client connection.
- Collector
Code string Secret. The unique HTTP collector code of your Sumo Logic
endpoint
.- Connector
Name string The name of the connector.
- Endpoint string
The Elasticsearch bulk endpoint URL in the format:
https://<hostname>.elastic-cloud.com:9243/_bulk/
. Setindex_name
in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. Learn more about how to Stream logs to Elasticsearch.- Compress
Logs bool Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- Connector
Id int - Content
Type string Content type to pass in the log file header.
- Custom
Header stringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- Custom
Header stringValue The custom header's contents passed with the request that contains information about the client connection.
- collector
Code String Secret. The unique HTTP collector code of your Sumo Logic
endpoint
.- connector
Name String The name of the connector.
- endpoint String
The Elasticsearch bulk endpoint URL in the format:
https://<hostname>.elastic-cloud.com:9243/_bulk/
. Setindex_name
in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. Learn more about how to Stream logs to Elasticsearch.- compress
Logs Boolean Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector
Id Integer - content
Type String Content type to pass in the log file header.
- custom
Header StringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- custom
Header StringValue The custom header's contents passed with the request that contains information about the client connection.
- collector
Code string Secret. The unique HTTP collector code of your Sumo Logic
endpoint
.- connector
Name string The name of the connector.
- endpoint string
The Elasticsearch bulk endpoint URL in the format:
https://<hostname>.elastic-cloud.com:9243/_bulk/
. Setindex_name
in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. Learn more about how to Stream logs to Elasticsearch.- compress
Logs boolean Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector
Id number - content
Type string Content type to pass in the log file header.
- custom
Header stringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- custom
Header stringValue The custom header's contents passed with the request that contains information about the client connection.
- collector_
code str Secret. The unique HTTP collector code of your Sumo Logic
endpoint
.- connector_
name str The name of the connector.
- endpoint str
The Elasticsearch bulk endpoint URL in the format:
https://<hostname>.elastic-cloud.com:9243/_bulk/
. Setindex_name
in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. Learn more about how to Stream logs to Elasticsearch.- compress_
logs bool Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector_
id int - content_
type str Content type to pass in the log file header.
- custom_
header_ strname A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- custom_
header_ strvalue The custom header's contents passed with the request that contains information about the client connection.
- collector
Code String Secret. The unique HTTP collector code of your Sumo Logic
endpoint
.- connector
Name String The name of the connector.
- endpoint String
The Elasticsearch bulk endpoint URL in the format:
https://<hostname>.elastic-cloud.com:9243/_bulk/
. Setindex_name
in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. Learn more about how to Stream logs to Elasticsearch.- compress
Logs Boolean Enables GZIP compression for a log file sent to a destination. If unspecified, this defaults to
true
.- connector
Id Number - content
Type String Content type to pass in the log file header.
- custom
Header StringName A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters.
- custom
Header StringValue The custom header's contents passed with the request that contains information about the client connection.
Package Details
- Repository
- Akamai pulumi/pulumi-akamai
- License
- Apache-2.0
- Notes
This Pulumi package is based on the
akamai
Terraform Provider.