Akamai v7.5.0 published on Friday, Oct 11, 2024 by Pulumi
akamai.Datastream
Explore with Pulumi AI
Create Datastream Resource
Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.
Constructor syntax
new Datastream(name: string, args: DatastreamArgs, opts?: CustomResourceOptions);
@overload
def Datastream(resource_name: str,
args: DatastreamArgs,
opts: Optional[ResourceOptions] = None)
@overload
def Datastream(resource_name: str,
opts: Optional[ResourceOptions] = None,
properties: Optional[Sequence[str]] = None,
active: Optional[bool] = None,
stream_name: Optional[str] = None,
contract_id: Optional[str] = None,
group_id: Optional[str] = None,
dataset_fields: Optional[Sequence[int]] = None,
delivery_configuration: Optional[DatastreamDeliveryConfigurationArgs] = None,
splunk_connector: Optional[DatastreamSplunkConnectorArgs] = None,
sumologic_connector: Optional[DatastreamSumologicConnectorArgs] = None,
azure_connector: Optional[DatastreamAzureConnectorArgs] = None,
elasticsearch_connector: Optional[DatastreamElasticsearchConnectorArgs] = None,
loggly_connector: Optional[DatastreamLogglyConnectorArgs] = None,
new_relic_connector: Optional[DatastreamNewRelicConnectorArgs] = None,
notification_emails: Optional[Sequence[str]] = None,
oracle_connector: Optional[DatastreamOracleConnectorArgs] = None,
https_connector: Optional[DatastreamHttpsConnectorArgs] = None,
s3_connector: Optional[DatastreamS3ConnectorArgs] = None,
datadog_connector: Optional[DatastreamDatadogConnectorArgs] = None,
collect_midgress: Optional[bool] = None,
gcs_connector: Optional[DatastreamGcsConnectorArgs] = None)
func NewDatastream(ctx *Context, name string, args DatastreamArgs, opts ...ResourceOption) (*Datastream, error)
public Datastream(string name, DatastreamArgs args, CustomResourceOptions? opts = null)
public Datastream(String name, DatastreamArgs args)
public Datastream(String name, DatastreamArgs args, CustomResourceOptions options)
type: akamai:Datastream
properties: # The arguments to resource properties.
options: # Bag of options to control resource's behavior.
Parameters
- name string
- The unique name of the resource.
- args DatastreamArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- resource_name str
- The unique name of the resource.
- args DatastreamArgs
- The arguments to resource properties.
- opts ResourceOptions
- Bag of options to control resource's behavior.
- ctx Context
- Context object for the current deployment.
- name string
- The unique name of the resource.
- args DatastreamArgs
- The arguments to resource properties.
- opts ResourceOption
- Bag of options to control resource's behavior.
- name string
- The unique name of the resource.
- args DatastreamArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- name String
- The unique name of the resource.
- args DatastreamArgs
- The arguments to resource properties.
- options CustomResourceOptions
- Bag of options to control resource's behavior.
Constructor example
The following reference example uses placeholder values for all input properties.
var datastreamResource = new Akamai.Datastream("datastreamResource", new()
{
Properties = new[]
{
"string",
},
Active = false,
StreamName = "string",
ContractId = "string",
GroupId = "string",
DatasetFields = new[]
{
0,
},
DeliveryConfiguration = new Akamai.Inputs.DatastreamDeliveryConfigurationArgs
{
Format = "string",
Frequency = new Akamai.Inputs.DatastreamDeliveryConfigurationFrequencyArgs
{
IntervalInSecs = 0,
},
FieldDelimiter = "string",
UploadFilePrefix = "string",
UploadFileSuffix = "string",
},
SplunkConnector = new Akamai.Inputs.DatastreamSplunkConnectorArgs
{
DisplayName = "string",
Endpoint = "string",
EventCollectorToken = "string",
CaCert = "string",
ClientCert = "string",
ClientKey = "string",
CompressLogs = false,
CustomHeaderName = "string",
CustomHeaderValue = "string",
MTls = false,
TlsHostname = "string",
},
SumologicConnector = new Akamai.Inputs.DatastreamSumologicConnectorArgs
{
CollectorCode = "string",
DisplayName = "string",
Endpoint = "string",
CompressLogs = false,
ContentType = "string",
CustomHeaderName = "string",
CustomHeaderValue = "string",
},
AzureConnector = new Akamai.Inputs.DatastreamAzureConnectorArgs
{
AccessKey = "string",
AccountName = "string",
ContainerName = "string",
DisplayName = "string",
Path = "string",
CompressLogs = false,
},
ElasticsearchConnector = new Akamai.Inputs.DatastreamElasticsearchConnectorArgs
{
DisplayName = "string",
UserName = "string",
Password = "string",
IndexName = "string",
Endpoint = "string",
ContentType = "string",
CustomHeaderValue = "string",
CustomHeaderName = "string",
CaCert = "string",
MTls = false,
ClientKey = "string",
TlsHostname = "string",
ClientCert = "string",
},
LogglyConnector = new Akamai.Inputs.DatastreamLogglyConnectorArgs
{
AuthToken = "string",
DisplayName = "string",
Endpoint = "string",
ContentType = "string",
CustomHeaderName = "string",
CustomHeaderValue = "string",
Tags = "string",
},
NewRelicConnector = new Akamai.Inputs.DatastreamNewRelicConnectorArgs
{
AuthToken = "string",
DisplayName = "string",
Endpoint = "string",
ContentType = "string",
CustomHeaderName = "string",
CustomHeaderValue = "string",
},
NotificationEmails = new[]
{
"string",
},
OracleConnector = new Akamai.Inputs.DatastreamOracleConnectorArgs
{
AccessKey = "string",
Bucket = "string",
DisplayName = "string",
Namespace = "string",
Path = "string",
Region = "string",
SecretAccessKey = "string",
CompressLogs = false,
},
HttpsConnector = new Akamai.Inputs.DatastreamHttpsConnectorArgs
{
AuthenticationType = "string",
Endpoint = "string",
DisplayName = "string",
ClientKey = "string",
CompressLogs = false,
ContentType = "string",
CustomHeaderName = "string",
CustomHeaderValue = "string",
ClientCert = "string",
CaCert = "string",
MTls = false,
Password = "string",
TlsHostname = "string",
UserName = "string",
},
S3Connector = new Akamai.Inputs.DatastreamS3ConnectorArgs
{
AccessKey = "string",
Bucket = "string",
DisplayName = "string",
Path = "string",
Region = "string",
SecretAccessKey = "string",
CompressLogs = false,
},
DatadogConnector = new Akamai.Inputs.DatastreamDatadogConnectorArgs
{
AuthToken = "string",
DisplayName = "string",
Endpoint = "string",
CompressLogs = false,
Service = "string",
Source = "string",
Tags = "string",
},
CollectMidgress = false,
GcsConnector = new Akamai.Inputs.DatastreamGcsConnectorArgs
{
Bucket = "string",
DisplayName = "string",
PrivateKey = "string",
ProjectId = "string",
ServiceAccountName = "string",
CompressLogs = false,
Path = "string",
},
});
example, err := akamai.NewDatastream(ctx, "datastreamResource", &akamai.DatastreamArgs{
Properties: pulumi.StringArray{
pulumi.String("string"),
},
Active: pulumi.Bool(false),
StreamName: pulumi.String("string"),
ContractId: pulumi.String("string"),
GroupId: pulumi.String("string"),
DatasetFields: pulumi.IntArray{
pulumi.Int(0),
},
DeliveryConfiguration: &akamai.DatastreamDeliveryConfigurationArgs{
Format: pulumi.String("string"),
Frequency: &akamai.DatastreamDeliveryConfigurationFrequencyArgs{
IntervalInSecs: pulumi.Int(0),
},
FieldDelimiter: pulumi.String("string"),
UploadFilePrefix: pulumi.String("string"),
UploadFileSuffix: pulumi.String("string"),
},
SplunkConnector: &akamai.DatastreamSplunkConnectorArgs{
DisplayName: pulumi.String("string"),
Endpoint: pulumi.String("string"),
EventCollectorToken: pulumi.String("string"),
CaCert: pulumi.String("string"),
ClientCert: pulumi.String("string"),
ClientKey: pulumi.String("string"),
CompressLogs: pulumi.Bool(false),
CustomHeaderName: pulumi.String("string"),
CustomHeaderValue: pulumi.String("string"),
MTls: pulumi.Bool(false),
TlsHostname: pulumi.String("string"),
},
SumologicConnector: &akamai.DatastreamSumologicConnectorArgs{
CollectorCode: pulumi.String("string"),
DisplayName: pulumi.String("string"),
Endpoint: pulumi.String("string"),
CompressLogs: pulumi.Bool(false),
ContentType: pulumi.String("string"),
CustomHeaderName: pulumi.String("string"),
CustomHeaderValue: pulumi.String("string"),
},
AzureConnector: &akamai.DatastreamAzureConnectorArgs{
AccessKey: pulumi.String("string"),
AccountName: pulumi.String("string"),
ContainerName: pulumi.String("string"),
DisplayName: pulumi.String("string"),
Path: pulumi.String("string"),
CompressLogs: pulumi.Bool(false),
},
ElasticsearchConnector: &akamai.DatastreamElasticsearchConnectorArgs{
DisplayName: pulumi.String("string"),
UserName: pulumi.String("string"),
Password: pulumi.String("string"),
IndexName: pulumi.String("string"),
Endpoint: pulumi.String("string"),
ContentType: pulumi.String("string"),
CustomHeaderValue: pulumi.String("string"),
CustomHeaderName: pulumi.String("string"),
CaCert: pulumi.String("string"),
MTls: pulumi.Bool(false),
ClientKey: pulumi.String("string"),
TlsHostname: pulumi.String("string"),
ClientCert: pulumi.String("string"),
},
LogglyConnector: &akamai.DatastreamLogglyConnectorArgs{
AuthToken: pulumi.String("string"),
DisplayName: pulumi.String("string"),
Endpoint: pulumi.String("string"),
ContentType: pulumi.String("string"),
CustomHeaderName: pulumi.String("string"),
CustomHeaderValue: pulumi.String("string"),
Tags: pulumi.String("string"),
},
NewRelicConnector: &akamai.DatastreamNewRelicConnectorArgs{
AuthToken: pulumi.String("string"),
DisplayName: pulumi.String("string"),
Endpoint: pulumi.String("string"),
ContentType: pulumi.String("string"),
CustomHeaderName: pulumi.String("string"),
CustomHeaderValue: pulumi.String("string"),
},
NotificationEmails: pulumi.StringArray{
pulumi.String("string"),
},
OracleConnector: &akamai.DatastreamOracleConnectorArgs{
AccessKey: pulumi.String("string"),
Bucket: pulumi.String("string"),
DisplayName: pulumi.String("string"),
Namespace: pulumi.String("string"),
Path: pulumi.String("string"),
Region: pulumi.String("string"),
SecretAccessKey: pulumi.String("string"),
CompressLogs: pulumi.Bool(false),
},
HttpsConnector: &akamai.DatastreamHttpsConnectorArgs{
AuthenticationType: pulumi.String("string"),
Endpoint: pulumi.String("string"),
DisplayName: pulumi.String("string"),
ClientKey: pulumi.String("string"),
CompressLogs: pulumi.Bool(false),
ContentType: pulumi.String("string"),
CustomHeaderName: pulumi.String("string"),
CustomHeaderValue: pulumi.String("string"),
ClientCert: pulumi.String("string"),
CaCert: pulumi.String("string"),
MTls: pulumi.Bool(false),
Password: pulumi.String("string"),
TlsHostname: pulumi.String("string"),
UserName: pulumi.String("string"),
},
S3Connector: &akamai.DatastreamS3ConnectorArgs{
AccessKey: pulumi.String("string"),
Bucket: pulumi.String("string"),
DisplayName: pulumi.String("string"),
Path: pulumi.String("string"),
Region: pulumi.String("string"),
SecretAccessKey: pulumi.String("string"),
CompressLogs: pulumi.Bool(false),
},
DatadogConnector: &akamai.DatastreamDatadogConnectorArgs{
AuthToken: pulumi.String("string"),
DisplayName: pulumi.String("string"),
Endpoint: pulumi.String("string"),
CompressLogs: pulumi.Bool(false),
Service: pulumi.String("string"),
Source: pulumi.String("string"),
Tags: pulumi.String("string"),
},
CollectMidgress: pulumi.Bool(false),
GcsConnector: &akamai.DatastreamGcsConnectorArgs{
Bucket: pulumi.String("string"),
DisplayName: pulumi.String("string"),
PrivateKey: pulumi.String("string"),
ProjectId: pulumi.String("string"),
ServiceAccountName: pulumi.String("string"),
CompressLogs: pulumi.Bool(false),
Path: pulumi.String("string"),
},
})
var datastreamResource = new Datastream("datastreamResource", DatastreamArgs.builder()
.properties("string")
.active(false)
.streamName("string")
.contractId("string")
.groupId("string")
.datasetFields(0)
.deliveryConfiguration(DatastreamDeliveryConfigurationArgs.builder()
.format("string")
.frequency(DatastreamDeliveryConfigurationFrequencyArgs.builder()
.intervalInSecs(0)
.build())
.fieldDelimiter("string")
.uploadFilePrefix("string")
.uploadFileSuffix("string")
.build())
.splunkConnector(DatastreamSplunkConnectorArgs.builder()
.displayName("string")
.endpoint("string")
.eventCollectorToken("string")
.caCert("string")
.clientCert("string")
.clientKey("string")
.compressLogs(false)
.customHeaderName("string")
.customHeaderValue("string")
.mTls(false)
.tlsHostname("string")
.build())
.sumologicConnector(DatastreamSumologicConnectorArgs.builder()
.collectorCode("string")
.displayName("string")
.endpoint("string")
.compressLogs(false)
.contentType("string")
.customHeaderName("string")
.customHeaderValue("string")
.build())
.azureConnector(DatastreamAzureConnectorArgs.builder()
.accessKey("string")
.accountName("string")
.containerName("string")
.displayName("string")
.path("string")
.compressLogs(false)
.build())
.elasticsearchConnector(DatastreamElasticsearchConnectorArgs.builder()
.displayName("string")
.userName("string")
.password("string")
.indexName("string")
.endpoint("string")
.contentType("string")
.customHeaderValue("string")
.customHeaderName("string")
.caCert("string")
.mTls(false)
.clientKey("string")
.tlsHostname("string")
.clientCert("string")
.build())
.logglyConnector(DatastreamLogglyConnectorArgs.builder()
.authToken("string")
.displayName("string")
.endpoint("string")
.contentType("string")
.customHeaderName("string")
.customHeaderValue("string")
.tags("string")
.build())
.newRelicConnector(DatastreamNewRelicConnectorArgs.builder()
.authToken("string")
.displayName("string")
.endpoint("string")
.contentType("string")
.customHeaderName("string")
.customHeaderValue("string")
.build())
.notificationEmails("string")
.oracleConnector(DatastreamOracleConnectorArgs.builder()
.accessKey("string")
.bucket("string")
.displayName("string")
.namespace("string")
.path("string")
.region("string")
.secretAccessKey("string")
.compressLogs(false)
.build())
.httpsConnector(DatastreamHttpsConnectorArgs.builder()
.authenticationType("string")
.endpoint("string")
.displayName("string")
.clientKey("string")
.compressLogs(false)
.contentType("string")
.customHeaderName("string")
.customHeaderValue("string")
.clientCert("string")
.caCert("string")
.mTls(false)
.password("string")
.tlsHostname("string")
.userName("string")
.build())
.s3Connector(DatastreamS3ConnectorArgs.builder()
.accessKey("string")
.bucket("string")
.displayName("string")
.path("string")
.region("string")
.secretAccessKey("string")
.compressLogs(false)
.build())
.datadogConnector(DatastreamDatadogConnectorArgs.builder()
.authToken("string")
.displayName("string")
.endpoint("string")
.compressLogs(false)
.service("string")
.source("string")
.tags("string")
.build())
.collectMidgress(false)
.gcsConnector(DatastreamGcsConnectorArgs.builder()
.bucket("string")
.displayName("string")
.privateKey("string")
.projectId("string")
.serviceAccountName("string")
.compressLogs(false)
.path("string")
.build())
.build());
datastream_resource = akamai.Datastream("datastreamResource",
properties=["string"],
active=False,
stream_name="string",
contract_id="string",
group_id="string",
dataset_fields=[0],
delivery_configuration=akamai.DatastreamDeliveryConfigurationArgs(
format="string",
frequency=akamai.DatastreamDeliveryConfigurationFrequencyArgs(
interval_in_secs=0,
),
field_delimiter="string",
upload_file_prefix="string",
upload_file_suffix="string",
),
splunk_connector=akamai.DatastreamSplunkConnectorArgs(
display_name="string",
endpoint="string",
event_collector_token="string",
ca_cert="string",
client_cert="string",
client_key="string",
compress_logs=False,
custom_header_name="string",
custom_header_value="string",
m_tls=False,
tls_hostname="string",
),
sumologic_connector=akamai.DatastreamSumologicConnectorArgs(
collector_code="string",
display_name="string",
endpoint="string",
compress_logs=False,
content_type="string",
custom_header_name="string",
custom_header_value="string",
),
azure_connector=akamai.DatastreamAzureConnectorArgs(
access_key="string",
account_name="string",
container_name="string",
display_name="string",
path="string",
compress_logs=False,
),
elasticsearch_connector=akamai.DatastreamElasticsearchConnectorArgs(
display_name="string",
user_name="string",
password="string",
index_name="string",
endpoint="string",
content_type="string",
custom_header_value="string",
custom_header_name="string",
ca_cert="string",
m_tls=False,
client_key="string",
tls_hostname="string",
client_cert="string",
),
loggly_connector=akamai.DatastreamLogglyConnectorArgs(
auth_token="string",
display_name="string",
endpoint="string",
content_type="string",
custom_header_name="string",
custom_header_value="string",
tags="string",
),
new_relic_connector=akamai.DatastreamNewRelicConnectorArgs(
auth_token="string",
display_name="string",
endpoint="string",
content_type="string",
custom_header_name="string",
custom_header_value="string",
),
notification_emails=["string"],
oracle_connector=akamai.DatastreamOracleConnectorArgs(
access_key="string",
bucket="string",
display_name="string",
namespace="string",
path="string",
region="string",
secret_access_key="string",
compress_logs=False,
),
https_connector=akamai.DatastreamHttpsConnectorArgs(
authentication_type="string",
endpoint="string",
display_name="string",
client_key="string",
compress_logs=False,
content_type="string",
custom_header_name="string",
custom_header_value="string",
client_cert="string",
ca_cert="string",
m_tls=False,
password="string",
tls_hostname="string",
user_name="string",
),
s3_connector=akamai.DatastreamS3ConnectorArgs(
access_key="string",
bucket="string",
display_name="string",
path="string",
region="string",
secret_access_key="string",
compress_logs=False,
),
datadog_connector=akamai.DatastreamDatadogConnectorArgs(
auth_token="string",
display_name="string",
endpoint="string",
compress_logs=False,
service="string",
source="string",
tags="string",
),
collect_midgress=False,
gcs_connector=akamai.DatastreamGcsConnectorArgs(
bucket="string",
display_name="string",
private_key="string",
project_id="string",
service_account_name="string",
compress_logs=False,
path="string",
))
const datastreamResource = new akamai.Datastream("datastreamResource", {
properties: ["string"],
active: false,
streamName: "string",
contractId: "string",
groupId: "string",
datasetFields: [0],
deliveryConfiguration: {
format: "string",
frequency: {
intervalInSecs: 0,
},
fieldDelimiter: "string",
uploadFilePrefix: "string",
uploadFileSuffix: "string",
},
splunkConnector: {
displayName: "string",
endpoint: "string",
eventCollectorToken: "string",
caCert: "string",
clientCert: "string",
clientKey: "string",
compressLogs: false,
customHeaderName: "string",
customHeaderValue: "string",
mTls: false,
tlsHostname: "string",
},
sumologicConnector: {
collectorCode: "string",
displayName: "string",
endpoint: "string",
compressLogs: false,
contentType: "string",
customHeaderName: "string",
customHeaderValue: "string",
},
azureConnector: {
accessKey: "string",
accountName: "string",
containerName: "string",
displayName: "string",
path: "string",
compressLogs: false,
},
elasticsearchConnector: {
displayName: "string",
userName: "string",
password: "string",
indexName: "string",
endpoint: "string",
contentType: "string",
customHeaderValue: "string",
customHeaderName: "string",
caCert: "string",
mTls: false,
clientKey: "string",
tlsHostname: "string",
clientCert: "string",
},
logglyConnector: {
authToken: "string",
displayName: "string",
endpoint: "string",
contentType: "string",
customHeaderName: "string",
customHeaderValue: "string",
tags: "string",
},
newRelicConnector: {
authToken: "string",
displayName: "string",
endpoint: "string",
contentType: "string",
customHeaderName: "string",
customHeaderValue: "string",
},
notificationEmails: ["string"],
oracleConnector: {
accessKey: "string",
bucket: "string",
displayName: "string",
namespace: "string",
path: "string",
region: "string",
secretAccessKey: "string",
compressLogs: false,
},
httpsConnector: {
authenticationType: "string",
endpoint: "string",
displayName: "string",
clientKey: "string",
compressLogs: false,
contentType: "string",
customHeaderName: "string",
customHeaderValue: "string",
clientCert: "string",
caCert: "string",
mTls: false,
password: "string",
tlsHostname: "string",
userName: "string",
},
s3Connector: {
accessKey: "string",
bucket: "string",
displayName: "string",
path: "string",
region: "string",
secretAccessKey: "string",
compressLogs: false,
},
datadogConnector: {
authToken: "string",
displayName: "string",
endpoint: "string",
compressLogs: false,
service: "string",
source: "string",
tags: "string",
},
collectMidgress: false,
gcsConnector: {
bucket: "string",
displayName: "string",
privateKey: "string",
projectId: "string",
serviceAccountName: "string",
compressLogs: false,
path: "string",
},
});
type: akamai:Datastream
properties:
active: false
azureConnector:
accessKey: string
accountName: string
compressLogs: false
containerName: string
displayName: string
path: string
collectMidgress: false
contractId: string
datadogConnector:
authToken: string
compressLogs: false
displayName: string
endpoint: string
service: string
source: string
tags: string
datasetFields:
- 0
deliveryConfiguration:
fieldDelimiter: string
format: string
frequency:
intervalInSecs: 0
uploadFilePrefix: string
uploadFileSuffix: string
elasticsearchConnector:
caCert: string
clientCert: string
clientKey: string
contentType: string
customHeaderName: string
customHeaderValue: string
displayName: string
endpoint: string
indexName: string
mTls: false
password: string
tlsHostname: string
userName: string
gcsConnector:
bucket: string
compressLogs: false
displayName: string
path: string
privateKey: string
projectId: string
serviceAccountName: string
groupId: string
httpsConnector:
authenticationType: string
caCert: string
clientCert: string
clientKey: string
compressLogs: false
contentType: string
customHeaderName: string
customHeaderValue: string
displayName: string
endpoint: string
mTls: false
password: string
tlsHostname: string
userName: string
logglyConnector:
authToken: string
contentType: string
customHeaderName: string
customHeaderValue: string
displayName: string
endpoint: string
tags: string
newRelicConnector:
authToken: string
contentType: string
customHeaderName: string
customHeaderValue: string
displayName: string
endpoint: string
notificationEmails:
- string
oracleConnector:
accessKey: string
bucket: string
compressLogs: false
displayName: string
namespace: string
path: string
region: string
secretAccessKey: string
properties:
- string
s3Connector:
accessKey: string
bucket: string
compressLogs: false
displayName: string
path: string
region: string
secretAccessKey: string
splunkConnector:
caCert: string
clientCert: string
clientKey: string
compressLogs: false
customHeaderName: string
customHeaderValue: string
displayName: string
endpoint: string
eventCollectorToken: string
mTls: false
tlsHostname: string
streamName: string
sumologicConnector:
collectorCode: string
compressLogs: false
contentType: string
customHeaderName: string
customHeaderValue: string
displayName: string
endpoint: string
Datastream Resource Properties
To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.
Inputs
The Datastream resource accepts the following input properties:
- Active bool
- Defining if stream should be active or not
- Contract
Id string - Identifies the contract that has access to the product
- Dataset
Fields List<int> - A list of data set fields selected from the associated template that the stream monitors in logs. The order of the identifiers define how the value for these fields appear in the log lines
- Delivery
Configuration DatastreamDelivery Configuration - Provides information about the configuration related to logs (format, file names, delivery frequency)
- Group
Id string - Identifies the group that has access to the product and for which the stream configuration was created
- Properties List<string>
- Identifies the properties monitored in the stream
- Stream
Name string - The name of the stream
- Azure
Connector DatastreamAzure Connector - Collect
Midgress bool - Identifies if stream needs to collect midgress data
- Datadog
Connector DatastreamDatadog Connector - Elasticsearch
Connector DatastreamElasticsearch Connector - Gcs
Connector DatastreamGcs Connector - Https
Connector DatastreamHttps Connector - Loggly
Connector DatastreamLoggly Connector - New
Relic DatastreamConnector New Relic Connector - Notification
Emails List<string> - List of email addresses where the system sends notifications about activations and deactivations of the stream
- Oracle
Connector DatastreamOracle Connector - S3Connector
Datastream
S3Connector - Splunk
Connector DatastreamSplunk Connector - Sumologic
Connector DatastreamSumologic Connector
- Active bool
- Defining if stream should be active or not
- Contract
Id string - Identifies the contract that has access to the product
- Dataset
Fields []int - A list of data set fields selected from the associated template that the stream monitors in logs. The order of the identifiers define how the value for these fields appear in the log lines
- Delivery
Configuration DatastreamDelivery Configuration Args - Provides information about the configuration related to logs (format, file names, delivery frequency)
- Group
Id string - Identifies the group that has access to the product and for which the stream configuration was created
- Properties []string
- Identifies the properties monitored in the stream
- Stream
Name string - The name of the stream
- Azure
Connector DatastreamAzure Connector Args - Collect
Midgress bool - Identifies if stream needs to collect midgress data
- Datadog
Connector DatastreamDatadog Connector Args - Elasticsearch
Connector DatastreamElasticsearch Connector Args - Gcs
Connector DatastreamGcs Connector Args - Https
Connector DatastreamHttps Connector Args - Loggly
Connector DatastreamLoggly Connector Args - New
Relic DatastreamConnector New Relic Connector Args - Notification
Emails []string - List of email addresses where the system sends notifications about activations and deactivations of the stream
- Oracle
Connector DatastreamOracle Connector Args - S3Connector
Datastream
S3Connector Args - Splunk
Connector DatastreamSplunk Connector Args - Sumologic
Connector DatastreamSumologic Connector Args
- active Boolean
- Defining if stream should be active or not
- contract
Id String - Identifies the contract that has access to the product
- dataset
Fields List<Integer> - A list of data set fields selected from the associated template that the stream monitors in logs. The order of the identifiers define how the value for these fields appear in the log lines
- delivery
Configuration DatastreamDelivery Configuration - Provides information about the configuration related to logs (format, file names, delivery frequency)
- group
Id String - Identifies the group that has access to the product and for which the stream configuration was created
- properties List<String>
- Identifies the properties monitored in the stream
- stream
Name String - The name of the stream
- azure
Connector DatastreamAzure Connector - collect
Midgress Boolean - Identifies if stream needs to collect midgress data
- datadog
Connector DatastreamDatadog Connector - elasticsearch
Connector DatastreamElasticsearch Connector - gcs
Connector DatastreamGcs Connector - https
Connector DatastreamHttps Connector - loggly
Connector DatastreamLoggly Connector - new
Relic DatastreamConnector New Relic Connector - notification
Emails List<String> - List of email addresses where the system sends notifications about activations and deactivations of the stream
- oracle
Connector DatastreamOracle Connector - s3Connector
Datastream
S3Connector - splunk
Connector DatastreamSplunk Connector - sumologic
Connector DatastreamSumologic Connector
- active boolean
- Defining if stream should be active or not
- contract
Id string - Identifies the contract that has access to the product
- dataset
Fields number[] - A list of data set fields selected from the associated template that the stream monitors in logs. The order of the identifiers define how the value for these fields appear in the log lines
- delivery
Configuration DatastreamDelivery Configuration - Provides information about the configuration related to logs (format, file names, delivery frequency)
- group
Id string - Identifies the group that has access to the product and for which the stream configuration was created
- properties string[]
- Identifies the properties monitored in the stream
- stream
Name string - The name of the stream
- azure
Connector DatastreamAzure Connector - collect
Midgress boolean - Identifies if stream needs to collect midgress data
- datadog
Connector DatastreamDatadog Connector - elasticsearch
Connector DatastreamElasticsearch Connector - gcs
Connector DatastreamGcs Connector - https
Connector DatastreamHttps Connector - loggly
Connector DatastreamLoggly Connector - new
Relic DatastreamConnector New Relic Connector - notification
Emails string[] - List of email addresses where the system sends notifications about activations and deactivations of the stream
- oracle
Connector DatastreamOracle Connector - s3Connector
Datastream
S3Connector - splunk
Connector DatastreamSplunk Connector - sumologic
Connector DatastreamSumologic Connector
- active bool
- Defining if stream should be active or not
- contract_
id str - Identifies the contract that has access to the product
- dataset_
fields Sequence[int] - A list of data set fields selected from the associated template that the stream monitors in logs. The order of the identifiers define how the value for these fields appear in the log lines
- delivery_
configuration DatastreamDelivery Configuration Args - Provides information about the configuration related to logs (format, file names, delivery frequency)
- group_
id str - Identifies the group that has access to the product and for which the stream configuration was created
- properties Sequence[str]
- Identifies the properties monitored in the stream
- stream_
name str - The name of the stream
- azure_
connector DatastreamAzure Connector Args - collect_
midgress bool - Identifies if stream needs to collect midgress data
- datadog_
connector DatastreamDatadog Connector Args - elasticsearch_
connector DatastreamElasticsearch Connector Args - gcs_
connector DatastreamGcs Connector Args - https_
connector DatastreamHttps Connector Args - loggly_
connector DatastreamLoggly Connector Args - new_
relic_ Datastreamconnector New Relic Connector Args - notification_
emails Sequence[str] - List of email addresses where the system sends notifications about activations and deactivations of the stream
- oracle_
connector DatastreamOracle Connector Args - s3_
connector DatastreamS3Connector Args - splunk_
connector DatastreamSplunk Connector Args - sumologic_
connector DatastreamSumologic Connector Args
- active Boolean
- Defining if stream should be active or not
- contract
Id String - Identifies the contract that has access to the product
- dataset
Fields List<Number> - A list of data set fields selected from the associated template that the stream monitors in logs. The order of the identifiers define how the value for these fields appear in the log lines
- delivery
Configuration Property Map - Provides information about the configuration related to logs (format, file names, delivery frequency)
- group
Id String - Identifies the group that has access to the product and for which the stream configuration was created
- properties List<String>
- Identifies the properties monitored in the stream
- stream
Name String - The name of the stream
- azure
Connector Property Map - collect
Midgress Boolean - Identifies if stream needs to collect midgress data
- datadog
Connector Property Map - elasticsearch
Connector Property Map - gcs
Connector Property Map - https
Connector Property Map - loggly
Connector Property Map - new
Relic Property MapConnector - notification
Emails List<String> - List of email addresses where the system sends notifications about activations and deactivations of the stream
- oracle
Connector Property Map - s3Connector Property Map
- splunk
Connector Property Map - sumologic
Connector Property Map
Outputs
All input properties are implicitly available as output properties. Additionally, the Datastream resource produces the following output properties:
- Created
By string - The username who created the stream
- Created
Date string - The date and time when the stream was created
- Id string
- The provider-assigned unique ID for this managed resource.
- Latest
Version int - Identifies the latest active configuration version of the stream
- Modified
By string - The username who modified the stream
- Modified
Date string - The date and time when the stream was modified
- Papi
Json string - The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- Product
Id string - The ID of the product for which the stream was created
- Stream
Version int - Identifies the configuration version of the stream
- Created
By string - The username who created the stream
- Created
Date string - The date and time when the stream was created
- Id string
- The provider-assigned unique ID for this managed resource.
- Latest
Version int - Identifies the latest active configuration version of the stream
- Modified
By string - The username who modified the stream
- Modified
Date string - The date and time when the stream was modified
- Papi
Json string - The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- Product
Id string - The ID of the product for which the stream was created
- Stream
Version int - Identifies the configuration version of the stream
- created
By String - The username who created the stream
- created
Date String - The date and time when the stream was created
- id String
- The provider-assigned unique ID for this managed resource.
- latest
Version Integer - Identifies the latest active configuration version of the stream
- modified
By String - The username who modified the stream
- modified
Date String - The date and time when the stream was modified
- papi
Json String - The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- product
Id String - The ID of the product for which the stream was created
- stream
Version Integer - Identifies the configuration version of the stream
- created
By string - The username who created the stream
- created
Date string - The date and time when the stream was created
- id string
- The provider-assigned unique ID for this managed resource.
- latest
Version number - Identifies the latest active configuration version of the stream
- modified
By string - The username who modified the stream
- modified
Date string - The date and time when the stream was modified
- papi
Json string - The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- product
Id string - The ID of the product for which the stream was created
- stream
Version number - Identifies the configuration version of the stream
- created_
by str - The username who created the stream
- created_
date str - The date and time when the stream was created
- id str
- The provider-assigned unique ID for this managed resource.
- latest_
version int - Identifies the latest active configuration version of the stream
- modified_
by str - The username who modified the stream
- modified_
date str - The date and time when the stream was modified
- papi_
json str - The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- product_
id str - The ID of the product for which the stream was created
- stream_
version int - Identifies the configuration version of the stream
- created
By String - The username who created the stream
- created
Date String - The date and time when the stream was created
- id String
- The provider-assigned unique ID for this managed resource.
- latest
Version Number - Identifies the latest active configuration version of the stream
- modified
By String - The username who modified the stream
- modified
Date String - The date and time when the stream was modified
- papi
Json String - The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- product
Id String - The ID of the product for which the stream was created
- stream
Version Number - Identifies the configuration version of the stream
Look up Existing Datastream Resource
Get an existing Datastream resource’s state with the given name, ID, and optional extra properties used to qualify the lookup.
public static get(name: string, id: Input<ID>, state?: DatastreamState, opts?: CustomResourceOptions): Datastream
@staticmethod
def get(resource_name: str,
id: str,
opts: Optional[ResourceOptions] = None,
active: Optional[bool] = None,
azure_connector: Optional[DatastreamAzureConnectorArgs] = None,
collect_midgress: Optional[bool] = None,
contract_id: Optional[str] = None,
created_by: Optional[str] = None,
created_date: Optional[str] = None,
datadog_connector: Optional[DatastreamDatadogConnectorArgs] = None,
dataset_fields: Optional[Sequence[int]] = None,
delivery_configuration: Optional[DatastreamDeliveryConfigurationArgs] = None,
elasticsearch_connector: Optional[DatastreamElasticsearchConnectorArgs] = None,
gcs_connector: Optional[DatastreamGcsConnectorArgs] = None,
group_id: Optional[str] = None,
https_connector: Optional[DatastreamHttpsConnectorArgs] = None,
latest_version: Optional[int] = None,
loggly_connector: Optional[DatastreamLogglyConnectorArgs] = None,
modified_by: Optional[str] = None,
modified_date: Optional[str] = None,
new_relic_connector: Optional[DatastreamNewRelicConnectorArgs] = None,
notification_emails: Optional[Sequence[str]] = None,
oracle_connector: Optional[DatastreamOracleConnectorArgs] = None,
papi_json: Optional[str] = None,
product_id: Optional[str] = None,
properties: Optional[Sequence[str]] = None,
s3_connector: Optional[DatastreamS3ConnectorArgs] = None,
splunk_connector: Optional[DatastreamSplunkConnectorArgs] = None,
stream_name: Optional[str] = None,
stream_version: Optional[int] = None,
sumologic_connector: Optional[DatastreamSumologicConnectorArgs] = None) -> Datastream
func GetDatastream(ctx *Context, name string, id IDInput, state *DatastreamState, opts ...ResourceOption) (*Datastream, error)
public static Datastream Get(string name, Input<string> id, DatastreamState? state, CustomResourceOptions? opts = null)
public static Datastream get(String name, Output<String> id, DatastreamState state, CustomResourceOptions options)
Resource lookup is not supported in YAML
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- resource_name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- Active bool
- Defining if stream should be active or not
- Azure
Connector DatastreamAzure Connector - Collect
Midgress bool - Identifies if stream needs to collect midgress data
- Contract
Id string - Identifies the contract that has access to the product
- Created
By string - The username who created the stream
- Created
Date string - The date and time when the stream was created
- Datadog
Connector DatastreamDatadog Connector - Dataset
Fields List<int> - A list of data set fields selected from the associated template that the stream monitors in logs. The order of the identifiers define how the value for these fields appear in the log lines
- Delivery
Configuration DatastreamDelivery Configuration - Provides information about the configuration related to logs (format, file names, delivery frequency)
- Elasticsearch
Connector DatastreamElasticsearch Connector - Gcs
Connector DatastreamGcs Connector - Group
Id string - Identifies the group that has access to the product and for which the stream configuration was created
- Https
Connector DatastreamHttps Connector - Latest
Version int - Identifies the latest active configuration version of the stream
- Loggly
Connector DatastreamLoggly Connector - Modified
By string - The username who modified the stream
- Modified
Date string - The date and time when the stream was modified
- New
Relic DatastreamConnector New Relic Connector - Notification
Emails List<string> - List of email addresses where the system sends notifications about activations and deactivations of the stream
- Oracle
Connector DatastreamOracle Connector - Papi
Json string - The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- Product
Id string - The ID of the product for which the stream was created
- Properties List<string>
- Identifies the properties monitored in the stream
- S3Connector
Datastream
S3Connector - Splunk
Connector DatastreamSplunk Connector - Stream
Name string - The name of the stream
- Stream
Version int - Identifies the configuration version of the stream
- Sumologic
Connector DatastreamSumologic Connector
- Active bool
- Defining if stream should be active or not
- Azure
Connector DatastreamAzure Connector Args - Collect
Midgress bool - Identifies if stream needs to collect midgress data
- Contract
Id string - Identifies the contract that has access to the product
- Created
By string - The username who created the stream
- Created
Date string - The date and time when the stream was created
- Datadog
Connector DatastreamDatadog Connector Args - Dataset
Fields []int - A list of data set fields selected from the associated template that the stream monitors in logs. The order of the identifiers define how the value for these fields appear in the log lines
- Delivery
Configuration DatastreamDelivery Configuration Args - Provides information about the configuration related to logs (format, file names, delivery frequency)
- Elasticsearch
Connector DatastreamElasticsearch Connector Args - Gcs
Connector DatastreamGcs Connector Args - Group
Id string - Identifies the group that has access to the product and for which the stream configuration was created
- Https
Connector DatastreamHttps Connector Args - Latest
Version int - Identifies the latest active configuration version of the stream
- Loggly
Connector DatastreamLoggly Connector Args - Modified
By string - The username who modified the stream
- Modified
Date string - The date and time when the stream was modified
- New
Relic DatastreamConnector New Relic Connector Args - Notification
Emails []string - List of email addresses where the system sends notifications about activations and deactivations of the stream
- Oracle
Connector DatastreamOracle Connector Args - Papi
Json string - The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- Product
Id string - The ID of the product for which the stream was created
- Properties []string
- Identifies the properties monitored in the stream
- S3Connector
Datastream
S3Connector Args - Splunk
Connector DatastreamSplunk Connector Args - Stream
Name string - The name of the stream
- Stream
Version int - Identifies the configuration version of the stream
- Sumologic
Connector DatastreamSumologic Connector Args
- active Boolean
- Defining if stream should be active or not
- azure
Connector DatastreamAzure Connector - collect
Midgress Boolean - Identifies if stream needs to collect midgress data
- contract
Id String - Identifies the contract that has access to the product
- created
By String - The username who created the stream
- created
Date String - The date and time when the stream was created
- datadog
Connector DatastreamDatadog Connector - dataset
Fields List<Integer> - A list of data set fields selected from the associated template that the stream monitors in logs. The order of the identifiers define how the value for these fields appear in the log lines
- delivery
Configuration DatastreamDelivery Configuration - Provides information about the configuration related to logs (format, file names, delivery frequency)
- elasticsearch
Connector DatastreamElasticsearch Connector - gcs
Connector DatastreamGcs Connector - group
Id String - Identifies the group that has access to the product and for which the stream configuration was created
- https
Connector DatastreamHttps Connector - latest
Version Integer - Identifies the latest active configuration version of the stream
- loggly
Connector DatastreamLoggly Connector - modified
By String - The username who modified the stream
- modified
Date String - The date and time when the stream was modified
- new
Relic DatastreamConnector New Relic Connector - notification
Emails List<String> - List of email addresses where the system sends notifications about activations and deactivations of the stream
- oracle
Connector DatastreamOracle Connector - papi
Json String - The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- product
Id String - The ID of the product for which the stream was created
- properties List<String>
- Identifies the properties monitored in the stream
- s3Connector
Datastream
S3Connector - splunk
Connector DatastreamSplunk Connector - stream
Name String - The name of the stream
- stream
Version Integer - Identifies the configuration version of the stream
- sumologic
Connector DatastreamSumologic Connector
- active boolean
- Defining if stream should be active or not
- azure
Connector DatastreamAzure Connector - collect
Midgress boolean - Identifies if stream needs to collect midgress data
- contract
Id string - Identifies the contract that has access to the product
- created
By string - The username who created the stream
- created
Date string - The date and time when the stream was created
- datadog
Connector DatastreamDatadog Connector - dataset
Fields number[] - A list of data set fields selected from the associated template that the stream monitors in logs. The order of the identifiers define how the value for these fields appear in the log lines
- delivery
Configuration DatastreamDelivery Configuration - Provides information about the configuration related to logs (format, file names, delivery frequency)
- elasticsearch
Connector DatastreamElasticsearch Connector - gcs
Connector DatastreamGcs Connector - group
Id string - Identifies the group that has access to the product and for which the stream configuration was created
- https
Connector DatastreamHttps Connector - latest
Version number - Identifies the latest active configuration version of the stream
- loggly
Connector DatastreamLoggly Connector - modified
By string - The username who modified the stream
- modified
Date string - The date and time when the stream was modified
- new
Relic DatastreamConnector New Relic Connector - notification
Emails string[] - List of email addresses where the system sends notifications about activations and deactivations of the stream
- oracle
Connector DatastreamOracle Connector - papi
Json string - The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- product
Id string - The ID of the product for which the stream was created
- properties string[]
- Identifies the properties monitored in the stream
- s3Connector
Datastream
S3Connector - splunk
Connector DatastreamSplunk Connector - stream
Name string - The name of the stream
- stream
Version number - Identifies the configuration version of the stream
- sumologic
Connector DatastreamSumologic Connector
- active bool
- Defining if stream should be active or not
- azure_
connector DatastreamAzure Connector Args - collect_
midgress bool - Identifies if stream needs to collect midgress data
- contract_
id str - Identifies the contract that has access to the product
- created_
by str - The username who created the stream
- created_
date str - The date and time when the stream was created
- datadog_
connector DatastreamDatadog Connector Args - dataset_
fields Sequence[int] - A list of data set fields selected from the associated template that the stream monitors in logs. The order of the identifiers define how the value for these fields appear in the log lines
- delivery_
configuration DatastreamDelivery Configuration Args - Provides information about the configuration related to logs (format, file names, delivery frequency)
- elasticsearch_
connector DatastreamElasticsearch Connector Args - gcs_
connector DatastreamGcs Connector Args - group_
id str - Identifies the group that has access to the product and for which the stream configuration was created
- https_
connector DatastreamHttps Connector Args - latest_
version int - Identifies the latest active configuration version of the stream
- loggly_
connector DatastreamLoggly Connector Args - modified_
by str - The username who modified the stream
- modified_
date str - The date and time when the stream was modified
- new_
relic_ Datastreamconnector New Relic Connector Args - notification_
emails Sequence[str] - List of email addresses where the system sends notifications about activations and deactivations of the stream
- oracle_
connector DatastreamOracle Connector Args - papi_
json str - The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- product_
id str - The ID of the product for which the stream was created
- properties Sequence[str]
- Identifies the properties monitored in the stream
- s3_
connector DatastreamS3Connector Args - splunk_
connector DatastreamSplunk Connector Args - stream_
name str - The name of the stream
- stream_
version int - Identifies the configuration version of the stream
- sumologic_
connector DatastreamSumologic Connector Args
- active Boolean
- Defining if stream should be active or not
- azure
Connector Property Map - collect
Midgress Boolean - Identifies if stream needs to collect midgress data
- contract
Id String - Identifies the contract that has access to the product
- created
By String - The username who created the stream
- created
Date String - The date and time when the stream was created
- datadog
Connector Property Map - dataset
Fields List<Number> - A list of data set fields selected from the associated template that the stream monitors in logs. The order of the identifiers define how the value for these fields appear in the log lines
- delivery
Configuration Property Map - Provides information about the configuration related to logs (format, file names, delivery frequency)
- elasticsearch
Connector Property Map - gcs
Connector Property Map - group
Id String - Identifies the group that has access to the product and for which the stream configuration was created
- https
Connector Property Map - latest
Version Number - Identifies the latest active configuration version of the stream
- loggly
Connector Property Map - modified
By String - The username who modified the stream
- modified
Date String - The date and time when the stream was modified
- new
Relic Property MapConnector - notification
Emails List<String> - List of email addresses where the system sends notifications about activations and deactivations of the stream
- oracle
Connector Property Map - papi
Json String - The configuration in JSON format that can be copy-pasted into PAPI configuration to enable datastream behavior
- product
Id String - The ID of the product for which the stream was created
- properties List<String>
- Identifies the properties monitored in the stream
- s3Connector Property Map
- splunk
Connector Property Map - stream
Name String - The name of the stream
- stream
Version Number - Identifies the configuration version of the stream
- sumologic
Connector Property Map
Supporting Types
DatastreamAzureConnector, DatastreamAzureConnectorArgs
- Access
Key string - Access keys associated with Azure Storage account
- Account
Name string - Specifies the Azure Storage account name
- Container
Name string - Specifies the Azure Storage container name
- Display
Name string - The name of the connector
- Path string
- The path to the folder within Azure Storage container where logs will be stored
- Compress
Logs bool - Indicates whether the logs should be compressed
- Access
Key string - Access keys associated with Azure Storage account
- Account
Name string - Specifies the Azure Storage account name
- Container
Name string - Specifies the Azure Storage container name
- Display
Name string - The name of the connector
- Path string
- The path to the folder within Azure Storage container where logs will be stored
- Compress
Logs bool - Indicates whether the logs should be compressed
- access
Key String - Access keys associated with Azure Storage account
- account
Name String - Specifies the Azure Storage account name
- container
Name String - Specifies the Azure Storage container name
- display
Name String - The name of the connector
- path String
- The path to the folder within Azure Storage container where logs will be stored
- compress
Logs Boolean - Indicates whether the logs should be compressed
- access
Key string - Access keys associated with Azure Storage account
- account
Name string - Specifies the Azure Storage account name
- container
Name string - Specifies the Azure Storage container name
- display
Name string - The name of the connector
- path string
- The path to the folder within Azure Storage container where logs will be stored
- compress
Logs boolean - Indicates whether the logs should be compressed
- access_
key str - Access keys associated with Azure Storage account
- account_
name str - Specifies the Azure Storage account name
- container_
name str - Specifies the Azure Storage container name
- display_
name str - The name of the connector
- path str
- The path to the folder within Azure Storage container where logs will be stored
- compress_
logs bool - Indicates whether the logs should be compressed
- access
Key String - Access keys associated with Azure Storage account
- account
Name String - Specifies the Azure Storage account name
- container
Name String - Specifies the Azure Storage container name
- display
Name String - The name of the connector
- path String
- The path to the folder within Azure Storage container where logs will be stored
- compress
Logs Boolean - Indicates whether the logs should be compressed
DatastreamDatadogConnector, DatastreamDatadogConnectorArgs
- Auth
Token string - The API key associated with Datadog account
- Display
Name string - The name of the connector
- Endpoint string
- The Datadog endpoint where logs will be stored
- Compress
Logs bool - Indicates whether the logs should be compressed
- Service string
- The service of the Datadog connector
- Source string
- The source of the Datadog connector
- string
- The tags of the Datadog connector
- Auth
Token string - The API key associated with Datadog account
- Display
Name string - The name of the connector
- Endpoint string
- The Datadog endpoint where logs will be stored
- Compress
Logs bool - Indicates whether the logs should be compressed
- Service string
- The service of the Datadog connector
- Source string
- The source of the Datadog connector
- string
- The tags of the Datadog connector
- auth
Token String - The API key associated with Datadog account
- display
Name String - The name of the connector
- endpoint String
- The Datadog endpoint where logs will be stored
- compress
Logs Boolean - Indicates whether the logs should be compressed
- service String
- The service of the Datadog connector
- source String
- The source of the Datadog connector
- String
- The tags of the Datadog connector
- auth
Token string - The API key associated with Datadog account
- display
Name string - The name of the connector
- endpoint string
- The Datadog endpoint where logs will be stored
- compress
Logs boolean - Indicates whether the logs should be compressed
- service string
- The service of the Datadog connector
- source string
- The source of the Datadog connector
- string
- The tags of the Datadog connector
- auth_
token str - The API key associated with Datadog account
- display_
name str - The name of the connector
- endpoint str
- The Datadog endpoint where logs will be stored
- compress_
logs bool - Indicates whether the logs should be compressed
- service str
- The service of the Datadog connector
- source str
- The source of the Datadog connector
- str
- The tags of the Datadog connector
- auth
Token String - The API key associated with Datadog account
- display
Name String - The name of the connector
- endpoint String
- The Datadog endpoint where logs will be stored
- compress
Logs Boolean - Indicates whether the logs should be compressed
- service String
- The service of the Datadog connector
- source String
- The source of the Datadog connector
- String
- The tags of the Datadog connector
DatastreamDeliveryConfiguration, DatastreamDeliveryConfigurationArgs
- Format string
- The format in which logs will be received
- Frequency
Datastream
Delivery Configuration Frequency - The frequency of collecting logs from each uploader and sending these logs to a destination
- Field
Delimiter string - A delimiter that you use to separate data set fields in log lines
- Upload
File stringPrefix - The prefix of the log file that will be send to a destination
- Upload
File stringSuffix - The suffix of the log file that will be send to a destination
- Format string
- The format in which logs will be received
- Frequency
Datastream
Delivery Configuration Frequency - The frequency of collecting logs from each uploader and sending these logs to a destination
- Field
Delimiter string - A delimiter that you use to separate data set fields in log lines
- Upload
File stringPrefix - The prefix of the log file that will be send to a destination
- Upload
File stringSuffix - The suffix of the log file that will be send to a destination
- format String
- The format in which logs will be received
- frequency
Datastream
Delivery Configuration Frequency - The frequency of collecting logs from each uploader and sending these logs to a destination
- field
Delimiter String - A delimiter that you use to separate data set fields in log lines
- upload
File StringPrefix - The prefix of the log file that will be send to a destination
- upload
File StringSuffix - The suffix of the log file that will be send to a destination
- format string
- The format in which logs will be received
- frequency
Datastream
Delivery Configuration Frequency - The frequency of collecting logs from each uploader and sending these logs to a destination
- field
Delimiter string - A delimiter that you use to separate data set fields in log lines
- upload
File stringPrefix - The prefix of the log file that will be send to a destination
- upload
File stringSuffix - The suffix of the log file that will be send to a destination
- format str
- The format in which logs will be received
- frequency
Datastream
Delivery Configuration Frequency - The frequency of collecting logs from each uploader and sending these logs to a destination
- field_
delimiter str - A delimiter that you use to separate data set fields in log lines
- upload_
file_ strprefix - The prefix of the log file that will be send to a destination
- upload_
file_ strsuffix - The suffix of the log file that will be send to a destination
- format String
- The format in which logs will be received
- frequency Property Map
- The frequency of collecting logs from each uploader and sending these logs to a destination
- field
Delimiter String - A delimiter that you use to separate data set fields in log lines
- upload
File StringPrefix - The prefix of the log file that will be send to a destination
- upload
File StringSuffix - The suffix of the log file that will be send to a destination
DatastreamDeliveryConfigurationFrequency, DatastreamDeliveryConfigurationFrequencyArgs
- Interval
In intSecs - The time in seconds after which the system bundles log lines into a file and sends it to a destination
- Interval
In intSecs - The time in seconds after which the system bundles log lines into a file and sends it to a destination
- interval
In IntegerSecs - The time in seconds after which the system bundles log lines into a file and sends it to a destination
- interval
In numberSecs - The time in seconds after which the system bundles log lines into a file and sends it to a destination
- interval_
in_ intsecs - The time in seconds after which the system bundles log lines into a file and sends it to a destination
- interval
In NumberSecs - The time in seconds after which the system bundles log lines into a file and sends it to a destination
DatastreamElasticsearchConnector, DatastreamElasticsearchConnectorArgs
- Display
Name string - The name of the connector.
- Endpoint string
- The Elasticsearch bulk endpoint URL in the https://hostname.elastic-cloud.com:9243/_bulk/ format. Set indexName in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. See Stream logs to Elasticsearch.
- Index
Name string - The index name of the Elastic cloud where you want to store log files.
- Password string
- The Elasticsearch basic access authentication password.
- User
Name string - The Elasticsearch basic access authentication username.
- Ca
Cert string - The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- Client
Cert string - The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- Client
Key string - The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- Content
Type string - The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- Custom
Header stringName - A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- Custom
Header stringValue - The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- MTls bool
- Indicates whether mTLS is enabled or not.
- Tls
Hostname string - The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- Display
Name string - The name of the connector.
- Endpoint string
- The Elasticsearch bulk endpoint URL in the https://hostname.elastic-cloud.com:9243/_bulk/ format. Set indexName in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. See Stream logs to Elasticsearch.
- Index
Name string - The index name of the Elastic cloud where you want to store log files.
- Password string
- The Elasticsearch basic access authentication password.
- User
Name string - The Elasticsearch basic access authentication username.
- Ca
Cert string - The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- Client
Cert string - The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- Client
Key string - The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- Content
Type string - The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- Custom
Header stringName - A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- Custom
Header stringValue - The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- MTls bool
- Indicates whether mTLS is enabled or not.
- Tls
Hostname string - The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- display
Name String - The name of the connector.
- endpoint String
- The Elasticsearch bulk endpoint URL in the https://hostname.elastic-cloud.com:9243/_bulk/ format. Set indexName in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. See Stream logs to Elasticsearch.
- index
Name String - The index name of the Elastic cloud where you want to store log files.
- password String
- The Elasticsearch basic access authentication password.
- user
Name String - The Elasticsearch basic access authentication username.
- ca
Cert String - The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- client
Cert String - The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- client
Key String - The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- content
Type String - The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- custom
Header StringName - A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- custom
Header StringValue - The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- m
Tls Boolean - Indicates whether mTLS is enabled or not.
- tls
Hostname String - The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- display
Name string - The name of the connector.
- endpoint string
- The Elasticsearch bulk endpoint URL in the https://hostname.elastic-cloud.com:9243/_bulk/ format. Set indexName in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. See Stream logs to Elasticsearch.
- index
Name string - The index name of the Elastic cloud where you want to store log files.
- password string
- The Elasticsearch basic access authentication password.
- user
Name string - The Elasticsearch basic access authentication username.
- ca
Cert string - The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- client
Cert string - The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- client
Key string - The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- content
Type string - The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- custom
Header stringName - A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- custom
Header stringValue - The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- m
Tls boolean - Indicates whether mTLS is enabled or not.
- tls
Hostname string - The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- display_
name str - The name of the connector.
- endpoint str
- The Elasticsearch bulk endpoint URL in the https://hostname.elastic-cloud.com:9243/_bulk/ format. Set indexName in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. See Stream logs to Elasticsearch.
- index_
name str - The index name of the Elastic cloud where you want to store log files.
- password str
- The Elasticsearch basic access authentication password.
- user_
name str - The Elasticsearch basic access authentication username.
- ca_
cert str - The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- client_
cert str - The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- client_
key str - The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- content_
type str - The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- custom_
header_ strname - A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- custom_
header_ strvalue - The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- m_
tls bool - Indicates whether mTLS is enabled or not.
- tls_
hostname str - The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- display
Name String - The name of the connector.
- endpoint String
- The Elasticsearch bulk endpoint URL in the https://hostname.elastic-cloud.com:9243/_bulk/ format. Set indexName in the appropriate field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. See Stream logs to Elasticsearch.
- index
Name String - The index name of the Elastic cloud where you want to store log files.
- password String
- The Elasticsearch basic access authentication password.
- user
Name String - The Elasticsearch basic access authentication username.
- ca
Cert String - The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- client
Cert String - The PEM-formatted digital certificate you want to authenticate requests to your destination with. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- client
Key String - The private key in the non-encrypted PKCS8 format you want to use to authenticate with the backend server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- content
Type String - The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- custom
Header StringName - A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- custom
Header StringValue - The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- m
Tls Boolean - Indicates whether mTLS is enabled or not.
- tls
Hostname String - The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
DatastreamGcsConnector, DatastreamGcsConnectorArgs
- Bucket string
- The name of the storage bucket created in Google Cloud account
- Display
Name string - The name of the connector
- Private
Key string - The contents of the JSON private key generated and downloaded in Google Cloud Storage account
- Project
Id string - The unique ID of Google Cloud project
- Service
Account stringName - The name of the service account with the storage.object.create permission or Storage Object Creator role
- Compress
Logs bool - Indicates whether the logs should be compressed
- Path string
- The path to the folder within Google Cloud bucket where logs will be stored
- Bucket string
- The name of the storage bucket created in Google Cloud account
- Display
Name string - The name of the connector
- Private
Key string - The contents of the JSON private key generated and downloaded in Google Cloud Storage account
- Project
Id string - The unique ID of Google Cloud project
- Service
Account stringName - The name of the service account with the storage.object.create permission or Storage Object Creator role
- Compress
Logs bool - Indicates whether the logs should be compressed
- Path string
- The path to the folder within Google Cloud bucket where logs will be stored
- bucket String
- The name of the storage bucket created in Google Cloud account
- display
Name String - The name of the connector
- private
Key String - The contents of the JSON private key generated and downloaded in Google Cloud Storage account
- project
Id String - The unique ID of Google Cloud project
- service
Account StringName - The name of the service account with the storage.object.create permission or Storage Object Creator role
- compress
Logs Boolean - Indicates whether the logs should be compressed
- path String
- The path to the folder within Google Cloud bucket where logs will be stored
- bucket string
- The name of the storage bucket created in Google Cloud account
- display
Name string - The name of the connector
- private
Key string - The contents of the JSON private key generated and downloaded in Google Cloud Storage account
- project
Id string - The unique ID of Google Cloud project
- service
Account stringName - The name of the service account with the storage.object.create permission or Storage Object Creator role
- compress
Logs boolean - Indicates whether the logs should be compressed
- path string
- The path to the folder within Google Cloud bucket where logs will be stored
- bucket str
- The name of the storage bucket created in Google Cloud account
- display_
name str - The name of the connector
- private_
key str - The contents of the JSON private key generated and downloaded in Google Cloud Storage account
- project_
id str - The unique ID of Google Cloud project
- service_
account_ strname - The name of the service account with the storage.object.create permission or Storage Object Creator role
- compress_
logs bool - Indicates whether the logs should be compressed
- path str
- The path to the folder within Google Cloud bucket where logs will be stored
- bucket String
- The name of the storage bucket created in Google Cloud account
- display
Name String - The name of the connector
- private
Key String - The contents of the JSON private key generated and downloaded in Google Cloud Storage account
- project
Id String - The unique ID of Google Cloud project
- service
Account StringName - The name of the service account with the storage.object.create permission or Storage Object Creator role
- compress
Logs Boolean - Indicates whether the logs should be compressed
- path String
- The path to the folder within Google Cloud bucket where logs will be stored
DatastreamHttpsConnector, DatastreamHttpsConnectorArgs
- Authentication
Type string - Either NONE for no authentication, or BASIC for username and password authentication
- Display
Name string - The name of the connector
- Endpoint string
- URL where logs will be stored
- Ca
Cert string - The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- Client
Cert string - The digital certificate in the PEM format you want to use to authenticate requests to your destination. If you want to use mutual authentication, you need to provide both the client certificate and the client key (in the PEM format).
- Client
Key string - The private key in the non-encrypted PKCS8 format you want to use to authenticate with the back-end server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- Compress
Logs bool - Indicates whether the logs should be compressed
- Content
Type string - Content type to pass in the log file header
- Custom
Header stringName - The name of custom header passed with the request to the destination
- Custom
Header stringValue - The custom header's contents passed with the request to the destination
- MTls bool
- Indicates whether mTLS is enabled or not.
- Password string
- Password set for custom HTTPS endpoint for authentication
- Tls
Hostname string - The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- User
Name string - Username used for authentication
- Authentication
Type string - Either NONE for no authentication, or BASIC for username and password authentication
- Display
Name string - The name of the connector
- Endpoint string
- URL where logs will be stored
- Ca
Cert string - The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- Client
Cert string - The digital certificate in the PEM format you want to use to authenticate requests to your destination. If you want to use mutual authentication, you need to provide both the client certificate and the client key (in the PEM format).
- Client
Key string - The private key in the non-encrypted PKCS8 format you want to use to authenticate with the back-end server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- Compress
Logs bool - Indicates whether the logs should be compressed
- Content
Type string - Content type to pass in the log file header
- Custom
Header stringName - The name of custom header passed with the request to the destination
- Custom
Header stringValue - The custom header's contents passed with the request to the destination
- MTls bool
- Indicates whether mTLS is enabled or not.
- Password string
- Password set for custom HTTPS endpoint for authentication
- Tls
Hostname string - The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- User
Name string - Username used for authentication
- authentication
Type String - Either NONE for no authentication, or BASIC for username and password authentication
- display
Name String - The name of the connector
- endpoint String
- URL where logs will be stored
- ca
Cert String - The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- client
Cert String - The digital certificate in the PEM format you want to use to authenticate requests to your destination. If you want to use mutual authentication, you need to provide both the client certificate and the client key (in the PEM format).
- client
Key String - The private key in the non-encrypted PKCS8 format you want to use to authenticate with the back-end server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- compress
Logs Boolean - Indicates whether the logs should be compressed
- content
Type String - Content type to pass in the log file header
- custom
Header StringName - The name of custom header passed with the request to the destination
- custom
Header StringValue - The custom header's contents passed with the request to the destination
- m
Tls Boolean - Indicates whether mTLS is enabled or not.
- password String
- Password set for custom HTTPS endpoint for authentication
- tls
Hostname String - The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- user
Name String - Username used for authentication
- authentication
Type string - Either NONE for no authentication, or BASIC for username and password authentication
- display
Name string - The name of the connector
- endpoint string
- URL where logs will be stored
- ca
Cert string - The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- client
Cert string - The digital certificate in the PEM format you want to use to authenticate requests to your destination. If you want to use mutual authentication, you need to provide both the client certificate and the client key (in the PEM format).
- client
Key string - The private key in the non-encrypted PKCS8 format you want to use to authenticate with the back-end server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- compress
Logs boolean - Indicates whether the logs should be compressed
- content
Type string - Content type to pass in the log file header
- custom
Header stringName - The name of custom header passed with the request to the destination
- custom
Header stringValue - The custom header's contents passed with the request to the destination
- m
Tls boolean - Indicates whether mTLS is enabled or not.
- password string
- Password set for custom HTTPS endpoint for authentication
- tls
Hostname string - The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- user
Name string - Username used for authentication
- authentication_
type str - Either NONE for no authentication, or BASIC for username and password authentication
- display_
name str - The name of the connector
- endpoint str
- URL where logs will be stored
- ca_
cert str - The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- client_
cert str - The digital certificate in the PEM format you want to use to authenticate requests to your destination. If you want to use mutual authentication, you need to provide both the client certificate and the client key (in the PEM format).
- client_
key str - The private key in the non-encrypted PKCS8 format you want to use to authenticate with the back-end server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- compress_
logs bool - Indicates whether the logs should be compressed
- content_
type str - Content type to pass in the log file header
- custom_
header_ strname - The name of custom header passed with the request to the destination
- custom_
header_ strvalue - The custom header's contents passed with the request to the destination
- m_
tls bool - Indicates whether mTLS is enabled or not.
- password str
- Password set for custom HTTPS endpoint for authentication
- tls_
hostname str - The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- user_
name str - Username used for authentication
- authentication
Type String - Either NONE for no authentication, or BASIC for username and password authentication
- display
Name String - The name of the connector
- endpoint String
- URL where logs will be stored
- ca
Cert String - The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- client
Cert String - The digital certificate in the PEM format you want to use to authenticate requests to your destination. If you want to use mutual authentication, you need to provide both the client certificate and the client key (in the PEM format).
- client
Key String - The private key in the non-encrypted PKCS8 format you want to use to authenticate with the back-end server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- compress
Logs Boolean - Indicates whether the logs should be compressed
- content
Type String - Content type to pass in the log file header
- custom
Header StringName - The name of custom header passed with the request to the destination
- custom
Header StringValue - The custom header's contents passed with the request to the destination
- m
Tls Boolean - Indicates whether mTLS is enabled or not.
- password String
- Password set for custom HTTPS endpoint for authentication
- tls
Hostname String - The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- user
Name String - Username used for authentication
DatastreamLogglyConnector, DatastreamLogglyConnectorArgs
- Auth
Token string - The unique HTTP code for your Loggly bulk endpoint.
- Display
Name string - The name of the connector.
- Endpoint string
- The Loggly bulk endpoint URL in the https://hostname.loggly.com/bulk/ format. Set the endpoint code in the authToken field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. See Stream logs to Loggly.
- Content
Type string - The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- Custom
Header stringName - A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- Custom
Header stringValue - The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- string
- The tags you can use to segment and filter log events in Loggly. See Tags in the Loggly documentation.
- Auth
Token string - The unique HTTP code for your Loggly bulk endpoint.
- Display
Name string - The name of the connector.
- Endpoint string
- The Loggly bulk endpoint URL in the https://hostname.loggly.com/bulk/ format. Set the endpoint code in the authToken field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. See Stream logs to Loggly.
- Content
Type string - The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- Custom
Header stringName - A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- Custom
Header stringValue - The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- string
- The tags you can use to segment and filter log events in Loggly. See Tags in the Loggly documentation.
- auth
Token String - The unique HTTP code for your Loggly bulk endpoint.
- display
Name String - The name of the connector.
- endpoint String
- The Loggly bulk endpoint URL in the https://hostname.loggly.com/bulk/ format. Set the endpoint code in the authToken field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. See Stream logs to Loggly.
- content
Type String - The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- custom
Header StringName - A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- custom
Header StringValue - The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- String
- The tags you can use to segment and filter log events in Loggly. See Tags in the Loggly documentation.
- auth
Token string - The unique HTTP code for your Loggly bulk endpoint.
- display
Name string - The name of the connector.
- endpoint string
- The Loggly bulk endpoint URL in the https://hostname.loggly.com/bulk/ format. Set the endpoint code in the authToken field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. See Stream logs to Loggly.
- content
Type string - The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- custom
Header stringName - A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- custom
Header stringValue - The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- string
- The tags you can use to segment and filter log events in Loggly. See Tags in the Loggly documentation.
- auth_
token str - The unique HTTP code for your Loggly bulk endpoint.
- display_
name str - The name of the connector.
- endpoint str
- The Loggly bulk endpoint URL in the https://hostname.loggly.com/bulk/ format. Set the endpoint code in the authToken field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. See Stream logs to Loggly.
- content_
type str - The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- custom_
header_ strname - A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- custom_
header_ strvalue - The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- str
- The tags you can use to segment and filter log events in Loggly. See Tags in the Loggly documentation.
- auth
Token String - The unique HTTP code for your Loggly bulk endpoint.
- display
Name String - The name of the connector.
- endpoint String
- The Loggly bulk endpoint URL in the https://hostname.loggly.com/bulk/ format. Set the endpoint code in the authToken field instead of providing it in the URL. You can use Akamaized property hostnames as endpoint URLs. See Stream logs to Loggly.
- content
Type String - The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- custom
Header StringName - A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- custom
Header StringValue - The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- String
- The tags you can use to segment and filter log events in Loggly. See Tags in the Loggly documentation.
DatastreamNewRelicConnector, DatastreamNewRelicConnectorArgs
- Auth
Token string - Your Log API token for your account in New Relic.
- Display
Name string - The name of the connector.
- Endpoint string
- A New Relic endpoint URL you want to send your logs to. The endpoint URL should follow the https://<newrelic.com>/log/v1/ format format. See Introduction to the Log API https://docs.newrelic.com/docs/logs/log-api/introduction-log-api/ if you want to retrieve your New Relic endpoint URL.
- Content
Type string - The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- Custom
Header stringName - A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- Custom
Header stringValue - The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- Auth
Token string - Your Log API token for your account in New Relic.
- Display
Name string - The name of the connector.
- Endpoint string
- A New Relic endpoint URL you want to send your logs to. The endpoint URL should follow the https://<newrelic.com>/log/v1/ format format. See Introduction to the Log API https://docs.newrelic.com/docs/logs/log-api/introduction-log-api/ if you want to retrieve your New Relic endpoint URL.
- Content
Type string - The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- Custom
Header stringName - A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- Custom
Header stringValue - The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- auth
Token String - Your Log API token for your account in New Relic.
- display
Name String - The name of the connector.
- endpoint String
- A New Relic endpoint URL you want to send your logs to. The endpoint URL should follow the https://<newrelic.com>/log/v1/ format format. See Introduction to the Log API https://docs.newrelic.com/docs/logs/log-api/introduction-log-api/ if you want to retrieve your New Relic endpoint URL.
- content
Type String - The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- custom
Header StringName - A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- custom
Header StringValue - The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- auth
Token string - Your Log API token for your account in New Relic.
- display
Name string - The name of the connector.
- endpoint string
- A New Relic endpoint URL you want to send your logs to. The endpoint URL should follow the https://<newrelic.com>/log/v1/ format format. See Introduction to the Log API https://docs.newrelic.com/docs/logs/log-api/introduction-log-api/ if you want to retrieve your New Relic endpoint URL.
- content
Type string - The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- custom
Header stringName - A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- custom
Header stringValue - The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- auth_
token str - Your Log API token for your account in New Relic.
- display_
name str - The name of the connector.
- endpoint str
- A New Relic endpoint URL you want to send your logs to. The endpoint URL should follow the https://<newrelic.com>/log/v1/ format format. See Introduction to the Log API https://docs.newrelic.com/docs/logs/log-api/introduction-log-api/ if you want to retrieve your New Relic endpoint URL.
- content_
type str - The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- custom_
header_ strname - A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- custom_
header_ strvalue - The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
- auth
Token String - Your Log API token for your account in New Relic.
- display
Name String - The name of the connector.
- endpoint String
- A New Relic endpoint URL you want to send your logs to. The endpoint URL should follow the https://<newrelic.com>/log/v1/ format format. See Introduction to the Log API https://docs.newrelic.com/docs/logs/log-api/introduction-log-api/ if you want to retrieve your New Relic endpoint URL.
- content
Type String - The type of the resource passed in the request's custom header. For details, see Additional options in the DataStream user guide.
- custom
Header StringName - A human-readable name for the request's custom header, containing only alphanumeric, dash, and underscore characters. For details, see Additional options in the DataStream user guide.
- custom
Header StringValue - The custom header's contents passed with the request that contains information about the client connection. For details, see Additional options in the DataStream user guide.
DatastreamOracleConnector, DatastreamOracleConnectorArgs
- Access
Key string - The access key identifier used to authenticate requests to the Oracle Cloud account
- Bucket string
- The name of the Oracle Cloud Storage bucket
- Display
Name string - The name of the connector
- Namespace string
- The namespace of Oracle Cloud Storage account
- Path string
- The path to the folder within your Oracle Cloud Storage bucket where logs will be stored
- Region string
- The Oracle Cloud Storage region where bucket resides
- Secret
Access stringKey - The secret access key identifier used to authenticate requests to the Oracle Cloud account
- Compress
Logs bool - Indicates whether the logs should be compressed
- Access
Key string - The access key identifier used to authenticate requests to the Oracle Cloud account
- Bucket string
- The name of the Oracle Cloud Storage bucket
- Display
Name string - The name of the connector
- Namespace string
- The namespace of Oracle Cloud Storage account
- Path string
- The path to the folder within your Oracle Cloud Storage bucket where logs will be stored
- Region string
- The Oracle Cloud Storage region where bucket resides
- Secret
Access stringKey - The secret access key identifier used to authenticate requests to the Oracle Cloud account
- Compress
Logs bool - Indicates whether the logs should be compressed
- access
Key String - The access key identifier used to authenticate requests to the Oracle Cloud account
- bucket String
- The name of the Oracle Cloud Storage bucket
- display
Name String - The name of the connector
- namespace String
- The namespace of Oracle Cloud Storage account
- path String
- The path to the folder within your Oracle Cloud Storage bucket where logs will be stored
- region String
- The Oracle Cloud Storage region where bucket resides
- secret
Access StringKey - The secret access key identifier used to authenticate requests to the Oracle Cloud account
- compress
Logs Boolean - Indicates whether the logs should be compressed
- access
Key string - The access key identifier used to authenticate requests to the Oracle Cloud account
- bucket string
- The name of the Oracle Cloud Storage bucket
- display
Name string - The name of the connector
- namespace string
- The namespace of Oracle Cloud Storage account
- path string
- The path to the folder within your Oracle Cloud Storage bucket where logs will be stored
- region string
- The Oracle Cloud Storage region where bucket resides
- secret
Access stringKey - The secret access key identifier used to authenticate requests to the Oracle Cloud account
- compress
Logs boolean - Indicates whether the logs should be compressed
- access_
key str - The access key identifier used to authenticate requests to the Oracle Cloud account
- bucket str
- The name of the Oracle Cloud Storage bucket
- display_
name str - The name of the connector
- namespace str
- The namespace of Oracle Cloud Storage account
- path str
- The path to the folder within your Oracle Cloud Storage bucket where logs will be stored
- region str
- The Oracle Cloud Storage region where bucket resides
- secret_
access_ strkey - The secret access key identifier used to authenticate requests to the Oracle Cloud account
- compress_
logs bool - Indicates whether the logs should be compressed
- access
Key String - The access key identifier used to authenticate requests to the Oracle Cloud account
- bucket String
- The name of the Oracle Cloud Storage bucket
- display
Name String - The name of the connector
- namespace String
- The namespace of Oracle Cloud Storage account
- path String
- The path to the folder within your Oracle Cloud Storage bucket where logs will be stored
- region String
- The Oracle Cloud Storage region where bucket resides
- secret
Access StringKey - The secret access key identifier used to authenticate requests to the Oracle Cloud account
- compress
Logs Boolean - Indicates whether the logs should be compressed
DatastreamS3Connector, DatastreamS3ConnectorArgs
- Access
Key string - The access key identifier used to authenticate requests to the Amazon S3 account
- Bucket string
- The name of the Amazon S3 bucket
- Display
Name string - The name of the connector
- Path string
- The path to the folder within Amazon S3 bucket where logs will be stored
- Region string
- The AWS region where Amazon S3 bucket resides
- Secret
Access stringKey - The secret access key identifier used to authenticate requests to the Amazon S3 account
- Compress
Logs bool - Indicates whether the logs should be compressed
- Access
Key string - The access key identifier used to authenticate requests to the Amazon S3 account
- Bucket string
- The name of the Amazon S3 bucket
- Display
Name string - The name of the connector
- Path string
- The path to the folder within Amazon S3 bucket where logs will be stored
- Region string
- The AWS region where Amazon S3 bucket resides
- Secret
Access stringKey - The secret access key identifier used to authenticate requests to the Amazon S3 account
- Compress
Logs bool - Indicates whether the logs should be compressed
- access
Key String - The access key identifier used to authenticate requests to the Amazon S3 account
- bucket String
- The name of the Amazon S3 bucket
- display
Name String - The name of the connector
- path String
- The path to the folder within Amazon S3 bucket where logs will be stored
- region String
- The AWS region where Amazon S3 bucket resides
- secret
Access StringKey - The secret access key identifier used to authenticate requests to the Amazon S3 account
- compress
Logs Boolean - Indicates whether the logs should be compressed
- access
Key string - The access key identifier used to authenticate requests to the Amazon S3 account
- bucket string
- The name of the Amazon S3 bucket
- display
Name string - The name of the connector
- path string
- The path to the folder within Amazon S3 bucket where logs will be stored
- region string
- The AWS region where Amazon S3 bucket resides
- secret
Access stringKey - The secret access key identifier used to authenticate requests to the Amazon S3 account
- compress
Logs boolean - Indicates whether the logs should be compressed
- access_
key str - The access key identifier used to authenticate requests to the Amazon S3 account
- bucket str
- The name of the Amazon S3 bucket
- display_
name str - The name of the connector
- path str
- The path to the folder within Amazon S3 bucket where logs will be stored
- region str
- The AWS region where Amazon S3 bucket resides
- secret_
access_ strkey - The secret access key identifier used to authenticate requests to the Amazon S3 account
- compress_
logs bool - Indicates whether the logs should be compressed
- access
Key String - The access key identifier used to authenticate requests to the Amazon S3 account
- bucket String
- The name of the Amazon S3 bucket
- display
Name String - The name of the connector
- path String
- The path to the folder within Amazon S3 bucket where logs will be stored
- region String
- The AWS region where Amazon S3 bucket resides
- secret
Access StringKey - The secret access key identifier used to authenticate requests to the Amazon S3 account
- compress
Logs Boolean - Indicates whether the logs should be compressed
DatastreamSplunkConnector, DatastreamSplunkConnectorArgs
- Display
Name string - The name of the connector
- Endpoint string
- The raw event Splunk URL where logs will be stored
- Event
Collector stringToken - The Event Collector token associated with Splunk account
- Ca
Cert string - The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- Client
Cert string - The digital certificate in the PEM format you want to use to authenticate requests to your destination. If you want to use mutual authentication, you need to provide both the client certificate and the client key (in the PEM format).
- Client
Key string - The private key in the non-encrypted PKCS8 format you want to use to authenticate with the back-end server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- Compress
Logs bool - Indicates whether the logs should be compressed
- Custom
Header stringName - The name of custom header passed with the request to the destination
- Custom
Header stringValue - The custom header's contents passed with the request to the destination
- MTls bool
- Indicates whether mTLS is enabled or not.
- Tls
Hostname string - The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- Display
Name string - The name of the connector
- Endpoint string
- The raw event Splunk URL where logs will be stored
- Event
Collector stringToken - The Event Collector token associated with Splunk account
- Ca
Cert string - The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- Client
Cert string - The digital certificate in the PEM format you want to use to authenticate requests to your destination. If you want to use mutual authentication, you need to provide both the client certificate and the client key (in the PEM format).
- Client
Key string - The private key in the non-encrypted PKCS8 format you want to use to authenticate with the back-end server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- Compress
Logs bool - Indicates whether the logs should be compressed
- Custom
Header stringName - The name of custom header passed with the request to the destination
- Custom
Header stringValue - The custom header's contents passed with the request to the destination
- MTls bool
- Indicates whether mTLS is enabled or not.
- Tls
Hostname string - The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- display
Name String - The name of the connector
- endpoint String
- The raw event Splunk URL where logs will be stored
- event
Collector StringToken - The Event Collector token associated with Splunk account
- ca
Cert String - The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- client
Cert String - The digital certificate in the PEM format you want to use to authenticate requests to your destination. If you want to use mutual authentication, you need to provide both the client certificate and the client key (in the PEM format).
- client
Key String - The private key in the non-encrypted PKCS8 format you want to use to authenticate with the back-end server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- compress
Logs Boolean - Indicates whether the logs should be compressed
- custom
Header StringName - The name of custom header passed with the request to the destination
- custom
Header StringValue - The custom header's contents passed with the request to the destination
- m
Tls Boolean - Indicates whether mTLS is enabled or not.
- tls
Hostname String - The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- display
Name string - The name of the connector
- endpoint string
- The raw event Splunk URL where logs will be stored
- event
Collector stringToken - The Event Collector token associated with Splunk account
- ca
Cert string - The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- client
Cert string - The digital certificate in the PEM format you want to use to authenticate requests to your destination. If you want to use mutual authentication, you need to provide both the client certificate and the client key (in the PEM format).
- client
Key string - The private key in the non-encrypted PKCS8 format you want to use to authenticate with the back-end server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- compress
Logs boolean - Indicates whether the logs should be compressed
- custom
Header stringName - The name of custom header passed with the request to the destination
- custom
Header stringValue - The custom header's contents passed with the request to the destination
- m
Tls boolean - Indicates whether mTLS is enabled or not.
- tls
Hostname string - The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- display_
name str - The name of the connector
- endpoint str
- The raw event Splunk URL where logs will be stored
- event_
collector_ strtoken - The Event Collector token associated with Splunk account
- ca_
cert str - The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- client_
cert str - The digital certificate in the PEM format you want to use to authenticate requests to your destination. If you want to use mutual authentication, you need to provide both the client certificate and the client key (in the PEM format).
- client_
key str - The private key in the non-encrypted PKCS8 format you want to use to authenticate with the back-end server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- compress_
logs bool - Indicates whether the logs should be compressed
- custom_
header_ strname - The name of custom header passed with the request to the destination
- custom_
header_ strvalue - The custom header's contents passed with the request to the destination
- m_
tls bool - Indicates whether mTLS is enabled or not.
- tls_
hostname str - The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
- display
Name String - The name of the connector
- endpoint String
- The raw event Splunk URL where logs will be stored
- event
Collector StringToken - The Event Collector token associated with Splunk account
- ca
Cert String - The certification authority (CA) certificate used to verify the origin server's certificate. If the certificate is not signed by a well-known certification authority, enter the CA certificate in the PEM format for verification.
- client
Cert String - The digital certificate in the PEM format you want to use to authenticate requests to your destination. If you want to use mutual authentication, you need to provide both the client certificate and the client key (in the PEM format).
- client
Key String - The private key in the non-encrypted PKCS8 format you want to use to authenticate with the back-end server. If you want to use mutual authentication, you need to provide both the client certificate and the client key.
- compress
Logs Boolean - Indicates whether the logs should be compressed
- custom
Header StringName - The name of custom header passed with the request to the destination
- custom
Header StringValue - The custom header's contents passed with the request to the destination
- m
Tls Boolean - Indicates whether mTLS is enabled or not.
- tls
Hostname String - The hostname that verifies the server's certificate and matches the Subject Alternative Names (SANs) in the certificate. If not provided, DataStream fetches the hostname from the endpoint URL.
DatastreamSumologicConnector, DatastreamSumologicConnectorArgs
- Collector
Code string - The unique HTTP collector code of Sumo Logic endpoint
- Display
Name string - The name of the connector
- Endpoint string
- The Sumo Logic collection endpoint where logs will be stored
- Compress
Logs bool - Indicates whether the logs should be compressed
- Content
Type string - Content type to pass in the log file header
- Custom
Header stringName - The name of custom header passed with the request to the destination
- Custom
Header stringValue - The custom header's contents passed with the request to the destination
- Collector
Code string - The unique HTTP collector code of Sumo Logic endpoint
- Display
Name string - The name of the connector
- Endpoint string
- The Sumo Logic collection endpoint where logs will be stored
- Compress
Logs bool - Indicates whether the logs should be compressed
- Content
Type string - Content type to pass in the log file header
- Custom
Header stringName - The name of custom header passed with the request to the destination
- Custom
Header stringValue - The custom header's contents passed with the request to the destination
- collector
Code String - The unique HTTP collector code of Sumo Logic endpoint
- display
Name String - The name of the connector
- endpoint String
- The Sumo Logic collection endpoint where logs will be stored
- compress
Logs Boolean - Indicates whether the logs should be compressed
- content
Type String - Content type to pass in the log file header
- custom
Header StringName - The name of custom header passed with the request to the destination
- custom
Header StringValue - The custom header's contents passed with the request to the destination
- collector
Code string - The unique HTTP collector code of Sumo Logic endpoint
- display
Name string - The name of the connector
- endpoint string
- The Sumo Logic collection endpoint where logs will be stored
- compress
Logs boolean - Indicates whether the logs should be compressed
- content
Type string - Content type to pass in the log file header
- custom
Header stringName - The name of custom header passed with the request to the destination
- custom
Header stringValue - The custom header's contents passed with the request to the destination
- collector_
code str - The unique HTTP collector code of Sumo Logic endpoint
- display_
name str - The name of the connector
- endpoint str
- The Sumo Logic collection endpoint where logs will be stored
- compress_
logs bool - Indicates whether the logs should be compressed
- content_
type str - Content type to pass in the log file header
- custom_
header_ strname - The name of custom header passed with the request to the destination
- custom_
header_ strvalue - The custom header's contents passed with the request to the destination
- collector
Code String - The unique HTTP collector code of Sumo Logic endpoint
- display
Name String - The name of the connector
- endpoint String
- The Sumo Logic collection endpoint where logs will be stored
- compress
Logs Boolean - Indicates whether the logs should be compressed
- content
Type String - Content type to pass in the log file header
- custom
Header StringName - The name of custom header passed with the request to the destination
- custom
Header StringValue - The custom header's contents passed with the request to the destination
Package Details
- Repository
- Akamai pulumi/pulumi-akamai
- License
- Apache-2.0
- Notes
- This Pulumi package is based on the
akamai
Terraform Provider.