published on Thursday, Apr 23, 2026 by Volcengine
published on Thursday, Apr 23, 2026 by Volcengine
Log Service supports data import, allowing you to structure data stored in sources such as TOS and Kafka and save it in Log Service
Example Usage
Example coming soon!
Example coming soon!
Example coming soon!
Example coming soon!
Example coming soon!
resources:
tLSImportTaskDemo:
type: volcenginecc:tls:ImportTask
name: TLSImportTaskDemo
properties:
description: ccapi-test-kafka
importSourceInfo:
kafka_source_info:
host: kafka-cnngsl83xxxxx.kafka.cn-beijing.ivolces.com:9092
group: group1
topic: topic1
encode: UTF-8
password: ""
protocol: ""
username: ""
mechanism: ""
instanceId: kafka-cnngsl83e6xxxxx
isNeedAuth: false
initialOffset: 0
timeSourceDefault: 1
projectId: 4f1af9e7-34af-4ce5-866e-xxxxxxx
sourceType: kafka
targetInfo:
region: cn-beijing
log_type: json_log
log_sample: ""
extract_rule:
extractRule:
beginRegex: ""
delimiter: ""
logRegex: ""
logTemplate:
format: ""
type: ""
quote: ""
timeFormat: '%Y-%m-%d %H:%M:%S,%f'
timeKey: time
timeSample: ""
unMatchLogKey: LogParseFailed
unMatchUpLoadSwitch: true
filterKeyRegex:
- key: user_id
regex: ^[0-9]+$
skipLineCount: 0
timeExtractRegex: '[0-9]{0,2}\/[0-9a-zA-Z]+\/[0-9:,]+'
timeZone: Asia/Shanghai
taskName: ccapi-test-kafka-1001
topicId: b75fffd8-1986-460c-9cca-xxxxxxx
Create ImportTask Resource
Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.
Constructor syntax
new ImportTask(name: string, args: ImportTaskArgs, opts?: CustomResourceOptions);@overload
def ImportTask(resource_name: str,
args: ImportTaskArgs,
opts: Optional[ResourceOptions] = None)
@overload
def ImportTask(resource_name: str,
opts: Optional[ResourceOptions] = None,
import_source_info: Optional[ImportTaskImportSourceInfoArgs] = None,
source_type: Optional[str] = None,
target_info: Optional[ImportTaskTargetInfoArgs] = None,
task_name: Optional[str] = None,
topic_id: Optional[str] = None,
description: Optional[str] = None,
project_id: Optional[str] = None,
status: Optional[int] = None)func NewImportTask(ctx *Context, name string, args ImportTaskArgs, opts ...ResourceOption) (*ImportTask, error)public ImportTask(string name, ImportTaskArgs args, CustomResourceOptions? opts = null)
public ImportTask(String name, ImportTaskArgs args)
public ImportTask(String name, ImportTaskArgs args, CustomResourceOptions options)
type: volcenginecc:tls:ImportTask
properties: # The arguments to resource properties.
options: # Bag of options to control resource's behavior.
Parameters
- name string
- The unique name of the resource.
- args ImportTaskArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- resource_name str
- The unique name of the resource.
- args ImportTaskArgs
- The arguments to resource properties.
- opts ResourceOptions
- Bag of options to control resource's behavior.
- ctx Context
- Context object for the current deployment.
- name string
- The unique name of the resource.
- args ImportTaskArgs
- The arguments to resource properties.
- opts ResourceOption
- Bag of options to control resource's behavior.
- name string
- The unique name of the resource.
- args ImportTaskArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- name String
- The unique name of the resource.
- args ImportTaskArgs
- The arguments to resource properties.
- options CustomResourceOptions
- Bag of options to control resource's behavior.
ImportTask Resource Properties
To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.
Inputs
In Python, inputs that are objects can be passed either as argument classes or as dictionary literals.
The ImportTask resource accepts the following input properties:
- Import
Source Volcengine.Info Import Task Import Source Info - Import data source information
- Source
Type string - Data source type. Options: tos, kafka.
- Target
Info Volcengine.Import Task Target Info - Output information for the data import task.
- Task
Name string - Import task name
- Topic
Id string - Log topic ID used to store data
- Description string
- Task description.
- Project
Id string - Log project ID for storing data.
- Status int
- Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
- Import
Source ImportInfo Task Import Source Info Args - Import data source information
- Source
Type string - Data source type. Options: tos, kafka.
- Target
Info ImportTask Target Info Args - Output information for the data import task.
- Task
Name string - Import task name
- Topic
Id string - Log topic ID used to store data
- Description string
- Task description.
- Project
Id string - Log project ID for storing data.
- Status int
- Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
- import
Source ImportInfo Task Import Source Info - Import data source information
- source
Type String - Data source type. Options: tos, kafka.
- target
Info ImportTask Target Info - Output information for the data import task.
- task
Name String - Import task name
- topic
Id String - Log topic ID used to store data
- description String
- Task description.
- project
Id String - Log project ID for storing data.
- status Integer
- Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
- import
Source ImportInfo Task Import Source Info - Import data source information
- source
Type string - Data source type. Options: tos, kafka.
- target
Info ImportTask Target Info - Output information for the data import task.
- task
Name string - Import task name
- topic
Id string - Log topic ID used to store data
- description string
- Task description.
- project
Id string - Log project ID for storing data.
- status number
- Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
- import_
source_ Importinfo Task Import Source Info Args - Import data source information
- source_
type str - Data source type. Options: tos, kafka.
- target_
info ImportTask Target Info Args - Output information for the data import task.
- task_
name str - Import task name
- topic_
id str - Log topic ID used to store data
- description str
- Task description.
- project_
id str - Log project ID for storing data.
- status int
- Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
- import
Source Property MapInfo - Import data source information
- source
Type String - Data source type. Options: tos, kafka.
- target
Info Property Map - Output information for the data import task.
- task
Name String - Import task name
- topic
Id String - Log topic ID used to store data
- description String
- Task description.
- project
Id String - Log project ID for storing data.
- status Number
- Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
Outputs
All input properties are implicitly available as output properties. Additionally, the ImportTask resource produces the following output properties:
- Create
Time string - Creation time.
- Id string
- The provider-assigned unique ID for this managed resource.
- Project
Name string - Log project name.
- Task
Id string - Import task ID.
- Task
Statistics Volcengine.Import Task Task Statistics - Progress of the data import task.
- Topic
Name string - Log topic name.
- Create
Time string - Creation time.
- Id string
- The provider-assigned unique ID for this managed resource.
- Project
Name string - Log project name.
- Task
Id string - Import task ID.
- Task
Statistics ImportTask Task Statistics - Progress of the data import task.
- Topic
Name string - Log topic name.
- create
Time String - Creation time.
- id String
- The provider-assigned unique ID for this managed resource.
- project
Name String - Log project name.
- task
Id String - Import task ID.
- task
Statistics ImportTask Task Statistics - Progress of the data import task.
- topic
Name String - Log topic name.
- create
Time string - Creation time.
- id string
- The provider-assigned unique ID for this managed resource.
- project
Name string - Log project name.
- task
Id string - Import task ID.
- task
Statistics ImportTask Task Statistics - Progress of the data import task.
- topic
Name string - Log topic name.
- create_
time str - Creation time.
- id str
- The provider-assigned unique ID for this managed resource.
- project_
name str - Log project name.
- task_
id str - Import task ID.
- task_
statistics ImportTask Task Statistics - Progress of the data import task.
- topic_
name str - Log topic name.
- create
Time String - Creation time.
- id String
- The provider-assigned unique ID for this managed resource.
- project
Name String - Log project name.
- task
Id String - Import task ID.
- task
Statistics Property Map - Progress of the data import task.
- topic
Name String - Log topic name.
Look up Existing ImportTask Resource
Get an existing ImportTask resource’s state with the given name, ID, and optional extra properties used to qualify the lookup.
public static get(name: string, id: Input<ID>, state?: ImportTaskState, opts?: CustomResourceOptions): ImportTask@staticmethod
def get(resource_name: str,
id: str,
opts: Optional[ResourceOptions] = None,
create_time: Optional[str] = None,
description: Optional[str] = None,
import_source_info: Optional[ImportTaskImportSourceInfoArgs] = None,
project_id: Optional[str] = None,
project_name: Optional[str] = None,
source_type: Optional[str] = None,
status: Optional[int] = None,
target_info: Optional[ImportTaskTargetInfoArgs] = None,
task_id: Optional[str] = None,
task_name: Optional[str] = None,
task_statistics: Optional[ImportTaskTaskStatisticsArgs] = None,
topic_id: Optional[str] = None,
topic_name: Optional[str] = None) -> ImportTaskfunc GetImportTask(ctx *Context, name string, id IDInput, state *ImportTaskState, opts ...ResourceOption) (*ImportTask, error)public static ImportTask Get(string name, Input<string> id, ImportTaskState? state, CustomResourceOptions? opts = null)public static ImportTask get(String name, Output<String> id, ImportTaskState state, CustomResourceOptions options)resources: _: type: volcenginecc:tls:ImportTask get: id: ${id}- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- resource_name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- Create
Time string - Creation time.
- Description string
- Task description.
- Import
Source Volcengine.Info Import Task Import Source Info - Import data source information
- Project
Id string - Log project ID for storing data.
- Project
Name string - Log project name.
- Source
Type string - Data source type. Options: tos, kafka.
- Status int
- Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
- Target
Info Volcengine.Import Task Target Info - Output information for the data import task.
- Task
Id string - Import task ID.
- Task
Name string - Import task name
- Task
Statistics Volcengine.Import Task Task Statistics - Progress of the data import task.
- Topic
Id string - Log topic ID used to store data
- Topic
Name string - Log topic name.
- Create
Time string - Creation time.
- Description string
- Task description.
- Import
Source ImportInfo Task Import Source Info Args - Import data source information
- Project
Id string - Log project ID for storing data.
- Project
Name string - Log project name.
- Source
Type string - Data source type. Options: tos, kafka.
- Status int
- Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
- Target
Info ImportTask Target Info Args - Output information for the data import task.
- Task
Id string - Import task ID.
- Task
Name string - Import task name
- Task
Statistics ImportTask Task Statistics Args - Progress of the data import task.
- Topic
Id string - Log topic ID used to store data
- Topic
Name string - Log topic name.
- create
Time String - Creation time.
- description String
- Task description.
- import
Source ImportInfo Task Import Source Info - Import data source information
- project
Id String - Log project ID for storing data.
- project
Name String - Log project name.
- source
Type String - Data source type. Options: tos, kafka.
- status Integer
- Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
- target
Info ImportTask Target Info - Output information for the data import task.
- task
Id String - Import task ID.
- task
Name String - Import task name
- task
Statistics ImportTask Task Statistics - Progress of the data import task.
- topic
Id String - Log topic ID used to store data
- topic
Name String - Log topic name.
- create
Time string - Creation time.
- description string
- Task description.
- import
Source ImportInfo Task Import Source Info - Import data source information
- project
Id string - Log project ID for storing data.
- project
Name string - Log project name.
- source
Type string - Data source type. Options: tos, kafka.
- status number
- Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
- target
Info ImportTask Target Info - Output information for the data import task.
- task
Id string - Import task ID.
- task
Name string - Import task name
- task
Statistics ImportTask Task Statistics - Progress of the data import task.
- topic
Id string - Log topic ID used to store data
- topic
Name string - Log topic name.
- create_
time str - Creation time.
- description str
- Task description.
- import_
source_ Importinfo Task Import Source Info Args - Import data source information
- project_
id str - Log project ID for storing data.
- project_
name str - Log project name.
- source_
type str - Data source type. Options: tos, kafka.
- status int
- Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
- target_
info ImportTask Target Info Args - Output information for the data import task.
- task_
id str - Import task ID.
- task_
name str - Import task name
- task_
statistics ImportTask Task Statistics Args - Progress of the data import task.
- topic_
id str - Log topic ID used to store data
- topic_
name str - Log topic name.
- create
Time String - Creation time.
- description String
- Task description.
- import
Source Property MapInfo - Import data source information
- project
Id String - Log project ID for storing data.
- project
Name String - Log project name.
- source
Type String - Data source type. Options: tos, kafka.
- status Number
- Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
- target
Info Property Map - Output information for the data import task.
- task
Id String - Import task ID.
- task
Name String - Import task name
- task
Statistics Property Map - Progress of the data import task.
- topic
Id String - Log topic ID used to store data
- topic
Name String - Log topic name.
Supporting Types
ImportTaskImportSourceInfo, ImportTaskImportSourceInfoArgs
- Kafka
Source Volcengine.Info Import Task Import Source Info Kafka Source Info - Kafka data source information. When sourceType is kafka, the KafkaSourceInfo field is required
- Tos
Source Volcengine.Info Import Task Import Source Info Tos Source Info - TOS data source information. When sourceType is tos, the TosSourceInfo field is required.
- Kafka
Source ImportInfo Task Import Source Info Kafka Source Info - Kafka data source information. When sourceType is kafka, the KafkaSourceInfo field is required
- Tos
Source ImportInfo Task Import Source Info Tos Source Info - TOS data source information. When sourceType is tos, the TosSourceInfo field is required.
- kafka
Source ImportInfo Task Import Source Info Kafka Source Info - Kafka data source information. When sourceType is kafka, the KafkaSourceInfo field is required
- tos
Source ImportInfo Task Import Source Info Tos Source Info - TOS data source information. When sourceType is tos, the TosSourceInfo field is required.
- kafka
Source ImportInfo Task Import Source Info Kafka Source Info - Kafka data source information. When sourceType is kafka, the KafkaSourceInfo field is required
- tos
Source ImportInfo Task Import Source Info Tos Source Info - TOS data source information. When sourceType is tos, the TosSourceInfo field is required.
- kafka_
source_ Importinfo Task Import Source Info Kafka Source Info - Kafka data source information. When sourceType is kafka, the KafkaSourceInfo field is required
- tos_
source_ Importinfo Task Import Source Info Tos Source Info - TOS data source information. When sourceType is tos, the TosSourceInfo field is required.
- kafka
Source Property MapInfo - Kafka data source information. When sourceType is kafka, the KafkaSourceInfo field is required
- tos
Source Property MapInfo - TOS data source information. When sourceType is tos, the TosSourceInfo field is required.
ImportTaskImportSourceInfoKafkaSourceInfo, ImportTaskImportSourceInfoKafkaSourceInfoArgs
- Encode string
- Data encoding format. Available options: UTF-8, GBK.
- Group string
- Kafka consumer group. If not specified, the system will automatically create a Kafka consumer group.
- Host string
- The service addresses for different types of Kafka clusters vary. Details are as follows: Message Queue Kafka Edition: Use the access point of the Kafka instance. For more information, see Access Point. If the Kafka instance and the Log Service Project are in the same region, you can use private network access; otherwise, use public network access. Self-hosted Kafka clusters: Use the IP address and port number or the domain name and port number of the Kafka Broker. Only public network access is supported. Separate multiple service addresses with a comma (,).
- Initial
Offset int - Starting position for data import. Options: 0: Earliest time, start importing from the first record in the specified Kafka Topic. 1: Latest time, start importing from the most recently generated record in the specified Kafka Topic.
- Instance
Id string - If you are using Message Queue Kafka Edition, set this to the Kafka instance ID.
- Is
Need boolAuth - Whether to enable authentication. If you use a public service address, it is recommended to enable authentication.
- Mechanism string
- Password authentication mechanism. Available options: PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
- Password string
- Kafka SASL user password for authentication.
- Protocol string
- Secure transmission protocol. Options include plaintext, saslssl, ssl, and saslplaintext
- Time
Source intDefault - Specify log time. Options: 0: Use Kafka message timestamp. 1: Use current system time.
- Topic string
- Kafka Topic name. Separate multiple Kafka Topics with commas (,).
- Username string
- Kafka SASL username for authentication.
- Encode string
- Data encoding format. Available options: UTF-8, GBK.
- Group string
- Kafka consumer group. If not specified, the system will automatically create a Kafka consumer group.
- Host string
- The service addresses for different types of Kafka clusters vary. Details are as follows: Message Queue Kafka Edition: Use the access point of the Kafka instance. For more information, see Access Point. If the Kafka instance and the Log Service Project are in the same region, you can use private network access; otherwise, use public network access. Self-hosted Kafka clusters: Use the IP address and port number or the domain name and port number of the Kafka Broker. Only public network access is supported. Separate multiple service addresses with a comma (,).
- Initial
Offset int - Starting position for data import. Options: 0: Earliest time, start importing from the first record in the specified Kafka Topic. 1: Latest time, start importing from the most recently generated record in the specified Kafka Topic.
- Instance
Id string - If you are using Message Queue Kafka Edition, set this to the Kafka instance ID.
- Is
Need boolAuth - Whether to enable authentication. If you use a public service address, it is recommended to enable authentication.
- Mechanism string
- Password authentication mechanism. Available options: PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
- Password string
- Kafka SASL user password for authentication.
- Protocol string
- Secure transmission protocol. Options include plaintext, saslssl, ssl, and saslplaintext
- Time
Source intDefault - Specify log time. Options: 0: Use Kafka message timestamp. 1: Use current system time.
- Topic string
- Kafka Topic name. Separate multiple Kafka Topics with commas (,).
- Username string
- Kafka SASL username for authentication.
- encode String
- Data encoding format. Available options: UTF-8, GBK.
- group String
- Kafka consumer group. If not specified, the system will automatically create a Kafka consumer group.
- host String
- The service addresses for different types of Kafka clusters vary. Details are as follows: Message Queue Kafka Edition: Use the access point of the Kafka instance. For more information, see Access Point. If the Kafka instance and the Log Service Project are in the same region, you can use private network access; otherwise, use public network access. Self-hosted Kafka clusters: Use the IP address and port number or the domain name and port number of the Kafka Broker. Only public network access is supported. Separate multiple service addresses with a comma (,).
- initial
Offset Integer - Starting position for data import. Options: 0: Earliest time, start importing from the first record in the specified Kafka Topic. 1: Latest time, start importing from the most recently generated record in the specified Kafka Topic.
- instance
Id String - If you are using Message Queue Kafka Edition, set this to the Kafka instance ID.
- is
Need BooleanAuth - Whether to enable authentication. If you use a public service address, it is recommended to enable authentication.
- mechanism String
- Password authentication mechanism. Available options: PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
- password String
- Kafka SASL user password for authentication.
- protocol String
- Secure transmission protocol. Options include plaintext, saslssl, ssl, and saslplaintext
- time
Source IntegerDefault - Specify log time. Options: 0: Use Kafka message timestamp. 1: Use current system time.
- topic String
- Kafka Topic name. Separate multiple Kafka Topics with commas (,).
- username String
- Kafka SASL username for authentication.
- encode string
- Data encoding format. Available options: UTF-8, GBK.
- group string
- Kafka consumer group. If not specified, the system will automatically create a Kafka consumer group.
- host string
- The service addresses for different types of Kafka clusters vary. Details are as follows: Message Queue Kafka Edition: Use the access point of the Kafka instance. For more information, see Access Point. If the Kafka instance and the Log Service Project are in the same region, you can use private network access; otherwise, use public network access. Self-hosted Kafka clusters: Use the IP address and port number or the domain name and port number of the Kafka Broker. Only public network access is supported. Separate multiple service addresses with a comma (,).
- initial
Offset number - Starting position for data import. Options: 0: Earliest time, start importing from the first record in the specified Kafka Topic. 1: Latest time, start importing from the most recently generated record in the specified Kafka Topic.
- instance
Id string - If you are using Message Queue Kafka Edition, set this to the Kafka instance ID.
- is
Need booleanAuth - Whether to enable authentication. If you use a public service address, it is recommended to enable authentication.
- mechanism string
- Password authentication mechanism. Available options: PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
- password string
- Kafka SASL user password for authentication.
- protocol string
- Secure transmission protocol. Options include plaintext, saslssl, ssl, and saslplaintext
- time
Source numberDefault - Specify log time. Options: 0: Use Kafka message timestamp. 1: Use current system time.
- topic string
- Kafka Topic name. Separate multiple Kafka Topics with commas (,).
- username string
- Kafka SASL username for authentication.
- encode str
- Data encoding format. Available options: UTF-8, GBK.
- group str
- Kafka consumer group. If not specified, the system will automatically create a Kafka consumer group.
- host str
- The service addresses for different types of Kafka clusters vary. Details are as follows: Message Queue Kafka Edition: Use the access point of the Kafka instance. For more information, see Access Point. If the Kafka instance and the Log Service Project are in the same region, you can use private network access; otherwise, use public network access. Self-hosted Kafka clusters: Use the IP address and port number or the domain name and port number of the Kafka Broker. Only public network access is supported. Separate multiple service addresses with a comma (,).
- initial_
offset int - Starting position for data import. Options: 0: Earliest time, start importing from the first record in the specified Kafka Topic. 1: Latest time, start importing from the most recently generated record in the specified Kafka Topic.
- instance_
id str - If you are using Message Queue Kafka Edition, set this to the Kafka instance ID.
- is_
need_ boolauth - Whether to enable authentication. If you use a public service address, it is recommended to enable authentication.
- mechanism str
- Password authentication mechanism. Available options: PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
- password str
- Kafka SASL user password for authentication.
- protocol str
- Secure transmission protocol. Options include plaintext, saslssl, ssl, and saslplaintext
- time_
source_ intdefault - Specify log time. Options: 0: Use Kafka message timestamp. 1: Use current system time.
- topic str
- Kafka Topic name. Separate multiple Kafka Topics with commas (,).
- username str
- Kafka SASL username for authentication.
- encode String
- Data encoding format. Available options: UTF-8, GBK.
- group String
- Kafka consumer group. If not specified, the system will automatically create a Kafka consumer group.
- host String
- The service addresses for different types of Kafka clusters vary. Details are as follows: Message Queue Kafka Edition: Use the access point of the Kafka instance. For more information, see Access Point. If the Kafka instance and the Log Service Project are in the same region, you can use private network access; otherwise, use public network access. Self-hosted Kafka clusters: Use the IP address and port number or the domain name and port number of the Kafka Broker. Only public network access is supported. Separate multiple service addresses with a comma (,).
- initial
Offset Number - Starting position for data import. Options: 0: Earliest time, start importing from the first record in the specified Kafka Topic. 1: Latest time, start importing from the most recently generated record in the specified Kafka Topic.
- instance
Id String - If you are using Message Queue Kafka Edition, set this to the Kafka instance ID.
- is
Need BooleanAuth - Whether to enable authentication. If you use a public service address, it is recommended to enable authentication.
- mechanism String
- Password authentication mechanism. Available options: PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
- password String
- Kafka SASL user password for authentication.
- protocol String
- Secure transmission protocol. Options include plaintext, saslssl, ssl, and saslplaintext
- time
Source NumberDefault - Specify log time. Options: 0: Use Kafka message timestamp. 1: Use current system time.
- topic String
- Kafka Topic name. Separate multiple Kafka Topics with commas (,).
- username String
- Kafka SASL username for authentication.
ImportTaskImportSourceInfoTosSourceInfo, ImportTaskImportSourceInfoTosSourceInfoArgs
- Bucket string
- TOS bucket name
- Compress
Type string - Compression mode for data in the TOS bucket. none: No compression. snappy: Compress using snappy. gzip: Compress using gzip. lz4: Compress using lz4.
- Prefix string
- Path of the file to be imported in the TOS bucket.
- Region string
- Region where the TOS bucket is located. Cross-region data import is supported
- Bucket string
- TOS bucket name
- Compress
Type string - Compression mode for data in the TOS bucket. none: No compression. snappy: Compress using snappy. gzip: Compress using gzip. lz4: Compress using lz4.
- Prefix string
- Path of the file to be imported in the TOS bucket.
- Region string
- Region where the TOS bucket is located. Cross-region data import is supported
- bucket String
- TOS bucket name
- compress
Type String - Compression mode for data in the TOS bucket. none: No compression. snappy: Compress using snappy. gzip: Compress using gzip. lz4: Compress using lz4.
- prefix String
- Path of the file to be imported in the TOS bucket.
- region String
- Region where the TOS bucket is located. Cross-region data import is supported
- bucket string
- TOS bucket name
- compress
Type string - Compression mode for data in the TOS bucket. none: No compression. snappy: Compress using snappy. gzip: Compress using gzip. lz4: Compress using lz4.
- prefix string
- Path of the file to be imported in the TOS bucket.
- region string
- Region where the TOS bucket is located. Cross-region data import is supported
- bucket str
- TOS bucket name
- compress_
type str - Compression mode for data in the TOS bucket. none: No compression. snappy: Compress using snappy. gzip: Compress using gzip. lz4: Compress using lz4.
- prefix str
- Path of the file to be imported in the TOS bucket.
- region str
- Region where the TOS bucket is located. Cross-region data import is supported
- bucket String
- TOS bucket name
- compress
Type String - Compression mode for data in the TOS bucket. none: No compression. snappy: Compress using snappy. gzip: Compress using gzip. lz4: Compress using lz4.
- prefix String
- Path of the file to be imported in the TOS bucket.
- region String
- Region where the TOS bucket is located. Cross-region data import is supported
ImportTaskTargetInfo, ImportTaskTargetInfoArgs
- Log
Type string - Specify log parsing type during import. delimiterlog: CSV type. multilinelog: multiline full text type. minimalistlog: single line full text type. jsonlog: JSON type.
- Region string
- Region.
- Extract
Rule Volcengine.Import Task Target Info Extract Rule - Log extraction rule.
- Log
Sample string - Log sample. When LogType is set to multiline_log, you must configure log samples. It is recommended to provide more than two log entries as examples to ensure the regular expression matches the first line of each log. Use real samples from the production environment.
- Log
Type string - Specify log parsing type during import. delimiterlog: CSV type. multilinelog: multiline full text type. minimalistlog: single line full text type. jsonlog: JSON type.
- Region string
- Region.
- Extract
Rule ImportTask Target Info Extract Rule - Log extraction rule.
- Log
Sample string - Log sample. When LogType is set to multiline_log, you must configure log samples. It is recommended to provide more than two log entries as examples to ensure the regular expression matches the first line of each log. Use real samples from the production environment.
- log
Type String - Specify log parsing type during import. delimiterlog: CSV type. multilinelog: multiline full text type. minimalistlog: single line full text type. jsonlog: JSON type.
- region String
- Region.
- extract
Rule ImportTask Target Info Extract Rule - Log extraction rule.
- log
Sample String - Log sample. When LogType is set to multiline_log, you must configure log samples. It is recommended to provide more than two log entries as examples to ensure the regular expression matches the first line of each log. Use real samples from the production environment.
- log
Type string - Specify log parsing type during import. delimiterlog: CSV type. multilinelog: multiline full text type. minimalistlog: single line full text type. jsonlog: JSON type.
- region string
- Region.
- extract
Rule ImportTask Target Info Extract Rule - Log extraction rule.
- log
Sample string - Log sample. When LogType is set to multiline_log, you must configure log samples. It is recommended to provide more than two log entries as examples to ensure the regular expression matches the first line of each log. Use real samples from the production environment.
- log_
type str - Specify log parsing type during import. delimiterlog: CSV type. multilinelog: multiline full text type. minimalistlog: single line full text type. jsonlog: JSON type.
- region str
- Region.
- extract_
rule ImportTask Target Info Extract Rule - Log extraction rule.
- log_
sample str - Log sample. When LogType is set to multiline_log, you must configure log samples. It is recommended to provide more than two log entries as examples to ensure the regular expression matches the first line of each log. Use real samples from the production environment.
- log
Type String - Specify log parsing type during import. delimiterlog: CSV type. multilinelog: multiline full text type. minimalistlog: single line full text type. jsonlog: JSON type.
- region String
- Region.
- extract
Rule Property Map - Log extraction rule.
- log
Sample String - Log sample. When LogType is set to multiline_log, you must configure log samples. It is recommended to provide more than two log entries as examples to ensure the regular expression matches the first line of each log. Use real samples from the production environment.
ImportTaskTargetInfoExtractRule, ImportTaskTargetInfoExtractRuleArgs
- Extract
Rule Volcengine.Import Task Target Info Extract Rule Extract Rule - Basic content of log extraction rules.
- Skip
Line intCount - Number of skipped rows. Only valid when the log type is delimiter_log and the import type is tos.
- Time
Extract stringRegex - Time extraction regular expression, used to extract the time value from the TimeKey field and parse it as the collection time
- Time
Zone string - Time zone. Supports machine time zone (default) and custom time zone. Custom time zone supports GMT and UTC. GMT format: GMT+08:00. UTC format: Asia/Shanghai.
- Extract
Rule ImportTask Target Info Extract Rule Extract Rule - Basic content of log extraction rules.
- Skip
Line intCount - Number of skipped rows. Only valid when the log type is delimiter_log and the import type is tos.
- Time
Extract stringRegex - Time extraction regular expression, used to extract the time value from the TimeKey field and parse it as the collection time
- Time
Zone string - Time zone. Supports machine time zone (default) and custom time zone. Custom time zone supports GMT and UTC. GMT format: GMT+08:00. UTC format: Asia/Shanghai.
- extract
Rule ImportTask Target Info Extract Rule Extract Rule - Basic content of log extraction rules.
- skip
Line IntegerCount - Number of skipped rows. Only valid when the log type is delimiter_log and the import type is tos.
- time
Extract StringRegex - Time extraction regular expression, used to extract the time value from the TimeKey field and parse it as the collection time
- time
Zone String - Time zone. Supports machine time zone (default) and custom time zone. Custom time zone supports GMT and UTC. GMT format: GMT+08:00. UTC format: Asia/Shanghai.
- extract
Rule ImportTask Target Info Extract Rule Extract Rule - Basic content of log extraction rules.
- skip
Line numberCount - Number of skipped rows. Only valid when the log type is delimiter_log and the import type is tos.
- time
Extract stringRegex - Time extraction regular expression, used to extract the time value from the TimeKey field and parse it as the collection time
- time
Zone string - Time zone. Supports machine time zone (default) and custom time zone. Custom time zone supports GMT and UTC. GMT format: GMT+08:00. UTC format: Asia/Shanghai.
- extract_
rule ImportTask Target Info Extract Rule Extract Rule - Basic content of log extraction rules.
- skip_
line_ intcount - Number of skipped rows. Only valid when the log type is delimiter_log and the import type is tos.
- time_
extract_ strregex - Time extraction regular expression, used to extract the time value from the TimeKey field and parse it as the collection time
- time_
zone str - Time zone. Supports machine time zone (default) and custom time zone. Custom time zone supports GMT and UTC. GMT format: GMT+08:00. UTC format: Asia/Shanghai.
- extract
Rule Property Map - Basic content of log extraction rules.
- skip
Line NumberCount - Number of skipped rows. Only valid when the log type is delimiter_log and the import type is tos.
- time
Extract StringRegex - Time extraction regular expression, used to extract the time value from the TimeKey field and parse it as the collection time
- time
Zone String - Time zone. Supports machine time zone (default) and custom time zone. Custom time zone supports GMT and UTC. GMT format: GMT+08:00. UTC format: Asia/Shanghai.
ImportTaskTargetInfoExtractRuleExtractRule, ImportTaskTargetInfoExtractRuleExtractRuleArgs
- Begin
Regex string - Regular expression for identifying the first line of each log. The matched part is treated as the start of the log. When LogType is set to multiline_log, you must configure a log sample
- Delimiter string
- Delimiter. Only valid when LogType is delimiter_log.
- Enable
Nanosecond bool - Enable nanoseconds.
- Filter
Key List<Volcengine.Regexes Import Task Target Info Extract Rule Extract Rule Filter Key Regex> - Keys List<string>
- List of log field names (Key). Valid only when LogType is delimiter_log. Supports up to 100 field names. Duplicate field names are not allowed, and all field names cannot be left blank
- Log
Regex string - Log regular expression
- Log
Template Volcengine.Import Task Target Info Extract Rule Extract Rule Log Template - Log template.
- Quote string
- Quotation mark. Content enclosed by the quotation mark will not be separated and will be parsed as a complete field. Only valid when LogType is delimiter_log.
- Time
Format string - Parsing format for the time field. If you use a specified time field in the log as the log timestamp, you must fill in TimeKey and TimeFormat. TimeKey and TimeFormat must be paired. For configuration details, see time format.
- Time
Key string - Name of the log time field. If you use a specific time field in the log as the log timestamp, you must provide both TimeKey and TimeFormat. TimeKey and TimeFormat must appear in pairs
- Time
Sample string - Time sample. Used to verify whether the entered time parsing format is correct
- Un
Match stringLog Key - When uploading logs that failed to parse, specify the key name for the failed logs. UnMatchUpLoadSwitch=true and UnMatchLogKey must be used together.
- Un
Match boolUp Load Switch - Whether to upload logs that failed to parse. UnMatchUpLoadSwitch=true and UnMatchLogKey must be paired. true: Upload logs that failed to parse. false: Do not upload logs that failed to parse.
- Begin
Regex string - Regular expression for identifying the first line of each log. The matched part is treated as the start of the log. When LogType is set to multiline_log, you must configure a log sample
- Delimiter string
- Delimiter. Only valid when LogType is delimiter_log.
- Enable
Nanosecond bool - Enable nanoseconds.
- Filter
Key []ImportRegexes Task Target Info Extract Rule Extract Rule Filter Key Regex - Keys []string
- List of log field names (Key). Valid only when LogType is delimiter_log. Supports up to 100 field names. Duplicate field names are not allowed, and all field names cannot be left blank
- Log
Regex string - Log regular expression
- Log
Template ImportTask Target Info Extract Rule Extract Rule Log Template - Log template.
- Quote string
- Quotation mark. Content enclosed by the quotation mark will not be separated and will be parsed as a complete field. Only valid when LogType is delimiter_log.
- Time
Format string - Parsing format for the time field. If you use a specified time field in the log as the log timestamp, you must fill in TimeKey and TimeFormat. TimeKey and TimeFormat must be paired. For configuration details, see time format.
- Time
Key string - Name of the log time field. If you use a specific time field in the log as the log timestamp, you must provide both TimeKey and TimeFormat. TimeKey and TimeFormat must appear in pairs
- Time
Sample string - Time sample. Used to verify whether the entered time parsing format is correct
- Un
Match stringLog Key - When uploading logs that failed to parse, specify the key name for the failed logs. UnMatchUpLoadSwitch=true and UnMatchLogKey must be used together.
- Un
Match boolUp Load Switch - Whether to upload logs that failed to parse. UnMatchUpLoadSwitch=true and UnMatchLogKey must be paired. true: Upload logs that failed to parse. false: Do not upload logs that failed to parse.
- begin
Regex String - Regular expression for identifying the first line of each log. The matched part is treated as the start of the log. When LogType is set to multiline_log, you must configure a log sample
- delimiter String
- Delimiter. Only valid when LogType is delimiter_log.
- enable
Nanosecond Boolean - Enable nanoseconds.
- filter
Key List<ImportRegexes Task Target Info Extract Rule Extract Rule Filter Key Regex> - keys List<String>
- List of log field names (Key). Valid only when LogType is delimiter_log. Supports up to 100 field names. Duplicate field names are not allowed, and all field names cannot be left blank
- log
Regex String - Log regular expression
- log
Template ImportTask Target Info Extract Rule Extract Rule Log Template - Log template.
- quote String
- Quotation mark. Content enclosed by the quotation mark will not be separated and will be parsed as a complete field. Only valid when LogType is delimiter_log.
- time
Format String - Parsing format for the time field. If you use a specified time field in the log as the log timestamp, you must fill in TimeKey and TimeFormat. TimeKey and TimeFormat must be paired. For configuration details, see time format.
- time
Key String - Name of the log time field. If you use a specific time field in the log as the log timestamp, you must provide both TimeKey and TimeFormat. TimeKey and TimeFormat must appear in pairs
- time
Sample String - Time sample. Used to verify whether the entered time parsing format is correct
- un
Match StringLog Key - When uploading logs that failed to parse, specify the key name for the failed logs. UnMatchUpLoadSwitch=true and UnMatchLogKey must be used together.
- un
Match BooleanUp Load Switch - Whether to upload logs that failed to parse. UnMatchUpLoadSwitch=true and UnMatchLogKey must be paired. true: Upload logs that failed to parse. false: Do not upload logs that failed to parse.
- begin
Regex string - Regular expression for identifying the first line of each log. The matched part is treated as the start of the log. When LogType is set to multiline_log, you must configure a log sample
- delimiter string
- Delimiter. Only valid when LogType is delimiter_log.
- enable
Nanosecond boolean - Enable nanoseconds.
- filter
Key ImportRegexes Task Target Info Extract Rule Extract Rule Filter Key Regex[] - keys string[]
- List of log field names (Key). Valid only when LogType is delimiter_log. Supports up to 100 field names. Duplicate field names are not allowed, and all field names cannot be left blank
- log
Regex string - Log regular expression
- log
Template ImportTask Target Info Extract Rule Extract Rule Log Template - Log template.
- quote string
- Quotation mark. Content enclosed by the quotation mark will not be separated and will be parsed as a complete field. Only valid when LogType is delimiter_log.
- time
Format string - Parsing format for the time field. If you use a specified time field in the log as the log timestamp, you must fill in TimeKey and TimeFormat. TimeKey and TimeFormat must be paired. For configuration details, see time format.
- time
Key string - Name of the log time field. If you use a specific time field in the log as the log timestamp, you must provide both TimeKey and TimeFormat. TimeKey and TimeFormat must appear in pairs
- time
Sample string - Time sample. Used to verify whether the entered time parsing format is correct
- un
Match stringLog Key - When uploading logs that failed to parse, specify the key name for the failed logs. UnMatchUpLoadSwitch=true and UnMatchLogKey must be used together.
- un
Match booleanUp Load Switch - Whether to upload logs that failed to parse. UnMatchUpLoadSwitch=true and UnMatchLogKey must be paired. true: Upload logs that failed to parse. false: Do not upload logs that failed to parse.
- begin_
regex str - Regular expression for identifying the first line of each log. The matched part is treated as the start of the log. When LogType is set to multiline_log, you must configure a log sample
- delimiter str
- Delimiter. Only valid when LogType is delimiter_log.
- enable_
nanosecond bool - Enable nanoseconds.
- filter_
key_ Sequence[Importregexes Task Target Info Extract Rule Extract Rule Filter Key Regex] - keys Sequence[str]
- List of log field names (Key). Valid only when LogType is delimiter_log. Supports up to 100 field names. Duplicate field names are not allowed, and all field names cannot be left blank
- log_
regex str - Log regular expression
- log_
template ImportTask Target Info Extract Rule Extract Rule Log Template - Log template.
- quote str
- Quotation mark. Content enclosed by the quotation mark will not be separated and will be parsed as a complete field. Only valid when LogType is delimiter_log.
- time_
format str - Parsing format for the time field. If you use a specified time field in the log as the log timestamp, you must fill in TimeKey and TimeFormat. TimeKey and TimeFormat must be paired. For configuration details, see time format.
- time_
key str - Name of the log time field. If you use a specific time field in the log as the log timestamp, you must provide both TimeKey and TimeFormat. TimeKey and TimeFormat must appear in pairs
- time_
sample str - Time sample. Used to verify whether the entered time parsing format is correct
- un_
match_ strlog_ key - When uploading logs that failed to parse, specify the key name for the failed logs. UnMatchUpLoadSwitch=true and UnMatchLogKey must be used together.
- un_
match_ boolup_ load_ switch - Whether to upload logs that failed to parse. UnMatchUpLoadSwitch=true and UnMatchLogKey must be paired. true: Upload logs that failed to parse. false: Do not upload logs that failed to parse.
- begin
Regex String - Regular expression for identifying the first line of each log. The matched part is treated as the start of the log. When LogType is set to multiline_log, you must configure a log sample
- delimiter String
- Delimiter. Only valid when LogType is delimiter_log.
- enable
Nanosecond Boolean - Enable nanoseconds.
- filter
Key List<Property Map>Regexes - keys List<String>
- List of log field names (Key). Valid only when LogType is delimiter_log. Supports up to 100 field names. Duplicate field names are not allowed, and all field names cannot be left blank
- log
Regex String - Log regular expression
- log
Template Property Map - Log template.
- quote String
- Quotation mark. Content enclosed by the quotation mark will not be separated and will be parsed as a complete field. Only valid when LogType is delimiter_log.
- time
Format String - Parsing format for the time field. If you use a specified time field in the log as the log timestamp, you must fill in TimeKey and TimeFormat. TimeKey and TimeFormat must be paired. For configuration details, see time format.
- time
Key String - Name of the log time field. If you use a specific time field in the log as the log timestamp, you must provide both TimeKey and TimeFormat. TimeKey and TimeFormat must appear in pairs
- time
Sample String - Time sample. Used to verify whether the entered time parsing format is correct
- un
Match StringLog Key - When uploading logs that failed to parse, specify the key name for the failed logs. UnMatchUpLoadSwitch=true and UnMatchLogKey must be used together.
- un
Match BooleanUp Load Switch - Whether to upload logs that failed to parse. UnMatchUpLoadSwitch=true and UnMatchLogKey must be paired. true: Upload logs that failed to parse. false: Do not upload logs that failed to parse.
ImportTaskTargetInfoExtractRuleExtractRuleFilterKeyRegex, ImportTaskTargetInfoExtractRuleExtractRuleFilterKeyRegexArgs
ImportTaskTargetInfoExtractRuleExtractRuleLogTemplate, ImportTaskTargetInfoExtractRuleExtractRuleLogTemplateArgs
ImportTaskTaskStatistics, ImportTaskTaskStatisticsArgs
- Bytes
Total int - Total resource bytes enumerated
- Bytes
Transferred int - Bytes transferred.
- Failed int
- Number of resources failed to import.
- Not
Exist int - Number of resources not found.
- Skipped int
- Number of resources skipped during import
- Task
Status string - Task status. Status of the import task. Preparing: Preparing for import. Importing: Importing data. Success: Import completed successfully. Failed: Import failed. Stopped: Import paused.
- Total int
- Total number of resources enumerated.
- Transferred int
- Number of records transferred.
- Bytes
Total int - Total resource bytes enumerated
- Bytes
Transferred int - Bytes transferred.
- Failed int
- Number of resources failed to import.
- Not
Exist int - Number of resources not found.
- Skipped int
- Number of resources skipped during import
- Task
Status string - Task status. Status of the import task. Preparing: Preparing for import. Importing: Importing data. Success: Import completed successfully. Failed: Import failed. Stopped: Import paused.
- Total int
- Total number of resources enumerated.
- Transferred int
- Number of records transferred.
- bytes
Total Integer - Total resource bytes enumerated
- bytes
Transferred Integer - Bytes transferred.
- failed Integer
- Number of resources failed to import.
- not
Exist Integer - Number of resources not found.
- skipped Integer
- Number of resources skipped during import
- task
Status String - Task status. Status of the import task. Preparing: Preparing for import. Importing: Importing data. Success: Import completed successfully. Failed: Import failed. Stopped: Import paused.
- total Integer
- Total number of resources enumerated.
- transferred Integer
- Number of records transferred.
- bytes
Total number - Total resource bytes enumerated
- bytes
Transferred number - Bytes transferred.
- failed number
- Number of resources failed to import.
- not
Exist number - Number of resources not found.
- skipped number
- Number of resources skipped during import
- task
Status string - Task status. Status of the import task. Preparing: Preparing for import. Importing: Importing data. Success: Import completed successfully. Failed: Import failed. Stopped: Import paused.
- total number
- Total number of resources enumerated.
- transferred number
- Number of records transferred.
- bytes_
total int - Total resource bytes enumerated
- bytes_
transferred int - Bytes transferred.
- failed int
- Number of resources failed to import.
- not_
exist int - Number of resources not found.
- skipped int
- Number of resources skipped during import
- task_
status str - Task status. Status of the import task. Preparing: Preparing for import. Importing: Importing data. Success: Import completed successfully. Failed: Import failed. Stopped: Import paused.
- total int
- Total number of resources enumerated.
- transferred int
- Number of records transferred.
- bytes
Total Number - Total resource bytes enumerated
- bytes
Transferred Number - Bytes transferred.
- failed Number
- Number of resources failed to import.
- not
Exist Number - Number of resources not found.
- skipped Number
- Number of resources skipped during import
- task
Status String - Task status. Status of the import task. Preparing: Preparing for import. Importing: Importing data. Success: Import completed successfully. Failed: Import failed. Stopped: Import paused.
- total Number
- Total number of resources enumerated.
- transferred Number
- Number of records transferred.
Import
$ pulumi import volcenginecc:tls/importTask:ImportTask example "task_id"
To learn more about importing existing cloud resources, see Importing resources.
Package Details
- Repository
- volcenginecc volcengine/pulumi-volcenginecc
- License
- MPL-2.0
- Notes
- This Pulumi package is based on the
volcengineccTerraform Provider.
published on Thursday, Apr 23, 2026 by Volcengine
