Viewing docs for volcenginecc v0.0.32
published on Thursday, Apr 23, 2026 by Volcengine
published on Thursday, Apr 23, 2026 by Volcengine
Viewing docs for volcenginecc v0.0.32
published on Thursday, Apr 23, 2026 by Volcengine
published on Thursday, Apr 23, 2026 by Volcengine
Data Source schema for Volcengine::TLS::ImportTask
Using getImportTask
Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.
function getImportTask(args: GetImportTaskArgs, opts?: InvokeOptions): Promise<GetImportTaskResult>
function getImportTaskOutput(args: GetImportTaskOutputArgs, opts?: InvokeOptions): Output<GetImportTaskResult>def get_import_task(id: Optional[str] = None,
opts: Optional[InvokeOptions] = None) -> GetImportTaskResult
def get_import_task_output(id: Optional[pulumi.Input[str]] = None,
opts: Optional[InvokeOptions] = None) -> Output[GetImportTaskResult]func LookupImportTask(ctx *Context, args *LookupImportTaskArgs, opts ...InvokeOption) (*LookupImportTaskResult, error)
func LookupImportTaskOutput(ctx *Context, args *LookupImportTaskOutputArgs, opts ...InvokeOption) LookupImportTaskResultOutput> Note: This function is named LookupImportTask in the Go SDK.
public static class GetImportTask
{
public static Task<GetImportTaskResult> InvokeAsync(GetImportTaskArgs args, InvokeOptions? opts = null)
public static Output<GetImportTaskResult> Invoke(GetImportTaskInvokeArgs args, InvokeOptions? opts = null)
}public static CompletableFuture<GetImportTaskResult> getImportTask(GetImportTaskArgs args, InvokeOptions options)
public static Output<GetImportTaskResult> getImportTask(GetImportTaskArgs args, InvokeOptions options)
fn::invoke:
function: volcenginecc:tls/getImportTask:getImportTask
arguments:
# arguments dictionaryThe following arguments are supported:
- Id string
- Uniquely identifies the resource.
- Id string
- Uniquely identifies the resource.
- id String
- Uniquely identifies the resource.
- id string
- Uniquely identifies the resource.
- id str
- Uniquely identifies the resource.
- id String
- Uniquely identifies the resource.
getImportTask Result
The following output properties are available:
- Create
Time string - Creation time.
- Description string
- Task description.
- Id string
- Uniquely identifies the resource.
- Import
Source Volcengine.Info Get Import Task Import Source Info - Import data source information
- Project
Id string - Log project ID for storing data.
- Project
Name string - Log project name.
- Source
Type string - Data source type. Options: tos, kafka.
- Status int
- Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
- Target
Info Volcengine.Get Import Task Target Info - Output information for the data import task.
- Task
Id string - Import task ID.
- Task
Name string - Import task name
- Task
Statistics Volcengine.Get Import Task Task Statistics - Progress of the data import task.
- Topic
Id string - Log topic ID used to store data
- Topic
Name string - Log topic name.
- Create
Time string - Creation time.
- Description string
- Task description.
- Id string
- Uniquely identifies the resource.
- Import
Source GetInfo Import Task Import Source Info - Import data source information
- Project
Id string - Log project ID for storing data.
- Project
Name string - Log project name.
- Source
Type string - Data source type. Options: tos, kafka.
- Status int
- Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
- Target
Info GetImport Task Target Info - Output information for the data import task.
- Task
Id string - Import task ID.
- Task
Name string - Import task name
- Task
Statistics GetImport Task Task Statistics - Progress of the data import task.
- Topic
Id string - Log topic ID used to store data
- Topic
Name string - Log topic name.
- create
Time String - Creation time.
- description String
- Task description.
- id String
- Uniquely identifies the resource.
- import
Source GetInfo Import Task Import Source Info - Import data source information
- project
Id String - Log project ID for storing data.
- project
Name String - Log project name.
- source
Type String - Data source type. Options: tos, kafka.
- status Integer
- Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
- target
Info GetImport Task Target Info - Output information for the data import task.
- task
Id String - Import task ID.
- task
Name String - Import task name
- task
Statistics GetImport Task Task Statistics - Progress of the data import task.
- topic
Id String - Log topic ID used to store data
- topic
Name String - Log topic name.
- create
Time string - Creation time.
- description string
- Task description.
- id string
- Uniquely identifies the resource.
- import
Source GetInfo Import Task Import Source Info - Import data source information
- project
Id string - Log project ID for storing data.
- project
Name string - Log project name.
- source
Type string - Data source type. Options: tos, kafka.
- status number
- Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
- target
Info GetImport Task Target Info - Output information for the data import task.
- task
Id string - Import task ID.
- task
Name string - Import task name
- task
Statistics GetImport Task Task Statistics - Progress of the data import task.
- topic
Id string - Log topic ID used to store data
- topic
Name string - Log topic name.
- create_
time str - Creation time.
- description str
- Task description.
- id str
- Uniquely identifies the resource.
- import_
source_ Getinfo Import Task Import Source Info - Import data source information
- project_
id str - Log project ID for storing data.
- project_
name str - Log project name.
- source_
type str - Data source type. Options: tos, kafka.
- status int
- Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
- target_
info GetImport Task Target Info - Output information for the data import task.
- task_
id str - Import task ID.
- task_
name str - Import task name
- task_
statistics GetImport Task Task Statistics - Progress of the data import task.
- topic_
id str - Log topic ID used to store data
- topic_
name str - Log topic name.
- create
Time String - Creation time.
- description String
- Task description.
- id String
- Uniquely identifies the resource.
- import
Source Property MapInfo - Import data source information
- project
Id String - Log project ID for storing data.
- project
Name String - Log project name.
- source
Type String - Data source type. Options: tos, kafka.
- status Number
- Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
- target
Info Property Map - Output information for the data import task.
- task
Id String - Import task ID.
- task
Name String - Import task name
- task
Statistics Property Map - Progress of the data import task.
- topic
Id String - Log topic ID used to store data
- topic
Name String - Log topic name.
Supporting Types
GetImportTaskImportSourceInfo
- Kafka
Source Volcengine.Info Get Import Task Import Source Info Kafka Source Info - Kafka data source information. When sourceType is kafka, the KafkaSourceInfo field is required
- Tos
Source Volcengine.Info Get Import Task Import Source Info Tos Source Info - TOS data source information. When sourceType is tos, the TosSourceInfo field is required.
- Kafka
Source GetInfo Import Task Import Source Info Kafka Source Info - Kafka data source information. When sourceType is kafka, the KafkaSourceInfo field is required
- Tos
Source GetInfo Import Task Import Source Info Tos Source Info - TOS data source information. When sourceType is tos, the TosSourceInfo field is required.
- kafka
Source GetInfo Import Task Import Source Info Kafka Source Info - Kafka data source information. When sourceType is kafka, the KafkaSourceInfo field is required
- tos
Source GetInfo Import Task Import Source Info Tos Source Info - TOS data source information. When sourceType is tos, the TosSourceInfo field is required.
- kafka
Source GetInfo Import Task Import Source Info Kafka Source Info - Kafka data source information. When sourceType is kafka, the KafkaSourceInfo field is required
- tos
Source GetInfo Import Task Import Source Info Tos Source Info - TOS data source information. When sourceType is tos, the TosSourceInfo field is required.
- kafka_
source_ Getinfo Import Task Import Source Info Kafka Source Info - Kafka data source information. When sourceType is kafka, the KafkaSourceInfo field is required
- tos_
source_ Getinfo Import Task Import Source Info Tos Source Info - TOS data source information. When sourceType is tos, the TosSourceInfo field is required.
- kafka
Source Property MapInfo - Kafka data source information. When sourceType is kafka, the KafkaSourceInfo field is required
- tos
Source Property MapInfo - TOS data source information. When sourceType is tos, the TosSourceInfo field is required.
GetImportTaskImportSourceInfoKafkaSourceInfo
- Encode string
- Data encoding format. Available options: UTF-8, GBK.
- Group string
- Kafka consumer group. If not specified, the system will automatically create a Kafka consumer group.
- Host string
- The service addresses for different types of Kafka clusters vary. Details are as follows: Message Queue Kafka Edition: Use the access point of the Kafka instance. For more information, see Access Point. If the Kafka instance and the Log Service Project are in the same region, you can use private network access; otherwise, use public network access. Self-hosted Kafka clusters: Use the IP address and port number or the domain name and port number of the Kafka Broker. Only public network access is supported. Separate multiple service addresses with a comma (,).
- Initial
Offset int - Starting position for data import. Options: 0: Earliest time, start importing from the first record in the specified Kafka Topic. 1: Latest time, start importing from the most recently generated record in the specified Kafka Topic.
- Instance
Id string - If you are using Message Queue Kafka Edition, set this to the Kafka instance ID.
- Is
Need boolAuth - Whether to enable authentication. If you use a public service address, it is recommended to enable authentication.
- Mechanism string
- Password authentication mechanism. Available options: PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
- Password string
- Kafka SASL user password for authentication.
- Protocol string
- Secure transmission protocol. Options include plaintext, saslssl, ssl, and saslplaintext
- Time
Source intDefault - Specify log time. Options: 0: Use Kafka message timestamp. 1: Use current system time.
- Topic string
- Kafka Topic name. Separate multiple Kafka Topics with commas (,).
- Username string
- Kafka SASL username for authentication.
- Encode string
- Data encoding format. Available options: UTF-8, GBK.
- Group string
- Kafka consumer group. If not specified, the system will automatically create a Kafka consumer group.
- Host string
- The service addresses for different types of Kafka clusters vary. Details are as follows: Message Queue Kafka Edition: Use the access point of the Kafka instance. For more information, see Access Point. If the Kafka instance and the Log Service Project are in the same region, you can use private network access; otherwise, use public network access. Self-hosted Kafka clusters: Use the IP address and port number or the domain name and port number of the Kafka Broker. Only public network access is supported. Separate multiple service addresses with a comma (,).
- Initial
Offset int - Starting position for data import. Options: 0: Earliest time, start importing from the first record in the specified Kafka Topic. 1: Latest time, start importing from the most recently generated record in the specified Kafka Topic.
- Instance
Id string - If you are using Message Queue Kafka Edition, set this to the Kafka instance ID.
- Is
Need boolAuth - Whether to enable authentication. If you use a public service address, it is recommended to enable authentication.
- Mechanism string
- Password authentication mechanism. Available options: PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
- Password string
- Kafka SASL user password for authentication.
- Protocol string
- Secure transmission protocol. Options include plaintext, saslssl, ssl, and saslplaintext
- Time
Source intDefault - Specify log time. Options: 0: Use Kafka message timestamp. 1: Use current system time.
- Topic string
- Kafka Topic name. Separate multiple Kafka Topics with commas (,).
- Username string
- Kafka SASL username for authentication.
- encode String
- Data encoding format. Available options: UTF-8, GBK.
- group String
- Kafka consumer group. If not specified, the system will automatically create a Kafka consumer group.
- host String
- The service addresses for different types of Kafka clusters vary. Details are as follows: Message Queue Kafka Edition: Use the access point of the Kafka instance. For more information, see Access Point. If the Kafka instance and the Log Service Project are in the same region, you can use private network access; otherwise, use public network access. Self-hosted Kafka clusters: Use the IP address and port number or the domain name and port number of the Kafka Broker. Only public network access is supported. Separate multiple service addresses with a comma (,).
- initial
Offset Integer - Starting position for data import. Options: 0: Earliest time, start importing from the first record in the specified Kafka Topic. 1: Latest time, start importing from the most recently generated record in the specified Kafka Topic.
- instance
Id String - If you are using Message Queue Kafka Edition, set this to the Kafka instance ID.
- is
Need BooleanAuth - Whether to enable authentication. If you use a public service address, it is recommended to enable authentication.
- mechanism String
- Password authentication mechanism. Available options: PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
- password String
- Kafka SASL user password for authentication.
- protocol String
- Secure transmission protocol. Options include plaintext, saslssl, ssl, and saslplaintext
- time
Source IntegerDefault - Specify log time. Options: 0: Use Kafka message timestamp. 1: Use current system time.
- topic String
- Kafka Topic name. Separate multiple Kafka Topics with commas (,).
- username String
- Kafka SASL username for authentication.
- encode string
- Data encoding format. Available options: UTF-8, GBK.
- group string
- Kafka consumer group. If not specified, the system will automatically create a Kafka consumer group.
- host string
- The service addresses for different types of Kafka clusters vary. Details are as follows: Message Queue Kafka Edition: Use the access point of the Kafka instance. For more information, see Access Point. If the Kafka instance and the Log Service Project are in the same region, you can use private network access; otherwise, use public network access. Self-hosted Kafka clusters: Use the IP address and port number or the domain name and port number of the Kafka Broker. Only public network access is supported. Separate multiple service addresses with a comma (,).
- initial
Offset number - Starting position for data import. Options: 0: Earliest time, start importing from the first record in the specified Kafka Topic. 1: Latest time, start importing from the most recently generated record in the specified Kafka Topic.
- instance
Id string - If you are using Message Queue Kafka Edition, set this to the Kafka instance ID.
- is
Need booleanAuth - Whether to enable authentication. If you use a public service address, it is recommended to enable authentication.
- mechanism string
- Password authentication mechanism. Available options: PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
- password string
- Kafka SASL user password for authentication.
- protocol string
- Secure transmission protocol. Options include plaintext, saslssl, ssl, and saslplaintext
- time
Source numberDefault - Specify log time. Options: 0: Use Kafka message timestamp. 1: Use current system time.
- topic string
- Kafka Topic name. Separate multiple Kafka Topics with commas (,).
- username string
- Kafka SASL username for authentication.
- encode str
- Data encoding format. Available options: UTF-8, GBK.
- group str
- Kafka consumer group. If not specified, the system will automatically create a Kafka consumer group.
- host str
- The service addresses for different types of Kafka clusters vary. Details are as follows: Message Queue Kafka Edition: Use the access point of the Kafka instance. For more information, see Access Point. If the Kafka instance and the Log Service Project are in the same region, you can use private network access; otherwise, use public network access. Self-hosted Kafka clusters: Use the IP address and port number or the domain name and port number of the Kafka Broker. Only public network access is supported. Separate multiple service addresses with a comma (,).
- initial_
offset int - Starting position for data import. Options: 0: Earliest time, start importing from the first record in the specified Kafka Topic. 1: Latest time, start importing from the most recently generated record in the specified Kafka Topic.
- instance_
id str - If you are using Message Queue Kafka Edition, set this to the Kafka instance ID.
- is_
need_ boolauth - Whether to enable authentication. If you use a public service address, it is recommended to enable authentication.
- mechanism str
- Password authentication mechanism. Available options: PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
- password str
- Kafka SASL user password for authentication.
- protocol str
- Secure transmission protocol. Options include plaintext, saslssl, ssl, and saslplaintext
- time_
source_ intdefault - Specify log time. Options: 0: Use Kafka message timestamp. 1: Use current system time.
- topic str
- Kafka Topic name. Separate multiple Kafka Topics with commas (,).
- username str
- Kafka SASL username for authentication.
- encode String
- Data encoding format. Available options: UTF-8, GBK.
- group String
- Kafka consumer group. If not specified, the system will automatically create a Kafka consumer group.
- host String
- The service addresses for different types of Kafka clusters vary. Details are as follows: Message Queue Kafka Edition: Use the access point of the Kafka instance. For more information, see Access Point. If the Kafka instance and the Log Service Project are in the same region, you can use private network access; otherwise, use public network access. Self-hosted Kafka clusters: Use the IP address and port number or the domain name and port number of the Kafka Broker. Only public network access is supported. Separate multiple service addresses with a comma (,).
- initial
Offset Number - Starting position for data import. Options: 0: Earliest time, start importing from the first record in the specified Kafka Topic. 1: Latest time, start importing from the most recently generated record in the specified Kafka Topic.
- instance
Id String - If you are using Message Queue Kafka Edition, set this to the Kafka instance ID.
- is
Need BooleanAuth - Whether to enable authentication. If you use a public service address, it is recommended to enable authentication.
- mechanism String
- Password authentication mechanism. Available options: PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
- password String
- Kafka SASL user password for authentication.
- protocol String
- Secure transmission protocol. Options include plaintext, saslssl, ssl, and saslplaintext
- time
Source NumberDefault - Specify log time. Options: 0: Use Kafka message timestamp. 1: Use current system time.
- topic String
- Kafka Topic name. Separate multiple Kafka Topics with commas (,).
- username String
- Kafka SASL username for authentication.
GetImportTaskImportSourceInfoTosSourceInfo
- Bucket string
- TOS bucket name
- Compress
Type string - Compression mode for data in the TOS bucket. none: No compression. snappy: Compress using snappy. gzip: Compress using gzip. lz4: Compress using lz4.
- Prefix string
- Path of the file to be imported in the TOS bucket.
- Region string
- Region where the TOS bucket is located. Cross-region data import is supported
- Bucket string
- TOS bucket name
- Compress
Type string - Compression mode for data in the TOS bucket. none: No compression. snappy: Compress using snappy. gzip: Compress using gzip. lz4: Compress using lz4.
- Prefix string
- Path of the file to be imported in the TOS bucket.
- Region string
- Region where the TOS bucket is located. Cross-region data import is supported
- bucket String
- TOS bucket name
- compress
Type String - Compression mode for data in the TOS bucket. none: No compression. snappy: Compress using snappy. gzip: Compress using gzip. lz4: Compress using lz4.
- prefix String
- Path of the file to be imported in the TOS bucket.
- region String
- Region where the TOS bucket is located. Cross-region data import is supported
- bucket string
- TOS bucket name
- compress
Type string - Compression mode for data in the TOS bucket. none: No compression. snappy: Compress using snappy. gzip: Compress using gzip. lz4: Compress using lz4.
- prefix string
- Path of the file to be imported in the TOS bucket.
- region string
- Region where the TOS bucket is located. Cross-region data import is supported
- bucket str
- TOS bucket name
- compress_
type str - Compression mode for data in the TOS bucket. none: No compression. snappy: Compress using snappy. gzip: Compress using gzip. lz4: Compress using lz4.
- prefix str
- Path of the file to be imported in the TOS bucket.
- region str
- Region where the TOS bucket is located. Cross-region data import is supported
- bucket String
- TOS bucket name
- compress
Type String - Compression mode for data in the TOS bucket. none: No compression. snappy: Compress using snappy. gzip: Compress using gzip. lz4: Compress using lz4.
- prefix String
- Path of the file to be imported in the TOS bucket.
- region String
- Region where the TOS bucket is located. Cross-region data import is supported
GetImportTaskTargetInfo
- Extract
Rule Volcengine.Get Import Task Target Info Extract Rule - Log extraction rule.
- Log
Sample string - Log sample. When LogType is set to multiline_log, you must configure log samples. It is recommended to provide more than two log entries as examples to ensure the regular expression matches the first line of each log. Use real samples from the production environment.
- Log
Type string - Specify log parsing type during import. delimiterlog: CSV type. multilinelog: multiline full text type. minimalistlog: single line full text type. jsonlog: JSON type.
- Region string
- Region.
- Extract
Rule GetImport Task Target Info Extract Rule - Log extraction rule.
- Log
Sample string - Log sample. When LogType is set to multiline_log, you must configure log samples. It is recommended to provide more than two log entries as examples to ensure the regular expression matches the first line of each log. Use real samples from the production environment.
- Log
Type string - Specify log parsing type during import. delimiterlog: CSV type. multilinelog: multiline full text type. minimalistlog: single line full text type. jsonlog: JSON type.
- Region string
- Region.
- extract
Rule GetImport Task Target Info Extract Rule - Log extraction rule.
- log
Sample String - Log sample. When LogType is set to multiline_log, you must configure log samples. It is recommended to provide more than two log entries as examples to ensure the regular expression matches the first line of each log. Use real samples from the production environment.
- log
Type String - Specify log parsing type during import. delimiterlog: CSV type. multilinelog: multiline full text type. minimalistlog: single line full text type. jsonlog: JSON type.
- region String
- Region.
- extract
Rule GetImport Task Target Info Extract Rule - Log extraction rule.
- log
Sample string - Log sample. When LogType is set to multiline_log, you must configure log samples. It is recommended to provide more than two log entries as examples to ensure the regular expression matches the first line of each log. Use real samples from the production environment.
- log
Type string - Specify log parsing type during import. delimiterlog: CSV type. multilinelog: multiline full text type. minimalistlog: single line full text type. jsonlog: JSON type.
- region string
- Region.
- extract_
rule GetImport Task Target Info Extract Rule - Log extraction rule.
- log_
sample str - Log sample. When LogType is set to multiline_log, you must configure log samples. It is recommended to provide more than two log entries as examples to ensure the regular expression matches the first line of each log. Use real samples from the production environment.
- log_
type str - Specify log parsing type during import. delimiterlog: CSV type. multilinelog: multiline full text type. minimalistlog: single line full text type. jsonlog: JSON type.
- region str
- Region.
- extract
Rule Property Map - Log extraction rule.
- log
Sample String - Log sample. When LogType is set to multiline_log, you must configure log samples. It is recommended to provide more than two log entries as examples to ensure the regular expression matches the first line of each log. Use real samples from the production environment.
- log
Type String - Specify log parsing type during import. delimiterlog: CSV type. multilinelog: multiline full text type. minimalistlog: single line full text type. jsonlog: JSON type.
- region String
- Region.
GetImportTaskTargetInfoExtractRule
- Extract
Rule Volcengine.Get Import Task Target Info Extract Rule Extract Rule - Basic content of log extraction rules.
- Skip
Line intCount - Number of skipped rows. Only valid when the log type is delimiter_log and the import type is tos.
- Time
Extract stringRegex - Time extraction regular expression, used to extract the time value from the TimeKey field and parse it as the collection time
- Time
Zone string - Time zone. Supports machine time zone (default) and custom time zone. Custom time zone supports GMT and UTC. GMT format: GMT+08:00. UTC format: Asia/Shanghai.
- Extract
Rule GetImport Task Target Info Extract Rule Extract Rule - Basic content of log extraction rules.
- Skip
Line intCount - Number of skipped rows. Only valid when the log type is delimiter_log and the import type is tos.
- Time
Extract stringRegex - Time extraction regular expression, used to extract the time value from the TimeKey field and parse it as the collection time
- Time
Zone string - Time zone. Supports machine time zone (default) and custom time zone. Custom time zone supports GMT and UTC. GMT format: GMT+08:00. UTC format: Asia/Shanghai.
- extract
Rule GetImport Task Target Info Extract Rule Extract Rule - Basic content of log extraction rules.
- skip
Line IntegerCount - Number of skipped rows. Only valid when the log type is delimiter_log and the import type is tos.
- time
Extract StringRegex - Time extraction regular expression, used to extract the time value from the TimeKey field and parse it as the collection time
- time
Zone String - Time zone. Supports machine time zone (default) and custom time zone. Custom time zone supports GMT and UTC. GMT format: GMT+08:00. UTC format: Asia/Shanghai.
- extract
Rule GetImport Task Target Info Extract Rule Extract Rule - Basic content of log extraction rules.
- skip
Line numberCount - Number of skipped rows. Only valid when the log type is delimiter_log and the import type is tos.
- time
Extract stringRegex - Time extraction regular expression, used to extract the time value from the TimeKey field and parse it as the collection time
- time
Zone string - Time zone. Supports machine time zone (default) and custom time zone. Custom time zone supports GMT and UTC. GMT format: GMT+08:00. UTC format: Asia/Shanghai.
- extract_
rule GetImport Task Target Info Extract Rule Extract Rule - Basic content of log extraction rules.
- skip_
line_ intcount - Number of skipped rows. Only valid when the log type is delimiter_log and the import type is tos.
- time_
extract_ strregex - Time extraction regular expression, used to extract the time value from the TimeKey field and parse it as the collection time
- time_
zone str - Time zone. Supports machine time zone (default) and custom time zone. Custom time zone supports GMT and UTC. GMT format: GMT+08:00. UTC format: Asia/Shanghai.
- extract
Rule Property Map - Basic content of log extraction rules.
- skip
Line NumberCount - Number of skipped rows. Only valid when the log type is delimiter_log and the import type is tos.
- time
Extract StringRegex - Time extraction regular expression, used to extract the time value from the TimeKey field and parse it as the collection time
- time
Zone String - Time zone. Supports machine time zone (default) and custom time zone. Custom time zone supports GMT and UTC. GMT format: GMT+08:00. UTC format: Asia/Shanghai.
GetImportTaskTargetInfoExtractRuleExtractRule
- Begin
Regex string - Regular expression for identifying the first line of each log. The matched part is treated as the start of the log. When LogType is set to multiline_log, you must configure a log sample
- Delimiter string
- Delimiter. Only valid when LogType is delimiter_log.
- Enable
Nanosecond bool - Enable nanoseconds.
- Filter
Key List<Volcengine.Regexes Get Import Task Target Info Extract Rule Extract Rule Filter Key Regex> - Filter key regular expression.
- Keys List<string>
- List of log field names (Key). Valid only when LogType is delimiter_log. Supports up to 100 field names. Duplicate field names are not allowed, and all field names cannot be left blank
- Log
Regex string - Log regular expression
- Log
Template Volcengine.Get Import Task Target Info Extract Rule Extract Rule Log Template - Log template.
- Quote string
- Quotation mark. Content enclosed by the quotation mark will not be separated and will be parsed as a complete field. Only valid when LogType is delimiter_log.
- Time
Format string - Parsing format for the time field. If you use a specified time field in the log as the log timestamp, you must fill in TimeKey and TimeFormat. TimeKey and TimeFormat must be paired. For configuration details, see time format.
- Time
Key string - Name of the log time field. If you use a specific time field in the log as the log timestamp, you must provide both TimeKey and TimeFormat. TimeKey and TimeFormat must appear in pairs
- Time
Sample string - Time sample. Used to verify whether the entered time parsing format is correct
- Un
Match stringLog Key - When uploading logs that failed to parse, specify the key name for the failed logs. UnMatchUpLoadSwitch=true and UnMatchLogKey must be used together.
- Un
Match boolUp Load Switch - Whether to upload logs that failed to parse. UnMatchUpLoadSwitch=true and UnMatchLogKey must be paired. true: Upload logs that failed to parse. false: Do not upload logs that failed to parse.
- Begin
Regex string - Regular expression for identifying the first line of each log. The matched part is treated as the start of the log. When LogType is set to multiline_log, you must configure a log sample
- Delimiter string
- Delimiter. Only valid when LogType is delimiter_log.
- Enable
Nanosecond bool - Enable nanoseconds.
- Filter
Key []GetRegexes Import Task Target Info Extract Rule Extract Rule Filter Key Regex - Filter key regular expression.
- Keys []string
- List of log field names (Key). Valid only when LogType is delimiter_log. Supports up to 100 field names. Duplicate field names are not allowed, and all field names cannot be left blank
- Log
Regex string - Log regular expression
- Log
Template GetImport Task Target Info Extract Rule Extract Rule Log Template - Log template.
- Quote string
- Quotation mark. Content enclosed by the quotation mark will not be separated and will be parsed as a complete field. Only valid when LogType is delimiter_log.
- Time
Format string - Parsing format for the time field. If you use a specified time field in the log as the log timestamp, you must fill in TimeKey and TimeFormat. TimeKey and TimeFormat must be paired. For configuration details, see time format.
- Time
Key string - Name of the log time field. If you use a specific time field in the log as the log timestamp, you must provide both TimeKey and TimeFormat. TimeKey and TimeFormat must appear in pairs
- Time
Sample string - Time sample. Used to verify whether the entered time parsing format is correct
- Un
Match stringLog Key - When uploading logs that failed to parse, specify the key name for the failed logs. UnMatchUpLoadSwitch=true and UnMatchLogKey must be used together.
- Un
Match boolUp Load Switch - Whether to upload logs that failed to parse. UnMatchUpLoadSwitch=true and UnMatchLogKey must be paired. true: Upload logs that failed to parse. false: Do not upload logs that failed to parse.
- begin
Regex String - Regular expression for identifying the first line of each log. The matched part is treated as the start of the log. When LogType is set to multiline_log, you must configure a log sample
- delimiter String
- Delimiter. Only valid when LogType is delimiter_log.
- enable
Nanosecond Boolean - Enable nanoseconds.
- filter
Key List<GetRegexes Import Task Target Info Extract Rule Extract Rule Filter Key Regex> - Filter key regular expression.
- keys List<String>
- List of log field names (Key). Valid only when LogType is delimiter_log. Supports up to 100 field names. Duplicate field names are not allowed, and all field names cannot be left blank
- log
Regex String - Log regular expression
- log
Template GetImport Task Target Info Extract Rule Extract Rule Log Template - Log template.
- quote String
- Quotation mark. Content enclosed by the quotation mark will not be separated and will be parsed as a complete field. Only valid when LogType is delimiter_log.
- time
Format String - Parsing format for the time field. If you use a specified time field in the log as the log timestamp, you must fill in TimeKey and TimeFormat. TimeKey and TimeFormat must be paired. For configuration details, see time format.
- time
Key String - Name of the log time field. If you use a specific time field in the log as the log timestamp, you must provide both TimeKey and TimeFormat. TimeKey and TimeFormat must appear in pairs
- time
Sample String - Time sample. Used to verify whether the entered time parsing format is correct
- un
Match StringLog Key - When uploading logs that failed to parse, specify the key name for the failed logs. UnMatchUpLoadSwitch=true and UnMatchLogKey must be used together.
- un
Match BooleanUp Load Switch - Whether to upload logs that failed to parse. UnMatchUpLoadSwitch=true and UnMatchLogKey must be paired. true: Upload logs that failed to parse. false: Do not upload logs that failed to parse.
- begin
Regex string - Regular expression for identifying the first line of each log. The matched part is treated as the start of the log. When LogType is set to multiline_log, you must configure a log sample
- delimiter string
- Delimiter. Only valid when LogType is delimiter_log.
- enable
Nanosecond boolean - Enable nanoseconds.
- filter
Key GetRegexes Import Task Target Info Extract Rule Extract Rule Filter Key Regex[] - Filter key regular expression.
- keys string[]
- List of log field names (Key). Valid only when LogType is delimiter_log. Supports up to 100 field names. Duplicate field names are not allowed, and all field names cannot be left blank
- log
Regex string - Log regular expression
- log
Template GetImport Task Target Info Extract Rule Extract Rule Log Template - Log template.
- quote string
- Quotation mark. Content enclosed by the quotation mark will not be separated and will be parsed as a complete field. Only valid when LogType is delimiter_log.
- time
Format string - Parsing format for the time field. If you use a specified time field in the log as the log timestamp, you must fill in TimeKey and TimeFormat. TimeKey and TimeFormat must be paired. For configuration details, see time format.
- time
Key string - Name of the log time field. If you use a specific time field in the log as the log timestamp, you must provide both TimeKey and TimeFormat. TimeKey and TimeFormat must appear in pairs
- time
Sample string - Time sample. Used to verify whether the entered time parsing format is correct
- un
Match stringLog Key - When uploading logs that failed to parse, specify the key name for the failed logs. UnMatchUpLoadSwitch=true and UnMatchLogKey must be used together.
- un
Match booleanUp Load Switch - Whether to upload logs that failed to parse. UnMatchUpLoadSwitch=true and UnMatchLogKey must be paired. true: Upload logs that failed to parse. false: Do not upload logs that failed to parse.
- begin_
regex str - Regular expression for identifying the first line of each log. The matched part is treated as the start of the log. When LogType is set to multiline_log, you must configure a log sample
- delimiter str
- Delimiter. Only valid when LogType is delimiter_log.
- enable_
nanosecond bool - Enable nanoseconds.
- filter_
key_ Sequence[Getregexes Import Task Target Info Extract Rule Extract Rule Filter Key Regex] - Filter key regular expression.
- keys Sequence[str]
- List of log field names (Key). Valid only when LogType is delimiter_log. Supports up to 100 field names. Duplicate field names are not allowed, and all field names cannot be left blank
- log_
regex str - Log regular expression
- log_
template GetImport Task Target Info Extract Rule Extract Rule Log Template - Log template.
- quote str
- Quotation mark. Content enclosed by the quotation mark will not be separated and will be parsed as a complete field. Only valid when LogType is delimiter_log.
- time_
format str - Parsing format for the time field. If you use a specified time field in the log as the log timestamp, you must fill in TimeKey and TimeFormat. TimeKey and TimeFormat must be paired. For configuration details, see time format.
- time_
key str - Name of the log time field. If you use a specific time field in the log as the log timestamp, you must provide both TimeKey and TimeFormat. TimeKey and TimeFormat must appear in pairs
- time_
sample str - Time sample. Used to verify whether the entered time parsing format is correct
- un_
match_ strlog_ key - When uploading logs that failed to parse, specify the key name for the failed logs. UnMatchUpLoadSwitch=true and UnMatchLogKey must be used together.
- un_
match_ boolup_ load_ switch - Whether to upload logs that failed to parse. UnMatchUpLoadSwitch=true and UnMatchLogKey must be paired. true: Upload logs that failed to parse. false: Do not upload logs that failed to parse.
- begin
Regex String - Regular expression for identifying the first line of each log. The matched part is treated as the start of the log. When LogType is set to multiline_log, you must configure a log sample
- delimiter String
- Delimiter. Only valid when LogType is delimiter_log.
- enable
Nanosecond Boolean - Enable nanoseconds.
- filter
Key List<Property Map>Regexes - Filter key regular expression.
- keys List<String>
- List of log field names (Key). Valid only when LogType is delimiter_log. Supports up to 100 field names. Duplicate field names are not allowed, and all field names cannot be left blank
- log
Regex String - Log regular expression
- log
Template Property Map - Log template.
- quote String
- Quotation mark. Content enclosed by the quotation mark will not be separated and will be parsed as a complete field. Only valid when LogType is delimiter_log.
- time
Format String - Parsing format for the time field. If you use a specified time field in the log as the log timestamp, you must fill in TimeKey and TimeFormat. TimeKey and TimeFormat must be paired. For configuration details, see time format.
- time
Key String - Name of the log time field. If you use a specific time field in the log as the log timestamp, you must provide both TimeKey and TimeFormat. TimeKey and TimeFormat must appear in pairs
- time
Sample String - Time sample. Used to verify whether the entered time parsing format is correct
- un
Match StringLog Key - When uploading logs that failed to parse, specify the key name for the failed logs. UnMatchUpLoadSwitch=true and UnMatchLogKey must be used together.
- un
Match BooleanUp Load Switch - Whether to upload logs that failed to parse. UnMatchUpLoadSwitch=true and UnMatchLogKey must be paired. true: Upload logs that failed to parse. false: Do not upload logs that failed to parse.
GetImportTaskTargetInfoExtractRuleExtractRuleFilterKeyRegex
GetImportTaskTargetInfoExtractRuleExtractRuleLogTemplate
GetImportTaskTaskStatistics
- Bytes
Total int - Total resource bytes enumerated
- Bytes
Transferred int - Bytes transferred.
- Failed int
- Number of resources failed to import.
- Not
Exist int - Number of resources not found.
- Skipped int
- Number of resources skipped during import
- Task
Status string - Task status. Status of the import task. Preparing: Preparing for import. Importing: Importing data. Success: Import completed successfully. Failed: Import failed. Stopped: Import paused.
- Total int
- Total number of resources enumerated.
- Transferred int
- Number of records transferred.
- Bytes
Total int - Total resource bytes enumerated
- Bytes
Transferred int - Bytes transferred.
- Failed int
- Number of resources failed to import.
- Not
Exist int - Number of resources not found.
- Skipped int
- Number of resources skipped during import
- Task
Status string - Task status. Status of the import task. Preparing: Preparing for import. Importing: Importing data. Success: Import completed successfully. Failed: Import failed. Stopped: Import paused.
- Total int
- Total number of resources enumerated.
- Transferred int
- Number of records transferred.
- bytes
Total Integer - Total resource bytes enumerated
- bytes
Transferred Integer - Bytes transferred.
- failed Integer
- Number of resources failed to import.
- not
Exist Integer - Number of resources not found.
- skipped Integer
- Number of resources skipped during import
- task
Status String - Task status. Status of the import task. Preparing: Preparing for import. Importing: Importing data. Success: Import completed successfully. Failed: Import failed. Stopped: Import paused.
- total Integer
- Total number of resources enumerated.
- transferred Integer
- Number of records transferred.
- bytes
Total number - Total resource bytes enumerated
- bytes
Transferred number - Bytes transferred.
- failed number
- Number of resources failed to import.
- not
Exist number - Number of resources not found.
- skipped number
- Number of resources skipped during import
- task
Status string - Task status. Status of the import task. Preparing: Preparing for import. Importing: Importing data. Success: Import completed successfully. Failed: Import failed. Stopped: Import paused.
- total number
- Total number of resources enumerated.
- transferred number
- Number of records transferred.
- bytes_
total int - Total resource bytes enumerated
- bytes_
transferred int - Bytes transferred.
- failed int
- Number of resources failed to import.
- not_
exist int - Number of resources not found.
- skipped int
- Number of resources skipped during import
- task_
status str - Task status. Status of the import task. Preparing: Preparing for import. Importing: Importing data. Success: Import completed successfully. Failed: Import failed. Stopped: Import paused.
- total int
- Total number of resources enumerated.
- transferred int
- Number of records transferred.
- bytes
Total Number - Total resource bytes enumerated
- bytes
Transferred Number - Bytes transferred.
- failed Number
- Number of resources failed to import.
- not
Exist Number - Number of resources not found.
- skipped Number
- Number of resources skipped during import
- task
Status String - Task status. Status of the import task. Preparing: Preparing for import. Importing: Importing data. Success: Import completed successfully. Failed: Import failed. Stopped: Import paused.
- total Number
- Total number of resources enumerated.
- transferred Number
- Number of records transferred.
Package Details
- Repository
- volcenginecc volcengine/pulumi-volcenginecc
- License
- MPL-2.0
- Notes
- This Pulumi package is based on the
volcengineccTerraform Provider.
Viewing docs for volcenginecc v0.0.32
published on Thursday, Apr 23, 2026 by Volcengine
published on Thursday, Apr 23, 2026 by Volcengine
