1. Packages
  2. Packages
  3. Volcenginecc Provider
  4. API Docs
  5. tls
  6. getImportTask
Viewing docs for volcenginecc v0.0.32
published on Thursday, Apr 23, 2026 by Volcengine
volcenginecc logo
Viewing docs for volcenginecc v0.0.32
published on Thursday, Apr 23, 2026 by Volcengine

    Data Source schema for Volcengine::TLS::ImportTask

    Using getImportTask

    Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.

    function getImportTask(args: GetImportTaskArgs, opts?: InvokeOptions): Promise<GetImportTaskResult>
    function getImportTaskOutput(args: GetImportTaskOutputArgs, opts?: InvokeOptions): Output<GetImportTaskResult>
    def get_import_task(id: Optional[str] = None,
                        opts: Optional[InvokeOptions] = None) -> GetImportTaskResult
    def get_import_task_output(id: Optional[pulumi.Input[str]] = None,
                        opts: Optional[InvokeOptions] = None) -> Output[GetImportTaskResult]
    func LookupImportTask(ctx *Context, args *LookupImportTaskArgs, opts ...InvokeOption) (*LookupImportTaskResult, error)
    func LookupImportTaskOutput(ctx *Context, args *LookupImportTaskOutputArgs, opts ...InvokeOption) LookupImportTaskResultOutput

    > Note: This function is named LookupImportTask in the Go SDK.

    public static class GetImportTask 
    {
        public static Task<GetImportTaskResult> InvokeAsync(GetImportTaskArgs args, InvokeOptions? opts = null)
        public static Output<GetImportTaskResult> Invoke(GetImportTaskInvokeArgs args, InvokeOptions? opts = null)
    }
    public static CompletableFuture<GetImportTaskResult> getImportTask(GetImportTaskArgs args, InvokeOptions options)
    public static Output<GetImportTaskResult> getImportTask(GetImportTaskArgs args, InvokeOptions options)
    
    fn::invoke:
      function: volcenginecc:tls/getImportTask:getImportTask
      arguments:
        # arguments dictionary

    The following arguments are supported:

    Id string
    Uniquely identifies the resource.
    Id string
    Uniquely identifies the resource.
    id String
    Uniquely identifies the resource.
    id string
    Uniquely identifies the resource.
    id str
    Uniquely identifies the resource.
    id String
    Uniquely identifies the resource.

    getImportTask Result

    The following output properties are available:

    CreateTime string
    Creation time.
    Description string
    Task description.
    Id string
    Uniquely identifies the resource.
    ImportSourceInfo Volcengine.GetImportTaskImportSourceInfo
    Import data source information
    ProjectId string
    Log project ID for storing data.
    ProjectName string
    Log project name.
    SourceType string
    Data source type. Options: tos, kafka.
    Status int
    Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
    TargetInfo Volcengine.GetImportTaskTargetInfo
    Output information for the data import task.
    TaskId string
    Import task ID.
    TaskName string
    Import task name
    TaskStatistics Volcengine.GetImportTaskTaskStatistics
    Progress of the data import task.
    TopicId string
    Log topic ID used to store data
    TopicName string
    Log topic name.
    CreateTime string
    Creation time.
    Description string
    Task description.
    Id string
    Uniquely identifies the resource.
    ImportSourceInfo GetImportTaskImportSourceInfo
    Import data source information
    ProjectId string
    Log project ID for storing data.
    ProjectName string
    Log project name.
    SourceType string
    Data source type. Options: tos, kafka.
    Status int
    Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
    TargetInfo GetImportTaskTargetInfo
    Output information for the data import task.
    TaskId string
    Import task ID.
    TaskName string
    Import task name
    TaskStatistics GetImportTaskTaskStatistics
    Progress of the data import task.
    TopicId string
    Log topic ID used to store data
    TopicName string
    Log topic name.
    createTime String
    Creation time.
    description String
    Task description.
    id String
    Uniquely identifies the resource.
    importSourceInfo GetImportTaskImportSourceInfo
    Import data source information
    projectId String
    Log project ID for storing data.
    projectName String
    Log project name.
    sourceType String
    Data source type. Options: tos, kafka.
    status Integer
    Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
    targetInfo GetImportTaskTargetInfo
    Output information for the data import task.
    taskId String
    Import task ID.
    taskName String
    Import task name
    taskStatistics GetImportTaskTaskStatistics
    Progress of the data import task.
    topicId String
    Log topic ID used to store data
    topicName String
    Log topic name.
    createTime string
    Creation time.
    description string
    Task description.
    id string
    Uniquely identifies the resource.
    importSourceInfo GetImportTaskImportSourceInfo
    Import data source information
    projectId string
    Log project ID for storing data.
    projectName string
    Log project name.
    sourceType string
    Data source type. Options: tos, kafka.
    status number
    Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
    targetInfo GetImportTaskTargetInfo
    Output information for the data import task.
    taskId string
    Import task ID.
    taskName string
    Import task name
    taskStatistics GetImportTaskTaskStatistics
    Progress of the data import task.
    topicId string
    Log topic ID used to store data
    topicName string
    Log topic name.
    create_time str
    Creation time.
    description str
    Task description.
    id str
    Uniquely identifies the resource.
    import_source_info GetImportTaskImportSourceInfo
    Import data source information
    project_id str
    Log project ID for storing data.
    project_name str
    Log project name.
    source_type str
    Data source type. Options: tos, kafka.
    status int
    Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
    target_info GetImportTaskTargetInfo
    Output information for the data import task.
    task_id str
    Import task ID.
    task_name str
    Import task name
    task_statistics GetImportTaskTaskStatistics
    Progress of the data import task.
    topic_id str
    Log topic ID used to store data
    topic_name str
    Log topic name.
    createTime String
    Creation time.
    description String
    Task description.
    id String
    Uniquely identifies the resource.
    importSourceInfo Property Map
    Import data source information
    projectId String
    Log project ID for storing data.
    projectName String
    Log project name.
    sourceType String
    Data source type. Options: tos, kafka.
    status Number
    Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
    targetInfo Property Map
    Output information for the data import task.
    taskId String
    Import task ID.
    taskName String
    Import task name
    taskStatistics Property Map
    Progress of the data import task.
    topicId String
    Log topic ID used to store data
    topicName String
    Log topic name.

    Supporting Types

    GetImportTaskImportSourceInfo

    KafkaSourceInfo Volcengine.GetImportTaskImportSourceInfoKafkaSourceInfo
    Kafka data source information. When sourceType is kafka, the KafkaSourceInfo field is required
    TosSourceInfo Volcengine.GetImportTaskImportSourceInfoTosSourceInfo
    TOS data source information. When sourceType is tos, the TosSourceInfo field is required.
    KafkaSourceInfo GetImportTaskImportSourceInfoKafkaSourceInfo
    Kafka data source information. When sourceType is kafka, the KafkaSourceInfo field is required
    TosSourceInfo GetImportTaskImportSourceInfoTosSourceInfo
    TOS data source information. When sourceType is tos, the TosSourceInfo field is required.
    kafkaSourceInfo GetImportTaskImportSourceInfoKafkaSourceInfo
    Kafka data source information. When sourceType is kafka, the KafkaSourceInfo field is required
    tosSourceInfo GetImportTaskImportSourceInfoTosSourceInfo
    TOS data source information. When sourceType is tos, the TosSourceInfo field is required.
    kafkaSourceInfo GetImportTaskImportSourceInfoKafkaSourceInfo
    Kafka data source information. When sourceType is kafka, the KafkaSourceInfo field is required
    tosSourceInfo GetImportTaskImportSourceInfoTosSourceInfo
    TOS data source information. When sourceType is tos, the TosSourceInfo field is required.
    kafka_source_info GetImportTaskImportSourceInfoKafkaSourceInfo
    Kafka data source information. When sourceType is kafka, the KafkaSourceInfo field is required
    tos_source_info GetImportTaskImportSourceInfoTosSourceInfo
    TOS data source information. When sourceType is tos, the TosSourceInfo field is required.
    kafkaSourceInfo Property Map
    Kafka data source information. When sourceType is kafka, the KafkaSourceInfo field is required
    tosSourceInfo Property Map
    TOS data source information. When sourceType is tos, the TosSourceInfo field is required.

    GetImportTaskImportSourceInfoKafkaSourceInfo

    Encode string
    Data encoding format. Available options: UTF-8, GBK.
    Group string
    Kafka consumer group. If not specified, the system will automatically create a Kafka consumer group.
    Host string
    The service addresses for different types of Kafka clusters vary. Details are as follows: Message Queue Kafka Edition: Use the access point of the Kafka instance. For more information, see Access Point. If the Kafka instance and the Log Service Project are in the same region, you can use private network access; otherwise, use public network access. Self-hosted Kafka clusters: Use the IP address and port number or the domain name and port number of the Kafka Broker. Only public network access is supported. Separate multiple service addresses with a comma (,).
    InitialOffset int
    Starting position for data import. Options: 0: Earliest time, start importing from the first record in the specified Kafka Topic. 1: Latest time, start importing from the most recently generated record in the specified Kafka Topic.
    InstanceId string
    If you are using Message Queue Kafka Edition, set this to the Kafka instance ID.
    IsNeedAuth bool
    Whether to enable authentication. If you use a public service address, it is recommended to enable authentication.
    Mechanism string
    Password authentication mechanism. Available options: PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
    Password string
    Kafka SASL user password for authentication.
    Protocol string
    Secure transmission protocol. Options include plaintext, saslssl, ssl, and saslplaintext
    TimeSourceDefault int
    Specify log time. Options: 0: Use Kafka message timestamp. 1: Use current system time.
    Topic string
    Kafka Topic name. Separate multiple Kafka Topics with commas (,).
    Username string
    Kafka SASL username for authentication.
    Encode string
    Data encoding format. Available options: UTF-8, GBK.
    Group string
    Kafka consumer group. If not specified, the system will automatically create a Kafka consumer group.
    Host string
    The service addresses for different types of Kafka clusters vary. Details are as follows: Message Queue Kafka Edition: Use the access point of the Kafka instance. For more information, see Access Point. If the Kafka instance and the Log Service Project are in the same region, you can use private network access; otherwise, use public network access. Self-hosted Kafka clusters: Use the IP address and port number or the domain name and port number of the Kafka Broker. Only public network access is supported. Separate multiple service addresses with a comma (,).
    InitialOffset int
    Starting position for data import. Options: 0: Earliest time, start importing from the first record in the specified Kafka Topic. 1: Latest time, start importing from the most recently generated record in the specified Kafka Topic.
    InstanceId string
    If you are using Message Queue Kafka Edition, set this to the Kafka instance ID.
    IsNeedAuth bool
    Whether to enable authentication. If you use a public service address, it is recommended to enable authentication.
    Mechanism string
    Password authentication mechanism. Available options: PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
    Password string
    Kafka SASL user password for authentication.
    Protocol string
    Secure transmission protocol. Options include plaintext, saslssl, ssl, and saslplaintext
    TimeSourceDefault int
    Specify log time. Options: 0: Use Kafka message timestamp. 1: Use current system time.
    Topic string
    Kafka Topic name. Separate multiple Kafka Topics with commas (,).
    Username string
    Kafka SASL username for authentication.
    encode String
    Data encoding format. Available options: UTF-8, GBK.
    group String
    Kafka consumer group. If not specified, the system will automatically create a Kafka consumer group.
    host String
    The service addresses for different types of Kafka clusters vary. Details are as follows: Message Queue Kafka Edition: Use the access point of the Kafka instance. For more information, see Access Point. If the Kafka instance and the Log Service Project are in the same region, you can use private network access; otherwise, use public network access. Self-hosted Kafka clusters: Use the IP address and port number or the domain name and port number of the Kafka Broker. Only public network access is supported. Separate multiple service addresses with a comma (,).
    initialOffset Integer
    Starting position for data import. Options: 0: Earliest time, start importing from the first record in the specified Kafka Topic. 1: Latest time, start importing from the most recently generated record in the specified Kafka Topic.
    instanceId String
    If you are using Message Queue Kafka Edition, set this to the Kafka instance ID.
    isNeedAuth Boolean
    Whether to enable authentication. If you use a public service address, it is recommended to enable authentication.
    mechanism String
    Password authentication mechanism. Available options: PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
    password String
    Kafka SASL user password for authentication.
    protocol String
    Secure transmission protocol. Options include plaintext, saslssl, ssl, and saslplaintext
    timeSourceDefault Integer
    Specify log time. Options: 0: Use Kafka message timestamp. 1: Use current system time.
    topic String
    Kafka Topic name. Separate multiple Kafka Topics with commas (,).
    username String
    Kafka SASL username for authentication.
    encode string
    Data encoding format. Available options: UTF-8, GBK.
    group string
    Kafka consumer group. If not specified, the system will automatically create a Kafka consumer group.
    host string
    The service addresses for different types of Kafka clusters vary. Details are as follows: Message Queue Kafka Edition: Use the access point of the Kafka instance. For more information, see Access Point. If the Kafka instance and the Log Service Project are in the same region, you can use private network access; otherwise, use public network access. Self-hosted Kafka clusters: Use the IP address and port number or the domain name and port number of the Kafka Broker. Only public network access is supported. Separate multiple service addresses with a comma (,).
    initialOffset number
    Starting position for data import. Options: 0: Earliest time, start importing from the first record in the specified Kafka Topic. 1: Latest time, start importing from the most recently generated record in the specified Kafka Topic.
    instanceId string
    If you are using Message Queue Kafka Edition, set this to the Kafka instance ID.
    isNeedAuth boolean
    Whether to enable authentication. If you use a public service address, it is recommended to enable authentication.
    mechanism string
    Password authentication mechanism. Available options: PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
    password string
    Kafka SASL user password for authentication.
    protocol string
    Secure transmission protocol. Options include plaintext, saslssl, ssl, and saslplaintext
    timeSourceDefault number
    Specify log time. Options: 0: Use Kafka message timestamp. 1: Use current system time.
    topic string
    Kafka Topic name. Separate multiple Kafka Topics with commas (,).
    username string
    Kafka SASL username for authentication.
    encode str
    Data encoding format. Available options: UTF-8, GBK.
    group str
    Kafka consumer group. If not specified, the system will automatically create a Kafka consumer group.
    host str
    The service addresses for different types of Kafka clusters vary. Details are as follows: Message Queue Kafka Edition: Use the access point of the Kafka instance. For more information, see Access Point. If the Kafka instance and the Log Service Project are in the same region, you can use private network access; otherwise, use public network access. Self-hosted Kafka clusters: Use the IP address and port number or the domain name and port number of the Kafka Broker. Only public network access is supported. Separate multiple service addresses with a comma (,).
    initial_offset int
    Starting position for data import. Options: 0: Earliest time, start importing from the first record in the specified Kafka Topic. 1: Latest time, start importing from the most recently generated record in the specified Kafka Topic.
    instance_id str
    If you are using Message Queue Kafka Edition, set this to the Kafka instance ID.
    is_need_auth bool
    Whether to enable authentication. If you use a public service address, it is recommended to enable authentication.
    mechanism str
    Password authentication mechanism. Available options: PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
    password str
    Kafka SASL user password for authentication.
    protocol str
    Secure transmission protocol. Options include plaintext, saslssl, ssl, and saslplaintext
    time_source_default int
    Specify log time. Options: 0: Use Kafka message timestamp. 1: Use current system time.
    topic str
    Kafka Topic name. Separate multiple Kafka Topics with commas (,).
    username str
    Kafka SASL username for authentication.
    encode String
    Data encoding format. Available options: UTF-8, GBK.
    group String
    Kafka consumer group. If not specified, the system will automatically create a Kafka consumer group.
    host String
    The service addresses for different types of Kafka clusters vary. Details are as follows: Message Queue Kafka Edition: Use the access point of the Kafka instance. For more information, see Access Point. If the Kafka instance and the Log Service Project are in the same region, you can use private network access; otherwise, use public network access. Self-hosted Kafka clusters: Use the IP address and port number or the domain name and port number of the Kafka Broker. Only public network access is supported. Separate multiple service addresses with a comma (,).
    initialOffset Number
    Starting position for data import. Options: 0: Earliest time, start importing from the first record in the specified Kafka Topic. 1: Latest time, start importing from the most recently generated record in the specified Kafka Topic.
    instanceId String
    If you are using Message Queue Kafka Edition, set this to the Kafka instance ID.
    isNeedAuth Boolean
    Whether to enable authentication. If you use a public service address, it is recommended to enable authentication.
    mechanism String
    Password authentication mechanism. Available options: PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
    password String
    Kafka SASL user password for authentication.
    protocol String
    Secure transmission protocol. Options include plaintext, saslssl, ssl, and saslplaintext
    timeSourceDefault Number
    Specify log time. Options: 0: Use Kafka message timestamp. 1: Use current system time.
    topic String
    Kafka Topic name. Separate multiple Kafka Topics with commas (,).
    username String
    Kafka SASL username for authentication.

    GetImportTaskImportSourceInfoTosSourceInfo

    Bucket string
    TOS bucket name
    CompressType string
    Compression mode for data in the TOS bucket. none: No compression. snappy: Compress using snappy. gzip: Compress using gzip. lz4: Compress using lz4.
    Prefix string
    Path of the file to be imported in the TOS bucket.
    Region string
    Region where the TOS bucket is located. Cross-region data import is supported
    Bucket string
    TOS bucket name
    CompressType string
    Compression mode for data in the TOS bucket. none: No compression. snappy: Compress using snappy. gzip: Compress using gzip. lz4: Compress using lz4.
    Prefix string
    Path of the file to be imported in the TOS bucket.
    Region string
    Region where the TOS bucket is located. Cross-region data import is supported
    bucket String
    TOS bucket name
    compressType String
    Compression mode for data in the TOS bucket. none: No compression. snappy: Compress using snappy. gzip: Compress using gzip. lz4: Compress using lz4.
    prefix String
    Path of the file to be imported in the TOS bucket.
    region String
    Region where the TOS bucket is located. Cross-region data import is supported
    bucket string
    TOS bucket name
    compressType string
    Compression mode for data in the TOS bucket. none: No compression. snappy: Compress using snappy. gzip: Compress using gzip. lz4: Compress using lz4.
    prefix string
    Path of the file to be imported in the TOS bucket.
    region string
    Region where the TOS bucket is located. Cross-region data import is supported
    bucket str
    TOS bucket name
    compress_type str
    Compression mode for data in the TOS bucket. none: No compression. snappy: Compress using snappy. gzip: Compress using gzip. lz4: Compress using lz4.
    prefix str
    Path of the file to be imported in the TOS bucket.
    region str
    Region where the TOS bucket is located. Cross-region data import is supported
    bucket String
    TOS bucket name
    compressType String
    Compression mode for data in the TOS bucket. none: No compression. snappy: Compress using snappy. gzip: Compress using gzip. lz4: Compress using lz4.
    prefix String
    Path of the file to be imported in the TOS bucket.
    region String
    Region where the TOS bucket is located. Cross-region data import is supported

    GetImportTaskTargetInfo

    ExtractRule Volcengine.GetImportTaskTargetInfoExtractRule
    Log extraction rule.
    LogSample string
    Log sample. When LogType is set to multiline_log, you must configure log samples. It is recommended to provide more than two log entries as examples to ensure the regular expression matches the first line of each log. Use real samples from the production environment.
    LogType string
    Specify log parsing type during import. delimiterlog: CSV type. multilinelog: multiline full text type. minimalistlog: single line full text type. jsonlog: JSON type.
    Region string
    Region.
    ExtractRule GetImportTaskTargetInfoExtractRule
    Log extraction rule.
    LogSample string
    Log sample. When LogType is set to multiline_log, you must configure log samples. It is recommended to provide more than two log entries as examples to ensure the regular expression matches the first line of each log. Use real samples from the production environment.
    LogType string
    Specify log parsing type during import. delimiterlog: CSV type. multilinelog: multiline full text type. minimalistlog: single line full text type. jsonlog: JSON type.
    Region string
    Region.
    extractRule GetImportTaskTargetInfoExtractRule
    Log extraction rule.
    logSample String
    Log sample. When LogType is set to multiline_log, you must configure log samples. It is recommended to provide more than two log entries as examples to ensure the regular expression matches the first line of each log. Use real samples from the production environment.
    logType String
    Specify log parsing type during import. delimiterlog: CSV type. multilinelog: multiline full text type. minimalistlog: single line full text type. jsonlog: JSON type.
    region String
    Region.
    extractRule GetImportTaskTargetInfoExtractRule
    Log extraction rule.
    logSample string
    Log sample. When LogType is set to multiline_log, you must configure log samples. It is recommended to provide more than two log entries as examples to ensure the regular expression matches the first line of each log. Use real samples from the production environment.
    logType string
    Specify log parsing type during import. delimiterlog: CSV type. multilinelog: multiline full text type. minimalistlog: single line full text type. jsonlog: JSON type.
    region string
    Region.
    extract_rule GetImportTaskTargetInfoExtractRule
    Log extraction rule.
    log_sample str
    Log sample. When LogType is set to multiline_log, you must configure log samples. It is recommended to provide more than two log entries as examples to ensure the regular expression matches the first line of each log. Use real samples from the production environment.
    log_type str
    Specify log parsing type during import. delimiterlog: CSV type. multilinelog: multiline full text type. minimalistlog: single line full text type. jsonlog: JSON type.
    region str
    Region.
    extractRule Property Map
    Log extraction rule.
    logSample String
    Log sample. When LogType is set to multiline_log, you must configure log samples. It is recommended to provide more than two log entries as examples to ensure the regular expression matches the first line of each log. Use real samples from the production environment.
    logType String
    Specify log parsing type during import. delimiterlog: CSV type. multilinelog: multiline full text type. minimalistlog: single line full text type. jsonlog: JSON type.
    region String
    Region.

    GetImportTaskTargetInfoExtractRule

    ExtractRule Volcengine.GetImportTaskTargetInfoExtractRuleExtractRule
    Basic content of log extraction rules.
    SkipLineCount int
    Number of skipped rows. Only valid when the log type is delimiter_log and the import type is tos.
    TimeExtractRegex string
    Time extraction regular expression, used to extract the time value from the TimeKey field and parse it as the collection time
    TimeZone string
    Time zone. Supports machine time zone (default) and custom time zone. Custom time zone supports GMT and UTC. GMT format: GMT+08:00. UTC format: Asia/Shanghai.
    ExtractRule GetImportTaskTargetInfoExtractRuleExtractRule
    Basic content of log extraction rules.
    SkipLineCount int
    Number of skipped rows. Only valid when the log type is delimiter_log and the import type is tos.
    TimeExtractRegex string
    Time extraction regular expression, used to extract the time value from the TimeKey field and parse it as the collection time
    TimeZone string
    Time zone. Supports machine time zone (default) and custom time zone. Custom time zone supports GMT and UTC. GMT format: GMT+08:00. UTC format: Asia/Shanghai.
    extractRule GetImportTaskTargetInfoExtractRuleExtractRule
    Basic content of log extraction rules.
    skipLineCount Integer
    Number of skipped rows. Only valid when the log type is delimiter_log and the import type is tos.
    timeExtractRegex String
    Time extraction regular expression, used to extract the time value from the TimeKey field and parse it as the collection time
    timeZone String
    Time zone. Supports machine time zone (default) and custom time zone. Custom time zone supports GMT and UTC. GMT format: GMT+08:00. UTC format: Asia/Shanghai.
    extractRule GetImportTaskTargetInfoExtractRuleExtractRule
    Basic content of log extraction rules.
    skipLineCount number
    Number of skipped rows. Only valid when the log type is delimiter_log and the import type is tos.
    timeExtractRegex string
    Time extraction regular expression, used to extract the time value from the TimeKey field and parse it as the collection time
    timeZone string
    Time zone. Supports machine time zone (default) and custom time zone. Custom time zone supports GMT and UTC. GMT format: GMT+08:00. UTC format: Asia/Shanghai.
    extract_rule GetImportTaskTargetInfoExtractRuleExtractRule
    Basic content of log extraction rules.
    skip_line_count int
    Number of skipped rows. Only valid when the log type is delimiter_log and the import type is tos.
    time_extract_regex str
    Time extraction regular expression, used to extract the time value from the TimeKey field and parse it as the collection time
    time_zone str
    Time zone. Supports machine time zone (default) and custom time zone. Custom time zone supports GMT and UTC. GMT format: GMT+08:00. UTC format: Asia/Shanghai.
    extractRule Property Map
    Basic content of log extraction rules.
    skipLineCount Number
    Number of skipped rows. Only valid when the log type is delimiter_log and the import type is tos.
    timeExtractRegex String
    Time extraction regular expression, used to extract the time value from the TimeKey field and parse it as the collection time
    timeZone String
    Time zone. Supports machine time zone (default) and custom time zone. Custom time zone supports GMT and UTC. GMT format: GMT+08:00. UTC format: Asia/Shanghai.

    GetImportTaskTargetInfoExtractRuleExtractRule

    BeginRegex string
    Regular expression for identifying the first line of each log. The matched part is treated as the start of the log. When LogType is set to multiline_log, you must configure a log sample
    Delimiter string
    Delimiter. Only valid when LogType is delimiter_log.
    EnableNanosecond bool
    Enable nanoseconds.
    FilterKeyRegexes List<Volcengine.GetImportTaskTargetInfoExtractRuleExtractRuleFilterKeyRegex>
    Filter key regular expression.
    Keys List<string>
    List of log field names (Key). Valid only when LogType is delimiter_log. Supports up to 100 field names. Duplicate field names are not allowed, and all field names cannot be left blank
    LogRegex string
    Log regular expression
    LogTemplate Volcengine.GetImportTaskTargetInfoExtractRuleExtractRuleLogTemplate
    Log template.
    Quote string
    Quotation mark. Content enclosed by the quotation mark will not be separated and will be parsed as a complete field. Only valid when LogType is delimiter_log.
    TimeFormat string
    Parsing format for the time field. If you use a specified time field in the log as the log timestamp, you must fill in TimeKey and TimeFormat. TimeKey and TimeFormat must be paired. For configuration details, see time format.
    TimeKey string
    Name of the log time field. If you use a specific time field in the log as the log timestamp, you must provide both TimeKey and TimeFormat. TimeKey and TimeFormat must appear in pairs
    TimeSample string
    Time sample. Used to verify whether the entered time parsing format is correct
    UnMatchLogKey string
    When uploading logs that failed to parse, specify the key name for the failed logs. UnMatchUpLoadSwitch=true and UnMatchLogKey must be used together.
    UnMatchUpLoadSwitch bool
    Whether to upload logs that failed to parse. UnMatchUpLoadSwitch=true and UnMatchLogKey must be paired. true: Upload logs that failed to parse. false: Do not upload logs that failed to parse.
    BeginRegex string
    Regular expression for identifying the first line of each log. The matched part is treated as the start of the log. When LogType is set to multiline_log, you must configure a log sample
    Delimiter string
    Delimiter. Only valid when LogType is delimiter_log.
    EnableNanosecond bool
    Enable nanoseconds.
    FilterKeyRegexes []GetImportTaskTargetInfoExtractRuleExtractRuleFilterKeyRegex
    Filter key regular expression.
    Keys []string
    List of log field names (Key). Valid only when LogType is delimiter_log. Supports up to 100 field names. Duplicate field names are not allowed, and all field names cannot be left blank
    LogRegex string
    Log regular expression
    LogTemplate GetImportTaskTargetInfoExtractRuleExtractRuleLogTemplate
    Log template.
    Quote string
    Quotation mark. Content enclosed by the quotation mark will not be separated and will be parsed as a complete field. Only valid when LogType is delimiter_log.
    TimeFormat string
    Parsing format for the time field. If you use a specified time field in the log as the log timestamp, you must fill in TimeKey and TimeFormat. TimeKey and TimeFormat must be paired. For configuration details, see time format.
    TimeKey string
    Name of the log time field. If you use a specific time field in the log as the log timestamp, you must provide both TimeKey and TimeFormat. TimeKey and TimeFormat must appear in pairs
    TimeSample string
    Time sample. Used to verify whether the entered time parsing format is correct
    UnMatchLogKey string
    When uploading logs that failed to parse, specify the key name for the failed logs. UnMatchUpLoadSwitch=true and UnMatchLogKey must be used together.
    UnMatchUpLoadSwitch bool
    Whether to upload logs that failed to parse. UnMatchUpLoadSwitch=true and UnMatchLogKey must be paired. true: Upload logs that failed to parse. false: Do not upload logs that failed to parse.
    beginRegex String
    Regular expression for identifying the first line of each log. The matched part is treated as the start of the log. When LogType is set to multiline_log, you must configure a log sample
    delimiter String
    Delimiter. Only valid when LogType is delimiter_log.
    enableNanosecond Boolean
    Enable nanoseconds.
    filterKeyRegexes List<GetImportTaskTargetInfoExtractRuleExtractRuleFilterKeyRegex>
    Filter key regular expression.
    keys List<String>
    List of log field names (Key). Valid only when LogType is delimiter_log. Supports up to 100 field names. Duplicate field names are not allowed, and all field names cannot be left blank
    logRegex String
    Log regular expression
    logTemplate GetImportTaskTargetInfoExtractRuleExtractRuleLogTemplate
    Log template.
    quote String
    Quotation mark. Content enclosed by the quotation mark will not be separated and will be parsed as a complete field. Only valid when LogType is delimiter_log.
    timeFormat String
    Parsing format for the time field. If you use a specified time field in the log as the log timestamp, you must fill in TimeKey and TimeFormat. TimeKey and TimeFormat must be paired. For configuration details, see time format.
    timeKey String
    Name of the log time field. If you use a specific time field in the log as the log timestamp, you must provide both TimeKey and TimeFormat. TimeKey and TimeFormat must appear in pairs
    timeSample String
    Time sample. Used to verify whether the entered time parsing format is correct
    unMatchLogKey String
    When uploading logs that failed to parse, specify the key name for the failed logs. UnMatchUpLoadSwitch=true and UnMatchLogKey must be used together.
    unMatchUpLoadSwitch Boolean
    Whether to upload logs that failed to parse. UnMatchUpLoadSwitch=true and UnMatchLogKey must be paired. true: Upload logs that failed to parse. false: Do not upload logs that failed to parse.
    beginRegex string
    Regular expression for identifying the first line of each log. The matched part is treated as the start of the log. When LogType is set to multiline_log, you must configure a log sample
    delimiter string
    Delimiter. Only valid when LogType is delimiter_log.
    enableNanosecond boolean
    Enable nanoseconds.
    filterKeyRegexes GetImportTaskTargetInfoExtractRuleExtractRuleFilterKeyRegex[]
    Filter key regular expression.
    keys string[]
    List of log field names (Key). Valid only when LogType is delimiter_log. Supports up to 100 field names. Duplicate field names are not allowed, and all field names cannot be left blank
    logRegex string
    Log regular expression
    logTemplate GetImportTaskTargetInfoExtractRuleExtractRuleLogTemplate
    Log template.
    quote string
    Quotation mark. Content enclosed by the quotation mark will not be separated and will be parsed as a complete field. Only valid when LogType is delimiter_log.
    timeFormat string
    Parsing format for the time field. If you use a specified time field in the log as the log timestamp, you must fill in TimeKey and TimeFormat. TimeKey and TimeFormat must be paired. For configuration details, see time format.
    timeKey string
    Name of the log time field. If you use a specific time field in the log as the log timestamp, you must provide both TimeKey and TimeFormat. TimeKey and TimeFormat must appear in pairs
    timeSample string
    Time sample. Used to verify whether the entered time parsing format is correct
    unMatchLogKey string
    When uploading logs that failed to parse, specify the key name for the failed logs. UnMatchUpLoadSwitch=true and UnMatchLogKey must be used together.
    unMatchUpLoadSwitch boolean
    Whether to upload logs that failed to parse. UnMatchUpLoadSwitch=true and UnMatchLogKey must be paired. true: Upload logs that failed to parse. false: Do not upload logs that failed to parse.
    begin_regex str
    Regular expression for identifying the first line of each log. The matched part is treated as the start of the log. When LogType is set to multiline_log, you must configure a log sample
    delimiter str
    Delimiter. Only valid when LogType is delimiter_log.
    enable_nanosecond bool
    Enable nanoseconds.
    filter_key_regexes Sequence[GetImportTaskTargetInfoExtractRuleExtractRuleFilterKeyRegex]
    Filter key regular expression.
    keys Sequence[str]
    List of log field names (Key). Valid only when LogType is delimiter_log. Supports up to 100 field names. Duplicate field names are not allowed, and all field names cannot be left blank
    log_regex str
    Log regular expression
    log_template GetImportTaskTargetInfoExtractRuleExtractRuleLogTemplate
    Log template.
    quote str
    Quotation mark. Content enclosed by the quotation mark will not be separated and will be parsed as a complete field. Only valid when LogType is delimiter_log.
    time_format str
    Parsing format for the time field. If you use a specified time field in the log as the log timestamp, you must fill in TimeKey and TimeFormat. TimeKey and TimeFormat must be paired. For configuration details, see time format.
    time_key str
    Name of the log time field. If you use a specific time field in the log as the log timestamp, you must provide both TimeKey and TimeFormat. TimeKey and TimeFormat must appear in pairs
    time_sample str
    Time sample. Used to verify whether the entered time parsing format is correct
    un_match_log_key str
    When uploading logs that failed to parse, specify the key name for the failed logs. UnMatchUpLoadSwitch=true and UnMatchLogKey must be used together.
    un_match_up_load_switch bool
    Whether to upload logs that failed to parse. UnMatchUpLoadSwitch=true and UnMatchLogKey must be paired. true: Upload logs that failed to parse. false: Do not upload logs that failed to parse.
    beginRegex String
    Regular expression for identifying the first line of each log. The matched part is treated as the start of the log. When LogType is set to multiline_log, you must configure a log sample
    delimiter String
    Delimiter. Only valid when LogType is delimiter_log.
    enableNanosecond Boolean
    Enable nanoseconds.
    filterKeyRegexes List<Property Map>
    Filter key regular expression.
    keys List<String>
    List of log field names (Key). Valid only when LogType is delimiter_log. Supports up to 100 field names. Duplicate field names are not allowed, and all field names cannot be left blank
    logRegex String
    Log regular expression
    logTemplate Property Map
    Log template.
    quote String
    Quotation mark. Content enclosed by the quotation mark will not be separated and will be parsed as a complete field. Only valid when LogType is delimiter_log.
    timeFormat String
    Parsing format for the time field. If you use a specified time field in the log as the log timestamp, you must fill in TimeKey and TimeFormat. TimeKey and TimeFormat must be paired. For configuration details, see time format.
    timeKey String
    Name of the log time field. If you use a specific time field in the log as the log timestamp, you must provide both TimeKey and TimeFormat. TimeKey and TimeFormat must appear in pairs
    timeSample String
    Time sample. Used to verify whether the entered time parsing format is correct
    unMatchLogKey String
    When uploading logs that failed to parse, specify the key name for the failed logs. UnMatchUpLoadSwitch=true and UnMatchLogKey must be used together.
    unMatchUpLoadSwitch Boolean
    Whether to upload logs that failed to parse. UnMatchUpLoadSwitch=true and UnMatchLogKey must be paired. true: Upload logs that failed to parse. false: Do not upload logs that failed to parse.

    GetImportTaskTargetInfoExtractRuleExtractRuleFilterKeyRegex

    Key string
    Key.
    Regex string
    Regular expression.
    Key string
    Key.
    Regex string
    Regular expression.
    key String
    Key.
    regex String
    Regular expression.
    key string
    Key.
    regex string
    Regular expression.
    key str
    Key.
    regex str
    Regular expression.
    key String
    Key.
    regex String
    Regular expression.

    GetImportTaskTargetInfoExtractRuleExtractRuleLogTemplate

    Format string
    Format.
    Type string
    Type
    Format string
    Format.
    Type string
    Type
    format String
    Format.
    type String
    Type
    format string
    Format.
    type string
    Type
    format str
    Format.
    type str
    Type
    format String
    Format.
    type String
    Type

    GetImportTaskTaskStatistics

    BytesTotal int
    Total resource bytes enumerated
    BytesTransferred int
    Bytes transferred.
    Failed int
    Number of resources failed to import.
    NotExist int
    Number of resources not found.
    Skipped int
    Number of resources skipped during import
    TaskStatus string
    Task status. Status of the import task. Preparing: Preparing for import. Importing: Importing data. Success: Import completed successfully. Failed: Import failed. Stopped: Import paused.
    Total int
    Total number of resources enumerated.
    Transferred int
    Number of records transferred.
    BytesTotal int
    Total resource bytes enumerated
    BytesTransferred int
    Bytes transferred.
    Failed int
    Number of resources failed to import.
    NotExist int
    Number of resources not found.
    Skipped int
    Number of resources skipped during import
    TaskStatus string
    Task status. Status of the import task. Preparing: Preparing for import. Importing: Importing data. Success: Import completed successfully. Failed: Import failed. Stopped: Import paused.
    Total int
    Total number of resources enumerated.
    Transferred int
    Number of records transferred.
    bytesTotal Integer
    Total resource bytes enumerated
    bytesTransferred Integer
    Bytes transferred.
    failed Integer
    Number of resources failed to import.
    notExist Integer
    Number of resources not found.
    skipped Integer
    Number of resources skipped during import
    taskStatus String
    Task status. Status of the import task. Preparing: Preparing for import. Importing: Importing data. Success: Import completed successfully. Failed: Import failed. Stopped: Import paused.
    total Integer
    Total number of resources enumerated.
    transferred Integer
    Number of records transferred.
    bytesTotal number
    Total resource bytes enumerated
    bytesTransferred number
    Bytes transferred.
    failed number
    Number of resources failed to import.
    notExist number
    Number of resources not found.
    skipped number
    Number of resources skipped during import
    taskStatus string
    Task status. Status of the import task. Preparing: Preparing for import. Importing: Importing data. Success: Import completed successfully. Failed: Import failed. Stopped: Import paused.
    total number
    Total number of resources enumerated.
    transferred number
    Number of records transferred.
    bytes_total int
    Total resource bytes enumerated
    bytes_transferred int
    Bytes transferred.
    failed int
    Number of resources failed to import.
    not_exist int
    Number of resources not found.
    skipped int
    Number of resources skipped during import
    task_status str
    Task status. Status of the import task. Preparing: Preparing for import. Importing: Importing data. Success: Import completed successfully. Failed: Import failed. Stopped: Import paused.
    total int
    Total number of resources enumerated.
    transferred int
    Number of records transferred.
    bytesTotal Number
    Total resource bytes enumerated
    bytesTransferred Number
    Bytes transferred.
    failed Number
    Number of resources failed to import.
    notExist Number
    Number of resources not found.
    skipped Number
    Number of resources skipped during import
    taskStatus String
    Task status. Status of the import task. Preparing: Preparing for import. Importing: Importing data. Success: Import completed successfully. Failed: Import failed. Stopped: Import paused.
    total Number
    Total number of resources enumerated.
    transferred Number
    Number of records transferred.

    Package Details

    Repository
    volcenginecc volcengine/pulumi-volcenginecc
    License
    MPL-2.0
    Notes
    This Pulumi package is based on the volcenginecc Terraform Provider.
    volcenginecc logo
    Viewing docs for volcenginecc v0.0.32
    published on Thursday, Apr 23, 2026 by Volcengine
      Try Pulumi Cloud free. Your team will thank you.