1. Packages
  2. Packages
  3. Volcenginecc Provider
  4. API Docs
  5. tls
  6. ImportTask
Viewing docs for volcenginecc v0.0.32
published on Thursday, Apr 23, 2026 by Volcengine
volcenginecc logo
Viewing docs for volcenginecc v0.0.32
published on Thursday, Apr 23, 2026 by Volcengine

    Log Service supports data import, allowing you to structure data stored in sources such as TOS and Kafka and save it in Log Service

    Example Usage

    Example coming soon!
    
    Example coming soon!
    
    Example coming soon!
    
    Example coming soon!
    
    Example coming soon!
    
    resources:
      tLSImportTaskDemo:
        type: volcenginecc:tls:ImportTask
        name: TLSImportTaskDemo
        properties:
          description: ccapi-test-kafka
          importSourceInfo:
            kafka_source_info:
              host: kafka-cnngsl83xxxxx.kafka.cn-beijing.ivolces.com:9092
              group: group1
              topic: topic1
              encode: UTF-8
              password: ""
              protocol: ""
              username: ""
              mechanism: ""
              instanceId: kafka-cnngsl83e6xxxxx
              isNeedAuth: false
              initialOffset: 0
              timeSourceDefault: 1
          projectId: 4f1af9e7-34af-4ce5-866e-xxxxxxx
          sourceType: kafka
          targetInfo:
            region: cn-beijing
            log_type: json_log
            log_sample: ""
            extract_rule:
              extractRule:
                beginRegex: ""
                delimiter: ""
                logRegex: ""
                logTemplate:
                  format: ""
                  type: ""
                quote: ""
                timeFormat: '%Y-%m-%d %H:%M:%S,%f'
                timeKey: time
                timeSample: ""
                unMatchLogKey: LogParseFailed
                unMatchUpLoadSwitch: true
                filterKeyRegex:
                  - key: user_id
                    regex: ^[0-9]+$
              skipLineCount: 0
              timeExtractRegex: '[0-9]{0,2}\/[0-9a-zA-Z]+\/[0-9:,]+'
              timeZone: Asia/Shanghai
          taskName: ccapi-test-kafka-1001
          topicId: b75fffd8-1986-460c-9cca-xxxxxxx
    

    Create ImportTask Resource

    Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.

    Constructor syntax

    new ImportTask(name: string, args: ImportTaskArgs, opts?: CustomResourceOptions);
    @overload
    def ImportTask(resource_name: str,
                   args: ImportTaskArgs,
                   opts: Optional[ResourceOptions] = None)
    
    @overload
    def ImportTask(resource_name: str,
                   opts: Optional[ResourceOptions] = None,
                   import_source_info: Optional[ImportTaskImportSourceInfoArgs] = None,
                   source_type: Optional[str] = None,
                   target_info: Optional[ImportTaskTargetInfoArgs] = None,
                   task_name: Optional[str] = None,
                   topic_id: Optional[str] = None,
                   description: Optional[str] = None,
                   project_id: Optional[str] = None,
                   status: Optional[int] = None)
    func NewImportTask(ctx *Context, name string, args ImportTaskArgs, opts ...ResourceOption) (*ImportTask, error)
    public ImportTask(string name, ImportTaskArgs args, CustomResourceOptions? opts = null)
    public ImportTask(String name, ImportTaskArgs args)
    public ImportTask(String name, ImportTaskArgs args, CustomResourceOptions options)
    
    type: volcenginecc:tls:ImportTask
    properties: # The arguments to resource properties.
    options: # Bag of options to control resource's behavior.
    
    

    Parameters

    name string
    The unique name of the resource.
    args ImportTaskArgs
    The arguments to resource properties.
    opts CustomResourceOptions
    Bag of options to control resource's behavior.
    resource_name str
    The unique name of the resource.
    args ImportTaskArgs
    The arguments to resource properties.
    opts ResourceOptions
    Bag of options to control resource's behavior.
    ctx Context
    Context object for the current deployment.
    name string
    The unique name of the resource.
    args ImportTaskArgs
    The arguments to resource properties.
    opts ResourceOption
    Bag of options to control resource's behavior.
    name string
    The unique name of the resource.
    args ImportTaskArgs
    The arguments to resource properties.
    opts CustomResourceOptions
    Bag of options to control resource's behavior.
    name String
    The unique name of the resource.
    args ImportTaskArgs
    The arguments to resource properties.
    options CustomResourceOptions
    Bag of options to control resource's behavior.

    ImportTask Resource Properties

    To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.

    Inputs

    In Python, inputs that are objects can be passed either as argument classes or as dictionary literals.

    The ImportTask resource accepts the following input properties:

    ImportSourceInfo Volcengine.ImportTaskImportSourceInfo
    Import data source information
    SourceType string
    Data source type. Options: tos, kafka.
    TargetInfo Volcengine.ImportTaskTargetInfo
    Output information for the data import task.
    TaskName string
    Import task name
    TopicId string
    Log topic ID used to store data
    Description string
    Task description.
    ProjectId string
    Log project ID for storing data.
    Status int
    Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
    ImportSourceInfo ImportTaskImportSourceInfoArgs
    Import data source information
    SourceType string
    Data source type. Options: tos, kafka.
    TargetInfo ImportTaskTargetInfoArgs
    Output information for the data import task.
    TaskName string
    Import task name
    TopicId string
    Log topic ID used to store data
    Description string
    Task description.
    ProjectId string
    Log project ID for storing data.
    Status int
    Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
    importSourceInfo ImportTaskImportSourceInfo
    Import data source information
    sourceType String
    Data source type. Options: tos, kafka.
    targetInfo ImportTaskTargetInfo
    Output information for the data import task.
    taskName String
    Import task name
    topicId String
    Log topic ID used to store data
    description String
    Task description.
    projectId String
    Log project ID for storing data.
    status Integer
    Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
    importSourceInfo ImportTaskImportSourceInfo
    Import data source information
    sourceType string
    Data source type. Options: tos, kafka.
    targetInfo ImportTaskTargetInfo
    Output information for the data import task.
    taskName string
    Import task name
    topicId string
    Log topic ID used to store data
    description string
    Task description.
    projectId string
    Log project ID for storing data.
    status number
    Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
    import_source_info ImportTaskImportSourceInfoArgs
    Import data source information
    source_type str
    Data source type. Options: tos, kafka.
    target_info ImportTaskTargetInfoArgs
    Output information for the data import task.
    task_name str
    Import task name
    topic_id str
    Log topic ID used to store data
    description str
    Task description.
    project_id str
    Log project ID for storing data.
    status int
    Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
    importSourceInfo Property Map
    Import data source information
    sourceType String
    Data source type. Options: tos, kafka.
    targetInfo Property Map
    Output information for the data import task.
    taskName String
    Import task name
    topicId String
    Log topic ID used to store data
    description String
    Task description.
    projectId String
    Log project ID for storing data.
    status Number
    Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting

    Outputs

    All input properties are implicitly available as output properties. Additionally, the ImportTask resource produces the following output properties:

    CreateTime string
    Creation time.
    Id string
    The provider-assigned unique ID for this managed resource.
    ProjectName string
    Log project name.
    TaskId string
    Import task ID.
    TaskStatistics Volcengine.ImportTaskTaskStatistics
    Progress of the data import task.
    TopicName string
    Log topic name.
    CreateTime string
    Creation time.
    Id string
    The provider-assigned unique ID for this managed resource.
    ProjectName string
    Log project name.
    TaskId string
    Import task ID.
    TaskStatistics ImportTaskTaskStatistics
    Progress of the data import task.
    TopicName string
    Log topic name.
    createTime String
    Creation time.
    id String
    The provider-assigned unique ID for this managed resource.
    projectName String
    Log project name.
    taskId String
    Import task ID.
    taskStatistics ImportTaskTaskStatistics
    Progress of the data import task.
    topicName String
    Log topic name.
    createTime string
    Creation time.
    id string
    The provider-assigned unique ID for this managed resource.
    projectName string
    Log project name.
    taskId string
    Import task ID.
    taskStatistics ImportTaskTaskStatistics
    Progress of the data import task.
    topicName string
    Log topic name.
    create_time str
    Creation time.
    id str
    The provider-assigned unique ID for this managed resource.
    project_name str
    Log project name.
    task_id str
    Import task ID.
    task_statistics ImportTaskTaskStatistics
    Progress of the data import task.
    topic_name str
    Log topic name.
    createTime String
    Creation time.
    id String
    The provider-assigned unique ID for this managed resource.
    projectName String
    Log project name.
    taskId String
    Import task ID.
    taskStatistics Property Map
    Progress of the data import task.
    topicName String
    Log topic name.

    Look up Existing ImportTask Resource

    Get an existing ImportTask resource’s state with the given name, ID, and optional extra properties used to qualify the lookup.

    public static get(name: string, id: Input<ID>, state?: ImportTaskState, opts?: CustomResourceOptions): ImportTask
    @staticmethod
    def get(resource_name: str,
            id: str,
            opts: Optional[ResourceOptions] = None,
            create_time: Optional[str] = None,
            description: Optional[str] = None,
            import_source_info: Optional[ImportTaskImportSourceInfoArgs] = None,
            project_id: Optional[str] = None,
            project_name: Optional[str] = None,
            source_type: Optional[str] = None,
            status: Optional[int] = None,
            target_info: Optional[ImportTaskTargetInfoArgs] = None,
            task_id: Optional[str] = None,
            task_name: Optional[str] = None,
            task_statistics: Optional[ImportTaskTaskStatisticsArgs] = None,
            topic_id: Optional[str] = None,
            topic_name: Optional[str] = None) -> ImportTask
    func GetImportTask(ctx *Context, name string, id IDInput, state *ImportTaskState, opts ...ResourceOption) (*ImportTask, error)
    public static ImportTask Get(string name, Input<string> id, ImportTaskState? state, CustomResourceOptions? opts = null)
    public static ImportTask get(String name, Output<String> id, ImportTaskState state, CustomResourceOptions options)
    resources:  _:    type: volcenginecc:tls:ImportTask    get:      id: ${id}
    name
    The unique name of the resulting resource.
    id
    The unique provider ID of the resource to lookup.
    state
    Any extra arguments used during the lookup.
    opts
    A bag of options that control this resource's behavior.
    resource_name
    The unique name of the resulting resource.
    id
    The unique provider ID of the resource to lookup.
    name
    The unique name of the resulting resource.
    id
    The unique provider ID of the resource to lookup.
    state
    Any extra arguments used during the lookup.
    opts
    A bag of options that control this resource's behavior.
    name
    The unique name of the resulting resource.
    id
    The unique provider ID of the resource to lookup.
    state
    Any extra arguments used during the lookup.
    opts
    A bag of options that control this resource's behavior.
    name
    The unique name of the resulting resource.
    id
    The unique provider ID of the resource to lookup.
    state
    Any extra arguments used during the lookup.
    opts
    A bag of options that control this resource's behavior.
    The following state arguments are supported:
    CreateTime string
    Creation time.
    Description string
    Task description.
    ImportSourceInfo Volcengine.ImportTaskImportSourceInfo
    Import data source information
    ProjectId string
    Log project ID for storing data.
    ProjectName string
    Log project name.
    SourceType string
    Data source type. Options: tos, kafka.
    Status int
    Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
    TargetInfo Volcengine.ImportTaskTargetInfo
    Output information for the data import task.
    TaskId string
    Import task ID.
    TaskName string
    Import task name
    TaskStatistics Volcengine.ImportTaskTaskStatistics
    Progress of the data import task.
    TopicId string
    Log topic ID used to store data
    TopicName string
    Log topic name.
    CreateTime string
    Creation time.
    Description string
    Task description.
    ImportSourceInfo ImportTaskImportSourceInfoArgs
    Import data source information
    ProjectId string
    Log project ID for storing data.
    ProjectName string
    Log project name.
    SourceType string
    Data source type. Options: tos, kafka.
    Status int
    Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
    TargetInfo ImportTaskTargetInfoArgs
    Output information for the data import task.
    TaskId string
    Import task ID.
    TaskName string
    Import task name
    TaskStatistics ImportTaskTaskStatisticsArgs
    Progress of the data import task.
    TopicId string
    Log topic ID used to store data
    TopicName string
    Log topic name.
    createTime String
    Creation time.
    description String
    Task description.
    importSourceInfo ImportTaskImportSourceInfo
    Import data source information
    projectId String
    Log project ID for storing data.
    projectName String
    Log project name.
    sourceType String
    Data source type. Options: tos, kafka.
    status Integer
    Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
    targetInfo ImportTaskTargetInfo
    Output information for the data import task.
    taskId String
    Import task ID.
    taskName String
    Import task name
    taskStatistics ImportTaskTaskStatistics
    Progress of the data import task.
    topicId String
    Log topic ID used to store data
    topicName String
    Log topic name.
    createTime string
    Creation time.
    description string
    Task description.
    importSourceInfo ImportTaskImportSourceInfo
    Import data source information
    projectId string
    Log project ID for storing data.
    projectName string
    Log project name.
    sourceType string
    Data source type. Options: tos, kafka.
    status number
    Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
    targetInfo ImportTaskTargetInfo
    Output information for the data import task.
    taskId string
    Import task ID.
    taskName string
    Import task name
    taskStatistics ImportTaskTaskStatistics
    Progress of the data import task.
    topicId string
    Log topic ID used to store data
    topicName string
    Log topic name.
    create_time str
    Creation time.
    description str
    Task description.
    import_source_info ImportTaskImportSourceInfoArgs
    Import data source information
    project_id str
    Log project ID for storing data.
    project_name str
    Log project name.
    source_type str
    Data source type. Options: tos, kafka.
    status int
    Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
    target_info ImportTaskTargetInfoArgs
    Output information for the data import task.
    task_id str
    Import task ID.
    task_name str
    Import task name
    task_statistics ImportTaskTaskStatisticsArgs
    Progress of the data import task.
    topic_id str
    Log topic ID used to store data
    topic_name str
    Log topic name.
    createTime String
    Creation time.
    description String
    Task description.
    importSourceInfo Property Map
    Import data source information
    projectId String
    Log project ID for storing data.
    projectName String
    Log project name.
    sourceType String
    Data source type. Options: tos, kafka.
    status Number
    Status of the data import task. 0: Importing. 1: Import completed. 2: Import error. 3: Stopping. 4: Stopped. 5: Restarting
    targetInfo Property Map
    Output information for the data import task.
    taskId String
    Import task ID.
    taskName String
    Import task name
    taskStatistics Property Map
    Progress of the data import task.
    topicId String
    Log topic ID used to store data
    topicName String
    Log topic name.

    Supporting Types

    ImportTaskImportSourceInfo, ImportTaskImportSourceInfoArgs

    KafkaSourceInfo Volcengine.ImportTaskImportSourceInfoKafkaSourceInfo
    Kafka data source information. When sourceType is kafka, the KafkaSourceInfo field is required
    TosSourceInfo Volcengine.ImportTaskImportSourceInfoTosSourceInfo
    TOS data source information. When sourceType is tos, the TosSourceInfo field is required.
    KafkaSourceInfo ImportTaskImportSourceInfoKafkaSourceInfo
    Kafka data source information. When sourceType is kafka, the KafkaSourceInfo field is required
    TosSourceInfo ImportTaskImportSourceInfoTosSourceInfo
    TOS data source information. When sourceType is tos, the TosSourceInfo field is required.
    kafkaSourceInfo ImportTaskImportSourceInfoKafkaSourceInfo
    Kafka data source information. When sourceType is kafka, the KafkaSourceInfo field is required
    tosSourceInfo ImportTaskImportSourceInfoTosSourceInfo
    TOS data source information. When sourceType is tos, the TosSourceInfo field is required.
    kafkaSourceInfo ImportTaskImportSourceInfoKafkaSourceInfo
    Kafka data source information. When sourceType is kafka, the KafkaSourceInfo field is required
    tosSourceInfo ImportTaskImportSourceInfoTosSourceInfo
    TOS data source information. When sourceType is tos, the TosSourceInfo field is required.
    kafka_source_info ImportTaskImportSourceInfoKafkaSourceInfo
    Kafka data source information. When sourceType is kafka, the KafkaSourceInfo field is required
    tos_source_info ImportTaskImportSourceInfoTosSourceInfo
    TOS data source information. When sourceType is tos, the TosSourceInfo field is required.
    kafkaSourceInfo Property Map
    Kafka data source information. When sourceType is kafka, the KafkaSourceInfo field is required
    tosSourceInfo Property Map
    TOS data source information. When sourceType is tos, the TosSourceInfo field is required.

    ImportTaskImportSourceInfoKafkaSourceInfo, ImportTaskImportSourceInfoKafkaSourceInfoArgs

    Encode string
    Data encoding format. Available options: UTF-8, GBK.
    Group string
    Kafka consumer group. If not specified, the system will automatically create a Kafka consumer group.
    Host string
    The service addresses for different types of Kafka clusters vary. Details are as follows: Message Queue Kafka Edition: Use the access point of the Kafka instance. For more information, see Access Point. If the Kafka instance and the Log Service Project are in the same region, you can use private network access; otherwise, use public network access. Self-hosted Kafka clusters: Use the IP address and port number or the domain name and port number of the Kafka Broker. Only public network access is supported. Separate multiple service addresses with a comma (,).
    InitialOffset int
    Starting position for data import. Options: 0: Earliest time, start importing from the first record in the specified Kafka Topic. 1: Latest time, start importing from the most recently generated record in the specified Kafka Topic.
    InstanceId string
    If you are using Message Queue Kafka Edition, set this to the Kafka instance ID.
    IsNeedAuth bool
    Whether to enable authentication. If you use a public service address, it is recommended to enable authentication.
    Mechanism string
    Password authentication mechanism. Available options: PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
    Password string
    Kafka SASL user password for authentication.
    Protocol string
    Secure transmission protocol. Options include plaintext, saslssl, ssl, and saslplaintext
    TimeSourceDefault int
    Specify log time. Options: 0: Use Kafka message timestamp. 1: Use current system time.
    Topic string
    Kafka Topic name. Separate multiple Kafka Topics with commas (,).
    Username string
    Kafka SASL username for authentication.
    Encode string
    Data encoding format. Available options: UTF-8, GBK.
    Group string
    Kafka consumer group. If not specified, the system will automatically create a Kafka consumer group.
    Host string
    The service addresses for different types of Kafka clusters vary. Details are as follows: Message Queue Kafka Edition: Use the access point of the Kafka instance. For more information, see Access Point. If the Kafka instance and the Log Service Project are in the same region, you can use private network access; otherwise, use public network access. Self-hosted Kafka clusters: Use the IP address and port number or the domain name and port number of the Kafka Broker. Only public network access is supported. Separate multiple service addresses with a comma (,).
    InitialOffset int
    Starting position for data import. Options: 0: Earliest time, start importing from the first record in the specified Kafka Topic. 1: Latest time, start importing from the most recently generated record in the specified Kafka Topic.
    InstanceId string
    If you are using Message Queue Kafka Edition, set this to the Kafka instance ID.
    IsNeedAuth bool
    Whether to enable authentication. If you use a public service address, it is recommended to enable authentication.
    Mechanism string
    Password authentication mechanism. Available options: PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
    Password string
    Kafka SASL user password for authentication.
    Protocol string
    Secure transmission protocol. Options include plaintext, saslssl, ssl, and saslplaintext
    TimeSourceDefault int
    Specify log time. Options: 0: Use Kafka message timestamp. 1: Use current system time.
    Topic string
    Kafka Topic name. Separate multiple Kafka Topics with commas (,).
    Username string
    Kafka SASL username for authentication.
    encode String
    Data encoding format. Available options: UTF-8, GBK.
    group String
    Kafka consumer group. If not specified, the system will automatically create a Kafka consumer group.
    host String
    The service addresses for different types of Kafka clusters vary. Details are as follows: Message Queue Kafka Edition: Use the access point of the Kafka instance. For more information, see Access Point. If the Kafka instance and the Log Service Project are in the same region, you can use private network access; otherwise, use public network access. Self-hosted Kafka clusters: Use the IP address and port number or the domain name and port number of the Kafka Broker. Only public network access is supported. Separate multiple service addresses with a comma (,).
    initialOffset Integer
    Starting position for data import. Options: 0: Earliest time, start importing from the first record in the specified Kafka Topic. 1: Latest time, start importing from the most recently generated record in the specified Kafka Topic.
    instanceId String
    If you are using Message Queue Kafka Edition, set this to the Kafka instance ID.
    isNeedAuth Boolean
    Whether to enable authentication. If you use a public service address, it is recommended to enable authentication.
    mechanism String
    Password authentication mechanism. Available options: PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
    password String
    Kafka SASL user password for authentication.
    protocol String
    Secure transmission protocol. Options include plaintext, saslssl, ssl, and saslplaintext
    timeSourceDefault Integer
    Specify log time. Options: 0: Use Kafka message timestamp. 1: Use current system time.
    topic String
    Kafka Topic name. Separate multiple Kafka Topics with commas (,).
    username String
    Kafka SASL username for authentication.
    encode string
    Data encoding format. Available options: UTF-8, GBK.
    group string
    Kafka consumer group. If not specified, the system will automatically create a Kafka consumer group.
    host string
    The service addresses for different types of Kafka clusters vary. Details are as follows: Message Queue Kafka Edition: Use the access point of the Kafka instance. For more information, see Access Point. If the Kafka instance and the Log Service Project are in the same region, you can use private network access; otherwise, use public network access. Self-hosted Kafka clusters: Use the IP address and port number or the domain name and port number of the Kafka Broker. Only public network access is supported. Separate multiple service addresses with a comma (,).
    initialOffset number
    Starting position for data import. Options: 0: Earliest time, start importing from the first record in the specified Kafka Topic. 1: Latest time, start importing from the most recently generated record in the specified Kafka Topic.
    instanceId string
    If you are using Message Queue Kafka Edition, set this to the Kafka instance ID.
    isNeedAuth boolean
    Whether to enable authentication. If you use a public service address, it is recommended to enable authentication.
    mechanism string
    Password authentication mechanism. Available options: PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
    password string
    Kafka SASL user password for authentication.
    protocol string
    Secure transmission protocol. Options include plaintext, saslssl, ssl, and saslplaintext
    timeSourceDefault number
    Specify log time. Options: 0: Use Kafka message timestamp. 1: Use current system time.
    topic string
    Kafka Topic name. Separate multiple Kafka Topics with commas (,).
    username string
    Kafka SASL username for authentication.
    encode str
    Data encoding format. Available options: UTF-8, GBK.
    group str
    Kafka consumer group. If not specified, the system will automatically create a Kafka consumer group.
    host str
    The service addresses for different types of Kafka clusters vary. Details are as follows: Message Queue Kafka Edition: Use the access point of the Kafka instance. For more information, see Access Point. If the Kafka instance and the Log Service Project are in the same region, you can use private network access; otherwise, use public network access. Self-hosted Kafka clusters: Use the IP address and port number or the domain name and port number of the Kafka Broker. Only public network access is supported. Separate multiple service addresses with a comma (,).
    initial_offset int
    Starting position for data import. Options: 0: Earliest time, start importing from the first record in the specified Kafka Topic. 1: Latest time, start importing from the most recently generated record in the specified Kafka Topic.
    instance_id str
    If you are using Message Queue Kafka Edition, set this to the Kafka instance ID.
    is_need_auth bool
    Whether to enable authentication. If you use a public service address, it is recommended to enable authentication.
    mechanism str
    Password authentication mechanism. Available options: PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
    password str
    Kafka SASL user password for authentication.
    protocol str
    Secure transmission protocol. Options include plaintext, saslssl, ssl, and saslplaintext
    time_source_default int
    Specify log time. Options: 0: Use Kafka message timestamp. 1: Use current system time.
    topic str
    Kafka Topic name. Separate multiple Kafka Topics with commas (,).
    username str
    Kafka SASL username for authentication.
    encode String
    Data encoding format. Available options: UTF-8, GBK.
    group String
    Kafka consumer group. If not specified, the system will automatically create a Kafka consumer group.
    host String
    The service addresses for different types of Kafka clusters vary. Details are as follows: Message Queue Kafka Edition: Use the access point of the Kafka instance. For more information, see Access Point. If the Kafka instance and the Log Service Project are in the same region, you can use private network access; otherwise, use public network access. Self-hosted Kafka clusters: Use the IP address and port number or the domain name and port number of the Kafka Broker. Only public network access is supported. Separate multiple service addresses with a comma (,).
    initialOffset Number
    Starting position for data import. Options: 0: Earliest time, start importing from the first record in the specified Kafka Topic. 1: Latest time, start importing from the most recently generated record in the specified Kafka Topic.
    instanceId String
    If you are using Message Queue Kafka Edition, set this to the Kafka instance ID.
    isNeedAuth Boolean
    Whether to enable authentication. If you use a public service address, it is recommended to enable authentication.
    mechanism String
    Password authentication mechanism. Available options: PLAIN, SCRAM-SHA-256, and SCRAM-SHA-512.
    password String
    Kafka SASL user password for authentication.
    protocol String
    Secure transmission protocol. Options include plaintext, saslssl, ssl, and saslplaintext
    timeSourceDefault Number
    Specify log time. Options: 0: Use Kafka message timestamp. 1: Use current system time.
    topic String
    Kafka Topic name. Separate multiple Kafka Topics with commas (,).
    username String
    Kafka SASL username for authentication.

    ImportTaskImportSourceInfoTosSourceInfo, ImportTaskImportSourceInfoTosSourceInfoArgs

    Bucket string
    TOS bucket name
    CompressType string
    Compression mode for data in the TOS bucket. none: No compression. snappy: Compress using snappy. gzip: Compress using gzip. lz4: Compress using lz4.
    Prefix string
    Path of the file to be imported in the TOS bucket.
    Region string
    Region where the TOS bucket is located. Cross-region data import is supported
    Bucket string
    TOS bucket name
    CompressType string
    Compression mode for data in the TOS bucket. none: No compression. snappy: Compress using snappy. gzip: Compress using gzip. lz4: Compress using lz4.
    Prefix string
    Path of the file to be imported in the TOS bucket.
    Region string
    Region where the TOS bucket is located. Cross-region data import is supported
    bucket String
    TOS bucket name
    compressType String
    Compression mode for data in the TOS bucket. none: No compression. snappy: Compress using snappy. gzip: Compress using gzip. lz4: Compress using lz4.
    prefix String
    Path of the file to be imported in the TOS bucket.
    region String
    Region where the TOS bucket is located. Cross-region data import is supported
    bucket string
    TOS bucket name
    compressType string
    Compression mode for data in the TOS bucket. none: No compression. snappy: Compress using snappy. gzip: Compress using gzip. lz4: Compress using lz4.
    prefix string
    Path of the file to be imported in the TOS bucket.
    region string
    Region where the TOS bucket is located. Cross-region data import is supported
    bucket str
    TOS bucket name
    compress_type str
    Compression mode for data in the TOS bucket. none: No compression. snappy: Compress using snappy. gzip: Compress using gzip. lz4: Compress using lz4.
    prefix str
    Path of the file to be imported in the TOS bucket.
    region str
    Region where the TOS bucket is located. Cross-region data import is supported
    bucket String
    TOS bucket name
    compressType String
    Compression mode for data in the TOS bucket. none: No compression. snappy: Compress using snappy. gzip: Compress using gzip. lz4: Compress using lz4.
    prefix String
    Path of the file to be imported in the TOS bucket.
    region String
    Region where the TOS bucket is located. Cross-region data import is supported

    ImportTaskTargetInfo, ImportTaskTargetInfoArgs

    LogType string
    Specify log parsing type during import. delimiterlog: CSV type. multilinelog: multiline full text type. minimalistlog: single line full text type. jsonlog: JSON type.
    Region string
    Region.
    ExtractRule Volcengine.ImportTaskTargetInfoExtractRule
    Log extraction rule.
    LogSample string
    Log sample. When LogType is set to multiline_log, you must configure log samples. It is recommended to provide more than two log entries as examples to ensure the regular expression matches the first line of each log. Use real samples from the production environment.
    LogType string
    Specify log parsing type during import. delimiterlog: CSV type. multilinelog: multiline full text type. minimalistlog: single line full text type. jsonlog: JSON type.
    Region string
    Region.
    ExtractRule ImportTaskTargetInfoExtractRule
    Log extraction rule.
    LogSample string
    Log sample. When LogType is set to multiline_log, you must configure log samples. It is recommended to provide more than two log entries as examples to ensure the regular expression matches the first line of each log. Use real samples from the production environment.
    logType String
    Specify log parsing type during import. delimiterlog: CSV type. multilinelog: multiline full text type. minimalistlog: single line full text type. jsonlog: JSON type.
    region String
    Region.
    extractRule ImportTaskTargetInfoExtractRule
    Log extraction rule.
    logSample String
    Log sample. When LogType is set to multiline_log, you must configure log samples. It is recommended to provide more than two log entries as examples to ensure the regular expression matches the first line of each log. Use real samples from the production environment.
    logType string
    Specify log parsing type during import. delimiterlog: CSV type. multilinelog: multiline full text type. minimalistlog: single line full text type. jsonlog: JSON type.
    region string
    Region.
    extractRule ImportTaskTargetInfoExtractRule
    Log extraction rule.
    logSample string
    Log sample. When LogType is set to multiline_log, you must configure log samples. It is recommended to provide more than two log entries as examples to ensure the regular expression matches the first line of each log. Use real samples from the production environment.
    log_type str
    Specify log parsing type during import. delimiterlog: CSV type. multilinelog: multiline full text type. minimalistlog: single line full text type. jsonlog: JSON type.
    region str
    Region.
    extract_rule ImportTaskTargetInfoExtractRule
    Log extraction rule.
    log_sample str
    Log sample. When LogType is set to multiline_log, you must configure log samples. It is recommended to provide more than two log entries as examples to ensure the regular expression matches the first line of each log. Use real samples from the production environment.
    logType String
    Specify log parsing type during import. delimiterlog: CSV type. multilinelog: multiline full text type. minimalistlog: single line full text type. jsonlog: JSON type.
    region String
    Region.
    extractRule Property Map
    Log extraction rule.
    logSample String
    Log sample. When LogType is set to multiline_log, you must configure log samples. It is recommended to provide more than two log entries as examples to ensure the regular expression matches the first line of each log. Use real samples from the production environment.

    ImportTaskTargetInfoExtractRule, ImportTaskTargetInfoExtractRuleArgs

    ExtractRule Volcengine.ImportTaskTargetInfoExtractRuleExtractRule
    Basic content of log extraction rules.
    SkipLineCount int
    Number of skipped rows. Only valid when the log type is delimiter_log and the import type is tos.
    TimeExtractRegex string
    Time extraction regular expression, used to extract the time value from the TimeKey field and parse it as the collection time
    TimeZone string
    Time zone. Supports machine time zone (default) and custom time zone. Custom time zone supports GMT and UTC. GMT format: GMT+08:00. UTC format: Asia/Shanghai.
    ExtractRule ImportTaskTargetInfoExtractRuleExtractRule
    Basic content of log extraction rules.
    SkipLineCount int
    Number of skipped rows. Only valid when the log type is delimiter_log and the import type is tos.
    TimeExtractRegex string
    Time extraction regular expression, used to extract the time value from the TimeKey field and parse it as the collection time
    TimeZone string
    Time zone. Supports machine time zone (default) and custom time zone. Custom time zone supports GMT and UTC. GMT format: GMT+08:00. UTC format: Asia/Shanghai.
    extractRule ImportTaskTargetInfoExtractRuleExtractRule
    Basic content of log extraction rules.
    skipLineCount Integer
    Number of skipped rows. Only valid when the log type is delimiter_log and the import type is tos.
    timeExtractRegex String
    Time extraction regular expression, used to extract the time value from the TimeKey field and parse it as the collection time
    timeZone String
    Time zone. Supports machine time zone (default) and custom time zone. Custom time zone supports GMT and UTC. GMT format: GMT+08:00. UTC format: Asia/Shanghai.
    extractRule ImportTaskTargetInfoExtractRuleExtractRule
    Basic content of log extraction rules.
    skipLineCount number
    Number of skipped rows. Only valid when the log type is delimiter_log and the import type is tos.
    timeExtractRegex string
    Time extraction regular expression, used to extract the time value from the TimeKey field and parse it as the collection time
    timeZone string
    Time zone. Supports machine time zone (default) and custom time zone. Custom time zone supports GMT and UTC. GMT format: GMT+08:00. UTC format: Asia/Shanghai.
    extract_rule ImportTaskTargetInfoExtractRuleExtractRule
    Basic content of log extraction rules.
    skip_line_count int
    Number of skipped rows. Only valid when the log type is delimiter_log and the import type is tos.
    time_extract_regex str
    Time extraction regular expression, used to extract the time value from the TimeKey field and parse it as the collection time
    time_zone str
    Time zone. Supports machine time zone (default) and custom time zone. Custom time zone supports GMT and UTC. GMT format: GMT+08:00. UTC format: Asia/Shanghai.
    extractRule Property Map
    Basic content of log extraction rules.
    skipLineCount Number
    Number of skipped rows. Only valid when the log type is delimiter_log and the import type is tos.
    timeExtractRegex String
    Time extraction regular expression, used to extract the time value from the TimeKey field and parse it as the collection time
    timeZone String
    Time zone. Supports machine time zone (default) and custom time zone. Custom time zone supports GMT and UTC. GMT format: GMT+08:00. UTC format: Asia/Shanghai.

    ImportTaskTargetInfoExtractRuleExtractRule, ImportTaskTargetInfoExtractRuleExtractRuleArgs

    BeginRegex string
    Regular expression for identifying the first line of each log. The matched part is treated as the start of the log. When LogType is set to multiline_log, you must configure a log sample
    Delimiter string
    Delimiter. Only valid when LogType is delimiter_log.
    EnableNanosecond bool
    Enable nanoseconds.
    FilterKeyRegexes List<Volcengine.ImportTaskTargetInfoExtractRuleExtractRuleFilterKeyRegex>
    Keys List<string>
    List of log field names (Key). Valid only when LogType is delimiter_log. Supports up to 100 field names. Duplicate field names are not allowed, and all field names cannot be left blank
    LogRegex string
    Log regular expression
    LogTemplate Volcengine.ImportTaskTargetInfoExtractRuleExtractRuleLogTemplate
    Log template.
    Quote string
    Quotation mark. Content enclosed by the quotation mark will not be separated and will be parsed as a complete field. Only valid when LogType is delimiter_log.
    TimeFormat string
    Parsing format for the time field. If you use a specified time field in the log as the log timestamp, you must fill in TimeKey and TimeFormat. TimeKey and TimeFormat must be paired. For configuration details, see time format.
    TimeKey string
    Name of the log time field. If you use a specific time field in the log as the log timestamp, you must provide both TimeKey and TimeFormat. TimeKey and TimeFormat must appear in pairs
    TimeSample string
    Time sample. Used to verify whether the entered time parsing format is correct
    UnMatchLogKey string
    When uploading logs that failed to parse, specify the key name for the failed logs. UnMatchUpLoadSwitch=true and UnMatchLogKey must be used together.
    UnMatchUpLoadSwitch bool
    Whether to upload logs that failed to parse. UnMatchUpLoadSwitch=true and UnMatchLogKey must be paired. true: Upload logs that failed to parse. false: Do not upload logs that failed to parse.
    BeginRegex string
    Regular expression for identifying the first line of each log. The matched part is treated as the start of the log. When LogType is set to multiline_log, you must configure a log sample
    Delimiter string
    Delimiter. Only valid when LogType is delimiter_log.
    EnableNanosecond bool
    Enable nanoseconds.
    FilterKeyRegexes []ImportTaskTargetInfoExtractRuleExtractRuleFilterKeyRegex
    Keys []string
    List of log field names (Key). Valid only when LogType is delimiter_log. Supports up to 100 field names. Duplicate field names are not allowed, and all field names cannot be left blank
    LogRegex string
    Log regular expression
    LogTemplate ImportTaskTargetInfoExtractRuleExtractRuleLogTemplate
    Log template.
    Quote string
    Quotation mark. Content enclosed by the quotation mark will not be separated and will be parsed as a complete field. Only valid when LogType is delimiter_log.
    TimeFormat string
    Parsing format for the time field. If you use a specified time field in the log as the log timestamp, you must fill in TimeKey and TimeFormat. TimeKey and TimeFormat must be paired. For configuration details, see time format.
    TimeKey string
    Name of the log time field. If you use a specific time field in the log as the log timestamp, you must provide both TimeKey and TimeFormat. TimeKey and TimeFormat must appear in pairs
    TimeSample string
    Time sample. Used to verify whether the entered time parsing format is correct
    UnMatchLogKey string
    When uploading logs that failed to parse, specify the key name for the failed logs. UnMatchUpLoadSwitch=true and UnMatchLogKey must be used together.
    UnMatchUpLoadSwitch bool
    Whether to upload logs that failed to parse. UnMatchUpLoadSwitch=true and UnMatchLogKey must be paired. true: Upload logs that failed to parse. false: Do not upload logs that failed to parse.
    beginRegex String
    Regular expression for identifying the first line of each log. The matched part is treated as the start of the log. When LogType is set to multiline_log, you must configure a log sample
    delimiter String
    Delimiter. Only valid when LogType is delimiter_log.
    enableNanosecond Boolean
    Enable nanoseconds.
    filterKeyRegexes List<ImportTaskTargetInfoExtractRuleExtractRuleFilterKeyRegex>
    keys List<String>
    List of log field names (Key). Valid only when LogType is delimiter_log. Supports up to 100 field names. Duplicate field names are not allowed, and all field names cannot be left blank
    logRegex String
    Log regular expression
    logTemplate ImportTaskTargetInfoExtractRuleExtractRuleLogTemplate
    Log template.
    quote String
    Quotation mark. Content enclosed by the quotation mark will not be separated and will be parsed as a complete field. Only valid when LogType is delimiter_log.
    timeFormat String
    Parsing format for the time field. If you use a specified time field in the log as the log timestamp, you must fill in TimeKey and TimeFormat. TimeKey and TimeFormat must be paired. For configuration details, see time format.
    timeKey String
    Name of the log time field. If you use a specific time field in the log as the log timestamp, you must provide both TimeKey and TimeFormat. TimeKey and TimeFormat must appear in pairs
    timeSample String
    Time sample. Used to verify whether the entered time parsing format is correct
    unMatchLogKey String
    When uploading logs that failed to parse, specify the key name for the failed logs. UnMatchUpLoadSwitch=true and UnMatchLogKey must be used together.
    unMatchUpLoadSwitch Boolean
    Whether to upload logs that failed to parse. UnMatchUpLoadSwitch=true and UnMatchLogKey must be paired. true: Upload logs that failed to parse. false: Do not upload logs that failed to parse.
    beginRegex string
    Regular expression for identifying the first line of each log. The matched part is treated as the start of the log. When LogType is set to multiline_log, you must configure a log sample
    delimiter string
    Delimiter. Only valid when LogType is delimiter_log.
    enableNanosecond boolean
    Enable nanoseconds.
    filterKeyRegexes ImportTaskTargetInfoExtractRuleExtractRuleFilterKeyRegex[]
    keys string[]
    List of log field names (Key). Valid only when LogType is delimiter_log. Supports up to 100 field names. Duplicate field names are not allowed, and all field names cannot be left blank
    logRegex string
    Log regular expression
    logTemplate ImportTaskTargetInfoExtractRuleExtractRuleLogTemplate
    Log template.
    quote string
    Quotation mark. Content enclosed by the quotation mark will not be separated and will be parsed as a complete field. Only valid when LogType is delimiter_log.
    timeFormat string
    Parsing format for the time field. If you use a specified time field in the log as the log timestamp, you must fill in TimeKey and TimeFormat. TimeKey and TimeFormat must be paired. For configuration details, see time format.
    timeKey string
    Name of the log time field. If you use a specific time field in the log as the log timestamp, you must provide both TimeKey and TimeFormat. TimeKey and TimeFormat must appear in pairs
    timeSample string
    Time sample. Used to verify whether the entered time parsing format is correct
    unMatchLogKey string
    When uploading logs that failed to parse, specify the key name for the failed logs. UnMatchUpLoadSwitch=true and UnMatchLogKey must be used together.
    unMatchUpLoadSwitch boolean
    Whether to upload logs that failed to parse. UnMatchUpLoadSwitch=true and UnMatchLogKey must be paired. true: Upload logs that failed to parse. false: Do not upload logs that failed to parse.
    begin_regex str
    Regular expression for identifying the first line of each log. The matched part is treated as the start of the log. When LogType is set to multiline_log, you must configure a log sample
    delimiter str
    Delimiter. Only valid when LogType is delimiter_log.
    enable_nanosecond bool
    Enable nanoseconds.
    filter_key_regexes Sequence[ImportTaskTargetInfoExtractRuleExtractRuleFilterKeyRegex]
    keys Sequence[str]
    List of log field names (Key). Valid only when LogType is delimiter_log. Supports up to 100 field names. Duplicate field names are not allowed, and all field names cannot be left blank
    log_regex str
    Log regular expression
    log_template ImportTaskTargetInfoExtractRuleExtractRuleLogTemplate
    Log template.
    quote str
    Quotation mark. Content enclosed by the quotation mark will not be separated and will be parsed as a complete field. Only valid when LogType is delimiter_log.
    time_format str
    Parsing format for the time field. If you use a specified time field in the log as the log timestamp, you must fill in TimeKey and TimeFormat. TimeKey and TimeFormat must be paired. For configuration details, see time format.
    time_key str
    Name of the log time field. If you use a specific time field in the log as the log timestamp, you must provide both TimeKey and TimeFormat. TimeKey and TimeFormat must appear in pairs
    time_sample str
    Time sample. Used to verify whether the entered time parsing format is correct
    un_match_log_key str
    When uploading logs that failed to parse, specify the key name for the failed logs. UnMatchUpLoadSwitch=true and UnMatchLogKey must be used together.
    un_match_up_load_switch bool
    Whether to upload logs that failed to parse. UnMatchUpLoadSwitch=true and UnMatchLogKey must be paired. true: Upload logs that failed to parse. false: Do not upload logs that failed to parse.
    beginRegex String
    Regular expression for identifying the first line of each log. The matched part is treated as the start of the log. When LogType is set to multiline_log, you must configure a log sample
    delimiter String
    Delimiter. Only valid when LogType is delimiter_log.
    enableNanosecond Boolean
    Enable nanoseconds.
    filterKeyRegexes List<Property Map>
    keys List<String>
    List of log field names (Key). Valid only when LogType is delimiter_log. Supports up to 100 field names. Duplicate field names are not allowed, and all field names cannot be left blank
    logRegex String
    Log regular expression
    logTemplate Property Map
    Log template.
    quote String
    Quotation mark. Content enclosed by the quotation mark will not be separated and will be parsed as a complete field. Only valid when LogType is delimiter_log.
    timeFormat String
    Parsing format for the time field. If you use a specified time field in the log as the log timestamp, you must fill in TimeKey and TimeFormat. TimeKey and TimeFormat must be paired. For configuration details, see time format.
    timeKey String
    Name of the log time field. If you use a specific time field in the log as the log timestamp, you must provide both TimeKey and TimeFormat. TimeKey and TimeFormat must appear in pairs
    timeSample String
    Time sample. Used to verify whether the entered time parsing format is correct
    unMatchLogKey String
    When uploading logs that failed to parse, specify the key name for the failed logs. UnMatchUpLoadSwitch=true and UnMatchLogKey must be used together.
    unMatchUpLoadSwitch Boolean
    Whether to upload logs that failed to parse. UnMatchUpLoadSwitch=true and UnMatchLogKey must be paired. true: Upload logs that failed to parse. false: Do not upload logs that failed to parse.

    ImportTaskTargetInfoExtractRuleExtractRuleFilterKeyRegex, ImportTaskTargetInfoExtractRuleExtractRuleFilterKeyRegexArgs

    Key string
    Key.
    Regex string
    Regular expression.
    Key string
    Key.
    Regex string
    Regular expression.
    key String
    Key.
    regex String
    Regular expression.
    key string
    Key.
    regex string
    Regular expression.
    key str
    Key.
    regex str
    Regular expression.
    key String
    Key.
    regex String
    Regular expression.

    ImportTaskTargetInfoExtractRuleExtractRuleLogTemplate, ImportTaskTargetInfoExtractRuleExtractRuleLogTemplateArgs

    Format string
    Format.
    Type string
    Type
    Format string
    Format.
    Type string
    Type
    format String
    Format.
    type String
    Type
    format string
    Format.
    type string
    Type
    format str
    Format.
    type str
    Type
    format String
    Format.
    type String
    Type

    ImportTaskTaskStatistics, ImportTaskTaskStatisticsArgs

    BytesTotal int
    Total resource bytes enumerated
    BytesTransferred int
    Bytes transferred.
    Failed int
    Number of resources failed to import.
    NotExist int
    Number of resources not found.
    Skipped int
    Number of resources skipped during import
    TaskStatus string
    Task status. Status of the import task. Preparing: Preparing for import. Importing: Importing data. Success: Import completed successfully. Failed: Import failed. Stopped: Import paused.
    Total int
    Total number of resources enumerated.
    Transferred int
    Number of records transferred.
    BytesTotal int
    Total resource bytes enumerated
    BytesTransferred int
    Bytes transferred.
    Failed int
    Number of resources failed to import.
    NotExist int
    Number of resources not found.
    Skipped int
    Number of resources skipped during import
    TaskStatus string
    Task status. Status of the import task. Preparing: Preparing for import. Importing: Importing data. Success: Import completed successfully. Failed: Import failed. Stopped: Import paused.
    Total int
    Total number of resources enumerated.
    Transferred int
    Number of records transferred.
    bytesTotal Integer
    Total resource bytes enumerated
    bytesTransferred Integer
    Bytes transferred.
    failed Integer
    Number of resources failed to import.
    notExist Integer
    Number of resources not found.
    skipped Integer
    Number of resources skipped during import
    taskStatus String
    Task status. Status of the import task. Preparing: Preparing for import. Importing: Importing data. Success: Import completed successfully. Failed: Import failed. Stopped: Import paused.
    total Integer
    Total number of resources enumerated.
    transferred Integer
    Number of records transferred.
    bytesTotal number
    Total resource bytes enumerated
    bytesTransferred number
    Bytes transferred.
    failed number
    Number of resources failed to import.
    notExist number
    Number of resources not found.
    skipped number
    Number of resources skipped during import
    taskStatus string
    Task status. Status of the import task. Preparing: Preparing for import. Importing: Importing data. Success: Import completed successfully. Failed: Import failed. Stopped: Import paused.
    total number
    Total number of resources enumerated.
    transferred number
    Number of records transferred.
    bytes_total int
    Total resource bytes enumerated
    bytes_transferred int
    Bytes transferred.
    failed int
    Number of resources failed to import.
    not_exist int
    Number of resources not found.
    skipped int
    Number of resources skipped during import
    task_status str
    Task status. Status of the import task. Preparing: Preparing for import. Importing: Importing data. Success: Import completed successfully. Failed: Import failed. Stopped: Import paused.
    total int
    Total number of resources enumerated.
    transferred int
    Number of records transferred.
    bytesTotal Number
    Total resource bytes enumerated
    bytesTransferred Number
    Bytes transferred.
    failed Number
    Number of resources failed to import.
    notExist Number
    Number of resources not found.
    skipped Number
    Number of resources skipped during import
    taskStatus String
    Task status. Status of the import task. Preparing: Preparing for import. Importing: Importing data. Success: Import completed successfully. Failed: Import failed. Stopped: Import paused.
    total Number
    Total number of resources enumerated.
    transferred Number
    Number of records transferred.

    Import

    $ pulumi import volcenginecc:tls/importTask:ImportTask example "task_id"
    

    To learn more about importing existing cloud resources, see Importing resources.

    Package Details

    Repository
    volcenginecc volcengine/pulumi-volcenginecc
    License
    MPL-2.0
    Notes
    This Pulumi package is based on the volcenginecc Terraform Provider.
    volcenginecc logo
    Viewing docs for volcenginecc v0.0.32
    published on Thursday, Apr 23, 2026 by Volcengine
      Try Pulumi Cloud free. Your team will thank you.