google-native logo
Google Cloud Native v0.28.0, Feb 2 23

google-native.notebooks/v1.getExecution

Gets details of executions

Using getExecution

Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.

function getExecution(args: GetExecutionArgs, opts?: InvokeOptions): Promise<GetExecutionResult>
function getExecutionOutput(args: GetExecutionOutputArgs, opts?: InvokeOptions): Output<GetExecutionResult>
def get_execution(execution_id: Optional[str] = None,
                  location: Optional[str] = None,
                  project: Optional[str] = None,
                  opts: Optional[InvokeOptions] = None) -> GetExecutionResult
def get_execution_output(execution_id: Optional[pulumi.Input[str]] = None,
                  location: Optional[pulumi.Input[str]] = None,
                  project: Optional[pulumi.Input[str]] = None,
                  opts: Optional[InvokeOptions] = None) -> Output[GetExecutionResult]
func LookupExecution(ctx *Context, args *LookupExecutionArgs, opts ...InvokeOption) (*LookupExecutionResult, error)
func LookupExecutionOutput(ctx *Context, args *LookupExecutionOutputArgs, opts ...InvokeOption) LookupExecutionResultOutput

> Note: This function is named LookupExecution in the Go SDK.

public static class GetExecution 
{
    public static Task<GetExecutionResult> InvokeAsync(GetExecutionArgs args, InvokeOptions? opts = null)
    public static Output<GetExecutionResult> Invoke(GetExecutionInvokeArgs args, InvokeOptions? opts = null)
}
public static CompletableFuture<GetExecutionResult> getExecution(GetExecutionArgs args, InvokeOptions options)
// Output-based functions aren't available in Java yet
fn::invoke:
  function: google-native:notebooks/v1:getExecution
  arguments:
    # arguments dictionary

The following arguments are supported:

ExecutionId string
Location string
Project string
ExecutionId string
Location string
Project string
executionId String
location String
project String
executionId string
location string
project string
executionId String
location String
project String

getExecution Result

The following output properties are available:

CreateTime string

Time the Execution was instantiated.

Description string

A brief description of this execution.

DisplayName string

Name used for UI purposes. Name can only contain alphanumeric characters and underscores '_'.

ExecutionTemplate Pulumi.GoogleNative.Notebooks.V1.Outputs.ExecutionTemplateResponse

execute metadata including name, hardware spec, region, labels, etc.

JobUri string

The URI of the external job used to execute the notebook.

Name string

The resource name of the execute. Format: projects/{project_id}/locations/{location}/executions/{execution_id}

OutputNotebookFile string

Output notebook file generated by this execution

State string

State of the underlying AI Platform job.

UpdateTime string

Time the Execution was last updated.

CreateTime string

Time the Execution was instantiated.

Description string

A brief description of this execution.

DisplayName string

Name used for UI purposes. Name can only contain alphanumeric characters and underscores '_'.

ExecutionTemplate ExecutionTemplateResponse

execute metadata including name, hardware spec, region, labels, etc.

JobUri string

The URI of the external job used to execute the notebook.

Name string

The resource name of the execute. Format: projects/{project_id}/locations/{location}/executions/{execution_id}

OutputNotebookFile string

Output notebook file generated by this execution

State string

State of the underlying AI Platform job.

UpdateTime string

Time the Execution was last updated.

createTime String

Time the Execution was instantiated.

description String

A brief description of this execution.

displayName String

Name used for UI purposes. Name can only contain alphanumeric characters and underscores '_'.

executionTemplate ExecutionTemplateResponse

execute metadata including name, hardware spec, region, labels, etc.

jobUri String

The URI of the external job used to execute the notebook.

name String

The resource name of the execute. Format: projects/{project_id}/locations/{location}/executions/{execution_id}

outputNotebookFile String

Output notebook file generated by this execution

state String

State of the underlying AI Platform job.

updateTime String

Time the Execution was last updated.

createTime string

Time the Execution was instantiated.

description string

A brief description of this execution.

displayName string

Name used for UI purposes. Name can only contain alphanumeric characters and underscores '_'.

executionTemplate ExecutionTemplateResponse

execute metadata including name, hardware spec, region, labels, etc.

jobUri string

The URI of the external job used to execute the notebook.

name string

The resource name of the execute. Format: projects/{project_id}/locations/{location}/executions/{execution_id}

outputNotebookFile string

Output notebook file generated by this execution

state string

State of the underlying AI Platform job.

updateTime string

Time the Execution was last updated.

create_time str

Time the Execution was instantiated.

description str

A brief description of this execution.

display_name str

Name used for UI purposes. Name can only contain alphanumeric characters and underscores '_'.

execution_template ExecutionTemplateResponse

execute metadata including name, hardware spec, region, labels, etc.

job_uri str

The URI of the external job used to execute the notebook.

name str

The resource name of the execute. Format: projects/{project_id}/locations/{location}/executions/{execution_id}

output_notebook_file str

Output notebook file generated by this execution

state str

State of the underlying AI Platform job.

update_time str

Time the Execution was last updated.

createTime String

Time the Execution was instantiated.

description String

A brief description of this execution.

displayName String

Name used for UI purposes. Name can only contain alphanumeric characters and underscores '_'.

executionTemplate Property Map

execute metadata including name, hardware spec, region, labels, etc.

jobUri String

The URI of the external job used to execute the notebook.

name String

The resource name of the execute. Format: projects/{project_id}/locations/{location}/executions/{execution_id}

outputNotebookFile String

Output notebook file generated by this execution

state String

State of the underlying AI Platform job.

updateTime String

Time the Execution was last updated.

Supporting Types

DataprocParametersResponse

Cluster string

URI for cluster used to run Dataproc execution. Format: projects/{PROJECT_ID}/regions/{REGION}/clusters/{CLUSTER_NAME}

Cluster string

URI for cluster used to run Dataproc execution. Format: projects/{PROJECT_ID}/regions/{REGION}/clusters/{CLUSTER_NAME}

cluster String

URI for cluster used to run Dataproc execution. Format: projects/{PROJECT_ID}/regions/{REGION}/clusters/{CLUSTER_NAME}

cluster string

URI for cluster used to run Dataproc execution. Format: projects/{PROJECT_ID}/regions/{REGION}/clusters/{CLUSTER_NAME}

cluster str

URI for cluster used to run Dataproc execution. Format: projects/{PROJECT_ID}/regions/{REGION}/clusters/{CLUSTER_NAME}

cluster String

URI for cluster used to run Dataproc execution. Format: projects/{PROJECT_ID}/regions/{REGION}/clusters/{CLUSTER_NAME}

ExecutionTemplateResponse

AcceleratorConfig Pulumi.GoogleNative.Notebooks.V1.Inputs.SchedulerAcceleratorConfigResponse

Configuration (count and accelerator type) for hardware running notebook execution.

ContainerImageUri string

Container Image URI to a DLVM Example: 'gcr.io/deeplearning-platform-release/base-cu100' More examples can be found at: https://cloud.google.com/ai-platform/deep-learning-containers/docs/choosing-container

DataprocParameters Pulumi.GoogleNative.Notebooks.V1.Inputs.DataprocParametersResponse

Parameters used in Dataproc JobType executions.

InputNotebookFile string

Path to the notebook file to execute. Must be in a Google Cloud Storage bucket. Format: gs://{bucket_name}/{folder}/{notebook_file_name} Ex: gs://notebook_user/scheduled_notebooks/sentiment_notebook.ipynb

JobType string

The type of Job to be used on this execution.

KernelSpec string

Name of the kernel spec to use. This must be specified if the kernel spec name on the execution target does not match the name in the input notebook file.

Labels Dictionary<string, string>

Labels for execution. If execution is scheduled, a field included will be 'nbs-scheduled'. Otherwise, it is an immediate execution, and an included field will be 'nbs-immediate'. Use fields to efficiently index between various types of executions.

MasterType string

Specifies the type of virtual machine to use for your training job's master worker. You must specify this field when scaleTier is set to CUSTOM. You can use certain Compute Engine machine types directly in this field. The following types are supported: - n1-standard-4 - n1-standard-8 - n1-standard-16 - n1-standard-32 - n1-standard-64 - n1-standard-96 - n1-highmem-2 - n1-highmem-4 - n1-highmem-8 - n1-highmem-16 - n1-highmem-32 - n1-highmem-64 - n1-highmem-96 - n1-highcpu-16 - n1-highcpu-32 - n1-highcpu-64 - n1-highcpu-96 Alternatively, you can use the following legacy machine types: - standard - large_model - complex_model_s - complex_model_m - complex_model_l - standard_gpu - complex_model_m_gpu - complex_model_l_gpu - standard_p100 - complex_model_m_p100 - standard_v100 - large_model_v100 - complex_model_m_v100 - complex_model_l_v100 Finally, if you want to use a TPU for training, specify cloud_tpu in this field. Learn more about the special configuration options for training with TPU.

OutputNotebookFolder string

Path to the notebook folder to write to. Must be in a Google Cloud Storage bucket path. Format: gs://{bucket_name}/{folder} Ex: gs://notebook_user/scheduled_notebooks

Parameters string

Parameters used within the 'input_notebook_file' notebook.

ParamsYamlFile string

Parameters to be overridden in the notebook during execution. Ref https://papermill.readthedocs.io/en/latest/usage-parameterize.html on how to specifying parameters in the input notebook and pass them here in an YAML file. Ex: gs://notebook_user/scheduled_notebooks/sentiment_notebook_params.yaml

ScaleTier string

Scale tier of the hardware used for notebook execution. DEPRECATED Will be discontinued. As right now only CUSTOM is supported.

Deprecated:

Required. Scale tier of the hardware used for notebook execution. DEPRECATED Will be discontinued. As right now only CUSTOM is supported.

ServiceAccount string

The email address of a service account to use when running the execution. You must have the iam.serviceAccounts.actAs permission for the specified service account.

Tensorboard string

The name of a Vertex AI [Tensorboard] resource to which this execution will upload Tensorboard logs. Format: projects/{project}/locations/{location}/tensorboards/{tensorboard}

VertexAiParameters Pulumi.GoogleNative.Notebooks.V1.Inputs.VertexAIParametersResponse

Parameters used in Vertex AI JobType executions.

AcceleratorConfig SchedulerAcceleratorConfigResponse

Configuration (count and accelerator type) for hardware running notebook execution.

ContainerImageUri string

Container Image URI to a DLVM Example: 'gcr.io/deeplearning-platform-release/base-cu100' More examples can be found at: https://cloud.google.com/ai-platform/deep-learning-containers/docs/choosing-container

DataprocParameters DataprocParametersResponse

Parameters used in Dataproc JobType executions.

InputNotebookFile string

Path to the notebook file to execute. Must be in a Google Cloud Storage bucket. Format: gs://{bucket_name}/{folder}/{notebook_file_name} Ex: gs://notebook_user/scheduled_notebooks/sentiment_notebook.ipynb

JobType string

The type of Job to be used on this execution.

KernelSpec string

Name of the kernel spec to use. This must be specified if the kernel spec name on the execution target does not match the name in the input notebook file.

Labels map[string]string

Labels for execution. If execution is scheduled, a field included will be 'nbs-scheduled'. Otherwise, it is an immediate execution, and an included field will be 'nbs-immediate'. Use fields to efficiently index between various types of executions.

MasterType string

Specifies the type of virtual machine to use for your training job's master worker. You must specify this field when scaleTier is set to CUSTOM. You can use certain Compute Engine machine types directly in this field. The following types are supported: - n1-standard-4 - n1-standard-8 - n1-standard-16 - n1-standard-32 - n1-standard-64 - n1-standard-96 - n1-highmem-2 - n1-highmem-4 - n1-highmem-8 - n1-highmem-16 - n1-highmem-32 - n1-highmem-64 - n1-highmem-96 - n1-highcpu-16 - n1-highcpu-32 - n1-highcpu-64 - n1-highcpu-96 Alternatively, you can use the following legacy machine types: - standard - large_model - complex_model_s - complex_model_m - complex_model_l - standard_gpu - complex_model_m_gpu - complex_model_l_gpu - standard_p100 - complex_model_m_p100 - standard_v100 - large_model_v100 - complex_model_m_v100 - complex_model_l_v100 Finally, if you want to use a TPU for training, specify cloud_tpu in this field. Learn more about the special configuration options for training with TPU.

OutputNotebookFolder string

Path to the notebook folder to write to. Must be in a Google Cloud Storage bucket path. Format: gs://{bucket_name}/{folder} Ex: gs://notebook_user/scheduled_notebooks

Parameters string

Parameters used within the 'input_notebook_file' notebook.

ParamsYamlFile string

Parameters to be overridden in the notebook during execution. Ref https://papermill.readthedocs.io/en/latest/usage-parameterize.html on how to specifying parameters in the input notebook and pass them here in an YAML file. Ex: gs://notebook_user/scheduled_notebooks/sentiment_notebook_params.yaml

ScaleTier string

Scale tier of the hardware used for notebook execution. DEPRECATED Will be discontinued. As right now only CUSTOM is supported.

Deprecated:

Required. Scale tier of the hardware used for notebook execution. DEPRECATED Will be discontinued. As right now only CUSTOM is supported.

ServiceAccount string

The email address of a service account to use when running the execution. You must have the iam.serviceAccounts.actAs permission for the specified service account.

Tensorboard string

The name of a Vertex AI [Tensorboard] resource to which this execution will upload Tensorboard logs. Format: projects/{project}/locations/{location}/tensorboards/{tensorboard}

VertexAiParameters VertexAIParametersResponse

Parameters used in Vertex AI JobType executions.

acceleratorConfig SchedulerAcceleratorConfigResponse

Configuration (count and accelerator type) for hardware running notebook execution.

containerImageUri String

Container Image URI to a DLVM Example: 'gcr.io/deeplearning-platform-release/base-cu100' More examples can be found at: https://cloud.google.com/ai-platform/deep-learning-containers/docs/choosing-container

dataprocParameters DataprocParametersResponse

Parameters used in Dataproc JobType executions.

inputNotebookFile String

Path to the notebook file to execute. Must be in a Google Cloud Storage bucket. Format: gs://{bucket_name}/{folder}/{notebook_file_name} Ex: gs://notebook_user/scheduled_notebooks/sentiment_notebook.ipynb

jobType String

The type of Job to be used on this execution.

kernelSpec String

Name of the kernel spec to use. This must be specified if the kernel spec name on the execution target does not match the name in the input notebook file.

labels Map<String,String>

Labels for execution. If execution is scheduled, a field included will be 'nbs-scheduled'. Otherwise, it is an immediate execution, and an included field will be 'nbs-immediate'. Use fields to efficiently index between various types of executions.

masterType String

Specifies the type of virtual machine to use for your training job's master worker. You must specify this field when scaleTier is set to CUSTOM. You can use certain Compute Engine machine types directly in this field. The following types are supported: - n1-standard-4 - n1-standard-8 - n1-standard-16 - n1-standard-32 - n1-standard-64 - n1-standard-96 - n1-highmem-2 - n1-highmem-4 - n1-highmem-8 - n1-highmem-16 - n1-highmem-32 - n1-highmem-64 - n1-highmem-96 - n1-highcpu-16 - n1-highcpu-32 - n1-highcpu-64 - n1-highcpu-96 Alternatively, you can use the following legacy machine types: - standard - large_model - complex_model_s - complex_model_m - complex_model_l - standard_gpu - complex_model_m_gpu - complex_model_l_gpu - standard_p100 - complex_model_m_p100 - standard_v100 - large_model_v100 - complex_model_m_v100 - complex_model_l_v100 Finally, if you want to use a TPU for training, specify cloud_tpu in this field. Learn more about the special configuration options for training with TPU.

outputNotebookFolder String

Path to the notebook folder to write to. Must be in a Google Cloud Storage bucket path. Format: gs://{bucket_name}/{folder} Ex: gs://notebook_user/scheduled_notebooks

parameters String

Parameters used within the 'input_notebook_file' notebook.

paramsYamlFile String

Parameters to be overridden in the notebook during execution. Ref https://papermill.readthedocs.io/en/latest/usage-parameterize.html on how to specifying parameters in the input notebook and pass them here in an YAML file. Ex: gs://notebook_user/scheduled_notebooks/sentiment_notebook_params.yaml

scaleTier String

Scale tier of the hardware used for notebook execution. DEPRECATED Will be discontinued. As right now only CUSTOM is supported.

Deprecated:

Required. Scale tier of the hardware used for notebook execution. DEPRECATED Will be discontinued. As right now only CUSTOM is supported.

serviceAccount String

The email address of a service account to use when running the execution. You must have the iam.serviceAccounts.actAs permission for the specified service account.

tensorboard String

The name of a Vertex AI [Tensorboard] resource to which this execution will upload Tensorboard logs. Format: projects/{project}/locations/{location}/tensorboards/{tensorboard}

vertexAiParameters VertexAIParametersResponse

Parameters used in Vertex AI JobType executions.

acceleratorConfig SchedulerAcceleratorConfigResponse

Configuration (count and accelerator type) for hardware running notebook execution.

containerImageUri string

Container Image URI to a DLVM Example: 'gcr.io/deeplearning-platform-release/base-cu100' More examples can be found at: https://cloud.google.com/ai-platform/deep-learning-containers/docs/choosing-container

dataprocParameters DataprocParametersResponse

Parameters used in Dataproc JobType executions.

inputNotebookFile string

Path to the notebook file to execute. Must be in a Google Cloud Storage bucket. Format: gs://{bucket_name}/{folder}/{notebook_file_name} Ex: gs://notebook_user/scheduled_notebooks/sentiment_notebook.ipynb

jobType string

The type of Job to be used on this execution.

kernelSpec string

Name of the kernel spec to use. This must be specified if the kernel spec name on the execution target does not match the name in the input notebook file.

labels {[key: string]: string}

Labels for execution. If execution is scheduled, a field included will be 'nbs-scheduled'. Otherwise, it is an immediate execution, and an included field will be 'nbs-immediate'. Use fields to efficiently index between various types of executions.

masterType string

Specifies the type of virtual machine to use for your training job's master worker. You must specify this field when scaleTier is set to CUSTOM. You can use certain Compute Engine machine types directly in this field. The following types are supported: - n1-standard-4 - n1-standard-8 - n1-standard-16 - n1-standard-32 - n1-standard-64 - n1-standard-96 - n1-highmem-2 - n1-highmem-4 - n1-highmem-8 - n1-highmem-16 - n1-highmem-32 - n1-highmem-64 - n1-highmem-96 - n1-highcpu-16 - n1-highcpu-32 - n1-highcpu-64 - n1-highcpu-96 Alternatively, you can use the following legacy machine types: - standard - large_model - complex_model_s - complex_model_m - complex_model_l - standard_gpu - complex_model_m_gpu - complex_model_l_gpu - standard_p100 - complex_model_m_p100 - standard_v100 - large_model_v100 - complex_model_m_v100 - complex_model_l_v100 Finally, if you want to use a TPU for training, specify cloud_tpu in this field. Learn more about the special configuration options for training with TPU.

outputNotebookFolder string

Path to the notebook folder to write to. Must be in a Google Cloud Storage bucket path. Format: gs://{bucket_name}/{folder} Ex: gs://notebook_user/scheduled_notebooks

parameters string

Parameters used within the 'input_notebook_file' notebook.

paramsYamlFile string

Parameters to be overridden in the notebook during execution. Ref https://papermill.readthedocs.io/en/latest/usage-parameterize.html on how to specifying parameters in the input notebook and pass them here in an YAML file. Ex: gs://notebook_user/scheduled_notebooks/sentiment_notebook_params.yaml

scaleTier string

Scale tier of the hardware used for notebook execution. DEPRECATED Will be discontinued. As right now only CUSTOM is supported.

Deprecated:

Required. Scale tier of the hardware used for notebook execution. DEPRECATED Will be discontinued. As right now only CUSTOM is supported.

serviceAccount string

The email address of a service account to use when running the execution. You must have the iam.serviceAccounts.actAs permission for the specified service account.

tensorboard string

The name of a Vertex AI [Tensorboard] resource to which this execution will upload Tensorboard logs. Format: projects/{project}/locations/{location}/tensorboards/{tensorboard}

vertexAiParameters VertexAIParametersResponse

Parameters used in Vertex AI JobType executions.

accelerator_config SchedulerAcceleratorConfigResponse

Configuration (count and accelerator type) for hardware running notebook execution.

container_image_uri str

Container Image URI to a DLVM Example: 'gcr.io/deeplearning-platform-release/base-cu100' More examples can be found at: https://cloud.google.com/ai-platform/deep-learning-containers/docs/choosing-container

dataproc_parameters DataprocParametersResponse

Parameters used in Dataproc JobType executions.

input_notebook_file str

Path to the notebook file to execute. Must be in a Google Cloud Storage bucket. Format: gs://{bucket_name}/{folder}/{notebook_file_name} Ex: gs://notebook_user/scheduled_notebooks/sentiment_notebook.ipynb

job_type str

The type of Job to be used on this execution.

kernel_spec str

Name of the kernel spec to use. This must be specified if the kernel spec name on the execution target does not match the name in the input notebook file.

labels Mapping[str, str]

Labels for execution. If execution is scheduled, a field included will be 'nbs-scheduled'. Otherwise, it is an immediate execution, and an included field will be 'nbs-immediate'. Use fields to efficiently index between various types of executions.

master_type str

Specifies the type of virtual machine to use for your training job's master worker. You must specify this field when scaleTier is set to CUSTOM. You can use certain Compute Engine machine types directly in this field. The following types are supported: - n1-standard-4 - n1-standard-8 - n1-standard-16 - n1-standard-32 - n1-standard-64 - n1-standard-96 - n1-highmem-2 - n1-highmem-4 - n1-highmem-8 - n1-highmem-16 - n1-highmem-32 - n1-highmem-64 - n1-highmem-96 - n1-highcpu-16 - n1-highcpu-32 - n1-highcpu-64 - n1-highcpu-96 Alternatively, you can use the following legacy machine types: - standard - large_model - complex_model_s - complex_model_m - complex_model_l - standard_gpu - complex_model_m_gpu - complex_model_l_gpu - standard_p100 - complex_model_m_p100 - standard_v100 - large_model_v100 - complex_model_m_v100 - complex_model_l_v100 Finally, if you want to use a TPU for training, specify cloud_tpu in this field. Learn more about the special configuration options for training with TPU.

output_notebook_folder str

Path to the notebook folder to write to. Must be in a Google Cloud Storage bucket path. Format: gs://{bucket_name}/{folder} Ex: gs://notebook_user/scheduled_notebooks

parameters str

Parameters used within the 'input_notebook_file' notebook.

params_yaml_file str

Parameters to be overridden in the notebook during execution. Ref https://papermill.readthedocs.io/en/latest/usage-parameterize.html on how to specifying parameters in the input notebook and pass them here in an YAML file. Ex: gs://notebook_user/scheduled_notebooks/sentiment_notebook_params.yaml

scale_tier str

Scale tier of the hardware used for notebook execution. DEPRECATED Will be discontinued. As right now only CUSTOM is supported.

Deprecated:

Required. Scale tier of the hardware used for notebook execution. DEPRECATED Will be discontinued. As right now only CUSTOM is supported.

service_account str

The email address of a service account to use when running the execution. You must have the iam.serviceAccounts.actAs permission for the specified service account.

tensorboard str

The name of a Vertex AI [Tensorboard] resource to which this execution will upload Tensorboard logs. Format: projects/{project}/locations/{location}/tensorboards/{tensorboard}

vertex_ai_parameters VertexAIParametersResponse

Parameters used in Vertex AI JobType executions.

acceleratorConfig Property Map

Configuration (count and accelerator type) for hardware running notebook execution.

containerImageUri String

Container Image URI to a DLVM Example: 'gcr.io/deeplearning-platform-release/base-cu100' More examples can be found at: https://cloud.google.com/ai-platform/deep-learning-containers/docs/choosing-container

dataprocParameters Property Map

Parameters used in Dataproc JobType executions.

inputNotebookFile String

Path to the notebook file to execute. Must be in a Google Cloud Storage bucket. Format: gs://{bucket_name}/{folder}/{notebook_file_name} Ex: gs://notebook_user/scheduled_notebooks/sentiment_notebook.ipynb

jobType String

The type of Job to be used on this execution.

kernelSpec String

Name of the kernel spec to use. This must be specified if the kernel spec name on the execution target does not match the name in the input notebook file.

labels Map<String>

Labels for execution. If execution is scheduled, a field included will be 'nbs-scheduled'. Otherwise, it is an immediate execution, and an included field will be 'nbs-immediate'. Use fields to efficiently index between various types of executions.

masterType String

Specifies the type of virtual machine to use for your training job's master worker. You must specify this field when scaleTier is set to CUSTOM. You can use certain Compute Engine machine types directly in this field. The following types are supported: - n1-standard-4 - n1-standard-8 - n1-standard-16 - n1-standard-32 - n1-standard-64 - n1-standard-96 - n1-highmem-2 - n1-highmem-4 - n1-highmem-8 - n1-highmem-16 - n1-highmem-32 - n1-highmem-64 - n1-highmem-96 - n1-highcpu-16 - n1-highcpu-32 - n1-highcpu-64 - n1-highcpu-96 Alternatively, you can use the following legacy machine types: - standard - large_model - complex_model_s - complex_model_m - complex_model_l - standard_gpu - complex_model_m_gpu - complex_model_l_gpu - standard_p100 - complex_model_m_p100 - standard_v100 - large_model_v100 - complex_model_m_v100 - complex_model_l_v100 Finally, if you want to use a TPU for training, specify cloud_tpu in this field. Learn more about the special configuration options for training with TPU.

outputNotebookFolder String

Path to the notebook folder to write to. Must be in a Google Cloud Storage bucket path. Format: gs://{bucket_name}/{folder} Ex: gs://notebook_user/scheduled_notebooks

parameters String

Parameters used within the 'input_notebook_file' notebook.

paramsYamlFile String

Parameters to be overridden in the notebook during execution. Ref https://papermill.readthedocs.io/en/latest/usage-parameterize.html on how to specifying parameters in the input notebook and pass them here in an YAML file. Ex: gs://notebook_user/scheduled_notebooks/sentiment_notebook_params.yaml

scaleTier String

Scale tier of the hardware used for notebook execution. DEPRECATED Will be discontinued. As right now only CUSTOM is supported.

Deprecated:

Required. Scale tier of the hardware used for notebook execution. DEPRECATED Will be discontinued. As right now only CUSTOM is supported.

serviceAccount String

The email address of a service account to use when running the execution. You must have the iam.serviceAccounts.actAs permission for the specified service account.

tensorboard String

The name of a Vertex AI [Tensorboard] resource to which this execution will upload Tensorboard logs. Format: projects/{project}/locations/{location}/tensorboards/{tensorboard}

vertexAiParameters Property Map

Parameters used in Vertex AI JobType executions.

SchedulerAcceleratorConfigResponse

CoreCount string

Count of cores of this accelerator.

Type string

Type of this accelerator.

CoreCount string

Count of cores of this accelerator.

Type string

Type of this accelerator.

coreCount String

Count of cores of this accelerator.

type String

Type of this accelerator.

coreCount string

Count of cores of this accelerator.

type string

Type of this accelerator.

core_count str

Count of cores of this accelerator.

type str

Type of this accelerator.

coreCount String

Count of cores of this accelerator.

type String

Type of this accelerator.

VertexAIParametersResponse

Env Dictionary<string, string>

Environment variables. At most 100 environment variables can be specified and unique. Example: GCP_BUCKET=gs://my-bucket/samples/

Network string

The full name of the Compute Engine network to which the Job should be peered. For example, projects/12345/global/networks/myVPC. Format is of the form projects/{project}/global/networks/{network}. Where {project} is a project number, as in 12345, and {network} is a network name. Private services access must already be configured for the network. If left unspecified, the job is not peered with any network.

Env map[string]string

Environment variables. At most 100 environment variables can be specified and unique. Example: GCP_BUCKET=gs://my-bucket/samples/

Network string

The full name of the Compute Engine network to which the Job should be peered. For example, projects/12345/global/networks/myVPC. Format is of the form projects/{project}/global/networks/{network}. Where {project} is a project number, as in 12345, and {network} is a network name. Private services access must already be configured for the network. If left unspecified, the job is not peered with any network.

env Map<String,String>

Environment variables. At most 100 environment variables can be specified and unique. Example: GCP_BUCKET=gs://my-bucket/samples/

network String

The full name of the Compute Engine network to which the Job should be peered. For example, projects/12345/global/networks/myVPC. Format is of the form projects/{project}/global/networks/{network}. Where {project} is a project number, as in 12345, and {network} is a network name. Private services access must already be configured for the network. If left unspecified, the job is not peered with any network.

env {[key: string]: string}

Environment variables. At most 100 environment variables can be specified and unique. Example: GCP_BUCKET=gs://my-bucket/samples/

network string

The full name of the Compute Engine network to which the Job should be peered. For example, projects/12345/global/networks/myVPC. Format is of the form projects/{project}/global/networks/{network}. Where {project} is a project number, as in 12345, and {network} is a network name. Private services access must already be configured for the network. If left unspecified, the job is not peered with any network.

env Mapping[str, str]

Environment variables. At most 100 environment variables can be specified and unique. Example: GCP_BUCKET=gs://my-bucket/samples/

network str

The full name of the Compute Engine network to which the Job should be peered. For example, projects/12345/global/networks/myVPC. Format is of the form projects/{project}/global/networks/{network}. Where {project} is a project number, as in 12345, and {network} is a network name. Private services access must already be configured for the network. If left unspecified, the job is not peered with any network.

env Map<String>

Environment variables. At most 100 environment variables can be specified and unique. Example: GCP_BUCKET=gs://my-bucket/samples/

network String

The full name of the Compute Engine network to which the Job should be peered. For example, projects/12345/global/networks/myVPC. Format is of the form projects/{project}/global/networks/{network}. Where {project} is a project number, as in 12345, and {network} is a network name. Private services access must already be configured for the network. If left unspecified, the job is not peered with any network.

Package Details

Repository
Google Cloud Native pulumi/pulumi-google-native
License
Apache-2.0