We recommend new projects start with resources from the AWS provider.
AWS Cloud Control v1.37.0 published on Wednesday, Oct 15, 2025 by Pulumi
aws-native.bedrock.getApplicationInferenceProfile
We recommend new projects start with resources from the AWS provider.
AWS Cloud Control v1.37.0 published on Wednesday, Oct 15, 2025 by Pulumi
Definition of AWS::Bedrock::ApplicationInferenceProfile Resource Type
Using getApplicationInferenceProfile
Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.
function getApplicationInferenceProfile(args: GetApplicationInferenceProfileArgs, opts?: InvokeOptions): Promise<GetApplicationInferenceProfileResult>
function getApplicationInferenceProfileOutput(args: GetApplicationInferenceProfileOutputArgs, opts?: InvokeOptions): Output<GetApplicationInferenceProfileResult>
def get_application_inference_profile(inference_profile_identifier: Optional[str] = None,
opts: Optional[InvokeOptions] = None) -> GetApplicationInferenceProfileResult
def get_application_inference_profile_output(inference_profile_identifier: Optional[pulumi.Input[str]] = None,
opts: Optional[InvokeOptions] = None) -> Output[GetApplicationInferenceProfileResult]
func LookupApplicationInferenceProfile(ctx *Context, args *LookupApplicationInferenceProfileArgs, opts ...InvokeOption) (*LookupApplicationInferenceProfileResult, error)
func LookupApplicationInferenceProfileOutput(ctx *Context, args *LookupApplicationInferenceProfileOutputArgs, opts ...InvokeOption) LookupApplicationInferenceProfileResultOutput
> Note: This function is named LookupApplicationInferenceProfile
in the Go SDK.
public static class GetApplicationInferenceProfile
{
public static Task<GetApplicationInferenceProfileResult> InvokeAsync(GetApplicationInferenceProfileArgs args, InvokeOptions? opts = null)
public static Output<GetApplicationInferenceProfileResult> Invoke(GetApplicationInferenceProfileInvokeArgs args, InvokeOptions? opts = null)
}
public static CompletableFuture<GetApplicationInferenceProfileResult> getApplicationInferenceProfile(GetApplicationInferenceProfileArgs args, InvokeOptions options)
public static Output<GetApplicationInferenceProfileResult> getApplicationInferenceProfile(GetApplicationInferenceProfileArgs args, InvokeOptions options)
fn::invoke:
function: aws-native:bedrock:getApplicationInferenceProfile
arguments:
# arguments dictionary
The following arguments are supported:
- Inference
Profile stringIdentifier - Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- Inference
Profile stringIdentifier - Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- inference
Profile StringIdentifier - Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- inference
Profile stringIdentifier - Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- inference_
profile_ stridentifier - Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- inference
Profile StringIdentifier - Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
getApplicationInferenceProfile Result
The following output properties are available:
- Created
At string - Time Stamp
- Inference
Profile stringArn - The Amazon Resource Name (ARN) of the inference profile.
- Inference
Profile stringId - The unique identifier of the inference profile.
- Inference
Profile stringIdentifier - Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- Models
List<Pulumi.
Aws Native. Bedrock. Outputs. Application Inference Profile Inference Profile Model> - List of model configuration
- Status
Pulumi.
Aws Native. Bedrock. Application Inference Profile Inference Profile Status - The status of the inference profile.
ACTIVE
means that the inference profile is ready to be used. - List<Pulumi.
Aws Native. Outputs. Tag> - List of Tags
- Type
Pulumi.
Aws Native. Bedrock. Application Inference Profile Inference Profile Type - The type of the inference profile. The following types are possible:
SYSTEM_DEFINED
– The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles.APPLICATION
– The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.
- Updated
At string - Time Stamp
- Created
At string - Time Stamp
- Inference
Profile stringArn - The Amazon Resource Name (ARN) of the inference profile.
- Inference
Profile stringId - The unique identifier of the inference profile.
- Inference
Profile stringIdentifier - Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- Models
[]Application
Inference Profile Inference Profile Model - List of model configuration
- Status
Application
Inference Profile Inference Profile Status - The status of the inference profile.
ACTIVE
means that the inference profile is ready to be used. - Tag
- List of Tags
- Type
Application
Inference Profile Inference Profile Type - The type of the inference profile. The following types are possible:
SYSTEM_DEFINED
– The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles.APPLICATION
– The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.
- Updated
At string - Time Stamp
- created
At String - Time Stamp
- inference
Profile StringArn - The Amazon Resource Name (ARN) of the inference profile.
- inference
Profile StringId - The unique identifier of the inference profile.
- inference
Profile StringIdentifier - Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- models
List<Application
Inference Profile Inference Profile Model> - List of model configuration
- status
Application
Inference Profile Inference Profile Status - The status of the inference profile.
ACTIVE
means that the inference profile is ready to be used. - List<Tag>
- List of Tags
- type
Application
Inference Profile Inference Profile Type - The type of the inference profile. The following types are possible:
SYSTEM_DEFINED
– The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles.APPLICATION
– The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.
- updated
At String - Time Stamp
- created
At string - Time Stamp
- inference
Profile stringArn - The Amazon Resource Name (ARN) of the inference profile.
- inference
Profile stringId - The unique identifier of the inference profile.
- inference
Profile stringIdentifier - Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- models
Application
Inference Profile Inference Profile Model[] - List of model configuration
- status
Application
Inference Profile Inference Profile Status - The status of the inference profile.
ACTIVE
means that the inference profile is ready to be used. - Tag[]
- List of Tags
- type
Application
Inference Profile Inference Profile Type - The type of the inference profile. The following types are possible:
SYSTEM_DEFINED
– The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles.APPLICATION
– The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.
- updated
At string - Time Stamp
- created_
at str - Time Stamp
- inference_
profile_ strarn - The Amazon Resource Name (ARN) of the inference profile.
- inference_
profile_ strid - The unique identifier of the inference profile.
- inference_
profile_ stridentifier - Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- models
Sequence[Application
Inference Profile Inference Profile Model] - List of model configuration
- status
Application
Inference Profile Inference Profile Status - The status of the inference profile.
ACTIVE
means that the inference profile is ready to be used. - Sequence[root_Tag]
- List of Tags
- type
Application
Inference Profile Inference Profile Type - The type of the inference profile. The following types are possible:
SYSTEM_DEFINED
– The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles.APPLICATION
– The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.
- updated_
at str - Time Stamp
- created
At String - Time Stamp
- inference
Profile StringArn - The Amazon Resource Name (ARN) of the inference profile.
- inference
Profile StringId - The unique identifier of the inference profile.
- inference
Profile StringIdentifier - Inference profile identifier. Supports both system-defined inference profile ids, and inference profile ARNs.
- models List<Property Map>
- List of model configuration
- status "ACTIVE"
- The status of the inference profile.
ACTIVE
means that the inference profile is ready to be used. - List<Property Map>
- List of Tags
- type "APPLICATION" | "SYSTEM_DEFINED"
- The type of the inference profile. The following types are possible:
SYSTEM_DEFINED
– The inference profile is defined by Amazon Bedrock. You can route inference requests across regions with these inference profiles.APPLICATION
– The inference profile was created by a user. This type of inference profile can track metrics and costs when invoking the model in it. The inference profile may route requests to one or multiple regions.
- updated
At String - Time Stamp
Supporting Types
ApplicationInferenceProfileInferenceProfileModel
- Model
Arn string - ARN for Foundation Models in Bedrock. These models can be used as base models for model customization jobs
- Model
Arn string - ARN for Foundation Models in Bedrock. These models can be used as base models for model customization jobs
- model
Arn String - ARN for Foundation Models in Bedrock. These models can be used as base models for model customization jobs
- model
Arn string - ARN for Foundation Models in Bedrock. These models can be used as base models for model customization jobs
- model_
arn str - ARN for Foundation Models in Bedrock. These models can be used as base models for model customization jobs
- model
Arn String - ARN for Foundation Models in Bedrock. These models can be used as base models for model customization jobs
ApplicationInferenceProfileInferenceProfileStatus
ApplicationInferenceProfileInferenceProfileType
Tag
Package Details
- Repository
- AWS Native pulumi/pulumi-aws-native
- License
- Apache-2.0
We recommend new projects start with resources from the AWS provider.
AWS Cloud Control v1.37.0 published on Wednesday, Oct 15, 2025 by Pulumi