POST
/
prompt-templates
/
{identifier}

Retrieve a prompt template using either the prompt_name or prompt_id. Optionally, specify version (version number) or label (release label like “prod”) to retrieve a specific version. If not specified, the latest version is returned.

PromptLayer will try to read the model provider from the parameters you attached to the prompt template. You can optionally pass in a provider to override the one set in the Prompt Registry. This will return LLM-specific arguments that can be passed directly into your LLM client. To format the template with input variables, use input_variables.

Headers

X-API-KEY
string
required

Path Parameters

identifier
string
required

The identifier can be either the prompt name or the prompt id.

Body

application/json
version
integer | null
Required range: x > 0
workspace_id
integer | null
label
string | null
provider
enum<string> | null
Available options:
openai,
anthropic
input_variables
object | null
metadata_filters
object | null

Optional dictionary of key values used for A/B release labels.

Response

200 - application/json
id
integer
required
prompt_name
string
required
prompt_template
object
required
metadata
object | null
commit_message
string | null
llm_kwargs
object | null

When you optionally specify provider in the body, llm_kwargs will be returned for that specific provider and you can pass these kwargs to the provider's API directly.

version
integer