POST
/
prompt-templates
/
{prompt_name}

Retrieve a prompt template using it’s prompt_name. Optionally, specify version (version number) or label (release label like “prod”) to retrieve a specific version. If not specified, the latest version is returned.

PromptLayer will try to read the model provider from the parameters you attached to the prompt template. You can optionally pass in a provider to override the one set in the Prompt Registry. This will return LLM-specific arguments that can be passed directly into your LLM client. To format the template with input variables, use input_variables.

Headers

X-API-KEY
string
required

Path Parameters

prompt_name
string
required

Body

application/json
version
integer | null
workspace_id
integer | null
label
string | null
provider
enum<string> | null
Available options:
openai,
anthropic
input_variables
object | null

Response

200 - application/json
id
integer
required
prompt_name
string
required
prompt_template
object
required

Completion Prompt Template

metadata
object | null
commit_message
string | null
llm_kwargs
object | null

When you optionally specify provider in the body, llm_kwargs will be returned for that specific provider and you can pass these kwargs to the provider's API directly.

version
integer