POST
/
prompt-templates
/
{prompt_name}
curl --request POST \
  --url https://api.promptlayer.com/prompt-templates/{prompt_name} \
  --header 'Content-Type: application/json' \
  --header 'X-API-KEY: <x-api-key>' \
  --data '{
  "version": 1,
  "workspace_id": 123,
  "label": "<string>",
  "provider": "openai",
  "input_variables": {}
}'
{
  "id": 123,
  "prompt_name": "<string>",
  "prompt_template": {
    "content": [
      {
        "type": "<any>",
        "text": "<string>"
      }
    ],
    "input_variables": [
      "<string>"
    ],
    "template_format": "f-string",
    "type": "<any>"
  },
  "metadata": {
    "model": {
      "provider": "<string>",
      "name": "<string>",
      "parameters": {}
    },
    "customField": "<string>"
  },
  "commit_message": "<string>",
  "llm_kwargs": {},
  "version": 123
}

Retrieve a prompt template using it’s prompt_name. Optionally, specify version (version number) or label (release label like “prod”) to retrieve a specific version. If not specified, the latest version is returned.

You can also specify a provider to return LLM-specific arguments that can be passed directly into your LLM client. To format the template with input variables, use input_variables.

Headers

X-API-KEY
string
required

Path Parameters

prompt_name
string
required

Body

application/json
version
integer | null
workspace_id
integer | null
label
string | null
provider
enum<string> | null
Available options:
openai,
anthropic
input_variables
object | null

Response

200 - application/json
id
integer
required
prompt_name
string
required
prompt_template
object
required

Completion Prompt Template

metadata
object | null
commit_message
string | null
llm_kwargs
object | null

When you optionally specify provider in the body, llm_kwargs will be returned for that specific provider and you can pass these kwargs to the provider's API directly.

version
integer