Skip to main content
GET
/
prompt-templates
/
{identifier}
Get Prompt Template Raw Data
curl --request GET \
  --url https://api.promptlayer.com/prompt-templates/{identifier} \
  --header 'X-API-KEY: <x-api-key>'
{
  "success": true,
  "id": 123,
  "prompt_name": "<string>",
  "version": 123,
  "workspace_id": 123,
  "prompt_template": {
    "content": [
      {
        "text": "<string>",
        "type": "text"
      }
    ],
    "input_variables": [],
    "template_format": "f-string",
    "type": "completion"
  },
  "snippets": [
    {
      "prompt_name": "<string>",
      "version": 123,
      "label": "<string>"
    }
  ],
  "metadata": {
    "model": {
      "provider": "<string>",
      "name": "<string>",
      "parameters": {}
    },
    "customField": "<string>"
  },
  "commit_message": "<string>",
  "tags": [
    "<string>"
  ],
  "created_at": "2023-11-07T05:31:56Z",
  "llm_kwargs": {}
}
Retrieve raw prompt template data without applying input variables. This endpoint is designed for:
  • GitHub sync: Use resolve_snippets=false to get the raw template with @@@snippet@@@ references intact
  • Local caching: Fetch resolved templates with optional llm_kwargs for offline use
  • Template inspection: View template structure and metadata without executing
Unlike the POST endpoint, this GET endpoint does not accept input_variables or provider in a request body. Instead, it returns the raw template data with placeholders preserved.

Query Parameters

ParameterTypeDefaultDescription
versioninteger-Specific version number. Mutually exclusive with label.
labelstring-Release label name (e.g. prod). Mutually exclusive with version.
resolve_snippetsbooleantrueWhen true, snippets are expanded. When false, raw @@@snippet@@@ references are preserved.
include_llm_kwargsbooleanfalseWhen true, includes provider-specific LLM API format in the response.

Caching

Responses are cached by default. To bypass the cache, send the Cache-Control: no-cache header.
Provider-Specific Schema NoticeThe llm_kwargs field (when requested via include_llm_kwargs=true) is provider-specific and its structure may change without notice as LLM providers update their APIs.For stable, provider-agnostic prompt data, use prompt_template instead of llm_kwargs.

Examples

# Default: resolved snippets, latest version
curl -H "X-API-KEY: your_api_key" \
  https://api.promptlayer.com/prompt-templates/my-prompt

# Raw template with snippet references (for GitHub sync)
curl -H "X-API-KEY: your_api_key" \
  "https://api.promptlayer.com/prompt-templates/my-prompt?resolve_snippets=false"

# With llm_kwargs (for local caching)
curl -H "X-API-KEY: your_api_key" \
  "https://api.promptlayer.com/prompt-templates/my-prompt?include_llm_kwargs=true"

# Specific version
curl -H "X-API-KEY: your_api_key" \
  "https://api.promptlayer.com/prompt-templates/my-prompt?version=2"

# By release label
curl -H "X-API-KEY: your_api_key" \
  "https://api.promptlayer.com/prompt-templates/my-prompt?label=prod"

Authentication

This endpoint requires API key authentication via the X-API-KEY header.

Headers

X-API-KEY
string
required

Path Parameters

identifier
string
required

The identifier can be either the prompt name or the prompt id.

Query Parameters

version
integer

Specific version number to retrieve. Mutually exclusive with label.

label
string

Release label name to retrieve (e.g. 'prod', 'staging'). Mutually exclusive with version.

resolve_snippets
boolean
default:true

When true (default), snippets are expanded in the returned prompt_template. When false, raw @@@snippet@@@ references are preserved.

include_llm_kwargs
boolean
default:false

When true, includes provider-specific llm_kwargs in the response. Requires model metadata to be set on the template.

Response

Successful Response

success
boolean
required
id
integer
required

The prompt template ID.

prompt_name
string
required

The name of the prompt template.

version
integer
required

The version number of the prompt template.

workspace_id
integer
required

The workspace this prompt template belongs to.

prompt_template
Completion Template · object
required

The prompt template content. When resolve_snippets is true (default), snippets are expanded. When false, raw @@@snippet@@@ references are preserved.

snippets
SnippetReference · object[]
required

List of snippet references used in this template.

metadata
Metadata · object

Model configuration including provider, model name, and parameters.

commit_message
string | null

The commit message for this version.

tags
string[]

Tags associated with the prompt template.

created_at
string<date-time> | null

Timestamp when this version was created.

llm_kwargs
LLM Kwargs · object

Provider-specific LLM arguments. Only present when include_llm_kwargs=true. Structure is provider-specific and may change without notice.