Retrieve raw prompt template data without applying input variables. Designed for GitHub sync, local caching, and template inspection. By default, snippets are resolved (expanded). Use resolve_snippets=false to get the raw template with snippet references intact.
resolve_snippets=false to get the raw template with @@@snippet@@@ references intactllm_kwargs for offline useinput_variables or provider in a request body. Instead, it returns the raw template data with placeholders preserved.
| Parameter | Type | Default | Description |
|---|---|---|---|
version | integer | - | Specific version number. Mutually exclusive with label. |
label | string | - | Release label name (e.g. prod). Mutually exclusive with version. |
resolve_snippets | boolean | true | When true, snippets are expanded. When false, raw @@@snippet@@@ references are preserved. |
include_llm_kwargs | boolean | false | When true, includes provider-specific LLM API format in the response. |
Cache-Control: no-cache header.
X-API-KEY header.The identifier can be either the prompt name or the prompt id.
Specific version number to retrieve. Mutually exclusive with label.
Release label name to retrieve (e.g. 'prod', 'staging'). Mutually exclusive with version.
When true (default), snippets are expanded in the returned prompt_template. When false, raw @@@snippet@@@ references are preserved.
When true, includes provider-specific llm_kwargs in the response. Requires model metadata to be set on the template.
Successful Response
The prompt template ID.
The name of the prompt template.
The version number of the prompt template.
The workspace this prompt template belongs to.
The prompt template content. When resolve_snippets is true (default), snippets are expanded. When false, raw @@@snippet@@@ references are preserved.
List of snippet references used in this template.
Model configuration including provider, model name, and parameters.
The commit message for this version.
Tags associated with the prompt template.
Timestamp when this version was created.
Provider-specific LLM arguments. Only present when include_llm_kwargs=true. Structure is provider-specific and may change without notice.