Reference
Get Prompt Template
POST
/
prompt-templates
/
{prompt_name}
Retrieve a prompt template using it’s prompt_name
. Optionally, specify version
(version number) or label
(release label like “prod”) to retrieve a specific version. If not specified, the latest version is returned.
You can also specify a provider
to return LLM-specific arguments that can be passed directly into your LLM client. To format the template with input variables, use input_variables
.
Headers
X-API-KEY
string
requiredPath Parameters
prompt_name
string
requiredBody
application/json
version
integer | null
workspace_id
integer | null
label
string | null
provider
enum<string> | null
Available options:
openai
, anthropic
input_variables
object | null
Response
200 - application/json
id
integer
requiredprompt_name
string
requiredprompt_template
object
requiredCompletion Prompt Template
metadata
object | null
commit_message
string | null
llm_kwargs
object | null
When you optionally specify provider
in the body, llm_kwargs
will be returned for that specific provider and you can pass these kwargs to the provider's API directly.
version
integer
Was this page helpful?