Get Prompt Template
Retrieve a prompt template using either the prompt_name
or prompt_id
. Optionally, specify version
(version number) or label
(release label like “prod”) to retrieve a specific version. If not specified, the latest version is returned.
PromptLayer will try to read the model provider from the parameters you attached to the prompt template. You can optionally pass in a provider
to override the one set in the Prompt Registry. This will return LLM-specific arguments that can be passed directly into your LLM client. To format the template with input variables, use input_variables
.
Headers
Path Parameters
The identifier can be either the prompt name or the prompt id.
Body
x > 0
openai
, anthropic
Optional dictionary of key values used for A/B release labels.
Response
When you optionally specify provider
in the body, llm_kwargs
will be returned for that specific provider and you can pass these kwargs to the provider's API directly.
Was this page helpful?