Get Prompt Template
Retrieve a prompt template using it’s prompt_name
. Optionally, specify version
(version number) or label
(release label like “prod”) to retrieve a specific version. If not specified, the latest version is returned.
PromptLayer will try to read the model provider from the parameters you attached to the prompt template. You can optionally pass in a provider
to override the one set in the Prompt Registry. This will return LLM-specific arguments that can be passed directly into your LLM client. To format the template with input variables, use input_variables
.
Headers
Path Parameters
Body
openai
, anthropic
Response
Completion Prompt Template
When you optionally specify provider
in the body, llm_kwargs
will be returned for that specific provider and you can pass these kwargs to the provider's API directly.
Was this page helpful?