prompt_name
or prompt_id
. Optionally, specify version
(version number) or label
(release label like “prod”) to retrieve a specific version. If not specified, the latest version is returned.
PromptLayer will try to read the model provider from the parameters you attached to the prompt template. You can optionally pass in a provider
to override the one set in the Prompt Registry. This will return LLM-specific arguments that can be passed directly into your LLM client. To format the template with input variables, use input_variables
.Headers
Path Parameters
The identifier can be either the prompt name or the prompt id.
Body
application/json
Required range:
x > 0
Available options:
openai
, anthropic
Optional dictionary of key values used for A/B release labels.
Optional model name used for returning default parameters with llm_kwargs.
Optional dictionary of model parameter overrides to use with the prompt template. This will override the parameters at runtime for the specified model and will try to make sure the model supports these parameters. For example, if you supply maxOutputTokens
for OpenAI, it will be converted to max_completion_tokens
.
Response
Successful Response
- Completion Template
- Chat Template
When you optionally specify provider
in the body, llm_kwargs
will be returned for that specific provider and you can pass these kwargs to the provider's API directly.