This is a prompt by {author_name}
). Learn more about using template variables. Prompt templates can have tags and are uniquely named.
You can use this tool to programmatically retrieve and publish prompts (even at runtime!). That is, this registry makes it easy to start A/B testing your prompts. Viewed as a “Prompt Management System”, this registry allows your org to pull out and organize the prompts that are currently dispersed throughout your codebase.
/prompte-templates/{prompt_name}
(read more).
prod
and staging
can be optionally applied to template versions and used to retrieve the template.
version
to get an older version of a prompt. By default the newest version of a prompt is returned
metadata
using the following code snippet:
provider
and input_variables
.
Currently we support provider
type of either openai
or anthropic
.
provider
, input_variables
, and other specific parameters required by the LLM provider (e.g., temperature
, max_tokens
for OpenAI). Use the llm_kwargs
as provided. If you need to override certain arguments, it is recommended to create a new version on PromptLayer. Alternatively, you can override them on your end if necessary.
f-string, jinja2
) to declare variables. (f-string
) allows you to declare variables using curly brackets ({variable_name}
) while (jinja2
) allows you to declare variables using double curly brackets ({{variable_name}}
). For a more detailed explanation of both formats, see our Template Variables documentation.
prod
on both version 1 and version 2 of a prompt template. This restriction is in place to prevent confusion when searching for prompt templates.
You can also set release labels via the SDK
/prompt-templates/{identifier}
(read more)
/rest/prompt-templates
(read more)
model
attribute is reserved for model parameters, avoid putting custom metadata here.
model
metadata along with a custom category
metadata:
/rest/prompt-templates
(read more).
You can also use Langchain to create a template, either by pulling it from LangchainHub, creating a custom template, or providing a Python dictionary directly.
/rest/track-prompt
(read more).
Learn more about tracking templates here
get
method with a release label.per_page
argument. For example, to get 100 prompts you can do the following:
page
argument. For example, to get the second page of prompts you can do the following:
prompt_template
represents the latest version of the prompt template.
Alternatively, use the REST API endpoint /prompt-templates
(read more).