{topic}
. This will be filled out by user input when interacting with the AI.
<your_openai_api_key>
and <your_promptlayer_api_key>
in the code snippets below with your actual keysUsing a .env file instead
python-dotenv
:app.py
and load the environment variables:{topic}
user input variable we made to specify the haiku topic in our prompt? PromptLayer can format the f-string or Jinja2 template using this variable. Learn more about template variables and how they work.
We will use PromptLayer to fetch the latest version of the prompt, inject input variables, and run it.
promptlayer_client.run
will automatically format your prompt template into the format needed by OpenAI, Anthropic, and all other major model providers.
The LLM request runs locally on your machine.
app.py
and run it from your terminal with:app.py
file when running this command.prompt_release_label
to the run call.
prompt_version
to specify a version number directly.
enable_tracing
set to True
:
@promptlayer_client.traceable
decorator in Python or wrapWithSpan
in JavaScript to trace custom functions.
Read more about tracing here.