Prompts are what define how your AI interacts with users. They allow you to set precise instructions for content generation, workflow automation, and they help you ensure consistent results. Prompts exist at the core of PromptLayer.

Understanding Prompts

A prompt is an instruction given to an AI model to guide its response. Well-structured prompts should:

  • Clearly define expected behavior to control outputs.
  • Use input variables (text, numbers, or other data) to ensure predictable responses.

Input Variables are the placeholders in your prompt left for user data. By passing in input variables like the date, a user query, or conversation memory data, your system is effictively filling-in–the-blanks that were left in the prompt template.

Why Prompts Matter

Prompts shape how an AI responds. A well-crafted prompt ensures the model generates accurate, relevant, and useful results. Without clear instructions, the AI may produce vague, off-topic, or incorrect responses.

For example, a generic prompt like “Write poetry about the moon” might result in a broad and inconsistent response. However, a more specific prompt, such as “Write a haiku about the moon,” directs the AI toward a precise and structured output.

Breaking Down a Prompt

Let’s create a prompt that generates a haiku. To do this, we first need to understand its key components.

A prompt typically consists of a System message and a User message.

The System message sets the AI’s role and behavior, much like opening a recipe book that defines how a dish should be prepared.

In our example, the System message is designed to create a haiku:

You are a skilled poet specializing in haikus.
Your task is to write a haiku based on a topic provided by the user.
The haiku must have 17 syllables, structured in three lines of 5, 7, and 5.

The User message often is used to provide input variables, similar to selecting a recipe based on available ingredients

In our example, the user message is the topic on which the haiku will be written:

{topic}

Create a Prompt

  1. Log in to your PromptLayer account.
  2. Navigate to the Prompt Registry. (Read more)
  3. Click the Create Prompt button to create a new prompt.
  4. Give the prompt a title such as ai-poet.
  5. Write a prompt in the System message with instructions (like we did above).
  6. Click New Message to add a User message. Add your input variable here (like {topic} above).
  7. Click on Parameters and choose an LLM model. We recommend starting with OpenAI.
  8. Save your prompt by clicking on Create Prompt.

By default, PromptLayer uses f-string syntax to parse input variables. That means that all input variables should be in curly-braces.

My name is {name}.

PromptLayer also supports jinja2 syntax. Learn more about Template Variables and how to use them effectively.

Watch the video below to learn how to create a prompt:

Run the Prompt in Playground

Testing prompts in Playground provides instant feedback, showing how the AI interprets your instructions and allowing for quick adjustments to improve results.

  1. Navigate to the Prompt Registry and open the prompt you just created.

  2. Click the Playground button, next to Edit.

  3. Optionally, modify the prompt if you want to test new changes.

  4. Execute the prompt by clicking the Run button.

  5. Observe the output in real time and adjust your prompt as needed.

Review Prompt Logs

Reviewing prompt logs helps you track old requests. PromptLayer stores historical outputs, execution time, cost, token usage, and model parameters.

  1. Open the sidebar on the left (if not already visible).
  2. Click on the first log entry to view detailed information, such as execution time, cost, token usage, and model parameters.
  3. Examine performance metrics or even click Open in Playground to experiment with the old prompt.

You can search for a specific term or use filters to locate specific prompt log. (Read more)

Additional Resources: