Observability
Learn how to use logging to monitor performance and optimize your prompts.
This module requires an existing prompt in your PromptLayer account. Please follow the Getting Started guide to create one if needed.
When buidling a prompt, observability becomes critical. For example, the ai-poet prompt generates a creative haiku based on a given topic, and enabling logging helps you monitor important performance details. For example, logging can reveal:
- Execution Issues: Did the prompt return a reasonable output as expected?
- Execution Time: How quickly the prompt is executed.
- Token Usage: The number of tokens used during execution, which directly impacts cost.
- Cost Metrics: Whether the prompt runs efficiently within your budget.
By reviewing these logs, you can determine if your ai-poet prompt is performing as expected and make adjustments if necessary—ensuring that your creative content is generated both efficiently and effectively.
Create an API Key
Before you can enable logging, you need to authenticate your PromptLayer client with an API key.
- Go to your PromptLayer Settings.
- Click on Create an API key to generate a new API key.
- Copy the API key for later use. (Read more)
Enable Logging
Set up logging and tracing within your SDK to capture execution data. This enables you to monitor latency, track errors, and record metadata.
- Install the PromptLayer SDK.
- Import the PromptLayer client.
- Initialize the PromptLayer client with your API key and logging enabled.
- Run the “ai-poet” prompt using the
pl_client.run
method, providing an input variable such as{topic: "The Ocean"}
.
- Review the generated logs to analyze metrics like execution time, token usage, and cost, then use these insights to fine-tune your prompt.
To read more about logging, check out the Logging Metadata section of the Quickstart guide.
Run and View Logs
Review your logs to troubleshoot issues and gather performance metrics.
- Execute your prompt (via SDK or code).
- Open the sidebar on the left side and click Requests tab to view log entries.
- Click on the log entry to see execution time, cost, token usage, and more.
- Use these insights to refine and optimize your prompt.
Use filters to search for specific requests, such as filtering by tags. In
this guide, we added the tag onboarding_guide
to the request.
You can also open these logs in the Playground, share them with your team, and add them to a dataset to use them for refining and testing.
Additional Resources:
- For more on running prompts, visit the Running Requests guide.
- To learn more about SDKs, check out our Python and Javascript guides.
- For more on Logging, check out our Advanced Logging guide.
- To learn more about filtering logs, check out the Advanced Search section of the Quickstart guide.
Was this page helpful?