Python
To get started, create an account by clicking “Log in” on PromptLayer. Once logged in, click the button to create an API key and save this in a secure location (Guide to Using Env Vars).
Once you have that all set up, install PromptLayer using pip
.
PromptLayer python library has support for both OpenAI and Anthropic LLMs!
Set up a PromptLayer client in your python file.
Optionally, you can specify the API key in the client.
OpenAI
In the Python file where you use OpenAI APIs, add the following. This allows us to keep track of your requests without needing any other code changes.
You can then use openai
as you would if you had imported it directly.
There is only one difference… PromptLayer allows you to add tags through the pl_tags
argument. This allows you to track and group requests in the dashboard.
Tags are not required but we recommend them!
After making your first few requests, you should be able to see them in the PromptLayer dashboard!
Here is a complete code snippet:
Anthropic
Using Anthropic with PromptLayer is very similar to how to one would use OpenAI.
Below is an example code snippet of the one line replacement:
Here is how it would look like on the dashbaord:
Async Support
PromptLayer supports asynchronous operations, ideal for managing concurrent tasks in non-blocking environments like web servers, microservices, or Jupyter notebooks.
Initializing the Async Client
To use asynchronous non-blocking methods, initialize AsyncPromptLayer as shown:
Async Usage Examples
The asynchronous client functions similarly to the synchronous version, but allows for non-blocking execution with asyncio
. Below are example uses.
Example 1: Async Template Management
Use asynchronous methods to manage templates:
Example 2: Async Workflow Execution
Run workflows asynchronously for better efficiency:
Example 3: Async Tracking and Logging
Track and log requests asynchronously:
For more information on custom logging, please visit our Custom Logging Documentation.
Example 4: Asynchronous Prompt Execution with run Method
You can execute prompt templates asynchronously using the run method. This allows you to run a prompt template by name with given input variables.
In this example, replace “TestPrompt” with the name of your prompt template, and provide any required input variables.
Supported Methods: Synchronous vs. Asynchronous
The following table provides an overview of the methods currently available in both synchronous and asynchronous versions of the PromptLayer client:
Method | Description | Synchronous Version | Asynchronous Version |
---|---|---|---|
templates.get() | Retrieves a template by name. | promptlayer_client.templates.get() | async_promptlayer_client.templates.get() |
templates.all() | Retrieves all templates. | promptlayer_client.templates.all() | async_promptlayer_client.templates.all() |
run() | Executes a prompt template. | promptlayer_client.run() | async_promptlayer_client.run() |
run_workflow() | Executes a workflow. | promptlayer_client.run_workflow() | async_promptlayer_client.run_workflow() |
track.metadata() | Tracks metadata. | promptlayer_client.track.metadata() | async_promptlayer_client.track.metadata() |
track.group() | Tracks a group. | promptlayer_client.track.group() | async_promptlayer_client.track.group() |
track.prompt() | Tracks a prompt. | promptlayer_client.track.prompt() | async_promptlayer_client.track.prompt() |
track.score() | Tracks a score. | promptlayer_client.track.score() | async_promptlayer_client.track.score() |
group.create() | Creates a new group. | promptlayer_client.group.create() | async_promptlayer_client.group.create() |
log_request() | Logs a request. | promptlayer_client.log_request() | async_promptlayer_client.log_request() |
Note: All asynchronous methods require an active event loop. Use them within an
async
function and run the function usingasyncio.run()
or another method suitable for managing event loops (e.g.,await
in Jupyter notebooks).
Want to say hi 👋, submit a feature request, or report a bug? ✉️ Contact us
Was this page helpful?