Python
To get started, create an account by clicking “Log in” on PromptLayer. Once logged in, click the button to create an API key and save this in a secure location (Guide to Using Env Vars).
Once you have that all set up, install PromptLayer using pip
.
pip install promptlayer
PromptLayer python library has support for both OpenAI and Anthropic LLMs!
Set up a PromptLayer client in your python file.
from promptlayer import PromptLayer
promptlayer_client = PromptLayer()
Optionally, you can specify the API key in the client.
promptlayer_client = PromptLayer(api_key="pl_****")
OpenAI
In the Python file where you use OpenAI APIs, add the following. This allows us to keep track of your requests without needing any other code changes.
You can then use openai
as you would if you had imported it directly.
There is only one difference… PromptLayer allows you to add tags through the pl_tags
argument. This allows you to track and group requests in the dashboard.
Tags are not required but we recommend them!
completion = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[
{"role": "system", "content": "You are an AI."},
{"role": "user", "content": "Compose a poem please."}
],
pl_tags=["getting-started"]
)
After making your first few requests, you should be able to see them in the PromptLayer dashboard!
Here is a complete code snippet:
from promptlayer import PromptLayer
promptlayer_client = PromptLayer()
# Swap out 'from openai import OpenAI'
OpenAI = promptlayer_client.openai.OpenAI
client = OpenAI()
completion = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[
{"role": "system", "content": "You are an AI."},
{"role": "user", "content": "Compose a poem please."}
],
pl_tags=["getting-started"]
)
print(completion.choices[0].message)
Anthropic
Using Anthropic with PromptLayer is very similar to how to one would use OpenAI.
Below is an example code snippet of the one line replacement:
from promptlayer import PromptLayer
promptlayer_client = PromptLayer()
# Swap out 'from anthropic import Anthropic'
anthropic = promptlayer_client.anthropic
client = anthropic.Anthropic()
completion = client.completions.create(
prompt=f'{anthropic.HUMAN_PROMPT} How many toes do dogs have? {anthropic.AI_PROMPT}',
stop_sequences=[anthropic.HUMAN_PROMPT],
model='claude-v1-100k',
max_tokens_to_sample=100,
pl_tags=['animal-toes']
)
print(completion)
Here is how it would look like on the dashbaord:
Want to say hi 👋 , submit a feature request, or report a bug? ✉️ Contact us
Was this page helpful?