Advanced Logging
Custom Logging
When to Use Custom Logging
Use the log_request
method when:
- You’re not using PromptLayer’s proxied versions of OpenAI or Anthropic clients
- You’re not using
pl_client.run()
for executing prompts - You need more flexibility (e.g., background processing, custom models)
- You want to track requests made outside the PromptLayer SDK
While custom logging requires more manual work, it offers greater control over the logging process and supports any LLM provider.
API Reference
For complete documentation on the log_request
API, see the Log Request API Reference.
Request Parameters
When logging a custom request, you can use the following parameters (see API Reference for details):
provider
(required): The LLM provider name (e.g., “openai”, “anthropic”)model
(required): The specific model used (e.g., “gpt-4o”, “claude-3-7-sonnet-20250219”)input
(required): The input prompt in Prompt Blueprint formatoutput
(required): The model response in Prompt Blueprint formatrequest_start_time
: Timestamp when the request startedrequest_end_time
: Timestamp when the response was receivedprompt_name
: Name of the prompt template if using one from PromptLayerprompt_version_number
: Version number of the prompt templateprompt_input_variables
: Variables used in the prompt templateinput_tokens
: Number of input tokens usedoutput_tokens
: Number of output tokens generatedtags
: Array of strings for categorizing requestsmetadata
: Custom JSON object for ability to search and filter requests later
Basic Usage
The input
and output
must be in prompt blueprint format:
Provider Conversion Helpers
OpenAI Format Converter
Anthropic Format Converter
OpenAI Example
Anthropic Example
Working with Tools and Function Calls
For OpenAI/Anthropic function calling or tool use: