Skip to main content
PromptLayer natively supports OpenTelemetry (OTEL), the industry-standard observability framework. You can send traces from any OpenTelemetry-compatible SDK or Collector directly to PromptLayer — no PromptLayer SDK required. This is ideal when:
  • Your framework isn’t listed on the Integrations page
  • You already have an OpenTelemetry pipeline and want to add PromptLayer as a destination
  • You want vendor-neutral instrumentation
If you’re using a supported framework like the Vercel AI SDK, OpenAI Agents SDK, or Claude Code, see the Integrations page for framework-specific setup — those integrations handle the OTEL configuration for you.

How It Works

PromptLayer exposes an OTLP/HTTP endpoint at:
https://api.promptlayer.com/v1/traces
Any OpenTelemetry SDK or Collector can export traces to this endpoint. Spans that include GenAI semantic convention attributes are automatically converted into PromptLayer request logs.

Setup

Configure your OpenTelemetry SDK to export traces to PromptLayer using the OTLP/HTTP exporter.
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.sdk.resources import Resource
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter

# Install required packages:
# pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http

exporter = OTLPSpanExporter(
    endpoint="https://api.promptlayer.com/v1/traces",
    headers={"X-API-KEY": "your-promptlayer-api-key"},
)

provider = TracerProvider(
    resource=Resource.create({"service.name": "my-llm-app"})
)
provider.add_span_processor(BatchSpanProcessor(exporter))

# Use the tracer to create spans
tracer = provider.get_tracer("my-llm-app")

GenAI Semantic Conventions

Spans that use GenAI semantic conventions are automatically parsed into PromptLayer request logs. Add these attributes to your LLM call spans:
AttributeDescription
gen_ai.request.modelModel name (e.g. gpt-4, claude-sonnet-4-20250514)
gen_ai.provider.nameProvider (e.g. openai, anthropic)
gen_ai.operation.nameOperation type (chat, text_completion, embeddings)
gen_ai.usage.input_tokensInput token count
gen_ai.usage.output_tokensOutput token count
gen_ai.input.messagesRequest messages
gen_ai.output.messagesResponse messages
gen_ai.request.temperatureTemperature parameter
gen_ai.request.max_tokensMax tokens parameter
gen_ai.response.finish_reasonsFinish reasons

Event-Based Conventions

PromptLayer also supports the newer event-based GenAI semantic conventions where message content is sent as span events rather than span attributes. This format is used by frameworks like LiveKit and newer versions of OpenTelemetry GenAI instrumentation. The following event types are recognized:
Event NameDescription
gen_ai.system.messageSystem message
gen_ai.user.messageUser message
gen_ai.assistant.messageAssistant message (including tool calls)
gen_ai.tool.messageTool/function result message
gen_ai.choiceModel response/choice
Event attributes like gen_ai.system.message.content, gen_ai.user.message.content, and tool call data are automatically extracted and mapped to PromptLayer request logs.
When both attribute-based messages (gen_ai.input.messages) and event-based messages are present on the same span, attribute-based messages take priority.

Linking to Prompt Templates

You can associate OTEL spans with prompt templates in your PromptLayer workspace by setting custom span attributes:
AttributeTypeDescription
promptlayer.prompt.namestringName of the prompt template
promptlayer.prompt.idintegerID of the prompt template (alternative to name)
promptlayer.prompt.versionintegerSpecific version number (optional)
promptlayer.prompt.labelstringLabel to resolve version (e.g. production)
from opentelemetry import trace

tracer = trace.get_tracer("my-llm-app")

with tracer.start_as_current_span("llm-call") as span:
    # Link this span to a prompt template
    span.set_attribute("promptlayer.prompt.name", "my-prompt")
    span.set_attribute("promptlayer.prompt.label", "production")

    # Add GenAI attributes
    span.set_attribute("gen_ai.request.model", "gpt-4")
    span.set_attribute("gen_ai.provider.name", "openai")

    # ... make your LLM call ...

Using an OpenTelemetry Collector

If you’re already running an OpenTelemetry Collector, you can add PromptLayer as an additional exporter in your Collector config:
exporters:
  otlphttp/promptlayer:
    endpoint: "https://api.promptlayer.com"
    headers:
      X-API-Key: "${PROMPTLAYER_API_KEY}"

service:
  pipelines:
    traces:
      exporters: [otlphttp/promptlayer]
This lets you fan out traces to PromptLayer alongside your existing observability backends (Datadog, New Relic, Jaeger, etc.) without changing your application code.

Content Types

The endpoint accepts both binary protobuf (application/x-protobuf, recommended) and JSON (application/json) encodings. Both support Content-Encoding: gzip.

Next Steps