Skip to main content
POST
/
v1
/
traces
Ingest Traces (OTLP)
curl --request POST \
  --url https://api.promptlayer.com/v1/traces \
  --header 'Content-Type: <content-type>' \
  --header 'X-API-KEY: <x-api-key>' \
  --data '
{
  "resourceSpans": [
    {
      "resource": {
        "attributes": [
          {
            "key": "<string>",
            "value": {
              "stringValue": "<string>",
              "intValue": "<string>",
              "doubleValue": 123,
              "boolValue": true,
              "arrayValue": {
                "values": [
                  {}
                ]
              }
            }
          }
        ]
      },
      "scopeSpans": [
        {
          "scope": {
            "name": "<string>",
            "version": "<string>"
          },
          "spans": [
            {
              "traceId": "<string>",
              "spanId": "<string>",
              "name": "<string>",
              "startTimeUnixNano": "<string>",
              "parentSpanId": "<string>",
              "kind": 0,
              "endTimeUnixNano": "<string>",
              "attributes": [
                {
                  "key": "<string>",
                  "value": {
                    "stringValue": "<string>",
                    "intValue": "<string>",
                    "doubleValue": 123,
                    "boolValue": true,
                    "arrayValue": {
                      "values": [
                        {}
                      ]
                    }
                  }
                }
              ],
              "status": {
                "code": 0,
                "message": "<string>"
              },
              "events": [
                {
                  "timeUnixNano": "<string>",
                  "name": "<string>",
                  "attributes": [
                    {
                      "key": "<string>",
                      "value": {
                        "stringValue": "<string>",
                        "intValue": "<string>",
                        "doubleValue": 123,
                        "boolValue": true,
                        "arrayValue": {
                          "values": [
                            {}
                          ]
                        }
                      }
                    }
                  ]
                }
              ],
              "links": [
                {
                  "traceId": "<string>",
                  "spanId": "<string>",
                  "attributes": [
                    {
                      "key": "<string>",
                      "value": {
                        "stringValue": "<string>",
                        "intValue": "<string>",
                        "doubleValue": 123,
                        "boolValue": true,
                        "arrayValue": {
                          "values": [
                            {}
                          ]
                        }
                      }
                    }
                  ]
                }
              ]
            }
          ]
        }
      ]
    }
  ]
}
'
{
  "partialSuccess": {
    "rejectedSpans": 123,
    "errorMessage": "<string>"
  }
}
Ingest OpenTelemetry traces using the standard OTLP/HTTP protocol. This endpoint is compatible with any OpenTelemetry SDK or Collector configured to export over HTTP. Spans that carry GenAI semantic convention attributes are automatically converted into PromptLayer request logs, giving you full observability without any PromptLayer-specific instrumentation.

Content Types

Content-TypeDescription
application/x-protobufBinary protobuf encoding (recommended)
application/jsonJSON encoding
Both formats support Content-Encoding: gzip.

GenAI Semantic Conventions

Spans with GenAI semantic convention attributes are automatically recognized and converted into request logs. The following attributes are extracted:
AttributeMaps to
gen_ai.request.modelModel / engine
gen_ai.provider.nameProvider type
gen_ai.operation.nameOperation name (e.g. chat, text_completion, embeddings)
gen_ai.usage.input_tokensInput token count
gen_ai.usage.output_tokensOutput token count
gen_ai.input.messagesRequest input messages
gen_ai.output.messagesResponse output messages
gen_ai.request.temperatureTemperature parameter
gen_ai.request.max_tokensMax tokens parameter
gen_ai.request.top_pTop-p parameter
gen_ai.response.finish_reasonsFinish reasons

Linking to Prompt Templates

You can link ingested spans to existing prompt templates in your workspace by adding custom span attributes:
AttributeTypeDescription
promptlayer.prompt.namestringName of the prompt template to link
promptlayer.prompt.idintegerID of the prompt template to link (alternative to name)
promptlayer.prompt.versionintegerSpecific version number (optional)
promptlayer.prompt.labelstringLabel to resolve to a version number (e.g. production). Used when version is not set.
Either name or id must be provided to identify the prompt template. If the prompt is not found in the workspace, the span is still ingested but no request log link is created.

Validation

  • traceId must be exactly 32 hex characters (16 bytes)
  • spanId must be exactly 16 hex characters (8 bytes)
  • parentSpanId, if provided, must be exactly 16 hex characters
Invalid spans are rejected and reported in the partialSuccess field of the response. Valid spans in the same request are still accepted.

Example: Configuring an OpenTelemetry SDK

from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter

exporter = OTLPSpanExporter(
    endpoint="https://api.promptlayer.com/v1/traces",
    headers={"X-API-KEY": "your-api-key"},
)

provider = TracerProvider()
provider.add_span_processor(BatchSpanProcessor(exporter))

Example: JSON Request

{
  "resourceSpans": [
    {
      "resource": {
        "attributes": [
          { "key": "service.name", "value": { "stringValue": "my-llm-app" } }
        ]
      },
      "scopeSpans": [
        {
          "scope": { "name": "openai.instrumentation", "version": "1.0.0" },
          "spans": [
            {
              "traceId": "a1b2c3d4e5f6a7b8c9d0e1f2a3b4c5d6",
              "spanId": "1a2b3c4d5e6f7a8b",
              "name": "chat gpt-4",
              "kind": 3,
              "startTimeUnixNano": "1700000000000000000",
              "endTimeUnixNano": "1700000001500000000",
              "attributes": [
                { "key": "gen_ai.request.model", "value": { "stringValue": "gpt-4" } },
                { "key": "gen_ai.provider.name", "value": { "stringValue": "openai" } },
                { "key": "gen_ai.operation.name", "value": { "stringValue": "chat" } },
                { "key": "gen_ai.usage.input_tokens", "value": { "intValue": "25" } },
                { "key": "gen_ai.usage.output_tokens", "value": { "intValue": "120" } }
              ],
              "status": { "code": 1 }
            }
          ]
        }
      ]
    }
  ]
}

Headers

X-API-KEY
string
required

API key to authorize the operation.

Content-Type
enum<string>
required

The encoding of the request body. Use application/x-protobuf for binary protobuf or application/json for JSON.

Available options:
application/x-protobuf,
application/json
Content-Encoding
enum<string>

Set to gzip if the request body is gzip-compressed.

Available options:
gzip

Body

An OTLP ExportTraceServiceRequest in JSON encoding. See the OTLP specification for the full schema.

resourceSpans
object[]

An array of ResourceSpans. Each element describes spans from a single instrumented resource.

Response

Successful Response. When using protobuf, the response is a binary ExportTraceServiceResponse. When using JSON, the response is a JSON object.

Response to an OTLP trace export request.

partialSuccess
object

Present only when some spans were rejected. Null when all spans were accepted.