Skip to main content
Custom providers let you connect to additional LLM providers beyond the built-in options, including DeepSeek, Grok, and more!

Setting Up a Custom Provider

To add a custom provider to your workspace:
  1. Navigate to Settings → Custom Providers and Models
  2. Click the Add Custom Provider button
  3. Configure the provider with the following details:
    • Name: A descriptive name for your provider (e.g., “DeepSeek”)
    • Client: Select the appropriate client type for your provider’s base URL
    • Base URL: The endpoint URL for your custom provider
    • API Key
Custom Provider Modal

Creating Custom Models

Once your provider is configured, you can define models for it:
  1. In Settings → Custom Providers and Models, click on your custom provider row to expand it
  2. Click Create Custom Model
  3. Fill in the model configuration:
    • Provider: Select the custom provider you created earlier
    • Model Name: Choose from known models or enter a custom identifier
    • Display Name: A friendly name that appears in the prompt playground
    • Model Type: Specify whether this is a Chat or Completion model
Custom Provider New Model

Using Custom Models

After setup, your custom models seamlessly integrate with PromptLayer’s features. You can:
  • Select them in the Playground alongside standard models
  • Use them in the Prompt Editor for template creation
  • Track requests and analyze performance just like any other model
Custom Provider Use Custom providers give you complete control over your model infrastructure while maintaining all the benefits of PromptLayer’s prompt management and observability features.

Examples

OpenRouter

OpenRouter provides access to a wide variety of cutting-edge models through a unified API, including models like DeepSeek, Claude, GPT-4, and many others that may not be available through standard providers. Configuration Instructions:
  1. Get an OpenRouter API Key: Sign up at OpenRouter and obtain your API key from their dashboard.
  2. Add OpenRouter as a Custom Provider in PromptLayer:
    • Navigate to Settings → Custom Providers and Models
    • Click Add Custom Provider
    • Configure with these settings:
      • Name: OpenRouter
      • Client: OpenAI (OpenRouter uses OpenAI-compatible endpoints)
      • Base URL: https://openrouter.ai/api/v1
      • API Key: Your OpenRouter API key
  3. Save Custom OpenRouter Models (Recommended):
    • In Settings → Custom Providers and Models, find your OpenRouter provider in the list
    • Click on the OpenRouter row to expand it
    • Click Create Custom Model in the expanded section
    • Configure each model:
      • Model Name: Enter the OpenRouter model identifier (e.g., deepseek/deepseek-chat, anthropic/claude-3.5-sonnet)
      • Display Name: A friendly name like “DeepSeek Chat” or “Claude 3.5 Sonnet”
      • Model Type: Choose “Chat” for most models
    • Repeat for each model you want to use
    • The full list of available models can be found in OpenRouter’s documentation
  4. Use Your Models:
    • Your custom models will now appear in the model dropdown in the Playground
    • Select your models directly from the dropdown - no manual typing needed
    • All requests will be routed through OpenRouter
OpenRouter automatically handles rate limiting and failover between providers, making it an excellent choice for accessing multiple models through a single integration.