Skip to main content
xAI provides the Grok family of large language models that can be integrated with PromptLayer through custom providers. Grok models offer advanced reasoning capabilities and real-time knowledge through X (Twitter) integration.

Setting Up xAI as a Custom Provider

To use Grok models in PromptLayer:
  1. Navigate to Settings → Custom Providers and Models in your PromptLayer dashboard
  2. Click Create Custom Provider
  3. Configure the provider with the following details:
    • Name: xAI (or your preferred name)
    • Client: OpenAI
    • Base URL: https://api.x.ai/v1
    • API Key: Your xAI API key (get one at x.ai)
xAI uses OpenAI-compatible endpoints, which is why we select OpenAI as the client type.
For easier model selection in the Playground and Prompt Registry, you can create custom models:
  1. In Settings → Custom Providers and Models, find your xAI provider in the list
  2. Click on the xAI row to expand it
  3. Click Create Custom Model
  4. Configure each model:
    • Provider: Select the xAI provider you created
    • Model Name: Enter the Grok model identifier (e.g., grok-4-fast-reasoning, grok-3)
    • Display Name: A friendly name like “Grok 2” or “Grok Beta”
    • Model Type: Chat
  5. Repeat for each model you want to use
This allows you to select Grok models directly from the dropdown instead of typing them manually.

Available Models

xAI regularly updates their model offerings. Example models include:
  • grok-4-fast-reasoning: Latest Grok 4 with fast reasoning (2M context)
  • grok-4-fast-non-reasoning: Grok 4 optimized for speed (2M context)
  • grok-code-fast-1: Specialized for code generation tasks
  • grok-3: Grok 3 with advanced reasoning capabilities
  • grok-2-vision-1212: Grok 2 with vision capabilities
For the complete and up-to-date list of available models and their capabilities, visit xAI’s official model documentation.

Using Grok in PromptLayer

In the Playground

After setup, you can use Grok models in the PromptLayer Playground:
  1. Open the Playground
  2. Select your xAI provider from the provider dropdown
  3. Choose your desired Grok model
  4. Start querying with your prompts

In the Prompt Registry

Grok models work seamlessly with PromptLayer’s Prompt Registry:
  • Select Grok models when creating or editing prompt templates
  • Use templates with Grok models in evaluations
  • Track and analyze xAI API usage alongside other providers

Parameter Compatibility

Some OpenAI parameters may not be compatible with Grok models. If you encounter errors:
  • Remove unsupported parameters like seed
  • Check xAI’s documentation for supported parameters
  • Use the Playground to test parameter compatibility before deploying

SDK Usage

Once you’ve set up your xAI custom provider and created a prompt template in the dashboard, you can run it programmatically with the PromptLayer SDK:
from promptlayer import PromptLayer

promptlayer = PromptLayer(api_key="pl_****")

# Run a prompt template that uses your xAI custom provider
# (Your template should be configured to use a Grok model like grok-4-fast-reasoning)
response = promptlayer.run(
    prompt_name="your-grok-prompt",
    input_variables={"topic": "quantum computing"}
)

# Access the response
print(response["raw_response"].choices[0].message.content)

# The request is automatically logged with request_id
print(f"Request ID: {response['request_id']}")
Using promptlayer.run() ensures your requests are properly logged to PromptLayer and leverages your prompt templates from the Prompt Registry. This is the recommended approach for production use, and it handles any parameter compatibility differences automatically based on your template configuration.