Concentrate AI provides access to a wide variety of models from major authors through a single unified API, including closed-source models like GPT-5.5, Claude Opus 4.7, and Gemini 3.1 Pro, alongside open-source options.Documentation Index
Fetch the complete documentation index at: https://docs.promptlayer.com/llms.txt
Use this file to discover all available pages before exploring further.
Setting Up Concentrate as a Custom Provider
To use Concentrate models in PromptLayer:- Get a Concentrate API Key: Sign up at Concentrate AI and obtain your API key from their dashboard
- Navigate to Settings → Custom Providers and Models in your PromptLayer dashboard
- Click Create Custom Provider
- Configure the provider with the following details:
- Name: Concentrate
- Client: OpenAI (Concentrate uses OpenAI-compatible endpoints)
- Base URL:
https://api.concentrate.ai/v1 - API Key: Your Concentrate API key
Concentrate exposes an OpenAI-compatible
/v1/responses endpoint, which is why we select OpenAI as the client type.Creating Custom Models (Recommended)
For easier model selection in the Playground and Prompt Registry, you can save specific Concentrate models:- Navigate to the Custom Providers and Models page
- Find the Concentrate row and click the three-dot menu on that row
- Click Add model
- Enter the model details:
- Model Name: Paste the model slug copied from Concentrate’s models page (e.g.,
gpt-5.5,claude-opus-4-7,anthropic/claude-opus-4-7) - Display Name: A friendly name like “GPT 5.5” or “Claude Opus 4.7”
- Model Name: Paste the model slug copied from Concentrate’s models page (e.g.,
- Optionally, customize parameters on the next page
- Repeat for each model you want to use
Available Models
Concentrate provides access to a vast catalog of models. You can use canonical names for automatic routing, or provider-prefixed names to pin a specific provider. Example models include:gpt-5.5: OpenAI’s GPT-5.5 modelclaude-opus-4-7: Anthropic’s Claude Opus 4.7 modelgemini-3-1-pro-preview: Google’s Gemini 3.1 Pro Preview modelanthropic/claude-opus-4-7: Claude Opus 4.7 pinned specifically to the Anthropic providerauto: Let Concentrate automatically route to the best model based on cost, performance, or latency
Using Concentrate in PromptLayer
In the Playground
After setup, you can use Concentrate models in the PromptLayer Playground:- Open the Playground
- At the bottom of the screen (next to the tools and output controls), open the provider menu and select Concentrate as the LLM provider
- Pick any model you’ve added to the Concentrate provider
- Select the Responses API as the request format
- Start querying with your prompts
In the Prompt Registry
Concentrate models work seamlessly with PromptLayer’s Prompt Registry:- Select Concentrate models when creating or editing prompt templates
- Use templates with Concentrate models in evaluations
- Track and analyze Concentrate API usage alongside other providers
Key Benefits
Concentrate provides:- Unified API: One OpenAI-compatible endpoint across every major provider
- Automatic failover: Requests retry across backup providers when one is unavailable
- Spend management: Budget limits per API key, project, or org, with anomaly alerts
- PII redaction and zero data retention: Configurable per API key for sensitive workloads
- Unified audit logs: Consistent usage logs and analytics across every provider in one dashboard
SDK Usage
Once you’ve set up your Concentrate custom provider and created a prompt template in the dashboard, you can run it programmatically with the PromptLayer SDK:Using
promptlayer.run() ensures your requests are properly logged to PromptLayer and leverages your prompt templates from the Prompt Registry. This is the recommended approach for production use.
