Testing Instructions for Documentation Examples
This document provides instructions for testing code examples in the PromptLayer documentation.Test Workspace
We maintain a dedicated test workspace for documentation examples with pre-configured prompts and settings.Workspace Details
- Purpose: Testing documentation code examples
- API Key: Available in secure storage (not committed to repo)
- Workspace ID: Contact team for access
Testing Multi-Turn Chat Examples
Required Prompts
To test the multi-turn chat examples from the documentation, you only need TWO prompts set up in your workspace:-
multi-turn-assistant
- Type: Chat prompt template
- Required placeholders:
{{chat_history}}- Message placeholder for conversation history{{user_question}}- Text variable for current user input{{ai_in_progress}}- Array variable for tool call tracking (can be empty list if not using tools)
- Example system message: “You are a helpful assistant. Use the conversation history to maintain context.”
- Used for: Basic multi-turn conversations, context retention tests, and examples without tools
-
multi-turn-assistant-with-tools (Optional - only if testing tool functionality)
- Type: Chat prompt template
- Required placeholders (in this order):
{{chat_history}}- Message placeholder for conversation history{{user_question}}- Text variable for current user input{{ai_in_progress}}- Message placeholder for tool interaction messages (MUST come AFTER user_question){{user_context}}- Object variable for additional context (optional)
- Important: The order matters!
ai_in_progressmust come afteruser_questionbecause it contains the AI’s tool interactions in response to the user’s question - Tool definitions (if testing tools):
search_kb- Search knowledge basecreate_ticket- Create support ticketescalate- Escalate to human agentend_conversation- End the conversation
- Used for: Examples demonstrating tool usage in multi-turn conversations
Running Tests
- Install dependencies:
-
Set your API key:
- Copy
.env.exampleto.env - Add your PromptLayer API key to the
.envfile:
- Copy
- Run the test scripts:
Test Script Locations
All test scripts are located in thetesting/ folder:
test_multi_turn_chat.py- Main test suite with all examples from documentationtest_multi_turn_chat_simple.py- Quick sanity checks without loopstest_multi_turn_loop.py- Tests multiple conversation turns (3+) with context retentiontest_multi_turn_with_tools.py- Demonstrates proper tool handling with ai_in_progresstest_tool_debug.py- Debug script to understand tool conversation flow
Adding New Test Prompts
When adding new documentation examples that require testing:- Create the prompt template in the test workspace
- Document the required placeholders and configuration here
- Add corresponding test cases to the test script
- Verify all examples work before publishing documentation
Troubleshooting
Common Issues
- Missing placeholders: Ensure all required placeholders are defined in your prompt template
- API key errors: Verify your API key has access to the test workspace
- Tool call errors: Check that tool definitions match the expected format in the documentation

