The Prompt Registry supports all major tool calling formats, including OpenAI tools, OpenAI functions, Anthropic tools, and Gemini tools.

You can create tool schemas interactively, and your prompt template will seemlessly work on any LLM. Tool calling in PromptLayer is model-agnostic.

Learn more about when you should use tools on our blog.

What is Tool Calling?

Tool calling (previously known as function calling) is a powerful feature that allows Language Models (LLMs) to return structured data and invoke predefined functions with JSON arguments. This capability enables more complex interactions and structured outputs from LLMs.

Key benefits of tool calling include:

  • Structured Outputs: Tool arguments are always in JSON format, enforced by JSONSchema at the API level.
  • Efficient Communication: Tool calling is a concept built into the model, reducing token usage and improving understanding.
  • Model Routing: Facilitates setting up modular prompts with specific responsibilities.
  • Prompt Injection Protection: Strict schema definitions at the model level make it harder to “jailbreak” the model.

Creating Visually

Tools can be defined, called, and set up visually through the Prompt Registry.

Publishing Programmatically

To publish a prompt template with tools programmatically, you can add the arguments tools and tool_choice to your prompt_template object. This is similar to how you would publish a regular prompt template.