LangChain
PromptLayer works seamlessly with LangChain. LangChain is a popular Python library aimed at assisting in the development of LLM applications. It provides a lot of helpful features like chains, agents, and memory.
Using PromptLayer with LangChain is simple. See the LangChain docs below:
There are two main ways to use LangChain with PromptLayer. The recommended way (as of LangChain version 0.0.221
) is to use the PromptLayerCallbackHandler
. Alternatively, you can use a PromptLayer-specific LLM or Chat model.
Using Callbacks
This is the recommended way to use LangChain with PromptLayer. It is simpler and more extendible than the other method below.
Every LLM supported by LangChain works with PromptLayerโs callback.
Start by importing PromptLayerCallbackHandler
. This callback function will log your request after each LLM response.
Now, when instantiating a model, just include the PromptLayerCallbackHandler
in the callbacks.
๐ Thatโs it! ๐
Full Examples
Below are some full examples using PromptLayer with various LLMs through LangChain.
OpenAI
GPT4All
HuggingFace Hub
You can use the HuggingFace wrapper to try out many different LLMs.
Async Requests
PromptLayer Request ID
The PromptLayer request ID is used to tag requests with metadata, scores, associated prompt templates, and more.
PromptLayerCallbackHandler
can optionally take in its own callback function that takes the request ID as an argument.
PromptLayer OpenAI Models
Alternatively, the older (but still supported) way to use LangChain with PromptLayer is through specific PromptLayer LLMs and Chat Models.
Please note: Do not use these models in addition to the callback. Use one or the other.
See below for examples:
Was this page helpful?