PromptLayer works seamlessly with LangChain. LangChain is a popular Python library aimed at assisting in the development of LLM applications. It provides a lot of helpful features like chains, agents, and memory.Using PromptLayer with LangChain is simple. See the LangChain docs below:
There are two main ways to use LangChain with PromptLayer. The recommended way (as of LangChain version 0.0.221) is to use the PromptLayerCallbackHandler. Alternatively, you can use a PromptLayer-specific LLM or Chat model.
Right now, callbacks only work with LangChain in Python.
This is the recommended way to use LangChain with PromptLayer. It is simpler and more extendible than the other method below.Every LLM supported by LangChain works with PromptLayer’s callback.Start by importing PromptLayerCallbackHandler. This callback function will log your request after each LLM response.
Copy
Ask AI
import promptlayer # Don't forget this 🍰from langchain.callbacks import PromptLayerCallbackHandler
Now, when instantiating a model, just include the PromptLayerCallbackHandler in the callbacks.
Alternatively, the older (but still supported) way to use LangChain with PromptLayer is through specific PromptLayer LLMs and Chat Models.Please note: Do not use these models in addition to the callback. Use one or the other.See below for examples:
Copy
Ask AI
from langchain.chat_models import PromptLayerChatOpenAIfrom langchain.schema import ( SystemMessage, HumanMessage,)chat = PromptLayerChatOpenAI(pl_tags=["langchain"])chat([ SystemMessage(content="You are a helpful assistant that translates English to French."), HumanMessage(content="Translate this sentence from English to French. I love programming.")])