How to print intermediate values in the langchain chain

Tags:

I don’t know why this isn’t documented anywhere.

Let’s say there are two prompts.

from langchain.prompts import PromptTemplate

synopsis_prompt = PromptTemplate.from_template(
    """You are a playwright. Given the title of play, it is your job to write a
       synopsis for that title.

       Title: {text}
       Playwright: This is a synopsis for the above play:""")

review_prompt = PromptTemplate.from_template(
    """You are a play critic from the New York Times. Given the synopsis of play, it is
       your job to write a review for that play.

       Play Synopsis:
       {text}
       Review from a New York Times play critic of the above play:""")

And then you add RunnableLambda.

from langchain.chat_models import ChatOpenAI
from langchain.schema import StrOutputParser
from langchain.schema.runnable import RunnableLambda


def myprint(s):
    print(s.text)
    return s


printer = RunnableLambda(myprint)

llm = ChatOpenAI()
chain = (
    {"text": synopsis_prompt | printer | llm | StrOutputParser()}
    | review_prompt
    | printer
    | llm
    | StrOutputParser()
)
chain.invoke({"text": "Tragedy at sunset on the beach"})