Skip to content
This repository was archived by the owner on Aug 5, 2025. It is now read-only.

fix: pass variables to langchain messages on prompt rendering #172

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

asvishnyakov
Copy link
Member

@asvishnyakov asvishnyakov commented May 7, 2025

This PR fixes an issue when any additional messages passed to Literal AI prompt, converted using to_langchain_chat_prompt_template into LangChain prompt template will not receive variables

client = AsyncLiteralClient()
prompt = client.get_prompt("prompt name")
prompt_template = prompt.to_langchain_chat_prompt_template([("placeholder", "{history}"), ("human", "{{query}}")])
llm = ChatOpenAI(**prompt.settings)
chain = prompt_template | llm | StrOutputParser()
runnable = RunnableWithMessageHistory(chain,
                                      get_session_history=get_session_history,
                                      input_messages_key="query",
                                      history_messages_key="history")

@chainlit.on_message
async def on_message(message: chainlit.Message):
  answer = chainlit.Message(content="")
  
  async for token in runnable.astream(
      {"query": query},
      config=RunnableConfig(callbacks=[chainlit.LangchainCallbackHandler()])
  ):
      await answer.stream_token(token)
  
  await answer.send()

history will not be rendered

P.S. This code looks very suspicious, I would say we should simply render with custom formatter first N messages which come from the prompt and then use default for the rest (additional) instead of catching exception, but this PR fixes an issue with minimal changes in code.

@asvishnyakov
Copy link
Member Author

Fixes #171

@asvishnyakov
Copy link
Member Author

@willydouhard May you take a look?

@asvishnyakov
Copy link
Member Author

@willydouhard Do you accept PRs to this project?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant