You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
langchain/docs/docs/how_to/assign.ipynb

193 lines
6.5 KiB
Plaintext

{
"cells": [
{
"cell_type": "raw",
"metadata": {},
"source": [
"---\n",
"sidebar_position: 6\n",
"keywords: [RunnablePassthrough, assign, LCEL]\n",
"---"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# How to add values to a chain's state\n",
"\n",
"An alternate way of [passing data through](/docs/how_to/passthrough) steps of a chain is to leave the current values of the chain state unchanged while assigning a new value under a given key. The [`RunnablePassthrough.assign()`](https://api.python.langchain.com/en/latest/runnables/langchain_core.runnables.passthrough.RunnablePassthrough.html#langchain_core.runnables.passthrough.RunnablePassthrough.assign) static method takes an input value and adds the extra arguments passed to the assign function.\n",
"\n",
"This is useful in the common [LangChain Expression Language](/docs/concepts/#langchain-expression-language) pattern of additively creating a dictionary to use as input to a later step.\n",
"\n",
"```{=mdx}\n",
"import PrerequisiteLinks from \"@theme/PrerequisiteLinks\";\n",
"\n",
"<PrerequisiteLinks content={`\n",
"- [LangChain Expression Language (LCEL)](/docs/concepts/#langchain-expression-language)\n",
"- [Chaining runnables](/docs/how_to/sequence/)\n",
"- [Calling runnables in parallel](/docs/how_to/parallel/)\n",
"- [Custom functions](/docs/how_to/functions/)\n",
"- [Passing data through](/docs/how_to/passthrough)\n",
"`} />\n",
"```\n",
"\n",
"Here's an example:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%pip install --upgrade --quiet langchain langchain-openai\n",
"\n",
"import os\n",
"from getpass import getpass\n",
"\n",
"os.environ[\"OPENAI_API_KEY\"] = getpass()"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"{'extra': {'num': 1, 'mult': 3}, 'modified': 2}"
]
},
"execution_count": 2,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"from langchain_core.runnables import RunnableParallel, RunnablePassthrough\n",
"\n",
"runnable = RunnableParallel(\n",
" extra=RunnablePassthrough.assign(mult=lambda x: x[\"num\"] * 3),\n",
" modified=lambda x: x[\"num\"] + 1,\n",
")\n",
"\n",
"runnable.invoke({\"num\": 1})"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Let's break down what's happening here.\n",
"\n",
"- The input to the chain is `{\"num\": 1}`. This is passed into a `RunnableParallel`, which invokes the runnables it is passed in parallel with that input.\n",
"- The value under the `extra` key is invoked. `RunnablePassthrough.assign()` keeps the original keys in the input dict (`{\"num\": 1}`), and assigns a new key called `mult`. The value is `lambda x: x[\"num\"] * 3)`, which is `3`. Thus, the result is `{\"num\": 1, \"mult\": 3}`.\n",
"- `{\"num\": 1, \"mult\": 3}` is returned to the `RunnableParallel` call, and is set as the value to the key `extra`.\n",
"- At the same time, the `modified` key is called. The result is `2`, since the lambda extracts a key called `\"num\"` from its input and adds one.\n",
"\n",
"Thus, the result is `{'extra': {'num': 1, 'mult': 3}, 'modified': 2}`.\n",
"\n",
"## Streaming\n",
"\n",
"One convenient feature of this method is that it allows values to pass through as soon as they are available. To show this off, we'll use `RunnablePassthrough.assign()` to immediately return source docs in a retrieval chain:"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"{'question': 'where did harrison work?'}\n",
"{'context': [Document(page_content='harrison worked at kensho')]}\n",
"{'output': ''}\n",
"{'output': 'H'}\n",
"{'output': 'arrison'}\n",
"{'output': ' worked'}\n",
"{'output': ' at'}\n",
"{'output': ' Kens'}\n",
"{'output': 'ho'}\n",
"{'output': '.'}\n",
"{'output': ''}\n"
]
}
],
"source": [
"from langchain_community.vectorstores import FAISS\n",
"from langchain_core.output_parsers import StrOutputParser\n",
"from langchain_core.prompts import ChatPromptTemplate\n",
"from langchain_core.runnables import RunnablePassthrough\n",
"from langchain_openai import ChatOpenAI, OpenAIEmbeddings\n",
"\n",
"vectorstore = FAISS.from_texts(\n",
" [\"harrison worked at kensho\"], embedding=OpenAIEmbeddings()\n",
")\n",
"retriever = vectorstore.as_retriever()\n",
"template = \"\"\"Answer the question based only on the following context:\n",
"{context}\n",
"\n",
"Question: {question}\n",
"\"\"\"\n",
"prompt = ChatPromptTemplate.from_template(template)\n",
"model = ChatOpenAI()\n",
"\n",
"generation_chain = prompt | model | StrOutputParser()\n",
"\n",
"retrieval_chain = {\n",
" \"context\": retriever,\n",
" \"question\": RunnablePassthrough(),\n",
"} | RunnablePassthrough.assign(output=generation_chain)\n",
"\n",
"stream = retrieval_chain.stream(\"where did harrison work?\")\n",
"\n",
"for chunk in stream:\n",
" print(chunk)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We can see that the first chunk contains the original `\"question\"` since that is immediately available. The second chunk contains `\"context\"` since the retriever finishes second. Finally, the output from the `generation_chain` streams in chunks as soon as it is available.\n",
"\n",
"## Next steps\n",
"\n",
"Now you've learned how to pass data through your chains to help to help format the data flowing through your chains.\n",
"\n",
"To learn more, see the other how-to guides on runnables in this section."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.1"
}
},
"nbformat": 4,
"nbformat_minor": 2
}