Skip to content

Commit

Permalink
Broader Runnable support, compositional pieces (#2129)
Browse files Browse the repository at this point in the history
* Adds RunnableSequence

* Broader runnable support

* Adds guide page

* Tracing and serialization fixes

* Adds cookbook, string output parser
  • Loading branch information
jacoblee93 authored Aug 1, 2023
1 parent cd4a807 commit de2d50c
Show file tree
Hide file tree
Showing 43 changed files with 1,109 additions and 85 deletions.
2 changes: 2 additions & 0 deletions docs/extras/guides/_category_.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
label: 'Guides'
position: 3
2 changes: 2 additions & 0 deletions docs/extras/guides/expression_language/_category_.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
label: 'LangChain Expression Language'
position: 1
63 changes: 63 additions & 0 deletions docs/extras/guides/expression_language/cookbook.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
import CodeBlock from "@theme/CodeBlock";

# Cookbook

In this guide, we'll take a look at a few common types of sequences you can create.

## PromptTemplate + LLM

A PromptTemplate -> LLM is a core chain that is used in most other larger chains/systems.

import BasicExample from "@examples/guides/expression_language/cookbook_basic.ts";

<CodeBlock language="typescript">{BasicExample}</CodeBlock>

## PromptTemplate + LLM + OutputParser

We can also add in an output parser to easily trasform the raw LLM/ChatModel output into a consistent string format:

import OutputParserExample from "@examples/guides/expression_language/cookbook_output_parser.ts";

<CodeBlock language="typescript">{OutputParserExample}</CodeBlock>

## Passthroughs

Often times when constructing a chain you may want to pass along original input variables to future steps in the chain. How exactly you do this depends on what exactly the input is:

- If the original input was a string, then you likely just want to pass along the string. This can be done with `RunnablePassthrough`. For an example of this, see `LLMChain + Retriever`
- If the original input was an object, then you likely want to pass along specific keys. For this, you can use an arrow function that takes the object as input and extracts the desired key. For an example of this see the `Mapping multiple input keys` example below

## LLMChain + Retriever

Let's now look at adding in a retrieval step, which adds up to a "retrieval-augmented generation" chain:

import RetrieverExample from "@examples/guides/expression_language/cookbook_retriever.ts";

<CodeBlock language="typescript">{RetrieverExample}</CodeBlock>

## Mapping multiple input keys

In the above example, we pass a string input directly into our chain. If we want our chain to take multiple inputs, we can pass a map of functions to parse the inputs:

import RetrieverMapExample from "@examples/guides/expression_language/cookbook_retriever_map.ts";

<CodeBlock language="typescript">{RetrieverMapExample}</CodeBlock>

## Conversational Retrieval Chain

Because `RunnableSequence.from` and `runnable.pipe` both accept runnable-like objects, including single-argument functions, we can add in conversation history via a formatting function.
This allows us to recreate the popular `ConversationalRetrievalQAChain` to "chat with data":

import ConversationalRetrievalExample from "@examples/guides/expression_language/cookbook_conversational_retrieval.ts";

<CodeBlock language="typescript">{ConversationalRetrievalExample}</CodeBlock>

Note that the individual chains we created are themselves `Runnables` and can therefore be piped into each other.

## Tools

You can use LangChain tools as well:

import ToolExample from "@examples/guides/expression_language/cookbook_tools.ts";

<CodeBlock language="typescript">{ToolExample}</CodeBlock>
39 changes: 39 additions & 0 deletions docs/extras/guides/expression_language/interface.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
import CodeBlock from "@theme/CodeBlock";

# Interface

In an effort to make it as easy as possible to create custom chains, we've implemented a ["Runnable"](/docs/api/schema_runnable/classes/Runnable) protocol that most components implement.
This is a standard interface with a few different methods, which make it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:

- `stream`: stream back chunks of the response
- `invoke`: call the chain on an input
- `batch`: call the chain on a list of inputs

The type of the input varies by component. For a prompt it is an object, for a retriever it is a single string, for a model either a single string, a list of chat messages, or a PromptValue.

The output type also varies by component. For an LLM it is a string, for a ChatModel it's a ChatMessage, for a prompt it's a PromptValue, and for a retriever it's a list of documents.

You can combine runnables (and runnable-like objects such as functions and objects whose values are all functions) into sequences in two ways:

- Call the `.pipe` instance method, which takes another runnable-like as an argument
- Use the `RunnableSequence.from([])` static method with an array of runnable-likes, which will run in sequence when invoked

See below for examples of how this looks.

## Stream

import StreamExample from "@examples/guides/expression_language/interface_stream.ts";

<CodeBlock language="typescript">{StreamExample}</CodeBlock>

## Invoke

import InvokeExample from "@examples/guides/expression_language/interface_invoke.ts";

<CodeBlock language="typescript">{InvokeExample}</CodeBlock>

## Batch

import BatchExample from "@examples/guides/expression_language/interface_batch.ts";

<CodeBlock language="typescript">{BatchExample}</CodeBlock>
19 changes: 19 additions & 0 deletions examples/src/guides/expression_language/cookbook_basic.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
import { PromptTemplate } from "langchain/prompts";
import { ChatOpenAI } from "langchain/chat_models/openai";

const model = new ChatOpenAI({});
const promptTemplate = PromptTemplate.fromTemplate(
"Tell me a joke about {topic}"
);

const chain = promptTemplate.pipe(model);

const result = await chain.invoke({ topic: "bears" });

console.log(result);

/*
AIMessage {
content: "Why don't bears wear shoes?\n\nBecause they have bear feet!",
}
*/
Original file line number Diff line number Diff line change
@@ -0,0 +1,102 @@
import { PromptTemplate } from "langchain/prompts";
import {
RunnableSequence,
RunnablePassthrough,
} from "langchain/schema/runnable";
import { Document } from "langchain/document";
import { ChatOpenAI } from "langchain/chat_models/openai";
import { HNSWLib } from "langchain/vectorstores/hnswlib";
import { OpenAIEmbeddings } from "langchain/embeddings/openai";
import { StringOutputParser } from "langchain/schema/output_parser";

const model = new ChatOpenAI({});

const condenseQuestionTemplate = `Given the following conversation and a follow up question, rephrase the follow up question to be a standalone question, in its original language.
Chat History:
{chat_history}
Follow Up Input: {question}
Standalone question:`;
const CONDENSE_QUESTION_PROMPT = PromptTemplate.fromTemplate(
condenseQuestionTemplate
);

const answerTemplate = `Answer the question based only on the following context:
{context}
Question: {question}
`;
const ANSWER_PROMPT = PromptTemplate.fromTemplate(answerTemplate);

const combineDocumentsFn = (docs: Document[], separator = "\n\n") => {
const serializedDocs = docs.map((doc) => doc.pageContent);
return serializedDocs.join(separator);
};

const formatChatHistory = (chatHistory: [string, string][]) => {
const formattedDialogueTurns = chatHistory.map(
(dialogueTurn) => `Human: ${dialogueTurn[0]}\nAssistant: ${dialogueTurn[1]}`
);
return formattedDialogueTurns.join("\n");
};

const vectorStore = await HNSWLib.fromTexts(
[
"mitochondria is the powerhouse of the cell",
"mitochondria is made of lipids",
],
[{ id: 1 }, { id: 2 }],
new OpenAIEmbeddings()
);
const retriever = vectorStore.asRetriever();

type ConversationalRetrievalQAChainInput = {
question: string;
chat_history: [string, string][];
};

const standaloneQuestionChain = RunnableSequence.from([
{
question: (input: ConversationalRetrievalQAChainInput) => input.question,
chat_history: (input: ConversationalRetrievalQAChainInput) =>
formatChatHistory(input.chat_history),
},
CONDENSE_QUESTION_PROMPT,
model,
new StringOutputParser(),
]);

const answerChain = RunnableSequence.from([
{
context: retriever.pipe(combineDocumentsFn),
question: new RunnablePassthrough(),
},
ANSWER_PROMPT,
model,
]);

const conversationalRetrievalQAChain =
standaloneQuestionChain.pipe(answerChain);

const result1 = await conversationalRetrievalQAChain.invoke({
question: "What is the powerhouse of the cell?",
chat_history: [],
});
console.log(result1);
/*
AIMessage { content: "The powerhouse of the cell is the mitochondria." }
*/

const result2 = await conversationalRetrievalQAChain.invoke({
question: "What are they made out of?",
chat_history: [
[
"What is the powerhouse of the cell?",
"The powerhouse of the cell is the mitochondria.",
],
],
});
console.log(result2);
/*
AIMessage { content: "Mitochondria are made out of lipids." }
*/
20 changes: 20 additions & 0 deletions examples/src/guides/expression_language/cookbook_output_parser.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
import { PromptTemplate } from "langchain/prompts";
import { ChatOpenAI } from "langchain/chat_models/openai";
import { RunnableSequence } from "langchain/schema/runnable";
import { StringOutputParser } from "langchain/schema/output_parser";

const model = new ChatOpenAI({});
const promptTemplate = PromptTemplate.fromTemplate(
"Tell me a joke about {topic}"
);
const outputParser = new StringOutputParser();

const chain = RunnableSequence.from([promptTemplate, model, outputParser]);

const result = await chain.invoke({ topic: "bears" });

console.log(result);

/*
"Why don't bears wear shoes?\n\nBecause they have bear feet!"
*/
46 changes: 46 additions & 0 deletions examples/src/guides/expression_language/cookbook_retriever.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
import { ChatOpenAI } from "langchain/chat_models/openai";
import { HNSWLib } from "langchain/vectorstores/hnswlib";
import { OpenAIEmbeddings } from "langchain/embeddings/openai";
import { PromptTemplate } from "langchain/prompts";
import {
RunnableSequence,
RunnablePassthrough,
} from "langchain/schema/runnable";
import { StringOutputParser } from "langchain/schema/output_parser";
import { Document } from "langchain/document";

const model = new ChatOpenAI({});

const vectorStore = await HNSWLib.fromTexts(
["mitochondria is the powerhouse of the cell"],
[{ id: 1 }],
new OpenAIEmbeddings()
);
const retriever = vectorStore.asRetriever();

const prompt =
PromptTemplate.fromTemplate(`Answer the question based only on the following context:
{context}
Question: {question}`);

const serializeDocs = (docs: Document[]) =>
docs.map((doc) => doc.pageContent).join("\n");

const chain = RunnableSequence.from([
{
context: retriever.pipe(serializeDocs),
question: new RunnablePassthrough(),
},
prompt,
model,
new StringOutputParser(),
]);

const result = await chain.invoke("What is the powerhouse of the cell?");

console.log(result);

/*
"The powerhouse of the cell is the mitochondria."
*/
55 changes: 55 additions & 0 deletions examples/src/guides/expression_language/cookbook_retriever_map.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
import { ChatOpenAI } from "langchain/chat_models/openai";
import { HNSWLib } from "langchain/vectorstores/hnswlib";
import { OpenAIEmbeddings } from "langchain/embeddings/openai";
import { PromptTemplate } from "langchain/prompts";
import { RunnableSequence } from "langchain/schema/runnable";
import { StringOutputParser } from "langchain/schema/output_parser";
import { Document } from "langchain/document";

const model = new ChatOpenAI({});

const vectorStore = await HNSWLib.fromTexts(
["mitochondria is the powerhouse of the cell"],
[{ id: 1 }],
new OpenAIEmbeddings()
);
const retriever = vectorStore.asRetriever();

const languagePrompt =
PromptTemplate.fromTemplate(`Answer the question based only on the following context:
{context}
Question: {question}
Answer in the following language: {language}`);

type LanguageChainInput = {
question: string;
language: string;
};

const serializeDocs = (docs: Document[]) =>
docs.map((doc) => doc.pageContent).join("\n");

const languageChain = RunnableSequence.from([
{
question: (input: LanguageChainInput) => input.question,
language: (input: LanguageChainInput) => input.language,
context: (input: LanguageChainInput) =>
retriever.pipe(serializeDocs).invoke(input.question),
},
languagePrompt,
model,
new StringOutputParser(),
]);

const result2 = await languageChain.invoke({
question: "What is the powerhouse of the cell?",
language: "German",
});

console.log(result2);

/*
"Mitochondrien sind das Kraftwerk der Zelle."
*/
24 changes: 24 additions & 0 deletions examples/src/guides/expression_language/cookbook_tools.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
import { SerpAPI } from "langchain/tools";
import { ChatOpenAI } from "langchain/chat_models/openai";
import { PromptTemplate } from "langchain/prompts";
import { StringOutputParser } from "langchain/schema/output_parser";

const search = new SerpAPI();

const prompt =
PromptTemplate.fromTemplate(`Turn the following user input into a search query for a search engine:
{input}`);

const model = new ChatOpenAI({});

const chain = prompt.pipe(model).pipe(new StringOutputParser()).pipe(search);

const result = await chain.invoke({
input: "Who is the current prime minister of Malaysia?",
});

console.log(result);
/*
Anwar Ibrahim
*/
Loading

1 comment on commit de2d50c

@vercel
Copy link

@vercel vercel bot commented on de2d50c Aug 1, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please sign in to comment.