Skip to content

Commit

Permalink
feat: add LangchainRouter (#66)
Browse files Browse the repository at this point in the history
* ✨ add LangchainRouter

* ✅ add unit test

* 📝 update README

* 📝 update docs

* ⬆️ upgrade langchain dependency

* 💡 add docstrings
  • Loading branch information
ajndkr authored May 28, 2023
1 parent 6508fc0 commit 99e036b
Show file tree
Hide file tree
Showing 14 changed files with 369 additions and 60 deletions.
2 changes: 1 addition & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ coverage: ## run unit tests with coverage
pre-commit: ## run pre-commit hooks
poetry run pre-commit run --all-files

build-docs: ## build documentation
build-docs: clean-docs ## build documentation
poetry run sphinx-autobuild -b html --host 0.0.0.0 --port 8000 docs docs/_build/html

clean-docs: ## clean documentation
Expand Down
28 changes: 14 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,11 @@
<h1> Lanarky </h1>

[![stars](https://img.shields.io/github/stars/ajndkr/lanarky)](https://github.com/ajndkr/lanarky/stargazers)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://github.com/ajndkr/lanarky/blob/main/LICENSE)
[![Documentation](https://img.shields.io/badge/documentation-ReadTheDocs-blue.svg)](https://lanarky.readthedocs.io/en/latest/)
[![PyPI version](https://badge.fury.io/py/lanarky.svg)](https://pypi.org/project/lanarky/)
[![Python 3.9](https://img.shields.io/badge/python-3.9-blue.svg)](https://www.python.org/downloads/release/python-3916/)
![Supported Python Versions](https://img.shields.io/pypi/pyversions/lanarky.svg)
[![Code Coverage](https://coveralls.io/repos/github/ajndkr/lanarky/badge.svg?branch=main)](https://coveralls.io/github/ajndkr/lanarky?branch=main)
[![Documentation](https://img.shields.io/badge/documentation-ReadTheDocs-blue.svg)](https://lanarky.readthedocs.io/en/latest/)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://github.com/ajndkr/lanarky/blob/main/LICENSE)

</div>

Expand Down Expand Up @@ -40,29 +40,29 @@ pip install lanarky

You can find the full documentation at [https://lanarky.readthedocs.io/en/latest/](https://lanarky.readthedocs.io/en/latest/).

## 🔥 Deploy a simple Langchain application in under 20 lines of code
## 🔥 Build your first Langchain app

```python
from dotenv import load_dotenv
from fastapi import FastAPI
from langchain import ConversationChain
from langchain.chat_models import ChatOpenAI
from pydantic import BaseModel
from lanarky.responses import StreamingResponse

from lanarky.routing import LangchainRouter

load_dotenv()
app = FastAPI()

class Request(BaseModel):
query: str

@app.post("/chat")
async def chat(request: Request) -> StreamingResponse:
chain = ConversationChain(llm=ChatOpenAI(temperature=0, streaming=True), verbose=True)
return StreamingResponse.from_chain(chain, request.query, media_type="text/event-stream")
langchain_router = LangchainRouter(
langchain_object=ConversationChain(
llm=ChatOpenAI(temperature=0), verbose=True
)
)
app.include_router(langchain_router)
```

See [`examples/`](https://github.com/ajndkr/lanarky/blob/main/examples/README.md) for list of available demo examples.
See [`examples/`](https://github.com/ajndkr/lanarky/blob/main/examples/README.md)
for list of available demo examples.

Create a `.env` file using `.env.sample` and add your OpenAI API key to it
before running the examples.
Expand Down
36 changes: 0 additions & 36 deletions docs/basic/getting_started.rst

This file was deleted.

2 changes: 1 addition & 1 deletion docs/basic/features.rst → docs/features.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
Langchain Support
-----------------

- Supports output streaming over HTTP and Websocket
- Supports multi-mode output streaming over HTTP and Websocket
- Supports multiple Chains and Agents

Gradio Testing
Expand Down
32 changes: 32 additions & 0 deletions docs/getting_started.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
🔥 Getting Started
===================

.. note::
Create a `.env` file using `.env.sample` and add your OpenAI API key to it before running the examples.

You can get quickly started with Lanarky and deploy your first Langchain app in just a few lines of code.

.. code-block:: python
from dotenv import load_dotenv
from fastapi import FastAPI
from langchain import ConversationChain
from langchain.chat_models import ChatOpenAI
from lanarky.routing import LangchainRouter
load_dotenv()
app = FastAPI()
langchain_router = LangchainRouter(
langchain_object=ConversationChain(
llm=ChatOpenAI(temperature=0), verbose=True
)
)
app.include_router(langchain_router)
.. image:: https://raw.githubusercontent.com/ajndkr/lanarky/main/assets/demo.gif

.. seealso::
You can find more Langchain demos in the `examples/ <https://github.com/ajndkr/lanarky/blob/main/examples/README.md>`_
folder of the GitHub repository.
11 changes: 5 additions & 6 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,19 +9,18 @@ Welcome to Lanarky
.. toctree::
:maxdepth: 1
:name: basic
:caption: Basic
:hidden:

basic/features
basic/getting_started
features
getting_started

.. toctree::
:maxdepth: 1
:name: advanced
:caption: Advanced
:name: frameworks
:caption: Supported Frameworks
:hidden:

advanced/custom_callbacks
langchain/index

.. toctree::
:maxdepth: 2
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
Langchain: Register Custom Callbacks
Advanced: Register Custom Callbacks
=====================================

Lanarky auto-detects the required callback handler based on input chain type using a callback registry.
Expand Down
63 changes: 63 additions & 0 deletions docs/langchain/deploy.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
Deploying Langchain Applications
=================================

Lanarky offers a straightforward method for deploying your Langchain app using ``LangchainRouter``.

``LangchainRouter`` inherits from FastAPI's ``APIRouter`` class and creates an API endpoint using your target Langchain object.

To better understand ``LangchainRouter``, let's break down the example below:

.. code-block:: python
from dotenv import load_dotenv
from fastapi import FastAPI
from langchain import ConversationChain
from langchain.chat_models import ChatOpenAI
from lanarky.routing import LangchainRouter
load_dotenv()
app = FastAPI()
langchain_router = LangchainRouter(
langchain_object=ConversationChain(
llm=ChatOpenAI(temperature=0),
verbose=True
)
)
app.include_router(langchain_router)
In the above example, ``langchain_router`` is an instance of the ``APIRouter`` class that creates a POST endpoint at ``/chat``.
This endpoint accepts JSON data as input and returns JSON data as output.

Now, let's explore an interesting feature of Lanarky - the ability to deploy your Langchain app with token streaming.
Here's an example:

.. code-block:: python
from dotenv import load_dotenv
from fastapi import FastAPI
from langchain import ConversationChain
from langchain.chat_models import ChatOpenAI
from lanarky.routing import LangchainRouter
load_dotenv()
app = FastAPI()
langchain_router = LangchainRouter(
langchain_object=ConversationChain(
llm=ChatOpenAI(temperature=0, streaming=True),
verbose=True
),
streaming_mode=1
)
app.include_router(langchain_router)
By including ``streaming_mode=1`` in the ``LangchainRouter`` initialization, your Langchain app can be deployed
with token streaming.

The ``LangchainRouter`` class uses the ``streaming_mode`` parameter to determine the token streaming behavior.
There are three available modes:

- ``streaming_mode=0``: No token streaming
- ``streaming_mode=1``: Token streaming as plain text
- ``streaming_mode=2``: Token streaming as JSON
15 changes: 15 additions & 0 deletions docs/langchain/index.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
Langchain
==================

Welcome to Lanarky documentation for the Langchain framework. Langchain is a popular framework to build
applications on top of available LLMs. While Langchain handles the core logic of the application, it lacks
the deployment requirements to make your application production ready.

This is where Lanarky comes in. Built on top of FastAPI, Lanarky provides a flexible deployment solution for
your Langchain application.

.. toctree::
:maxdepth: 1

deploy
custom_callbacks
3 changes: 3 additions & 0 deletions lanarky/routing/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
from .langchain import LangchainRouter

__all__ = ["LangchainRouter"]
90 changes: 90 additions & 0 deletions lanarky/routing/langchain.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,90 @@
from enum import IntEnum
from typing import Any, Type

from fastapi.routing import APIRouter
from langchain.chains.base import Chain

from .utils import (
create_langchain_base_endpoint,
create_langchain_dependency,
create_langchain_streaming_endpoint,
create_langchain_streaming_json_endpoint,
create_request_from_langchain_dependency,
create_response_model_from_langchain_dependency,
)


class StreamingMode(IntEnum):
OFF = 0
TEXT = 1
JSON = 2


def create_langchain_endpoint(
endpoint_request, langchain_dependency, response_model, streaming_mode
):
"""Creates a Langchain endpoint."""
if streaming_mode == StreamingMode.OFF:
endpoint = create_langchain_base_endpoint(
endpoint_request, langchain_dependency, response_model
)
elif streaming_mode == StreamingMode.TEXT:
endpoint = create_langchain_streaming_endpoint(
endpoint_request, langchain_dependency
)
elif streaming_mode == StreamingMode.JSON:
endpoint = create_langchain_streaming_json_endpoint(
endpoint_request, langchain_dependency
)
else:
raise ValueError(f"Invalid streaming mode: {streaming_mode}")

return endpoint


class LangchainRouter(APIRouter):
def __init__(
self,
*,
langchain_url: str = "/chat",
langchain_object: Type[Chain] = Chain,
langchain_endpoint_kwargs: dict[str, Any] = None,
streaming_mode: StreamingMode = StreamingMode.OFF,
**kwargs,
):
super().__init__(**kwargs)
self.langchain_url = langchain_url
self.langchain_object = langchain_object
self.langchain_endpoint_kwargs = langchain_endpoint_kwargs or {}
self.streaming_mode = streaming_mode

self.langchain_dependency = create_langchain_dependency(langchain_object)

self.setup()

def setup(self) -> None:
if self.langchain_url:
endpoint_request = create_request_from_langchain_dependency(
self.langchain_dependency
)
response_model = (
create_response_model_from_langchain_dependency(
self.langchain_dependency
)
if self.streaming_mode == StreamingMode.OFF
else None
)
endpoint = create_langchain_endpoint(
endpoint_request,
self.langchain_dependency,
response_model,
self.streaming_mode,
)

self.add_api_route(
self.langchain_url,
endpoint,
response_model=response_model,
methods=["POST"],
**self.langchain_endpoint_kwargs,
)
Loading

0 comments on commit 99e036b

Please sign in to comment.