-
Notifications
You must be signed in to change notification settings - Fork 75
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
* ✨ add LangchainRouter * ✅ add unit test * 📝 update README * 📝 update docs * ⬆️ upgrade langchain dependency * 💡 add docstrings
- Loading branch information
Showing
14 changed files
with
369 additions
and
60 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,32 @@ | ||
🔥 Getting Started | ||
=================== | ||
|
||
.. note:: | ||
Create a `.env` file using `.env.sample` and add your OpenAI API key to it before running the examples. | ||
|
||
You can get quickly started with Lanarky and deploy your first Langchain app in just a few lines of code. | ||
|
||
.. code-block:: python | ||
from dotenv import load_dotenv | ||
from fastapi import FastAPI | ||
from langchain import ConversationChain | ||
from langchain.chat_models import ChatOpenAI | ||
from lanarky.routing import LangchainRouter | ||
load_dotenv() | ||
app = FastAPI() | ||
langchain_router = LangchainRouter( | ||
langchain_object=ConversationChain( | ||
llm=ChatOpenAI(temperature=0), verbose=True | ||
) | ||
) | ||
app.include_router(langchain_router) | ||
.. image:: https://raw.githubusercontent.com/ajndkr/lanarky/main/assets/demo.gif | ||
|
||
.. seealso:: | ||
You can find more Langchain demos in the `examples/ <https://github.com/ajndkr/lanarky/blob/main/examples/README.md>`_ | ||
folder of the GitHub repository. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
2 changes: 1 addition & 1 deletion
2
docs/advanced/custom_callbacks.rst → docs/langchain/custom_callbacks.rst
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,63 @@ | ||
Deploying Langchain Applications | ||
================================= | ||
|
||
Lanarky offers a straightforward method for deploying your Langchain app using ``LangchainRouter``. | ||
|
||
``LangchainRouter`` inherits from FastAPI's ``APIRouter`` class and creates an API endpoint using your target Langchain object. | ||
|
||
To better understand ``LangchainRouter``, let's break down the example below: | ||
|
||
.. code-block:: python | ||
from dotenv import load_dotenv | ||
from fastapi import FastAPI | ||
from langchain import ConversationChain | ||
from langchain.chat_models import ChatOpenAI | ||
from lanarky.routing import LangchainRouter | ||
load_dotenv() | ||
app = FastAPI() | ||
langchain_router = LangchainRouter( | ||
langchain_object=ConversationChain( | ||
llm=ChatOpenAI(temperature=0), | ||
verbose=True | ||
) | ||
) | ||
app.include_router(langchain_router) | ||
In the above example, ``langchain_router`` is an instance of the ``APIRouter`` class that creates a POST endpoint at ``/chat``. | ||
This endpoint accepts JSON data as input and returns JSON data as output. | ||
|
||
Now, let's explore an interesting feature of Lanarky - the ability to deploy your Langchain app with token streaming. | ||
Here's an example: | ||
|
||
.. code-block:: python | ||
from dotenv import load_dotenv | ||
from fastapi import FastAPI | ||
from langchain import ConversationChain | ||
from langchain.chat_models import ChatOpenAI | ||
from lanarky.routing import LangchainRouter | ||
load_dotenv() | ||
app = FastAPI() | ||
langchain_router = LangchainRouter( | ||
langchain_object=ConversationChain( | ||
llm=ChatOpenAI(temperature=0, streaming=True), | ||
verbose=True | ||
), | ||
streaming_mode=1 | ||
) | ||
app.include_router(langchain_router) | ||
By including ``streaming_mode=1`` in the ``LangchainRouter`` initialization, your Langchain app can be deployed | ||
with token streaming. | ||
|
||
The ``LangchainRouter`` class uses the ``streaming_mode`` parameter to determine the token streaming behavior. | ||
There are three available modes: | ||
|
||
- ``streaming_mode=0``: No token streaming | ||
- ``streaming_mode=1``: Token streaming as plain text | ||
- ``streaming_mode=2``: Token streaming as JSON |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,15 @@ | ||
Langchain | ||
================== | ||
|
||
Welcome to Lanarky documentation for the Langchain framework. Langchain is a popular framework to build | ||
applications on top of available LLMs. While Langchain handles the core logic of the application, it lacks | ||
the deployment requirements to make your application production ready. | ||
|
||
This is where Lanarky comes in. Built on top of FastAPI, Lanarky provides a flexible deployment solution for | ||
your Langchain application. | ||
|
||
.. toctree:: | ||
:maxdepth: 1 | ||
|
||
deploy | ||
custom_callbacks |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,3 @@ | ||
from .langchain import LangchainRouter | ||
|
||
__all__ = ["LangchainRouter"] |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,90 @@ | ||
from enum import IntEnum | ||
from typing import Any, Type | ||
|
||
from fastapi.routing import APIRouter | ||
from langchain.chains.base import Chain | ||
|
||
from .utils import ( | ||
create_langchain_base_endpoint, | ||
create_langchain_dependency, | ||
create_langchain_streaming_endpoint, | ||
create_langchain_streaming_json_endpoint, | ||
create_request_from_langchain_dependency, | ||
create_response_model_from_langchain_dependency, | ||
) | ||
|
||
|
||
class StreamingMode(IntEnum): | ||
OFF = 0 | ||
TEXT = 1 | ||
JSON = 2 | ||
|
||
|
||
def create_langchain_endpoint( | ||
endpoint_request, langchain_dependency, response_model, streaming_mode | ||
): | ||
"""Creates a Langchain endpoint.""" | ||
if streaming_mode == StreamingMode.OFF: | ||
endpoint = create_langchain_base_endpoint( | ||
endpoint_request, langchain_dependency, response_model | ||
) | ||
elif streaming_mode == StreamingMode.TEXT: | ||
endpoint = create_langchain_streaming_endpoint( | ||
endpoint_request, langchain_dependency | ||
) | ||
elif streaming_mode == StreamingMode.JSON: | ||
endpoint = create_langchain_streaming_json_endpoint( | ||
endpoint_request, langchain_dependency | ||
) | ||
else: | ||
raise ValueError(f"Invalid streaming mode: {streaming_mode}") | ||
|
||
return endpoint | ||
|
||
|
||
class LangchainRouter(APIRouter): | ||
def __init__( | ||
self, | ||
*, | ||
langchain_url: str = "/chat", | ||
langchain_object: Type[Chain] = Chain, | ||
langchain_endpoint_kwargs: dict[str, Any] = None, | ||
streaming_mode: StreamingMode = StreamingMode.OFF, | ||
**kwargs, | ||
): | ||
super().__init__(**kwargs) | ||
self.langchain_url = langchain_url | ||
self.langchain_object = langchain_object | ||
self.langchain_endpoint_kwargs = langchain_endpoint_kwargs or {} | ||
self.streaming_mode = streaming_mode | ||
|
||
self.langchain_dependency = create_langchain_dependency(langchain_object) | ||
|
||
self.setup() | ||
|
||
def setup(self) -> None: | ||
if self.langchain_url: | ||
endpoint_request = create_request_from_langchain_dependency( | ||
self.langchain_dependency | ||
) | ||
response_model = ( | ||
create_response_model_from_langchain_dependency( | ||
self.langchain_dependency | ||
) | ||
if self.streaming_mode == StreamingMode.OFF | ||
else None | ||
) | ||
endpoint = create_langchain_endpoint( | ||
endpoint_request, | ||
self.langchain_dependency, | ||
response_model, | ||
self.streaming_mode, | ||
) | ||
|
||
self.add_api_route( | ||
self.langchain_url, | ||
endpoint, | ||
response_model=response_model, | ||
methods=["POST"], | ||
**self.langchain_endpoint_kwargs, | ||
) |
Oops, something went wrong.