Skip to content

Commit

Permalink
feat(demo): add examples (#159)
Browse files Browse the repository at this point in the history
* 📝 update getting started

* 💡 improve docstrings

* ➕ add `mkdocstrings[python]` dependency

* 🐛 fix type annotations

* 📝 add api reference docs

* ⚡ minor changes

* ➕ add dependencies for examples

* 🔧 add social plugin

* ➕ add back gradio dependency

* 🔨 add chatgpt-clone example

* 🐛 fix json serialisation for langchain callbacks

* 🔨 add paulGPT example

* 📝 hide footer

* 📝 add discord invite link

* 👷 add back docs GA

* ✅ fix broken unit test

* ⚡ minor change
  • Loading branch information
ajndkr authored Nov 28, 2023
1 parent 3d4e87d commit fb972a3
Show file tree
Hide file tree
Showing 39 changed files with 2,228 additions and 247 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/docs.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -41,5 +41,5 @@ jobs:
- name: Deploy GitHub Pages
run: |
pip install mkdocs-material mdx-include termynal
pip install 'mkdocs-material[imaging]' mdx-include termynal 'mkdocstrings[python]'
mkdocs gh-deploy --force
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@
[![Stars](https://img.shields.io/github/stars/ajndkr/lanarky)](https://github.com/ajndkr/lanarky/stargazers)
[![License](https://img.shields.io/badge/License-MIT-yellow.svg)](https://github.com/ajndkr/lanarky/blob/main/LICENSE)
[![Twitter](https://img.shields.io/twitter/follow/LanarkyAPI?style=social)](https://twitter.com/intent/follow?screen_name=LanarkyAPI)
[![Discord](https://img.shields.io/badge/join-Discord-7289da.svg)](https://discord.gg/6qUfrQAEeE)

[![Python](https://img.shields.io/pypi/pyversions/lanarky.svg)](https://pypi.org/project/lanarky/)
[![Coverage](https://coveralls.io/repos/github/ajndkr/lanarky/badge.svg?branch=main)](https://coveralls.io/github/ajndkr/lanarky?branch=main)
Expand Down
6 changes: 4 additions & 2 deletions docs/getting-started.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,14 @@
---
hide:
- navigation
- footer
---

Let's build our first LLM microservice with Lanarky!

We need to first install some extra dependencies as we will use OpenAI as the LLM
provider.
## Dependencies

First, we will install Lanarky with the OpenAI adapter:

<!-- termynal -->

Expand Down
1 change: 1 addition & 0 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
hide:
- navigation
- toc
- footer
---

<style>
Expand Down
2 changes: 1 addition & 1 deletion docs/learn/adapters/langchain/fastapi.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ async def chat(
The `/chat` endpoint is similar to the one we created using `LangChainAPIRouter` in the
[LangChain API Router](./router.md) guide. Besides the `StreamingResponse` class, we also use
the `TokenStreamingCallbackHandler` callback handler to stream the intermediate tokens back to
the client. Check out [Callbacks](../../callbacks.md) to learn more about the lanarky callback
the client. Check out [Callbacks](./callbacks.md) to learn more about the lanarky callback
handlers.

!!! tip
Expand Down
9 changes: 6 additions & 3 deletions docs/learn/adapters/langchain/router.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,8 @@ to the router to build a streaming endpoint. The additional parameters such as
To receive the events, we will use the following client script:

```python
import json

import click

from lanarky.clients import StreamingClient
Expand All @@ -60,7 +62,7 @@ def main(input: str, stream: bool):
params={"streaming": str(stream).lower()},
json={"input": input},
):
print(f"{event.event}: {event.data}")
print(f"{event.event}: {json.loads(event.data)['token']}", end="", flush=True)


if __name__ == "__main__":
Expand All @@ -81,7 +83,7 @@ Then run the client script:

```
$ python client.py --input "hi"
completion: {'token': 'Hello! How can I assist you today?'}
completion: Hello! How can I assist you today?
```

## Websocket
Expand Down Expand Up @@ -116,6 +118,7 @@ function and send it to the router to build a websocket endpoint.
To communicate with the server, we will use the following client script:

```python
import json
from lanarky.clients import WebSocketClient


Expand All @@ -127,7 +130,7 @@ def main():
session.send(dict(input=user_input))
print("Received: ", end="")
for chunk in session.stream_response():
print(chunk["data"]["token"], end="", flush=True)
print(json.loads(chunk["data"])["token"], end="", flush=True)


if __name__ == "__main__":
Expand Down
10 changes: 10 additions & 0 deletions docs/learn/index.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
---
hide:
- toc
- footer
---

# Learn
Expand All @@ -12,3 +13,12 @@ Here's a quick overview of what we will cover:
- [Streaming](./streaming.md): build streaming microservices using FastAPI routers
- [Websockets](./websockets.md): build websocket microservices using FastAPI routers
- [Adapters](./adapters/index.md): build microservices using popular LLM frameworks

## Examples

Lanarky comes with a set of examples to demonstrate its features:

- [ChatGPT-clone](https://github.com/ajndkr/lanarky/tree/main/examples/chatgpt-clone): a chatbot
application like [ChatGPT](https://chat.openai.com/)
- [PaulGPT](https://github.com/ajndkr/lanarky/tree/main/examples/paulGPT): a chatbot application
to answer questions about Paul Graham essay, "[What I worked on](http://www.paulgraham.com/worked.html)"
9 changes: 9 additions & 0 deletions docs/reference/adapters/langchain.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
::: lanarky.adapters.langchain.routing

::: lanarky.adapters.langchain.responses

::: lanarky.adapters.langchain.callbacks

::: lanarky.adapters.langchain.dependencies

::: lanarky.adapters.langchain.utils
9 changes: 9 additions & 0 deletions docs/reference/adapters/openai.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
::: lanarky.adapters.openai.resources

::: lanarky.adapters.openai.routing

::: lanarky.adapters.openai.responses

::: lanarky.adapters.openai.dependencies

::: lanarky.adapters.openai.utils
30 changes: 30 additions & 0 deletions docs/reference/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
---
hide:
- toc
- footer
---

# API Reference

This is Lanarky's API reference documentation.

The API reference in split into multiple sections. First, we will cover the
Core API:

- [`Lanarky`](./lanarky.md) - The main application module
- [`StreamingResponse`](./streaming.md) - `Response` class for streaming
- [`WebSocketSession`](./websockets.md) - class for managing websocket sessions

!!! note

Lanarky also provides a collection of web clients for testing purposes.
See [Miscellaneous](./misc.md) for more information.

Next, we will cover the Adapter API:

- [OpenAI](./adapters/openai.md): Adapter module for
[OpenAI Python SDK](https://platform.openai.com/docs/api-reference?lang=python)
- [LangChain](./adapters/langchain.md): Adapter module for
[LangChain](https://www.langchain.com/)

You can find all other utility functions/classes in the [Miscellaneous](./misc.md) section.
5 changes: 5 additions & 0 deletions docs/reference/lanarky.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
# `Lanarky` class

::: lanarky.Lanarky
options:
members: []
5 changes: 5 additions & 0 deletions docs/reference/misc.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
Lanarky offers client classes to test streaming and websocket endpoints.

::: lanarky.clients

::: lanarky.logging
3 changes: 3 additions & 0 deletions docs/reference/streaming.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
# `StreamingResponse` class

::: lanarky.responses.StreamingResponse
3 changes: 3 additions & 0 deletions docs/reference/websockets.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
# `WebsocketSession` class

::: lanarky.websockets.WebsocketSession
34 changes: 34 additions & 0 deletions examples/chatgpt-clone/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
# ChatGPT-clone

A chatbot application like [ChatGPT](https://chat.openai.com/), built with Lanarky.

This example covers the following Lanarky features:

- OpenAI Adapter
- Streaming tokens via server-sent events

To learn more about Lanarky, check out Lanarky's [full documentation](https://lanarky.ajndkr.com/learn/).

## Setup

Install dependencies:

```
pip install 'lanarky[openai]' gradio
```

## Run

First we set the OpenAI API key:

```sh
export OPENAI_API_KEY=<your-key>
```

Then we run the server:

```sh
python app.py
```

Once the server is running, open http://localhost:8000/ in your browser.
81 changes: 81 additions & 0 deletions examples/chatgpt-clone/app.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,81 @@
import gradio as gr

from lanarky import Lanarky
from lanarky.adapters.openai.resources import ChatCompletionResource
from lanarky.adapters.openai.routing import OpenAIAPIRouter
from lanarky.clients import StreamingClient

app = Lanarky()
router = OpenAIAPIRouter()


@router.post("/chat")
def chat() -> ChatCompletionResource:
system = "You are a sassy assistant"
return ChatCompletionResource(system=system, stream=True)


app.include_router(router)


def mount_playground(app: Lanarky) -> Lanarky:
blocks = gr.Blocks(
title="ChatGPT-clone",
theme=gr.themes.Default(
primary_hue=gr.themes.colors.teal, secondary_hue=gr.themes.colors.teal
),
css="footer {visibility: hidden}",
)

with blocks:
blocks.load(
None,
None,
js="""
() => {
document.body.className = "white";
}""",
)
gr.HTML(
"""<div align="center"><img src="https://lanarky.ajndkr.com/assets/logo-light-mode.png" width="350"></div>"""
)
chatbot = gr.Chatbot(height=500, show_label=False)
with gr.Row():
user_input = gr.Textbox(
show_label=False, placeholder="Type a message...", scale=5
)
clear_btn = gr.Button("Clear")

def chat(history):
messages = []
for human, assistant in history:
if human:
messages.append({"role": "user", "content": human})
if assistant:
messages.append({"role": "assistant", "content": assistant})

history[-1][1] = ""
for event in StreamingClient().stream_response(
"POST", "/chat", json={"messages": messages}
):
history[-1][1] += event.data
yield history

user_input.submit(
lambda user_input, chatbot: ("", chatbot + [[user_input, None]]),
[user_input, chatbot],
[user_input, chatbot],
queue=False,
).then(chat, chatbot, chatbot)
clear_btn.click(lambda: None, None, chatbot, queue=False)

return gr.mount_gradio_app(app, blocks.queue(), "/")


app = mount_playground(app)


if __name__ == "__main__":
import uvicorn

uvicorn.run(app)
35 changes: 35 additions & 0 deletions examples/paulGPT/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
# PaulGPT

A chatbot application to answer questions about Paul Graham essay, "[What I worked on](http://www.paulgraham.com/worked.html)",
built with Lanarky.

This example covers the following Lanarky features:

- LangChain Adapter
- Streaming source documents via server-sent events and websockets

To learn more about Lanarky, check out Lanarky's [full documentation](https://lanarky.ajndkr.com/learn/).

## Setup

Install dependencies:

```
pip install 'lanarky[openai]' gradio faiss-cpu
```

## Run

First we set the OpenAI API key:

```sh
export OPENAI_API_KEY=<your-key>
```

Then we run the server:

```sh
python app.py
```

Once the server is running, open http://localhost:8000/ in your browser.
Loading

0 comments on commit fb972a3

Please sign in to comment.