Skip to content

Commit

Permalink
Fix name
Browse files Browse the repository at this point in the history
  • Loading branch information
jacoblee93 committed Jan 28, 2025
1 parent f05ca0d commit 8663686
Show file tree
Hide file tree
Showing 9 changed files with 233 additions and 45 deletions.
24 changes: 13 additions & 11 deletions docs/core_docs/docs/integrations/chat/deepseek.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
},
"source": [
"---\n",
"sidebar_label: Deepseek\n",
"sidebar_label: DeepSeek\n",
"---"
]
},
Expand All @@ -19,34 +19,36 @@
"id": "e49f1e0d",
"metadata": {},
"source": [
"# ChatDeepseek\n",
"# ChatDeepSeek\n",
"\n",
"This will help you getting started with Deepseek [chat models](/docs/concepts/#chat-models). For detailed documentation of all `ChatDeepseek` features and configurations head to the [API reference](https://api.js.langchain.com/classes/_langchain_deepseek.ChatDeepseek.html).\n",
"This will help you getting started with DeepSeek [chat models](/docs/concepts/#chat-models). For detailed documentation of all `ChatDeepSeek` features and configurations head to the [API reference](https://api.js.langchain.com/classes/_langchain_deepseek.ChatDeepSeek.html).\n",
"\n",
"## Overview\n",
"### Integration details\n",
"\n",
"| Class | Package | Local | Serializable | [PY support](https://python.langchain.com/docs/integrations/chat/deepseek) | Package downloads | Package latest |\n",
"| :--- | :--- | :---: | :---: | :---: | :---: | :---: |\n",
"| [`ChatDeepseek`](https://api.js.langchain.com/classes/_langchain_deepseek.ChatDeepseek.html) | [`@langchain/deepseek`](https://npmjs.com/@langchain/deepseek) | ❌ (see [Ollama](/docs/integrations/chat/ollama)) | beta | ✅ | ![NPM - Downloads](https://img.shields.io/npm/dm/@langchain/deepseek?style=flat-square&label=%20&) | ![NPM - Version](https://img.shields.io/npm/v/@langchain/deepseek?style=flat-square&label=%20&) |\n",
"| [`DeepSeek`](https://api.js.langchain.com/classes/_langchain_deepseek.ChatDeepSeek.html) | [`@langchain/deepseek`](https://npmjs.com/@langchain/deepseek) | ❌ (see [Ollama](/docs/integrations/chat/ollama)) | beta | ✅ | ![NPM - Downloads](https://img.shields.io/npm/dm/@langchain/deepseek?style=flat-square&label=%20&) | ![NPM - Version](https://img.shields.io/npm/v/@langchain/deepseek?style=flat-square&label=%20&) |\n",
"\n",
"### Model features\n",
"\n",
"See the links in the table headers below for guides on how to use specific features.\n",
"\n",
"| [Tool calling](/docs/how_to/tool_calling) | [Structured output](/docs/how_to/structured_output/) | JSON mode | [Image input](/docs/how_to/multimodal_inputs/) | Audio input | Video input | [Token-level streaming](/docs/how_to/chat_streaming/) | [Token usage](/docs/how_to/chat_token_usage_tracking/) | [Logprobs](/docs/how_to/logprobs/) |\n",
"| :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |\n",
"| ✅ | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ | ✅ | | \n",
"| ✅ | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ | ✅ | | \n",
"\n",
"Note that as of 1/27/25, tool calling and structured output are not currently supported for `deepseek-reasoner`.\n",
"\n",
"## Setup\n",
"\n",
"To access Deepseek models you'll need to create a/an Deepseek account, get an API key, and install the `@langchain/deepseek` integration package.\n",
"To access DeepSeek models you'll need to create a DeepSeek account, get an API key, and install the `@langchain/deepseek` integration package.\n",
"\n",
"You can also access the DeepSeek API through providers like [Together AI](/docs/integrations/chat/togetherai) or [Ollama](/docs/integrations/chat/ollama).\n",
"\n",
"### Credentials\n",
"\n",
"Head to https://deepseek.com/ to sign up to Deepseek and generate an API key. Once you've done this set the `DEEPSEEK_API_KEY` environment variable:\n",
"Head to https://deepseek.com/ to sign up to DeepSeek and generate an API key. Once you've done this set the `DEEPSEEK_API_KEY` environment variable:\n",
"\n",
"```bash\n",
"export DEEPSEEK_API_KEY=\"your-api-key\"\n",
Expand All @@ -61,7 +63,7 @@
"\n",
"### Installation\n",
"\n",
"The LangChain ChatDeepseek integration lives in the `@langchain/deepseek` package:\n",
"The LangChain ChatDeepSeek integration lives in the `@langchain/deepseek` package:\n",
"\n",
"```{=mdx}\n",
"import IntegrationInstallTooltip from \"@mdx_components/integration_install_tooltip.mdx\";\n",
Expand Down Expand Up @@ -97,9 +99,9 @@
},
"outputs": [],
"source": [
"import { ChatDeepseek } from \"@langchain/deepseek\";\n",
"import { ChatDeepSeek } from \"@langchain/DeepSeek\n",
"\n",
"const llm = new ChatDeepseek({\n",
"const llm = new ChatDeepSeek({\n",
" model: \"deepseek-reasoner\",\n",
" temperature: 0,\n",
" // other params...\n",
Expand Down Expand Up @@ -200,7 +202,7 @@
"source": [
"## API reference\n",
"\n",
"For detailed documentation of all ChatDeepseek features and configurations head to the API reference: https://api.js.langchain.com/classes/_langchain_deepseek.ChatDeepseek.html"
"For detailed documentation of all ChatDeepSeek features and configurations head to the API reference: https://api.js.langchain.com/classes/_langchain_deepseek.ChatDeepSeek.html"
]
}
],
Expand Down
2 changes: 1 addition & 1 deletion docs/core_docs/docs/integrations/chat/xai.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
"\n",
"[xAI](https://x.ai/) is an artificial intelligence company that develops large language models (LLMs). Their flagship model, Grok, is trained on real-time X (formerly Twitter) data and aims to provide witty, personality-rich responses while maintaining high capability on technical tasks.\n",
"\n",
"This guide will help you getting started with `ChatXAI` [chat models](/docs/concepts/chat_models). For detailed documentation of all `ChatXAI` features and configurations head to the [API reference](https://api.js.langchain.com/classes/langchain_community_chat_models_fireworks.ChatXAI.html).\n",
"This guide will help you getting started with `ChatXAI` [chat models](/docs/concepts/chat_models). For detailed documentation of all `ChatXAI` features and configurations head to the [API reference](https://api.js.langchain.com/classes/_langchain_xai.ChatXAI.html).\n",
"\n",
"## Overview\n",
"### Integration details\n",
Expand Down
80 changes: 80 additions & 0 deletions libs/langchain-deepseek/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,80 @@
# @langchain/deepseek

This package contains the LangChain.js integrations for DeepSeek.

## Installation

```bash npm2yarn
npm install @langchain/deepseek @langchain/core
```

## Chat models

This package adds support for DeepSeek's chat model inference.

Set the necessary environment variable (or pass it in via the constructor):

```bash
export DEEPSEEK_API_KEY=
```

```typescript
import { ChatDeepSeek } from "@langchain/deepseek";
import { HumanMessage } from "@langchain/core/messages";

const model = new ChatDeepSeek({
apiKey: process.env.DEEPSEEK_API_KEY, // Default value.
model: "<model_name>",
});

const res = await model.invoke([
{
role: "user",
content: message,
},
]);
```

## Development

To develop the `@langchain/deepseek` package, you'll need to follow these instructions:

### Install dependencies

```bash
yarn install
```

### Build the package

```bash
yarn build
```

Or from the repo root:

```bash
yarn build --filter=@langchain/deepseek
```

### Run tests

Test files should live within a `tests/` file in the `src/` folder. Unit tests should end in `.test.ts` and integration tests should
end in `.int.test.ts`:

```bash
$ yarn test
$ yarn test:int
```

### Lint & Format

Run the linter & formatter to ensure your code is up to standard:

```bash
yarn lint && yarn format
```

### Adding new entrypoints

If you add a new file to be exported, either import & re-export from `src/index.ts`, or add it to the `entrypoints` field in the `config` variable located inside `langchain.config.js` and run `yarn build` to generate the new entrypoint.
1 change: 1 addition & 0 deletions libs/langchain-deepseek/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,7 @@
"@jest/globals": "^29.5.0",
"@langchain/core": "workspace:*",
"@langchain/scripts": ">=0.1.0 <0.2.0",
"@langchain/standard-tests": "workspace:*",
"@swc/core": "^1.3.90",
"@swc/jest": "^0.2.29",
"@tsconfig/recommended": "^1.0.3",
Expand Down
130 changes: 117 additions & 13 deletions libs/langchain-deepseek/src/chat_models.ts
Original file line number Diff line number Diff line change
Expand Up @@ -6,11 +6,11 @@ import {
OpenAIClient,
} from "@langchain/openai";

export interface ChatDeepseekCallOptions extends ChatOpenAICallOptions {
export interface ChatDeepSeekCallOptions extends ChatOpenAICallOptions {
headers?: Record<string, string>;
}

export interface ChatDeepseekInput extends ChatOpenAIFields {
export interface ChatDeepSeekInput extends ChatOpenAIFields {
/**
* The Deepseek API key to use for requests.
* @default process.env.DEEPSEEK_API_KEY
Expand Down Expand Up @@ -59,18 +59,27 @@ export interface ChatDeepseekInput extends ChatOpenAIFields {
* export DEEPSEEK_API_KEY="your-api-key"
* ```
*
* ## [Constructor args](https://api.js.langchain.com/classes/_langchain_deepseek.ChatDeepseek.html#constructor)
* ## [Constructor args](https://api.js.langchain.com/classes/_langchain_deepseek.ChatDeepSeek.html#constructor)
*
* ## [Runtime args](https://api.js.langchain.com/interfaces/_langchain_deepseek.ChatDeepseekCallOptions.html)
* ## [Runtime args](https://api.js.langchain.com/interfaces/_langchain_deepseek.ChatDeepSeekCallOptions.html)
*
* Runtime args can be passed as the second argument to any of the base runnable methods `.invoke`. `.stream`, `.batch`, etc.
* They can also be passed via `.bind`, as shown in the examples below:
* They can also be passed via `.bind`, or the second arg in `.bindTools`, like shown in the examples below:
*
* ```typescript
* // When calling `.bind`, call options should be passed via the first argument
* const llmWithArgsBound = llm.bind({
* stop: ["\n"],
* tools: [...],
* });
*
* // When calling `.bindTools`, call options should be passed via the second argument
* const llmWithTools = llm.bindTools(
* [...],
* {
* tool_choice: "auto",
* }
* );
* ```
*
* ## Examples
Expand All @@ -79,9 +88,9 @@ export interface ChatDeepseekInput extends ChatOpenAIFields {
* <summary><strong>Instantiate</strong></summary>
*
* ```typescript
* import { ChatDeepseek } from '@langchain/deepseek';
* import { ChatDeepSeek } from '@langchain/deepseek';
*
* const llm = new ChatDeepseek({
* const llm = new ChatDeepSeek({
* model: "deepseek-reasoner",
* temperature: 0,
* // other params...
Expand Down Expand Up @@ -268,10 +277,104 @@ export interface ChatDeepseekInput extends ChatOpenAIFields {
* </details>
*
* <br />
*
* <details>
* <summary><strong>Bind tools</strong></summary>
*
* ```typescript
* import { z } from 'zod';
*
* const llmForToolCalling = new ChatDeepSeek({
* model: "deepseek-chat",
* temperature: 0,
* // other params...
* });
*
* const GetWeather = {
* name: "GetWeather",
* description: "Get the current weather in a given location",
* schema: z.object({
* location: z.string().describe("The city and state, e.g. San Francisco, CA")
* }),
* }
*
* const GetPopulation = {
* name: "GetPopulation",
* description: "Get the current population in a given location",
* schema: z.object({
* location: z.string().describe("The city and state, e.g. San Francisco, CA")
* }),
* }
*
* const llmWithTools = llmForToolCalling.bindTools([GetWeather, GetPopulation]);
* const aiMsg = await llmWithTools.invoke(
* "Which city is hotter today and which is bigger: LA or NY?"
* );
* console.log(aiMsg.tool_calls);
* ```
*
* ```txt
* [
* {
* name: 'GetWeather',
* args: { location: 'Los Angeles, CA' },
* type: 'tool_call',
* id: 'call_cd34'
* },
* {
* name: 'GetWeather',
* args: { location: 'New York, NY' },
* type: 'tool_call',
* id: 'call_68rf'
* },
* {
* name: 'GetPopulation',
* args: { location: 'Los Angeles, CA' },
* type: 'tool_call',
* id: 'call_f81z'
* },
* {
* name: 'GetPopulation',
* args: { location: 'New York, NY' },
* type: 'tool_call',
* id: 'call_8byt'
* }
* ]
* ```
* </details>
*
* <br />
*
* <details>
* <summary><strong>Structured Output</strong></summary>
*
* ```typescript
* import { z } from 'zod';
*
* const Joke = z.object({
* setup: z.string().describe("The setup of the joke"),
* punchline: z.string().describe("The punchline to the joke"),
* rating: z.number().optional().describe("How funny the joke is, from 1 to 10")
* }).describe('Joke to tell user.');
*
* const structuredLlm = llmForToolCalling.withStructuredOutput(Joke, { name: "Joke" });
* const jokeResult = await structuredLlm.invoke("Tell me a joke about cats");
* console.log(jokeResult);
* ```
*
* ```txt
* {
* setup: "Why don't cats play poker in the wild?",
* punchline: 'Because there are too many cheetahs.'
* }
* ```
* </details>
*
* <br />
*/
export class ChatDeepseek extends ChatOpenAI<ChatDeepseekCallOptions> {
export class ChatDeepSeek extends ChatOpenAI<ChatDeepSeekCallOptions> {
static lc_name() {
return "ChatDeepseek";
return "ChatDeepSeek";
}

_llmType() {
Expand All @@ -288,7 +391,7 @@ export class ChatDeepseek extends ChatOpenAI<ChatDeepseekCallOptions> {

lc_namespace = ["langchain", "chat_models", "deepseek"];

constructor(fields?: Partial<ChatDeepseekInput>) {
constructor(fields?: Partial<ChatDeepSeekInput>) {
const apiKey = fields?.apiKey || getEnvironmentVariable("DEEPSEEK_API_KEY");
if (!apiKey) {
throw new Error(
Expand All @@ -307,6 +410,7 @@ export class ChatDeepseek extends ChatOpenAI<ChatDeepseekCallOptions> {
}

protected override _convertOpenAIDeltaToBaseMessageChunk(
// eslint-disable-next-line @typescript-eslint/no-explicit-any
delta: Record<string, any>,
rawResponse: OpenAIClient.ChatCompletionChunk,
defaultRole?:
Expand Down Expand Up @@ -335,9 +439,9 @@ export class ChatDeepseek extends ChatOpenAI<ChatDeepseekCallOptions> {
message,
rawResponse
);
langChainMessage.additional_kwargs.reasoning_content = (
message as any
).reasoning_content;
langChainMessage.additional_kwargs.reasoning_content =
// eslint-disable-next-line @typescript-eslint/no-explicit-any
(message as any).reasoning_content;
return langChainMessage;
}
}
4 changes: 2 additions & 2 deletions libs/langchain-deepseek/src/tests/chat_models.int.test.ts
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
/* eslint-disable no-process-env */
/* eslint-disable @typescript-eslint/no-explicit-any */
import { test, expect } from "@jest/globals";
import { ChatDeepseek } from "../chat_models.js";
import { ChatDeepSeek } from "../chat_models.js";

test("Can send deepseek-reasoner requests", async () => {
const llm = new ChatDeepseek({
const llm = new ChatDeepSeek({
model: "deepseek-reasoner",
});
const input = `Translate "I love programming" into French.`;
Expand Down
Loading

0 comments on commit 8663686

Please sign in to comment.