-
Notifications
You must be signed in to change notification settings - Fork 1.6k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[WIP] Updating Extensions & Engines guide
- Loading branch information
1 parent
867a51c
commit a90287b
Showing
11 changed files
with
449 additions
and
346 deletions.
There are no files selected for viewing
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,268 @@ | ||
--- | ||
title: Configure Extensions | ||
description: Learn about Jan's default extensions and explore how to configure them. | ||
[ | ||
Jan, | ||
Customizable Intelligence, LLM, | ||
local AI, | ||
privacy focus, | ||
free and open source, | ||
private and offline, | ||
conversational AI, | ||
no-subscription fee, | ||
large language models, | ||
Jan Extensions, | ||
Extensions, | ||
] | ||
--- | ||
|
||
# Configure an Extension Settings | ||
|
||
To configure an extension settings: | ||
|
||
1. Navigate to the `~/jan/data/extensions`. | ||
2. Open the `extensions.json` file | ||
3. Edit the file with options including: | ||
|
||
| Option | Description | | ||
| ---------------- | ----------------------------------- | | ||
| `_active` | Enable/disable the extension. | | ||
| `listeners` | Default listener setting. | | ||
| `origin` | Extension file path. | | ||
| `installOptions` | Version and metadata configuration. | | ||
| `name` | Extension name. | | ||
| `productName` | Extension display name. | | ||
| `version` | Extension version. | | ||
| `main` | Main file path. | | ||
| `description` | Extension description. | | ||
| `url` | Extension URL. | | ||
|
||
```json title="~/jan/data/extensions/extensions.json" | ||
{ | ||
"@janhq/conversational-extension": { | ||
"_active": true, | ||
"listeners": {}, | ||
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-conversational-extension-1.0.0.tgz", | ||
"installOptions": { | ||
"version": false, | ||
"fullMetadata": true | ||
}, | ||
"name": "@janhq/conversational-extension", | ||
"productName": "Conversational", | ||
"version": "1.0.0", | ||
"main": "dist/index.js", | ||
"description": "This extension enables conversations and state persistence via your filesystem", | ||
"url": "extension://@janhq/conversational-extension/dist/index.js" | ||
}, | ||
"@janhq/inference-anthropic-extension": { | ||
"_active": true, | ||
"listeners": {}, | ||
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-inference-anthropic-extension-1.0.2.tgz", | ||
"installOptions": { | ||
"version": false, | ||
"fullMetadata": true | ||
}, | ||
"name": "@janhq/inference-anthropic-extension", | ||
"productName": "Anthropic Inference Engine", | ||
"version": "1.0.2", | ||
"main": "dist/index.js", | ||
"description": "This extension enables Anthropic chat completion API calls", | ||
"url": "extension://@janhq/inference-anthropic-extension/dist/index.js" | ||
}, | ||
"@janhq/inference-triton-trt-llm-extension": { | ||
"_active": true, | ||
"listeners": {}, | ||
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-inference-triton-trt-llm-extension-1.0.0.tgz", | ||
"installOptions": { | ||
"version": false, | ||
"fullMetadata": true | ||
}, | ||
"name": "@janhq/inference-triton-trt-llm-extension", | ||
"productName": "Triton-TRT-LLM Inference Engine", | ||
"version": "1.0.0", | ||
"main": "dist/index.js", | ||
"description": "This extension enables Nvidia's TensorRT-LLM as an inference engine option", | ||
"url": "extension://@janhq/inference-triton-trt-llm-extension/dist/index.js" | ||
}, | ||
"@janhq/inference-mistral-extension": { | ||
"_active": true, | ||
"listeners": {}, | ||
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-inference-mistral-extension-1.0.1.tgz", | ||
"installOptions": { | ||
"version": false, | ||
"fullMetadata": true | ||
}, | ||
"name": "@janhq/inference-mistral-extension", | ||
"productName": "MistralAI Inference Engine", | ||
"version": "1.0.1", | ||
"main": "dist/index.js", | ||
"description": "This extension enables Mistral chat completion API calls", | ||
"url": "extension://@janhq/inference-mistral-extension/dist/index.js" | ||
}, | ||
"@janhq/inference-martian-extension": { | ||
"_active": true, | ||
"listeners": {}, | ||
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-inference-martian-extension-1.0.1.tgz", | ||
"installOptions": { | ||
"version": false, | ||
"fullMetadata": true | ||
}, | ||
"name": "@janhq/inference-martian-extension", | ||
"productName": "Martian Inference Engine", | ||
"version": "1.0.1", | ||
"main": "dist/index.js", | ||
"description": "This extension enables Martian chat completion API calls", | ||
"url": "extension://@janhq/inference-martian-extension/dist/index.js" | ||
}, | ||
"@janhq/inference-openrouter-extension": { | ||
"_active": true, | ||
"listeners": {}, | ||
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-inference-openrouter-extension-1.0.0.tgz", | ||
"installOptions": { | ||
"version": false, | ||
"fullMetadata": true | ||
}, | ||
"name": "@janhq/inference-openrouter-extension", | ||
"productName": "OpenRouter Inference Engine", | ||
"version": "1.0.0", | ||
"main": "dist/index.js", | ||
"description": "This extension enables Open Router chat completion API calls", | ||
"url": "extension://@janhq/inference-openrouter-extension/dist/index.js" | ||
}, | ||
"@janhq/inference-nvidia-extension": { | ||
"_active": true, | ||
"listeners": {}, | ||
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-inference-nvidia-extension-1.0.1.tgz", | ||
"installOptions": { | ||
"version": false, | ||
"fullMetadata": true | ||
}, | ||
"name": "@janhq/inference-nvidia-extension", | ||
"productName": "NVIDIA NIM Inference Engine", | ||
"version": "1.0.1", | ||
"main": "dist/index.js", | ||
"description": "This extension enables NVIDIA chat completion API calls", | ||
"url": "extension://@janhq/inference-nvidia-extension/dist/index.js" | ||
}, | ||
"@janhq/inference-groq-extension": { | ||
"_active": true, | ||
"listeners": {}, | ||
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-inference-groq-extension-1.0.1.tgz", | ||
"installOptions": { | ||
"version": false, | ||
"fullMetadata": true | ||
}, | ||
"name": "@janhq/inference-groq-extension", | ||
"productName": "Groq Inference Engine", | ||
"version": "1.0.1", | ||
"main": "dist/index.js", | ||
"description": "This extension enables fast Groq chat completion API calls", | ||
"url": "extension://@janhq/inference-groq-extension/dist/index.js" | ||
}, | ||
"@janhq/inference-openai-extension": { | ||
"_active": true, | ||
"listeners": {}, | ||
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-inference-openai-extension-1.0.2.tgz", | ||
"installOptions": { | ||
"version": false, | ||
"fullMetadata": true | ||
}, | ||
"name": "@janhq/inference-openai-extension", | ||
"productName": "OpenAI Inference Engine", | ||
"version": "1.0.2", | ||
"main": "dist/index.js", | ||
"description": "This extension enables OpenAI chat completion API calls", | ||
"url": "extension://@janhq/inference-openai-extension/dist/index.js" | ||
}, | ||
"@janhq/inference-cohere-extension": { | ||
"_active": true, | ||
"listeners": {}, | ||
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-inference-cohere-extension-1.0.0.tgz", | ||
"installOptions": { | ||
"version": false, | ||
"fullMetadata": true | ||
}, | ||
"name": "@janhq/inference-cohere-extension", | ||
"productName": "Cohere Inference Engine", | ||
"version": "1.0.0", | ||
"main": "dist/index.js", | ||
"description": "This extension enables Cohere chat completion API calls", | ||
"url": "extension://@janhq/inference-cohere-extension/dist/index.js" | ||
}, | ||
"@janhq/model-extension": { | ||
"_active": true, | ||
"listeners": {}, | ||
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-model-extension-1.0.33.tgz", | ||
"installOptions": { | ||
"version": false, | ||
"fullMetadata": true | ||
}, | ||
"name": "@janhq/model-extension", | ||
"productName": "Model Management", | ||
"version": "1.0.33", | ||
"main": "dist/index.js", | ||
"description": "Model Management Extension provides model exploration and seamless downloads", | ||
"url": "extension://@janhq/model-extension/dist/index.js" | ||
}, | ||
"@janhq/monitoring-extension": { | ||
"_active": true, | ||
"listeners": {}, | ||
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-monitoring-extension-1.0.10.tgz", | ||
"installOptions": { | ||
"version": false, | ||
"fullMetadata": true | ||
}, | ||
"name": "@janhq/monitoring-extension", | ||
"productName": "System Monitoring", | ||
"version": "1.0.10", | ||
"main": "dist/index.js", | ||
"description": "This extension provides system health and OS level data", | ||
"url": "extension://@janhq/monitoring-extension/dist/index.js" | ||
}, | ||
"@janhq/assistant-extension": { | ||
"_active": true, | ||
"listeners": {}, | ||
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-assistant-extension-1.0.1.tgz", | ||
"installOptions": { | ||
"version": false, | ||
"fullMetadata": true | ||
}, | ||
"name": "@janhq/assistant-extension", | ||
"productName": "Jan Assistant", | ||
"version": "1.0.1", | ||
"main": "dist/index.js", | ||
"description": "This extension enables assistants, including Jan, a default assistant that can call all downloaded models", | ||
"url": "extension://@janhq/assistant-extension/dist/index.js" | ||
}, | ||
"@janhq/tensorrt-llm-extension": { | ||
"_active": true, | ||
"listeners": {}, | ||
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-tensorrt-llm-extension-0.0.3.tgz", | ||
"installOptions": { | ||
"version": false, | ||
"fullMetadata": true | ||
}, | ||
"name": "@janhq/tensorrt-llm-extension", | ||
"productName": "TensorRT-LLM Inference Engine", | ||
"version": "0.0.3", | ||
"main": "dist/index.js", | ||
"description": "This extension enables Nvidia's TensorRT-LLM for the fastest GPU acceleration. See the [setup guide](https://jan.ai/guides/providers/tensorrt-llm/) for next steps.", | ||
"url": "extension://@janhq/tensorrt-llm-extension/dist/index.js" | ||
}, | ||
"@janhq/inference-cortex-extension": { | ||
"_active": true, | ||
"listeners": {}, | ||
"origin": "C:\\Users\\ACER\\AppData\\Local\\Programs\\jan\\resources\\app.asar.unpacked\\pre-install\\janhq-inference-cortex-extension-1.0.15.tgz", | ||
"installOptions": { | ||
"version": false, | ||
"fullMetadata": true | ||
}, | ||
"name": "@janhq/inference-cortex-extension", | ||
"productName": "Cortex Inference Engine", | ||
"version": "1.0.15", | ||
"main": "dist/index.js", | ||
"description": "This extension embeds cortex.cpp, a lightweight inference engine written in C++. See https://nitro.jan.ai.\nAdditional dependencies could be installed to run without Cuda Toolkit installation.", | ||
"url": "extension://@janhq/inference-cortex-extension/dist/index.js" | ||
} | ||
} | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,15 @@ | ||
{ | ||
"cortex": { | ||
"title": "Cortex", | ||
"href": "/docs/extensions-settings/cortex" | ||
}, | ||
"model-management": { | ||
"title": "Model Management", | ||
"href": "/docs/extensions-settings/model-management" | ||
}, | ||
"system-monitoring": { | ||
"title": "System Monitoring", | ||
"href": "/docs/extensions-settings/system-monitoring" | ||
} | ||
} | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,20 @@ | ||
--- | ||
title: Cortex | ||
description: Learn about Jan's default extensions and explore how to configure them. | ||
keywords: | ||
[ | ||
Jan, | ||
Customizable Intelligence, LLM, | ||
local AI, | ||
privacy focus, | ||
free and open source, | ||
private and offline, | ||
conversational AI, | ||
no-subscription fee, | ||
large language models, | ||
Jan Extensions, | ||
Extensions, | ||
] | ||
--- | ||
|
||
|
55 changes: 55 additions & 0 deletions
55
docs/src/pages/docs/extensions-settings/model-management.mdx
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,55 @@ | ||
--- | ||
title: Model Management | ||
description: Learn about Jan's default extensions and explore how to configure them. | ||
keywords: | ||
[ | ||
Jan, | ||
Customizable Intelligence, LLM, | ||
local AI, | ||
privacy focus, | ||
free and open source, | ||
private and offline, | ||
conversational AI, | ||
no-subscription fee, | ||
large language models, | ||
Jan Extensions, | ||
Extensions, | ||
] | ||
--- | ||
|
||
import { Callout } from 'nextra/components' | ||
import { Settings, EllipsisVertical, Plus, FolderOpen, Pencil } from 'lucide-react' | ||
|
||
# Model Management | ||
|
||
Configure how Jan handles model downloads and management. Access these settings through **Settings** (<Settings width={16} height={16} style={{display:"inline"}}/>) > **Extensions** > **Model Management**: | ||
|
||
## Hugging Face Access Token | ||
|
||
Access tokens authenticate your identity to Hugging Face Hub for model downloads. | ||
|
||
|
||
|
||
Enter your token in the format: `hf_************************` | ||
|
||
<Callout type="info"> | ||
Get your Hugging Face token from [Hugging Face Settings](https://huggingface.co/settings/tokens) | ||
</Callout> | ||
|
||
## Log Management | ||
|
||
### Enable App Logs | ||
Toggle to save logs locally on your computer for: | ||
- Debugging model issues | ||
- Crash reports | ||
- Download troubleshooting | ||
|
||
### Log Cleaning Interval | ||
Set automatic log deletion interval in milliseconds: | ||
- Default: 120000 (2 minutes) | ||
- Controls disk space usage | ||
- Prevents log accumulation | ||
|
||
<Callout type="warning"> | ||
Keep your access tokens secure and never share them. Enable logs temporarily when needed for troubleshooting. | ||
</Callout> |
Oops, something went wrong.