Skip to content

Latest commit

 

History

History
67 lines (45 loc) · 3.93 KB

examples.md

File metadata and controls

67 lines (45 loc) · 3.93 KB

Examples

Table of Contents

Using LLMInterface

The following examples focus on LLMInterface usage.

Basic Usage

  • Chat: Basic LLMInterface.sendMessage() chat usage. This example features an OpenAI compatible structure.
  • Prompt: Basic LLMInterface.sendMessage() prompt usage.
  • Streaming Mode: LLMInterface.sendMessage() streaming mode prompt usage.
  • Set Multiple API Keys: LLMInterface.setApiKey() multiple key usage. This example shows how to set more than one API key at once.

Embeddings

Caching

  • Simple Cache: Default file-based cache usage example.
  • Memory Cache: High-speed in-memory cache usage example.
  • Flat Cache: NPM flat-cache, a JSON flat file cache, usage example. (Node Package)
  • Cache Manager: NPM cache-manager, an advanced caching system supporting multiple backends including MongoDB, Memcache, Redis, SQLite, and more, usage example. (Node Package)

Interface Options

  • Auto Retry Failed Requests: Controlling retries with interfaceOptions.retryAttempts and interfaceOptions.retryMultiplier usage example.
  • Include Original Response: Including the complete original response with interfaceOptions.includeOriginalResponse usage example.
  • JSON Repair: Repairing badly formed JSON with interfaceOptions.attemptJsonRepair usage example.

JSON

  • JSON Output: Requesting a JSON response using the prompt usage example.
  • JSON Repair: Repairing badly formed JSON with interfaceOptions.attemptJsonRepair usage example.
  • Native JSON Output: Requesting a JSON response with options.response_format usage example.

What Can You Do with LLMInterface?

The following are some examples using LLMInterface.

Retrieval-Augmented Generation (RAG) using Langchain.js

Mixture of Agents (MoA)

Miscellaneous