Skip to content
Atomie CHEN edited this page May 22, 2024 · 16 revisions

PyPI vsmarketplace

Welcome to the HandyLLM wiki! Click on the pages panel on the right to jump to what you need!

Installation

pip3 install handyllm

or, install from the Github repo to get latest updates:

pip3 install git+https://github.com/atomiechen/handyllm.git

VSCode Editor Support

Install HandyLLM extension from the marketplace. Please check HandyLLM VSCode extension.

Getting Started with hprompt

Create a text file named try.hprompt with the following content (replace <YOUR_OPENAI_API_KEY>):

Caution

This is only a minimal working example, and we do NOT recommend storing your API key in the hprompt file. Save it to a separate credential file instead (see Credentials).

---
model: gpt-4o
temperature: 0.4
api_key: <YOUR_OPENAI_API_KEY>
---

$user$
How to speed up my prompt engineering iteration?

Now run it with the CLI:

handyllm hprompt try.hprompt

The result will be dumped to the stderr, and you will see it in the same hprompt format.

You can also run it programmatically:

from handyllm import hprompt
my_prompt = hprompt.load_from('try.hprompt')
result_prompt = my_prompt.run()
print(result_prompt.dumps())

Gain more controls

You can specify more arguments in the frontmatter, and add variables in the content, like this:

---
# frontmatter data
model: gpt-3.5-turbo
temperature: 0.5
meta:
  credential_path: .env
  var_map_path: substitute.txt
  output_path: out/%Y-%m-%d/result.%H-%M-%S.hprompt
---

$system$
You are a helpful assistant.

$user$
Your current context: 
%context%

Please follow my instructions:
%instructions%

Check this page for details.

Clone this wiki locally