Skip to content

Releases: yusufcanb/tlm

1.2

11 Feb 23:07
72f5834
Compare
Choose a tag to compare

One-liner RAG has arrived in tlm v1.2! πŸŽ‰

Version 1.2 of tlm introduces one-liner Retrieval Augmented Generation (RAG) with the new tlm ask command. This beta feature allows you to ask questions and get contextually relevant answers directly within your codebase and documentation.

Inspired by the Repomix project, tlm ask provides a similar context gathering mechanism, implemented efficiently in Go. However, tlm goes a step further by bridging context retrieval with local and open-source LLM prompting, enabling security and privacy with the comfort of your terminal.

Key Features of tlm ask:

  • Instant Answers: Get quick answers to direct questions using tlm ask "<prompt>".
  • Contextual Understanding: Enhance answer accuracy by providing context. Use the --context flag and specify a directory for analysis, e.g., tlm ask --context . "<prompt>".
  • Granular Context Control: Further refine the context using --include and --exclude flags with file patterns. Target specific files or exclude irrelevant ones, e.g., tlm ask --context . --include *.md "<prompt>" or tlm ask --context . --exclude **/*_test.go "<prompt>".

Example Usage:

  • tlm ask "What is the main purpose of this function?"
  • tlm ask --context ./src --include *.go "How does authentication work?"
  • tlm ask --context ./docs --include *.md --exclude README.md "Summarize the key concepts."
  • tlm ask --interactive "What are the dependencies?"

Ask

1.2-pre

31 Jan 01:43
Compare
Choose a tag to compare

Use the model you like! πŸ₯³πŸŽ‰

Now, starting from 1.2-pre version, tlm will deprecate the use of Modelfiles and will be able to work with any base model without creating it's own. That was the most wanted request from the earlier discussions. Initially, I wanted to abstract user from underlying model so they can just focus on getting good results. But, with the boom of the new open-source models, I've decided not to have an opinion on which model to use. Users can choose and decide which one is the best!

Model Choice

Changelog

  • Removal of the Modelfile approach. tlm now will use base models without requiring custom model creation.
  • tlm config will list all available Ollama models and let you to select a default model to work with.
  • The default model is now qwen2.5-coder:3b which is accurate and blazing fast at the same time.

Full Changelog: 1.1...1.2-pre

1.1

13 Mar 17:02
Compare
Choose a tag to compare
1.1

1.1 is out! πŸ₯³ πŸŽ‰

Thank you so much to everyone who showed interest in the initial release. Your support has been incredible! Within just two weeks, tlm has rocketed to 231 stars from its zero point. This overwhelming response is truly humbling and inspiring.

It's because the engagement that I'm thrilled to announce the release of version 1.1. This update aims to enhance the project's robustness and maintainability, laying the groundwork for the continued growth and enabling easier collaboration of the project.

$ tlm s 'get me a cowsay to express excitement of tlm 1.1 release'

┃ > Thinking... (1.198s)
┃ > cowsay "tlm 1.1 is out! let's celebrate!"
┃ > Executing...

 ----------------------------------
< tlm 1.1 is out! let's celebrate! >
 ----------------------------------
        \   ^__^
         \  (oo)\_______
            (__)\       )\/\
                ||----w |
                ||     ||

Changelog

  • Ability to override automatic shell detection and generate suggestions for different shells.
  • Suggestion/Explanation preset optimizations. (Precise/Balanced/Creative)
  • Non-interactive configuration.
  • E2E tests for acceptance.
  • Informs user on new releases.

Full Changelog: 1.0...1.1

Discussions

  • Integration with Homebrew, Scoop and Snap stores for easier distribution.
  • Code signing.

1.0

02 Mar 02:02
cca26d8
Compare
Choose a tag to compare
1.0

What's Changed

New Contributors

Full Changelog: 1.0-rc3...1.0

1.0-rc3

27 Feb 23:34
0a8ca82
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: 1.0-rc2...1.0-rc3