Releases: amaiya/onprem
Releases · amaiya/onprem
v0.9.0
0.9.0 (2025-02-26)
new:
Support for using self-ask prompt strategy with RAG (#120 )
Improved table understanding when invoking LLm.ask
. (#124 )
helpers for document metadata (#121 )
changed
Added k
and score_threshold
arguments to LLM.ask
(#122 )
Added n_proc
parameter to control the number of CPUs used
by LLM.ingest
(ee09807 )
Upgrade version of chromadb
(#125 )
fixed:
Ensure table-processing is sequential and not parallelized (#123 )
Fixes to support newer version of langchain_community
. (#125 )
v0.8.0
0.8.0 (2025-02-13)
new:
Added HFClassifier
to pipelines.classifier
module (#119 )
Added SKClassifier
to pipelines.classifier
module (#118 )
sk
"helper" module to fit simple scikit-learn text models (#117 )
changed
Added process_documents
function (#117 )
fixed:
Pass autodetect_encoding
argument to TextLoader
(#116 )
0.7.1
0.7.1 (2024-12-18)
new:
changed
fixed:
Fix for HF chat template issue resulting in bad prompt format and bad output for HF transformer models (#113 /#114 )
0.7.0
0.7.0 (2024-12-16)
new:
Support for structured outputs (#110 )
Support for table extraction (#106 , #107 )
Facilitate identifying tables extracted as HTML (#112 )
changed
Remove dependnency on deprecated RetrievalQA (#108 )
Refactored code base (#109 )
Use new JSON-safe formatting of prompt templates (#109 )
fixed:
Added utils.format_string
function to help format template strings
with embedded JSON (#105 )
support stop strings with transformers (#111 )
0.6.1
0.6.1 (2024-12-04)
new:
changed
Changed pdf_use_unstructured
to pdf_unstructured
and
pdf2md
to pdf_markdown
(#102 )
fixed:
0.6.0
0.6.0 (2024-12-3)
new:
Improved PDF text extraction including optional markdown
conversion, table inference, and OCR (#100 )
changed
fixed:
Add support for HF training (#98 )
Default to localhost in Web app (#99 )
0.5.2
0.5.2 (2024-011-25)
new:
changed
fixed:
Allow all Hugging Face pipeline/model arguments to be supplied (#96 )
0.5.1
0.5.1 (2024-11-22)
new:
changed
Refactored Hugging Face transformers backend (#95 )
fixed:
Suppress swig deprecation warning (#93 )
Raise error if summarizers encounter bad document (#94 )
0.5.0
0.5.0 (2024-11-20)
new:
Support for Hugging Face transformers as LLM engine instead of Llama.cpp
changed
LLM.prompt
now accepts OpenAI-style messages in form of list of dictionaries
fixed:
Remove unused imports (#92 )
0.4.0
0.4.0 (2024-11-13)
new:
Added default_model
parameter to LLM
to more easily use Llama-3.1-8B-Instruct
.
changed
fixed:
You can’t perform that action at this time.