-
Notifications
You must be signed in to change notification settings - Fork 421
Document how to reduce the amount of dependencies needed #1580
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
You can reduce this by just installing the core package https://www.npmjs.com/package/@llamaindex/core and the LLM provider that you need (e.g. Openai: https://www.npmjs.com/package/@llamaindex/openai) How about adding a "Reduce Dependencies" section to the docs: https://ts.llamaindex.ai/docs/llamaindex ? "Guide" looks like a good place: Anyone wants to help with that? |
Unfortunately installing core alone is not enough to get this working. For example you can't access to vector stores they seem to be part of the main package. |
Yes, if you need vector stores, then please wait for #1587 - coming soon! |
@marcusschiesser any ETA for this? I'm getting the following error at Vercel: after bumping llamaindex from 0.6.9 to 0.8.31. I think that importing the Link to Vercel docs: https://vercel.com/docs/functions/runtimes#bundle-size-limits |
@AndreMaz which vector store are you using? We're already extracted postgres and azure. Using those with core should result in a small package size. onnx is used by huggingface and chromadb. We will optimize bundle sizes and import handling in the 0.9 release |
Hey @marcusschiesser thank you for replying and thank you guys for your work! I'm using |
@AndreMaz just released the 0.9 package, for migration check: https://ts.llamaindex.ai/docs/llamaindex/migration/0.8-to-0.9 |
Describe the bug
Installing llamaindex adds almost 900MB of packages to node modules! Ironically the workflow package is not installed and has to be done separately. Is there a way to reduce the total sizes of dependencies especially if just using OpenAI? I don't need Gemini etc but it seems to install every LLM provider by default.
To Reproduce
The text was updated successfully, but these errors were encountered: