Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FEATURE REQUEST: Allow user selection of models at query time #1289

Open
ga-it opened this issue Mar 22, 2025 · 1 comment
Open

FEATURE REQUEST: Allow user selection of models at query time #1289

ga-it opened this issue Mar 22, 2025 · 1 comment

Comments

@ga-it
Copy link

ga-it commented Mar 22, 2025

Is your feature request related to a problem? Please describe.
Currently you can define three research options - FAST_LLM, SMART_LLM, and STRATEGIC_LLM
It would be exceptionally useful to be able to change these at query time

Describe the solution you'd like
Open-webui has probably got the functionality hierarchy right where you can define connections (i.e. endpoints), group rights against models and then the user in these groups can select the model at query time

By defining the endpoints, models are retrieved real-time via the api list functionality. New models are then available for administrators to make available to rights groups

For gpt-researcher, a further layers could mimic the rights groups into research groups - e.g. groups of fast_llm, smart_llm and strategic_llm models where the user can select which model to use to accomplish the task

I think the layering is important. We host litellm behind which we have hosted models on Ollama and paid api access to most providers.

Mostly our hosted models are sufficient, but it would be useful to include access to paid APIs on a permissioned basis.

The layering really helps and would demand a user management layer to gpt_researcher - I will put in a separate feature request for this.

Describe alternatives you've considered
It is possible to alias models in Litellm and so alias three models for gpt_researcher there and then chnage the model in litellm, but that does not give sufficient real-time flexibility.

@ga-it ga-it changed the title Allow user selection of models at query time FEATURE REQUEST: Allow user selection of models at query time Mar 22, 2025
@ElishaKay
Copy link
Collaborator

a) We're currently at the stage of considering these user stories:

Search Providers and Model Selection

  • For example: Users can choose from 10 different models, 10 different retrievers and hybrid
  • Implementation option: we can pass these parameters via the Headers object of the websocket request & store many API keys on the server so that the relevant keys are leveraged depending on the Search Provider or LLM the user passed to the Research Agent

b) Regarding User Management - see here

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants