-
-
Notifications
You must be signed in to change notification settings - Fork 3.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enhancement: Dedicated Reverse Proxy Endpoint #1344
Comments
https://docs.anyscale.com/endpoints/model-serving/openai-migration-guide It should already work if you set |
That works, but it still says OpenAI in the UI, and once you have this set up you can't switch to an OpenAI model. BTW I found a similar site, portkey.ai (https://portkey.ai/docs/welcome/what-is-portkey) that supports more models. For all but OpenAI, though, you have to use their sdk. |
@eburnette @danny-avila I changed the title to better reflect the nature of the feature request, and re-opened it since a dedicated reverse proxy endpoint was requested more than once |
Indeed this is planned and should actually be pretty simple to implement once I put the time in, thank you for clarifying @fuegovic @eburnette |
I'm also thinking about how to allow multiple custom endpoints. For example, if i want openrouter, mistral api, and openai available all at once |
Contact Details
Redacted
What features would you like to see added?
Support models served by Anyscale (https://app.endpoints.anyscale.com/).
More details
Anyscale supports a number of open source llms as fully managed API endpoints. The first million tokens are free so all you need is to sign up and get an API key. They support the OpenAI API, including function calls.
Which components are impacted by your request?
No response
Pictures
No response
Code of Conduct
The text was updated successfully, but these errors were encountered: