-
Notifications
You must be signed in to change notification settings - Fork 5.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
请问embedding模型也经过xinference 吗?另外直接调用星火的llm的openai形式的api,请问怎么配置?好像教程写的不是很清楚。 #4906
Comments
参考 xinference 的配置项,只要是openai兼容的接口,api_url/key 配好,模型名称写好就可以了。 |
是把api_url/key配在xinference里,还是在项目配置文件里 |
请问不兼容的呢。。 公司自己部署的模型,api的data自己做了下封装,请问这种如何自定义。 |
不兼容就没办法了,要么改成兼容的,要么自己做适配。 |
请问最后怎么用公司自己部署的模型呀? |
自己写一个适配的服务接口,用服务做转发。 |
No description provided.
The text was updated successfully, but these errors were encountered: