Replies: 1 comment 1 reply
-
Thanks a lot @ajndkr |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Related #164
While I work on integrating this feature (need to create sync mode callback handlers for all), here's a script to build a microservice with llama.cpp:
To start server, run:
To test:
Hope this helps while I work on updating the library!
Beta Was this translation helpful? Give feedback.
All reactions