Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

dynamic logger wip #1430

Closed
wants to merge 1 commit into from
Closed

dynamic logger wip #1430

wants to merge 1 commit into from

Conversation

apage43
Copy link
Member

@apage43 apage43 commented Sep 15, 2023

No description provided.

@AndriyMulyar
Copy link
Contributor

Does this PR encompass turning off underlying llama.cpp print statements?

@apage43
Copy link
Member Author

apage43 commented Sep 18, 2023

Does this PR encompass turning off underlying llama.cpp print statements?

assuming i can make this trick work on Windows (trying to figure out how to do that presently) - the idea here is that this logging facility can be used inside dynamically loaded code (i.e. llama.cpp or the other model implementation libraries) while still being controlled from the code that loaded it (gpt4all chat client or bindings)

@apage43
Copy link
Member Author

apage43 commented Sep 18, 2023

okay it looks like this is pretty awkward to pull off on windows - there's no facility for the linker to allow unresolved symbols in a dll and then resolve them back against already loaded symbols - https://github.com/ocaml/flexdll exist to enable this sort of thing to work on windows but its a lot of code to import

will think about alternative approaches and in the meantime may just make it possible to #ifdef out the existing prints

@apage43 apage43 closed this Sep 18, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants