Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add azure openai api support to ontogpt #322

Merged
merged 21 commits into from
Mar 7, 2024

Conversation

k-bandi
Copy link
Contributor

@k-bandi k-bandi commented Feb 6, 2024

This PR adds support for using Azure OpenAI API endpoints in OntoGPT for the LLM component instead of Vanilla OpenAI endpoints.

@caufieldjh
Copy link
Member

Thanks @k-bandi !
I will make some updates to this to ensure we retain the base functionality, but Azure support is certainly appreciated!

@k-bandi k-bandi closed this Feb 7, 2024
@k-bandi k-bandi reopened this Feb 7, 2024
@k-bandi
Copy link
Contributor Author

k-bandi commented Feb 7, 2024

@caufieldjh - Thank you. Should I close this PR now?

@caufieldjh
Copy link
Member

Please leave it open! I'm very happy to have your contribution and would like to integrate it.

@k-bandi
Copy link
Contributor Author

k-bandi commented Feb 7, 2024

Okay great. Thank you. Let me know if you need anything.

k-bandi and others added 6 commits February 7, 2024 15:19
Co-authored-by: Luc Cary <61755432+lucinvitae@users.noreply.github.com>
Co-authored-by: Luc Cary <61755432+lucinvitae@users.noreply.github.com>
Co-authored-by: Luc Cary <61755432+lucinvitae@users.noreply.github.com>
Co-authored-by: Luc Cary <61755432+lucinvitae@users.noreply.github.com>
…zure-openai-api-support

# Conflicts:
#	src/settings.py
@caufieldjh
Copy link
Member

Just getting back to this now - thanks for your patience @k-bandi. Should be able to get it working and merged today.

@caufieldjh
Copy link
Member

OK, this should be ready to use! Two notes:

  • There is now a CLI flag named --azure-select. Use this with a command like extract to use an Azure endpoint, though it will still expect the configuration details to be defined in a local.toml.
  • I'm going to be refactoring OntoGPT more soon to reduce the amount of model changes we have to track in this project (i.e., we'll let other packages like llm and litellm do that) but will try to keep that CLI flag as it's a useful shorthand. This may mean that the process for storing model configurations will change, however.
  • I don't have an Azure endpoint to test with, so I'm making some assumptions about how well this works. Please let me know if it is broken or doesn't work as expected!

@caufieldjh caufieldjh merged commit 906baee into monarch-initiative:main Mar 7, 2024
2 checks passed
@k-bandi
Copy link
Contributor Author

k-bandi commented Mar 7, 2024

Thanks @caufieldjh for looking into this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants