Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Generate multiple show notes with different LLM services #50

Open
ajcwebdev opened this issue Dec 4, 2024 · 0 comments
Open

Generate multiple show notes with different LLM services #50

ajcwebdev opened this issue Dec 4, 2024 · 0 comments
Labels
medium difficulty Larger in scope than easy difficulty but still fairly self-contained

Comments

@ajcwebdev
Copy link
Owner

Right now, if someone wants to generate multiple show notes using different LLM services, they would have to run the entire processing pipeline to do this.

This results in multiple downloads to get the content and multiple runs of transcription. There should be the ability to pass more than one LLM flag and reuse the transcription.
 
Example command:

npm run as -- \
  --video "https://www.youtube.com/watch?v=MORMZXEaONk" \
  --chatgpt GPT_4o \
  --claude CLAUDE_3_HAIKU
@ajcwebdev ajcwebdev changed the title Allow generating multiple show note files with different LLM services Generate multiple show notes with different LLM services Dec 4, 2024
@ajcwebdev ajcwebdev added the medium difficulty Larger in scope than easy difficulty but still fairly self-contained label Dec 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
medium difficulty Larger in scope than easy difficulty but still fairly self-contained
Projects
None yet
Development

No branches or pull requests

1 participant