The name of this repo is reference from here.
- Make a copy of
config_template.yaml
and name itconfig.yaml
. - Set the following variables to prevent any impact on product environment.
pubsub_subscription.launch: False
pubsub_publish.send_userID: test
chat_hist_pubsub.topic: "trash-collector"
- Modify
memory_type, days_of_week, prompt, prompt, violation
to suit your usage scenario - Modify
pach.sh
to suit your usage scenario. - Run
sh pach.sh
to upload your model registry - Deploy the endpoint on Vertex AI
- Make a copy of
config_template.yaml
and name itconfig.yaml
. - Set the following variables to run AVLiver on the production environment.
pubsub_subscription.launch: True
pubsub_subscription.topic: "media17-live-events" or "media17-live-events-test" (The latter is for testing use)
pubsub_subscription.pull_userID: <AVLiver userID>
pubsub_publish.send_userID: <AVLiver userID>
tts_endpoint.speaker_name: <speaker name>
chat_hist_pubsub.topic: AI-Vliver-chat-hist
- (optional) Set
moderator.launch: True
to launch moderator.
- Modify
memory_type, days_of_week, prompt, prompt, violation
to suit your usage scenario - Modify
pach.sh
to suit your usage scenario. - Run
sh pach.sh
to upload your model registry - Deploy the endpoint on Vertex AI
Airflow is used to tell AI-VLiver daily information that you want AI-VLiver to know in the prompt.
- develop your own
information_collector.py
, which would collect information from real world and summary into a short message and return byinformation_collector
. - modify
daily_information_dag.py
to set up your aivliver's endpoint. - put these codes to GCP composer: ai-vliver-daily-information to trigger airflow
airflow will inject message into variable: daily_mail in prompt.py
Cloud function is used to remember the long-term memory in redis and inject it into variable: long_term_memory in prompt.py
cloud function will call LLM to extract useful information from each conversation to construct the structure data(dict) to save in the redis and inject into prompt ultimately.
- write down your own LLM prompt in the prompt.py (give few shot example to LLM)
- set up you AIVLiver's endpoint id
- set a new key name to replace
long_term_memory
Note: The name of this repo is reference from here.