diff --git a/README.md b/README.md index 79fabf4..cfd98ae 100644 --- a/README.md +++ b/README.md @@ -17,13 +17,14 @@ FATE-LLM is a framework to support federated learning for large language models( ### Standalone deployment Please refer to [FATE-Standalone deployment](https://github.com/FederatedAI/FATE#standalone-deployment). -Deploy FATE-Standalone version with 1.11.3 <= version < 2.0, then copy directory `python/fate_llm` to `{fate_install}/fate/python/fate_llm` +* To deploy FATE-LLM v2.0, deploy FATE-Standalone with version >= 2.1, then make a new directory `{fate_install}/fate_llm` and clone the code into it, install the python requirements, and add `{fate_install}/fate_llm/python` to `PYTHONPATH` +* To deploy FATE-LLM v1.x, deploy FATE-Standalone with 1.11.3 <= version < 2.0, then copy directory `python/fate_llm` to `{fate_install}/fate/python/fate_llm` ### Cluster deployment Use [FATE-LLM deployment packages](https://github.com/FederatedAI/FATE/wiki/Download#llm%E9%83%A8%E7%BD%B2%E5%8C%85) to deploy, refer to [FATE-Cluster deployment](https://github.com/FederatedAI/FATE#cluster-deployment) for more deployment details. ## Quick Start - [Federated ChatGLM3-6B Training](./doc/tutorial/parameter_efficient_llm/ChatGLM3-6B_ds.ipynb) -- [Builtin Models In PELLM](./doc/tutorial/builtin_models.md) +- [Builtin Models In PELLM](./doc/tutorial/builtin_pellm_models.md) - [Offsite Tuning Tutorial](./doc/tutorial/offsite_tuning/Offsite_tuning_tutorial.ipynb) - [FedKSeed](./doc/tutorial/fedkseed/fedkseed-example.ipynb) \ No newline at end of file diff --git a/doc/tutorial/builtin_pellm_models.md b/doc/tutorial/builtin_pellm_models.md index 70c3f37..e2a3d49 100644 --- a/doc/tutorial/builtin_pellm_models.md +++ b/doc/tutorial/builtin_pellm_models.md @@ -6,13 +6,12 @@ After reading the training tutorial above, it's easy to use other models listing | Model | ModuleName | ClassName | DataSetName | -| -------------- | ----------------- | --------------| --------------- | | +| -------------- | ----------------- | --------------| --------------- | | Qwen2 | pellm.qwen | Qwen | prompt_dataset | | Bloom-7B1 | pellm.bloom | Bloom | prompt_dataset | | LLaMA-2-7B | pellm.llama | LLaMa | prompt_dataset | | LLaMA-7B | pellm.llama | LLaMa | prompt_dataset | | ChatGLM3-6B | pellm.chatglm | ChatGLM | prompt_dataset | -| ChatGLM-6B | pellm.chatglm | ChatGLM | prompt_dataset | | GPT-2 | pellm.gpt2 | GPT2 | seq_cls_dataset | | ALBERT | pellm.albert | Albert | seq_cls_dataset | | BART | pellm.bart | Bart | seq_cls_dataset |