Skip to content

Commit

Permalink
Removed phi-1_5 from model_config.json (nutanix#30)
Browse files Browse the repository at this point in the history
  • Loading branch information
saileshd1402 authored Dec 7, 2023
1 parent b1029bb commit cab08d0
Show file tree
Hide file tree
Showing 3 changed files with 6 additions and 22 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ This new solution includes:
- The management interface for enhanced terminal UI or standard CLI
- Support for a curated set of LLMs including Llama2, Falcon and MPT

Refer to the official [GPT-in-a-Box Documentation](https://opendocs.nutanix.com/gpt-in-a-box/vm/getting_started/) to deploy and validate the inference server on virtual machine
Refer to the official [GPT-in-a-Box Documentation](https://opendocs.nutanix.com/gpt-in-a-box/overview/) to deploy and validate the inference server on virtual machine

### License
All source code and other contents in this repository are covered by the Nutanix License and Services Agreement, which is located at https://www.nutanix.com/legal/eula
16 changes: 5 additions & 11 deletions llm/generate.py
Original file line number Diff line number Diff line change
Expand Up @@ -195,7 +195,7 @@ def read_config_for_download(gen_model: GenerateDataModel) -> GenerateDataModel:
)
sys.exit(1)

else: # Custom model case
else: # Custom model and HuggingFace model case
gen_model.is_custom_model = True
if gen_model.skip_download:
if check_if_folder_empty(gen_model.mar_utils.model_path):
Expand All @@ -208,21 +208,15 @@ def read_config_for_download(gen_model: GenerateDataModel) -> GenerateDataModel:
else:
if not gen_model.repo_info.repo_id:
print(
"## If you want to create a model archive file with the supported models, "
"## If you want to create a model archive file for supported models, "
"make sure you're model name is present in the below : "
)
print(list(models.keys()))
print(
"\nIf you want to create a model archive file for"
" a custom model,there are two methods:\n"
"1. If you have already downloaded the custom model"
" files, please include"
" the --skip_download flag and provide the model_path "
"directory which contains the model files.\n"
"2. If you need to download the model files, provide "
"the HuggingFace repository ID using 'repo_id'"
" along with an empty model_path driectory where the "
"model files will be downloaded.\n"
" either a Custom Model or other HuggingFace models, "
"refer to the official GPT-in-a-Box documentation: "
"https://opendocs.nutanix.com/gpt-in-a-box/overview/"
)
sys.exit(1)

Expand Down
10 changes: 0 additions & 10 deletions llm/model_config.json
Original file line number Diff line number Diff line change
Expand Up @@ -47,16 +47,6 @@
"response_timeout": 2000
}
},
"phi-1_5": {
"handler": "handler.py",
"repo_id": "microsoft/phi-1_5",
"repo_version": "b6a7e2fe15c21f5847279f23e280cc5a0e7049ef",
"registration_params": {
"batch_size": 1,
"max_batch_delay": 200,
"response_timeout": 2000
}
},
"gpt2": {
"handler": "handler.py",
"repo_id": "gpt2",
Expand Down

0 comments on commit cab08d0

Please sign in to comment.