diff --git a/README.md b/README.md index 8396e63..b5c8202 100644 --- a/README.md +++ b/README.md @@ -7,6 +7,8 @@ [APEX](https://github.com/deepmodeling/APEX): Alloy Property EXplorer is a component of the [AI Square](https://aissquare.com/) project that involves the restructuring of the [DP-GEN](https://github.com/deepmodeling/dpgen) `auto_test` module to develop a versatile and extensible Python package for general alloy property calculations. This package enables users to conveniently establish a wide range of cloud-native property-test workflows by utilizing various computational approaches, including LAMMPS, VASP, ABACUS, and others. +![gif](./docs/images/apex_demo_high_0001.gif) + ## v1.2 Main Features Update * Add a `retrieve` sub-command to allow results to be retrieved independently and manually for multiple properties (Remove `Distributor` and `Collector` OP) * Support common **dflow operations** with terminal commands @@ -37,8 +39,14 @@ If you use APEX in your research, please cite the following paper for general pu - [How to cite APEX](#how-to-cite-apex) - [Table of Contents](#table-of-contents) - [1. Overview](#1-overview) - - [2. Easy Install](#2-easy-install) - - [3. User Guide](#3-user-guide) + - [2. Quick Start](#2-quick-start) + - [2.1. Install APEX](#21-install-apex) + - [2.2. Install Local Argo (Optional)](#22-install-local-argo-optional) + - [2.3. Submission Examples](#23-submission-examples) + - [2.3.1. Submit to Local Argo Service](#231-submit-to-local-argo-service) + - [2.3.2. Submit Without Argo Service](#232-submit-without-argo-service) + - [2.3.3. Submit to the Bohrium](#233-submit-to-the-bohrium) + - [3. Documents \& User Guide](#3-documents--user-guide) - [3.1. Before Submission](#31-before-submission) - [3.1.1. Global Setting](#311-global-setting) - [3.1.2. Calculation Parameters](#312-calculation-parameters) @@ -57,10 +65,6 @@ If you use APEX in your research, please cite the following paper for general pu - [3.3.1. Retrieve Results Manually](#331-retrieve-results-manually) - [3.3.2. Archive Test Results](#332-archive-test-results) - [3.3.3. Results Visualization Report](#333-results-visualization-report) - - [4. Quick Start](#4-quick-start) - - [4.1. In the Bohrium](#41-in-the-bohrium) - - [4.2. In a Local Argo Service](#42-in-a-local-argo-service) - - [4.3. In a Local Environment](#43-in-a-local-environment) ## 1. Overview @@ -91,21 +95,154 @@ APEX currently offers calculation methods for the following alloy properties: Moreover, APEX supports three types of calculators: **LAMMPS** for molecular dynamics simulations, and **VASP** and **ABACUS** for first-principles calculations. -## 2. Easy Install -Easy install by +## 2. Quick Start +### 2.1. Install APEX +The latest version of APEX can be easily installed via `pypi` with following command: ```shell pip install apex-flow ``` -You may also clone the package firstly by +The second approach is to intall from the source code. Firstly clone the code from repository: ```shell git clone https://github.com/deepmodeling/APEX.git ``` -then install APEX by +then install APEX by: ```shell cd APEX pip install . ``` -## 3. User Guide + +### 2.2. Install Local Argo (Optional) +APEX workflow can be boosted and better orgnized by the [Argo](https://argoproj.github.io/workflows/) workflow engine, which provides intuitive process monitoring UI and multiple user-friendly workflow management functions. + +To enable this feature on your local computer, you can setup the dflow service by executing [installation scripts](./docs/scripts/) prepared based on Unix-like system. For instance, to install on a Linux system without root access: +```shell +bash install-linux-cn.sh +``` +This process will automatically configure the required local tools, including Docker, Minikube, and Argo service, with the default port set to `127.0.0.1:2746`. To setup the service on Windows, please refer to the [dflow setup manual](https://github.com/deepmodeling/dflow/tree/master/tutorials) for more details. + + +### 2.3. Submission Examples +We present several case studies as introductory illustrations of APEX, tailored to distinct user scenarios. For our demonstration, we will utilize a [LAMMPS_example](./examples/lammps_demo) to compute the Equation of State (EOS) and elastic constants of molybdenum in both Body-Centered Cubic (BCC) and Face-Centered Cubic (FCC) phases. To begin, we will examine the files prepared within the working directory for this specific case. + +``` +lammps_demo +├── confs +│ ├── std-bcc +│ │ └── POSCAR +│ └── std-fcc +│ └── POSCAR +├── frozen_model.pb +├── global_bohrium.json +├── global_hpc.json +├── param_joint.json +├── param_props.json +└── param_relax.json +``` +There are three types of parameter files and two types of global config files, as well as a Deep Potential file of molybdenum `frozen_model.pb`. Under the directory of `confs`, structure file `POSCAR` of both phases have been prepared respectively. + +#### 2.3.1. Submit to Local Argo Service +Before this subsection, make sure you have setup a local Argo service environment. If not, please follow the instruction in [2.2. Install Local Argo (Optional)](#22-install-local-argo-optional) to do so. + +To submit workflow to local Argo service, with the default port set to `127.0.0.1:2746`. Consequently, one can modify the `global_hpc.json` file to submit a workflow. Here is an example to submit job to local LAMMPS env: +```json +{ + "apex_image_name":"zhuoyli/apex_amd64", + "run_image_name": "zhuoyli/apex_amd64", + "run_command":"lmp -in in.lammps", + "batch_type": "Shell", + "context_type": "Local", + "local_root" : "./", + "remote_root": "/some/path/not/under/pwd/" +} +``` +Another example to submit job to a remote HPC. In this example, we attempt to distribute tasks to a remote node managed by [Slurm](https://slurm.schedmd.com). Users can replace the relevant parameters within the `machine` dictionary or specify `resources` and `tasks` according to [DPDispatcher](https://docs.deepmodeling.com/projects/dpdispatcher/en/latest/index.html) rules. Here is an example `global_hpc.json` file: +```json +{ + "apex_image_name":"zhuoyli/apex_amd64", + "run_image_name": "zhuoyli/apex_amd64", + "run_command":"lmp -in in.lammps", + "context_type": "SSHContext", + "mechine":{ + "batch_type": "Slurm", + "context_type": "SSHContext", + "local_root" : "./", + "remote_root": "/your/remote/tasks/path", + "clean_asynchronously": true, + "remote_profile": { + "hostname": "123.12.12.12", + "username": "USERNAME", + "password": "PASSWD", + "port": 22, + "timeout": 10 + } + }, + "resources":{ + "number_node": 1, + "cpu_per_node": 4, + "gpu_per_node": 0, + "queue_name": "apex_test", + "group_size": 1, + "module_list": ["deepmd-kit/2.1.0/cpu_binary_release"], + "custom_flags": [ + "#SBATCH --partition=xlong", + "#SBATCH --ntasks=4", + "#SBATCH --mem=10G", + "#SBATCH --nodes=1", + "#SBATCH --time=1-00:00:00" + ] + } +} +``` + +Then, one can submit a relaxation workflow via: +```shell +apex submit param_relax.json -c global_hpc.json +``` + +Upon submission of the workflow, progress can be monitored at local https://127.0.0.1:2746. If the argo is setup on system without monitor UI, you may try to port forward the `127.0.0.1:2746` to another PC by running following command on that PC: +```shell +ssh -nNT -L 127.0.0.1:2746:127.0.0.1:2746 USERNAME@123.12.12.12 +``` +Then, you can monitor the UI through https://127.0.0.1:2746 + +#### 2.3.2. Submit Without Argo Service +If your local computer experiences difficulties connecting to the internet or installing cloud-native infrastructures like Docker and Argo, APEX offers a **workflow local debug mode** that allows the flow to operate in a basic `Python3` environment, independent of the Docker container. Users will **not** be able to monitor the workflow through the workflow UI. However, the workflows are still running automatically. + +To enable this function, users can add an additional optional argument `-d` to the origin submission command, as demonstrated below: + +```shell +apex submit -d param_relax.json -c global_hpc.json +``` + +In this approach, uses are not required to specify an image for executing APEX. Rather, APEX should be pre-installed in the default `Python3` environment to ensure proper functioning. + +#### 2.3.3. Submit to the Bohrium +The most efficient method for submitting an APEX workflow is through the pre-built execution environment of Argo on the [Bohrium cloud platform](https://bohrium.dp.tech). This is especially convenient and robust for massive task-intensive workflows running concurrently. It is necessary to create a **Bohrium account** before running. Below is an example of a global.json file for this approach. + +```json +{ + "dflow_host": "https://workflows.deepmodeling.com", + "k8s_api_server": "https://workflows.deepmodeling.com", + "batch_type": "Bohrium", + "context_type": "Bohrium", + "email": "YOUR_EMAIL", + "password": "YOUR_PASSWD", + "program_id": 1234, + "apex_image_name":"registry.dp.tech/dptech/prod-11045/apex-dependency:1.2.0", + "lammps_image_name": "registry.dp.tech/dptech/prod-11045/deepmdkit-phonolammps:2.1.1", + "lammps_run_command":"lmp -in in.lammps", + "scass_type":"c8_m31_1 * NVIDIA T4" +} +``` +Then, one can submit a relaxation workflow via: +```shell +apex submit param_relax.json -c global_bohrium.json +``` +Remember to replace `email`, `password` and `program_id` of your own before submission. As for image, you can either build your own or use public images from Bohrium or pulling from the Docker Hub. Once the workflow is submitted, one can monitor it at https://workflows.deepmodeling.com. + +You may also checkout our online [hand-on Bohrium notebook tutorial](https://bohrium.dp.tech/notebooks/15413) for submisson to Bohrium. + +## 3. Documents & User Guide ### 3.1. Before Submission In APEX, there are **three essential components** required before submitting a workflow: @@ -275,7 +412,7 @@ Below are three examples (for detailed explanations of each parameter, please re | norm_deform | Float | 0.01 | The deformation in xx, yy, zz, defaul = 1e-2 | | shear_deform | Float | 0.01 | The deformation in other directions, default = 1e-2 | | conventional | Bool | False | Whether adopt conventional cell for deformation | - | ieee | Bool | True | Whether rotate relaxed structure into IEEE-standard format before deformation ([ref](https://ieeexplore.ieee.org/document/26560)) | + | ieee | Bool | False | Whether rotate relaxed structure into IEEE-standard format before deformation ([ref](https://ieeexplore.ieee.org/document/26560)) | ##### 3.1.2.3. Surface | Key words | Data structure | Example | Description | @@ -546,107 +683,3 @@ Once the report app is opened (or manully via http://127.0.0.1:8050/), users can
Figure 3. Demonstration of APEX Results Visualization Report
- - - -## 4. Quick Start -We present several case studies as introductory illustrations of APEX, tailored to distinct user scenarios. For our demonstration, we will utilize a [LAMMPS_example](./examples/lammps_demo) to compute the Equation of State (EOS) and elastic constants of molybdenum in both Body-Centered Cubic (BCC) and Face-Centered Cubic (FCC) phases. To begin, we will examine the files prepared within the working directory for this specific case. - -``` -lammps_demo -├── confs -│ ├── std-bcc -│ │ └── POSCAR -│ └── std-fcc -│ └── POSCAR -├── frozen_model.pb -├── global_bohrium.json -├── global_hpc.json -├── param_joint.json -├── param_props.json -└── param_relax.json -``` -There are three types of parameter files and two types of global config files, as well as a Deep Potential file of molybdenum `frozen_model.pb`. Under the directory of `confs`, structure file `POSCAR` of both phases have been prepared respectively. - -### 4.1. In the Bohrium -The most efficient method for submitting an APEX workflow is through the preconfigured execution environment of dflow in the [Bohrium platform](https://bohrium.dp.tech). To do this, it may be necessary to create an account on Bohrium. Below is an example of a global.json file for this approach. - -```json -{ - "dflow_host": "https://workflows.deepmodeling.com", - "k8s_api_server": "https://workflows.deepmodeling.com", - "batch_type": "Bohrium", - "context_type": "Bohrium", - "email": "YOUR_EMAIL", - "password": "YOUR_PASSWD", - "program_id": 1234, - "apex_image_name":"registry.dp.tech/dptech/prod-11045/apex-dependency:1.2.0", - "lammps_image_name": "registry.dp.tech/dptech/prod-11045/deepmdkit-phonolammps:2.1.1", - "lammps_run_command":"lmp -in in.lammps", - "scass_type":"c8_m31_1 * NVIDIA T4" -} -``` -Then, one can submit a relaxation workflow via: -```shell -apex submit param_relax.json -c global_bohrium.json -``` -Remember to replace `email`, `password` and `program_id` of your own before submission. As for image, you can either build your own or use public images from Bohrium or pulling from the Docker Hub. Once the workflow is submitted, one can monitor it at https://workflows.deepmodeling.com. - -### 4.2. In a Local Argo Service -Additionally, a dflow environment can be installed in a local computer by executing [installation scripts](https://github.com/deepmodeling/dflow/tree/master/scripts) located in the dflow repository (users can also refer to the [dflow service setup manual](https://github.com/deepmodeling/dflow/tree/master/tutorials) for more details). For instance, to install on a Linux system without root access: -```shell -bash install-linux-cn.sh -``` -This process will automatically configure the required local tools, including Docker, Minikube, and Argo service, with the default port set to `127.0.0.1:2746`. Consequently, one can modify the `global_hpc.json` file to submit a workflow to this container without a Bohrium account. Here is an example: - -```json -{ - "apex_image_name":"zhuoyli/apex_amd64", - "run_image_name": "zhuoyli/apex_amd64", - "run_command":"lmp -in in.lammps", - "batch_type": "Slurm", - "context_type": "SSHContext", - "local_root" : "./", - "remote_root": "/hpc/home/zyl/Downloads/remote_tasks", - "remote_host": "123.12.12.12", - "remote_username": "USERNAME", - "remote_password": "PASSWD", - "resources":{ - "number_node": 1, - "cpu_per_node": 4, - "gpu_per_node": 0, - "queue_name": "apex_test", - "group_size": 1, - "module_list": ["deepmd-kit/2.1.0/cpu_binary_release"], - "custom_flags": [ - "#SBATCH --partition=xlong", - "#SBATCH --ntasks=4", - "#SBATCH --mem=10G", - "#SBATCH --nodes=1", - "#SBATCH --time=1-00:00:00" - ] - } -} - -``` -In this example, we attempt to distribute tasks to a remote node managed by [Slurm](https://slurm.schedmd.com). Users can replace the relevant parameters within the `machine` dictionary or specify `resources` and `tasks` according to [DPDispatcher](https://docs.deepmodeling.com/projects/dpdispatcher/en/latest/index.html) rules. - -For the APEX image, it is publicly available on [Docker Hub](https://hub.docker.com) and can be pulled automatically. Users may also choose to pull the image beforehand or create their own Docker image in the Minikube environment locally using a [Dockerfile](./docs/Dockerfile) (please refer to [Docker's documentation](https://docs.docker.com/engine/reference/commandline/build/) for building instructions) to expedite pod initialization. - -Then, one can submit a relaxation workflow via: -```shell -apex submit param_relax.json -c global_hpc.json -``` - -Upon submission of the workflow, progress can be monitored at https://127.0.0.1:2746. - -### 4.3. In a Local Environment -If your local computer experiences difficulties connecting to the internet, APEX offers a **workflow local debug mode** that allows the flow to operate in a basic `Python3` environment, independent of the Docker container. However, users will **not** be able to monitor the workflow through the Argo UI. - -To enable this feature, users can add an additional optional argument `-d` to the origin submission command, as demonstrated below: - -```shell -apex submit -d param_relax.json -c global_hpc.json -``` - -In this approach, uses are not required to specify an image for executing APEX. Rather, APEX should be pre-installed in the default `Python3` environment to ensure proper functioning. diff --git a/apex/__init__.py b/apex/__init__.py index cb59c39..7990198 100644 --- a/apex/__init__.py +++ b/apex/__init__.py @@ -1,5 +1,5 @@ import os -__version__ = '1.2.6' +__version__ = '1.2.9' LOCAL_PATH = os.getcwd() diff --git a/apex/config.py b/apex/config.py index 8510c7f..585e3d5 100644 --- a/apex/config.py +++ b/apex/config.py @@ -59,6 +59,7 @@ class Config: group_size: int = None pool_size: int = None upload_python_packages: list = field(default_factory=list) + exclude_upload_files: list = field(default_factory=list) lammps_image_name: str = None lammps_run_command: str = None vasp_image_name: str = None diff --git a/apex/core/calculator/ABACUS.py b/apex/core/calculator/ABACUS.py index ccca5df..1faf13e 100644 --- a/apex/core/calculator/ABACUS.py +++ b/apex/core/calculator/ABACUS.py @@ -148,6 +148,9 @@ def make_input_file(self, output_dir, task_type, task_param): elif [relax_pos, relax_shape, relax_vol] == [False, True, True]: self.modify_input(incar, "calculation", "cell-relax") fix_atom = [True, True, True] + elif [relax_pos, relax_shape, relax_vol] == [True, False, True]: + self.modify_input(incar, "calculation", "cell-relax") + self.modify_input(incar, "fixed_axes", "shape") elif [relax_pos, relax_shape, relax_vol] == [False, False, True]: raise RuntimeError( "relax volume but fix shape is not supported for ABACUS" diff --git a/apex/core/calculator/Lammps.py b/apex/core/calculator/Lammps.py index ef0be29..cd979c2 100644 --- a/apex/core/calculator/Lammps.py +++ b/apex/core/calculator/Lammps.py @@ -22,7 +22,7 @@ upload_packages.append(__file__) # LAMMPS_INTER_TYPE = ['deepmd', 'eam_alloy', 'meam', 'eam_fs', 'meam_spline', 'snap', 'gap', 'rann', 'mace'] - +MULTI_MODELS_INTER_TYPE = ["meam", "snap", "gap"] class Lammps(Task): def __init__(self, inter_parameter, path_to_poscar): @@ -30,7 +30,7 @@ def __init__(self, inter_parameter, path_to_poscar): self.inter_type = inter_parameter["type"] self.type_map = inter_parameter["type_map"] self.in_lammps = inter_parameter.get("in_lammps", "auto") - if self.inter_type in ["meam", "snap"]: + if self.inter_type in MULTI_MODELS_INTER_TYPE: self.model = list(map(os.path.abspath, inter_parameter["model"])) else: self.model = os.path.abspath(inter_parameter["model"]) @@ -76,6 +76,16 @@ def set_model_param(self): "param_type": self.type_map, "deepmd_version": deepmd_version, } + elif self.inter_type == "gap": + model_name = list(map(os.path.basename, self.model)) + self.model_param = { + "type": self.inter_type, + "model_name": model_name, + "param_type": self.type_map, + "init_string": self.inter.get("init_string", None), + "atomic_num_list": self.inter.get("atomic_num_list", None), + "deepmd_version": deepmd_version, + } else: model_name = os.path.basename(self.model) self.model_param = { @@ -101,10 +111,10 @@ def symlink_force(self, target, link_name): def make_potential_files(self, output_dir): parent_dir = os.path.join(output_dir, "../../") - if self.inter_type in ["meam", "snap"]: - model_lib, model_file = map(os.path.basename, self.model[:2]) - targets = [self.model[0], self.model[1]] - link_names = [model_lib, model_file] + if self.inter_type in MULTI_MODELS_INTER_TYPE: + model_file = map(os.path.basename, self.model) + targets = self.model + link_names = list(model_file) else: model_file = os.path.basename(self.model) targets = [self.model] @@ -516,19 +526,19 @@ def _prepare_result_dict(self, atom_numbs, type_map_list, type_list, box, coord, return result_dict def forward_files(self, property_type="relaxation"): - if self.inter_type in ["meam", "snap"]: + if self.inter_type in MULTI_MODELS_INTER_TYPE: return ["conf.lmp", "in.lammps"] + list(map(os.path.basename, self.model)) else: return ["conf.lmp", "in.lammps", os.path.basename(self.model)] def forward_common_files(self, property_type="relaxation"): if property_type not in ["eos"]: - if self.inter_type in ["meam", "snap"]: + if self.inter_type in MULTI_MODELS_INTER_TYPE: return ["in.lammps"] + list(map(os.path.basename, self.model)) else: return ["in.lammps", os.path.basename(self.model)] else: - if self.inter_type in ["meam", "snap"]: + if self.inter_type in MULTI_MODELS_INTER_TYPE: return list(map(os.path.basename, self.model)) else: return [os.path.basename(self.model)] diff --git a/apex/core/calculator/VASP.py b/apex/core/calculator/VASP.py index fb6c234..a19d190 100644 --- a/apex/core/calculator/VASP.py +++ b/apex/core/calculator/VASP.py @@ -78,8 +78,9 @@ def make_input_file(self, output_dir, task_type, task_param): # revise INCAR based on the INCAR provided in the "interaction" else: + approach = None if prop_type == "phonon": - approach = task_param.get("approach", "linear") + approach = task_param.get("approach") logging.info(f"No specification of INCAR for {prop_type} calculation, will auto-generate") if approach == "linear": incar = incar_upper(Incar.from_str( @@ -131,14 +132,15 @@ def make_input_file(self, output_dir, task_type, task_param): ) incar["ISIF"] = isif - elif cal_type == "static": + elif cal_type == "static" and not approach == "linear": nsw = 0 if not ("NSW" in incar and incar.get("NSW") == nsw): logging.info( "%s setting NSW to %d" % (self.make_input_file.__name__, nsw) ) incar["NSW"] = nsw - + elif cal_type == "static" and approach == "linear": + pass else: raise RuntimeError("not supported calculation type for VASP") diff --git a/apex/core/calculator/lib/abacus_utils.py b/apex/core/calculator/lib/abacus_utils.py index df14097..e1e30a1 100644 --- a/apex/core/calculator/lib/abacus_utils.py +++ b/apex/core/calculator/lib/abacus_utils.py @@ -1,6 +1,6 @@ #!/usr/bin/python3 import logging -import os, re +import os, re, glob import dpdata import numpy as np @@ -410,16 +410,23 @@ def final_stru(abacus_path): out_stru = bool(line.split()[1]) logf = os.path.join(abacus_path, "OUT.%s/running_%s.log" % (suffix, calculation)) if calculation in ["relax", "cell-relax"]: - if not out_stru: + if os.path.isfile(os.path.join(abacus_path, "OUT.%s/STRU_ION_D" % suffix)): return "OUT.%s/STRU_ION_D" % suffix else: - with open(logf) as f1: - lines = f1.readlines() - for i in range(1, len(lines)): - if lines[-i][36:41] == "istep": - max_step = int(lines[-i].split()[-1]) - break - return "OUT.%s/STRU_ION%d_D" % (suffix, max_step) + # find the final name by STRU_ION*_D, + # for abacus version < v3.2.2, there has no STRU_ION_D file but has STRU_ION0_D STRU_ION1_D ... STRU_ION10_D ... + # so we need to find the last STRU_ION*_D file + stru_ions = glob.glob( + os.path.join(abacus_path, f"OUT.{suffix}/STRU_ION*_D") + ) + if len(stru_ions) > 0: + # sort the file name by the number in the file name + stru_ions.sort(key=lambda x: int(x.split("_")[-2][3:])) + final_stru_ion = os.path.basename(stru_ions[-1]) + return f"OUT.{suffix}/{final_stru_ion}" + else: + # if there has no STRU_ION_D, return the input STRU + return "STRU" elif calculation == "md": with open(logf) as f1: lines = f1.readlines() diff --git a/apex/core/calculator/lib/lammps_utils.py b/apex/core/calculator/lib/lammps_utils.py index 7bee6a3..f6baa57 100644 --- a/apex/core/calculator/lib/lammps_utils.py +++ b/apex/core/calculator/lib/lammps_utils.py @@ -1,15 +1,14 @@ #!/usr/bin/env python3 import os -import random -import subprocess as sp -import sys +import re import dpdata from dpdata.periodic_table import Element from packaging.version import Version from apex.core.lib import util +from apex.core.constants import PERIOD_ELEMENTS_BY_SYMBOL from dflow.python import upload_packages upload_packages.append(__file__) @@ -153,11 +152,20 @@ def inter_snap(param): def inter_gap(param): + init_string = param["init_string"] + atomic_num_list = param["atomic_num_list"] + if init_string is None: + with open(param["model_name"][0], "r") as fp: + xml_contents = fp.read() + init_string = re.search(r'label="([^"]*)"', xml_contents).group(1) + if atomic_num_list is None: + atomic_num_list = [PERIOD_ELEMENTS_BY_SYMBOL.index(e) + 1 for e in param["param_type"]] + ret = "" line = "pair_style quip \n" - line += "pair_coeff * * %s " % param["model_name"][0] - for ii in param["param_type"]: - line += ii + " " + line += f'pair_coeff * * {param["model_name"][0]} "Potential xml_label={init_string}" ' + for ii in atomic_num_list: + line += str(ii) + " " line += "\n" ret += line return ret diff --git a/apex/core/common_equi.py b/apex/core/common_equi.py index 4d672e2..fa6493c 100644 --- a/apex/core/common_equi.py +++ b/apex/core/common_equi.py @@ -85,9 +85,14 @@ def make_equi(confs, inter_param, relax_param): sys.to("abacus/stru", stru) else: raise FileNotFoundError("No file %s" % stru) + if not os.path.exists(os.path.join(ii, POSCAR)): + sys = dpdata.System(stru, fmt="abacus/stru") + sys.to("vasp/poscar", os.path.join(ii, POSCAR)) shutil.copyfile(stru, os.path.join(ii, "STRU.bk")) abacus_utils.modify_stru_path(stru, "pp_orb/", inter_param) + orig_poscar = poscar + orig_POSCAR = POSCAR poscar = os.path.abspath(stru) POSCAR = "STRU" if not os.path.exists(poscar): @@ -105,6 +110,8 @@ def make_equi(confs, inter_param, relax_param): if os.path.isfile(POSCAR): os.remove(POSCAR) os.symlink(os.path.relpath(poscar), POSCAR) + if inter_param["type"] == "abacus": + os.symlink(os.path.relpath(orig_poscar), orig_POSCAR) os.chdir(cwd) task_dirs.sort() # generate task files diff --git a/apex/core/constants.py b/apex/core/constants.py new file mode 100644 index 0000000..98abf14 --- /dev/null +++ b/apex/core/constants.py @@ -0,0 +1,14 @@ +PERIOD_ELEMENTS_BY_SYMBOL = [ + "H", "He", "Li", "Be", "B", "C", "N", "O", "F", "Ne", + "Na", "Mg", "Al", "Si", "P", "S", "Cl", "Ar", "K", "Ca", + "Sc", "Ti", "V", "Cr", "Mn", "Fe", "Co", "Ni", "Cu", "Zn", + "Ga", "Ge", "As", "Se", "Br", "Kr", "Rb", "Sr", "Y", "Zr", + "Nb", "Mo", "Tc", "Ru", "Rh", "Pd", "Ag", "Cd", "In", "Sn", + "Sb", "Te", "I", "Xe", "Cs", "Ba", "La", "Ce", "Pr", "Nd", + "Pm", "Sm", "Eu", "Gd", "Tb", "Dy", "Ho", "Er", "Tm", "Yb", + "Lu", "Hf", "Ta", "W", "Re", "Os", "Ir", "Pt", "Au", "Hg", + "Tl", "Pb", "Bi", "Po", "At", "Rn", "Fr", "Ra", "Ac", "Th", + "Pa", "U", "Np", "Pu", "Am", "Cm", "Bk", "Cf", "Es", "Fm", + "Md", "No", "Lr", "Rf", "Db", "Sg", "Bh", "Hs", "Mt", "Ds", + "Rg", "Cn", "Nh", "Fl", "Mc", "Lv", "Ts", "Og" +] diff --git a/apex/core/property/Elastic.py b/apex/core/property/Elastic.py index 531e556..1a59af5 100644 --- a/apex/core/property/Elastic.py +++ b/apex/core/property/Elastic.py @@ -33,7 +33,7 @@ def __init__(self, parameter, inter_param=None): self.shear_deform = parameter["shear_deform"] parameter.setdefault("conventional", False) self.conventional = parameter["conventional"] - parameter.setdefault("ieee", True) + parameter.setdefault("ieee", False) self.ieee = parameter["ieee"] parameter.setdefault("cal_type", "relaxation") self.cal_type = parameter["cal_type"] diff --git a/apex/core/property/Gamma.py b/apex/core/property/Gamma.py index c05ccbe..223743c 100644 --- a/apex/core/property/Gamma.py +++ b/apex/core/property/Gamma.py @@ -248,7 +248,7 @@ def make_confs(self, path_to_work, path_to_equi, refine=False): task_list.append(output_task) # print("# %03d generate " % ii, output_task) - logging.info(f"# {count} generate {output_task}, with{len(obtained_slab.sites)} atoms") + logging.info(f"# {count} generate {output_task}, with {len(obtained_slab.sites)} atoms") # make confs obtained_slab.to("POSCAR.tmp", "POSCAR") @@ -256,7 +256,7 @@ def make_confs(self, path_to_work, path_to_equi, refine=False): vasp_utils.sort_poscar("POSCAR", "POSCAR", ptypes) if self.inter_param["type"] == "abacus": abacus_utils.poscar2stru("POSCAR", self.inter_param, "STRU") - os.remove("POSCAR") + #os.remove("POSCAR") # vasp.perturb_xz('POSCAR', 'POSCAR', self.pert_xz) # record miller dumpfn(self.plane_miller, "miller.json") @@ -433,7 +433,7 @@ def __poscar_fix(self, poscar) -> None: def __stru_fix(self, stru) -> None: fix_dict = {"true": True, "false": False} - fix_xyz = [fix_dict[i] for i in self.addfix] + fix_xyz = [fix_dict[i] for i in self.add_fix] abacus_utils.stru_fix_atom(stru, fix_atom=fix_xyz) def __inLammpes_fix(self, inLammps) -> None: diff --git a/apex/core/property/Interstitial.py b/apex/core/property/Interstitial.py index 7bb8c50..1a5dcf9 100644 --- a/apex/core/property/Interstitial.py +++ b/apex/core/property/Interstitial.py @@ -318,7 +318,7 @@ def make_confs(self, path_to_work, path_to_equi, refine=False): bcc_interstital_dict = { 'tetrahedral': {chl: [0.25, 0.5, 0]}, 'octahedral': {chl: [0.5, 0.5, 0]}, - 'crowdion': {chl: [0.25, 0.25, 0]}, + 'crowdion': {chl: [0.25, 0.25, 0.25]}, '<111>dumbbell': {chl: [1 / 3, 1 / 3, 1 / 3], center: [2 / 3, 2 / 3, 2 / 3]}, '<110>dumbbell': {chl: [1 / 4, 3 / 4, 1 / 2], @@ -408,7 +408,7 @@ def make_confs(self, path_to_work, path_to_equi, refine=False): output_task = os.path.join(self.path_to_work, "task.%06d" % ii) os.chdir(output_task) abacus_utils.poscar2stru("POSCAR", self.inter_param, "STRU") - os.remove("POSCAR") + #os.remove("POSCAR") os.chdir(cwd) return self.task_list diff --git a/apex/core/property/Phonon.py b/apex/core/property/Phonon.py index 9d763ad..923d70f 100644 --- a/apex/core/property/Phonon.py +++ b/apex/core/property/Phonon.py @@ -55,7 +55,7 @@ def __init__(self, parameter, inter_param=None): self.BAND_POINTS = parameter["BAND_POINTS"] parameter["BAND_CONNECTION"] = parameter.get('BAND_CONNECTION', True) self.BAND_CONNECTION = parameter["BAND_CONNECTION"] - parameter["cal_type"] = parameter.get("cal_type", "relaxation") + parameter["cal_type"] = parameter.get("cal_type", "static") default_cal_setting = { "relax_pos": True, "relax_shape": False, @@ -485,6 +485,12 @@ def phonopy_band_string_2_band_list(band_str: str, band_label: str = None): # return type -> list[list[dict[Any, Any]]] return band_list + @staticmethod + def check_same_copy(src, dst): + if os.path.samefile(src, dst): + return + shutil.copyfile(src, dst) + def _compute_lower(self, output_file, all_tasks, all_res): cwd = Path.cwd() work_path = Path(output_file).parent.absolute() @@ -497,10 +503,9 @@ def _compute_lower(self, output_file, all_tasks, all_res): if not self.reprod: os.chdir(work_path) if self.inter_param["type"] == 'abacus': - shutil.copyfile("task.000000/band.conf", "band.conf") - shutil.copyfile("task.000000/STRU.ori", "STRU") - shutil.copyfile("task.000000/phonopy_disp.yaml", "phonopy_disp.yaml") - os.system('phonopy -f task.0*/OUT.ABACUS/running_scf.log') + self.check_same_copy("task.000000/band.conf", "band.conf") + self.check_same_copy("task.000000/STRU.ori", "STRU") + self.check_same_copy("task.000000/phonopy_disp.yaml", "phonopy_disp.yaml") os.system('phonopy -f task.0*/OUT.ABACUS/running_scf.log') if os.path.exists("FORCE_SETS"): print('FORCE_SETS is created') @@ -510,9 +515,8 @@ def _compute_lower(self, output_file, all_tasks, all_res): os.system('phonopy-bandplot --gnuplot band.yaml > band.dat') elif self.inter_param["type"] == 'vasp': - shutil.copyfile("task.000000/band.conf", "band.conf") - if not os.path.samefile("task.000000/POSCAR-unitcell", "POSCAR-unitcell"): - shutil.copyfile("task.000000/POSCAR-unitcell", "POSCAR-unitcell") + self.check_same_copy("task.000000/band.conf", "band.conf") + self.check_same_copy("task.000000/POSCAR-unitcell", "POSCAR-unitcell") if self.approach == "linear": os.chdir(all_tasks[0]) @@ -528,8 +532,8 @@ def _compute_lower(self, output_file, all_tasks, all_res): shutil.copyfile("band.dat", work_path/"band.dat") elif self.approach == "displacement": - shutil.copyfile("task.000000/band.conf", "band.conf") - shutil.copyfile("task.000000/phonopy_disp.yaml", "phonopy_disp.yaml") + self.check_same_copy("task.000000/band.conf", "band.conf") + self.check_same_copy("task.000000/phonopy_disp.yaml", "phonopy_disp.yaml") os.system('phonopy -f task.0*/vasprun.xml') if os.path.exists("FORCE_SETS"): print('FORCE_SETS is created') diff --git a/apex/core/property/Surface.py b/apex/core/property/Surface.py index 2ea3a5e..31420e9 100644 --- a/apex/core/property/Surface.py +++ b/apex/core/property/Surface.py @@ -184,7 +184,7 @@ def make_confs(self, path_to_work, path_to_equi, refine=False): vasp_utils.perturb_xz("POSCAR", "POSCAR", self.pert_xz) if self.inter_param["type"] == "abacus": abacus_utils.poscar2stru("POSCAR", self.inter_param, "STRU") - os.remove("POSCAR") + #os.remove("POSCAR") # record miller dumpfn(all_slabs[ii].miller_index, "miller.json") diff --git a/apex/core/property/Vacancy.py b/apex/core/property/Vacancy.py index d0a610e..7fbbc44 100644 --- a/apex/core/property/Vacancy.py +++ b/apex/core/property/Vacancy.py @@ -167,7 +167,7 @@ def make_confs(self, path_to_work, path_to_equi, refine=False): dss[ii].to("POSCAR", "POSCAR") if self.inter_param["type"] == "abacus": abacus_utils.poscar2stru("POSCAR", self.inter_param, "STRU") - os.remove("POSCAR") + #os.remove("POSCAR") # np.savetxt('supercell.out', self.supercell, fmt='%d') dumpfn(self.supercell, "supercell.json") os.chdir(cwd) diff --git a/apex/submit.py b/apex/submit.py index 9161d2b..3e2f935 100644 --- a/apex/submit.py +++ b/apex/submit.py @@ -32,7 +32,8 @@ def pack_upload_dir( upload_dir: os.PathLike, relax_param: dict, prop_param: dict, - flow_type: str + flow_type: str, + exclude_upload_files: List[str], ): """ Pack the necessary files and directories into temp dir and upload it to dflow @@ -75,9 +76,10 @@ def pack_upload_dir( backup_path(path_to_prop) """copy necessary files and directories into temp upload directory""" + exclude_upload_files.append("all_result.json") copy_all_other_files( work_dir, upload_dir, - exclude_files=["all_result.json"], + exclude_files=exclude_upload_files, include_dirs=list(include_dirs) ) for ii in conf_dirs: @@ -134,7 +136,8 @@ def submit( upload_dir=tmp_dir, relax_param=relax_param, prop_param=props_param, - flow_type=flow_type + flow_type=flow_type, + exclude_upload_files=wf_config.exclude_upload_files ) cwd = os.getcwd() @@ -201,6 +204,7 @@ def submit_workflow( tmp_work_dir = tempfile.TemporaryDirectory() config["mode"] = "debug" config["debug_workdir"] = config_dict.get("debug_workdir", tmp_work_dir.name) + logging.info(f'Debug mode activated, debug work directory: {config["debug_workdir"]}') s3_config["storage_client"] = None if flow_name: diff --git a/docs/images/apex_demo_high_0001.gif b/docs/images/apex_demo_high_0001.gif new file mode 100644 index 0000000..b105035 Binary files /dev/null and b/docs/images/apex_demo_high_0001.gif differ diff --git a/setup.py b/setup.py index ea57c55..7e5371a 100644 --- a/setup.py +++ b/setup.py @@ -5,7 +5,7 @@ setuptools.setup( name="apex-flow", - version="1.2.6", + version="1.2.9", author="Zhuoyuan Li, Tongqi Wen", author_email="zhuoyli@outlook.com", description="Alloy Properties EXplorer using simulations", diff --git a/tests/test_abacus_property.py b/tests/test_abacus_property.py index c2ff548..5f195c9 100644 --- a/tests/test_abacus_property.py +++ b/tests/test_abacus_property.py @@ -144,8 +144,8 @@ def test_make_property_elastic(self): os.remove( os.path.realpath(os.path.join(self.equi_path, "OUT.ABACUS", "STRU_ION_D")) ) - with self.assertRaises(RuntimeError): - elastic.make_confs(work_path, self.equi_path, refine=False) + #with self.assertRaises(RuntimeError): + # elastic.make_confs(work_path, self.equi_path, refine=False) def test_make_property_elastic_post_process(self): property = {"type": "elastic", "norm_deform": 1e-2, "shear_deform": 1e-2}