Releases
v0.8.7
What's Changed
New Features
[CoreEngine/MLOps] Supported LLM record logging.
[Serving] Made the inference backend for deepspeed work.
[CoreEngine/DevOps] Made the public cloud server scheduled into specific nodes.
[DevOps] Added the fedml light docker image and related documents.
[DevOps] Built and pushed light docker images and related pipelines.
[CoreEngine] Added timestamp when reporting system metrics.
[DevOps] Made the serving k8s cluster work with the latest images and updated related chart files.
[CoreEngine] Added the skip_log_model_net option for llm training.
[CoreEngine/CrossSilo] Supported customized hierarchical cross-silo.
[Serving] Created the default model config and readme file if the user did not provide any model config and readme options when creating a model card.
[Serving] Allow users to customize their token for end point and inference.
Bug Fixes
[CoreEngine] Made compatibility when opening subprocess on windows.
[CoreEngine] Fixed the issue that MPI Mode does not have client rank -1.
[CoreEngine] Set the python interpreter based on the current running python version.
[CoreEngine] Fixed the issue that failed to verify the pip ssl certificate when checking OTA versions.
[CrossDevice] Fixed issues where the test metrics are reported twice to MLOps and loss metrics are clipped to integers on the Beehive platform.
[App] Fixed issues when installing flamby on the heart-disease app.
[CoreEngine] Added handler when utf-8 cannot decode the output and error string.
[App] Fixed scripts and requirements on the FedNLP app.
[CoreEngine] Fixed issues whereFileExistsError triggered for all os.makedirs.
[Serving] Changed the model url to open.fedml.ai.
[Serving] Fixed the issue for OnnxExporterError and added Onnx as default dependent library when installing fedml.
[Serving] Fixed the issue where the local package name is different from MLOps UI.
Enhancements
[Serving] Establish container based on user's config and improve code readability.
You can’t perform that action at this time.