Skip to content

Releases: simonprovost/Auto-Sklong

🎉 Minor Update with Package Updates & Stability Enhancements

15 Jan 20:43
Compare
Choose a tag to compare

[v0.0.4] - 2025-01-15 – Stability Enhancements & Package Updates

We’re excited to introduce Auto-Sklong v0.0.4! While this release is minor, it ensures that our dependencies remain up-to-date, particularly with the migration of Scikit-Longitudinal to version 0.0.7. This upgrade brings compatibility improvements and prepare the foundation for better performance in longitudinal machine learning tasks.

Highlights

  1. Package Updates

    • Updated Scikit-Longitudinal to v0.0.7 to incorporate compatibility adjustments for the recent migration to uv.
  2. Stability Improvements

    • Minor bug fixes and adjustments to ensure smoother workflows, especially with updated dependencies.

As always, thank you for your support and contributions! Let’s keep pushing forward together! 🚀

Previously in v0.0.3

🎄 Migration to UV & Documentation Improvements

We are glad to present Auto-Sklong v0.0.3. This new release focusses on streamlining the development workflow by migrating from PDM to uv, which speeds up installation and reduces complexity. Our documentation has also been updated, with clarifications to the Quick Start and new sections for Apple Silicon Mac customers. Most excitingly, our paper has been accepted to the IEEE BIBM 2024 conference—stay tuned for the BibTeX citation and additional publication details when the proceedings are posted!

Highlights

  1. Migration from PDM to uv

    • Far simpler commands and fewer setup configurations for the community.
    • Substantial speed improvements, as shown in this benchmark comparison, pitting uv against poetry, PDM, and pip-sync.
  2. Documentation Enhancements

    • Quick Start Fixes: Thanks to @anderdnavarro (in #4, #5, #6) for correcting parameter names in the Quick Start feature list examples.
    • Paper Acceptance: Our paper on Auto-Sklong has been accepted to the 2024 IEEE BIBM Conference. We will add the BibTeX reference once the proceedings are finalised.
    • Apple Silicon Installation Guide: The Quick Start now includes a dedicated section for installing Auto-Sklong on Apple Silicon-based Macs, making setup for M1/M2 systems more transparent and accessible. Thanks once more to @anderdnavarro for pointing that out!

Future Work

  • BibTeX Citation: We will add a citation reference for our BIBM 2024 paper as soon as the proceedings are publicly available.
  • Documentation: The experiments paper section will be simplified, and the redundant "Release History" tab in the documentation will be removed.
  • Examples: We aim to launch a comprehensive Jupyter notebook tutorial to demonstrate how to use Auto-Sklong.

As always, thank you for your continued support. Let’s keep exploring the boundaries of longitudinal machine learning!

Merry XMas! 🎄

🎄 Migration to UV & Documentation Improvements

31 Dec 00:17
Compare
Choose a tag to compare

[v0.0.3] - 2024-12-31 – Migration to uv & Documentation Improvements

We are glad to present Auto-Sklong v0.0.3. This new release focusses on streamlining the development workflow by migrating from PDM to uv, which speeds up installation and reduces complexity. Our documentation has also been updated, with clarifications to the Quick Start and new sections for Apple Silicon Mac customers. Most excitingly, our paper has been accepted to the IEEE BIBM 2024 conference—stay tuned for the BibTeX citation and additional publication details when the proceedings are posted!

Highlights

  1. Migration from PDM to uv

    • Far simpler commands and fewer setup configurations for the community.
    • Substantial speed improvements, as shown in this benchmark comparison, pitting uv against poetry, PDM, and pip-sync.
  2. Documentation Enhancements

    • Quick Start Fixes: Thanks to @anderdnavarro (in #4, #5, #6) for correcting parameter names in the Quick Start feature list examples.
    • Paper Acceptance: Our paper on Auto-Sklong has been accepted to the 2024 IEEE BIBM Conference. We will add the BibTeX reference once the proceedings are finalised.
    • Apple Silicon Installation Guide: The Quick Start now includes a dedicated section for installing Auto-Sklong on Apple Silicon-based Macs, making setup for M1/M2 systems more transparent and accessible. Thanks once more to @anderdnavarro for pointing that out!

Future Work

  • BibTeX Citation: We will add a citation reference for our BIBM 2024 paper as soon as the proceedings are publicly available.
  • Documentation: The experiments paper section will be simplified, and the redundant "Release History" tab in the documentation will be removed.
  • Examples: We aim to launch a comprehensive Jupyter notebook tutorial to demonstrate how to use Auto-Sklong.

As always, thank you for your continued support. Let’s keep exploring the boundaries of longitudinal machine learning!

Merry XMas! 🎄

Previously in v0.0.2

We are pleased to announce that Auto-Sklong is now available in its first public release under the tag 0.0.2, despite numerous PyPI misadventures (lesson learned, PyPI-Tests). 🎉

About Auto-Sklong

Auto-Sklong is built on @PGijsbers’ General Automated Machine Learning (AutoML) Assistant (GAMA) framework—a flexible AutoML framework for experimenting with different search strategies and a customisable search space. We began improving GAMA locally for our own goals of tackling longitudinal machine learning tasks, resulting in Auto-Sklong. While it remains an AutoML system, it offers new features such as:

  • A sequential search space via ConfigSpace.
  • Bayesian optimisation using SMAC3.
  • Additional built-in features inherited from GAMA.

Key Features in v0.0.2

  • New Search Space: ConfigSpace-supported search space.
  • New Search Method: Bayesian Optimisation via SMAC3.
  • Documentation: Comprehensive new docs (Material for MkDocs), including tutorials on longitudinal data and usage examples.
  • PyPI Availability: Auto-Sklong is now published on PyPI.
  • Continuous Integration: Streamlined CI pipeline for building, testing, and publishing.

Next Steps

  • Finalise PRs on GAMA to align with Auto-Sklong.
  • Add real-world examples and Jupyter notebooks to help users adopt the library.
  • Continue refining the library and documentation.

Note: No tag 0.0.1 will ever be available.

🎉 First Public Release Github & PyPi

12 Jul 13:16
Compare
Choose a tag to compare

We are pleased to announce that Auto-Sklong is now available in its first public release under the tag 0.0.2, despite numerous Pypi misadventures (lesson learned, Pypi-Tests). 🎉

📽️ Auto-Sklong is built on @PGijsbers' General Automated Machine Learning (AutoML) Assistant (GAMA) framework. A flexible AutoML framework for experimenting with different search strategies and a customisable search space, among other cool features. We began using and improving locally GAMA for our own goals of tackling the Longitudinal machine learning tasks via AutoML, then created Auto-Sklong, which, while an AutoML system, differs from the very goal of GAMA; however, the improvements made to GAMA by doing Auto-Sklong were "generalised" for the GAMA goal, and we submitted three pull requests (see further in our readme).

💡 Auto-Sklong introduces a completely new search space by leveraging ConfigSpace, a sequential search space. Introduces a new search method, bayesian optimisation, via SMAC3. It also includes all of GAMA's built-in features, such as different search methods and other cool stuff. Read the Auto-Sklong and GAMA documentation. In order to achieve the end goal: Auto-Sklong is now capable of solving both the (1) Longitudinal Machine Learning task problem by understanding the temporal dependency in the dataset – leveraging Sklong – and the (2) Combined Algorithm Selection and Hyperparameter Optimisation (CASH Optimisation).

Paper has been submitted to a conference. Will be updated if accepted.

🫵
https://pypi.org/project/Auto-Sklong/0.0.2

[v0.0.2] - 2024-07-12 - First Public Release

Added

  • New Search Space: ConfigSpace supported search space via GAMA. Pull request ongoing on the original repository.
  • New Search Method: Bayesian Optimization via SMAC3 is now feasible. Pull request ongoing on the GAMA original repository.
  • Documentation: Comprehensive new documentation with Material for MKDocs. This includes a detailed tutorial on understanding vectors of waves in longitudinal datasets, a contribution guide, an FAQ section, and complete API references which use a lot of Sklong and GAMA documentation to guide the users.
  • PyPI Availability: Auto-Sklong is now available on PyPI.
  • Continuous Integration: Integrated unit testing, documentation, and PyPI publishing within the CI pipeline.

To-Do

  • Finalize PRs on GAMA: Ongoing pull requests on GAMA would facilitate the alignment between Auto-Sklong and GAMA's latest version. They need to be worked on and published so that we can make compatibility adjustments between both libraries for the sake of Auto-Sklong's long-term goals (being able to benefit from future GAMA features if any).
  • Future Enhancements: Ongoing improvements and new features as they are identified.
  • Documentation Examples: Add examples to the documentation to help users understand how to use the library with Jupyter notebooks.

Note, no tag 0.0.1 will ever be available.