Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ci: 👷 release,test-release,dep actions and pyproject.toml added #43

Merged
merged 7 commits into from
Sep 16, 2024

Conversation

onuralpszr
Copy link
Collaborator

@onuralpszr onuralpszr commented Sep 15, 2024

Description

  • Pyproject.toml (pypa, and well tested)
  • Dependabot added
  • Release action added
  • Test Release action added (rc,a,b)
  • Tox ini and added as development dependencies
  • Github action for OSX (pytests) added
  • Maestro version command added
  • pyproject.toml exclude list updated
  • License update from MIT to Apache
  • Project urls are added.
  • Keywords added.
  • Exclude files list updated
  • Changelog.md added

Signed-off-by: Onuralp SEZER <thunderbirdtr@gmail.com>
@CLAassistant
Copy link

CLAassistant commented Sep 15, 2024

CLA assistant check
All committers have signed the CLA.

Signed-off-by: Onuralp SEZER <thunderbirdtr@gmail.com>
@onuralpszr
Copy link
Collaborator Author

Current test results

  py39: OK (66.17=setup[32.69]+cmd[33.48] seconds)
  py310: OK (57.87=setup[26.58]+cmd[31.29] seconds)
  py311: OK (55.35=setup[25.66]+cmd[29.69] seconds)
  py312: OK (61.36=setup[30.26]+cmd[31.10] seconds)
  congratulations :) (240.78 seconds)

It also confirms installation work properly

Signed-off-by: Onuralp SEZER <thunderbirdtr@gmail.com>
@onuralpszr onuralpszr force-pushed the ci/pypa/dep branch 2 times, most recently from e38869b to 4110122 Compare September 15, 2024 18:22
Signed-off-by: Onuralp SEZER <thunderbirdtr@gmail.com>
@onuralpszr onuralpszr marked this pull request as ready for review September 15, 2024 18:30
@onuralpszr
Copy link
Collaborator Author

onuralpszr commented Sep 15, 2024

Hi @SkalskiP 👋,

I've completed most of the actions for Maestro and replaced the setup.py and requirements.txt files with a pyproject.toml, leveraging setuptools. Additionally, I wrote actions for both macOS and Ubuntu to ensure that all installations and tests pass.

I manually added a FlashAttention wheel for a clean installation in test CI but PyPI version of flash-attention only provides a tarball that requires compilation. However, this complicates a one-liner installation (pip install maestro) since users would need to either set up the proper environment or install flash-attention manually alongside Torch.

  • Do you think we should consider making;
    • FlashAttention an optional dependency?
    • Create docker images for easy to use (all packages pre-installed and pre-configured environments for prod/train/dev)
  • Do we planning to implement versioning for the documentation?
  • Do we planning to implement WindowsOS into CI tests?
  • I didn't touch pre-commit yet but do we plan to have similar config but longer line limit (I see 120) to set for ruff/pre-commit configuration I made in supervision to align both of them ? And apply them in same PR

Signed-off-by: Onuralp SEZER <thunderbirdtr@gmail.com>
Signed-off-by: Onuralp SEZER <thunderbirdtr@gmail.com>
- 📝 Added a new CHANGELOG.md file to document changes.
- 📝 Updated README.md with new information or corrections.
- 🔧 Modified pyproject.toml to update project urls and exclude corrections
- 🔧 Package keywords added to pyproject.toml

Signed-off-by: Onuralp SEZER <thunderbirdtr@gmail.com>
@onuralpszr
Copy link
Collaborator Author

Wheel Package Content list

adding 'maestro/init.py'
adding 'maestro/primitives.py'
adding 'maestro/visualizers.py'
adding 'maestro/cli/init.py'
adding 'maestro/cli/env.py'
adding 'maestro/cli/introspection.py'
adding 'maestro/cli/main.py'
adding 'maestro/cli/utils.py'
adding 'maestro/lmms/init.py'
adding 'maestro/lmms/gpt4.py'
adding 'maestro/markers/init.py'
adding 'maestro/markers/sam.py'
adding 'maestro/postprocessing/init.py'
adding 'maestro/postprocessing/mask.py'
adding 'maestro/postprocessing/text.py'
adding 'maestro/trainer/init.py'
adding 'maestro/trainer/common/init.py'
adding 'maestro/trainer/common/configuration/init.py'
adding 'maestro/trainer/common/configuration/env.py'
adding 'maestro/trainer/common/data_loaders/init.py'
adding 'maestro/trainer/common/data_loaders/datasets.py'
adding 'maestro/trainer/common/data_loaders/jsonl.py'
adding 'maestro/trainer/common/utils/init.py'
adding 'maestro/trainer/common/utils/file_system.py'
adding 'maestro/trainer/common/utils/leaderboard.py'
adding 'maestro/trainer/common/utils/metrics.py'
adding 'maestro/trainer/common/utils/reproducibility.py'
adding 'maestro/trainer/models/init.py'
adding 'maestro/trainer/models/florence_2/init.py'
adding 'maestro/trainer/models/florence_2/checkpoints.py'
adding 'maestro/trainer/models/florence_2/core.py'
adding 'maestro/trainer/models/florence_2/data_loading.py'
adding 'maestro/trainer/models/florence_2/entrypoint.py'
adding 'maestro/trainer/models/florence_2/metrics.py'
adding 'maestro/trainer/models/paligemma/init.py'
adding 'maestro/trainer/models/paligemma/entrypoint.py'
adding 'maestro/trainer/models/paligemma/training.py'
adding 'maestro-0.2.0rc3.dist-info/LICENSE'
adding 'maestro-0.2.0rc3.dist-info/METADATA'
adding 'maestro-0.2.0rc3.dist-info/WHEEL'
adding 'maestro-0.2.0rc3.dist-info/entry_points.txt'
adding 'maestro-0.2.0rc3.dist-info/top_level.txt'
adding 'maestro-0.2.0rc3.dist-info/RECORD'

pip install torch
if [[ "${{ matrix.os }}" == "ubuntu-latest" ]]; then
if [[ "${{ matrix.python-version }}" == "3.10" ]]; then
pip install https://github.com/Dao-AILab/flash-attention/releases/download/v2.6.3/flash_attn-2.6.3+cu123torch2.4cxx11abiTRUE-cp310-cp310-linux_x86_64.whl
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will this break when torch is bumped to next version given flashattn is fixed to 2.4?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, we need to update here, or I can lock pytorch in here and we can update when we wanted to as well. But source installation was breaking in action and I checked documentation of flash-attention it was asking other packages and cuda-toolkit to be present plus they also have certain version lock as you can see in pre-made as wheels

@SkalskiP SkalskiP merged commit 1f7c5af into develop Sep 16, 2024
9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants