-
Notifications
You must be signed in to change notification settings - Fork 645
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Migrate build data to pyproject.toml #1078
Migrate build data to pyproject.toml #1078
Conversation
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
pyproject.toml
Outdated
"Topic :: Scientific/Engineering :: Artificial Intelligence" | ||
] | ||
dependencies = [ | ||
"torch", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Don't we need any version constraint at all? Maybe at least a minimum version?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I do think we should have at least minimums, but I'm not sure what they should be. I took these runtime dependencies out of setup.py
just as they were.
If it were up to me, I would pin torch>=1.13.0
on the low end for all platforms/backends assuming everything works. Why?
- Oldest release which did not ship any wheels with CUDA < 11
- Migrated to CUDA 11.6 and 11.7, which is right in line where I think bitsandbytes cutoff should be [1]
- Apple silicon support (MPS) promoted from experimental/prototype to beta
- 🤗 Accelerate docs "strongly recommend" torch>=1.13 for MPS
- AWS has HuggingFace containerswith the oldest having
torch==1.13.1+cu117
on py3.9
** In general, all of the Python 3.9+ containers with PyTorch appear to be torch>=1.13.0 - Azure Container for PyTorch still has a 1.13. configuration
But it could also be a little more permissive for some environments; e.g. transformers recently adopted a constraint of torch>=1.11,!=1.12.0
in huggingface/transformers#28207, which was also adopted in UKPLab/sentence-transformers#2432.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IMHO (and I recognize this might ba a bit controversial) we should set minimum to what we test on, then pin to these when we test. I don't think we should overstretch to keep supporting old versions of dependencies at the expense of stability or complexity. So I would say ideally we set minimum to what we currently build on and either no upper bound or < next major.
As a user of any package I prefer to know what works or not and not having to be the guinea pig.
But this is my personal opinion, not sure what the rest of you think.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ideally we would have a CI setup that could test a wider matrix so we can be confident about how we pin these, for sure.
For me part of the point of this bitsandbytes
project is to try and make things accessible for a wide range of users - so that's across hardware, software, and user's expertise. Since many people, particularly in academia, don't have much control over things like OS, python version, or even torch version, we've got to support a large range in order to make things easy for users. That does come at the cost of maintenance, obviously. Need to find a good balance here.
So at least with this change it's being restricted to torch>=1.11,!=1.12.0
, which is in line with transformers
. I don't know that all of those versions of torch will work, but it's an incremental step forward compared to install_requires=["torch", "numpy"]
, and given the lack of issues around torch versions it does seem likely that it would work.
Co-authored-by: Aarni Koskela <[email protected]>
Co-authored-by: Aarni Koskela <[email protected]>
This reverts commit d23be94.
Co-authored-by: Aarni Koskela <[email protected]>
Co-authored-by: Aarni Koskela <[email protected]>
This reverts commit d23be94.
Rebased |
This is really weird, somehow the diff of the PR got all messed up now. This makes it really hard to review. It seems to me this PR is otherwise ready for review, right? |
Not sure what happened there but I think it should look more reasonable now. Apart from that, yes, it's ready. |
9b72679
to
7800734
Compare
Superseded by #1373. |
pyproject.toml
, conforming with PEP 517, PEP 518, and PEP 621requirements.txt
files (but not the condaenvironment.yml
ones just yet)