-
Notifications
You must be signed in to change notification settings - Fork 181
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LRU embedding net requires very new torch version #1537
Comments
One could add a pytest.mark.skipif which is true if the torch version is not >= 2.5 and a user warning or error when using the scan mode for the LRUEmbedding. the other mode should work as expected. I can't promise to be able to do this until next week. Please excuse the brevity, I'm on the phone. |
Some context: the LRU recurrency can be run in two modes, either using a for loop (which is the default), or using associative / parallel scans (which are potentially much faster for long time series as it scales log(sequence length)). However even with the for loop mode, the LRUs should be very useful to embed time-series (and probably still a lot faster than many other RNNs). The associative scan function is still in development in PyTorch - in fact gradients are not supported (but will be very soon, potentially merged in the coming days). I think @famura's suggestion makes sense for now! |
If we know the type of Error that will be raised it makes sense to use a
so that we will get an |
🐛 Bug Description
Our test suite fail for torch
v2.4.0
, because the LRU requires at leastv2.5.0
. I am not sure how to best handle this, as I would prefer to not pin torch to such a new version. Together with #1380, this would maketorch=2.5.0
the only compatible version.To reproduce:
The error is
(and many other LRU tests with the same error message)
Tagging @famura @Matthijspals FYI.
The text was updated successfully, but these errors were encountered: