-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to get GPU solver working #381
Comments
Thanks for reporting this, @taDachs. This should be fixed. To use GPU feature, please try ExaModels: If you want to use GPU features with JuMP, one option is using experimental JuMP interface, but this might be less stable/efficient |
@sshin23 can you elaborate what is the scope of the fix? Will just the documentation be fixed that ExaModels needs to be used for GPU feature or there will be native support for JuMP so that JuMP computes derivatives? JuMP can also use ASL for derivatives as well as experimental symbolic derivatives. To my understanding cuDSS is linear solver and derivatives (Jacobian and Hessian) could be provided by other tools than just ExaModels. |
You may wrap the NLP model so that the AD takes place on host memory, and then they are sent to the device memory. Please check https://github.com/exanauts/ExaModels.jl/blob/main/src/utils.jl#L5-L120 |
Hey, I tried following the quickstart for gpu solvers, but was unable to get the solver to run.
My code looks like this:
I get an error message which I don't really understand:
Is there a problem with how i setup my optimization problem? I couldn't find any documentation on the GPU solvers.
The text was updated successfully, but these errors were encountered: