-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
JIT Error encountered when optimizing GammaC
#1288
Comments
Here are the steps that should be taken to debug
Unjitting the compute function tends to help for debugging |
It seems to be unique to the bounce integral objectives, if I comment out GammaC it works fine, and if i change to EffectiveRipple it still happens. Other things:
These probably aren't related to the error above, but might be another source of concern |
with |
Ok I ran optimizations leading up to ISHW, so commit 5cd7ebd should not have this issue. State of branch at that commit https://github.com/PlasmaControl/DESC/tree/5cd7ebde563258f754a0401d9da6aa143bc3376f |
There is also a jit call wrapping the compute function in
Aren't these compiled once? The
Can memory usage effect this? Is this forward or reverse mode? I ran forward optimizations before ISHW and did not see the optimizer exit |
5cd7ebd...Gamma_c |
I won't have time to debug tonight/tmrw, but will look more this weekend. thanks for starting to look into this so quickly though. on |
I think this is some jax issue; and the caching suggest this is problem dependent. In any case, I suggest trying on #1290, and if the issue disappears then can mark this resolved. The objectives there use an optimization step independent transforms grid, so that might solve that caching issue you came across. |
Same error occurs in #1290 , once I find the specific cause I can commit a fix |
I accidentally ran the the tutorial's optimization cell block 6 another time after the optimization completes successfully, and I get the same error. JIT caching is not done there, and the block is self-contained so the second run is a completely new optimization, not a second step, so I am uncertain if it is related to this issue. The error message suggests jax is getting an array with different dimension than it expects, so flattening all inputs from tuples and higher dim arrays to 1D arrays before they reach the objective function, in particular those in The omnigenity objective also passes in a 2D array in constants, so it might have the same issue, and could be worth looking into how 2D arrays are interpreted in the compute scaled error functions. |
Yep the fix in #1229 is actually pretty simple, Greta basically changed the way |
Hm what is the error message actually? that seems different than what we get (ours is an np logical array, not quite related to shape mismatches which is what yours sounds like? this could be a separate issue?) |
I think even if the block is self-contained, running it again will still find that there is a cached jitted version of the |
Check if @rahulgaur104 's ballooning objective is also affected by this |
Fixed by storing rho, alpha, zeta in a LinearGrid instead of separate arrays. Where rho, alpha, zeta, etc are accessed, now passed by indexing array |
Error seems to occur when optimizing
GammaC
objective ongh/Gamma_c
branch, happens on the second optimization step and seems related to the JIT cache? The error also only occurs if attempting an optimization at a resolution that you have previously optimized at, changing the eq resolution between steps seems to avoid this issue, so I assume it is related to the cachingMWE:
Error:
The text was updated successfully, but these errors were encountered: