-
Notifications
You must be signed in to change notification settings - Fork 39
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ydiago solver without mpi #134
Comments
Honestly, I do not see why
|
In my machine with nvfortran and cuda support
I've reproduced the same behavior with nvfortran on a couple of other machines. No idea of why. I do not know why |
@muralidhar-nalabothula yambo can be
a) compiled without mpi:
--disable-mpi
at configure time,-D_MPI
is not defined. Of course scalapck/blacs cannot be linked. Probably it does not make sense to link elpa neither. We can de-activate the ydiago implementation as well in such case.b) compiled with mpi but mpi disabled at runtime using
yambo -nompi
. In such case the code can be compiled with all mpi libraries (including sclapack/blacs/etc .. ), butMPI_Init
is not called. Moreover, all calls such ascall MPI_COMM_SIZE
are protected by something likeif(ncpu>1) return
.For case (a), we can leave it as an open issue, and work on this in the future.
For case (b) it might be enough to put some
if(ncpu>1) then / else
insideK_diagonalize
to avoid seg-fault at runtime.The text was updated successfully, but these errors were encountered: