[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [ESPResSo-users] [SPAM] Re: activate magnetostatics method DipolarDi
Re: [ESPResSo-users] [SPAM] Re: activate magnetostatics method DipolarDirectSumCpu
Sun, 12 May 2019 15:42:41 +0800
I have tried to run the script "run-full.sh
" of the supplementary information of
with " mpirun -n 32"and found that
13: misplaced part id 7647. 0x29
a4fa0 != 0x29a4cd0
I have checked the document and searched the mailing list, unfortunately I didn't find the answer. Does anyone have the clue about that?
在 2019年5月10日 下午10:32，Rudolf Weeber <address@hidden>写道：
On Fri, May 10, 2019 at 09:05:12PM +0800, address@hidden wrote:
> I think the DipolarDirectSumCpu is a long range interaction and the command of mpi version will divide the whole box into local boxes so that each node sees everything on its own local box and on one layer of cells. Due to that, the DipolarDirectSumCpu can't work fine! Is that why I could not activate magnetostatics method DipolarDirectSumCpu with the command of mpi version? If it's true, how can I calculate the dipolar interaction with open boundary?
The dipolar direct sum on th CPU is not parallelized. You can only use it on a single core.
The remaining simulation is typically suited for a single cpu core.
There is a gpu implementation (DipolarDirectSumGpu) optimized for systems with more than ~2000 magnetic particles.
For large systems (>~10000 dipoles), you can use the P2NFFT method
It scales as N log N in the number of dipoles (as opposed to N^2 for direct summatoin). The supplementary information contains usage examples for ESPResSo.
Hope that helps!
Re: [ESPResSo-users] activate magnetostatics method DipolarDirectSumCpu, Jean-Noël Grad, 2019/05/10