<table style="width: 99.8%; "><tbody><tr><td id="QQMAILSTATIONERY" style="background:url(https://rescdn.qqmail.com/zh_CN/htmledition/images/xinzhi/bg/b_01.jpg); min-height:550px; padding:100px 55px 200px; ">Hi,<br><br>Thank you very much! It really helps. I checked PAR(cr), and it turns out to be false. Also, the&nbsp;DOMAINDECOMP(cr) is also false. It seems to me that, in multisim case, dd_collect_state is not available anymore.<br><br>Does it show that we do not have domain decomposition with multisim? If so, how do I reach the local state, local index, mdatoms and all the things related to dd_partition_system?&nbsp;<br><br>Does it mean that we always share the storage without communication using MPI? To be more specific, how do we get the information of a certain atom, e.g. its position, velocity, etc......<br><br>These questions are probably stupid but I indeed found the parallelization method in multisim case confusing and any help will be appreciated...<br><br>Thank you so much!!!<br><br>Best regards,<br>Huan</td></tr></tbody></table><div><br></div><div><br></div><div style="font-size: 12px;font-family: Arial Narrow;padding:2px 0 2px 0;">------------------&nbsp;Original&nbsp;------------------</div><div style="font-size: 12px;background:#efefef;padding:8px;"><div><b>From:</b>&nbsp;"gromacs.org_gmx-developers-request"&lt;gromacs.org_gmx-developers-request@maillist.sys.kth.se&gt;;</div><div><b>Date:</b>&nbsp;Wed, Jun 5, 2019 06:00 PM</div><div><b>To:</b>&nbsp;"gromacs.org_gmx-developers"&lt;gromacs.org_gmx-developers@maillist.sys.kth.se&gt;;<wbr></div><div></div><div><b>Subject:</b>&nbsp;gromacs.org_gmx-developers Digest, Vol 182, Issue 2</div></div><div><br></div>Send gromacs.org_gmx-developers mailing list submissions to<br>        gromacs.org_gmx-developers@maillist.sys.kth.se<br><br>To subscribe or unsubscribe via the World Wide Web, visit<br>        https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developers<br><br>or, via email, send a message with subject or body 'help' to<br>        gromacs.org_gmx-developers-request@maillist.sys.kth.se<br><br>You can reach the person managing the list at<br>        gromacs.org_gmx-developers-owner@maillist.sys.kth.se<br><br>When replying, please edit your Subject line so it is more specific<br>than "Re: Contents of gromacs.org_gmx-developers digest..."<br><br><br>Today's Topics:<br><br>&nbsp;&nbsp; 1. Re: mdrun_mpi not able to reach "rank" (Mark Abraham)<br>&nbsp;&nbsp; 2. Upcoming 2019.3 patch release (Paul bauer)<br><br><br>----------------------------------------------------------------------<br><br>Message: 1<br>Date: Wed, 5 Jun 2019 08:31:25 +0200<br>From: Mark Abraham &lt;mark.j.abraham@gmail.com&gt;<br>To: Discussion list for GROMACS development<br>        &lt;gmx-developers@gromacs.org&gt;<br>Cc: "gromacs.org_gmx-developers"<br>        &lt;gromacs.org_gmx-developers@maillist.sys.kth.se&gt;<br>Subject: Re: [gmx-developers] mdrun_mpi not able to reach "rank"<br>Message-ID:<br>        &lt;CAMNuMARw-p5QYtWuK_cbLnznYMKrr_3Y4-XAn8ijK0yz9bE=nw@mail.gmail.com&gt;<br>Content-Type: text/plain; charset="utf-8"<br><br>Hi,<br><br>In your two cases, the form of parallelism is different. In the latter, if<br>you are using two ranks with thread-MPI, then you cannot be using multisim,<br>so there is more than one rank for the single simulation in use.<br><br>The PAR(cr) macro (sadly, misnamed for historical reasons) reflects whether<br>there is more than one rank per simulation, so you should be check that,<br>before using e.g. the functions in gromacs/gmxlib/network.h to gather some<br>information to the ranks that are master of each simulation. There's other<br>functions for communicating between master ranks of multi-simulations (e.g.<br>see the REMD code)<br><br>Mark<br><br>On Wed, 5 Jun 2019 at 07:54, 1004753465 &lt;1004753465@qq.com&gt; wrote:<br><br>&gt; Hi everyone,<br>&gt;<br>&gt; I am currently trying to run two Gromacs 2018 parallel processes by using<br>&gt;<br>&gt; mpirun -np 2 ...(some path)/mdrun_mpi -v -multidir sim[01]<br>&gt;<br>&gt; During the simulation, I need to collect some information to the two<br>&gt; master nodes, just like the function "dd_gather". Therefore, I need to<br>&gt; reach (cr-&gt;dd) for each rank. However, whenever I want to print<br>&gt; "cr-&gt;dd-&gt;rank" or "cr-&gt;dd-&gt;nnodes"or some thing like that, it just shows<br>&gt;<br>&gt; [c15:31936] *** Process received signal ***<br>&gt; [c15:31936] Signal: Segmentation fault (11)<br>&gt; [c15:31936] Signal code: Address not mapped (1)<br>&gt; [c15:31936] Failing at address: 0x30<br>&gt; [c15:31936] [ 0] /lib/x86_64-linux-gnu/libpthread.so.0(+0x10340)<br>&gt; [0x7f7f9e374340]<br>&gt; [c15:31936] [ 1]<br>&gt; /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x468cfb]<br>&gt; [c15:31936] [ 2]<br>&gt; /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x40dd65]<br>&gt; [c15:31936] [ 3]<br>&gt; /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x42ca93]<br>&gt; [c15:31936] [ 4]<br>&gt; /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x416f7d]<br>&gt; [c15:31936] [ 5]<br>&gt; /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x41792c]<br>&gt; [c15:31936] [ 6]<br>&gt; /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x438756]<br>&gt; [c15:31936] [ 7]<br>&gt; /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x438b3e]<br>&gt; [c15:31936] [ 8]<br>&gt; /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x439a97]<br>&gt; [c15:31936] [ 9] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf5)<br>&gt; [0x7f7f9d591ec5]<br>&gt; [c15:31936] [10]<br>&gt; /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x40b93e]<br>&gt; [c15:31936] *** End of error message ***<br>&gt; step 0[c15:31935] *** Process received signal ***<br>&gt; [c15:31935] Signal: Segmentation fault (11)<br>&gt; [c15:31935] Signal code: Address not mapped (1)<br>&gt; [c15:31935] Failing at address: 0x30<br>&gt; [c15:31935] [ 0] /lib/x86_64-linux-gnu/libpthread.so.0(+0x10340)<br>&gt; [0x7fb64892e340]<br>&gt; [c15:31935] [ 1]<br>&gt; /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x468cfb]<br>&gt; [c15:31935] [ 2]<br>&gt; /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x40dd65]<br>&gt; [c15:31935] [ 3]<br>&gt; /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x42ca93]<br>&gt; [c15:31935] [ 4]<br>&gt; /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x416f7d]<br>&gt; [c15:31935] [ 5]<br>&gt; /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x41792c]<br>&gt; [c15:31935] [ 6]<br>&gt; /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x438756]<br>&gt; [c15:31935] [ 7]<br>&gt; /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x438b3e]<br>&gt; [c15:31935] [ 8]<br>&gt; /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x439a97]<br>&gt; [c15:31935] [ 9] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf5)<br>&gt; [0x7fb647b4bec5]<br>&gt; [c15:31935] [10]<br>&gt; /home/hudan/wow/ngromacs-2018/gromacs-2018/build/bin/mdrun_mpi() [0x40b93e]<br>&gt; [c15:31935] *** End of error message ***<br>&gt; --------------------------------------------------------------------------<br>&gt; mpirun noticed that process rank 0 with PID 31935 on node c15.dynstar<br>&gt; exited on signal 11 (Segmentation fault).<br>&gt; --------------------------------------------------------------------------<br>&gt;<br>&gt; However, if I install the package without flag -DGMX_MPI=on, the single<br>&gt; program(mdrun) runs smoothly. and all the domain decomposition rank can be<br>&gt; printed out and used conveniently.<br>&gt;<br>&gt; It is pretty wierd to me that, with mdrun_mpi, although domain<br>&gt; decomposition can be done, their rank can neither be printed out nor<br>&gt; available through struct cr-&gt;dd. I wonder whether they were saved in other<br>&gt; form, but I do not know what it is.<br>&gt;<br>&gt; I will appreciate it if someone can help. Thank you very much!!!<br>&gt; Best,<br>&gt; Huan<br>&gt; --<br>&gt; Gromacs Developers mailing list<br>&gt;<br>&gt; * Please search the archive at<br>&gt; http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List before<br>&gt; posting!<br>&gt;<br>&gt; * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists<br>&gt;<br>&gt; * For (un)subscribe requests visit<br>&gt; https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developers<br>&gt; or send a mail to gmx-developers-request@gromacs.org.<br>-------------- next part --------------<br>An HTML attachment was scrubbed...<br>URL: &lt;http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-developers/attachments/20190605/0f93c01b/attachment-0003.html&gt;<br><br>------------------------------<br><br>Message: 2<br>Date: Wed, 5 Jun 2019 10:13:39 +0200<br>From: Paul bauer &lt;paul.bauer.q@gmail.com&gt;<br>To: gromacs.org_gmx-developers@maillist.sys.kth.se<br>Subject: [gmx-developers] Upcoming 2019.3 patch release<br>Message-ID: &lt;3690c17f-283a-3c34-ab48-1e11b0ff0417@gmail.com&gt;<br>Content-Type: text/plain; charset=utf-8; format=flowed<br><br>Hello developers,<br><br>The next GROMACS patch release for the 2019 branch is planned for end of <br>next week around June 14.<br>The plan for those releases continues to be to have them released every <br>2-3 months to ensure users will get their hands on the latest fixes.<br><br>Please check out <br>https://redmine.gromacs.org/projects/gromacs/issues?fixed_version_id=89&amp;set_filter=1&amp;status_id=o <br>to see if there's something you can help out with fixing, or review at <br>https://gerrit.gromacs.org/#/q/status:open+project:gromacs+branch:release-2019.<br><br>Cheers<br><br>Paul<br><br>-- <br>Paul Bauer, PhD<br>GROMACS Release Manager<br>KTH Stockholm, SciLifeLab<br>0046737308594<br><br><br><br>------------------------------<br><br>-- <br>Gromacs Developers mailing list<br><br>* Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List before posting!<br><br>* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists<br><br>* For (un)subscribe requests visit<br>https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developers or send a mail to gmx-developers-request@gromacs.org.<br><br>End of gromacs.org_gmx-developers Digest, Vol 182, Issue 2<br>**********************************************************<br>