<div><br></div>Hi, <div><br></div><div>You should start from here:</div><div><br></div><div><a href="http://www.gromacs.org/Documentation/Errors#There_is_no_domain_decomposition_for_n_nodes_that_is_compatible_with_the_given_box_and_a_minimum_cell_size_of_x_nm">http://www.gromacs.org/Documentation/Errors#There_is_no_domain_decomposition_for_n_nodes_that_is_compatible_with_the_given_box_and_a_minimum_cell_size_of_x_nm</a></div>
<div><br></div><div>Terry</div><div><br><br><div class="gmail_quote">On Fri, Jan 6, 2012 at 12:49 PM, Albert <span dir="ltr"><<a href="mailto:mailmd2011@gmail.com">mailmd2011@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div bgcolor="#FFFFFF" text="#000000">
<font size="+1"><font face="WenQuanYi WenQuanYi Bitmap Song">Hello:<br>
I found that each time I would like to increase my nodes for
MD run, my job always failed. it said:<br>
<br>
Will use 192 particle-particle and 64 PME only nodes<br>
This is a guess, check the performance at the end of the log
file<br>
<br>
-------------------------------------------------------<br>
Program mdrun_mpi_bg, VERSION 4.5.5<br>
Source code file: ../../../src/mdlib/domdec.c, line: 6436<br>
<br>
Fatal error:<br>
There is no domain decomposition for 192 nodes that is
compatible with the given box and a minimum cell size of<br>
1.02425 nm<br>
Change the number of nodes or mdrun option -rcon or -dds or your
LINCS settings<br>
Look in the log file for details on the domain decomposition<br>
For more information and tips for troubleshooting, please check
the GROMACS<br>
website at <a href="http://www.gromacs.org/Documentation/Errors" target="_blank">http://www.gromacs.org/Documentation/Errors</a><br>
-------------------------------------------------------<br>
<br>
"Ohne Arbeit wird das Leben Oede" (Wir Sind Helden)<br>
<br>
<br>
Does anybody have any idea for this? here is my scrips for
submitting jobs;<br>
# @ job_name = I213A<br>
# @ class = kdm-large<br>
# @ account_no = G07-13 <br>
# @ error = gromacs.err<br>
# @ output = gromacs.out<br>
# @ environment = COPY_ALL<br>
# @ wall_clock_limit = 12:00:00<br>
# @ notification = error<br>
# @ notify_user = <a href="mailto:albert@icm.edu.pl" target="_blank">albert@icm.edu.pl</a><br>
# @ job_type = bluegene<br>
# @ bg_size = 64<br>
# @ queue<br>
mpirun -exe /opt/gromacs/4.5.5/bin/mdrun_mpi_bg -args "-nosum
-dlb yes -v -s npt.tpr" -mode VN -np 256<br>
<br>
<br>
if I change the bg_size=32 and -np=128, everything goes well....<br>
<br>
<br>
THX<br>
<br>
</font></font>
</div>
<br>--<br>
gmx-users mailing list <a href="mailto:gmx-users@gromacs.org">gmx-users@gromacs.org</a><br>
<a href="http://lists.gromacs.org/mailman/listinfo/gmx-users" target="_blank">http://lists.gromacs.org/mailman/listinfo/gmx-users</a><br>
Please search the archive at <a href="http://www.gromacs.org/Support/Mailing_Lists/Search" target="_blank">http://www.gromacs.org/Support/Mailing_Lists/Search</a> before posting!<br>
Please don't post (un)subscribe requests to the list. Use the<br>
www interface or send it to <a href="mailto:gmx-users-request@gromacs.org">gmx-users-request@gromacs.org</a>.<br>
Can't post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a><br></blockquote></div><br></div>