<div dir="ltr">HI Carsten,<br>Thanks again for reply. and my apologies for putting question out of discussion.<br>actually I tried same command with -np 24 and -np 64, and for both cases i got different trajectory (while analyzing them using ngmx).<br>
Also Can you suggest me some tutorial or reference to get details of scalability limitation of gromacs(on parellal enviournment).<br><br>With Thanks,<br>Vivek<br><br><div class="gmail_quote">2008/9/12 Carsten Kutzner <span dir="ltr"><<a href="mailto:ckutzne@gwdg.de">ckutzne@gwdg.de</a>></span><br>
<blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">Hi Vivek,<br>
<br>
I think I'm a bit lost now. We were originally talking about differences<br>
in trajectories but from the mail you just sent I can see that you have<br>
a segmentation fault, which is another problem.<br>
<br>
I can only suggest that if you want to make use of 128 processors you<br>
should download the CVS version of gromacs or wait until the 4.0<br>
is out. Since in gromacs 3.3.x the protein has to reside as a whole<br>
on one of the processors, this very likely limits your scaling.<br>
<br>
Also, on 128 processors you will get a PME grid of 128x128xSomething<br>
(since nx and ny have to be divisible by the number of CPUs) which is<br>
probably way bigger than it needs to be (how big is it for a single<br>
CPU?). Together with a PME order of 6 this leads to a large overlap<br>
in the charge grid, which has to be communicated among the processors.<br>
PME order 4 will be better suited for such a high parallelization, but<br>
in general for Gromacs 3.x you should have at least a few thousand atoms<br>
per processor, less than 1000 won't give you decent scaling at all.<br>
<br>
Carsten<br>
<br>
vivek sharma wrote:<br>
<blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
Hi Carsten,<br>
Thanks for your reply. Actually I am running MD simulation on a protein molecule with 270 residues(2687 atoms), after adding water it is having 45599 atoms, and using the recent version of gromacs test available from <a href="http://gromacs.org" target="_blank">gromacs.org</a> <<a href="http://gromacs.org" target="_blank">http://gromacs.org</a>> (gmxtest-3.3.3.tgz)<div>
<div></div><div class="Wj3C7c"><br>
Following are the entries from the .mdp file I am using.....<br>
<br>
**********md.mdp<br>
title = trp_drg MD<br>
cpp = /lib/cpp ; location of cpp on SGI<br>
constraints = all-bonds<br>
integrator = md<br>
dt = 0.002 ; ps !<br>
nsteps = 25000 ; total 50 ps.<br>
nstcomm = 1<br>
nstxout = 2500 ; output coordinates every 5.0 ps<br>
nstvout = 0<br>
nstfout = 0<br>
nstlist = 5<br>
ns_type = grid<br>
rlist = 0.9<br>
coulombtype = PME<br>
rcoulomb = 0.9<br>
rvdw = 1.4<br>
fourierspacing = 0.12<br>
fourier_nx = 0<br>
fourier_ny = 0<br>
fourier_nz = 0<br>
pme_order = 6<br>
ewald_rtol = 1e-5<br>
optimize_fft = yes<br>
; Berendsen temperature coupling is on in four groups<br>
Tcoupl = berendsen<br>
tau_t = 0.1 0.1 0.1<br>
tc-grps = protein NDP sol<br>
ref_t = 300 300 300<br>
; Pressure coupling is on<br>
Pcoupl = berendsen<br>
pcoupltype = isotropic<br>
tau_p = 0.5<br>
compressibility = 4.5e-5<br>
ref_p = 1.0<br>
; Generate velocites is on at 300 K.<br>
gen_vel = yes<br>
gen_temp = 300.0<br>
gen_seed = 173529<br>
<br>
<br>
****************and Following are the commands I am using<br>
grompp_d -np 128 -f md1.mdp -c 1XU9_A_b4em.gro -p 1XU9_A.top -o 1XU9_A_md1_np128.tpr<br>
<br>
submit<br>
mdrun_d<br>
/////arguement for mdrun_d<br>
-s 1XU9_A_md1_np128.tpr -o 1XU9_A_md1_np128.trr -c 1XU9_A_pmd1_np128.gro -g md_np128.log -e md_np128.edr -np 128<br>
<br>
***********Following is the error I am getting<br>
Reading file 1XU9_A_md1_np128.tpr, VERSION 3.3.3 (double precision)<br>
starting mdrun 'CORTICOSTEROID 11-BETA-DEHYDROGENASE, ISOZYME 1'<br>
25000 steps, 50.0 ps.<br>
<br>
srun: error: n141: task1: Segmentation fault<br>
srun: Terminating job<br>
<br>
****************************************************************<br>
Is this information is helpfull in figuring out the problem.<br>
Please, advice<br>
<br>
With Thanks,<br>
Vivek<br>
<br></div></div>
2008/9/11 Carsten Kutzner <<a href="mailto:ckutzne@gwdg.de" target="_blank">ckutzne@gwdg.de</a> <mailto:<a href="mailto:ckutzne@gwdg.de" target="_blank">ckutzne@gwdg.de</a>>><div><div></div><div class="Wj3C7c">
<br>
<br>
vivek sharma wrote:<br>
<br>
Hi There,<br>
I am running gromacs parellal version on cluster, with different<br>
-np options.<br>
<br>
Hi,<br>
<br>
which version of gromacs exactly are you using?<br>
<br>
<br>
<br>
<br>
<br>
On analyzing the 5 nsec trajectory using ngmx, I am finding<br>
difference in the trajectory of two similar runs (only thing<br>
varying in two runs in -np i.e 20 and 64 ), where mdp file and<br>
input files are same in two cases.<br>
I am wondering why I am getting this difference in two<br>
trajectories ?<br>
I am looking for the advice whether I did something wrong or<br>
what may be the probable reason for this difference.<br>
<br>
<br>
There are many reasons why a parallel run does not yield binary<br>
identical<br>
results to a run with another number of processors, even if you<br>
start from<br>
the same tpr file. If you use PME, then the FFTW could pick a slightly<br>
different algorithm (it will select the fastest for that number of<br>
processors.<br>
This feature you can turn off by passing --disable-fftw-measure to the<br>
gromacs configure script). But still you can get results that are not<br>
binary identical if you do FFTs on a varying number of CPUs. Also, for<br>
limited accuracy which is inherent to any computer, additions need not<br>
be associative, which can show up in parallel additions.<br>
<br>
Generally, if you run in double precision, these effects will be way<br>
smaller,<br>
but nevertheless you won't get binary identical results. This will<br>
in all<br>
cases lead to trajectories which slowly diverge from each other.<br>
However,<br>
in the fist few hundred time steps, you should not see any difference in<br>
the first couple of decimals of the variables (positions, velocities,<br>
energies ...)<br>
<br>
<br>
Also, I am not able to run gromacs faster by increasing the -np<br>
issue,<br>
<br>
<br>
Please provide the exact command line you used.<br>
<br>
<br>
Is there any max limit for scaling gromacs on parellal cluster ?<br>
<br>
<br>
Yes, depending on your MD system and on the cluster you use :)<br>
<br>
Carsten<br>
<br>
<br>
<br>
With Thanks,<br>
Vivek<br>
<br>
<br>
------------------------------------------------------------------------<br>
<br>
_______________________________________________<br>
gmx-users mailing list <a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a><br></div></div>
<mailto:<a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a>><div class="Ih2E3d"><br>
<a href="http://www.gromacs.org/mailman/listinfo/gmx-users" target="_blank">http://www.gromacs.org/mailman/listinfo/gmx-users</a><br>
Please search the archive at <a href="http://www.gromacs.org/search" target="_blank">http://www.gromacs.org/search</a><br>
before posting!<br>
Please don't post (un)subscribe requests to the list. Use the<br>
www interface or send it to <a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a><br></div>
<mailto:<a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a>>.<div class="Ih2E3d"><br>
Can't post? Read <a href="http://www.gromacs.org/mailing_lists/users.php" target="_blank">http://www.gromacs.org/mailing_lists/users.php</a><br>
<br>
<br>
-- Dr. Carsten Kutzner<br>
Max Planck Institute for Biophysical Chemistry<br>
Theoretical and Computational Biophysics Department<br>
Am Fassberg 11<br>
37077 Goettingen, Germany<br>
Tel. +49-551-2012313, Fax: +49-551-2012302<br>
<a href="http://www.mpibpc.mpg.de/research/dep/grubmueller/" target="_blank">http://www.mpibpc.mpg.de/research/dep/grubmueller/</a><br></div>
<a href="http://www.gwdg.de/%7Eckutzne" target="_blank">http://www.gwdg.de/~ckutzne</a> <<a href="http://www.gwdg.de/%7Eckutzne" target="_blank">http://www.gwdg.de/%7Eckutzne</a>><div class="Ih2E3d"><br>
<br>
_______________________________________________<br>
gmx-users mailing list <a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a><br></div>
<mailto:<a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a>><div class="Ih2E3d"><br>
<a href="http://www.gromacs.org/mailman/listinfo/gmx-users" target="_blank">http://www.gromacs.org/mailman/listinfo/gmx-users</a><br>
Please search the archive at <a href="http://www.gromacs.org/search" target="_blank">http://www.gromacs.org/search</a> before<br>
posting!<br>
Please don't post (un)subscribe requests to the list. Use the www<br>
interface or send it to <a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a><br></div>
<mailto:<a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a>>.<div class="Ih2E3d"><br>
Can't post? Read <a href="http://www.gromacs.org/mailing_lists/users.php" target="_blank">http://www.gromacs.org/mailing_lists/users.php</a><br>
<br>
<br>
<br>
------------------------------------------------------------------------<br>
<br>
_______________________________________________<br>
gmx-users mailing list <a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a><br>
<a href="http://www.gromacs.org/mailman/listinfo/gmx-users" target="_blank">http://www.gromacs.org/mailman/listinfo/gmx-users</a><br>
Please search the archive at <a href="http://www.gromacs.org/search" target="_blank">http://www.gromacs.org/search</a> before posting!<br>
Please don't post (un)subscribe requests to the list. Use the www interface or send it to <a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a>.<br>
Can't post? Read <a href="http://www.gromacs.org/mailing_lists/users.php" target="_blank">http://www.gromacs.org/mailing_lists/users.php</a><br>
</div></blockquote><div><div></div><div class="Wj3C7c">
<br>
-- <br>
Dr. Carsten Kutzner<br>
Max Planck Institute for Biophysical Chemistry<br>
Theoretical and Computational Biophysics Department<br>
Am Fassberg 11<br>
37077 Goettingen, Germany<br>
Tel. +49-551-2012313, Fax: +49-551-2012302<br>
<a href="http://www.mpibpc.mpg.de/research/dep/grubmueller/" target="_blank">http://www.mpibpc.mpg.de/research/dep/grubmueller/</a><br>
<a href="http://www.gwdg.de/%7Eckutzne" target="_blank">http://www.gwdg.de/~ckutzne</a><br>
_______________________________________________<br>
gmx-users mailing list <a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a><br>
<a href="http://www.gromacs.org/mailman/listinfo/gmx-users" target="_blank">http://www.gromacs.org/mailman/listinfo/gmx-users</a><br>
Please search the archive at <a href="http://www.gromacs.org/search" target="_blank">http://www.gromacs.org/search</a> before posting!<br>
Please don't post (un)subscribe requests to the list. Use the www interface or send it to <a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a>.<br>
Can't post? Read <a href="http://www.gromacs.org/mailing_lists/users.php" target="_blank">http://www.gromacs.org/mailing_lists/users.php</a><br>
</div></div></blockquote></div><br></div>