I just realized that that was a very old mdp file. Here is an mdp file from my most recent run as well as what I think are the domain decomposition statistics.<div><br></div><div>mdp file:</div><div><div>title = BMIM+PF6</div>
<div>cpp = /lib/cpp</div><div>constraints = hbonds</div><div>integrator = md</div><div>dt = 0.002 ; ps !</div><div>nsteps = 4000000 ; total 8ns.</div>
<div>nstcomm = 1</div><div>nstxout = 50000</div><div>nstvout = 50000</div><div>nstfout = 0</div><div>nstlog = 5000</div><div>nstenergy = 5000</div>
<div>nstxtcout = 25000</div><div>nstlist = 10</div><div>ns_type = grid</div><div>pbc = xyz</div><div>coulombtype = PME</div><div>vdwtype = Cut-off</div>
<div>rlist = 1.2</div><div>rcoulomb = 1.2</div><div>rvdw = 1.2</div><div>fourierspacing = 0.12</div><div>pme_order = 4</div><div>ewald_rtol = 1e-5</div>
<div>; Berendsen temperature coupling is on in two groups</div><div>Tcoupl = berendsen</div><div>tc_grps = BMI PF6 </div><div>tau_t = 0.2 0.2</div><div>ref_t = 300 300</div>
<div>nsttcouple = 1</div><div>; Energy monitoring</div><div>energygrps = BMI PF6</div><div>; Isotropic pressure coupling is now on</div><div>Pcoupl = berendsen</div><div>pcoupltype = isotropic</div>
<div>;pc-grps = BMI PFF</div><div>tau_p = 1.0</div><div>ref_p = 1.0</div><div>compressibility = 4.5e-5</div><div><br></div><div>; Generate velocites is off at 300 K.</div>
<div>gen_vel = yes</div><div>gen_temp = 300.0</div><div>gen_seed = 100000</div><div><br></div><div>domain decomposition</div><div><div>There are: 12800 Atoms</div><div>Max number of connections per atom is 63</div>
<div>Total number of connections is 286400</div><div>Max number of graph edges per atom is 6</div><div>Total number of graph edges is 24800</div></div><br><div class="gmail_quote">On Thu, Jan 27, 2011 at 4:32 PM, Justin A. Lemkul <span dir="ltr"><<a href="mailto:jalemkul@vt.edu">jalemkul@vt.edu</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;"><div class="im"><br>
<br>
Denny Frost wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
about 12000 atoms, 8 nodes, CentOS 5.3/Linux, infiniband. Below is a copy of my mdp file.<br>
<br>
title = BMIM+PF6<br>
cpp = /lib/cpp<br>
constraints = all_bonds<br>
integrator = md<br>
dt = 0.004 ; ps !<br>
nsteps = 20000000 ; total 4ns.<br>
nstcomm = 1<br>
nstxout = 50000<br>
nstvout = 50000<br>
nstfout = 0<br>
nstlog = 5000<br>
nstenergy = 5000<br>
nstxtcout = 25000<br>
nstlist = 10<br>
ns_type = grid<br>
pbc = xyz<br>
coulombtype = PME<br>
vdwtype = Shift<br>
rlist = 1.0<br>
rcoulomb = 1.0<br>
rvdw = 1.0<br>
fourierspacing = 0.6<br>
</blockquote>
<br></div>
This fourierspacing is 5-6 times larger than what is normally accepted as sufficiently accurate. A sparse grid will make the PME algorithm faster, actually, but at the expense of accuracy.<br>
<br>
Can you post the domain decomposition statistics from the .log file? They appear just above the energies from time 0. What did grompp tell you about the relative PME:PP load?<br>
<br>
-Justin<br>
<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div class="im">
;pme_order = 4<br>
ewald_rtol = 1e-5<br>
; Berendsen temperature coupling is on in two groups<br>
Tcoupl = berendsen<br>
tc_grps = BMI PF6 tau_t = 0.1 0.1<br>
ref_t = 300 300<br>
nsttcouple = 1<br>
; Energy monitoring<br>
energygrps = BMI PF6<br>
; Isotropic pressure coupling is now on<br>
Pcoupl = berendsen<br>
pcoupltype = isotropic<br>
;pc-grps = BMI PFF<br>
tau_p = 1.0<br>
ref_p = 1.0<br>
compressibility = 4.5e-5<br>
<br>
; Generate velocites is off at 300 K.<br>
gen_vel = yes<br>
gen_temp = 300.0<br>
gen_seed = 100000<br>
<br>
<br></div><div class="im">
On Thu, Jan 27, 2011 at 4:12 PM, Dallas Warren <<a href="mailto:Dallas.Warren@monash.edu" target="_blank">Dallas.Warren@monash.edu</a> <mailto:<a href="mailto:Dallas.Warren@monash.edu" target="_blank">Dallas.Warren@monash.edu</a>>> wrote:<br>
<br>
You will need to provide more details on the system. How many<br>
atoms, what sort of computer system is it being run on, how many<br>
nodes, copy of the mdp file etc.<br>
<br>
<br>
Catch ya,<br>
<br>
Dr. Dallas Warren<br>
<br>
Medicinal Chemistry and Drug Action<br>
<br>
Monash Institute of Pharmaceutical Sciences, Monash University<br>
381 Royal Parade, Parkville VIC 3010<br></div>
<a href="mailto:dallas.warren@monash.edu" target="_blank">dallas.warren@monash.edu</a> <mailto:<a href="mailto:dallas.warren@monash.edu" target="_blank">dallas.warren@monash.edu</a>><div class="im"><br>
<br>
+61 3 9903 9304<br>
---------------------------------<br>
When the only tool you own is a hammer, every problem begins to<br>
resemble a nail.<br>
<br>
<br>
*From:* <a href="mailto:gmx-users-bounces@gromacs.org" target="_blank">gmx-users-bounces@gromacs.org</a><br>
<mailto:<a href="mailto:gmx-users-bounces@gromacs.org" target="_blank">gmx-users-bounces@gromacs.org</a>><br>
[mailto:<a href="mailto:gmx-users-bounces@gromacs.org" target="_blank">gmx-users-bounces@gromacs.org</a><br>
<mailto:<a href="mailto:gmx-users-bounces@gromacs.org" target="_blank">gmx-users-bounces@gromacs.org</a>>] *On Behalf Of *Denny Frost<br>
*Sent:* Friday, 28 January 2011 9:34 AM<br>
*To:* Discussion list for GROMACS users<br>
*Subject:* [gmx-users] Slow Runs<br>
<br>
<br>
I am taking over a project for a graduate student who did MD using<br>
Gromacs 3.3.3. I now run similar simulations with Gromacs 4.5.1 and<br>
find that they run only about 1/2 to 1/3 as fast as the previous<br>
runs done in Gromacs 3.3.3. The runs have about the same number of<br>
atoms and both use opls force fields. The mdp files is virtually<br>
the same (I copied them). The only major difference is that my runs<br>
have difference species and thus have different (although smaller)<br>
itp files. The runs are stable and give reasonable thermodynamic<br>
properties - they're just slow. Has anyone had any experience with<br>
something like this?<br>
<br>
<br>
--<br>
gmx-users mailing list <a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a><br></div>
<mailto:<a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a>><div class="im"><br>
<a href="http://lists.gromacs.org/mailman/listinfo/gmx-users" target="_blank">http://lists.gromacs.org/mailman/listinfo/gmx-users</a><br>
Please search the archive at<br>
<a href="http://www.gromacs.org/Support/Mailing_Lists/Search" target="_blank">http://www.gromacs.org/Support/Mailing_Lists/Search</a> before posting!<br>
Please don't post (un)subscribe requests to the list. Use the<br>
www interface or send it to <a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a><br></div>
<mailto:<a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a>>.<div class="im"><br>
Can't post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a><br>
<br>
<br>
</div></blockquote>
<br>
-- <br>
========================================<br>
<br>
Justin A. Lemkul<br>
Ph.D. Candidate<br>
ICTAS Doctoral Scholar<br>
MILES-IGERT Trainee<br>
Department of Biochemistry<br>
Virginia Tech<br>
Blacksburg, VA<br>
jalemkul[at]<a href="http://vt.edu" target="_blank">vt.edu</a> | (540) 231-9080<br>
<a href="http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin" target="_blank">http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin</a><br>
<br>
========================================<br><font color="#888888">
-- <br></font><div><div></div><div class="h5">
gmx-users mailing list <a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a><br>
<a href="http://lists.gromacs.org/mailman/listinfo/gmx-users" target="_blank">http://lists.gromacs.org/mailman/listinfo/gmx-users</a><br>
Please search the archive at <a href="http://www.gromacs.org/Support/Mailing_Lists/Search" target="_blank">http://www.gromacs.org/Support/Mailing_Lists/Search</a> before posting!<br>
Please don't post (un)subscribe requests to the list. Use the www interface or send it to <a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a>.<br>
Can't post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a><br>
</div></div></blockquote></div><br></div>