Here's what I've got:<div><br></div><div><div><span class="Apple-tab-span" style="white-space:pre">        </span>M E G A - F L O P S A C C O U N T I N G</div><div><br></div><div> RF=Reaction-Field FE=Free Energy SCFE=Soft-Core/Free Energy</div>
<div> T=Tabulated W3=SPC/TIP3p W4=TIP4p (single or pairs)</div><div> NF=No Forces</div><div><br></div><div> Computing: M-Number M-Flops % Flops</div><div>-----------------------------------------------------------------------------</div>
<div> Coul(T) + VdW(T) 1219164.751609 82903203.109 80.6</div><div> Outer nonbonded loop 25980.879385 259808.794 0.3</div><div> Calc Weights 37138.271040 1336977.757 1.3</div>
<div> Spread Q Bspline 792283.115520 1584566.231 1.5</div><div> Gather F Bspline 792283.115520 4753698.693 4.6</div><div> 3D-FFT 119163.856212 953310.850 0.9</div>
<div> Solve PME 2527.465668 161757.803 0.2</div><div> NS-Pairs 47774.705001 1003268.805 1.0</div><div> Reset In Box 371.386080 1114.158 0.0</div>
<div> Shift-X 24758.847360 148553.084 0.1</div><div> CG-CoM 1237.953600 3713.861 0.0</div><div> Angles 18569.135520 3119614.767 3.0</div>
<div> Propers 14855.308416 3401865.627 3.3</div><div> Impropers 3094.855920 643730.031 0.6</div><div> Virial 1242.417375 22363.513 0.0</div>
<div> Stop-CM 1237.953600 12379.536 0.0</div><div> P-Coupling 12379.423680 74276.542 0.1</div><div> Calc-Ekin 12379.436160 334244.776 0.3</div>
<div> Lincs 11760.476208 705628.572 0.7</div><div> Lincs-Mat 245113.083072 980452.332 1.0</div><div> Constraint-V 23520.928704 188167.430 0.2</div>
<div> Constraint-Vir 11760.452496 282250.860 0.3</div><div>-----------------------------------------------------------------------------</div><div> Total 102874947.133 100.0</div>
<div>-----------------------------------------------------------------------------</div><div><br></div><div><br></div><div> R E A L C Y C L E A N D T I M E A C C O U N T I N G</div><div><br></div><div> Computing: Nodes Number G-Cycles Seconds %</div>
<div>-----------------------------------------------------------------------</div><div> Neighbor search 1 99195 8779.027 3300.3 3.8</div><div> Force 1 991941 188562.885 70886.8 81.7</div>
<div> PME mesh 1 991941 18012.830 6771.6 7.8</div><div> Write traj. 1 41 16.835 6.3 0.0</div><div> Update 1 991941 2272.379 854.3 1.0</div>
<div> Constraints 1 991941 11121.146 4180.8 4.8</div><div> Rest 1 2162.628 813.0 0.9</div><div>-----------------------------------------------------------------------</div>
<div> Total 1 230927.730 86813.1 100.0</div><div>-----------------------------------------------------------------------</div><div>-----------------------------------------------------------------------</div>
<div> PME spread/gather 1 1983882 17065.384 6415.4 7.4</div><div> PME 3D-FFT 1 1983882 503.340 189.2 0.2</div><div> PME solve 1 991941 427.136 160.6 0.2</div>
<div>-----------------------------------------------------------------------</div><div><br></div><div>Does that mean it's only using 1 node? That would explain the speed issues.</div><br><div class="gmail_quote">On Fri, Jan 28, 2011 at 12:46 PM, Justin A. Lemkul <span dir="ltr"><<a href="mailto:jalemkul@vt.edu">jalemkul@vt.edu</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;"><br>
<br>
Denny Frost wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
gromacs 4.5.1<br>
<br>
</blockquote>
<br>
Ah, what I posted was from 4.0.7. I wonder why that sort of output was eliminated in 4.5; it's quite useful. Sorry for leading you astray on that. No matter, the end of the .log file will still contain statistics about what's eating up all your simulation time.<br>
<br>
-Justin<br>
<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div><div></div><div class="h5">
On Fri, Jan 28, 2011 at 12:40 PM, Erik Marklund <<a href="mailto:erikm@xray.bmc.uu.se" target="_blank">erikm@xray.bmc.uu.se</a> <mailto:<a href="mailto:erikm@xray.bmc.uu.se" target="_blank">erikm@xray.bmc.uu.se</a>>> wrote:<br>
<br>
PME is still an Ewald sum.<br>
<br>
Erik<br>
<br>
Denny Frost skrev 2011-01-28 20.38:<br>
</div></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div><div></div><div class="h5">
I don't have any domain decomposition information like that in my<br>
log file. That's worrisome. The only other information I could<br>
find about PME and Ewald and this set of lines:<br>
<br>
Table routines are used for coulomb: TRUE<br>
Table routines are used for vdw: FALSE<br>
Will do PME sum in reciprocal space.<br>
<br>
++++ PLEASE READ AND CITE THE FOLLOWING REFERENCE ++++<br>
U. Essman, L. Perela, M. L. Berkowitz, T. Darden, H. Lee and L. G.<br>
Pedersen A smooth particle mesh Ewald method<br>
J. Chem. Phys. 103 (1995) pp. 8577-8592<br>
-------- -------- --- Thank You --- -------- --------<br>
<br>
Will do ordinary reciprocal space Ewald sum.<br>
Using a Gaussian width (1/beta) of 0.384195 nm for Ewald<br>
Cut-off's: NS: 1.2 Coulomb: 1.2 LJ: 1.2<br>
System total charge: 0.000<br>
Generated table with 4400 data points for Ewald.<br>
Tabscale = 2000 points/nm<br>
Generated table with 4400 data points for LJ6.<br>
Tabscale = 2000 points/nm<br>
Generated table with 4400 data points for LJ12.<br>
Tabscale = 2000 points/nm<br>
Configuring nonbonded kernels...<br>
Configuring standard C nonbonded kernels...<br>
Testing x86_64 SSE2 support... present.<br>
<br>
<br>
Why does it say it will do PME on one line, then ordinary Ewald later?<br>
<br>
On Fri, Jan 28, 2011 at 12:26 PM, Justin A. Lemkul<br></div></div><div><div></div><div class="h5">
<<a href="mailto:jalemkul@vt.edu" target="_blank">jalemkul@vt.edu</a> <mailto:<a href="mailto:jalemkul@vt.edu" target="_blank">jalemkul@vt.edu</a>>> wrote:<br>
<br>
<br>
<br>
Denny Frost wrote:<br>
<br>
I just realized that that was a very old mdp file. Here<br>
is an mdp file from my most recent run as well as what I<br>
think are the domain decomposition statistics.<br>
<br>
mdp file:<br>
title = BMIM+PF6<br>
cpp = /lib/cpp<br>
constraints = hbonds<br>
integrator = md<br>
dt = 0.002 ; ps !<br>
nsteps = 4000000 ; total 8ns.<br>
nstcomm = 1<br>
nstxout = 50000<br>
nstvout = 50000<br>
nstfout = 0<br>
nstlog = 5000<br>
nstenergy = 5000<br>
nstxtcout = 25000<br>
nstlist = 10<br>
ns_type = grid<br>
pbc = xyz<br>
coulombtype = PME<br>
vdwtype = Cut-off<br>
rlist = 1.2<br>
rcoulomb = 1.2<br>
rvdw = 1.2<br>
fourierspacing = 0.12<br>
pme_order = 4<br>
ewald_rtol = 1e-5<br>
; Berendsen temperature coupling is on in two groups<br>
Tcoupl = berendsen<br>
tc_grps = BMI PF6 tau_t = 0.2 0.2<br>
ref_t = 300 300<br>
nsttcouple = 1<br>
; Energy monitoring<br>
energygrps = BMI PF6<br>
; Isotropic pressure coupling is now on<br>
Pcoupl = berendsen<br>
pcoupltype = isotropic<br>
;pc-grps = BMI PFF<br>
tau_p = 1.0<br>
ref_p = 1.0<br>
compressibility = 4.5e-5<br>
<br>
; Generate velocites is off at 300 K.<br>
gen_vel = yes<br>
gen_temp = 300.0<br>
gen_seed = 100000<br>
<br>
domain decomposition<br>
There are: 12800 Atoms<br>
Max number of connections per atom is 63<br>
Total number of connections is 286400<br>
Max number of graph edges per atom is 6<br>
Total number of graph edges is 24800<br>
<br>
<br>
More useful information is contained at the very top of the<br>
.log file, after the citations. An example from one of my own<br>
runs is:<br>
<br>
Linking all bonded interactions to atoms<br>
There are 2772 inter charge-group exclusions,<br>
will use an extra communication step for exclusion forces for PME<br>
<br>
The initial number of communication pulses is: X 2 Y 1<br>
The initial domain decomposition cell size is: X 1.05 nm Y 1.58 nm<br>
<br>
The maximum allowed distance for charge groups involved in<br>
interactions is:<br>
non-bonded interactions 1.400 nm<br>
(the following are initial values, they could change due to<br>
box deformation)<br>
two-body bonded interactions (-rdd) 1.400 nm<br>
multi-body bonded interactions (-rdd) 1.054 nm<br>
atoms separated by up to 5 constraints (-rcon) 1.054 nm<br>
<br>
When dynamic load balancing gets turned on, these settings<br>
will change to:<br>
The maximum number of communication pulses is: X 2 Y 2<br>
The minimum size for domain decomposition cells is 0.833 nm<br>
The requested allowed shrink of DD cells (option -dds) is: 0.80<br>
The allowed shrink of domain decomposition cells is: X 0.79 Y 0.53<br>
The maximum allowed distance for charge groups involved in<br>
interactions is:<br>
non-bonded interactions 1.400 nm<br>
two-body bonded interactions (-rdd) 1.400 nm<br>
multi-body bonded interactions (-rdd) 0.833 nm<br>
atoms separated by up to 5 constraints (-rcon) 0.833 nm<br>
<br>
<br>
Making 2D domain decomposition grid 9 x 6 x 1, home cell index<br>
0 0 0<br>
<br>
<br>
Also, the output under "DOMAIN DECOMPOSITION STATISTICS" (at<br>
the bottom of the file) would be useful. Also look for any<br>
notes about performance lost due to imbalance, waiting for<br>
PME, etc. These provide very detailed clues about how your<br>
system was treated.<br>
<br>
-Justin<br>
<br>
<br>
-- ========================================<br>
<br>
Justin A. Lemkul<br>
Ph.D. Candidate<br>
ICTAS Doctoral Scholar<br>
MILES-IGERT Trainee<br>
Department of Biochemistry<br>
Virginia Tech<br>
Blacksburg, VA<br></div></div>
jalemkul[at]<a href="http://vt.edu" target="_blank">vt.edu</a> <<a href="http://vt.edu" target="_blank">http://vt.edu</a>> | (540) 231-9080<div class="im"><br>
<a href="http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin" target="_blank">http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin</a><br>
<br>
========================================<br>
-- gmx-users mailing list <a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a><br></div>
<mailto:<a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a>><div class="im"><br>
<a href="http://lists.gromacs.org/mailman/listinfo/gmx-users" target="_blank">http://lists.gromacs.org/mailman/listinfo/gmx-users</a><br>
Please search the archive at<br>
<a href="http://www.gromacs.org/Support/Mailing_Lists/Search" target="_blank">http://www.gromacs.org/Support/Mailing_Lists/Search</a> before<br>
posting!<br>
Please don't post (un)subscribe requests to the list. Use the<br>
www interface or send it to <a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a><br></div>
<mailto:<a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a>>.<div class="im"><br>
Can't post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a><br>
<br>
<br>
</div></blockquote><div class="im">
<br>
<br>
-- -----------------------------------------------<br>
Erik Marklund, PhD student<br>
Dept. of Cell and Molecular Biology, Uppsala University.<br>
Husargatan 3, Box 596, 75124 Uppsala, Sweden<br>
phone: +46 18 471 4537 fax: +46 18 511 755<br></div>
<a href="mailto:erikm@xray.bmc.uu.se" target="_blank">erikm@xray.bmc.uu.se</a> <mailto:<a href="mailto:erikm@xray.bmc.uu.se" target="_blank">erikm@xray.bmc.uu.se</a>> <a href="http://folding.bmc.uu.se/" target="_blank">http://folding.bmc.uu.se/</a><div class="im">
<br>
<br>
<br>
--<br>
gmx-users mailing list <a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a><br></div>
<mailto:<a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a>><div class="im"><br>
<a href="http://lists.gromacs.org/mailman/listinfo/gmx-users" target="_blank">http://lists.gromacs.org/mailman/listinfo/gmx-users</a><br>
Please search the archive at<br>
<a href="http://www.gromacs.org/Support/Mailing_Lists/Search" target="_blank">http://www.gromacs.org/Support/Mailing_Lists/Search</a> before posting!<br>
Please don't post (un)subscribe requests to the list. Use the<br>
www interface or send it to <a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a><br></div>
<mailto:<a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a>>.<div class="im"><br>
Can't post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a><br>
<br>
<br>
</div></blockquote><div><div></div><div class="h5">
<br>
-- <br>
========================================<br>
<br>
Justin A. Lemkul<br>
Ph.D. Candidate<br>
ICTAS Doctoral Scholar<br>
MILES-IGERT Trainee<br>
Department of Biochemistry<br>
Virginia Tech<br>
Blacksburg, VA<br>
jalemkul[at]<a href="http://vt.edu" target="_blank">vt.edu</a> | (540) 231-9080<br>
<a href="http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin" target="_blank">http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin</a><br>
<br>
========================================<br>
-- <br>
gmx-users mailing list <a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a><br>
<a href="http://lists.gromacs.org/mailman/listinfo/gmx-users" target="_blank">http://lists.gromacs.org/mailman/listinfo/gmx-users</a><br>
Please search the archive at <a href="http://www.gromacs.org/Support/Mailing_Lists/Search" target="_blank">http://www.gromacs.org/Support/Mailing_Lists/Search</a> before posting!<br>
Please don't post (un)subscribe requests to the list. Use the www interface or send it to <a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a>.<br>
Can't post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a><br>
</div></div></blockquote></div><br></div>