Here's my grompp command:<div><br></div><div>grompp_d -nice 0 -v -f md.mdp -c ReadyForMD.gro -o md.tpr -p top.top</div><div><br></div><div>and my mdrun command is this: </div><div><br></div><div>time mpiexec mdrun_mpi -np 8 -cpt 30000 -nice 0 -nt 1 -s $PBS_O_WORKDIR/md.tpr -o $PBS_O_WORKDIR/mdDone.trr -x $PBS_O_WORKDIR/mdDone.xtc -c $PBS_O_WORKDIR/mdDone.gro -e $PBS_O_WORKDIR/md.edr -g $PBS_O_WORKDIR/md.log 1> $PBS_JOBID.pgm.out 4> $PBS_JOBID.pgm.err</div>
<div><br></div><div>I know the -cpt option is 30000 because I don't want a checkpoint file because every time it tries to make it, it fails due to quota issues and kills the job. I'm not sure why this happens, but I think it's a separate issue to take up with my supercomputing facility.<br>
<br><div class="gmail_quote">On Fri, Jan 28, 2011 at 1:18 PM, Justin A. Lemkul <span dir="ltr"><<a href="mailto:jalemkul@vt.edu">jalemkul@vt.edu</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
<div class="im"><br>
<br>
Denny Frost wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
all 8 nodes are running at full capacity, though<br>
<br>
</blockquote>
<br></div>
What is your mdrun command line? How did you compile it? What can happen is something went wrong during installation, so you think you have an MPI-enabled binary, but it is simply executing 8 copies of the same job.<br>
<br>
-Justin<br>
<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div><div></div><div class="h5">
On Fri, Jan 28, 2011 at 1:13 PM, Justin A. Lemkul <<a href="mailto:jalemkul@vt.edu" target="_blank">jalemkul@vt.edu</a> <mailto:<a href="mailto:jalemkul@vt.edu" target="_blank">jalemkul@vt.edu</a>>> wrote:<br>
<br>
<br>
<br>
Denny Frost wrote:<br>
<br>
Here's what I've got:<br>
<br>
M E G A - F L O P S A C C O U N T I N G<br>
<br>
RF=Reaction-Field FE=Free Energy SCFE=Soft-Core/Free Energy<br>
T=Tabulated W3=SPC/TIP3p W4=TIP4p (single or pairs)<br>
NF=No Forces<br>
<br>
Computing: M-Number M-Flops % Flops<br>
-----------------------------------------------------------------------------<br>
Coul(T) + VdW(T) 1219164.751609 82903203.109 80.6<br>
Outer nonbonded loop 25980.879385 259808.794 0.3<br>
Calc Weights 37138.271040 1336977.757 1.3<br>
Spread Q Bspline 792283.115520 1584566.231 1.5<br>
Gather F Bspline 792283.115520 4753698.693 4.6<br>
3D-FFT 119163.856212 953310.850 0.9<br>
Solve PME 2527.465668 161757.803 0.2<br>
NS-Pairs 47774.705001 1003268.805 1.0<br>
Reset In Box 371.386080 1114.158 0.0<br>
Shift-X 24758.847360 148553.084 0.1<br>
CG-CoM 1237.953600 3713.861 0.0<br>
Angles 18569.135520 3119614.767 3.0<br>
Propers 14855.308416 3401865.627 3.3<br>
Impropers 3094.855920 643730.031 0.6<br>
Virial 1242.417375 22363.513 0.0<br>
Stop-CM 1237.953600 12379.536 0.0<br>
P-Coupling 12379.423680 74276.542 0.1<br>
Calc-Ekin 12379.436160 334244.776 0.3<br>
Lincs 11760.476208 705628.572 0.7<br>
Lincs-Mat 245113.083072 980452.332 1.0<br>
Constraint-V 23520.928704 188167.430 0.2<br>
Constraint-Vir 11760.452496 282250.860 0.3<br>
-----------------------------------------------------------------------------<br>
Total 102874947.133 100.0<br>
-----------------------------------------------------------------------------<br>
<br>
<br>
R E A L C Y C L E A N D T I M E A C C O U N T I N G<br>
<br>
Computing: Nodes Number G-Cycles Seconds %<br>
-----------------------------------------------------------------------<br>
Neighbor search 1 99195 8779.027 3300.3 3.8<br>
Force 1 991941 188562.885 70886.8 81.7<br>
PME mesh 1 991941 18012.830 6771.6 7.8<br>
Write traj. 1 41 16.835 6.3<br>
0.0<br>
Update 1 991941 2272.379 854.3 1.0<br>
Constraints 1 991941 11121.146 4180.8 4.8<br>
Rest 1 2162.628 813.0 0.9<br>
-----------------------------------------------------------------------<br>
Total 1 230927.730 86813.1 100.0<br>
-----------------------------------------------------------------------<br>
-----------------------------------------------------------------------<br>
PME spread/gather 1 1983882 17065.384 6415.4 7.4<br>
PME 3D-FFT 1 1983882 503.340 189.2<br>
0.2<br>
PME solve 1 991941 427.136 160.6 0.2<br>
-----------------------------------------------------------------------<br>
<br>
Does that mean it's only using 1 node? That would explain the<br>
speed issues.<br>
<br>
<br>
That's what it looks like to me.<br>
<br>
<br>
-Justin<br>
<br>
-- ========================================<br>
<br>
Justin A. Lemkul<br>
Ph.D. Candidate<br>
ICTAS Doctoral Scholar<br>
MILES-IGERT Trainee<br>
Department of Biochemistry<br>
Virginia Tech<br>
Blacksburg, VA<br></div></div>
jalemkul[at]<a href="http://vt.edu" target="_blank">vt.edu</a> <<a href="http://vt.edu" target="_blank">http://vt.edu</a>> | (540) 231-9080<div class="im"><br>
<a href="http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin" target="_blank">http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin</a><br>
<br>
========================================<br>
-- gmx-users mailing list <a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a><br></div>
<mailto:<a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a>><div class="im"><br>
<a href="http://lists.gromacs.org/mailman/listinfo/gmx-users" target="_blank">http://lists.gromacs.org/mailman/listinfo/gmx-users</a><br>
Please search the archive at<br>
<a href="http://www.gromacs.org/Support/Mailing_Lists/Search" target="_blank">http://www.gromacs.org/Support/Mailing_Lists/Search</a> before posting!<br>
Please don't post (un)subscribe requests to the list. Use the www<br>
interface or send it to <a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a><br></div>
<mailto:<a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a>>.<div class="im"><br>
Can't post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a><br>
<br>
<br>
</div></blockquote><div><div></div><div class="h5">
<br>
-- <br>
========================================<br>
<br>
Justin A. Lemkul<br>
Ph.D. Candidate<br>
ICTAS Doctoral Scholar<br>
MILES-IGERT Trainee<br>
Department of Biochemistry<br>
Virginia Tech<br>
Blacksburg, VA<br>
jalemkul[at]<a href="http://vt.edu" target="_blank">vt.edu</a> | (540) 231-9080<br>
<a href="http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin" target="_blank">http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin</a><br>
<br>
========================================<br>
-- <br>
gmx-users mailing list <a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a><br>
<a href="http://lists.gromacs.org/mailman/listinfo/gmx-users" target="_blank">http://lists.gromacs.org/mailman/listinfo/gmx-users</a><br>
Please search the archive at <a href="http://www.gromacs.org/Support/Mailing_Lists/Search" target="_blank">http://www.gromacs.org/Support/Mailing_Lists/Search</a> before posting!<br>
Please don't post (un)subscribe requests to the list. Use the www interface or send it to <a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a>.<br>
Can't post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a><br>
</div></div></blockquote></div><br></div>