I&#39;m leaning toward the possibility that it is actually only running 8 copies of the same job on different processors.  My question is how does gromacs4.5 know how many processors it has available to parallelize a job?  Is it specified in grompp or does it just detect it? <br>
<br><div class="gmail_quote">On Fri, Jan 28, 2011 at 1:32 PM, Justin A. Lemkul <span dir="ltr">&lt;<a href="mailto:jalemkul@vt.edu">jalemkul@vt.edu</a>&gt;</span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
<div class="im"><br>
<br>
Denny Frost wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Here&#39;s my grompp command:<br>
<br>
grompp_d -nice 0 -v -f md.mdp -c ReadyForMD.gro -o md.tpr -p top.top<br>
<br>
and my mdrun command is this: <br>
time mpiexec mdrun_mpi -np 8 -cpt 30000 -nice 0 -nt 1 -s $PBS_O_WORKDIR/md.tpr -o $PBS_O_WORKDIR/mdDone.trr -x $PBS_O_WORKDIR/mdDone.xtc -c $PBS_O_WORKDIR/mdDone.gro -e $PBS_O_WORKDIR/md.edr -g $PBS_O_WORKDIR/md.log 1&gt; $PBS_JOBID.pgm.out 4&gt; $PBS_JOBID.pgm.err<br>

<br>
</blockquote>
<br></div>
The -np option of mdrun is nonexistent, but mdrun does not check for proper command line arguments, so you won&#39;t get an error.  But then you&#39;ve said that 8 processors are active, so I still suspect that mdrun was compiled incorrectly or in such a way that it&#39;s incompatible with your system.  The output from the .log file indicates that only one processor was used.  Maybe your admins can help you on this one, if the jobs spit out any useful diagnostic information.<br>

<br>
For our cluster, we use e.g.:<br>
<br>
mpirun -np 8 mdrun_mpi -deffnm md<br>
<br>
-Justin<br>
<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div class="im">
I know the -cpt option is 30000 because I don&#39;t want a checkpoint file because every time it tries to make it, it fails due to quota issues and kills the job.  I&#39;m not sure why this happens, but I think it&#39;s a separate issue to take up with my supercomputing facility.<br>

<br></div><div class="im">
On Fri, Jan 28, 2011 at 1:18 PM, Justin A. Lemkul &lt;<a href="mailto:jalemkul@vt.edu" target="_blank">jalemkul@vt.edu</a> &lt;mailto:<a href="mailto:jalemkul@vt.edu" target="_blank">jalemkul@vt.edu</a>&gt;&gt; wrote:<br>

<br>
<br>
<br>
    Denny Frost wrote:<br>
<br>
        all 8 nodes are running at full capacity, though<br>
<br>
<br>
    What is your mdrun command line?  How did you compile it?  What can<br>
    happen is something went wrong during installation, so you think you<br>
    have an MPI-enabled binary, but it is simply executing 8 copies of<br>
    the same job.<br>
<br>
    -Justin<br>
<br>
        On Fri, Jan 28, 2011 at 1:13 PM, Justin A. Lemkul<br>
        &lt;<a href="mailto:jalemkul@vt.edu" target="_blank">jalemkul@vt.edu</a> &lt;mailto:<a href="mailto:jalemkul@vt.edu" target="_blank">jalemkul@vt.edu</a>&gt;<br></div><div><div></div><div class="h5">
        &lt;mailto:<a href="mailto:jalemkul@vt.edu" target="_blank">jalemkul@vt.edu</a> &lt;mailto:<a href="mailto:jalemkul@vt.edu" target="_blank">jalemkul@vt.edu</a>&gt;&gt;&gt; wrote:<br>
<br>
<br>
<br>
           Denny Frost wrote:<br>
<br>
               Here&#39;s what I&#39;ve got:<br>
<br>
               M E G A - F L O P S   A C C O U N T I N G<br>
<br>
                 RF=Reaction-Field  FE=Free Energy  SCFE=Soft-Core/Free<br>
        Energy<br>
                 T=Tabulated        W3=SPC/TIP3p    W4=TIP4p (single or<br>
        pairs)<br>
                 NF=No Forces<br>
<br>
                Computing:                               M-Number                       M-Flops  % Flops<br>
                      -----------------------------------------------------------------------------<br>
                Coul(T) + VdW(T)                   1219164.751609                  82903203.109    80.6<br>
                Outer nonbonded loop                 25980.879385                    259808.794     0.3<br>
                Calc Weights                         37138.271040                   1336977.757     1.3<br>
                Spread Q Bspline                    792283.115520                   1584566.231     1.5<br>
                Gather F Bspline                    792283.115520                   4753698.693     4.6<br>
                3D-FFT                              119163.856212                    953310.850     0.9<br>
                Solve PME                             2527.465668                    161757.803     0.2<br>
                NS-Pairs                             47774.705001                   1003268.805     1.0<br>
                Reset In Box                           371.386080                      1114.158     0.0<br>
                Shift-X                              24758.847360                    148553.084     0.1<br>
                CG-CoM                                1237.953600                      3713.861     0.0<br>
                Angles                               18569.135520                   3119614.767     3.0<br>
                Propers                              14855.308416                   3401865.627     3.3<br>
                Impropers                             3094.855920                    643730.031     0.6<br>
                Virial                                1242.417375                     22363.513     0.0<br>
                Stop-CM                               1237.953600                     12379.536     0.0<br>
                P-Coupling                           12379.423680                     74276.542     0.1<br>
                Calc-Ekin                            12379.436160                    334244.776     0.3<br>
                Lincs                                11760.476208                    705628.572     0.7<br>
                Lincs-Mat                           245113.083072                    980452.332     1.0<br>
                Constraint-V                         23520.928704                    188167.430     0.2<br>
                Constraint-Vir                       11760.452496                    282250.860     0.3<br>
                      -----------------------------------------------------------------------------<br>
                Total                                                             102874947.133   100.0<br>
                      -----------------------------------------------------------------------------<br>
<br>
<br>
                   R E A L   C Y C L E   A N D   T I M E   A C C O U N T<br>
        I N G<br>
<br>
                Computing:         Nodes     Number     G-Cycles           Seconds     %<br>
                      -----------------------------------------------------------------------<br>
                Neighbor search    1      99195     8779.027     3300.3<br>
              3.8<br>
                Force                   1     991941   188562.885           70886.8           81.7<br>
                PME mesh           1     991941    18012.830     6771.6<br>
             7.8<br>
                Write traj.             1            41     16.835                 6.3<br>
                        0.0<br>
                Update                 1     991941     2272.379             854.3              1.0<br>
                Constraints           1     991941    11121.146            4180.8     4.8<br>
                Rest                     1                    2162.628                    813.0      0.9<br>
                      -----------------------------------------------------------------------<br>
                Total                    1                  230927.730                  86813.1   100.0<br>
                      -----------------------------------------------------------------------<br>
                      -----------------------------------------------------------------------<br>
                PME spread/gather      1    1983882      17065.384           6415.4   7.4<br>
                PME 3D-FFT               1    1983882      503.340              189.2<br>
                   0.2<br>
                PME solve                  1     991941       427.136                     160.6     0.2<br>
                      -----------------------------------------------------------------------<br>
<br>
               Does that mean it&#39;s only using 1 node?  That would<br>
        explain the<br>
               speed issues.<br>
<br>
<br>
           That&#39;s what it looks like to me.<br>
<br>
<br>
           -Justin<br>
<br>
           --     ========================================<br>
<br>
           Justin A. Lemkul<br>
           Ph.D. Candidate<br>
           ICTAS Doctoral Scholar<br>
           MILES-IGERT Trainee<br>
           Department of Biochemistry<br>
           Virginia Tech<br>
           Blacksburg, VA<br></div></div>
           jalemkul[at]<a href="http://vt.edu" target="_blank">vt.edu</a> &lt;<a href="http://vt.edu" target="_blank">http://vt.edu</a>&gt; &lt;<a href="http://vt.edu" target="_blank">http://vt.edu</a>&gt; | (540)<div class="im">
<br>
        231-9080<br>
<br>
           <a href="http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin" target="_blank">http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin</a><br>
<br>
           ========================================<br>
           --     gmx-users mailing list    <a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a><br>
        &lt;mailto:<a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a>&gt;<br></div>
           &lt;mailto:<a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a> &lt;mailto:<a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a>&gt;&gt;<div><div></div><div class="h5">
<br>
<br>
           <a href="http://lists.gromacs.org/mailman/listinfo/gmx-users" target="_blank">http://lists.gromacs.org/mailman/listinfo/gmx-users</a><br>
           Please search the archive at<br>
           <a href="http://www.gromacs.org/Support/Mailing_Lists/Search" target="_blank">http://www.gromacs.org/Support/Mailing_Lists/Search</a> before<br>
        posting!<br>
           Please don&#39;t post (un)subscribe requests to the list. Use the www<br>
           interface or send it to <a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a><br>
        &lt;mailto:<a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a>&gt;<br>
           &lt;mailto:<a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a><br>
        &lt;mailto:<a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a>&gt;&gt;.<br>
<br>
           Can&#39;t post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a><br>
<br>
<br>
<br>
    --     ========================================<br>
<br>
    Justin A. Lemkul<br>
    Ph.D. Candidate<br>
    ICTAS Doctoral Scholar<br>
    MILES-IGERT Trainee<br>
    Department of Biochemistry<br>
    Virginia Tech<br>
    Blacksburg, VA<br>
    jalemkul[at]<a href="http://vt.edu" target="_blank">vt.edu</a> &lt;<a href="http://vt.edu" target="_blank">http://vt.edu</a>&gt; | (540) 231-9080<br>
    <a href="http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin" target="_blank">http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin</a><br>
<br>
    ========================================<br>
    --     gmx-users mailing list    <a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a><br>
    &lt;mailto:<a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a>&gt;<br>
    <a href="http://lists.gromacs.org/mailman/listinfo/gmx-users" target="_blank">http://lists.gromacs.org/mailman/listinfo/gmx-users</a><br>
    Please search the archive at<br>
    <a href="http://www.gromacs.org/Support/Mailing_Lists/Search" target="_blank">http://www.gromacs.org/Support/Mailing_Lists/Search</a> before posting!<br>
    Please don&#39;t post (un)subscribe requests to the list. Use the www<br>
    interface or send it to <a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a><br>
    &lt;mailto:<a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a>&gt;.<br>
    Can&#39;t post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a><br>
<br>
<br>
</div></div></blockquote><div><div></div><div class="h5">
<br>
-- <br>
========================================<br>
<br>
Justin A. Lemkul<br>
Ph.D. Candidate<br>
ICTAS Doctoral Scholar<br>
MILES-IGERT Trainee<br>
Department of Biochemistry<br>
Virginia Tech<br>
Blacksburg, VA<br>
jalemkul[at]<a href="http://vt.edu" target="_blank">vt.edu</a> | (540) 231-9080<br>
<a href="http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin" target="_blank">http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin</a><br>
<br>
========================================<br>
-- <br>
gmx-users mailing list    <a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a><br>
<a href="http://lists.gromacs.org/mailman/listinfo/gmx-users" target="_blank">http://lists.gromacs.org/mailman/listinfo/gmx-users</a><br>
Please search the archive at <a href="http://www.gromacs.org/Support/Mailing_Lists/Search" target="_blank">http://www.gromacs.org/Support/Mailing_Lists/Search</a> before posting!<br>
Please don&#39;t post (un)subscribe requests to the list. Use the www interface or send it to <a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a>.<br>
Can&#39;t post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a><br>
</div></div></blockquote></div><br>