In the log file, when gromacs specifies &quot;Nodes,&quot; does it mean processors?<br><br><div class="gmail_quote">On Fri, Jan 28, 2011 at 1:44 PM, Justin A. Lemkul <span dir="ltr">&lt;<a href="mailto:jalemkul@vt.edu">jalemkul@vt.edu</a>&gt;</span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;"><div class="im"><br>
<br>
Denny Frost wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
I&#39;m leaning toward the possibility that it is actually only running 8 copies of the same job on different processors.  My question is how does gromacs4.5 know how many processors it has available to parallelize a job?  Is it specified in grompp or does it just detect it? <br>

</blockquote>
<br></div>
If you&#39;re using MPI, it comes from mpiexec/mpirun/whatever.  Setting a proper flag there is what tells mdrun how many nodes to use.<br><font color="#888888">
<br>
-Justin<br>
<br>
</font><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div class="im">
On Fri, Jan 28, 2011 at 1:32 PM, Justin A. Lemkul &lt;<a href="mailto:jalemkul@vt.edu" target="_blank">jalemkul@vt.edu</a> &lt;mailto:<a href="mailto:jalemkul@vt.edu" target="_blank">jalemkul@vt.edu</a>&gt;&gt; wrote:<br>

<br>
<br>
<br>
    Denny Frost wrote:<br>
<br></div><div><div></div><div class="h5">
        Here&#39;s my grompp command:<br>
<br>
        grompp_d -nice 0 -v -f md.mdp -c ReadyForMD.gro -o md.tpr -p top.top<br>
<br>
        and my mdrun command is this:<br>
        time mpiexec mdrun_mpi -np 8 -cpt 30000 -nice 0 -nt 1 -s<br>
        $PBS_O_WORKDIR/md.tpr -o $PBS_O_WORKDIR/mdDone.trr -x<br>
        $PBS_O_WORKDIR/mdDone.xtc -c $PBS_O_WORKDIR/mdDone.gro -e<br>
        $PBS_O_WORKDIR/md.edr -g $PBS_O_WORKDIR/md.log 1&gt;<br>
        $PBS_JOBID.pgm.out 4&gt; $PBS_JOBID.pgm.err<br>
<br>
<br>
    The -np option of mdrun is nonexistent, but mdrun does not check for<br>
    proper command line arguments, so you won&#39;t get an error.  But then<br>
    you&#39;ve said that 8 processors are active, so I still suspect that<br>
    mdrun was compiled incorrectly or in such a way that it&#39;s<br>
    incompatible with your system.  The output from the .log file<br>
    indicates that only one processor was used.  Maybe your admins can<br>
    help you on this one, if the jobs spit out any useful diagnostic<br>
    information.<br>
<br>
    For our cluster, we use e.g.:<br>
<br>
    mpirun -np 8 mdrun_mpi -deffnm md<br>
<br>
    -Justin<br>
<br>
        I know the -cpt option is 30000 because I don&#39;t want a<br>
        checkpoint file because every time it tries to make it, it fails<br>
        due to quota issues and kills the job.  I&#39;m not sure why this<br>
        happens, but I think it&#39;s a separate issue to take up with my<br>
        supercomputing facility.<br>
<br>
        On Fri, Jan 28, 2011 at 1:18 PM, Justin A. Lemkul<br>
        &lt;<a href="mailto:jalemkul@vt.edu" target="_blank">jalemkul@vt.edu</a> &lt;mailto:<a href="mailto:jalemkul@vt.edu" target="_blank">jalemkul@vt.edu</a>&gt;<br></div></div><div><div></div><div class="h5">
        &lt;mailto:<a href="mailto:jalemkul@vt.edu" target="_blank">jalemkul@vt.edu</a> &lt;mailto:<a href="mailto:jalemkul@vt.edu" target="_blank">jalemkul@vt.edu</a>&gt;&gt;&gt; wrote:<br>
<br>
<br>
<br>
           Denny Frost wrote:<br>
<br>
               all 8 nodes are running at full capacity, though<br>
<br>
<br>
           What is your mdrun command line?  How did you compile it?<br>
         What can<br>
           happen is something went wrong during installation, so you<br>
        think you<br>
           have an MPI-enabled binary, but it is simply executing 8<br>
        copies of<br>
           the same job.<br>
<br>
           -Justin<br>
<br>
               On Fri, Jan 28, 2011 at 1:13 PM, Justin A. Lemkul<br>
               &lt;<a href="mailto:jalemkul@vt.edu" target="_blank">jalemkul@vt.edu</a> &lt;mailto:<a href="mailto:jalemkul@vt.edu" target="_blank">jalemkul@vt.edu</a>&gt;<br>
        &lt;mailto:<a href="mailto:jalemkul@vt.edu" target="_blank">jalemkul@vt.edu</a> &lt;mailto:<a href="mailto:jalemkul@vt.edu" target="_blank">jalemkul@vt.edu</a>&gt;&gt;<br>
               &lt;mailto:<a href="mailto:jalemkul@vt.edu" target="_blank">jalemkul@vt.edu</a> &lt;mailto:<a href="mailto:jalemkul@vt.edu" target="_blank">jalemkul@vt.edu</a>&gt;<br>
        &lt;mailto:<a href="mailto:jalemkul@vt.edu" target="_blank">jalemkul@vt.edu</a> &lt;mailto:<a href="mailto:jalemkul@vt.edu" target="_blank">jalemkul@vt.edu</a>&gt;&gt;&gt;&gt; wrote:<br>
<br>
<br>
<br>
                  Denny Frost wrote:<br>
<br>
                      Here&#39;s what I&#39;ve got:<br>
<br>
                      M E G A - F L O P S   A C C O U N T I N G<br>
<br>
                        RF=Reaction-Field  FE=Free Energy<br>
         SCFE=Soft-Core/Free<br>
               Energy<br>
                        T=Tabulated        W3=SPC/TIP3p    W4=TIP4p<br>
        (single or<br>
               pairs)<br>
                        NF=No Forces<br>
<br>
                       Computing:                               M-Number<br>
                              M-Flops  % Flops<br>
                                    -----------------------------------------------------------------------------<br>
                       Coul(T) + VdW(T)                   1219164.751609<br>
                         82903203.109    80.6<br>
                       Outer nonbonded loop                 25980.879385<br>
                           259808.794     0.3<br>
                       Calc Weights                         37138.271040<br>
                          1336977.757     1.3<br>
                       Spread Q Bspline                    792283.115520<br>
                          1584566.231     1.5<br>
                       Gather F Bspline                    792283.115520<br>
                          4753698.693     4.6<br>
                       3D-FFT                              119163.856212<br>
                           953310.850     0.9<br>
                       Solve PME                             2527.465668<br>
                           161757.803     0.2<br>
                       NS-Pairs                             47774.705001<br>
                          1003268.805     1.0<br>
                       Reset In Box                           371.386080<br>
                             1114.158     0.0<br>
                       Shift-X                              24758.847360<br>
                           148553.084     0.1<br>
                       CG-CoM                                1237.953600<br>
                             3713.861     0.0<br>
                       Angles                               18569.135520<br>
                          3119614.767     3.0<br>
                       Propers                              14855.308416<br>
                          3401865.627     3.3<br>
                       Impropers                             3094.855920<br>
                           643730.031     0.6<br>
                       Virial                                1242.417375<br>
                            22363.513     0.0<br>
                       Stop-CM                               1237.953600<br>
                            12379.536     0.0<br>
                       P-Coupling                           12379.423680<br>
                            74276.542     0.1<br>
                       Calc-Ekin                            12379.436160<br>
                           334244.776     0.3<br>
                       Lincs                                11760.476208<br>
                           705628.572     0.7<br>
                       Lincs-Mat                           245113.083072<br>
                           980452.332     1.0<br>
                       Constraint-V                         23520.928704<br>
                           188167.430     0.2<br>
                       Constraint-Vir                       11760.452496<br>
                           282250.860     0.3<br>
                                    -----------------------------------------------------------------------------<br>
                       Total                                                                    102874947.133   100.0<br>
                                    -----------------------------------------------------------------------------<br>
<br>
<br>
                          R E A L   C Y C L E   A N D   T I M E   A C C<br>
        O U N T<br>
               I N G<br>
<br>
                       Computing:         Nodes     Number     G-Cycles<br>
                  Seconds     %<br>
                                    -----------------------------------------------------------------------<br>
                       Neighbor search    1      99195     8779.027            3300.3<br>
                     3.8<br>
                       Force                   1     991941   188562.885<br>
                  70886.8           81.7<br>
                       PME mesh           1     991941    18012.830            6771.6<br>
                    7.8<br>
                       Write traj.             1            41            16.835                 6.3<br>
                               0.0<br>
                       Update                 1     991941     2272.379<br>
                    854.3              1.0<br>
                       Constraints           1     991941    11121.146                   4180.8     4.8<br>
                       Rest                     1                           2162.628                    813.0      0.9<br>
                                    -----------------------------------------------------------------------<br>
                       Total                    1                         230927.730                  86813.1   100.0<br>
                                    -----------------------------------------------------------------------<br>
                                    -----------------------------------------------------------------------<br>
                       PME spread/gather      1    1983882             17065.384           6415.4   7.4<br>
                       PME 3D-FFT               1    1983882             503.340              189.2<br>
                          0.2<br>
                       PME solve                  1     991941              427.136                     160.6     0.2<br>
                                    -----------------------------------------------------------------------<br>
<br>
                      Does that mean it&#39;s only using 1 node?  That would<br>
               explain the<br>
                      speed issues.<br>
<br>
<br>
                  That&#39;s what it looks like to me.<br>
<br>
<br>
                  -Justin<br>
<br>
                  --     ========================================<br>
<br>
                  Justin A. Lemkul<br>
                  Ph.D. Candidate<br>
                  ICTAS Doctoral Scholar<br>
                  MILES-IGERT Trainee<br>
                  Department of Biochemistry<br>
                  Virginia Tech<br>
                  Blacksburg, VA<br>
                  jalemkul[at]<a href="http://vt.edu" target="_blank">vt.edu</a> &lt;<a href="http://vt.edu" target="_blank">http://vt.edu</a>&gt; &lt;<a href="http://vt.edu" target="_blank">http://vt.edu</a>&gt;<br>
        &lt;<a href="http://vt.edu" target="_blank">http://vt.edu</a>&gt; | (540)<br>
<br>
               231-9080<br>
<br>
                  <a href="http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin" target="_blank">http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin</a><br>
<br>
                  ========================================<br>
                  --     gmx-users mailing list    <a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a><br>
        &lt;mailto:<a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a>&gt;<br>
               &lt;mailto:<a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a> &lt;mailto:<a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a>&gt;&gt;<br>
                  &lt;mailto:<a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a><br>
        &lt;mailto:<a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a>&gt; &lt;mailto:<a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a><br>
        &lt;mailto:<a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a>&gt;&gt;&gt;<br>
<br>
<br>
                  <a href="http://lists.gromacs.org/mailman/listinfo/gmx-users" target="_blank">http://lists.gromacs.org/mailman/listinfo/gmx-users</a><br>
                  Please search the archive at<br>
                  <a href="http://www.gromacs.org/Support/Mailing_Lists/Search" target="_blank">http://www.gromacs.org/Support/Mailing_Lists/Search</a> before<br>
               posting!<br>
                  Please don&#39;t post (un)subscribe requests to the list.<br>
        Use the www<br>
                  interface or send it to <a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a><br>
        &lt;mailto:<a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a>&gt;<br>
               &lt;mailto:<a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a><br>
        &lt;mailto:<a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a>&gt;&gt;<br>
                  &lt;mailto:<a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a><br>
        &lt;mailto:<a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a>&gt;<br>
               &lt;mailto:<a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a><br>
        &lt;mailto:<a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a>&gt;&gt;&gt;.<br>
<br>
                  Can&#39;t post? Read<br>
        <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a><br>
<br>
<br>
<br>
           --     ========================================<br>
<br>
           Justin A. Lemkul<br>
           Ph.D. Candidate<br>
           ICTAS Doctoral Scholar<br>
           MILES-IGERT Trainee<br>
           Department of Biochemistry<br>
           Virginia Tech<br>
           Blacksburg, VA<br>
           jalemkul[at]<a href="http://vt.edu" target="_blank">vt.edu</a> &lt;<a href="http://vt.edu" target="_blank">http://vt.edu</a>&gt; &lt;<a href="http://vt.edu" target="_blank">http://vt.edu</a>&gt; | (540)<br>
        231-9080<br>
           <a href="http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin" target="_blank">http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin</a><br>
<br>
           ========================================<br>
           --     gmx-users mailing list    <a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a><br>
        &lt;mailto:<a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a>&gt;<br>
           &lt;mailto:<a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a> &lt;mailto:<a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a>&gt;&gt;<br>
           <a href="http://lists.gromacs.org/mailman/listinfo/gmx-users" target="_blank">http://lists.gromacs.org/mailman/listinfo/gmx-users</a><br>
           Please search the archive at<br>
           <a href="http://www.gromacs.org/Support/Mailing_Lists/Search" target="_blank">http://www.gromacs.org/Support/Mailing_Lists/Search</a> before<br>
        posting!<br>
           Please don&#39;t post (un)subscribe requests to the list. Use the www<br>
           interface or send it to <a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a><br>
        &lt;mailto:<a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a>&gt;<br>
           &lt;mailto:<a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a><br>
        &lt;mailto:<a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a>&gt;&gt;.<br>
           Can&#39;t post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a><br>
<br>
<br>
<br>
    --     ========================================<br>
<br>
    Justin A. Lemkul<br>
    Ph.D. Candidate<br>
    ICTAS Doctoral Scholar<br>
    MILES-IGERT Trainee<br>
    Department of Biochemistry<br>
    Virginia Tech<br>
    Blacksburg, VA<br>
    jalemkul[at]<a href="http://vt.edu" target="_blank">vt.edu</a> &lt;<a href="http://vt.edu" target="_blank">http://vt.edu</a>&gt; | (540) 231-9080<br>
    <a href="http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin" target="_blank">http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin</a><br>
<br>
    ========================================<br>
    --     gmx-users mailing list    <a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a><br>
    &lt;mailto:<a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a>&gt;<br>
    <a href="http://lists.gromacs.org/mailman/listinfo/gmx-users" target="_blank">http://lists.gromacs.org/mailman/listinfo/gmx-users</a><br>
    Please search the archive at<br>
    <a href="http://www.gromacs.org/Support/Mailing_Lists/Search" target="_blank">http://www.gromacs.org/Support/Mailing_Lists/Search</a> before posting!<br>
    Please don&#39;t post (un)subscribe requests to the list. Use the www<br>
    interface or send it to <a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a><br>
    &lt;mailto:<a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a>&gt;.<br>
    Can&#39;t post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a><br>
<br>
<br>
</div></div></blockquote><div><div></div><div class="h5">
<br>
-- <br>
========================================<br>
<br>
Justin A. Lemkul<br>
Ph.D. Candidate<br>
ICTAS Doctoral Scholar<br>
MILES-IGERT Trainee<br>
Department of Biochemistry<br>
Virginia Tech<br>
Blacksburg, VA<br>
jalemkul[at]<a href="http://vt.edu" target="_blank">vt.edu</a> | (540) 231-9080<br>
<a href="http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin" target="_blank">http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin</a><br>
<br>
========================================<br>
-- <br>
gmx-users mailing list    <a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a><br>
<a href="http://lists.gromacs.org/mailman/listinfo/gmx-users" target="_blank">http://lists.gromacs.org/mailman/listinfo/gmx-users</a><br>
Please search the archive at <a href="http://www.gromacs.org/Support/Mailing_Lists/Search" target="_blank">http://www.gromacs.org/Support/Mailing_Lists/Search</a> before posting!<br>
Please don&#39;t post (un)subscribe requests to the list. Use the www interface or send it to <a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a>.<br>
Can&#39;t post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a><br>
</div></div></blockquote></div><br>