<html><head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8"></head><body>oops I meant /usr/local/gromacs/bin in PATH<br>
<br>
also 4.5.4 has support for threading if compiled that way...Most clusters don't use it because nodes on separate chassis usually don't support shared memory with each other. We get really bad performance when using threading-over-scalemp as compared to the typical openmpi setup on identical hardware and infrastructure.<br>
-- <br>
Sent from my Android phone with K-9 Mail. Please excuse my brevity.<br><br><div class="gmail_quote">"Peter C. Lai" <pcl@uab.edu> wrote:<blockquote class="gmail_quote" style="margin: 0pt 0pt 0pt 0.8ex; border-left: 1px solid rgb(204, 204, 204); padding-left: 1ex;">
possibly. Make sure PATH includes /usr/local/bin and LD_LIBRARY_PATH includes the path where libmd_<a href="http://mpi.so">mpi.so</a> and libgmx_<a href="http://mpi.so">mpi.so</a> are located.<br>
<br>
try using mdrun -v and > somelogfile after the mpirun line (like mpirun blah -np 8 mdrun -v -deffnm etcetc > mpilog). the contents of mpilog will tell you what mpi is trying to do...<br>
-- <br>
Sent from my Android phone with K-9 Mail. Please excuse my brevity.<br><br><div class="gmail_quote">Andrew DeYoung <adeyoung@andrew.cmu.edu> wrote:<blockquote class="gmail_quote" style="margin: 0pt 0pt 0pt 0.8ex; border-left: 1px solid rgb(204, 204, 204); padding-left: 1ex;">
<pre style="white-space: pre-wrap; word-wrap:break-word; font-family: monospace">Hi,<br><br>Typically, I use Gromacs 4.5.5 compiled with automatic threading. As you<br>know, automatic threading is awesome because it allows me to start parallel<br>runs without calling mpirun. So on version 4.5.5, I can start a job on eight<br>CPUs using simply the command:<br><br>mdrun -s topol.tpr -nt 8<br><br>However, now I am using a different node on my department's cluster, and<br>this node instead has Gromacs 4.5.4 (compiled without automatic threading).<br>So, I must use mpirun to start parallel runs. I have tried this command:<br><br>mpirun -machinefile mymachines -np 8 mdrun -s topol.tpr<br><br>where mymachines is an (extensionless) file containing only the text "c60<br>slots=8". (c60 is the name of the node that I am using.)<br><br>I get this error message:<br><br>"Missing: program name. Program mdrun either does not exist, is not<br>executab
le, or
is an erroneous argument to mpirun."<br><br>This is strange, because mdrun is, I think, in my path. For example, if I<br>type "mdrun -h", I get the manual page for mdrun (version 4.5.4).<br><br>Then I tried the command "which mdrun", and it gave me this output:<br><br>/usr/local/gromacs/bin/mdrun<br><br>So, next I tried to call mdrun via mpirun using the specific path for mdrun:<br><br>mpirun -machinefile mymachines -np 8 /usr/local/gromacs/bin/mdrun -s<br>topol.tpr<br><br>This starts running my simulation, but when I look in "top", the simulation<br>is only running on a single CPU; there is only one entry for mdrun in "top",<br>and it has only %CPU=100 (not eight different entries for mdrun, nor one<br>entry with %CPU=800). Also, the simulation is going at the speed I would<br>expect for running on a single CPU -- it is very slow, so I am convinced<br>that, as "top" suggests, mdrun is running on only one CPU.<br><br>Stra
ngely,
my col
leagues
are able to run jobs in parallel using the exact<br>commands that I described above. So apparently something is wrong with my<br>user ID, although there are no error messages (except the error message<br>about "Missing: program name" that I described).<br><br>If you have time, do you have any suggestions for other things that I can<br>try? Do you think that something could be wrong with my bashrc file?<br><br>Thanks for your time!<br><br>Andrew DeYoung<br>Carnegie Mellon University<br><br>-- <br>gmx-users mailing list gmx-users@gromacs.org<br><a href="http://lists.gromacs.org/mailman/listinfo/gmx-users">http://lists.gromacs.org/mailman/listinfo/gmx-users</a><br>Please search the archive at <a href="http://www.gromacs.org/Support/Mailing_Lists/Search">http://www.gromacs.org/Support/Mailing_Lists/Search</a> before posting!<br>Please don't post (un)subscribe requests to the list. Use the <br>www interface or send it to
gmx-users-request@gromacs.org.<br>Can't post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists">http://www.gromacs.org/Support/Mailing_Lists</a><br></pre></blockquote></div></blockquote></div></body></html>