I believed the case that single machine with multiple processors (like
Dell server with 2 due core CPUs) should be very popular for end user.
Because not everyone can offer a 32 nodes cluster or has access to
supercomputer center. <br><br>Could we have detail tutorial for "run gromacs on single machine with multiple processors" Thanks.<br><br>Best,<br><br>Ocean<br><br><div><span class="gmail_quote">On 12/21/06, <b class="gmail_sendername">
David van der Spoel</b> <<a href="mailto:spoel@xray.bmc.uu.se">spoel@xray.bmc.uu.se</a>> wrote:</span><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
Yang Ye wrote:<br>> normally we use following commands when compiling gromacs<br>> ./configure ... --program_suffix=_mpi --enable-mpi ...<br>><br>> So we will have mdrun_mpi in the end.<br>><br>> The usage of mdrun_mpi is the same as mdrun but -np, -replex and other
<br>> switches are activated.<br>><br>> That statement on the user manual is to clarify that you don't need<br>> MPI-enabled mdrun to run on a multiprocessor machine. But if you want to<br>> tap the capability of the multiprocessor machine, you need to use MPI
<br>> and use mdrun_mpi<br>><br>As Erik stated, different MPI implementations have different standards<br>for starting jobs. The comment in the manual is probably a remnant from<br>an old SGI MPI implementation. I will remove the comment.
<br><br>Otherwise, a little trial and error and man XXX usually does the job...<br><br>--<br>David.<br>________________________________________________________________________<br>David van der Spoel, PhD, Assoc. Prof., Molecular Biophysics group,
<br>Dept. of Cell and Molecular Biology, Uppsala University.<br>Husargatan 3, Box 596, 75124 Uppsala, Sweden<br>phone: 46 18 471 4205 fax: 46 18 511 755<br><a href="mailto:spoel@xray.bmc.uu.se">spoel@xray.bmc.uu.se
</a> <a href="mailto:spoel@gromacs.org">spoel@gromacs.org</a> <a href="http://folding.bmc.uu.se">http://folding.bmc.uu.se</a><br>++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++<br>_______________________________________________
<br>gmx-users mailing list <a href="mailto:gmx-users@gromacs.org">gmx-users@gromacs.org</a><br><a href="http://www.gromacs.org/mailman/listinfo/gmx-users">http://www.gromacs.org/mailman/listinfo/gmx-users</a><br>Please don't post (un)subscribe requests to the list. Use the
<br>www interface or send it to <a href="mailto:gmx-users-request@gromacs.org">gmx-users-request@gromacs.org</a>.<br>Can't post? Read <a href="http://www.gromacs.org/mailing_lists/users.php">http://www.gromacs.org/mailing_lists/users.php
</a><br></blockquote></div><br>