for the latest versions of gromacs, it would do if u specify the number of processors u want to run the job on, at the mpiexec level. u dont have to repeat the same at the mdrun_mpi level. try using<br>usr/local/bin/mpiexec -n 4 -verbose /usr/local/applications/bioinformatics/gromacs-
3.3/bin/mdrun_mpi <other mdrun parameters>.<br><br>hope this helps.<br>cheers<br>kota.<br><br><br><div><span class="gmail_quote">On 12/1/05, <b class="gmail_sendername">Wong, RYM (Richard)</b> <<a href="mailto:R.Y.M.Wong@rl.ac.uk">
R.Y.M.Wong@rl.ac.uk</a>> wrote:</span><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;"><br><br>-----Original Message-----<br>From: <a href="mailto:gmx-users-bounces@gromacs.org">
gmx-users-bounces@gromacs.org</a><br>[mailto:<a href="mailto:gmx-users-bounces@gromacs.org">gmx-users-bounces@gromacs.org</a>]On Behalf Of Florian Haberl<br>Sent: 01 December 2005 10:12<br>To: Discussion list for GROMACS users
<br>Subject: Re: [gmx-users] FW: Gromacs version 3.3<br><br><br>hi,<br><br>On Thursday 01 December 2005 10:19, Wong, RYM (Richard) wrote:<br>> -----Original Message-----<br>> From: <a href="mailto:gmx-users-bounces@gromacs.org">
gmx-users-bounces@gromacs.org</a><br>> [mailto:<a href="mailto:gmx-users-bounces@gromacs.org">gmx-users-bounces@gromacs.org</a>]On Behalf Of David van der Spoel<br>> Sent: 30 November 2005 20:01<br>> To: Discussion list for GROMACS users
<br>> Subject: Re: [gmx-users] FW: Gromacs version 3.3<br>><br>> Wong, RYM (Richard) wrote:<br>> >>Hi,<br>> >><br>> >>I have installed Gromacs version 3.3 on a machine running Linux<br>
> >> 2.4.21-37.ELsmp.<br>> ><br>> >The configuration command syntax I used to compile the MPI version of<br>> > Gromacs-3.3:<br>><br>> What kind of machine?<br>><br>> The system contains 20 dual Intel Xeon processors.
<br>> The version of MPI is MPICH-1.2.5..10 associated with GNU (i.e. mpicc<br>> invokes gcc-3.2.3). Multiprocessors are connected together via Myrinet GM<br>> version 2.0.8. Do you need any other information?<br>
><br>> >> ./configure --prefix=/usr/local/applications/bioinformatics/gromacs-3.3<br>> >> --exec-prefix=/usr/local/applications/bioinformatics/gromacs-3.3<br>> >> --enable-mpi --enable-shared --with-fft=fftw2 --program-suffix="_mpi"
<br>> >> CPPFLAGS=-I/usr/local/applications/libraries/numerical/fft/include<br>> >> LDFLAGS=-L/usr/local/applications/libraries/numerical/fft/lib CC=mpicc<br>> >><br>> >>Then executed "make"
<br>> >>Then executed "make install"<br>> >>Then I used the command "grompp -np 2" to create the input data files in<br>> >> the sample "water" provided by the Gromacs development team.
<br>> >><br>> >>When I run the command "mdrun_mpi" , I get 'segmentation fault' error.<br>> >><br>> >> Warning: main: task 0 died with signal 11 (Segmentation fault)<br>> >> Warning: main: task 1 died with signal 11 (Segmentation fault)
<br>><br>> Did you use mpirun?<br>> Due to the security reason, we use mpiexec version 0.75 instead of using<br>> mpirun directly. We are using MPICH-1.2.5..10 and Myrinet GM-2.0.8.<br>> Do you need any other information?
<br><br>Did you use a command line like "mpiexec -np 2 /pathtomdrun/mdrun_mpi -np 2" ,<br>you have to invoke the mpiprocess first with mpirun/ mpiexec to start<br>mdrun_mpi.<br><br> The command I use for executing the mdrun_mpi:
<br> "/usr/local/bin/mpiexec -n 4 -verbose /usr/local/applications/bioinformatics/gromacs-3.3/bin/mdrun_mpi -np 4".<br><br>Error messages recorded in the log file:<br>********************************************
<br>[1] Malloc hook test failed: the malloc provided by MPICH-GM is not used, it may have been overloaded by a malloc provided by another library.<br><br>[0] Malloc hook test failed: the malloc provided by MPICH-GM is not used, it may have been overloaded by a malloc provided by another library.
<br><br>[2] Malloc hook test failed: the malloc provided by MPICH-GM is not used, it may have been overloaded by a malloc provided by another library.<br><br>[3] Malloc hook test failed: the malloc provided by MPICH-GM is not used, it may have been overloaded by a malloc provided by another library.
<br><br>mpiexec: Warning: accept_abort_conn: MPI_Abort from IP <a href="http://192.168.100.5">192.168.100.5</a>, killing all.<br>mpiexec: Warning: main: task 0 exited with status 255.<br>mpiexec: Warning: main: task 1 exited with status 255.
<br>mpiexec: Warning: main: task 2 exited with status 255.<br>**********************************************<br>Does it work without mpi, e.g. only on one cpu?<br><br>Yes. The serial version of mdrun works OK.<br><br>Are you using a scheduler/ queing system like maui/ torque?
<br>No. I create a script which contains the above command and submit it to PBS (portable batch system).<br><br><br>><br>> Thank you<br>> Richard<br>> _______________________________________________<br>> gmx-users mailing list
<br>> <a href="mailto:gmx-users@gromacs.org">gmx-users@gromacs.org</a><br>> <a href="http://www.gromacs.org/mailman/listinfo/gmx-users">http://www.gromacs.org/mailman/listinfo/gmx-users</a><br>> Please don't post (un)subscribe requests to the list. Use the
<br>> www interface or send it to <a href="mailto:gmx-users-request@gromacs.org">gmx-users-request@gromacs.org</a>.<br><br><br>Greetings,<br><br>Florian<br>--<br>---------------------------------------------------------------------------------------------------
<br> Florian Haberl Universitaet Erlangen/Nuernberg<br> Computer-Chemie-Centrum Naegelsbachstr. 25, D-91052 Erlangen<br><br> Mailto: florian.haberl AT <a href="http://chemie.uni-erlangen.de">
chemie.uni-erlangen.de</a><br><br>---------------------------------------------------------------------------------------------------<br><br>_______________________________________________<br>gmx-users mailing list<br><a href="mailto:gmx-users@gromacs.org">
gmx-users@gromacs.org</a><br><a href="http://www.gromacs.org/mailman/listinfo/gmx-users">http://www.gromacs.org/mailman/listinfo/gmx-users</a><br>Please don't post (un)subscribe requests to the list. Use the<br>www interface or send it to
<a href="mailto:gmx-users-request@gromacs.org">gmx-users-request@gromacs.org</a>.<br>_______________________________________________<br>gmx-users mailing list<br><a href="mailto:gmx-users@gromacs.org">gmx-users@gromacs.org
</a><br><a href="http://www.gromacs.org/mailman/listinfo/gmx-users">http://www.gromacs.org/mailman/listinfo/gmx-users</a><br>Please don't post (un)subscribe requests to the list. Use the<br>www interface or send it to <a href="mailto:gmx-users-request@gromacs.org">
gmx-users-request@gromacs.org</a>.<br></blockquote></div><br>