Hi,<br><br>Hi,<br><br>I'm having MPI compatibility issues while running Gromacs v4.0.4. I've tried using openmpi/bin/mpirun, openmpi-1.2.7-intel-mx2g/bin/<div id=":8d" class="ii gt">mpirun,
and openmpi-intel-mx2g/bin/mpirun, but none of them seem to be compatible -
they caused my simulation to crash due to the error:<br><br>"Program mdrun_mpi, VERSION 3.3.1<br>Source code file: init.c, line: 69<br><br>Fatal error:<br>run input file md3_2.tpr was made for 4 nodes,<br> while mdrun_mpi expected it to be for 1 nodes."<br>
<br>Does anyone have insight on this problem? Thanks.<br></div><br><div class="gmail_quote">2009/8/2 Mark Abraham <span dir="ltr"><<a href="mailto:Mark.Abraham@anu.edu.au">Mark.Abraham@anu.edu.au</a>></span><br><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
<div class="im"><a href="mailto:rainy908@yahoo.com" target="_blank">rainy908@yahoo.com</a> wrote:<br>
<blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
So I've been able to get access to Gromacs v4.0.4 on another supercomputer<br>
cluster. However, I've been told that there are compatibility issues<br>
regarding the MPI with Gromacs v4. Also, I'm using the MARTINI force field<br>
with Gromacs, but I'm not sure how well it is tested with v4?<br>
</blockquote>
<br></div>
Please start a new email with a new subject when you change topics. You should also try to search the archives for clues on these topics. I'm not aware of general problems with modern MPI libraries and GROMACS 4. The use GROMACS makes of MPI functionality is quite undemanding.<div class="im">
<br>
<br>
Mark\</div></blockquote></div><br>