<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<html>
<head>
<meta content="text/html; charset=GB2312" http-equiv="Content-Type">
<title></title>
</head>
<body bgcolor="#ffffff" text="#000000">
Hi all,<br>
<br>
I just recompiled GMX4.0.7. Such error doesn't occur. But 4.0.7 is
about 30% slower than 4.5.3. So I really appreciate if anyone can
help me with it!<br>
<br>
best regards,<br>
Baofu Qiao<br>
<br>
<br>
ÓÚ 2010-11-25 20:17, Baofu Qiao дµÀ:
<blockquote cite="mid:4CEEA833.80702@gmail.com" type="cite">
<meta http-equiv="content-type" content="text/html;
charset=GB2312">
Hi all,<br>
<br>
I got the error message when I am extending the simulation using
the following command:<br>
mpiexec -np 64 mdrun -deffnm pre -npme 32 -maxh 2 -table table
-cpi pre.cpt -append <br>
<br>
The previous simuluation is succeeded. I wonder why pre.log is
locked, and the strange warning of "<big><b>Function not
implemented</b></big>"?<br>
<br>
Any suggestion is appreciated!<br>
<br>
*********************************************************************<br>
Getting Loaded...<br>
Reading file pre.tpr, VERSION 4.5.3 (single precision)<br>
<br>
Reading checkpoint file pre.cpt generated: Thu Nov 25 19:43:25
2010<br>
<br>
-------------------------------------------------------<br>
Program mdrun, VERSION 4.5.3<br>
Source code file: checkpoint.c, line: 1750<br>
<br>
Fatal error:<br>
<big><b>Failed to lock: pre.log. Function not implemented.</b></big><br>
For more information and tips for troubleshooting, please check
the GROMACS<br>
website at <a moz-do-not-send="true"
class="moz-txt-link-freetext"
href="http://www.gromacs.org/Documentation/Errors">http://www.gromacs.org/Documentation/Errors</a><br>
-------------------------------------------------------<br>
<br>
"It Doesn't Have to Be Tip Top" (Pulp Fiction)<br>
<br>
Error on node 0, will try to stop all the nodes<br>
Halting parallel program mdrun on CPU 0 out of 64<br>
<br>
gcq#147: "It Doesn't Have to Be Tip Top" (Pulp Fiction)<br>
<br>
--------------------------------------------------------------------------<br>
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD<br>
with errorcode -1.<br>
<br>
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI
processes.<br>
You may or may not see output from other processes, depending on<br>
exactly when Open MPI kills them.<br>
--------------------------------------------------------------------------<br>
--------------------------------------------------------------------------<br>
mpiexec has exited due to process rank 0 with PID 32758 on<br>
<br>
</blockquote>
<br>
</body>
</html>