<html>
<head>
<style><!--
.hmmessage P
{
margin:0px;
padding:0px
}
body.hmmessage
{
font-size: 10pt;
font-family:Verdana
}
--></style>
</head>
<body class='hmmessage'>
<br><br>> From: zhao0139@ntu.edu.sg<br>> To: gmx-users@gromacs.org<br>> Date: Tue, 6 Apr 2010 20:31:18 +0800<br>> Subject: [gmx-users] RE: Re: loab imbalance<br>> <br>> > > > On 6/04/2010 5:39 PM, lina wrote:<br>> > > > > Hi everyone,<br>> > > > ><br>> > > > > Here is the result of the mdrun which was performed on 16cpus. I am not<br>> > > > > clear about it, was it due to using MPI reason? or some other reasons.<br>> > > > ><br>> > > > > Writing final coordinates.<br>> > > > > Average load imbalance: 1500.0 %<br>> > > > > Part of the total run time spent waiting due to load imbalance: 187.5 %<br>> > > > > Steps where the load balancing was limited by -rdd, -rcon and/or -dds:<br>> > > > > X 0 % Y 0 %<br>> > > > > NOTE: 187.5 % performance was lost due to load imbalance<br>> > > > > in the domain decomposition.<br>> > > > <br>> > > > You ran an inefficient but otherwise valid computation. Check out the <br>> > > > manual section on domain decomposition to learn why it was inefficient, <br>> > > > and whether you can do better.<br>> > > > <br>> > > > Mark<br>> > > <br>> > > I search the "decomposition" keyword on Gromacs manual, no match found.<br>> > > Are you positive about that? Thanks any way, but can you make it more<br>> > > problem-solved-oriented, so I can easily understand.<br>> > > <br>> > > Thanks and regards,<br>> > > <br>> > > lina<br>> > <br>> > This looks strange.<br>> > You have 1 core doing something and 15 cores doing nothing.<br>> > Do you only have one small molecule?<br>> > How many steps was this simulation?<br>> > <br>> > Berk<br>> <br>> 6,000,000steps. When I used htop to check the 16 cores, found they were<br>> all took 100%. I am surprised to be told that only 1 core working. <br>> Below is part of the md.log<br>> <br>> Initializing Domain Decomposition on 16 nodes<br>> Dynamic load balancing: auto<br>> Will sort the charge groups at every domain (re)decomposition<br>> Initial maximum inter charge-group distances:<br>> two-body bonded interactions: 0.574 nm, LJ-14, atoms 330 338<br>> multi-body bonded interactions: 0.572 nm, Proper Dih., atoms 934 942<br>> Minimum cell size due to bonded interactions: 0.629 nm<br>> Maximum distance for 5 constraints, at 120 deg. angles, all-trans: 0.876<br>> nm<br>> Estimated maximum distance required for P-LINCS: 0.876 nm<br>> This distance will limit the DD cell size, you can override this with<br>> -rcon<br>> Scaling the initial minimum size with 1/0.8 (option -dds) = 1.25<br>> Optimizing the DD grid for 16 cells with a minimum initial size of 1.095<br>> nm<br>> The maximum allowed number of cells is: X 5 Y 5 Z 4<br>> Domain decomposition grid 4 x 4 x 1, separate PME nodes 0<br>> Domain decomposition nodeid 0, coordinates 0 0 0<br>> <br>> Table routines are used for coulomb: FALSE<br>> Table routines are used for vdw: FALSE<br>> Cut-off's: NS: 0.9 Coulomb: 1.4 LJ: 1.4<br>> Reaction-Field:<br>> epsRF = 78, rc = 1.4, krf = 0.178734, crf = 1.0646, epsfac = 138.935<br>> The electrostatics potential has its minimum at r = 1.40903<br>> System total charge: 0.000<br>> Generated table with 4800 data points for 1-4 COUL.<br>> Tabscale = 2000 points/nm<br>> Generated table with 4800 data points for 1-4 LJ6.<br>> Tabscale = 2000 points/nm<br>> Generated table with 4800 data points for 1-4 LJ12.<br>> Tabscale = 2000 points/nm<br>> <br>> Enabling SPC water optimization for 7729 molecules.<br>> <br>> Configuring nonbonded kernels...<br>> Testing x86_64 SSE2 support... present.<br>> <br>> Initializing Parallel LINear Constraint Solver.<br>> <br>> Thanks advance.<br>> <br>> lina<br><br>This all looks very normal.<br>I would guess the imbalance print in the output is incorrect.<br>Which version of Gromacs are you using?<br><br>You could run a short simulation (of a minute or so) on 1 or 2 cores<br>and compare the ns/day reported at the end of the log file to this run.<br><br>Berk<br><br>                                            <br /><hr />Express yourself instantly with MSN Messenger! <a href='http://clk.atdmt.com/AVE/go/onm00200471ave/direct/01/' target='_new'>MSN Messenger</a></body>
</html>