<html>
<head>
<style>
.hmmessage P
{
margin:0px;
padding:0px
}
body.hmmessage
{
font-size: 10pt;
font-family:Verdana
}
</style>
</head>
<body class='hmmessage'>
Well, David is the person who should fix this.<br>You can submit a bugzilla or bug him directly.<br><br>Berk<br><br>> Date: Thu, 28 May 2009 10:41:12 +0200<br>> From: erikm@xray.bmc.uu.se<br>> To: gmx-users@gromacs.org<br>> Subject: Re: [gmx-users] Strange assignment of atoms to processors with pd<br>> <br>> Berk Hess skrev:<br>> > Hi,<br>> ><br>> > This is plain 4.0 code is presume?<br>> > This problem should be fixed then.<br>> Should I commit a buzilla?<br>> ><br>> > But I now also made vacuum without cut-off working with domain <br>> > decomposition in CVS head.<br>> > Compared to a not-unbalanced PD (for instance only a protein, no <br>> > water) DD is slightly slower.<br>> > But DD will be faster than a badly balanced PD system.<br>> ><br>> > Berk<br>> ><br>> > > Date: Wed, 27 May 2009 11:04:49 +0200<br>> > > From: spoel@xray.bmc.uu.se<br>> > > To: gmx-users@gromacs.org<br>> > > Subject: Re: [gmx-users] Strange assignment of atoms to processors <br>> > with pd<br>> > ><br>> > > Erik Marklund wrote:<br>> > > > David van der Spoel skrev:<br>> > > >> Erik Marklund wrote:<br>> > > >>> I should add that this problem only seem to arise when the <br>> > analyte is<br>> > > >>> covered with a thin sheet of water. When simulating a dry analyte I<br>> > > >>> get good scaling. In the latter case the charges, and therefore the<br>> > > >>> topology, is slightly different.<br>> > > >> How about vsites? Did you happen to turn them off as well in the<br>> > > >> vacuum case?<br>> > > > Turned off in all cases. The VSites mentioned in the log file is the<br>> > > > 4:th particle on the tip4p-water molecules.<br>> > > OK. Did you try a one step run with -debug?<br>> > > It may give more info on the partitioning.<br>> > ><br>> > > >>><br>> > > >>> /Erik<br>> > > >>><br>> > > >>> Erik Marklund skrev:<br>> > > >>>> Hi,<br>> > > >>>><br>> > > >>>> I'm simulating non-periodic systems in vacuo, using constrained<br>> > > >>>> h-bonds and particle decomposition. For some of my simulations the<br>> > > >>>> cpu-usage seem far from optimal. The first cpu gets no atoms, <br>> > while<br>> > > >>>> the second one gets plenty and the remaining cpus get less than I<br>> > > >>>> expected. Is this a bug?<br>> > > >>>><br>> > > >>>><br>> > > >>>> An excerpt from the log file:<br>> > > >>>><br>> > > >>>> There are: 2911 Atoms<br>> > > >>>> There are: 317 VSites<br>> > > >>>> splitting topology...<br>> > > >>>> There are 999 charge group borders and 318 shake borders<br>> > > >>>> There are 318 total borders<br>> > > >>>> Division over nodes in atoms:<br>> > > >>>> 0 1960 212 212 212 212 212 208<br>> > > >>>> Walking down the molecule graph to make constraint-blocks<br>> > > >>>> CPU= 0, lastcg= -1, targetcg= 499, myshift= 1<br>> > > >>>> CPU= 1, lastcg= 681, targetcg= 182, myshift= 0<br>> > > >>>> CPU= 2, lastcg= 734, targetcg= 235, myshift= 7<br>> > > >>>> CPU= 3, lastcg= 787, targetcg= 288, myshift= 6<br>> > > >>>> CPU= 4, lastcg= 840, targetcg= 341, myshift= 5<br>> > > >>>> CPU= 5, lastcg= 893, targetcg= 394, myshift= 4<br>> > > >>>> CPU= 6, lastcg= 946, targetcg= 447, myshift= 3<br>> > > >>>> CPU= 7, lastcg= 998, targetcg= 499, myshift= 2<br>> > > >>>> pd->shift = 7, pd->bshift= 0<br>> > > >>>> Division of bonded forces over processors<br>> > > >>>> CPU 0 1 2 3 4 5 6 7<br>> > > >>>> Workload division<br>> > > >>>> nnodes: 8<br>> > > >>>> pd->shift: 7<br>> > > >>>> pd->bshift: 0<br>> > > >>>> Nodeid atom0 #atom cg0 #cg<br>> > > >>>> 0 0 0 0 0<br>> > > >>>> 1 0 1960 0 682<br>> > > >>>> 2 1960 212 682 53<br>> > > >>>> 3 2172 212 735 53<br>> > > >>>> 4 2384 212 788 53<br>> > > >>>> 5 2596 212 841 53<br>> > > >>>> 6 2808 212 894 53<br>> > > >>>> 7 3020 208 947 52<br>> > > >>>><br>> > > >>>> …<br>> > > >>>> Total Scaling: 18% of max performance<br>> > > >>>><br>> > > >>><br>> > > >>><br>> > > >><br>> > > >><br>> > > ><br>> > > ><br>> > ><br>> > ><br>> > > --<br>> > > David van der Spoel, Ph.D., Professor of Biology<br>> > > Molec. Biophys. group, Dept. of Cell & Molec. Biol., Uppsala University.<br>> > > Box 596, 75124 Uppsala, Sweden. Phone: +46184714205. Fax: +4618511755.<br>> > > spoel@xray.bmc.uu.se spoel@gromacs.org http://folding.bmc.uu.se<br>> > > _______________________________________________<br>> > > gmx-users mailing list gmx-users@gromacs.org<br>> > > http://www.gromacs.org/mailman/listinfo/gmx-users<br>> > > Please search the archive at http://www.gromacs.org/search before <br>> > posting!<br>> > > Please don't post (un)subscribe requests to the list. Use the<br>> > > www interface or send it to gmx-users-request@gromacs.org.<br>> > > Can't post? Read http://www.gromacs.org/mailing_lists/users.php<br>> ><br>> > ------------------------------------------------------------------------<br>> > What can you do with the new Windows Live? Find out <br>> > <http://www.microsoft.com/windows/windowslive/default.aspx><br>> > ------------------------------------------------------------------------<br>> ><br>> > _______________________________________________<br>> > gmx-users mailing list gmx-users@gromacs.org<br>> > http://www.gromacs.org/mailman/listinfo/gmx-users<br>> > Please search the archive at http://www.gromacs.org/search before posting!<br>> > Please don't post (un)subscribe requests to the list. Use the <br>> > www interface or send it to gmx-users-request@gromacs.org.<br>> > Can't post? Read http://www.gromacs.org/mailing_lists/users.php<br>> <br>> <br>> -- <br>> -----------------------------------------------<br>> Erik Marklund, PhD student<br>> Laboratory of Molecular Biophysics,<br>> Dept. of Cell and Molecular Biology, Uppsala University.<br>> Husargatan 3, Box 596, 75124 Uppsala, Sweden<br>> phone: +46 18 471 4537 fax: +46 18 511 755<br>> erikm@xray.bmc.uu.se http://xray.bmc.uu.se/molbiophys<br>> <br>> _______________________________________________<br>> gmx-users mailing list gmx-users@gromacs.org<br>> http://www.gromacs.org/mailman/listinfo/gmx-users<br>> Please search the archive at http://www.gromacs.org/search before posting!<br>> Please don't post (un)subscribe requests to the list. Use the <br>> www interface or send it to gmx-users-request@gromacs.org.<br>> Can't post? Read http://www.gromacs.org/mailing_lists/users.php<br><br /><hr />See all the ways you can stay connected <a href='http://www.microsoft.com/windows/windowslive/default.aspx' target='_new'>to friends and family</a></body>
</html>