Hi,<div><br></div><div>last time I checked (summer) I got </div><div>40bytes per atom </div><div>and 294byes per atom/core (RF with 12Å cut-off)</div><div><br></div><div>100M atoms works with that cut-off on 128 16GB nodes with 8 cores. I haven't tried on less than 128 nodes. (See <a href="http://cmb.ornl.gov/research/petascale-md">http://cmb.ornl.gov/research/petascale-md</a>)<br>
<br></div><div>We could relatively easy fix the 40bytes per atom (no one had time so far to work on it) but I don't think there is much which can be done about the 294bytes per atom/core.</div><div><br></div><div>On how many nodes do you want to simulate? Thus are you limited by the 40bytes per atom or the 294bytes per atom/core?</div>
<div><br></div><div>Roland</div><div><br></div><div><br></div><div><br><div class="gmail_quote">On Tue, Mar 2, 2010 at 11:31 PM, Amit Choubey <span dir="ltr"><<a href="mailto:kgp.amit@gmail.com">kgp.amit@gmail.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">Hi Mark,<br><br>Yes thats one way to go about it. But it would have been great if i could get a rough estimation.<br>
<br>
Thank you.<br><font color="#888888"><br>amit</font><div><div></div><div class="h5"><br><br><br><div class="gmail_quote">On Tue, Mar 2, 2010 at 8:06 PM, Mark Abraham <span dir="ltr"><<a href="mailto:Mark.Abraham@anu.edu.au" target="_blank">Mark.Abraham@anu.edu.au</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="border-left:1px solid rgb(204, 204, 204);margin:0pt 0pt 0pt 0.8ex;padding-left:1ex"><div>On 3/03/2010 12:53 PM, Amit Choubey wrote:<br>
<blockquote class="gmail_quote" style="border-left:1px solid rgb(204, 204, 204);margin:0pt 0pt 0pt 0.8ex;padding-left:1ex">
Hi Mark,<br>
<br>
I quoted the memory usage requirements from a presentation by Berk<br>
Hess, Following is the link to it<br>
<br>
<br>
<a href="http://www.csc.fi/english/research/sciences/chemistry/courses/cg-2009/berk_csc.pdf" target="_blank">http://www.csc.fi/english/research/sciences/chemistry/courses/cg-2009/berk_csc.pdf</a><br>
<br>
l. In that presentation on pg 27,28 Berk does talk about memory<br>
usage but then I am not sure if he referred to any other specific thing.<br>
<br>
My system only contains SPC water. I want Berendsen T coupling and<br>
Coulomb interaction with Reaction Field.<br>
<br>
I just want a rough estimate of how big of a system of water can be<br>
simulated on our super computers.<br>
</blockquote>
<br></div>
Try increasingly large systems until it runs out of memory. There's your answer.<br>
<br>
Mark<br>
<br>
<blockquote class="gmail_quote" style="border-left:1px solid rgb(204, 204, 204);margin:0pt 0pt 0pt 0.8ex;padding-left:1ex"><div>
On Fri, Feb 26, 2010 at 3:56 PM, Mark Abraham <<a href="mailto:mark.abraham@anu.edu.au" target="_blank">mark.abraham@anu.edu.au</a><br></div><div>
<mailto:<a href="mailto:mark.abraham@anu.edu.au" target="_blank">mark.abraham@anu.edu.au</a>>> wrote:<br>
<br>
----- Original Message -----<br></div><div>
From: Amit Choubey <<a href="mailto:kgp.amit@gmail.com" target="_blank">kgp.amit@gmail.com</a> <mailto:<a href="mailto:kgp.amit@gmail.com" target="_blank">kgp.amit@gmail.com</a>>><br>
Date: Saturday, February 27, 2010 10:17<br>
Subject: Re: [gmx-users] gromacs memory usage<br>
To: Discussion list for GROMACS users <<a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a><br></div><div>
<mailto:<a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a>>><br>
<br>
> Hi Mark,<br>
> We have few nodes with 64 GB memory and many other with 16 GB of<br>
memory. I am attempting a simulation of around 100 M atoms.><br>
<br>
Well, try some smaller systems and work upwards to see if you have a<br>
limit in practice. 50K atoms can be run in less than 32GB over 64<br>
processors. You didn't say whether your simulation system can run on<br>
1 processor... if it does, then you can be sure the problem really<br>
is related to parallelism.<br>
<br>
> I did find some document which says one need (50bytes)*NATOMS on<br>
master node, also one needs<br>
> (100+4*(no. of atoms in cutoff)*(NATOMS/nprocs) for compute<br>
nodes. Is this true?><br>
<br>
In general, no. It will vary with the simulation algorithm you're<br>
using. Quoting such without attributing the source or describing the<br>
context is next to useless. You also dropped a parenthesis.<br>
<br>
Mark<br>
--<br>
gmx-users mailing list <a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a><br></div>
<mailto:<a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a>><div><br>
<a href="http://lists.gromacs.org/mailman/listinfo/gmx-users" target="_blank">http://lists.gromacs.org/mailman/listinfo/gmx-users</a><br>
Please search the archive at <a href="http://www.gromacs.org/search" target="_blank">http://www.gromacs.org/search</a> before<br>
posting!<br>
Please don't post (un)subscribe requests to the list. Use the<br>
www interface or send it to <a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a><br></div>
<mailto:<a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a>>.<div><br>
Can't post? Read <a href="http://www.gromacs.org/mailing_lists/users.php" target="_blank">http://www.gromacs.org/mailing_lists/users.php</a><br>
<br>
<br>
</div></blockquote>
-- <br><div><div></div><div>
gmx-users mailing list <a href="mailto:gmx-users@gromacs.org" target="_blank">gmx-users@gromacs.org</a><br>
<a href="http://lists.gromacs.org/mailman/listinfo/gmx-users" target="_blank">http://lists.gromacs.org/mailman/listinfo/gmx-users</a><br>
Please search the archive at <a href="http://www.gromacs.org/search" target="_blank">http://www.gromacs.org/search</a> before posting!<br>
Please don't post (un)subscribe requests to the list. Use the www interface or send it to <a href="mailto:gmx-users-request@gromacs.org" target="_blank">gmx-users-request@gromacs.org</a>.<br>
Can't post? Read <a href="http://www.gromacs.org/mailing_lists/users.php" target="_blank">http://www.gromacs.org/mailing_lists/users.php</a><br>
</div></div></blockquote></div><br>
</div></div><br>--<br>
gmx-users mailing list <a href="mailto:gmx-users@gromacs.org">gmx-users@gromacs.org</a><br>
<a href="http://lists.gromacs.org/mailman/listinfo/gmx-users" target="_blank">http://lists.gromacs.org/mailman/listinfo/gmx-users</a><br>
Please search the archive at <a href="http://www.gromacs.org/search" target="_blank">http://www.gromacs.org/search</a> before posting!<br>
Please don't post (un)subscribe requests to the list. Use the<br>
www interface or send it to <a href="mailto:gmx-users-request@gromacs.org">gmx-users-request@gromacs.org</a>.<br>
Can't post? Read <a href="http://www.gromacs.org/mailing_lists/users.php" target="_blank">http://www.gromacs.org/mailing_lists/users.php</a><br></blockquote></div><br><br clear="all"><br>-- <br>ORNL/UT Center for Molecular Biophysics <a href="http://cmb.ornl.gov">cmb.ornl.gov</a><br>
865-241-1537, ORNL PO BOX 2008 MS6309<br>
</div>