[gmx-users] MD workstation for Gromacs

Abhi Acharya abhi117acharya at gmail.com
Mon Oct 27 15:05:44 CET 2014


Hi,
We use systems with 780 Ti , 12 cores  and have had no problems running
Gromacs. Gives a performance of ~ 100 ns/day for a system with 35000 atoms.

Regards,
Abhishek

On Mon, Oct 27, 2014 at 7:05 PM, Adelman, Joshua Lev <jla65 at pitt.edu> wrote:

>
> On Oct 27, 2014, at 3:14 AM, Carsten Kutzner wrote:
>
> Hi,
>
> On 27 Oct 2014, at 07:11, Mohammad Hossein Borghei <mh.borghei at gmail.com
> <mailto:mh.borghei at gmail.com>> wrote:
>
> Thank you Szilárd,
>
> So I would be really thankful if you could tell me which configuration is
> the best:
>
> 2x GTX 980
> 2x GTX 780 Ti
> 2x GTX Titan Black
> I would choose between the 980 and the 780Ti, which will give
> you about the same performance. Buy whatever card you can get
> for a cheaper price. The Titan Black will be too expensive in
> relation to performance.
>
> Carsten
>
>
>
> Just a note about the GTX 780 Ti. While I don't know if people have had
> problems running with Gromacs, the Amber developers are recommending
> against this card due high failure rates:
>
> See the "Supported GPUs" section of:
> http://ambermd.org/gpus/
>
> and the following mailing list thread:
> http://archive.ambermd.org/201406/0289.html
>
> Josh
>
>
>
>
>
> On Mon, Oct 20, 2014 at 12:24 AM, Szilárd Páll <pall.szilard at gmail.com
> <mailto:pall.szilard at gmail.com>>
> wrote:
>
> Please send such questions/requests to the GROMACS users' list, I'm
> replying there.
>
> - For GROMACS 2x GTX 980 will be faster than one TITAN Z.
> - Consider getting plain DIMMs instead of ECC registered;
> - If you care about memory bandwidth, AFAIK you need 8 memory modules;
> this will not matter for GROMACS simulations, but it could matter for
> analysis or other memory-intensive operations;
>
> --
> Szilárd
>
>
> ---------- Forwarded message ----------
> From: Mohammad Hossein Borghei <mh.borghei at gmail.com<mailto:
> mh.borghei at gmail.com>>
> Date: Sun, Oct 19, 2014 at 12:34 PM
> Subject: MD workstation for Gromacs
> To: pall.szilard at gmail.com<mailto:pall.szilard at gmail.com>
>
>
> Dear Mr. Szilárd
>
> I saw your comments in Gromacs mailing list and I thought you can answer
> my question. I would be really thankful if you could tell me whether the
> attached configurations are appropriate for GPU computing in Gromacs. Which
> one is better? Can they be improved by not any increase in price?
>
> Kind Regards,
>
> --
> Mohammad Hossein Borghei
>
>
>
>
> Sent with MailTrack
> <
> https://mailtrack.io/install?source=signature&lang=en&referral=mh.borghei@gmail.com&idSignature=22
> >
>
>
>
>
> --
> Mohammad Hossein Borghei
> --
>
>
> --
> Dr. Carsten Kutzner
> Max Planck Institute for Biophysical Chemistry
> Theoretical and Computational Biophysics
> Am Fassberg 11, 37077 Goettingen, Germany
> Tel. +49-551-2012313, Fax: +49-551-2012302
> http://www.mpibpc.mpg.de/grubmueller/kutzner
> http://www.mpibpc.mpg.de/grubmueller/sppexa
>
>
>
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
>


More information about the gromacs.org_gmx-users mailing list