[gmx-users] MD workstation

Szilárd Páll pall.szilard at gmail.com
Fri Oct 17 17:30:00 CEST 2014


On Fri, Oct 17, 2014 at 3:47 AM, lloyd riggs <lloyd.riggs at gmx.ch> wrote:
>
> Is there any progress in openCL versions of Gromacs, as it is listed on the
> developer site?  Just askin.

Yes, there is some. However, I'm wondering how is that related to the
topic? The chance of OpenCL support will likely not not change the
hardware options much, at least not on the short term.

>  One thing I ran across is one can get
> integrated GPU arrays on a board if you find say Russian board designs from
> China for about the same price with 10x the computational speed, but the
> boards would be largly OpenCL dependent.

OpenCL is not a silver bullet. Somebody will need to write code for
those boards.

Can we have a reference for the hardware you're talking about?

> Stephan Watkins
> Gesendet: Donnerstag, 16. Oktober 2014 um 20:21 Uhr
> Von: "Szilárd Páll" <pall.szilard at gmail.com>
> An: "Discussion list for GROMACS users" <gmx-users at gromacs.org>
> Betreff: Re: [gmx-users] MD workstation
> On Thu, Oct 16, 2014 at 3:35 PM, Hadházi Ádám <hadadam at gmail.com> wrote:
>> May I ask why your config is better than e.g.
>>
>> 2x Intel Xeon E5-2620 CPUs (2x$405)
>> 4x GTX 970(4x $330)
>> 1x Z9PE-D8 WS ($449)
>> 64 GB DDR3 ($600)
>> PSU 1600W, ($250)
>> standard 2TB 5400rpm drive, ($85)
>> total: (~$3500)
>
> Mirco's suggested setup will give much higher *aggregate* simulation
> throughput. GROMACS uses both CPUs and GPUs and requires a balanced
> resource mix to run efficiently (less so if you don't use PME). The
> E5-2620 is rather slow and it will be a good match for a single GTX
> 970, perhaps even a 980, but it will be the limiting factor with two
> GPUs per socket.
>
>> As for your setup...can I use that 4 nodes in parallel for 1 long
>> simulation or 1 FEP job?
>
> Not without a fast network.
>
>> What are the weak points of my workstation?
>
> The CPU. Desktop IVB-E or HSW-E (e.g. i7 49XX, 59XX) will give much
> better performance per buck.
>
> Also note:
> * your smaller 25k MD setup will not scale across multiple GPUs;
> * in FEP runs you, by sharing a GPU between multiple runs you can
> increase the aggregate throughput by quite a lot!
>
> Cheers,
> --
> Szilárd
>
>> Best,
>> Adam
>>
>>
>> 2014-10-16 23:00 GMT+10:00 Mirco Wahab
>> <mirco.wahab at chemie.tu-freiberg.de>:
>>
>>> On 16.10.2014 14:38, Hadházi Ádám wrote:
>>>
>>>> Dear GMX Stuff and Users,
>>>>>> I am planning to buy a new MD workstation with 4 GPU (GTX 780 or 970)
>>>>>> or 3
>>>>>> GPU (GTX 980) for 4000$.
>>>>>> Could you recommend me a setup for this machine?
>>>>>> 1 or 2 CPU is necessary? 32/64 GB memory? Cooling? Power?
>>>>>>
>>>>>
>>>>> - What system (size, type, natoms) do you plan to simulate?
>>>>>
>>>>> - Do you have to run *only one single simulation* over long time
>>>>> or *some similar simulations* with similar parameters?
>>>>>
>>>>
>>>> The systems are kind of mix:
>>>> MD:
>>>> smallest system: 25k atoms, spc/tip3p, 2fs/4fs, NPT, simulation time:
>>>> 500-1000ns
>>>> biggest system: 150k atoms, spc/tip3p, 2fs/4fs, NPT, simulation time:
>>>> 100-1000ns
>>>> FEP (free energy perturbation): ligand functional group mutation
>>>> 25k-150k atoms, in complex and in water simulations, production
>>>> simulation:
>>>> 5ns for each lambda window (number of windows: 12)
>>>>
>>>
>>> In this situation, I'd probably use 4 machines for $1000 each,
>>> putting in each:
>>> - consumer i7/4790(K), $300
>>> - any 8GB DDR3, $75-$80
>>> - standard Z97 board, $100
>>> - standard PSU 450W, $40
>>> - standard 2TB 5400rpm drive, $85
>>>
>>> the rest of the money (4 x $395), I'd use for 4 graphics
>>> cards, probably 3 GTX-970 ($330) and one GTX-980 ($550) -
>>> depending on availability, the actual prices, and
>>> your detailed budget.
>>>
>>> YMMV,
>>>
>>>
>>> Regards
>>>
>>> M.
>>>
>>> --
>>> Gromacs Users mailing list
>>>
>>> * Please search the archive at http://www.gromacs.org/
>>> Support/Mailing_Lists/GMX-Users_List before posting!
>>>
>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>>
>>> * For (un)subscribe requests visit
>>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
>>> send a mail to gmx-users-request at gromacs.org.
>>>
>> --
>> Gromacs Users mailing list
>>
>> * Please search the archive at
>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
>>
>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>
>> * For (un)subscribe requests visit
>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send
>> a mail to gmx-users-request at gromacs.org.
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
> mail to gmx-users-request at gromacs.org.
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
> mail to gmx-users-request at gromacs.org.
>


More information about the gromacs.org_gmx-users mailing list