[gmx-users] Problem with OpenMP+MPI

Szilárd Páll szilard.pall at cbr.su.se
Wed Feb 27 18:27:46 CET 2013


Jesmin,

First of all, have you read the following pages?
http://www.gromacs.org/Documentation/Acceleration_and_parallelization
http://www.gromacs.org/Documentation/Cut-off_schemes

To summarize:
- OpenMP parallelization is supported in Verlet scheme for the full mdrun
and for separate PME nodes also with the group scheme (for improving
scaling);
- GB only works with group scheme and right now is rather limited by
parallelization issues. Additionally, fixing it does not have a very high
priority due to the lack of active maintainers of this code with lots of
free time as well as because there have been fairly limited user feedback
on (the need for) GB. Please express your wish of getting this fixed +
support for OpenMP by commenting on existing bug reports of filing new
bugs/feature requests on redmine.gromacs.org.

Thanks,

--
Szilárd


On Wed, Feb 27, 2013 at 6:13 PM, jesmin jahan <shraban03 at gmail.com> wrote:

> Dear Justin,
>
> I want to compare the performance of MPI+OpenMP implementation of
> Gromacs GB-energy with other MPI+X implementation of GB-energy.
>
> So, my question is:
> Is there any way to get sensible result for the implicit solvent
> simulation using Gromacs that employs both MPI and OpenMP?
> If yes, then what should be the parameter set?
>
> Thanks for your time.
>
> Best Regards,
> Jesmin
>
> On Wed, Feb 27, 2013 at 11:54 AM, Justin Lemkul <jalemkul at vt.edu> wrote:
> >
> >
> > On 2/27/13 11:27 AM, jesmin jahan wrote:
> >>
> >> Thanks again Justin,
> >>
> >> I added
> >>
> >> cutoff-scheme       = group
> >>
> >> But still I am getting
> >>
> >> OpenMP threads have been requested with cut-off scheme Group, but
> >> these are only supported with cut-off scheme Verlet
> >>
> >> which really says, openmp threads are supported only in Verlet!
> >>
> >
> > Then you need to run without OpenMP threads.
> >
> > -Justin
> >
> >
> >> constraints         =  none
> >> integrator          =  md
> >> cutoff-scheme       = group
> >>
> >> pbc                 =  no
> >> ;verlet-buffer-drift = -1
> >> dt                  =  0.001
> >> nsteps              =  0
> >> ns_type             = simple
> >> comm-mode           = angular
> >> rlist               = 0
> >> rcoulomb            = 0
> >> rvdw                = 0
> >> nstlist             = 0
> >> rgbradii            = 0
> >> nstgbradii          = 1
> >> coulombtype         = cutoff
> >> vdwtype             = cutoff
> >> implicit_solvent    =  GBSA
> >> gb_algorithm        =  HCT ;
> >> sa_algorithm        =  None
> >> gb_dielectric_offset    = 0.02
> >>
> >> optimize_fft             = yes
> >> energygrps               = protein
> >>
> >> Best Regards,
> >> Jesmin
> >>
> >>
> >> On Wed, Feb 27, 2013 at 10:57 AM, Justin Lemkul <jalemkul at vt.edu>
> wrote:
> >>>
> >>>
> >>>
> >>> On 2/27/13 10:54 AM, jesmin jahan wrote:
> >>>>
> >>>>
> >>>> Many thanks to you Justin for your help.
> >>>>
> >>>> Now my .mdp file looks like:
> >>>>
> >>>> constraints         =  none
> >>>> integrator          =  md
> >>>>
> >>>> pbc                 =  no
> >>>> verlet-buffer-drift = -1
> >>>> dt                  =  0.001
> >>>> nsteps              =  0
> >>>> ns_type             = simple
> >>>> comm-mode           = angular
> >>>> rlist               = 0
> >>>> rcoulomb            = 0
> >>>> rvdw                = 0
> >>>> nstlist             = 0
> >>>> rgbradii            = 0
> >>>> nstgbradii          = 1
> >>>> coulombtype         = cutoff
> >>>> vdwtype             = cutoff
> >>>> implicit_solvent    =  GBSA
> >>>> gb_algorithm        =  HCT ;
> >>>> sa_algorithm        =  None
> >>>> gb_dielectric_offset    = 0.02
> >>>>
> >>>> optimize_fft             = yes
> >>>> energygrps               = protein
> >>>>
> >>>> And when I run mdrun, it says
> >>>>
> >>>> "OpenMP threads have been requested with cut-off scheme Group, but
> >>>> these are only supported with cut-off scheme Verlet"
> >>>>
> >>>> Now if I add
> >>>>
> >>>> cutoff-scheme       = Verlet
> >>>>
> >>>> It says
> >>>>
> >>>>    With Verlet lists only full pbc or pbc=xy with walls is supported
> >>>>
> >>>>     With Verlet lists nstlist should be larger than 0
> >>>>
> >>>>
> >>>> So, what is the solution??
> >>>>
> >>>
> >>> Use the group cutoff scheme rather than Verlet, otherwise we wind up
> >>> right
> >>> back where we were before.
> >>>
> >>>
> >>>> Also,  you told me
> >>>>
> >>>> "Note that implicit solvent calculations can be run on no more than 2
> >>>> processors.  You'll get a fatal error if you try to use more."
> >>>>
> >>>> Is there any documentation for this? Can I cite you if I want to add
> >>>> this information in an article?
> >>>>
> >>>
> >>> It's a bug, listed on Redmine somewhere.  I doubt that requires any
> sort
> >>> of
> >>> citation, but if anyone asks, there are discussions in the list archive
> >>> and
> >>> on redmine.gromacs.org.
> >>>
> >>>
> >>> -Justin
> >>>
> >>> --
> >>> ========================================
> >>>
> >>> Justin A. Lemkul, Ph.D.
> >>> Research Scientist
> >>> Department of Biochemistry
> >>> Virginia Tech
> >>> Blacksburg, VA
> >>> jalemkul[at]vt.edu | (540) 231-9080
> >>> http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin
> >>>
> >>> ========================================
> >>> --
> >>> gmx-users mailing list    gmx-users at gromacs.org
> >>> http://lists.gromacs.org/mailman/listinfo/gmx-users
> >>> * Please search the archive at
> >>> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> >>> * Please don't post (un)subscribe requests to the list. Use the www
> >>> interface or send it to gmx-users-request at gromacs.org.
> >>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> >>
> >>
> >>
> >>
> >
> > --
> > ========================================
> >
> > Justin A. Lemkul, Ph.D.
> > Research Scientist
> > Department of Biochemistry
> > Virginia Tech
> > Blacksburg, VA
> > jalemkul[at]vt.edu | (540) 231-9080
> > http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin
> >
> > ========================================
> > --
> > gmx-users mailing list    gmx-users at gromacs.org
> > http://lists.gromacs.org/mailman/listinfo/gmx-users
> > * Please search the archive at
> > http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> > * Please don't post (un)subscribe requests to the list. Use the www
> > interface or send it to gmx-users-request at gromacs.org.
> > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
>
>
> --
> Jesmin Jahan Tithi
> PhD Student, CS
> Stony Brook University, NY-11790.
> --
> gmx-users mailing list    gmx-users at gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> * Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-request at gromacs.org.
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>



More information about the gromacs.org_gmx-users mailing list