<div dir="ltr"><br><div class="gmail_extra"><br><div class="gmail_quote">On Tue, Sep 30, 2014 at 4:48 PM, Kevin Chen <span dir="ltr"><<a href="mailto:fch6699@gmail.com" target="_blank">fch6699@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Have you guys compared GROMACS performance between icc and gcc? Per Szilárd's note, seems gcc is better, am I right?<br></blockquote><div><br></div><div>Yeah, I'd take that one to the bank. Of course, we're always interested to hear of observations (either way).</div><div><br></div><div>Mark</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Cheers,<br>
<br>
Kevin<br>
<span class=""><br>
<br>
-----Original Message-----<br>
From: <a href="mailto:gromacs.org_gmx-developers-bounces@maillist.sys.kth.se">gromacs.org_gmx-developers-bounces@maillist.sys.kth.se</a> [mailto:<a href="mailto:gromacs.org_gmx-developers-bounces@maillist.sys.kth.se">gromacs.org_gmx-developers-bounces@maillist.sys.kth.se</a>] On Behalf Of Szilárd Páll<br>
Sent: Monday, September 29, 2014 3:50 PM<br>
To: Discussion list for GROMACS development<br>
Subject: Re: [gmx-developers] ICC 14 support<br>
<br>
</span><div><div class="h5">BTW: you'll have less trouble when combining with CUDA as well as better performance with gcc!<br>
<br>
Cheers,<br>
--<br>
Szilárd<br>
<br>
<br>
On Mon, Sep 29, 2014 at 10:14 PM, Kevin Chen <<a href="mailto:fch6699@gmail.com">fch6699@gmail.com</a>> wrote:<br>
> Hi Roland,<br>
><br>
><br>
><br>
> Thanks for the reply! Looks like the error messages were generated by<br>
> Cuda6.0 (see errors below) instead of GROMACS. Switching back to<br>
> ICC13.1 and turning off static-libraries worked for us.<br>
><br>
><br>
><br>
> Best,<br>
><br>
><br>
><br>
> Kevin<br>
><br>
><br>
><br>
><br>
><br>
><br>
><br>
> ======================================================================<br>
> ===============================================================<br>
><br>
><br>
><br>
> “In file included from /opt/apps/cuda/6.0/include/cuda_runtime.h(59),<br>
><br>
> /opt/apps/cuda/6.0/include/host_config.h(72): catastrophic error:<br>
> #error<br>
> directive: -- unsupported ICC configuration! Only ICC 13.1 on Linux<br>
> x86_64 is supported!<br>
><br>
> #error -- unsupported ICC configuration! Only ICC 13.1 on Linux<br>
> x86_64 is supported!<br>
><br>
> ^<br>
><br>
><br>
><br>
> /opt/apps/cuda/6.0/include/host_config.h(72): catastrophic error:<br>
> #error<br>
> directive: -- unsupported ICC configuration! Only ICC 13.1 on Linux<br>
> x86_64 is supported!<br>
><br>
> #error -- unsupported ICC configuration! Only ICC 13.1 on Linux<br>
> x86_64 is supported!<br>
><br>
> ^<br>
><br>
><br>
><br>
> from<br>
> /admin/build/admin/rpms/stampede/BUILD/gromacs-5.0.1/src/gromacs/gmxlib/gpu_utils/<a href="http://gpu_utils.cu" target="_blank">gpu_utils.cu</a>(0):<br>
><br>
> /opt/apps/cuda/6.0/include/host_config.h(72): catastrophic error:<br>
> #error<br>
> directive: -- unsupported ICC configuration! Only ICC 13.1 on Linux<br>
> x86_64 is supported!<br>
><br>
> #error -- unsupported ICC configuration! Only ICC 13.1 on Linux<br>
> x86_64 is supported!<br>
><br>
> ^<br>
><br>
><br>
><br>
> In file included from /opt/apps/cuda/6.0/include/cuda_runtime.h(59),<br>
><br>
> from<br>
> /admin/build/admin/rpms/stampede/BUILD/gromacs-5.0.1/src/gromacs/gmxlib/cuda_tools/<a href="http://pmalloc_cuda.cu" target="_blank">pmalloc_cuda.cu</a>(0):<br>
><br>
> /opt/apps/cuda/6.0/include/host_config.h(72): catastrophic error:<br>
> #error<br>
> directive: -- unsupported ICC configuration! Only ICC 13.1 on Linux<br>
> x86_64 is supported!<br>
><br>
><br>
><br>
> “<br>
><br>
><br>
><br>
><br>
><br>
><br>
><br>
><br>
><br>
> From: <a href="mailto:gromacs.org_gmx-developers-bounces@maillist.sys.kth.se">gromacs.org_gmx-developers-bounces@maillist.sys.kth.se</a><br>
> [mailto:<a href="mailto:gromacs.org_gmx-developers-bounces@maillist.sys.kth.se">gromacs.org_gmx-developers-bounces@maillist.sys.kth.se</a>] On<br>
> Behalf Of Roland Schulz<br>
> Sent: Monday, September 29, 2014 11:16 AM<br>
> To: <a href="mailto:gmx-developers@gromacs.org">gmx-developers@gromacs.org</a><br>
> Subject: Re: [gmx-developers] ICC 14 support<br>
><br>
><br>
><br>
> Hi,<br>
><br>
><br>
><br>
> what problem do you have with ICC14? Both ICC14 and ICC15 should work fine.<br>
> There was an issue with ICC14+static-libraries (9e8061e13f48, 4.6.7<br>
> and<br>
> 5.0.1) and ICC14+unit-tests(b0e60e91add6, 5.0.1). Both are fixed in<br>
> release-5-0 and will be included 5.0.2. You can either use the<br>
> release-5-0 branch, just apply the patch, wait till 5.0.2 (should be<br>
> soon), or don't use static-libraries (default) and don't try to run<br>
> the unit-tests (the compiler issue isn't present in the main code thus<br>
> even though the unit tests fail the actual program is OK).<br>
><br>
><br>
><br>
> Roland<br>
><br>
><br>
><br>
> On Mon, Sep 29, 2014 at 10:35 AM, Kevin Chen <<a href="mailto:fch6699@gmail.com">fch6699@gmail.com</a>> wrote:<br>
><br>
> Hi Guys,<br>
><br>
> Was wondering if GROMACS will support ICC 14 in the near future?<br>
><br>
> Kevin Chen, Ph.D.<br>
> HPC Applications, TACC<br>
><br>
><br>
> -----Original Message-----<br>
> From: <a href="mailto:gromacs.org_gmx-developers-bounces@maillist.sys.kth.se">gromacs.org_gmx-developers-bounces@maillist.sys.kth.se</a><br>
> [mailto:<a href="mailto:gromacs.org_gmx-developers-bounces@maillist.sys.kth.se">gromacs.org_gmx-developers-bounces@maillist.sys.kth.se</a>] On<br>
> Behalf Of Alexey Shvetsov<br>
> Sent: Sunday, September 28, 2014 3:09 PM<br>
> To: <a href="mailto:gmx-developers@gromacs.org">gmx-developers@gromacs.org</a><br>
> Subject: Re: [gmx-developers] Possible bug in gmx<br>
><br>
> Mark Abraham писал 28-09-2014 18:17:<br>
>> How about a redmine issue - this thread's not about GROMACS<br>
>> development, per se ;-)<br>
><br>
> Sorry about that =D<br>
><br>
> Redmine issue <a href="http://redmine.gromacs.org/issues/1607" target="_blank">http://redmine.gromacs.org/issues/1607</a><br>
> With relevant tpr file attached<br>
>><br>
>> Mark<br>
>><br>
>> On Sun, Sep 28, 2014 at 3:18 PM, Alexey Shvetsov<br>
>> <<a href="mailto:alexxy@omrb.pnpi.spb.ru">alexxy@omrb.pnpi.spb.ru</a>> wrote:<br>
>><br>
>>> Hi Berk!<br>
>>><br>
>>> Its not a cut and paste error, also there are no pdb dumps.<br>
>>> Also I see this error before with other systems.<br>
>>><br>
>>> I can provide tpr file for that system<br>
>>><br>
>>> <a href="https://biod.pnpi.spb.ru/~alexxy/gmx/psa_pep_ctrl.md_npt.tpr" target="_blank">https://biod.pnpi.spb.ru/~alexxy/gmx/psa_pep_ctrl.md_npt.tpr</a> [1]<br>
>>><br>
>>> Berk Hess писал 28-09-2014 16:37:<br>
>>><br>
>>> Hi,<br>
>>><br>
>>> I assume that your old and be coordinates being identical is correct<br>
>>> and not a cut-and-paste error.<br>
>>> This seems a bit strange or do you freeze part of the system?<br>
>>> The only things moving here are then the domain boundaries and I<br>
>>> don't see an issue there, since they only moved a little.<br>
>>><br>
>>> Do you have any more output besides the error message? PDB dump<br>
>>> files maybe?<br>
>>><br>
>>> Cheers,<br>
>>><br>
>>> Berk<br>
>>><br>
>>> On 09/28/2014 02:22 PM, Alexey Shvetsov wrote:<br>
>>> Hi,<br>
>>><br>
>>> just wanna add that this error seems to be reproducable even on<br>
>>> single node. Also i get same error for gpu runs.<br>
>>> However i dont see it in large systems (800k+ atoms) running on<br>
>>> large number of cpus (512+)<br>
>>><br>
>>> Alexey Shvetsov писал 28-09-2014 13:44:<br>
>>> Hi,<br>
>>><br>
>>> DD grid is<br>
>>><br>
>>> Domain decomposition grid 4 x 1 x 1, separate PME ranks 0 PME domain<br>
>>> decomposition: 4 x 1 x 1<br>
>>><br>
>>> for 4 node setup<br>
>>><br>
>>> and<br>
>>><br>
>>> Domain decomposition grid 4 x 2 x 1, separate PME ranks 0 PME domain<br>
>>> decomposition: 4 x 2 x 1<br>
>>><br>
>>> for 8 node setup<br>
>>><br>
>>> It's reproducable with 5.0 release and latest git master. I try to<br>
>>> check if its reproducable with 1 node. Also i can provide tpr file<br>
>>> for this system<br>
>>><br>
>>> Mark Abraham писал 28-09-2014 13:28:<br>
>>> Hi,<br>
>>><br>
>>> It's hard to say on that information. There were some issues fixed<br>
>>> in the lead-up to GROMACS 5 with DD not always working with 2<br>
>>> domains in a direction, but that's a pure guess. I'd assume you can<br>
>>> reproduce this with release-5-0 branch. Do you observe it with a single domain?<br>
>>> If not, then it's surely a bug (and should be submitted to redmine).<br>
>>><br>
>>> Mark<br>
>>><br>
>>> On Sun, Sep 28, 2014 at 11:18 AM, Alexey Shvetsov<br>
>>> <<a href="mailto:alexxy@omrb.pnpi.spb.ru">alexxy@omrb.pnpi.spb.ru</a>> wrote:<br>
>>><br>
>>> Hi all!<br>
>>><br>
>>> I'm doing some tests with small peptide and constantly getting this<br>
>>> error.<br>
>>> I get it with few systems.<br>
>>><br>
>>> Systems sizes are around 10k or 20k<br>
>>> I run it on 4 or 8 old nodes each with two xeon 54xx series<br>
>>><br>
>>> starting mdrun '2ZCH_3 in water'<br>
>>> 50000000 steps, 100000.0 ps (continuing from step 1881000, 3762.0<br>
>>> ps).<br>
>>><br>
>>> Step 13514000:<br>
>>> The charge group starting at atom 6608 moved more than the distance<br>
>>> allowed by the domain decomposition (1.112924) in direction X<br>
>>> distance out of cell -1.193103 Old coordinates: 5.467 0.298 3.636<br>
>>> New<br>
>>> coordinates: 5.467 0.298 3.636 Old cell boundaries in direction X:<br>
>>> 4.037 5.382 New cell boundaries in direction X: 4.089 5.452<br>
>>><br>
>>><br>
>> ---------------------------------------------------------------------<br>
>> -<br>
>> ----<br>
>>> MPI_ABORT was invoked on rank 3 in communicator MPI_COMM_WORLD with<br>
>>> errorcode 1.<br>
>>><br>
>>> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.<br>
>>> You may or may not see output from other processes, depending on<br>
>>> exactly when Open MPI kills them.<br>
>>><br>
>>><br>
>> ---------------------------------------------------------------------<br>
>> -<br>
>> ----<br>
>>><br>
>>> -------------------------------------------------------<br>
>>> Program mdrun_mpi, VERSION 5.1-dev-20140922-20c00a9-dirty-unknown<br>
>>> Source code file:<br>
>>><br>
>>><br>
>> /var/tmp/alexxy/portage/sci-chemistry/gromacs-9999/work/gromacs-9999/<br>
>> s<br>
>> rc/gromacs/mdlib/domdec.cpp,<br>
>>> line: 4388<br>
>>><br>
>>> Fatal error:<br>
>>> A charge group moved too far between two domain decomposition steps<br>
>>> This usually means that your system is not well equilibrated For<br>
>>> more information and tips for troubleshooting, please check the<br>
>>> GROMACS website at <a href="http://www.gromacs.org/Documentation/Errors" target="_blank">http://www.gromacs.org/Documentation/Errors</a> [2]<br>
>>> [1]<br>
>>> -------------------------------------------------------<br>
>>><br>
>>> -- Best Regards,<br>
>>> Alexey 'Alexxy' Shvetsov, PhD<br>
>>> Department of Molecular and Radiation Biophysics FSBI Petersburg<br>
>>> Nuclear Physics Institute, NRC Kurchatov Institute, Leningrad<br>
>>> region, Gatchina, Russia mailto:<a href="mailto:alexxyum@gmail.com">alexxyum@gmail.com</a><br>
>>> mailto:<a href="mailto:alexxy@omrb.pnpi.spb.ru">alexxy@omrb.pnpi.spb.ru</a><br>
>>> -- Gromacs Developers mailing list<br>
>>><br>
>>> * Please search the archive at<br>
>>> <a href="http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List" target="_blank">http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List</a><br>
>>> [3] [2]<br>
>>> before posting!<br>
>>><br>
>>> * Can't post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a> [4]<br>
>>> [3]<br>
>>><br>
>>> * For (un)subscribe requests visit<br>
>>><br>
>>><br>
>> <a href="https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-develope" target="_blank">https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-develope</a><br>
>> r<br>
>> s<br>
>>> [5]<br>
>>> [4] or send a mail to <a href="mailto:gmx-developers-request@gromacs.org">gmx-developers-request@gromacs.org</a>.<br>
>>><br>
>>> Links:<br>
>>> ------<br>
>>> [1] <a href="http://www.gromacs.org/Documentation/Errors" target="_blank">http://www.gromacs.org/Documentation/Errors</a> [2] [2]<br>
>>> <a href="http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List" target="_blank">http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List</a> [3]<br>
>>> [3] <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a> [4] [4]<br>
>>><br>
>> <a href="https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-develope" target="_blank">https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-develope</a><br>
>> r<br>
>> s<br>
>>> [5]<br>
>><br>
>> -- Best Regards,<br>
>> Alexey 'Alexxy' Shvetsov, PhD<br>
>> Department of Molecular and Radiation Biophysics FSBI Petersburg<br>
>> Nuclear Physics Institute, NRC Kurchatov Institute, Leningrad<br>
>> region, Gatchina, Russia mailto:<a href="mailto:alexxyum@gmail.com">alexxyum@gmail.com</a><br>
>> mailto:<a href="mailto:alexxy@omrb.pnpi.spb.ru">alexxy@omrb.pnpi.spb.ru</a><br>
>><br>
>> --<br>
>> Best Regards,<br>
>> Alexey 'Alexxy' Shvetsov, PhD<br>
>> Department of Molecular and Radiation Biophysics FSBI Petersburg<br>
>> Nuclear Physics Institute, NRC Kurchatov Institute, Leningrad<br>
>> region, Gatchina, Russia mailto:<a href="mailto:alexxyum@gmail.com">alexxyum@gmail.com</a><br>
>> mailto:<a href="mailto:alexxy@omrb.pnpi.spb.ru">alexxy@omrb.pnpi.spb.ru</a><br>
>> --<br>
>> Gromacs Developers mailing list<br>
>><br>
>> * Please search the archive at<br>
>> <a href="http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List" target="_blank">http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List</a> [3]<br>
>> before posting!<br>
>><br>
>> * Can't post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a> [4]<br>
>><br>
>> * For (un)subscribe requests visit<br>
>><br>
>> <a href="https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-develope" target="_blank">https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-develope</a><br>
</div></div>>> r s [5] or send a mail to <a href="mailto:gmx-developers-request@gromacs.org">gmx-developers-request@gromacs.org</a>.<br>
<div class="HOEnZb"><div class="h5">>><br>
>><br>
>> Links:<br>
>> ------<br>
>> [1] <a href="https://biod.pnpi.spb.ru/~alexxy/gmx/psa_pep_ctrl.md_npt.tpr" target="_blank">https://biod.pnpi.spb.ru/~alexxy/gmx/psa_pep_ctrl.md_npt.tpr</a><br>
>> [2] <a href="http://www.gromacs.org/Documentation/Errors" target="_blank">http://www.gromacs.org/Documentation/Errors</a><br>
>> [3] <a href="http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List" target="_blank">http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List</a><br>
>> [4] <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a><br>
>> [5]<br>
>> <a href="https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-develope" target="_blank">https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-develope</a><br>
>> r<br>
>> s<br>
><br>
> --<br>
> Best Regards,<br>
> Alexey 'Alexxy' Shvetsov, PhD<br>
> Department of Molecular and Radiation Biophysics FSBI Petersburg<br>
> Nuclear Physics Institute, NRC Kurchatov Institute, Leningrad region,<br>
> Gatchina, Russia mailto:<a href="mailto:alexxyum@gmail.com">alexxyum@gmail.com</a><br>
> mailto:<a href="mailto:alexxy@omrb.pnpi.spb.ru">alexxy@omrb.pnpi.spb.ru</a><br>
> --<br>
> Gromacs Developers mailing list<br>
><br>
> * Please search the archive at<br>
> <a href="http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List" target="_blank">http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List</a><br>
> before posting!<br>
><br>
> * Can't post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a><br>
><br>
> * For (un)subscribe requests visit<br>
> <a href="https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developer" target="_blank">https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developer</a><br>
> s or send a mail to <a href="mailto:gmx-developers-request@gromacs.org">gmx-developers-request@gromacs.org</a>.<br>
><br>
> --<br>
> Gromacs Developers mailing list<br>
><br>
> * Please search the archive at<br>
> <a href="http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List" target="_blank">http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List</a><br>
> before posting!<br>
><br>
> * Can't post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a><br>
><br>
> * For (un)subscribe requests visit<br>
> <a href="https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developer" target="_blank">https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developer</a><br>
> s or send a mail to <a href="mailto:gmx-developers-request@gromacs.org">gmx-developers-request@gromacs.org</a>.<br>
><br>
><br>
><br>
><br>
><br>
> --<br>
> ORNL/UT Center for Molecular Biophysics <a href="http://cmb.ornl.gov" target="_blank">cmb.ornl.gov</a> 865-241-1537,<br>
> ORNL PO BOX 2008 MS6309<br>
><br>
><br>
> --<br>
> Gromacs Developers mailing list<br>
><br>
> * Please search the archive at<br>
> <a href="http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List" target="_blank">http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List</a><br>
> before posting!<br>
><br>
> * Can't post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a><br>
><br>
> * For (un)subscribe requests visit<br>
> <a href="https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developer" target="_blank">https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developer</a><br>
> s or send a mail to <a href="mailto:gmx-developers-request@gromacs.org">gmx-developers-request@gromacs.org</a>.<br>
--<br>
Gromacs Developers mailing list<br>
<br>
* Please search the archive at <a href="http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List" target="_blank">http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List</a> before posting!<br>
<br>
* Can't post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a><br>
<br>
* For (un)subscribe requests visit<br>
<a href="https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developers" target="_blank">https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developers</a> or send a mail to <a href="mailto:gmx-developers-request@gromacs.org">gmx-developers-request@gromacs.org</a>.<br>
<br>
--<br>
Gromacs Developers mailing list<br>
<br>
* Please search the archive at <a href="http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List" target="_blank">http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List</a> before posting!<br>
<br>
* Can't post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a><br>
<br>
* For (un)subscribe requests visit<br>
<a href="https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developers" target="_blank">https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developers</a> or send a mail to <a href="mailto:gmx-developers-request@gromacs.org">gmx-developers-request@gromacs.org</a>.</div></div></blockquote></div><br></div></div>