<div dir="ltr">Hi,<div><br></div><div>before we can automatically run those we need a benchmark suite first: <a href="http://redmine.gromacs.org/issues/1105">http://redmine.gromacs.org/issues/1105</a>. Someone needs to take the lead on it. I&#39;m sure others will help.</div><div><br></div><div>Roland</div></div><div class="gmail_extra"><br><div class="gmail_quote">On Tue, Sep 30, 2014 at 11:07 AM, Shirts, Michael R. (mrs5pt) <span dir="ltr">&lt;<a href="mailto:mrs5pt@eservices.virginia.edu" target="_blank">mrs5pt@eservices.virginia.edu</a>&gt;</span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">



<div style="word-wrap:break-word;color:rgb(0,0,0);font-size:14px;font-family:Calibri,sans-serif">
<div>
<div><br>
</div>
<div>Is there a plan (long term) to do (essentially) automated performance tests so that we can perform consistent(ish) checks for new changes in code, then post the results in an easy(ish) way to interpret for others?</div>
<div>
<div><br>
</div>
<div>Best,</div>
~~~~~~~~~~~~<br>
Michael Shirts<br>
Associate Professor<br>
Department of Chemical Engineering<br>
University of Virginia<br>
<a href="http://michael.shirts@virginia.edu" target="_blank">michael.shirts@virginia.edu</a><br>
<a href="tel:%28434%29-243-1821" value="+14342431821" target="_blank">(434)-243-1821</a></div>
</div>
<div><br>
</div>
<span>
<div style="font-family:Calibri;font-size:14pt;text-align:left;color:black;BORDER-BOTTOM:medium none;BORDER-LEFT:medium none;PADDING-BOTTOM:0in;PADDING-LEFT:0in;PADDING-RIGHT:0in;BORDER-TOP:#b5c4df 1pt solid;BORDER-RIGHT:medium none;PADDING-TOP:3pt">
<span style="font-weight:bold">From: </span>Mark Abraham &lt;<a href="mailto:mark.j.abraham@gmail.com" target="_blank">mark.j.abraham@gmail.com</a>&gt;<br>
<span style="font-weight:bold">Reply-To: </span>&quot;<a href="mailto:gmx-developers@gromacs.org" target="_blank">gmx-developers@gromacs.org</a>&quot; &lt;<a href="mailto:gmx-developers@gromacs.org" target="_blank">gmx-developers@gromacs.org</a>&gt;<br>
<span style="font-weight:bold">Date: </span>Tuesday, September 30, 2014 at 10:59 AM<br>
<span style="font-weight:bold">To: </span>Discussion list for GROMACS development &lt;<a href="mailto:gmx-developers@gromacs.org" target="_blank">gmx-developers@gromacs.org</a>&gt;<span class=""><br>
<span style="font-weight:bold">Subject: </span>Re: [gmx-developers] ICC 14 support<br>
</span></div>
<div><br>
</div>
<div>
<div>
<div dir="ltr"><br>
<div class="gmail_extra"><br>
<div class="gmail_quote"><span class="">On Tue, Sep 30, 2014 at 4:48 PM, Kevin Chen <span dir="ltr">
&lt;<a href="mailto:fch6699@gmail.com" target="_blank">fch6699@gmail.com</a>&gt;</span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Have you guys compared GROMACS performance between icc and gcc? Per Szilárd&#39;s note, seems gcc is better, am I right?<br>
</blockquote>
<div><br>
</div>
<div>Yeah, I&#39;d take that one to the bank. Of course, we&#39;re always interested to hear of observations (either way).</div>
<div><br>
</div>
<div>Mark</div>
<div> </div>
</span><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Cheers,<br>
<br>
Kevin<div><div class="h5"><br>
<span><br>
<br>
-----Original Message-----<br>
From: <a href="mailto:gromacs.org_gmx-developers-bounces@maillist.sys.kth.se" target="_blank">gromacs.org_gmx-developers-bounces@maillist.sys.kth.se</a> [mailto:<a href="mailto:gromacs.org_gmx-developers-bounces@maillist.sys.kth.se" target="_blank">gromacs.org_gmx-developers-bounces@maillist.sys.kth.se</a>]
 On Behalf Of Szilárd Páll<br>
Sent: Monday, September 29, 2014 3:50 PM<br>
To: Discussion list for GROMACS development<br>
Subject: Re: [gmx-developers] ICC 14 support<br>
<br>
</span>
</div></div><div>
<div><div><div class="h5">BTW: you&#39;ll have less trouble when combining with CUDA as well as better performance with gcc!<br>
<br>
Cheers,<br>
--<br>
Szilárd<br>
<br>
<br>
On Mon, Sep 29, 2014 at 10:14 PM, Kevin Chen &lt;<a href="mailto:fch6699@gmail.com" target="_blank">fch6699@gmail.com</a>&gt; wrote:<br>
&gt; Hi Roland,<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt; Thanks for the reply! Looks like the error messages were generated by<br>
&gt; Cuda6.0 (see errors below) instead of GROMACS. Switching back to<br>
&gt; ICC13.1 and turning off static-libraries worked for us.<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt; Best,<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt; Kevin<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt; ======================================================================<br>
&gt; ===============================================================<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt; “In file included from /opt/apps/cuda/6.0/include/cuda_runtime.h(59),<br>
&gt;<br>
&gt; /opt/apps/cuda/6.0/include/host_config.h(72): catastrophic error:<br>
&gt; #error<br>
&gt; directive: -- unsupported ICC configuration! Only ICC 13.1 on Linux<br>
&gt; x86_64 is supported!<br>
&gt;<br>
&gt;   #error -- unsupported ICC configuration! Only ICC 13.1 on Linux<br>
&gt; x86_64 is supported!<br>
&gt;<br>
&gt;    ^<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt; /opt/apps/cuda/6.0/include/host_config.h(72): catastrophic error:<br>
&gt; #error<br>
&gt; directive: -- unsupported ICC configuration! Only ICC 13.1 on Linux<br>
&gt; x86_64 is supported!<br>
&gt;<br>
&gt;   #error -- unsupported ICC configuration! Only ICC 13.1 on Linux<br>
&gt; x86_64 is supported!<br>
&gt;<br>
&gt;    ^<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt;                  from<br>
&gt; /admin/build/admin/rpms/stampede/BUILD/gromacs-5.0.1/src/gromacs/gmxlib/gpu_utils/<a href="http://gpu_utils.cu" target="_blank">gpu_utils.cu</a>(0):<br>
&gt;<br>
&gt; /opt/apps/cuda/6.0/include/host_config.h(72): catastrophic error:<br>
&gt; #error<br>
&gt; directive: -- unsupported ICC configuration! Only ICC 13.1 on Linux<br>
&gt; x86_64 is supported!<br>
&gt;<br>
&gt;   #error -- unsupported ICC configuration! Only ICC 13.1 on Linux<br>
&gt; x86_64 is supported!<br>
&gt;<br>
&gt;    ^<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt; In file included from /opt/apps/cuda/6.0/include/cuda_runtime.h(59),<br>
&gt;<br>
&gt;                  from<br>
&gt; /admin/build/admin/rpms/stampede/BUILD/gromacs-5.0.1/src/gromacs/gmxlib/cuda_tools/<a href="http://pmalloc_cuda.cu" target="_blank">pmalloc_cuda.cu</a>(0):<br>
&gt;<br>
&gt; /opt/apps/cuda/6.0/include/host_config.h(72): catastrophic error:<br>
&gt; #error<br>
&gt; directive: -- unsupported ICC configuration! Only ICC 13.1 on Linux<br>
&gt; x86_64 is supported!<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt; “<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt;<br></div></div><span class="">
&gt; From: <a href="mailto:gromacs.org_gmx-developers-bounces@maillist.sys.kth.se" target="_blank">gromacs.org_gmx-developers-bounces@maillist.sys.kth.se</a><br>
&gt; [mailto:<a href="mailto:gromacs.org_gmx-developers-bounces@maillist.sys.kth.se" target="_blank">gromacs.org_gmx-developers-bounces@maillist.sys.kth.se</a>] On<br></span><span class="">
&gt; Behalf Of Roland Schulz<br>
&gt; Sent: Monday, September 29, 2014 11:16 AM<br></span>
&gt; To: <a href="mailto:gmx-developers@gromacs.org" target="_blank">gmx-developers@gromacs.org</a><br>
&gt; Subject: Re: [gmx-developers] ICC 14 support<div><div class="h5"><br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt; Hi,<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt; what problem do you have with ICC14? Both ICC14 and ICC15 should work fine.<br>
&gt; There was an issue with ICC14+static-libraries (9e8061e13f48, 4.6.7<br>
&gt; and<br>
&gt; 5.0.1) and ICC14+unit-tests(b0e60e91add6, 5.0.1). Both are fixed in<br>
&gt; release-5-0 and will be included 5.0.2. You can either use the<br>
&gt; release-5-0 branch, just apply the patch, wait till 5.0.2 (should be<br>
&gt; soon), or don&#39;t use static-libraries (default) and don&#39;t try to run<br>
&gt; the unit-tests (the compiler issue isn&#39;t present in the main code thus<br>
&gt; even though the unit tests fail the actual program is OK).<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt; Roland<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt; On Mon, Sep 29, 2014 at 10:35 AM, Kevin Chen &lt;<a href="mailto:fch6699@gmail.com" target="_blank">fch6699@gmail.com</a>&gt; wrote:<br>
&gt;<br>
&gt; Hi Guys,<br>
&gt;<br>
&gt; Was wondering if GROMACS will support ICC 14 in the near future?<br>
&gt;<br>
&gt; Kevin Chen, Ph.D.<br>
&gt; HPC Applications, TACC<br>
&gt;<br>
&gt;<br>
&gt; -----Original Message-----<br>
&gt; From: <a href="mailto:gromacs.org_gmx-developers-bounces@maillist.sys.kth.se" target="_blank">gromacs.org_gmx-developers-bounces@maillist.sys.kth.se</a><br>
&gt; [mailto:<a href="mailto:gromacs.org_gmx-developers-bounces@maillist.sys.kth.se" target="_blank">gromacs.org_gmx-developers-bounces@maillist.sys.kth.se</a>] On<br>
&gt; Behalf Of Alexey Shvetsov<br>
&gt; Sent: Sunday, September 28, 2014 3:09 PM<br>
&gt; To: <a href="mailto:gmx-developers@gromacs.org" target="_blank">gmx-developers@gromacs.org</a><br>
&gt; Subject: Re: [gmx-developers] Possible bug in gmx<br>
&gt;<br>
&gt; Mark Abraham писал 28-09-2014 18:17:<br>
&gt;&gt; How about a redmine issue - this thread&#39;s not about GROMACS<br>
&gt;&gt; development, per se ;-)<br>
&gt;<br>
&gt; Sorry about that =D<br>
&gt;<br>
&gt; Redmine issue <a href="http://redmine.gromacs.org/issues/1607" target="_blank">
http://redmine.gromacs.org/issues/1607</a><br>
&gt; With relevant tpr file attached<br>
&gt;&gt;<br>
&gt;&gt; Mark<br>
&gt;&gt;<br>
&gt;&gt; On Sun, Sep 28, 2014 at 3:18 PM, Alexey Shvetsov<br>
&gt;&gt; &lt;<a href="mailto:alexxy@omrb.pnpi.spb.ru" target="_blank">alexxy@omrb.pnpi.spb.ru</a>&gt; wrote:<br>
&gt;&gt;<br>
&gt;&gt;&gt; Hi Berk!<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; Its not a cut and paste error, also there are no pdb dumps.<br>
&gt;&gt;&gt; Also I see this error before with other systems.<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; I can provide tpr file for that system<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; <a href="https://biod.pnpi.spb.ru/~alexxy/gmx/psa_pep_ctrl.md_npt.tpr" target="_blank">
https://biod.pnpi.spb.ru/~alexxy/gmx/psa_pep_ctrl.md_npt.tpr</a> [1]<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; Berk Hess писал 28-09-2014 16:37:<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; Hi,<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; I assume that your old and be coordinates being identical is correct<br>
&gt;&gt;&gt; and not a cut-and-paste error.<br>
&gt;&gt;&gt; This seems a bit strange or do you freeze part of the system?<br>
&gt;&gt;&gt; The only things moving here are then the domain boundaries and I<br>
&gt;&gt;&gt; don&#39;t see an issue there, since they only moved a little.<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; Do you have any more output besides the error message? PDB dump<br>
&gt;&gt;&gt; files maybe?<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; Cheers,<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; Berk<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; On 09/28/2014 02:22 PM, Alexey Shvetsov wrote:<br>
&gt;&gt;&gt; Hi,<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; just wanna add that this error seems to be reproducable even on<br>
&gt;&gt;&gt; single node. Also i get same error for gpu runs.<br>
&gt;&gt;&gt; However i dont see it in large systems (800k+ atoms) running on<br>
&gt;&gt;&gt; large number of cpus (512+)<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; Alexey Shvetsov писал 28-09-2014 13:44:<br>
&gt;&gt;&gt; Hi,<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; DD grid is<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; Domain decomposition grid 4 x 1 x 1, separate PME ranks 0 PME domain<br>
&gt;&gt;&gt; decomposition: 4 x 1 x 1<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; for 4 node setup<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; and<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; Domain decomposition grid 4 x 2 x 1, separate PME ranks 0 PME domain<br>
&gt;&gt;&gt; decomposition: 4 x 2 x 1<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; for 8 node setup<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; It&#39;s reproducable with 5.0 release and latest git master. I try to<br>
&gt;&gt;&gt; check if its reproducable with 1 node. Also i can provide tpr file<br>
&gt;&gt;&gt; for this system<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; Mark Abraham писал 28-09-2014 13:28:<br>
&gt;&gt;&gt; Hi,<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; It&#39;s hard to say on that information. There were some issues fixed<br>
&gt;&gt;&gt; in the lead-up to GROMACS 5 with DD not always working with 2<br>
&gt;&gt;&gt; domains in a direction, but that&#39;s a pure guess. I&#39;d assume you can<br>
&gt;&gt;&gt; reproduce this with release-5-0 branch. Do you observe it with a single domain?<br>
&gt;&gt;&gt; If not, then it&#39;s surely a bug (and should be submitted to redmine).<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; Mark<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; On Sun, Sep 28, 2014 at 11:18 AM, Alexey Shvetsov<br>
&gt;&gt;&gt; &lt;<a href="mailto:alexxy@omrb.pnpi.spb.ru" target="_blank">alexxy@omrb.pnpi.spb.ru</a>&gt; wrote:<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; Hi all!<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; I&#39;m doing some tests with small peptide and constantly getting this<br>
&gt;&gt;&gt; error.<br>
&gt;&gt;&gt; I get it with few systems.<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; Systems sizes are around 10k or 20k<br>
&gt;&gt;&gt; I run it on 4 or 8 old nodes each with two xeon 54xx series<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; starting mdrun &#39;2ZCH_3 in water&#39;<br>
&gt;&gt;&gt; 50000000 steps, 100000.0 ps (continuing from step 1881000, 3762.0<br>
&gt;&gt;&gt; ps).<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; Step 13514000:<br>
&gt;&gt;&gt; The charge group starting at atom 6608 moved more than the distance<br>
&gt;&gt;&gt; allowed by the domain decomposition (1.112924) in direction X<br>
&gt;&gt;&gt; distance out of cell -1.193103 Old coordinates: 5.467 0.298 3.636<br>
&gt;&gt;&gt; New<br>
&gt;&gt;&gt; coordinates: 5.467 0.298 3.636 Old cell boundaries in direction X:<br>
&gt;&gt;&gt; 4.037 5.382 New cell boundaries in direction X: 4.089 5.452<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt;<br>
&gt;&gt; ---------------------------------------------------------------------<br>
&gt;&gt; -<br>
&gt;&gt; ----<br>
&gt;&gt;&gt; MPI_ABORT was invoked on rank 3 in communicator MPI_COMM_WORLD with<br>
&gt;&gt;&gt; errorcode 1.<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.<br>
&gt;&gt;&gt; You may or may not see output from other processes, depending on<br>
&gt;&gt;&gt; exactly when Open MPI kills them.<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt;<br>
&gt;&gt; ---------------------------------------------------------------------<br>
&gt;&gt; -<br>
&gt;&gt; ----<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; -------------------------------------------------------<br>
&gt;&gt;&gt; Program mdrun_mpi, VERSION 5.1-dev-20140922-20c00a9-dirty-unknown<br>
&gt;&gt;&gt; Source code file:<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt;<br>
&gt;&gt; /var/tmp/alexxy/portage/sci-chemistry/gromacs-9999/work/gromacs-9999/<br>
&gt;&gt; s<br>
&gt;&gt; rc/gromacs/mdlib/domdec.cpp,<br>
&gt;&gt;&gt; line: 4388<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; Fatal error:<br>
&gt;&gt;&gt; A charge group moved too far between two domain decomposition steps<br>
&gt;&gt;&gt; This usually means that your system is not well equilibrated For<br>
&gt;&gt;&gt; more information and tips for troubleshooting, please check the<br>
&gt;&gt;&gt; GROMACS website at <a href="http://www.gromacs.org/Documentation/Errors" target="_blank">
http://www.gromacs.org/Documentation/Errors</a> [2]<br>
&gt;&gt;&gt; [1]<br>
&gt;&gt;&gt; -------------------------------------------------------<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; -- Best Regards,<br>
&gt;&gt;&gt; Alexey &#39;Alexxy&#39; Shvetsov, PhD<br>
&gt;&gt;&gt; Department of Molecular and Radiation Biophysics FSBI Petersburg<br>
&gt;&gt;&gt; Nuclear Physics Institute, NRC Kurchatov Institute, Leningrad<br>
&gt;&gt;&gt; region, Gatchina, Russia mailto:<a href="mailto:alexxyum@gmail.com" target="_blank">alexxyum@gmail.com</a><br>
&gt;&gt;&gt; mailto:<a href="mailto:alexxy@omrb.pnpi.spb.ru" target="_blank">alexxy@omrb.pnpi.spb.ru</a><br>
&gt;&gt;&gt; -- Gromacs Developers mailing list<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; * Please search the archive at<br>
&gt;&gt;&gt; <a href="http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List" target="_blank">
http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List</a><br>
&gt;&gt;&gt; [3] [2]<br>
&gt;&gt;&gt; before posting!<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; * Can&#39;t post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">
http://www.gromacs.org/Support/Mailing_Lists</a> [4]<br>
&gt;&gt;&gt; [3]<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; * For (un)subscribe requests visit<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt;<br>
&gt;&gt; <a href="https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-develope" target="_blank">
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-develope</a><br>
&gt;&gt; r<br>
&gt;&gt; s<br>
&gt;&gt;&gt; [5]<br>
&gt;&gt;&gt; [4] or send a mail to <a href="mailto:gmx-developers-request@gromacs.org" target="_blank">gmx-developers-request@gromacs.org</a>.<br>
&gt;&gt;&gt;<br>
&gt;&gt;&gt; Links:<br>
&gt;&gt;&gt; ------<br>
&gt;&gt;&gt; [1] <a href="http://www.gromacs.org/Documentation/Errors" target="_blank">http://www.gromacs.org/Documentation/Errors</a> [2] [2]<br>
&gt;&gt;&gt; <a href="http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List" target="_blank">
http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List</a> [3]<br>
&gt;&gt;&gt; [3] <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a> [4] [4]<br>
&gt;&gt;&gt;<br>
&gt;&gt; <a href="https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-develope" target="_blank">
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-develope</a><br>
&gt;&gt; r<br>
&gt;&gt; s<br>
&gt;&gt;&gt; [5]<br>
&gt;&gt;<br>
&gt;&gt;  -- Best Regards,<br>
&gt;&gt;  Alexey &#39;Alexxy&#39; Shvetsov, PhD<br>
&gt;&gt;  Department of Molecular and Radiation Biophysics  FSBI Petersburg<br>
&gt;&gt; Nuclear Physics Institute, NRC Kurchatov Institute,  Leningrad<br>
&gt;&gt; region, Gatchina, Russia  mailto:<a href="mailto:alexxyum@gmail.com" target="_blank">alexxyum@gmail.com</a><br>
&gt;&gt; mailto:<a href="mailto:alexxy@omrb.pnpi.spb.ru" target="_blank">alexxy@omrb.pnpi.spb.ru</a><br>
&gt;&gt;<br>
&gt;&gt;  --<br>
&gt;&gt;  Best Regards,<br>
&gt;&gt;  Alexey &#39;Alexxy&#39; Shvetsov, PhD<br>
&gt;&gt;  Department of Molecular and Radiation Biophysics  FSBI Petersburg<br>
&gt;&gt; Nuclear Physics Institute, NRC Kurchatov Institute,  Leningrad<br>
&gt;&gt; region, Gatchina, Russia  mailto:<a href="mailto:alexxyum@gmail.com" target="_blank">alexxyum@gmail.com</a><br>
&gt;&gt; mailto:<a href="mailto:alexxy@omrb.pnpi.spb.ru" target="_blank">alexxy@omrb.pnpi.spb.ru</a><br>
&gt;&gt;  --<br>
&gt;&gt;  Gromacs Developers mailing list<br>
&gt;&gt;<br>
&gt;&gt;  * Please search the archive at<br>
&gt;&gt; <a href="http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List" target="_blank">
http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List</a> [3]<br>
&gt;&gt; before posting!<br>
&gt;&gt;<br>
&gt;&gt;  * Can&#39;t post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">
http://www.gromacs.org/Support/Mailing_Lists</a> [4]<br>
&gt;&gt;<br>
&gt;&gt;  * For (un)subscribe requests visit<br>
&gt;&gt;<br>
&gt;&gt; <a href="https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-develope" target="_blank">
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-develope</a><br>
</div></div></div>
</div>
&gt;&gt; r s [5] or send a mail to <a href="mailto:gmx-developers-request@gromacs.org" target="_blank">
gmx-developers-request@gromacs.org</a>.<div><div class="h5"><br>
<div>
<div>&gt;&gt;<br>
&gt;&gt;<br>
&gt;&gt; Links:<br>
&gt;&gt; ------<br>
&gt;&gt; [1] <a href="https://biod.pnpi.spb.ru/~alexxy/gmx/psa_pep_ctrl.md_npt.tpr" target="_blank">
https://biod.pnpi.spb.ru/~alexxy/gmx/psa_pep_ctrl.md_npt.tpr</a><br>
&gt;&gt; [2] <a href="http://www.gromacs.org/Documentation/Errors" target="_blank">http://www.gromacs.org/Documentation/Errors</a><br>
&gt;&gt; [3] <a href="http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List" target="_blank">
http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List</a><br>
&gt;&gt; [4] <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">http://www.gromacs.org/Support/Mailing_Lists</a><br>
&gt;&gt; [5]<br>
&gt;&gt; <a href="https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-develope" target="_blank">
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-develope</a><br>
&gt;&gt; r<br>
&gt;&gt; s<br>
&gt;<br>
&gt; --<br>
&gt; Best Regards,<br>
&gt; Alexey &#39;Alexxy&#39; Shvetsov, PhD<br>
&gt; Department of Molecular and Radiation Biophysics FSBI Petersburg<br>
&gt; Nuclear Physics Institute, NRC Kurchatov Institute, Leningrad region,<br>
&gt; Gatchina, Russia mailto:<a href="mailto:alexxyum@gmail.com" target="_blank">alexxyum@gmail.com</a><br>
&gt; mailto:<a href="mailto:alexxy@omrb.pnpi.spb.ru" target="_blank">alexxy@omrb.pnpi.spb.ru</a><br>
&gt; --<br>
&gt; Gromacs Developers mailing list<br>
&gt;<br>
&gt; * Please search the archive at<br>
&gt; <a href="http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List" target="_blank">
http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List</a><br>
&gt; before posting!<br>
&gt;<br>
&gt; * Can&#39;t post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">
http://www.gromacs.org/Support/Mailing_Lists</a><br>
&gt;<br>
&gt; * For (un)subscribe requests visit<br>
&gt; <a href="https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developer" target="_blank">
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developer</a><br>
&gt; s or send a mail to <a href="mailto:gmx-developers-request@gromacs.org" target="_blank">gmx-developers-request@gromacs.org</a>.<br>
&gt;<br>
&gt; --<br>
&gt; Gromacs Developers mailing list<br>
&gt;<br>
&gt; * Please search the archive at<br>
&gt; <a href="http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List" target="_blank">
http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List</a><br>
&gt; before posting!<br>
&gt;<br>
&gt; * Can&#39;t post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">
http://www.gromacs.org/Support/Mailing_Lists</a><br>
&gt;<br>
&gt; * For (un)subscribe requests visit<br>
&gt; <a href="https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developer" target="_blank">
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developer</a><br>
&gt; s or send a mail to <a href="mailto:gmx-developers-request@gromacs.org" target="_blank">gmx-developers-request@gromacs.org</a>.<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt;<br>
&gt; --<br>
&gt; ORNL/UT Center for Molecular Biophysics <a href="http://cmb.ornl.gov" target="_blank">
cmb.ornl.gov</a> <a href="tel:865-241-1537" value="+18652411537" target="_blank">865-241-1537</a>,<br>
&gt; ORNL PO BOX 2008 MS6309<br>
&gt;<br>
&gt;<br>
&gt; --<br>
&gt; Gromacs Developers mailing list<br>
&gt;<br>
&gt; * Please search the archive at<br>
&gt; <a href="http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List" target="_blank">
http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List</a><br>
&gt; before posting!<br>
&gt;<br>
&gt; * Can&#39;t post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">
http://www.gromacs.org/Support/Mailing_Lists</a><br>
&gt;<br>
&gt; * For (un)subscribe requests visit<br>
&gt; <a href="https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developer" target="_blank">
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developer</a><br>
&gt; s or send a mail to <a href="mailto:gmx-developers-request@gromacs.org" target="_blank">gmx-developers-request@gromacs.org</a>.<br>
--<br>
Gromacs Developers mailing list<br>
<br>
* Please search the archive at <a href="http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List" target="_blank">
http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List</a> before posting!<br>
<br>
* Can&#39;t post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">
http://www.gromacs.org/Support/Mailing_Lists</a><br>
<br>
* For (un)subscribe requests visit<br>
<a href="https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developers" target="_blank">https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developers</a> or send a mail to
<a href="mailto:gmx-developers-request@gromacs.org" target="_blank">gmx-developers-request@gromacs.org</a>.<br>
<br>
--<br>
Gromacs Developers mailing list<br>
<br>
* Please search the archive at <a href="http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List" target="_blank">
http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List</a> before posting!<br>
<br>
* Can&#39;t post? Read <a href="http://www.gromacs.org/Support/Mailing_Lists" target="_blank">
http://www.gromacs.org/Support/Mailing_Lists</a><br>
<br>
* For (un)subscribe requests visit<br>
<a href="https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developers" target="_blank">https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developers</a> or send a mail to
<a href="mailto:gmx-developers-request@gromacs.org" target="_blank">gmx-developers-request@gromacs.org</a>.</div>
</div>
</div></div></blockquote>
</div>
<br>
</div>
</div>
</div>
</div>
</span>
</div>

</blockquote></div><br><br clear="all"><div><br></div>-- <br>ORNL/UT Center for Molecular Biophysics <a href="http://cmb.ornl.gov">cmb.ornl.gov</a><br>865-241-1537, ORNL PO BOX 2008 MS6309
</div>