All my domain runs failed on the first try but started up OK on the second and subsequent runs without changing anything. I suggest to try again.I am experiencing the same, but I am currently still on 15.99.2. My runs fail, and I do not get any diagnostic messages. Is this problem solved in 15.99.5?
Changes needed for GFS Update 2017?
Re: Changes needed for GFS Update 2017?
-
- Posts: 59
- Joined: Tue Jun 05, 2012 5:25 pm
Re: Changes needed for GFS Update 2017?
Hi, I have the same problem of Meteo60 with temperature = dpt and rh = 100
Just update to 15.99.9
The problem is in meteogrid:
Just update to 15.99.9
The problem is in meteogrid:
IV. Horizontal interpolation of the intermediate files to the computational domain
* Calculating mean surface temperatures for missing water temperature values - Moving On
* Interpolating fields from the initialization to the computational domain (13 CPUs)*** glibc detected *** /usr1/uems/util/mpich2/bin/mpiexec: free(): invalid pointer: 0x00000000004df9a0 ***
======= Backtrace: =========
[0x4f6c3a]
[0x4f98db]
[0x45d658]
[0x40c7b9]
[0x401f3f]
[0x4df20b]
[0x400489]
======= Memory map: ========
00400000-005bb000 r-xp 00000000 09:02 853156609 /usr1/uems/util/mpich2/bin/mpiexec
007bb000-00860000 rwxp 001bb000 09:02 853156609 /usr1/uems/util/mpich2/bin/mpiexec
00860000-0087b000 rwxp 00000000 00:00 0
019a6000-019c9000 rwxp 00000000 00:00 0 [heap]
2b8e96781000-2b8e96782000 rwxp 00000000 00:00 0
2b8e98000000-2b8e98027000 rwxp 00000000 00:00 0
2b8e98027000-2b8e9c000000 ---p 00000000 00:00 0
7fff80884000-7fff808a6000 rwxp 00000000 00:00 0 [stack]
7fff809ec000-7fff809ee000 r-xp 00000000 00:00 0 [vdso]
7fff809ee000-7fff809f0000 r--p 00000000 00:00 0 [vvar]
ffffffffff600000-ffffffffff601000 r-xp 00000000 00:00 0 [vsyscall]
- Failed (134)
! In metgrid (1) - This is not looking good!
Re: Changes needed for GFS Update 2017?
On 15.99.x this problem of RH and T°C over coastlines is solved for me. With 15.99.2 the run crash at the beginning (segmentation fault). With 15.99.5 all is ok
Giancarlo Modugno wrote: ↑Mon Jul 24, 2017 10:45 am Hi, I have the same problem of Meteo60 with temperature = dpt and rh = 100
Just update to 15.99.9
The problem is in meteogrid:
IV. Horizontal interpolation of the intermediate files to the computational domain
* Calculating mean surface temperatures for missing water temperature values - Moving On
* Interpolating fields from the initialization to the computational domain (13 CPUs)*** glibc detected *** /usr1/uems/util/mpich2/bin/mpiexec: free(): invalid pointer: 0x00000000004df9a0 ***
======= Backtrace: =========
[0x4f6c3a]
[0x4f98db]
[0x45d658]
[0x40c7b9]
[0x401f3f]
[0x4df20b]
[0x400489]
======= Memory map: ========
00400000-005bb000 r-xp 00000000 09:02 853156609 /usr1/uems/util/mpich2/bin/mpiexec
007bb000-00860000 rwxp 001bb000 09:02 853156609 /usr1/uems/util/mpich2/bin/mpiexec
00860000-0087b000 rwxp 00000000 00:00 0
019a6000-019c9000 rwxp 00000000 00:00 0 [heap]
2b8e96781000-2b8e96782000 rwxp 00000000 00:00 0
2b8e98000000-2b8e98027000 rwxp 00000000 00:00 0
2b8e98027000-2b8e9c000000 ---p 00000000 00:00 0
7fff80884000-7fff808a6000 rwxp 00000000 00:00 0 [stack]
7fff809ec000-7fff809ee000 r-xp 00000000 00:00 0 [vdso]
7fff809ee000-7fff809f0000 r--p 00000000 00:00 0 [vvar]
ffffffffff600000-ffffffffff601000 r-xp 00000000 00:00 0 [vsyscall]
- Failed (134)
! In metgrid (1) - This is not looking good!
-
- Posts: 59
- Joined: Tue Jun 05, 2012 5:25 pm
Re: Changes needed for GFS Update 2017?
Thank you Meteo60
My error: my version is 15.99.5 not 15.99.9
I get the errors with 0.5° GFS data input, but with 0.25°C GFS it seems working...
My error: my version is 15.99.5 not 15.99.9
I get the errors with 0.5° GFS data input, but with 0.25°C GFS it seems working...
Starting UEMS Program ems_prep.pl (V15.99.5) on meteoweb at Tue Jul 25 14:54:12 2017 UTC
I. STRC EMS ems_prep Simulation Initialization Summary
Initialization Start Time : Tue Jul 25 00:00:00 2017 UTC
Initialization End Time : Tue Jul 25 06:00:00 2017 UTC
Boundary Condition Frequency : 180 Minutes
Initialization Data Set : gfs personal tile data set
Boundary Condition Data Set : gfs personal tile data set
Static Surface Data Sets : None
Land Surface Data Sets : None
II. Search out requested files for WRF model initialization
* Locating gfs files for model initial and boundary conditions
Areal coverage of your 0.5 degree GFS personal tile
Corner Lat-Lon points of the domain:
49.50, 2.00 49.50, 23.00
* *
* 41.83, 12.51
* *
34.00, 2.00 34.00, 23.00
Initiating HTTP connection to ems3.comet.ucar.edu
Making request #1 of 3 for personal tile data
-> Attempting to acquire 17072500.gfs.t00z.0p50.pgrb2f000 - Success (0.20 mb/s)
-> Attempting to acquire 17072500.gfs.t00z.0p50.pgrb2f003 - Success (0.20 mb/s)
-> Attempting to acquire 17072500.gfs.t00z.0p50.pgrb2f006 - Success (0.20 mb/s)
* All requested gfs files are available for model initialization
Excellent! - Your master plan is working!
III. Create the WPS ARW intermediate format files
* Processing gfs files for use as model initial and boundary conditions - Yatzee!!
Intermediate file processing completed in 3.44 seconds
IV. Horizontal interpolation of the intermediate files to the computational domain
* Calculating mean surface temperatures for missing water temperature values - Moving On
* Interpolating fields from the initialization to the computational domain (13 CPUs) - Success!
* Metgrid processed files are located in
/usr1/uems/runs/italia-new/wpsprd
Horizontal interpolation to computational domain completed in 3.02 seconds
Your awesome EMS Prep party is complete - Tue Jul 25 14:54:36 2017 UTC
Alphanumeric code 6EQUJ5 will someday be interprteted as: "Think Globally, Model Locally!"
Starting UEMS Program ems_run.pl (V15.99.5) on meteoweb at Tue Jul 25 14:54:36 2017 UTC
I. Preparing your EMS Run experience
* You are running the WRF ARW core. Hey Ho! Let's go! - model'n!
* Simulation start and end times:
Domain Start End
1 2017-07-25_00:00:00 2017-07-25_06:00:00
* Simulation length will be 6 hours
* A large timestep of 50 seconds will be used for this simulation
II. Creating the initial and boundary condition files for the user domain(s)
* The WRF REAL program shall be run on the following systems and processors:
14 processors on meteoweb (1 tile per processor)
* Creating WRF initial and boundary condition files
* Initial and boundary conditions created in 5 seconds
Moving on to bigger and better delusions of grandeur
III. Running ARW WRF while thinking happy thoughts
* The WRF ARW core shall be run on the following systems and processors:
14 processors on meteoweb (1 tile per processor)
* Run Output Frequency Primary wrfout Aux File 1
---------------------------------------------------
Domain 01 : 1 hour Off
* Runnning your simulation with enthusiasm!
You can sing along to the progress of the simulation while watching:
% tail -f /usr1/uems/runs/italia-new/rsl.out.0000
Unless you have something better to do with your time
! Possible problem as system return code was 139
! Your WRF simulation (PID 17755) returned a exit status of 139, which is never good.
System Signal Code (SN) : 11 (Invalid Memory Reference - Seg Fault)
! While perusing the log/run_wrfm.log file I determined the following:
It appears that your run failed due to a Segmentation Fault on your
system. This failure is typically caused when the EMS attempt to access a
region of memory that has not been allocated. Most often, segmentation
faults are due to an array bounds error or accessing memory though a NULL
pointer. Either way, this is an issue that needs to be corrected by the
developer.
So, if you want this problem fixed send your log files along with the
namelist.wrfm and namelist.wps files to Robert.Rozumalski@noaa.gov, just
because he cares.
! Here are the last few lines from the run_wrfm.log file:
starting wrf task 1 of 14
starting wrf task 10 of 14
starting wrf task 7 of 14
starting wrf task 5 of 14
starting wrf task 4 of 14
starting wrf task 2 of 14
starting wrf task 9 of 14
===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= PID 17764 RUNNING AT meteoweb
= EXIT CODE: 139
= CLEANING UP REMAINING PROCESSES
= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
===================================================================================
YOUR APPLICATION TERMINATED WITH THE EXIT STRING: Segmentation fault (signal 11)
This typically refers to a problem with your application.
Please see the FAQ page for debugging suggestions
* System information is available in static/ems_system.info
! Here's a little help from your friend at EMS world headquarters:
The log files from your premature termination have been neatly bundled in log/2017072514.wrfm_crash_logs.tgz
Feel free to send them to a person who cares should you need some
assistance in troubleshooting this problem.
* All available wrfout files have been moved to the wrfprd directory
-
- Posts: 59
- Joined: Tue Jun 05, 2012 5:25 pm
Re: Changes needed for GFS Update 2017?
No... The simulation with 0.25 is ok with 6 hours but with 18 or 24 or more hours the same error
! Possible problem as system return code was 139
! Your WRF simulation (PID 29927) returned a exit status of 139, which is never good.
System Signal Code (SN) : 11 (Invalid Memory Reference - Seg Fault)
! While perusing the log/run_wrfm.log file I determined the following:
It appears that your run failed due to a Segmentation Fault on your
system. This failure is typically caused when the EMS attempt to access a
region of memory that has not been allocated. Most often, segmentation
faults are due to an array bounds error or accessing memory though a NULL
pointer. Either way, this is an issue that needs to be corrected by the
developer.
So, if you want this problem fixed send your log files along with the
namelist.wrfm and namelist.wps files to Robert.Rozumalski@noaa.gov, just
because he cares.
! Here are the last few lines from the run_wrfm.log file:
starting wrf task 0 of 14
starting wrf task 2 of 14
starting wrf task 3 of 14
starting wrf task 9 of 14
starting wrf task 1 of 14
starting wrf task starting wrf task 5 of 14
8 of 14
-
- Posts: 59
- Joined: Tue Jun 05, 2012 5:25 pm
Re: Changes needed for GFS Update 2017?
ok ...(
) I confirm that the problem is only with GFS 0.50°...
Any idea?

Any idea?
-
- Posts: 1604
- Joined: Wed Aug 19, 2009 10:05 am
Re: Changes needed for GFS Update 2017?
Can you check contents of these ptiles? They might not have "landn" variable.
Re: Changes needed for GFS Update 2017?
Hi ivan, this is needed to activate nmm? If so, where should these files be placed?meteoadriatic wrote: ↑Thu Jul 20, 2017 4:54 pm OK, here are the changes for NMM
http://gamma.meteoadriatic.net/tmp/NMM/
This ungrib is from WPS 3.9 with patch for new GFS, compiled statically so it should work with any system. Also, that one from Robert's update should work also in NMM but didn't tested it myself. Vtable.GFS should be placed in data/tables/vtable and METGRID.TBL.NMM in data/tables/wps
That should be all you need to change. Let me know if it works.
Ivan
-
- Posts: 59
- Joined: Tue Jun 05, 2012 5:25 pm
Re: Changes needed for GFS Update 2017?
sorry, I don't find the ptiles files and then the landn variable; I have tried to find in conf/grib_info but with "locate *ptile*" command with terminalmeteoadriatic wrote: ↑Tue Jul 25, 2017 4:50 pm Can you check contents of these ptiles? They might not have "landn" variable.
Can you indicate me, please?
-
- Posts: 1604
- Joined: Wed Aug 19, 2009 10:05 am
Re: Changes needed for GFS Update 2017?
This works for my NMM just fine. I wrote paths for tables and ungrib should go instead that one you have in bin directory.dominic wrote: ↑Tue Jul 25, 2017 10:23 pmHi ivan, this is needed to activate nmm? If so, where should these files be placed?meteoadriatic wrote: ↑Thu Jul 20, 2017 4:54 pm OK, here are the changes for NMM
http://gamma.meteoadriatic.net/tmp/NMM/
This ungrib is from WPS 3.9 with patch for new GFS, compiled statically so it should work with any system. Also, that one from Robert's update should work also in NMM but didn't tested it myself. Vtable.GFS should be placed in data/tables/vtable and METGRID.TBL.NMM in data/tables/wps
That should be all you need to change. Let me know if it works.
Ivan