Using GFS 0.25° grid dataset

Compiling NMM/ARW code for EMS system, upgrading WRF cores on your own, changing EMS scripts to suit your particular needs, and other modifications to original EMS distribution goes in this forum. These are officially unsupported actions.
meteoadriatic
Posts: 1512
Joined: Wed Aug 19, 2009 10:05 am

Using GFS 0.25° grid dataset

Post by meteoadriatic » Wed Oct 29, 2014 4:44 pm

Hello all,

these are very exciting news. GFS parallel run is available as new GFS run for next update and is available at NCEP ftp server. The most important news about this update is availability of 0.25 degrees output resolution grid.

So, here is a quick giude for your guys, how to use it for your WRF :mrgreen:

New grib_info file for GFS parallel. Put this one into your $EMS/conf/grib_info with file name gfs-para_gribinfo.conf

Code: Select all

#  GRIB information file for ems_prep.pl - RAR/NWS May 2008
#
#  This file contains information on data sets used with the ems_prep.pl routine for
#  initialization of the WRF EMS. Should you decide to mangle its contents or use
#  this file as a template, then please note that the following parameters are necessary
#  for your happiness and well-being.
#

#1/2. CATEGORY is used to categorize this data set for the purpose of better organizing
#     the available initialization data options when passing the --dslist flag to ems_prep.
#     The category may be anything you choose but it's recommended that you stick to a few
#     established conventions unless you have a good reason to create your own. The current
#     category list includes:
#
#         Category                    Description
#         ---------------------------------------------------------------------------------
#         Land Surface Model (LSM)  - Data sets containing LSM-related fields (--lsm)
#         Surface            (SFC)  - Data sets containing static surface fields (--sfc)
#         Forecast           (FCST) - Operational Forecast data sets
#         Analysis           (ANAL) - Operational Analysis data sets (--analysis)
#         Model Forecast     (MODL) - Data sets from Non-operational model runs
#         Historical         (REAN) - Historical or Reanalysis data sets
#
#     The following may be appended to the category to indicate a personal tile data set
#
#         Personal Tiles     (PTIL) - STRC Personal tile data sets
#
#     If you want something different just make up a category name and it will be handled
#     appropriately by ems_prep.
#
#     Leaving CATEGORY blank or undefined will result in the data set being placed in
#     the "Land of misfit data sets" category.
#
CATEGORY = Forecast

#  1. INFO provides some general information about the data set such as forecast frequency, 
#     vertical and horizontal resolution and coordinate system.
#
#     Example: INFO = .25 degree Global - Isobaric coordinate - 3hourly grib format
#
INFO = GFS Model 0.25 degree - 3hourly (>200MB per file)


#  2. VCOORD identifies the vertical coordinate of the data set. Typical values are
#     
#       VCOORD = press  : Isobaric Coordinate
#       VCOORD = hybrid : Hybrid Coordinate (such as RUC)
#       VCOORD = theta  : Isentropic Coordinate
#       VCOORD = height : Height Coordinate
#       VCOORD = sigma  : Sigma Coordinate (Native ARW or NMM)
#       VCOORD = none   : No vertical coordinate (Surface-based) data
#
VCOORD = press


#  3. INITFH is the initial (forecast) hour of the data set you wish to download.  It is the
#     default value but may be overridden from the command line or the CYCLES setting below.
#
#      Examples:
#
#        INITFH    =  00 
#
#      To download the 00 hour forecast from the data set as your 00 hour forecast.
#
#        INITFH    =  06
#
#      To download the 06 hour forecast from the data set as your 00 hour forecast.
#  
INITFH = 00

#  4. FINLFH is the final (forecast) hour of the GRIB file(s) you wish to download. Again,
#     this is the default value and may be overridden on the command line or with the
#     CYCLES parameter. 
# 
#       Example: FINLFH = 48
# 
#     To use the 48 hour forecast from the data set as your last time for your boundary
#     conditions.
# 
#     Note that FINLFH - INITFH defines the length of your run unless otherwise overridden.
#     See the CYCLES option below or the [--length ] ems_prep option for more details.
#
FINLFH = 24

#  5. FREQFH is the frequency, in hours, of the (forecast) files you wish to download
#     between INITFH and FINLFH. This server as your boundary condition frequency and it is
#     suggested that you use the highest frequency available (lowest value), which is
#     usually 3-hourly (FREQFH = 03). Do not set this value lower than the frequency of the
#     available data because bad stuff will happen.
#
#     Example: FREQFH = 03
#
FREQFH = 03


#  6. CYCLES defines the cycle hours (UTC) for which forecast files are generated from the
#     model runs. The standard format for this parameter is:
# 
#       CYCLES = CYCLE 1, CYCLE 2, ..., CYCLE N, 
# 
#     where each cycle time (UTC) is separated by a comma (,). For example:
# 
#       Example: CYCLES = 00,06,12,18
# 
#     IMPORTANT: The times listed in CYCLE are critical to ems_prep.pl working correctly as
#     they identify the most recent data set available when executing real-time simulations.
#     For example, if you want to download a 12 UTC run but "12" is not listed in the CYCLES
#     setting, you will be out of luck (SOL). The ems_prep.pl routine will default to the 
#     most recent cycle time.
# 
#     Alternatively, if you include cycle time for which no data set exists then you  will
#     have problems with your real-time downloads. Just don't do it.
#               
#     There is a caveat though, please see the DELAY parameter below for more information.
#
#     Note that the CYCLES setting can be overridden with the --cycles command line option. 
#     See "ems_prep --guide" for the gory details. 
# 
# 
#     ADVANCED COMPLEX STUFF: 
# 
#     The CYCLES parameter may be used to override the default INITFH, FINLFH, and FREQFH 
#     values. If you do not want to use default settings for every model cycle, try using:
# 
#        CYCLES = CYCLE[:INITFH:FINLFH:FREQFH].  
# 
#     Example: CYCLES = 00:24:36:06,06,12,18:12:36
#  
#     Note that in the above example the individual cycle times are separated by a comma (,)
#     and the INITFH, FINLFH, and FREQFH values are separated by a colon (:).
# 
#     INTERPRETATION:
# 
#       From the 00Z Cycle run (00:24:36:06), obtain the 24 to 36 hour forecasts 
#       every 06 hours. Note these values override the INITFH, FINLFH, and FREQFH 
#       default values!
#  
#       From the 06Z Cycle run (06) use the default values of INITFH, FINLFH, and 
#       FREQFH specified above.
#  
#       From the 12Z Cycle run (12) use the default values of INITFH, FINLFH, and 
#       FREQFH specified above.
#  
#       From the 18Z Cycle run (18:12:36), the 12 to 36 hour forecasts every 
#       FREQFH hours.
# 
#       There are even a few other options when using the --cycle option in ems_prep. 
#       See the guide for more information.
# 
CYCLES = 00,06,12,18


#  7. DELAY represents the number of hours, following a cycle time, before the GRIB files
#     are available. In most cases, a lag exists from the time that  the operational model
#     is initialized to when the run is completed and the data are  post processed. For
#     example, if DELAY = 3, then ems_prep.pl will not look for the 12Z cycle run files
#     until after 15Z (12+3). The 06Z cycle would be used as the current cycle (default)
#     between 9 and 15Z. This behavior can be overridden with the  --nodelay option in 
#     ems_prep.pl.
# 
#     Note that if you set the value too low then you will be hitting the server for 
#     data when the files are not available, which is not good.
# 
#     Example: DELAY = 05
# 
DELAY = 04

#  8. TILES contains a list of NCEP model grib tiles separated by a comma (,). If the data 
#     set described by this file does not use NCEP tiles then TILES will be ignored. If the 
#     data set does consist if individual tiles that must be quilted together then be sure 
#     to include the TT placeholder for tile number in the file naming convention for both
#     the remote and local filenames.
#
#     Example: TILES = 31,32,22,23
#
#     Note that gif images depicting the 32km 221 and 12km 218 tile locations can be found in
#     the wrfems/docs directory. 
#
#     NOTE:  It is up to the user to make sure that the list of tiles, when quilted together,
#            creates a rectangle AND that the quilted domain is sufficiently large to cover
#            the model computational domain.  Bad things happen when the model domain 
#            resides outside the areal coverage of the initialization data set. Not like
#            William Shatner sings "Lucy in the Sky With Diamonds" bad, but you still do
#            not want to go there.
#
TILES = 

#  9. SERVER-METHOD specifies the method used to download the data files, the location of 
#     the files along with the filenames files joined by a colon (:).
# 
#     SERVER-METHOD = SERVER ID:/directory location of data on server/filename convention
# 
#     IMPORTANT: The SERVER ID must have a corresponding IP/hostname defined in the SERVER
#     section of the wrfems/data/conf/config/ems_prep/prep_global.conf configuration file.
# 
#     Note that the following place holders will be replaced with the appropriate values 
#     in ems_prep.pl:
# 
#       YYYY  - 4 digit year
#       YY    - 2 digit year
#       MM    - 2 digit month
#       DD    - 2 digit day
#       CC    - Model cycle hour
#       FF    - Forecast hour [0-99]
#       FFF   - Forecast hour [100-999]
#       NN    - 2-digit Forecast minute
#       TT    - Tile number(s) for GRIB tiles.
#
#     The METHOD indicates the methods to use to acquire the data. Currently ftp, http, or 
#     nfs are supported indicated by SERVER-FTP, SERVER-HTTP, and SERVER-NFS respectively.
#
#     Examples: SERVER-HTTP = STRC:/data/grib/YYYYMMDD/nam/grid212/grib.tCCz/nam.tCCz.awip3dFF.tm00.bz2
#               SERVER-FTP  = NCEP:/pub/data/nccf/com/gfs/prod/gfs.YYYYMMDD/gfs.tCCz.pgrb2f0FF.bz2
#               SERVER-NFS  = KIELBASA:/data/archive/YYYYMMDD/grib/212/grib.tCCz/nam.tCCz.awip3dFF.tm00
#
#     In the first example above, STRC is the ID of the http server and has a corresponding 
#     STRC = <hostname> entry in the prep_global.conf file. The files are located in the 
#     /data/grib/YYYYMMDD/nam/grid212/grib.tCCz directory on the server and 
#     /nam.tCCz.awip3dFF.tm00.bz2 is the naming convention, with space holders.
#
#     Note that ems_prep.pl will automatically unpack ".gz" and ".bz2" files. If you are 
#     using a data source that is packed then make sure you include the appropriate suffix.
#
#     In the second example above, NCEP is the ID of the ftp server and has a corresponding
#     NCEP = <hostname> entry in prep_global.conf. In order to use the SERVER-FTP (HTTP) option
#     the data files must be available via ftp (http).
#
#     * NFS USERS *
#
#     In the SERVER-NFS example above, KIELBASA is the server ID of the system where the 
#     data reside and there is a corresponding KIELBASA = <hostname> entry in the 
#     prep_global.conf file. Unlike the FTP and HTTP options, either SERVER ID or actual
#     hostname ([user@]<hostname>:) may be used to identify the server. 
#
#     If a SERVER ID is used, it must be in ALL CAPS in both the "SERVER-NFS =" line and the 
#     prep_global.conf file. For example:
#
#       SERVER-NFS  = KIELBASA:/data/grib/YYYYMMDD/gfs/grib.tCCz/YYMMDDCC.gfs.tCCz.pgrb2fFF
#     
#     Where in prep_global.conf: KIELBASA = roz@kielbasa (Note the all capital letters 
#                                                         for kielbasa)
#     And
#
#       SERVER-NFS  = roz@kielbasa:/data/grib/YYYYMMDD/gfs/grib.tCCz/YYMMDDCC.gfs.tCCz.pgrb2fFF
#
#     Are basically the same thing. So why then allow for both options?  Specifying a server
#     ID in the "SERVER-NFS =" line will allow you to specify a server when passing the 
#     --dset <dset set>:nfs:<server> flag to ems_prep.pl. So if you had:
#
#       SERVER-NFS  = SERVER_A:/data/archive/YYYYMMDD/grib/212/grib.tCCz/nam.tCCz.awip3dFF.tm00.gz
#       SERVER-NFS  = SERVER_B:/data/archive/YYYYMMDD/grib/212/grib.tCCz/nam.tCCz.awip3dFF.tm00.gz
#
#     With SERVER_A and SERVER_B defined in prep_global.conf, then you can specify a server
#     to access:
#
#       % ems_prep  [other options]  --dset <data set>:nfs:server_b (either upper or lower case)
#
#     The default behavior with just "ems_prep  --dset <data set>:nfs" will result in ems_prep.pl 
#     looping through each of the servers listed (first SERVER_A and then SERVER_B).
#
#     IMPORTANT! - The ems_prep.pl routine uses secure copy (scp) to access the data on those
#                  servers identified by either the SERVER ID or actual hostname 
#                  ([user@]<hostname>), so you MUST have passwordless ssh configured between
#                  the machine running ems_prep.pl and the server.
#
#     But what is your data reside on the same machine as ems_prep.pl and you don't want to
#     use scp? In that case set the SERVER ID to "LOCAL" or leave blank:
#
#       SERVER-NFS  = LOCAL:/data/archive/YYYYMMDD/grib/212/grib.tCCz/nam.tCCz.awip3dFF.tm00.gz
#     or
#       SERVER-NFS  = /data/archive/YYYYMMDD/grib/212/grib.tCCz/nam.tCCz.awip3dFF.tm00.gz
#
#     in which case ems_prep.pl will use the standard copy command (cp) to access the 
#     requested file from a locally-mounted partition.
#     
#     Finally, if there is more than one server listed below and you do not specify a server
#     or method, i.e, "% ems_prep --dset <data set>", then ems_prep.pl will attempt to connect
#     each (ftp,http,nfs) server listed until all the requested files have been  downloaded.
#
#     So in summary:
#
#         % ems_prep --dset <data set>:<method>:<server> - Attempt to get <data set> via <method> 
#                                                          from <server>
#
#          % ems_prep --dset <data set>:<method>  - Attempt to get <data set> from all the <method>
#                                                   servers listed in the <data set>_gribinfo.conf file.
#
#          % ems_prep --dset <data set> - Attempt to get <data set> via all the methods and servers
#                                         listed in the <data set>_gribinfo.conf file.
#
SERVER-FTP  = NCEP:/pub/data/nccf/com/gfs/para/gfs.YYYYMMDDCC/gfs.tCCz.pgrb2.0p25.fFFF


# 10. LOCFIL is file naming convention to be used on the LOCAL system.  This filename is
#     usually the same as that on the remote server; but hey, you have the power. The 
#     primary purpose for this parameter is so filenames on the local machine do not change 
#     when failing over to a different remote server, which may not use an identical naming 
#     convention for the same data set. The filename uses the same YYYY, MM, DD, CC, FF, 
#     and TT place holders listed in the SERVER-METHOD section.
#
#     GRIB 2 <-> 1 CONVERSION
#
#     The WRF EMS will automatically convert between GRIB formats if requested by the user.
#     This is necessary when using some data sets such as the NCEP tiles, which are available
#     from NCEP in GRIB 2 format but must be converted to GRIB 1 format before processing
#     into WRF. The ems_prep routine keys off the differences between the filenames on the
#     remote and local systems. If the files on the remote server contain "grib2" or "grb2"
#     "gr2" in the filename but is missing on local filename, then the the file will be
#     converted to GRIB 1 format.  Conversely, if the remote file uses a GRIB 1 naming
#     convention but a GRIB 2 name is used locally, then a GRIB 1 -> 2 conversion will occur.
#     Example: LOCFIL = YYMMDDCC.gfs.tCCz.pgrb2f0FF
#
LOCFIL = YYMMDDCC.gfs.tCCz.pgrb2fFF

# 11. VTABLE is the Vtable.<MODEL ID> to use when processing the GRIB grids into the WPS
#     intermediate format. All tables are located $EMS/data/conf/tables/vtables directory and 
#     define what fields to pull from the GRIB file for processing and initialization in your
#     run. Note that Vtables are quasi-independent of the data set. The table just describes
#     the available fields and not the navigation information so a single Vtable table may 
#     be used for multiple data sets.
#
#     LVTABLE is the Vtable to use should this data set be accessed with the --lsm <data set> 
#     option, in which case the user likely wants a subset (near surface fields) of the fields
#     available in the Vtable specified by LVTABLE. So, LVTABLE should point to a file
#     that contains just the near surface fields. Both VTABLE and LVTABLE may be specified
#     below.
#
#     Examples: VTABLE  = Vtable.NAM
#               LVTABLE = Vtable.NAMLSM
#
VTABLE  = Vtable.GFSPARA
                               
# That's all there is for now.
#

As soil temperature field has been changed in new grib, you will also need new vtable for ungrib. Here you are folks! Put this into $EMS/data/tables/vtables with filename Vtable.GFSPARA:

Code: Select all

GRIB1| Level| From |  To  | metgrid  | metgrid | metgrid                                 |GRIB2|GRIB2|GRIB2|GRIB2|
Param| Type |Level1|Level2| Name     | Units   | Description                             |Discp|Catgy|Param|Level|
-----+------+------+------+----------+---------+-----------------------------------------+-----------------------+
  11 | 100  |   *  |      | TT       | K       | Temperature                             |  0  |  0  |  0  | 100 |
  33 | 100  |   *  |      | UU       | m s-1   | U                                       |  0  |  2  |  2  | 100 |
  34 | 100  |   *  |      | VV       | m s-1   | V                                       |  0  |  2  |  3  | 100 |
  52 | 100  |   *  |      | RH       | %       | Relative Humidity                       |  0  |  1  |  1  | 100 |
   7 | 100  |   *  |      | HGT      | m       | Height                                  |  0  |  3  |  5  | 100 |
  11 | 105  |   2  |      | TT       | K       | Temperature       at 2 m                |  0  |  0  |  0  | 103 |
  52 | 105  |   2  |      | RH       | %       | Relative Humidity at 2 m                |  0  |  1  |  1  | 103 |
  33 | 105  |  10  |      | UU       | m s-1   | U                 at 10 m               |  0  |  2  |  2  | 103 |
  34 | 105  |  10  |      | VV       | m s-1   | V                 at 10 m               |  0  |  2  |  3  | 103 |
   1 |   1  |   0  |      | PSFC     | Pa      | Surface Pressure                        |  0  |  3  |  0  |   1 |
   2 | 102  |   0  |      | PMSL     | Pa      | Sea-level Pressure                      |  0  |  3  |  1  | 101 |
 144 | 112  |   0  |  10  | SM000010 | fraction| Soil Moist 0-10 cm below grn layer (Up) |  2  |  0  | 192 | 106 |
 144 | 112  |  10  |  40  | SM010040 | fraction| Soil Moist 10-40 cm below grn layer     |  2  |  0  | 192 | 106 |
 144 | 112  |  40  | 100  | SM040100 | fraction| Soil Moist 40-100 cm below grn layer    |  2  |  0  | 192 | 106 |
 144 | 112  | 100  | 200  | SM100200 | fraction| Soil Moist 100-200 cm below gr layer    |  2  |  0  | 192 | 106 |
 144 | 112  |  10  | 200  | SM010200 | fraction| Soil Moist 10-200 cm below gr layer     |  2  |  0  | 192 | 106 |
   2 | 112  |   0  |  10  | ST000010 | K       | T 0-10 cm below ground layer (Upper)    |  2  |  0  |  2  | 106 |
   2 | 112  |  10  |  40  | ST010040 | K       | T 10-40 cm below ground layer (Upper)   |  2  |  0  |  2  | 106 |
   2 | 112  |  40  | 100  | ST040100 | K       | T 40-100 cm below ground layer (Upper)  |  2  |  0  |  2  | 106 |
   2 | 112  | 100  | 200  | ST100200 | K       | T 100-200 cm below ground layer (Bottom)|  2  |  0  |  2  | 106 |
   2 | 112  |  10  | 200  | ST010200 | K       | T 10-200 cm below ground layer (Bottom) |  2  |  0  |  2  | 106 |
  91 |   1  |   0  |      | SEAICE   | proprtn | Ice flag                                | 10  |  2  |  2  |   1 |
  81 |   1  |   0  |      | LANDSEA  | proprtn | Land/Sea flag (1=land, 0 or 2=sea)      |  2  |  0  |  0  |   1 |
   7 |   1  |   0  |      | SOILHGT  | m       | Terrain field of source analysis        |  0  |  3  |  5  |   1 |
  11 |   1  |   0  |      | SKINTEMP | K       | Skin temperature (can use for SST also) |  0  |  0  |  0  |   1 |
  65 |   1  |   0  |      | SNOW     | kg m-2  | Water equivalent snow depth             |  0  |  1  | 13  |   1 |
     |   1  |   0  |      | SNOWH    | m       | Physical Snow Depth                     |  0  |  1  |     |   1 |
-----+------+------+------+----------+---------+-----------------------------------------+-----------------------+
#
#
#
#  For SNOWH, NCEP starts with the AFWA snow depth analysis and converts it to a water-equivalent.
#  For some reason, NCEP uses a different ratio in the GFS/GDAS than in the NAM and that which is assumed in WRF.
#  Therefore, we need to adjust SNOW and compute SNOWH in ungrib. 
Disclaimer here: I didn't objectively tested this yet, just fired it up and it's working, but there might be other changes in grib fields. Use with caution and test thoroughly before production use! There might also be some new useful fields to add into vtable.

Now use "gfs-para" for your DSETS parameter and it is all!

Big warning, every download file is over 230MB big!

Enjoy and report back if you find any problem with dataset! :D

oib
Posts: 117
Joined: Mon Jan 31, 2011 3:29 pm

Re: Using GFS 0.25° grid dataset

Post by oib » Thu Oct 30, 2014 10:32 am

Thanks for the gooood news :)

230 MB / file will need a lot of time to download, so at the moment I can't input the parallel run into WRF operational runs.

To use it operationally I have to wait for the availability of the geographical tiling service from NCEP.
I hope the service will be available from december, but honestly I don't have any rumors about this.

But I sure will do tests with the new GFS data, so to be ready when the tiling service will be available.

meteoadriatic
Posts: 1512
Joined: Wed Aug 19, 2009 10:05 am

Re: Using GFS 0.25° grid dataset

Post by meteoadriatic » Thu Oct 30, 2014 1:54 pm

oib wrote:230 MB / file will need a lot of time to download, so at the moment I can't input the parallel run into WRF operational runs.
In fact, you can use it only for intialization, and then for boundary input files use normal gfs of gfsptiles. In EMS environment, use something like DSETS=gfs-para%gfsptile to do that (see chapter 7 of users guide).
oib wrote:To use it operationally I have to wait for the availability of the geographical tiling service from NCEP.
I hope the service will be available from december, but honestly I don't have any rumors about this.

But I sure will do tests with the new GFS data, so to be ready when the tiling service will be available.
Robert promised 0.25deg ptiles when he finish setting up new server (in about a month from now) :)

oib
Posts: 117
Joined: Mon Jan 31, 2011 3:29 pm

Re: Using GFS 0.25° grid dataset

Post by oib » Thu Oct 30, 2014 5:24 pm

meteoadriatic wrote:
oib wrote:230 MB / file will need a lot of time to download, so at the moment I can't input the parallel run into WRF operational runs.
In fact, you can use it only for intialization, and then for boundary input files use normal gfs of gfsptiles. In EMS environment, use something like DSETS=gfs-para%gfsptile to do that (see chapter 7 of users guide).
oib wrote:To use it operationally I have to wait for the availability of the geographical tiling service from NCEP.
I hope the service will be available from december, but honestly I don't have any rumors about this.

But I sure will do tests with the new GFS data, so to be ready when the tiling service will be available.
Robert promised 0.25deg ptiles when he finish setting up new server (in about a month from now) :)
So, if I understand correctly, you suggest to input to WPS:
+ analysis 0.25 deg file from GFSPAR
+ forecast 0.50 deg files from GFS
:?:

In your test with a such kind of initialization have you noticed a forecast improvement?

meteoadriatic
Posts: 1512
Joined: Wed Aug 19, 2009 10:05 am

Re: Using GFS 0.25° grid dataset

Post by meteoadriatic » Thu Oct 30, 2014 6:48 pm

oib wrote:+ analysis 0.25 deg file from GFSPAR
+ forecast 0.50 deg files from GFS
:?:
Yes.
oib wrote:In your test with a such kind of initialization have you noticed a forecast improvement?
No :mrgreen:

However:
- I just run several runs
- I didn't do parallel testing so I can't look objectively at difference

But:
- I would be very surprised if results aren't any better
- Snow cover analysis seems much better in new GFS

surgeon
Posts: 44
Joined: Wed Dec 08, 2010 1:34 pm
Location: Poland
Contact:

Re: Using GFS 0.25° grid dataset

Post by surgeon » Fri Oct 31, 2014 11:28 am

There is no SNODsfc in 0.25 GFS. :( Or I can't find.
Total cloud cover var name is changed.
Last edited by surgeon on Fri Oct 31, 2014 1:29 pm, edited 2 times in total.

meteoadriatic
Posts: 1512
Joined: Wed Aug 19, 2009 10:05 am

Re: Using GFS 0.25° grid dataset

Post by meteoadriatic » Fri Oct 31, 2014 1:12 pm

It should be, because snow on the ground is intitialized absolutely correctly in WRF. I didn't tried yet to look at 0.25deg for display.

Gippox
Posts: 59
Joined: Sat Mar 02, 2013 1:16 am

Re: Using GFS 0.25° grid dataset

Post by Gippox » Sat Nov 01, 2014 4:46 pm

Hey Ivan this is good news! :D Finally GFS aligns with ECMWF...
I initialize the model with cycled 4D Var, so i need only the data boundary...unfortunately i also have to wait NOMADS grib filter service since my DSL is not sufficiently fast ... :roll: .

Antonix
Posts: 256
Joined: Fri Oct 16, 2009 8:53 am

Re: Using GFS 0.25° grid dataset

Post by Antonix » Mon Nov 03, 2014 2:45 pm

work well!
I tried a long run (168) with nesting and it worked very well!!!

surgeon
Posts: 44
Joined: Wed Dec 08, 2010 1:34 pm
Location: Poland
Contact:

Re: Using GFS 0.25° grid dataset

Post by surgeon » Mon Nov 03, 2014 4:16 pm

Gippox wrote:...unfortunately i also have to wait NOMADS grib filter service since my DSL is not sufficiently fast ... :roll: .
NOMADS grib filter offers GFS 0.25 since few days.

Post Reply

Who is online

Users browsing this forum: No registered users and 1 guest