Naive questions - input and output grib frequency

Forum dedicated to older versions of EMS package (WRFEMS v3.2, v3.1 or older). Support is user-to-user based, so please help others if you can.
himoon
Posts: 4
Joined: Wed Mar 02, 2011 9:51 am

Naive questions - input and output grib frequency

Post by himoon » Thu Mar 03, 2011 6:08 pm

I'm sorry, I'm very very new with WRF, I've run little more than the standard example explained at official on-line tutorial... Looking at this example I saw that they only took one day of initial values, I mean, five GRIB files. Is this standard?, How many initial GRIB files do you usually need?

Where do you specify the time of the output?, Can you get several outputs at different times?, How many?, Where do you specify it?

If I need to give a prediction at t = 6 or 12 hours of the initial values at 3 nested domains with sizes 500000 km², 100000 km² and 100 km² and resolutions 1º, 10' and 30'' respectively, How many initial GRIB files do you usually need?.

Thank you very much!!!

meteoadriatic
Posts: 1567
Joined: Wed Aug 19, 2009 10:05 am

Re: Naive questions

Post by meteoadriatic » Sat Mar 05, 2011 6:20 pm

himoon wrote:I'm sorry, I'm very very new with WRF, I've run little more than the standard example explained at official on-line tutorial... Looking at this example I saw that they only took one day of initial values, I mean, five GRIB files. Is this standard?, How many initial GRIB files do you usually need?
You will have to explain this better, because I can't understand what you asked, sorry. Please give us link to that example.
himoon wrote:Where do you specify the time of the output?, Can you get several outputs at different times?, How many?, Where do you specify it?
Also this one, I'm really not sure if I understood correctly. Output control is done in wrfems/runs/<domainname>/conf/ems_post directory which contain various configuration files, and also in wrfems/runs/<domainname>/conf/ems_run/run_wrfout.conf file where you can specify how often WRF EMS will output forecast files (HISTORY_INTERVAL = ...) fileld.

himoon
Posts: 4
Joined: Wed Mar 02, 2011 9:51 am

Re: Naive questions

Post by himoon » Mon Mar 07, 2011 9:38 pm

Thank you very much, meteoadriatic.

Here is the link:
http://www.mmm.ucar.edu/wrf/OnLineTutor ... index.html

I don't know very well how the wrf works. I guess it ingests GRIB files containing the predictions or outputs of GFS and gives you as result another prediction with smaller time steps inside of the original interval of time. In this case, I suppose you need to input so many files as necessary to cover the same interval:

If you need to know the temperature at every hour during one day, I guess you need GRIB files corresponding at one day, isn't it?.

If you want to know the temperature every hour of the day after tomorrow, how many input data do you need?

Finally, can you predict any more than GFS? or can you only interpolate the predictions from GFS?

I hope this clarifies my question. Thanks.

meteoadriatic
Posts: 1567
Joined: Wed Aug 19, 2009 10:05 am

Re: Naive questions

Post by meteoadriatic » Tue Mar 08, 2011 10:58 am

himoon wrote:Thank you very much, meteoadriatic.

Here is the link:
http://www.mmm.ucar.edu/wrf/OnLineTutor ... index.html

I don't know very well how the wrf works. I guess it ingests GRIB files containing the predictions or outputs of GFS and gives you as result another prediction with smaller time steps inside of the original interval of time. In this case, I suppose you need to input so many files as necessary to cover the same interval:
Yes, something like that!
himoon wrote:If you need to know the temperature at every hour during one day, I guess you need GRIB files corresponding at one day, isn't it?.

If you want to know the temperature every hour of the day after tomorrow, how many input data do you need?
OK, now we are talking about input and output frequency. Regional model must be intialized with output from another model that has greater area coverage. And then, these output files from bigger model become input files in regional model. They serve dual purporse; first grib (analyze time) is interpolated to model grid resolution and model is started from it. This is what we call model initialization. Then, during model run time, air masses enter into model domain at it's borders. They have different properties (temperature, moisture, wind) and model must know what enters it's domain during run. This data is inputed into model as all other grib files, and we call this (lateral) boudary conditions.

Now, more often we feed model with boundary conditions, less error will be produced. It is not the same if model know what's going on around domain borders every hour or every six hours. In six hours many data will be missed and thus model will make bigger forecast errors.

If you use GFS for model initializing and boundary conditions, you can't use frequency better than 3 hours because that is the best what GFS gives out. If you use ems_autorun, this is "BCFREQ" field in ems_autorun.conf file.

WRF output frequency has nothing with BC input frequency and is totally up to you. It is set in run_wrfout.conf file.

I hope this helps at least a little, and sorry about bad english.

himoon
Posts: 4
Joined: Wed Mar 02, 2011 9:51 am

Re: Naive questions

Post by himoon » Thu Mar 10, 2011 1:46 pm

Thank you very much!

pattim
Posts: 173
Joined: Sun Jun 24, 2012 8:42 pm
Location: Los Angeles, CA, USA

Re: Naive questions

Post by pattim » Sun Jul 01, 2012 4:31 pm

meteoadriatic wrote: Now, more often we feed model with boundary conditions, less error will be produced. It is not the same if model know what's going on around domain borders every hour or every six hours. In six hours many data will be missed and thus model will make bigger forecast errors.
...so, if I understand the implication, "error" here is defined as the difference in forecast results from later satellite (etc.) measurements? I think I saw in the docs that EMS linearly interpolates the coarse-resolution BC time steps to the time steps it needs. Overall it sounds like EMS is doing a kind of interpolation of coarser measurements to local weather, but is there a better word than interpolate for chaotic phenomena?

Some of the servers offer "forecast data" (as opposed to simply "data," e.g., GFS) more often, but I don't know the difference between these types of data (on the servers).

Is there any EMS practical wisdom about defining a global coarse domain (and nudging it) and then nesting down to high resolution for local runs? Does that work better or worse than simply getting IC's and BC's from a server? (Not knowing, I would think it might work well, but with numerical problems like this, the devil's in the details.)
meteoadriatic wrote: If you use GFS for model initializing and boundary conditions, you can't use frequency better than 3 hours because that is the best what GFS gives out. If you use ems_autorun, this is "BCFREQ" field in ems_autorun.conf file.

WRF output frequency has nothing with BC input frequency and is totally up to you. It is set in run_wrfout.conf file.

I hope this helps at least a little, and sorry about bad english.
Thank you very much for the insights!!!!!!!!

meteoadriatic
Posts: 1567
Joined: Wed Aug 19, 2009 10:05 am

Re: Naive questions

Post by meteoadriatic » Sun Jul 01, 2012 5:16 pm

pattim wrote:
meteoadriatic wrote: Now, more often we feed model with boundary conditions, less error will be produced. It is not the same if model know what's going on around domain borders every hour or every six hours. In six hours many data will be missed and thus model will make bigger forecast errors.
...so, if I understand the implication, "error" here is defined as the difference in forecast results from later satellite (etc.) measurements?
Yes, forecast reliability skill.
pattim wrote:I think I saw in the docs that EMS linearly interpolates the coarse-resolution BC time steps to the time steps it needs.
No. Interpolation is done to domain's horizontal grid point distance and vertical levels, it is done in metgrid phase.
pattim wrote:Some of the servers offer "forecast data" (as opposed to simply "data," e.g., GFS) more often, but I don't know the difference between these types of data (on the servers).
Be more precise, give example.
pattim wrote:Is there any EMS practical wisdom about defining a global coarse domain (and nudging it) and then nesting down to high resolution for local runs?
There is a kind of wisdom, indeed. :mrgreen:
pattim wrote:Does that work better or worse than simply getting IC's and BC's from a server? (Not knowing, I would think it might work well, but with numerical problems like this, the devil's in the details.)
What? You imply to run for example 1km resolution domain and feed it directly from GFS input data? It will work. But you won't be satisfied with forecast skills of such experiment. There is recommendation that you don't use more than 5x difference between domain resolution and BC data resolution if you want reliable forecast. That's one of the reasons for using nests.

pattim
Posts: 173
Joined: Sun Jun 24, 2012 8:42 pm
Location: Los Angeles, CA, USA

Re: Naive questions

Post by pattim » Sun Jul 01, 2012 5:40 pm

meteoadriatic wrote:
pattim wrote:Some of the servers offer "forecast data" (as opposed to simply "data," e.g., GFS) more often, but I don't know the difference between these types of data (on the servers).
Be more precise, give example.
Example: In the EMS documentation, in Appendix E.2.1 it says GFS operational real time data are 6-hourly. Then it says "Forecast file time are 3-hourly." This sounds like the 6-hour data are actual measurements (from, say, satellite products) and the 3-hourly data are model output based on the 6-hour data (say from a model like WRF run with the 6-hour data).
http://strc.comet.ucar.edu/software/new ... endixE.pdf
meteoadriatic wrote:
pattim wrote:Is there any EMS practical wisdom about defining a global coarse domain (and nudging it) and then nesting down to high resolution for local runs?
There is a kind of wisdom, indeed. :mrgreen:
pattim wrote:Does that work better or worse than simply getting IC's and BC's from a server? (Not knowing, I would think it might work well, but with numerical problems like this, the devil's in the details.)
What? You imply to run for example 1km resolution domain and feed it directly from GFS input data? It will work. But you won't be satisfied with forecast skills of such experiment. There is recommendation that you don't use more than 5x difference between domain resolution and BC data resolution if you want reliable forecast. That's one of the reasons for using nests.
Sorry I wasn't clearer - what I was thinking of was, for example, of doing a global run at 5 degree resolution (110km) then 4 nests down to a local resolution. Each nest has only 3x decrease at its boundary, as suggested in the docs.

So: 110km -> 3x nest -> 37km -> 3x nest -> 12km -> 3x nest -> 4km -> 3x nest -> 1km.

The primary domain therefore has no BC's, just IC's and nudging from a coarse global data set. I am not sure which server(s) have coarse data.

And just to be clear, the first nest might be 25 deg x 25 deg, the next next might be 5 x 5 degrees, etc...

I don't believe there is any point to doing this if one is doing only local real-time forecasting. But it might be useful for research purposes. But it also depends on what the terms "operational data" vs. "forecast data" mean. (example: 6-hour vs. 3-hour data in docs section E.2.1 http://strc.comet.ucar.edu/software/new ... endixE.pdf)

pattim
Posts: 173
Joined: Sun Jun 24, 2012 8:42 pm
Location: Los Angeles, CA, USA

Re: Naive questions

Post by pattim » Sun Jul 01, 2012 5:43 pm

meteoadriatic wrote:
pattim wrote:I think I saw in the docs that EMS linearly interpolates the coarse-resolution BC time steps to the time steps it needs.
No. Interpolation is done to domain's horizontal grid point distance and vertical levels, it is done in metgrid phase.
Where is time interpolation of the BC's done?

meteoadriatic
Posts: 1567
Joined: Wed Aug 19, 2009 10:05 am

Re: Naive questions

Post by meteoadriatic » Sun Jul 01, 2012 5:51 pm

pattim wrote:Example: In the EMS documentation, in Appendix E.2.1 it says GFS operational real time data are 6-hourly. Then it says "Forecast file time are 3-hourly." This sounds like the 6-hour data are actual measurements (from, say, satellite products) and the 3-hourly data are model output based on the 6-hour data (say from a model like WRF run with the 6-hour data).
http://strc.comet.ucar.edu/software/new ... endixE.pdf
That in Appendiy E, 2.1 means:
GFS model runs 4 times a day, at 00, 06, 12 and 18 UTC. It gives it's forecasts in 3-hr steps.


pattim wrote:Sorry I wasn't clearer - what I was thinking of was, for example, of doing a global run at 5 degree resolution (110km) then 4 nests down to a local resolution. Each nest has only 3x decrease at its boundary, as suggested in the docs.

So: 110km -> 3x nest -> 37km -> 3x nest -> 12km -> 3x nest -> 4km -> 3x nest -> 1km.

The primary domain therefore has no BC's, just IC's and nudging from a coarse global data set. I am not sure which server(s) have coarse data.

And just to be clear, the first nest might be 25 deg x 25 deg, the next next might be 5 x 5 degrees, etc...

I don't believe there is any point to doing this if one is doing only local real-time forecasting. But it might be useful for research purposes. But it also depends on what the terms "operational data" vs. "forecast data" mean. (example: 6-hour vs. 3-hour data in docs section E.2.1 http://strc.comet.ucar.edu/software/new ... endixE.pdf)
Yes you can do global WRF run. But you will need plenty of computing resources. I don't see any practical use of that when GFS forecasts are freely available for feeding regional domains with BC.

P.S. When NMM-B model becomes ready for linux/public use, global runs might become bit more popular, but still lot of computing resources will be needed.

pattim wrote:
meteoadriatic wrote:
pattim wrote:I think I saw in the docs that EMS linearly interpolates the coarse-resolution BC time steps to the time steps it needs.
No. Interpolation is done to domain's horizontal grid point distance and vertical levels, it is done in metgrid phase.
Where is time interpolation of the BC's done?
Nowhere!

Post Reply