Into the future?

Forum dedicated to older versions of EMS package (WRFEMS v3.2, v3.1 or older). Support is user-to-user based, so please help others if you can.
Post Reply
pattim
Posts: 171
Joined: Sun Jun 24, 2012 8:42 pm
Location: Los Angeles, CA, USA

Into the future?

Post by pattim » Sat Feb 09, 2013 2:52 am

I don't know how to run a 'local' EMS run into the future - that is, running past the available GRIB files for BC data.

It does seem like, ultimately, that is the point, after all. Is there some setting I don't know about?

Thank You,
Patricia

PS: The only way I've found so far is to nest a global run, but EMS has trouble with more than one nest in a global run, so I am resolution limited (I can't make enough nests to get fine resolution):

Code: Select all

patti@OS121:~/00_GCMs/EMS_WRF/wrfems/runs/Global005> csh
patti@OS121-> ls -l grib
total 50920
-rw-r--r-- 1 patti users 52140194 Feb  8 19:04 gfsanl_4_20120707_0000_000.grb2
patti@OS121-> source ../../EMS.cshrc
patti@OS121-> ems_prep --dset fnl --local --date 20120707 --cycle 00 --length 72 --analysis && ems_run

  WRF EMS Program ems_prep (V3.2.1.5.45.beta) started on OS121 at Sat Feb  9 03:07:05 2013 UTC

                       The WRF EMS Says: "Who's Awesome? You're Awesome!"

     I.  WRF EMS ems_prep Model Initialization Summary

           Initialization Start Time    : Sat Jul  7 00:00:00 2012 UTC
           Initialization End   Time    : Tue Jul 10 00:00:00 2012 UTC
           Initialization Data Set      : fnl
           Static Surface Data Sets     : None
           Land Surface Data Sets       : None

         This is a GLOBAL simulation - Going global!

    II.  Search out requested files for WRF model initialization

         *  Locating fnl files for model initial conditions
         *  All requested fnl files are available for model initialization

         Excellent! - Your master plan is working!
   III.  Create the WPS ARW intermediate format files
         *  Processing fnl files for use as model initial conditions                   - Excellent!!
         ARW core intermediate file processing completed in 12.5 seconds

    IV.  Horizontal interpolation of the intermediate files to the computational domain
         *  Metgrid processed files are located in       
            /home/patti/00_GCMs/EMS_WRF/wrfems/runs/Global005/wpsprd
         Horizontal interpolation to computational domain completed in 7.99 seconds
    WRF EMS Program ems_prep completed at Sat Feb  9 03:07:26 2013 UTC

Use of qw(...) as parentheses is deprecated at /home/patti/00_GCMs/EMS_WRF/wrfems/strc/ems_run/Run_cfgarw.pm line 431.
Use of qw(...) as parentheses is deprecated at /home/patti/00_GCMs/EMS_WRF/wrfems/strc/ems_run/Run_cfgnmm.pm line 350.

  WRF EMS Program ems_run (V3.2.1.5.45.beta) started on OS121 at Sat Feb  9 03:07:27 2013 UTC

                      The WRF EMS Says: "Who's Awesome? You're Awesome!"

     I.  Preforming configuration in preparation for your EMS experience
         *  You are running the WRF ARW core global domain. Hey Ho! Let's go - GLOBAL model'n!
         *  Simulation start and end times:
              Domain         Start                   End
                1     2012-07-07_00:00:00     2012-07-10_00:00:00      
         *  Simulation length will be 72 hours
         *  Large timestep to be used for this simulation is 196.3 seconds

    II.  Creating the initial and boundary condition files for the user domain(s)

         *  The WRF REAL program shall be run on the following systems and processors:
            6  processors on OS121     (8 tiles per processor)
         *  Creating the WRF global initial condition file
         *  WRF initial and boundary conditions successfully created in 32 seconds
         Moving on to bigger and better delusions of grandeur

   III.  Running ARW WRF while thinking happy thoughts

         *  The WRF ARW core shall be run on the following systems and processors:
            6  processors on OS121     (8 tiles per processor)
         *  Run Output Frequency   Primary wrfout   Aux File 1
            ---------------------------------------------------
              Domain 01          :   1 hour            Off      

         *  Runnning your simulation with enthusiasm!
              You can sing along to the progress of the simulation while watching:
                %  tail -f /home/patti/00_GCMs/EMS_WRF/wrfems/runs/Global005/rsl.out.0000
              Unless you have something better to do with your time

meteoadriatic
Posts: 1566
Joined: Wed Aug 19, 2009 10:05 am

Re: Into the future?

Post by meteoadriatic » Sat Feb 09, 2013 9:30 am

I don't think this is possible. However you can run global as long as you want, then save output grib files, then do normal regional run with those files as BC.

pattim
Posts: 171
Joined: Sun Jun 24, 2012 8:42 pm
Location: Los Angeles, CA, USA

Re: Into the future?

Post by pattim » Sat Feb 09, 2013 7:37 pm

meteoadriatic wrote:I don't think this is possible. However you can run global as long as you want, then save output grib files, then do normal regional run with those files as BC.
Hi, meteo! Thnx for the reply! Yes, I thought of that after I went to bed - but how do you save as GRIB files instead of the stock .nc files? If Robert fixes the nesting within global simulations, that would fix the problem - ESPECIALLY if he fixes nudging: you could use global domain nudging to study errors. But if he fixes the capability to nest global runs, you wouldn't have to make separate global and local runs. I'm hoping he will fix these in the next release. (Globals would also be helped by fixing the ocean Tsurf exchange.)

I was reading Warner's Numerical Weather and Climate Prediction and he points out that if you put the BC's a long way away from the region of interest, then their effect isn't noticed - so I guess you could just do this (assuming you have one BC file - known_starting_BC_Gribfile.grb2 - representing "now"):

Code: Select all

> ln -s known_starting_BC_Gribfile.grb2 phony_later_BC_Gribfiles_time1.grb2
> ln -s known_starting_BC_Gribfile.grb2 phony_later_BC_Gribfiles_time2.grb2
> ln -s known_starting_BC_Gribfile.grb2 phony_later_BC_Gribfiles_time3.grb2
> ln -s known_starting_BC_Gribfile.grb2 phony_later_BC_Gribfiles_time4.grb2
...and the later BC files would just be copies of the one file (the file for *now*) that you do have. This would keep the BC's constant. This is unphysical, but if they are a LONG distance away from the region of interest, then the error would not reach the region of interest during the time of interest.

So if the winds were ~10km/hr, then to do a 24 hour forecast, you would need the boundaries, say, 300km (preferably more) distant from the nested "region of interest."

Warner was talking more about dynamical grid errors, but the same idea applies, I think.

meteoadriatic
Posts: 1566
Joined: Wed Aug 19, 2009 10:05 am

Re: Into the future?

Post by meteoadriatic » Sat Feb 09, 2013 7:44 pm

pattim wrote:but how do you save as GRIB files instead of the stock .nc files?
Doesn't wrfpost work with global run? If not then I don't have idea right now.

pattim
Posts: 171
Joined: Sun Jun 24, 2012 8:42 pm
Location: Los Angeles, CA, USA

Re: Into the future?

Post by pattim » Sat Feb 09, 2013 8:36 pm

meteoadriatic wrote:
pattim wrote:but how do you save as GRIB files instead of the stock .nc files?
Doesn't wrfpost work with global run? If not then I don't have idea right now.
OH, so you use wrfpost, then? I was looking for a config file setting to tell EMS to output GRIB files instead of NetCDF... :roll:

meteoadriatic
Posts: 1566
Joined: Wed Aug 19, 2009 10:05 am

Re: Into the future?

Post by meteoadriatic » Sun Feb 10, 2013 9:28 pm

Well, standard procedure with ems_post.. try it, if it creates grib files then great, you have your input conditions for regional run!

smartie
Posts: 97
Joined: Sat May 21, 2011 7:34 am

Re: Into the future?

Post by smartie » Mon Feb 11, 2013 12:27 pm

I can confirm that the post routine will produce grib files from a global run which can be used to initialise a local/regional WRF .

We have done some experimentation with Global WRF to see if it it can provide better initial/boundary conditions for some difficult to model cases and have not found that it does. For some reanalysis cases tried we found no improvement over using IBCs/LBCs from the reanalysis data. This does not mean that there may be exceptions where it might improve simulations. Of course the problem with a Global run is that (like all global models) it is highly sensitive to initial conditions.

On the question of boundaries we've also done some experiments with reanalysis cases and found that in general you can initialise say a 15km grid from the dataset and provided the domain is large enough results (eg the domain might cover the N Atlantic from the eastern seaboard into western Europe) are similar to nesting a smaller domain within a larger domain eg 45-15km. You have to come to some judgement according to your computational resources and requirements. However, I wouldn't forget that some cases (eg explosive cyclone development) may be highly sensitive to upstream boundaries.

David

meteoadriatic
Posts: 1566
Joined: Wed Aug 19, 2009 10:05 am

Re: Into the future?

Post by meteoadriatic » Mon Feb 11, 2013 1:26 pm

smartie wrote:On the question of boundaries we've also done some experiments with reanalysis cases and found that in general you can initialise say a 15km grid from the dataset and provided the domain is large enough results (eg the domain might cover the N Atlantic from the eastern seaboard into western Europe) are similar to nesting a smaller domain within a larger domain eg 45-15km.
True! In fact if you start for example three telescopic domains (eg. 45-15-5km), the finest one is still initialized with resolution of initial/boundary conditions, like all coarse domains are. Coarse domains are there just to ensure better behaviour of systems and airmasses entering inner domains from outside. However if one has enough resources to make only one but very big hi-res domain, so that domain boundaries lie very far from area of interest, low resolution of boundary conditions will not be much issue, if at all. As you said, it is mostly question of available computer resources.

Post Reply