MPI problem when using the domain wizard (dwiz)

Forum dedicated to older versions of EMS package (WRFEMS v3.2, v3.1 or older). Support is user-to-user based, so please help others if you can.
Post Reply
meso
Posts: 10
Joined: Fri Feb 22, 2013 10:30 pm

MPI problem when using the domain wizard (dwiz)

Post by meso » Wed Mar 06, 2013 7:37 pm

Hello,

I am using dwiz and got the following error when trying to "Localize my Domain" at Tab 3 - Run Preprocessors:

"Parallel compiled code should be submitted through job queuing software by setting a value for the job command'. Click the 'Job Command' button and type in a command. Fore example:
mpirun
Additional options are to either run the WPS executables outside of Domain Wizard, or recompile WPS in serial mode and then use Domain Wizard. Note: the compilation option for compiling WPS is independent of the compilation option used for WRF. So, you could compile WRF parallel and WPS serial. "

Note: I got this domain from another computer/run (of which I transferred the whole domain and it's configuration to this computer). The previous computer had a similar setup (same OS and 64-bit system). I did "yum install glibc.i686" to update the Java interpreter.

Here is my system info if that helps:

Code: Select all

    WRF EMS Program sysinfo (V3.2.1.5.45.beta) 


    *  Gathering information for localhost *

          System Information for *
          
              System Date           : Wed Mar  6 16:11:59 2013 UTC
              System Hostname       : *
              System Address        : *
          
              System OS             : Linux
              OS Kernel             : 2.6.18-274.18.1.el5
              Kernel Type           : x86_64
              Linux Distribution    : CentOS release 5.8 (Final)
          
          Network Interface Information for *
          
              Network Interface     : eth0
              Interface Address     : *
              Address Resolves to   : *
              Interface State       : UP
          
              Network Interface     : eth1
              Interface State       : Inactive
          
              Network Interface     : eth2
              Interface State       : Inactive
          
              Network Interface     : eth3
              Interface State       : Inactive
          
              Network Interface     : lo
              Interface Address     : *
              Address Resolves to   : *
              Interface State       : UP
          
              Network Interface     : sit0
              Interface State       : Inactive
          
          Processor and Memory Information for *
          
              CPU Name              : Intel(R) Xeon(R) CPU E5504 @ 2.00GHz
              CPU Instructions      : nehalem
              CPU Type              : 64-bit
              CPU Speed             : 1995.04 MHz
          
              EMS Determined Processor Count
                  Physical CPUs     : 1
                  Cores per CPU     : 4
                  Total Processors  : 4
          
              EMS.cshrc Specified Processor Count
                  Physical CPUs     : 1 
                  Cores per CPU     : 4
                  Total Processors  : 4
          
              Hyper-Threading       : Off
                
              System Memory         : 7.7 Gbytes
          
          WRF EMS User Information for wrfg on *
          
              User  ID              : 16416
              Group ID              : 16416
              Home Directory        : /home/wrfg
              Home Directory Mount  : Local
              User Shell            : /bin/tcsh
              Shell Installed       : Yes
              Shell Login Files     : .cshrc
              EMS.cshrc Sourced     : .cshrc
              EMS.cshrc Port Range  : None Defined
          
          WRF EMS Installation Information for *
          
              EMS Release           : 3.2.1.5.45.beta
              EMS Home Directory    : /usr1/wrfems
              EMS Home Mount        : Local
              EMS User ID           : 16416
              EMS Group ID          : 16416
              EMS Binaries          : x64
          
              EMS Run Directory     : /usr1/wrfems/runs
              EMS Run Dir Mount     : Local
              EMS Run Dir User ID   : 16416
              EMS Run Dir Group ID  : 16416
          
              Run Dir Avail Space   : 9.24 Gb
              Run Dir Space Used    : 65%
          
              EMS Util Directory    : /usr1/wrfems/util
Update (13:45 CST 5/6)
I tried using ems_domains.pl and received the following msg when trying to localize this existing domain:

Code: Select all

      Attempting to localize domain WRF4G at your request

     !  The nltohash routine says "If in doubt, blame it on the tool!"        
        
        BUMMER: Namelist file problem (../runs/wrf4g/static/namelist.wps) - No such file or directory
Perhaps it didn't copy the namelist.wps over? Hmm... Looking into that I see it there in: ../wrf4g/static/.

meso
Posts: 10
Joined: Fri Feb 22, 2013 10:30 pm

Re: MPI problem when using the domain wizard (dwiz)

Post by meso » Thu Mar 21, 2013 3:03 pm

meteoadriatic wrote: if you want to transfer whole domain and it's configuration to another computer or new installation, just copy whole contents of wrfems/runs/domainname to the same location at new installation. Simple as that :)
Could the reason be that I transferred the domain and configuration (see below) to this computer from another? The domain and configuration came from a computer cluster, where it is now just a single computer.

meteoadriatic
Posts: 1566
Joined: Wed Aug 19, 2009 10:05 am

Re: MPI problem when using the domain wizard (dwiz)

Post by meteoadriatic » Thu Mar 21, 2013 5:15 pm

Yes possibly but I'm not sure. I personally don't use dwiz at all. Try with ems_domain.pl script instead. Official manual describes using ems_domain.pl script in detail.

meso
Posts: 10
Joined: Fri Feb 22, 2013 10:30 pm

Re: MPI problem when using the domain wizard (dwiz)

Post by meso » Mon Apr 01, 2013 5:57 pm

Interesting - it did work with the script.

After taking a domain from another computer, to use it, all I have to do is localize it, correct since it is an existing domain?

meteoadriatic
Posts: 1566
Joined: Wed Aug 19, 2009 10:05 am

Re: MPI problem when using the domain wizard (dwiz)

Post by meteoadriatic » Mon Apr 01, 2013 6:24 pm

You don't have to do anything if you copy whole contents of domain static directory. There is all you need. If you just copy your namelist.wps file, then you need to localize it. By localizing domain you create one .nc file for a domain, that are your domain static files, those files include geographic information like landmask, terrain height, type of land, etc.

meso
Posts: 10
Joined: Fri Feb 22, 2013 10:30 pm

Re: MPI problem when using the domain wizard (dwiz)

Post by meso » Tue Apr 02, 2013 9:31 pm

meteoadriatic wrote:You don't have to do anything if you copy whole contents of domain static directory. There is all you need. If you just copy your namelist.wps file, then you need to localize it. By localizing domain you create one .nc file for a domain, that are your domain static files, those files include geographic information like landmask, terrain height, type of land, etc.
Oh... Well I localized it anyway -- :/ assuming that won't lead to problems down the road.

Thanks for the help meteoadriatic!

meteoadriatic
Posts: 1566
Joined: Wed Aug 19, 2009 10:05 am

Re: MPI problem when using the domain wizard (dwiz)

Post by meteoadriatic » Tue Apr 02, 2013 9:34 pm

meso wrote:Oh... Well I localized it anyway -- :/ assuming that won't lead to problems down the road.
Of course not :)

Post Reply

Who is online

Users browsing this forum: No registered users and 0 guests