extract data from wrf-ems for weather forecast
Re: extract data from wrf-ems for weather forecast
I solved the problem
Just add in the command "-match_inv" and i changed in egrep "PRMSL:mean sea level" by "PRMSL:n=1"
Just add in the command "-match_inv" and i changed in egrep "PRMSL:mean sea level" by "PRMSL:n=1"
www.meteo-sudouest.fr
Re: extract data from wrf-ems for weather forecast
Is it possibile extract data every 3 hours.... i have grib file every 3 hours... thanks
ZHB wrote:Hello,
If it can helps you, I've made a script in perl to extract desired data from grib2 files and then to upload them on a MySQL database. To execute this script automaticaly, you have to edit conf/ems_post/post_grads.conf and add this script path at the end of the file.
This scipt first extract data from grib2 files with wgrib2 and write them into txt file for each timestep. Then, each params are merged into one file for each timestep, so yet we can use LOAD DATA LOCALE INFILE tu update our MySQL database.
I know my script isn't very well coded, but if you have questions I'm here.
P.S.Code: Select all
#!/usr/bin/env perl ###################################### #Description: extraction des paramètres météorologiques, insertion de ceux-ci en # base de données et création de divers cartes. #Instructions: modifier le tableau $tab_data en y ajoutant les noms des paramètres # DANS L ORDRE à laquelle ils sont renseignés en BDD. # : Indiquer le nombre d'échéances dans la variable $run_length ########################################################################### ########################## [modifier ci-dessous] ########################## $wrfems_dir = "/home/wrfnmm/wrfems"; # path to EMS folder $run_name = "nmm_francesuisse"; # run dir name $ctl_file_name = "ctlfile.ctl"; # ctl file name $run_length = "130"; # forecast length $final_dir = "/var/www/share/grib"; # path where data are created (txt files) $last_update_file = "/var/www/share/update.txt"; #path to the last updated file $database = "database_name"; # remote database name $hostname = "11.111.11.111"; # host $login = "user"; # user $mdp = "password"; # password $tmpTable = "t_fcst_data_tmp"; # temporary table to load data before updating forecast table. $tableEchCol = "echeance"; # First column name of the forcaste table (Timestep) $tableLatCol = "fcst_latitude"; # Second column name of the forcaste table $tableLngCol = "fcst_longitude"; # Third column name of the forcaste table $tableFcstName = "t_fcst_data"; # forecast table name, without domain number part. Ex : t_fcst_data for t_fcst_data_01 # do or not the action $extractData = "1"; $insertDataIntoDB = "1"; $updateTimeFile = "1"; $generate_maps = "1"; # ftp upload for last update file my $ftp_host = '11.111.11.111'; my $ftp_user = 'user'; my $ftp_pwd = 'password'; my $ftp_upload_dir = '/remote/upload/path'; my @domains = ('01', '02'); #domains number. For domains < 10, add a 0 before # database columns name in the correct order, without the first three columns (echeance, fcst_latitude, fcst_longitude) my @tab_data = ( 'PRMSL', 'TMP500', 'RH500', 'TMP700', 'RH700', 'TMP850', 'RH850', 'TMP2m', 'DPT2m', 'RH2m', 'TMAX2m', 'TMIN2m', 'UGRD10m', 'VGRD10m', 'PRATEsfc', 'CPRATsfc', 'APCPsfc', 'ACPCPsfc', 'NCPCPsfc', 'CSNOWsfc', 'CICEPsfc', 'CFRZRsfc', 'CAPEsfc', 'CINsfc', 'LCDC', 'MCDC', 'HCDC', 'CDCON', 'USTM6000', 'VSTM6000', 'HGT0C', 'GUSTsfc', 'MX10U10m', 'MX10V10m', 'RH950', 'RH925', 'RH900', 'RH800', 'TMP950', 'TMP925', 'TMP900', 'TMP800', 'LFTXsfc', 'TMP600', 'TSOIL0001', 'SOILW0001', 'TSOIL0104', 'SOILW0104', 'HGTsfc'); # array with expression for wgrib2 extraction. # Expression can be found like this : wgrib2 grib2file.grb2 -match_inv > output.txt my @tab_extract = ( '(:PRMSL:mean sea level:)', '(:TMP:500 mb:)', '(:RH:500 mb:)', '(:TMP:700 mb:)', '(:RH:700 mb:)', '(:TMP:850 mb:)', '(:RH:850 mb:)', '(:TMP:2 m above ground:)', '(:DPT:2 m above ground:)', '(:RH:2 m above ground:)', '(:TMAX:2 m above ground:)', '(:TMIN:2 m above ground:)', '(:UGRD:10 m above ground:)', '(:VGRD:10 m above ground:)', '(:PRATE:surface:)', '(:CPRAT:surface:)', '(:APCP:surface:)', '(:ACPCP:surface:)', '(:NCPCP:surface:)', '(:CSNOW:surface:)', '(:CICEP:surface:)', '(:CFRZR:surface:)', '(:CAPE:surface:)', '(:CIN:surface:)', '(:LCDC:low cloud layer:)', '(:MCDC:middle cloud layer:)', '(:HCDC:high cloud layer:)', '(:CDCON:)', '(:USTM:6000-0 m above ground:)', '(:VSTM:6000-0 m above ground:)', '(:HGT:0C isotherm:)', '(:GUST:surface:)', '(:MX10U:10 m above ground:)', '(:MX10V:10 m above ground:)', '(:RH:950 mb:)', '(:RH:925 mb:)', '(:RH:900 mb:)', '(:RH:800 mb:)', '(:TMP:950 mb:)', '(:TMP:925 mb:)', '(:TMP:900 mb:)', '(:TMP:800 mb:)', '(:LFTX:500-1000 mb:)', '(:TMP:600 mb:)', '(:TSOIL:0-0.1 m below ground:)', '(:SOILW:0-0.1 m below ground:)', '(:TSOIL:0.4-1 m below ground:)', '(:SOILW:0.4-1 m below ground:)', '(:HGT:surface:)'); ########################## [modifier ci-dessus] ########################## #////////////////////////////////////////////////////////////////////////# ##################### [Ne pas modifier ci-dessous] ####################### # vérification de la disponibilité des fichiers gribs du dernier domaines # si les fichiers grib du dernier domaine n'existent pas on ne réalise pas les actions chdir("${wrfems_dir}/runs/${run_name}/emsprd/grib") or die "The working directory ${wrfems_dir}/runs/${run_name}/emsprd/grib you specified doesn't exist - check it and try again\n"; # check if wgrib2 for last domain (in @domains array) exists. If they don't exists don't execute this script $lastDomain = $domains[-1]; if (glob("*d${lastDomain}.grb2*")) { if($extractData == 1) { #################### [extraction des donnees] ####################### chdir("${wrfems_dir}/runs/${run_name}/emsprd/grib") or die "The working directory ${wrfems_dir}/runs/${run_name}/emsprd/grib you specified doesn't exist - check it and try again\n"; print "Extracting data from grib files\n"; for($i = 0; $i < $run_length; $i++) { for($j = 0; $j < @tab_extract; $j++) { foreach $domain (@domains) { if($i < 10) { `wgrib2 *d${domain}.grb2f0${i}0000 | egrep \"${tab_extract[$j]}\" | wgrib2 -i *d${domain}.grb2f0${i}0000 -spread $final_dir/grib_data/${domain}/${domain}_${tab_data[$j]}_${i}.txt`; } else { `wgrib2 *d${domain}.grb2f${i}0000 | egrep \"${tab_extract[$j]}\" | wgrib2 -i *d${domain}.grb2f${i}0000 -spread $final_dir/grib_data/${domain}/${domain}_${tab_data[$j]}_${i}.txt`; } } } print "."; } ################### [concatenation des donnees] ####################### print "\n\nMaking one data file from all parameter files\n"; chdir($final_dir) or die "The working directory ${final_dir} you specified doesn't exist - check it and try again\n"; $size = @tab_data; for($i = 0; $i < $run_length; $i++) { foreach $domain (@domains) { `cp ./grib_data/${domain}/${domain}_${tab_data[0]}_${i}.txt ${domain}_data_${i}.txt`; for($j = 1; $j < $size; $j++) { `cut -d \",\" -f3 ./grib_data/${domain}/${domain}_${tab_data[$j]}_${i}.txt > ./tmp/${domain}_${j}.txt`; } for($j = 1; $j < $size; $j++) { # creation des fichiers temporaires s'ils n'existent pas if(! -e "./tmp/${domain}_T${j}.txt") { open (FILE, ">>./tmp/${domain}_T${j}.txt") or die ("An error occured when creating file\n"); } $pred=$j - 1; if($j == 1) { `paste -d \",\" ${domain}_data_${i}.txt ./tmp/${domain}_${j}.txt > ./tmp/${domain}_T${j}.txt`; } elsif($j < $size - 1) { `paste -d \",\" ./tmp/${domain}_T${pred}.txt ./tmp/${domain}_${j}.txt > ./tmp/${domain}_T${j}.txt`; } else { `paste -d \",\" ./tmp/${domain}_T${pred}.txt ./tmp/${domain}_${j}.txt > ${domain}_data_${i}.txt`; } } print "."; } } } ##################### [DB data insertion with perl-net-ssh] ####################### if($insertDataIntoDB == 1) { print "\n\nInserting data into DB\n"; # création des chaine nécessaires pour les requêtes sql my $req1 = ""; my $req2 = ""; foreach (@tab_data) { $req1 .= $_ . ", "; $req2 .= "f." . $_ . "=t." . $_ . ", "; } $str1 = substr $req1, 0 ,-2; $str2 = substr $req2, 0 ,-2; # insertion des données en BDD use DBI; use CGI; $co = new CGI; print $co->header; $dsn = "DBI:mysql:database=$database;host=$hostname"; $dbh = DBI->connect($dsn, $login, $mdp) or die "Echec connexion"; for($i = 0; $i < $run_length; $i++) { foreach $domain (@domains) { # truncate temporary table $requete = "TRUNCATE TABLE ${tmpTable}_${domain}"; $sth = $dbh->prepare($requete); $sth->execute(); $sth -> finish; # load datas in tempo table $requete = "LOAD DATA LOCAL INFILE '/var/www/share/grib/${domain}_data_${i}.txt' INTO TABLE ${tmpTable}_${domain} FIELDS TERMINATED BY ',' ENCLOSED BY '' LINES STARTING BY '' TERMINATED BY '\n' IGNORE 1 LINES (${tableLngCol}, ${tableLatCol}, ${str1}) SET ${tableEchCol} = ${i}"; $sth = $dbh->prepare($requete); $sth->execute(); $sth -> finish; # IF instruction convert longitude like 0 <-> 360 to longitude -180 <-> 180 $requete = "UPDATE ${tableFcstName}_${domain} f,${tmpTable}_${domain} t SET ${str2} WHERE IF(t.${tableLngCol} > 180, f.${tableLngCol}=t.${tableLngCol}-360, f.${tableLngCol}=t.${tableLngCol}) AND f.${tableLatCol} = t.${tableLatCol} AND f.${tableEchCol} = t.${tableEchCol}"; $sth = $dbh->prepare($requete); $sth->execute(); $sth -> finish; print "."; } } $dbh -> disconnect; } ##################### [mise a jour du fichier update] ####################### if($updateTimeFile == 1) { use Time::Local; use POSIX qw/strftime/; $now = time(); $offset = -1; # nombre de jours de decalage $secs_in_day = (3600 * 24); # seconds in hour * 24 $two_days_hence = $now + ($offset * $secs_in_day); $date = strftime('%Y-%m-%d %H:%M:%S',localtime($two_days_hence)); open (UPDATE, ">$last_update_file"); print UPDATE ($date); close(UPDATE); # upload du fichier de mise à jour sur le serveur web use Net::FTP; my $ftp = Net::FTP->new($ftp_host, Debug => 0) or die "Cannot connect to $ftp_host: $@"; $ftp->login($ftp_user,$ftp_pwd) or die "Cannot login ", $ftp->message; $ftp->cwd($ftp_upload_dir) or die "Cannot cd to " , $ftp->message(); $ftp->put($last_update_file) or die "get failed ", $ftp->message; $ftp->quit; } ##################### [creation des cartes] ####################### if($generate_maps == 1) { print "\n\nStarting images generation\n\n"; # creation du dossier processing s'il n'existe pas if(! -d "${wrfems_dir}/util/grads/products/${run_name}/processing") { print "\tmaking processing directory\n"; mkdir("${wrfems_dir}/util/grads/products/${run_name}/processing") or die ("An error occured when creating new dir\n"); } # nettoyage du dossier processing chdir("${wrfems_dir}/util/grads/products/${run_name}/processing") or die "The working directory you specified doesn't exist - check it and try again\n"; print "\tRemoving old ctl, idx and grb files\n"; `rm *.ctl`; `rm *.idx`; `rm *.grb*`; chdir("${wrfems_dir}/runs/${run_name}/emsprd/grads") or die "The grib directory ${wrfems_dir}/runs/${run_name}/emsprd/grads you specified doesn't exist - check it and try again\n"; print "\tCopying ctl, idx and grb files into processing directory\n"; `cp *.* $wrfems_dir/util/grads/products/$run_name/processing`; ########### [changement de nom du fichier *.ctl] ########## chdir("${wrfems_dir}/util/grads/products/${run_name}/processing") or die "The working directory you specified doesn't exist - check it and try again\n"; print "\tRenaming original ctl file to ${ctl_file_name}\n"; foreach $domain (@domains) { `mv *d${domain}.ctl ${domain}_${ctl_file_name}`; } #on retourne dans le dossier contenant les scripts chdir("${wrfems_dir}/util/grads/products/${run_name}") or die "The script directory ${wrfems_dir}/util/grads/products/${run_name} you specified doesn't exist - check it and try again\n"; print "\tProducing ${run_length} hours worth of images\n"; #print DEBUG "Here's what I'm running: gradsc -l -b -c \"run model.gs.all $requested_domain $runtime $freq $timestep\"\n"; foreach $domain (@domains) { print "${wrfems_dir}/util/grads/products/${run_name}/romandie3km.maps.gs ${wrfems_dir}/util/grads/products/${run_name}/processing/${domain}_${ctl_file_name} ${run_length} ${domain}"; `${wrfems_dir}/util/grads/bin/opengrads -bp << EOF run ${wrfems_dir}/util/grads/products/${run_name}/romandie3km.maps.gs ${wrfems_dir}/util/grads/products/${run_name}/processing/${domain}_${ctl_file_name} ${run_length} ${domain} quit EOF`; } } }
I'm sorry for my english !
Re: extract data from wrf-ems for weather forecast
Yes, you can adapt the script by changing lines :
To
Maybe you will have to do more adaptations, but I think it's ok like that.
Code: Select all
for($i = 0; $i < $run_length; $i++)
Code: Select all
for($i = 0; $i < $run_length; $i+3)
Re: extract data from wrf-ems for weather forecast
sorry, but where do I have to put INSERT statement exactly??
Thanks
Thanks
ZHB wrote:Yes, you have to create the data table and temporary table. Here is my data table :
Temporary table has the same structure, so you can only change the name.Code: Select all
CREATE TABLE IF NOT EXISTS `t_fcst_data_01` ( `echeance` smallint(3) NOT NULL, `fcst_latitude` decimal(8,6) NOT NULL, `fcst_longitude` decimal(8,6) NOT NULL, `TMP500` decimal(6,3) NOT NULL, `RH500` decimal(5,2) NOT NULL, `TMP700` decimal(6,3) NOT NULL, `RH700` decimal(5,2) NOT NULL, `TMP850` decimal(6,3) NOT NULL, `RH850` decimal(5,2) NOT NULL, `TMP2m` decimal(6,3) NOT NULL, `RH2m` decimal(5,2) NOT NULL, `DPT2m` decimal(6,3) NOT NULL, `TMAX2m` decimal(6,3) NOT NULL, `TMIN2m` decimal(6,3) NOT NULL, `UGRD10m` decimal(4,1) NOT NULL, `VGRD10m` decimal(4,1) NOT NULL, `PRMSL` decimal(6,0) NOT NULL, `APCPsfc` decimal(4,1) NOT NULL, `ACPCPsfc` decimal(4,1) NOT NULL, `NCPCPsfc` decimal(4,1) NOT NULL, `CAPEsfc` decimal(8,4) NOT NULL, `CINsfc` decimal(7,4) NOT NULL, `LCDC` decimal(5,2) NOT NULL, `MCDC` decimal(5,2) NOT NULL, `HCDC` decimal(5,2) NOT NULL, `HGT0C` decimal(5,1) NOT NULL, `U10MAX10m` decimal(4,1) NOT NULL, `V10MAX10m` decimal(4,1) NOT NULL, `RH950` decimal(5,2) NOT NULL, `RH925` decimal(5,2) NOT NULL, `RH900` decimal(5,2) NOT NULL, `RH800` decimal(5,2) NOT NULL, `TMP950` decimal(6,3) NOT NULL, `TMP925` decimal(6,3) NOT NULL, `TMP900` decimal(6,3) NOT NULL, `TMP800` decimal(6,3) NOT NULL, `LFTXsfc` decimal(6,3) NOT NULL, `TMP600` decimal(6,3) NOT NULL, `TSOIL0001` decimal(6,3) NOT NULL, `TSOIL0104` decimal(6,3) NOT NULL, `SOILW0001` decimal(6,3) NOT NULL, `SOILW0104` decimal(6,3) NOT NULL, `HGTsfc` decimal(6,2) NOT NULL, `UGRD500` decimal(4,1) NOT NULL, `VGRD500` decimal(4,1) NOT NULL, `UGRD850` decimal(4,1) NOT NULL, `VGRD850` decimal(4,1) NOT NULL, `AZRAINaccsfc` decimal(3,1) NOT NULL, KEY `coordinate` (`fcst_latitude`,`fcst_longitude`), KEY `echeance` (`echeance`) ) ENGINE=MyISAM DEFAULT CHARSET=utf8;
Note that to insert data into MySQL database, you have to insert all desired timestep values ("echeance" in my table) and all latitude and longitude values in the data table (not in the temporary table). It's for the update part of the script.
To insert those values, you can simply comment the UPDATE part of the script and add before that an INSERT statement :In the data table, "echeance", "fcst_latitude" and "fcst_longitude" are always the same and won't be updated.Code: Select all
INSERT INTO ${tableFcstName}${domain} (SELECT * FROM ${tmpTable}_${domain})
Re: extract data from wrf-ems for weather forecast
no, with this change the script not ending
ZHB wrote:Yes, you can adapt the script by changing lines :ToCode: Select all
for($i = 0; $i < $run_length; $i++)
Maybe you will have to do more adaptations, but I think it's ok like that.Code: Select all
for($i = 0; $i < $run_length; $i+3)
Re: extract data from wrf-ems for weather forecast
Sorry, I use already POSTSCR for another thing ... how can I do? thanksZHB wrote:Hello,
To execute this script automaticaly, you have to edit conf/ems_post/post_grads.conf and add this script path at the end of the file.
I know my script isn't very well coded, but if you have questions I'm here.
P.S.Code: Select all
#!/usr/bin/env perl ###################################### #Description: extraction des paramètres météorologiques, insertion de ceux-ci en # base de données et création de divers cartes. #Instructions: modifier le tableau $tab_data en y ajoutant les noms des paramètres # DANS L ORDRE à laquelle ils sont renseignés en BDD. # : Indiquer le nombre d'échéances dans la variable $run_length ########################################################################### ########################## [modifier ci-dessous] ########################## $wrfems_dir = "/home/wrfnmm/wrfems"; # path to EMS folder $run_name = "nmm_francesuisse"; # run dir name $ctl_file_name = "ctlfile.ctl"; # ctl file name $run_length = "130"; # forecast length $final_dir = "/var/www/share/grib"; # path where data are created (txt files) $last_update_file = "/var/www/share/update.txt"; #path to the last updated file $database = "database_name"; # remote database name $hostname = "11.111.11.111"; # host $login = "user"; # user $mdp = "password"; # password $tmpTable = "t_fcst_data_tmp"; # temporary table to load data before updating forecast table. $tableEchCol = "echeance"; # First column name of the forcaste table (Timestep) $tableLatCol = "fcst_latitude"; # Second column name of the forcaste table $tableLngCol = "fcst_longitude"; # Third column name of the forcaste table $tableFcstName = "t_fcst_data"; # forecast table name, without domain number part. Ex : t_fcst_data for t_fcst_data_01 # do or not the action $extractData = "1"; $insertDataIntoDB = "1"; $updateTimeFile = "1"; $generate_maps = "1"; # ftp upload for last update file my $ftp_host = '11.111.11.111'; my $ftp_user = 'user'; my $ftp_pwd = 'password'; my $ftp_upload_dir = '/remote/upload/path'; my @domains = ('01', '02'); #domains number. For domains < 10, add a 0 before # database columns name in the correct order, without the first three columns (echeance, fcst_latitude, fcst_longitude) my @tab_data = ( 'PRMSL', 'TMP500', 'RH500', 'TMP700', 'RH700', 'TMP850', 'RH850', 'TMP2m', 'DPT2m', 'RH2m', 'TMAX2m', 'TMIN2m', 'UGRD10m', 'VGRD10m', 'PRATEsfc', 'CPRATsfc', 'APCPsfc', 'ACPCPsfc', 'NCPCPsfc', 'CSNOWsfc', 'CICEPsfc', 'CFRZRsfc', 'CAPEsfc', 'CINsfc', 'LCDC', 'MCDC', 'HCDC', 'CDCON', 'USTM6000', 'VSTM6000', 'HGT0C', 'GUSTsfc', 'MX10U10m', 'MX10V10m', 'RH950', 'RH925', 'RH900', 'RH800', 'TMP950', 'TMP925', 'TMP900', 'TMP800', 'LFTXsfc', 'TMP600', 'TSOIL0001', 'SOILW0001', 'TSOIL0104', 'SOILW0104', 'HGTsfc'); # array with expression for wgrib2 extraction. # Expression can be found like this : wgrib2 grib2file.grb2 -match_inv > output.txt my @tab_extract = ( '(:PRMSL:mean sea level:)', '(:TMP:500 mb:)', '(:RH:500 mb:)', '(:TMP:700 mb:)', '(:RH:700 mb:)', '(:TMP:850 mb:)', '(:RH:850 mb:)', '(:TMP:2 m above ground:)', '(:DPT:2 m above ground:)', '(:RH:2 m above ground:)', '(:TMAX:2 m above ground:)', '(:TMIN:2 m above ground:)', '(:UGRD:10 m above ground:)', '(:VGRD:10 m above ground:)', '(:PRATE:surface:)', '(:CPRAT:surface:)', '(:APCP:surface:)', '(:ACPCP:surface:)', '(:NCPCP:surface:)', '(:CSNOW:surface:)', '(:CICEP:surface:)', '(:CFRZR:surface:)', '(:CAPE:surface:)', '(:CIN:surface:)', '(:LCDC:low cloud layer:)', '(:MCDC:middle cloud layer:)', '(:HCDC:high cloud layer:)', '(:CDCON:)', '(:USTM:6000-0 m above ground:)', '(:VSTM:6000-0 m above ground:)', '(:HGT:0C isotherm:)', '(:GUST:surface:)', '(:MX10U:10 m above ground:)', '(:MX10V:10 m above ground:)', '(:RH:950 mb:)', '(:RH:925 mb:)', '(:RH:900 mb:)', '(:RH:800 mb:)', '(:TMP:950 mb:)', '(:TMP:925 mb:)', '(:TMP:900 mb:)', '(:TMP:800 mb:)', '(:LFTX:500-1000 mb:)', '(:TMP:600 mb:)', '(:TSOIL:0-0.1 m below ground:)', '(:SOILW:0-0.1 m below ground:)', '(:TSOIL:0.4-1 m below ground:)', '(:SOILW:0.4-1 m below ground:)', '(:HGT:surface:)'); ########################## [modifier ci-dessus] ########################## #////////////////////////////////////////////////////////////////////////# ##################### [Ne pas modifier ci-dessous] ####################### # vérification de la disponibilité des fichiers gribs du dernier domaines # si les fichiers grib du dernier domaine n'existent pas on ne réalise pas les actions chdir("${wrfems_dir}/runs/${run_name}/emsprd/grib") or die "The working directory ${wrfems_dir}/runs/${run_name}/emsprd/grib you specified doesn't exist - check it and try again\n"; # check if wgrib2 for last domain (in @domains array) exists. If they don't exists don't execute this script $lastDomain = $domains[-1]; if (glob("*d${lastDomain}.grb2*")) { if($extractData == 1) { #################### [extraction des donnees] ####################### chdir("${wrfems_dir}/runs/${run_name}/emsprd/grib") or die "The working directory ${wrfems_dir}/runs/${run_name}/emsprd/grib you specified doesn't exist - check it and try again\n"; print "Extracting data from grib files\n"; for($i = 0; $i < $run_length; $i++) { for($j = 0; $j < @tab_extract; $j++) { foreach $domain (@domains) { if($i < 10) { `wgrib2 *d${domain}.grb2f0${i}0000 | egrep \"${tab_extract[$j]}\" | wgrib2 -i *d${domain}.grb2f0${i}0000 -spread $final_dir/grib_data/${domain}/${domain}_${tab_data[$j]}_${i}.txt`; } else { `wgrib2 *d${domain}.grb2f${i}0000 | egrep \"${tab_extract[$j]}\" | wgrib2 -i *d${domain}.grb2f${i}0000 -spread $final_dir/grib_data/${domain}/${domain}_${tab_data[$j]}_${i}.txt`; } } } print "."; } ################### [concatenation des donnees] ####################### print "\n\nMaking one data file from all parameter files\n"; chdir($final_dir) or die "The working directory ${final_dir} you specified doesn't exist - check it and try again\n"; $size = @tab_data; for($i = 0; $i < $run_length; $i++) { foreach $domain (@domains) { `cp ./grib_data/${domain}/${domain}_${tab_data[0]}_${i}.txt ${domain}_data_${i}.txt`; for($j = 1; $j < $size; $j++) { `cut -d \",\" -f3 ./grib_data/${domain}/${domain}_${tab_data[$j]}_${i}.txt > ./tmp/${domain}_${j}.txt`; } for($j = 1; $j < $size; $j++) { # creation des fichiers temporaires s'ils n'existent pas if(! -e "./tmp/${domain}_T${j}.txt") { open (FILE, ">>./tmp/${domain}_T${j}.txt") or die ("An error occured when creating file\n"); } $pred=$j - 1; if($j == 1) { `paste -d \",\" ${domain}_data_${i}.txt ./tmp/${domain}_${j}.txt > ./tmp/${domain}_T${j}.txt`; } elsif($j < $size - 1) { `paste -d \",\" ./tmp/${domain}_T${pred}.txt ./tmp/${domain}_${j}.txt > ./tmp/${domain}_T${j}.txt`; } else { `paste -d \",\" ./tmp/${domain}_T${pred}.txt ./tmp/${domain}_${j}.txt > ${domain}_data_${i}.txt`; } } print "."; } } } ##################### [DB data insertion with perl-net-ssh] ####################### if($insertDataIntoDB == 1) { print "\n\nInserting data into DB\n"; # création des chaine nécessaires pour les requêtes sql my $req1 = ""; my $req2 = ""; foreach (@tab_data) { $req1 .= $_ . ", "; $req2 .= "f." . $_ . "=t." . $_ . ", "; } $str1 = substr $req1, 0 ,-2; $str2 = substr $req2, 0 ,-2; # insertion des données en BDD use DBI; use CGI; $co = new CGI; print $co->header; $dsn = "DBI:mysql:database=$database;host=$hostname"; $dbh = DBI->connect($dsn, $login, $mdp) or die "Echec connexion"; for($i = 0; $i < $run_length; $i++) { foreach $domain (@domains) { # truncate temporary table $requete = "TRUNCATE TABLE ${tmpTable}_${domain}"; $sth = $dbh->prepare($requete); $sth->execute(); $sth -> finish; # load datas in tempo table $requete = "LOAD DATA LOCAL INFILE '/var/www/share/grib/${domain}_data_${i}.txt' INTO TABLE ${tmpTable}_${domain} FIELDS TERMINATED BY ',' ENCLOSED BY '' LINES STARTING BY '' TERMINATED BY '\n' IGNORE 1 LINES (${tableLngCol}, ${tableLatCol}, ${str1}) SET ${tableEchCol} = ${i}"; $sth = $dbh->prepare($requete); $sth->execute(); $sth -> finish; # IF instruction convert longitude like 0 <-> 360 to longitude -180 <-> 180 $requete = "UPDATE ${tableFcstName}_${domain} f,${tmpTable}_${domain} t SET ${str2} WHERE IF(t.${tableLngCol} > 180, f.${tableLngCol}=t.${tableLngCol}-360, f.${tableLngCol}=t.${tableLngCol}) AND f.${tableLatCol} = t.${tableLatCol} AND f.${tableEchCol} = t.${tableEchCol}"; $sth = $dbh->prepare($requete); $sth->execute(); $sth -> finish; print "."; } } $dbh -> disconnect; } ##################### [mise a jour du fichier update] ####################### if($updateTimeFile == 1) { use Time::Local; use POSIX qw/strftime/; $now = time(); $offset = -1; # nombre de jours de decalage $secs_in_day = (3600 * 24); # seconds in hour * 24 $two_days_hence = $now + ($offset * $secs_in_day); $date = strftime('%Y-%m-%d %H:%M:%S',localtime($two_days_hence)); open (UPDATE, ">$last_update_file"); print UPDATE ($date); close(UPDATE); # upload du fichier de mise à jour sur le serveur web use Net::FTP; my $ftp = Net::FTP->new($ftp_host, Debug => 0) or die "Cannot connect to $ftp_host: $@"; $ftp->login($ftp_user,$ftp_pwd) or die "Cannot login ", $ftp->message; $ftp->cwd($ftp_upload_dir) or die "Cannot cd to " , $ftp->message(); $ftp->put($last_update_file) or die "get failed ", $ftp->message; $ftp->quit; } ##################### [creation des cartes] ####################### if($generate_maps == 1) { print "\n\nStarting images generation\n\n"; # creation du dossier processing s'il n'existe pas if(! -d "${wrfems_dir}/util/grads/products/${run_name}/processing") { print "\tmaking processing directory\n"; mkdir("${wrfems_dir}/util/grads/products/${run_name}/processing") or die ("An error occured when creating new dir\n"); } # nettoyage du dossier processing chdir("${wrfems_dir}/util/grads/products/${run_name}/processing") or die "The working directory you specified doesn't exist - check it and try again\n"; print "\tRemoving old ctl, idx and grb files\n"; `rm *.ctl`; `rm *.idx`; `rm *.grb*`; chdir("${wrfems_dir}/runs/${run_name}/emsprd/grads") or die "The grib directory ${wrfems_dir}/runs/${run_name}/emsprd/grads you specified doesn't exist - check it and try again\n"; print "\tCopying ctl, idx and grb files into processing directory\n"; `cp *.* $wrfems_dir/util/grads/products/$run_name/processing`; ########### [changement de nom du fichier *.ctl] ########## chdir("${wrfems_dir}/util/grads/products/${run_name}/processing") or die "The working directory you specified doesn't exist - check it and try again\n"; print "\tRenaming original ctl file to ${ctl_file_name}\n"; foreach $domain (@domains) { `mv *d${domain}.ctl ${domain}_${ctl_file_name}`; } #on retourne dans le dossier contenant les scripts chdir("${wrfems_dir}/util/grads/products/${run_name}") or die "The script directory ${wrfems_dir}/util/grads/products/${run_name} you specified doesn't exist - check it and try again\n"; print "\tProducing ${run_length} hours worth of images\n"; #print DEBUG "Here's what I'm running: gradsc -l -b -c \"run model.gs.all $requested_domain $runtime $freq $timestep\"\n"; foreach $domain (@domains) { print "${wrfems_dir}/util/grads/products/${run_name}/romandie3km.maps.gs ${wrfems_dir}/util/grads/products/${run_name}/processing/${domain}_${ctl_file_name} ${run_length} ${domain}"; `${wrfems_dir}/util/grads/bin/opengrads -bp << EOF run ${wrfems_dir}/util/grads/products/${run_name}/romandie3km.maps.gs ${wrfems_dir}/util/grads/products/${run_name}/processing/${domain}_${ctl_file_name} ${run_length} ${domain} quit EOF`; } } }
I'm sorry for my english !
Re: extract data from wrf-ems for weather forecast
Code: Select all
my @tab_extract = ( '(:PRMSL:mean sea level:)',
'(:TMP:500 mb:)',
'(:RH:500 mb:)',
'(:TMP:700 mb:)',
'(:RH:700 mb:)',
'(:TMP:850 mb:)',
'(:RH:850 mb:)',
'(:TMP:2 m above ground:)',
'(:DPT:2 m above ground:)',
'(:RH:2 m above ground:)',
'(:TMAX:2 m above ground:)',
'(:TMIN:2 m above ground:)',
'(:UGRD:10 m above ground:)',
'(:VGRD:10 m above ground:)',
'(:PRATE:surface:)',
'(:CPRAT:surface:)',
'(:APCP:surface:)',
'(:ACPCP:surface:)',
'(:NCPCP:surface:)',
'(:CSNOW:surface:)',
'(:CICEP:surface:)',
'(:CFRZR:surface:)',
'(:CAPE:surface:)',
'(:CIN:surface:)',
'(:LCDC:low cloud layer:)',
'(:MCDC:middle cloud layer:)',
'(:HCDC:high cloud layer:)',
'(:CDCON:)',
'(:USTM:6000-0 m above ground:)',
'(:VSTM:6000-0 m above ground:)',
'(:HGT:0C isotherm:)',
'(:GUST:surface:)',
'(:MX10U:10 m above ground:)',
'(:MX10V:10 m above ground:)',
'(:RH:950 mb:)',
'(:RH:925 mb:)',
'(:RH:900 mb:)',
'(:RH:800 mb:)',
'(:TMP:950 mb:)',
'(:TMP:925 mb:)',
'(:TMP:900 mb:)',
'(:TMP:800 mb:)',
'(:LFTX:500-1000 mb:)',
'(:TMP:600 mb:)',
'(:TSOIL:0-0.1 m below ground:)',
'(:SOILW:0-0.1 m below ground:)',
'(:TSOIL:0.4-1 m below ground:)',
'(:SOILW:0.4-1 m below ground:)',
'(:HGT:surface:)');
anyone knows how i can extract REFCMAXclm ?? I tried '(:REFCMAX:entire atmospere:)' but nothing result
Thanks
-
- Posts: 1601
- Joined: Wed Aug 19, 2009 10:05 am
Re: extract data from wrf-ems for weather forecast
Do you have anything in that variable when using for example grads?
Re: extract data from wrf-ems for weather forecast
Yes I have it... in the charts it works 

-
- Posts: 54
- Joined: Sat Sep 05, 2009 6:36 am
Re: extract data from wrf-ems for weather forecast
I don't have ems 3.4 anymore, but if I list the content of the grib files of uems with wgrib2, I have for instance
with REFC as parameter name.
Looking at the grads ctl file, I have
You may have to verify the name of the parameter as seen by wgrib2 by using
where "mygribfile" is replaced by any of your grib files
Regards
Code: Select all
456:2929096:d=2017090406:REFC:entire atmosphere:9 hour fcst:
457:2929854:d=2017090406:REFC:entire atmosphere:8-9 hour max fcst:
Looking at the grads ctl file, I have
Code: Select all
REFCmaxclm 0,10,0 0,16,5,2 ** entire atmosphere Composite reflectivity [dB]
Code: Select all
wgrib2 mygribfile
Regards
meteo-sciez.fr