Friday, October 31, 2014
Wednesday, October 29, 2014
IAT GUI summary
1. Find a selected item.
private Choice theIAT_Choice = new Choice();
if (theIAT_Choice.getSelectedItem().equals("vsdb"))
System.out.println("vsdb !!");
2. Shortcut to println statement in eclipse.
// type “syso”, followed by CTRL + SPACE.
System.out.println("Hello, World");
3.ItemEvent selection ( not really sure..... )
public void itemStateChanged(ItemEvent e) {
if (e.getStateChange() == ItemEvent.SELECTED) {
fdLbl.setVisible(false);
System.out.println("fd lbl selected");
} else {
fdLbl.setVisible(true);
System.out.println("fd lbl invisible");
}
4. Panels I have
emptyConfigPanel :
emptyLbl
emptyTextArea
radmonConfigPanel
vsdbConfigPanel
fdConfigPanel
fdConfigLabelArr
fdConfigTextAreaArr
fdConfigTextAreaStrArr
geConfigPanel
hitConfigPanel
move window via keyboard
Refer to the original post here.
Method #1
NOTE: Method #1 won't work with a maximized Window.
NOTE: Method #2 will move your Window to the right or left half of the screen in the same manner as dragging a window to the right or left of the screen will.
Press the Windows Key & Right Arrow or Left Arrow
Method #3
NOTE: Method #3 will move your Window one display to the right or left.
- Kudos to Brink for that tip.
Tried, not working. However,
Pressing the Windows Key & Shift & Up Arrow or Down Arrow
shows a good move.
Method #1
NOTE: Method #1 won't work with a maximized Window.
- Alt-Tab or Click On the Window
- Press "Alt & Space"
- Press "M"
- Use your arrow keys to move the Window
- Press Enter to exit
NOTE: Method #2 will move your Window to the right or left half of the screen in the same manner as dragging a window to the right or left of the screen will.
Press the Windows Key & Right Arrow or Left Arrow
Method #3
NOTE: Method #3 will move your Window one display to the right or left.
- Kudos to Brink for that tip.
- Press the Windows Key & Shift & Right Arrow or Left Arrow
Tried, not working. However,
Pressing the Windows Key & Shift & Up Arrow or Down Arrow
shows a good move.
Tuesday, October 28, 2014
Java SpringLayout example
// ==================================================================
// Config layout for runPanel
// ==================================================================
// ----------------------------> X
// | (0,0)
// |
// |
// \/ Y
// 1. Add components into runPanel
runPanel.add(iatCheckBoxPanel);
runPanel.add(runButton);
runPanel.add(parButton);
SpringLayout runPanelLayout = new SpringLayout();
runPanel.setLayout(runPanelLayout);
int spacer = 5;
int xOrig = 110;
int xWidth = 150;
int yHeight = 30;
// 2. Use constraint to position each component within panel
// All these positions starting from (0,) are RELATIVE to this panel LOCALLY.
SpringLayout.Constraints iatCheckBoxPanelCons = runPanelLayout
.getConstraints(iatCheckBoxPanel);
iatCheckBoxPanelCons.setX(Spring.constant(0));
iatCheckBoxPanelCons.setY(Spring.constant(0));
iatCheckBoxPanelCons.setWidth(Spring.constant(100));
iatCheckBoxPanelCons.setHeight(Spring.constant(100));
SpringLayout.Constraints runButtonCons = runPanelLayout
.getConstraints(runButton);
runButtonCons.setX(Spring.constant(xOrig));
runButtonCons.setY(Spring.constant(70));
runButtonCons.setWidth(Spring.constant(xWidth));
runButtonCons.setHeight(Spring.constant(yHeight));
SpringLayout.Constraints parButtonCons = runPanelLayout
.getConstraints(parButton);
parButtonCons.setX(Spring.constant(xOrig + xWidth + spacer));
parButtonCons.setY(Spring.constant(70));
parButtonCons.setWidth(Spring.constant(xWidth));
parButtonCons.setHeight(Spring.constant(yHeight));
// ==================================================================
// End of runPanel
// ==================================================================
fcstDiff summary
fcstDiff summary :
1. Utility location:
cardinal:
export ndate_dir=/usr/local/jcsda/nwprod_gdas_2014/util/exec
export copygb_dir=/usr/local/jcsda/nwprod_gdas_2014/util/exec
export gribmap_dir=/opt/grads/2.0.2-intel-14.0-2/bin
zeus:
export ndate_dir=/scratch2/portfolios/NCEPDEV/global/save/glopara/nwpara/util/exec
export copygb_dir=/scratch2/portfolios/NCEPDEV/global/save/glopara/nwpara/util/exec
export gribmap_dir=/apps/grads/2.0.1a/bin
2. Modules required to run
zeus:
$ module load grads/2.0.1a # grads
$ module load intel/12-12.0.4.191 # copygb
Note: These modules need to be loaded first before running fcstDiff package, otherwise, you are
going to see many scary error messages related to shared library and missing data in *ctl file and *gs files.
to be continued.........
1. Utility location:
cardinal:
export ndate_dir=/usr/local/jcsda/nwprod_gdas_2014/util/exec
export copygb_dir=/usr/local/jcsda/nwprod_gdas_2014/util/exec
export gribmap_dir=/opt/grads/2.0.2-intel-14.0-2/bin
zeus:
export ndate_dir=/scratch2/portfolios/NCEPDEV/global/save/glopara/nwpara/util/exec
export copygb_dir=/scratch2/portfolios/NCEPDEV/global/save/glopara/nwpara/util/exec
export gribmap_dir=/apps/grads/2.0.1a/bin
2. Modules required to run
zeus:
$ module load grads/2.0.1a # grads
$ module load intel/12-12.0.4.191 # copygb
Note: These modules need to be loaded first before running fcstDiff package, otherwise, you are
going to see many scary error messages related to shared library and missing data in *ctl file and *gs files.
to be continued.........
Thursday, October 23, 2014
How to permanently rotate pdf 90 deg?
See original post here.
Open your file that you want rotated (even if it is 1000 pages all in the wrong direction). Go to Document/Rotate Pages... or use Ctrl+Shift+R and this opens your rotation menu.
You have several rotation options to rotate single pages, all pages, or a selection of pages. Choose what you need and select OK to proceed.
Now you have two options;
1. You will notice your save icon is no longer grayed out so you can permanenly save the file,
2. You will be able to save as... a new file, should you want to keep the original in tact.
Both options permanently save your chosen rotation(s).
This has been built into to software for quite some time but is often overlooked - and rightly so as its not immediately evident.
Open your file that you want rotated (even if it is 1000 pages all in the wrong direction). Go to Document/Rotate Pages... or use Ctrl+Shift+R and this opens your rotation menu.
You have several rotation options to rotate single pages, all pages, or a selection of pages. Choose what you need and select OK to proceed.
Now you have two options;
1. You will notice your save icon is no longer grayed out so you can permanenly save the file,
2. You will be able to save as... a new file, should you want to keep the original in tact.
Both options permanently save your chosen rotation(s).
This has been built into to software for quite some time but is often overlooked - and rightly so as its not immediately evident.
Monday, October 20, 2014
vsdb summary 1
1. Compile vsdb package
2. Configure two shell scripts:
# setup_envs.sh
# vsdbjob_submit.sh
$ cd /data/users/dxu/vsdb_pkg/vsdb_v17
$ vi setup_envs.sh
$ vi vsdbjob_submit.sh
3. Run VSDB
$ cd /data/users/dxu/vsdb_pkg/vsdb_v17
$ ./vsdbjob_submit.sh
4. Input, output, log and run dir
5. script that contains sub_cardinal
export SUBJOB=$vsdbhome/bin/sub_cardinal
File 1 is : ./map_util/sfcfcst_1cyc.sh
File 2 is : ./map_util/allcenters_rmsmap.sh
File 3 is : ./map_util/allcenters_1cyc.sh
File 4 is : ./setup_envs.sh
File 5 is : ./vsdbjob.sh
File 6 is : ./precip/plot_pcp.sh
File 7 is : ./precip/precip_score_vsdb.sh
File 8 is : ./fit2obs/plotall.sh
File 9 is : ./fit2obs/fit2obs.sh
File 10 is : ./plot2d/maps2d_new.sh
File 11 is : ./vsdbjob_submit.sh
File 12 is : ./grid2obs/grid2obs_plot.sh
File 13 is : ./grid2obs/grid2obs_driver.sh
File 14 is : ./grid2obs/scripts/get_opsgfs_data.sh
File 15 is : ./grid2obs/scripts/get_paragfs_data.sh
File 16 is : ./grid2obs/scripts/g2o_sfcmap.sh
File 17 is : ./grid2obs/scripts/grid2obssfc.fits.sh
File 18 is : ./grid2obs/scripts/g2o_airmap.sh
File 19 is : ./grid2obs/grid2obs_opsdaily.sh
File 20 is : ./grid2obs/grid2obs.sh
File 21 is : ./verify_exp_step2.sh
6. all options used in SUBJOB
1) Flag used in SUBJOB
$SUBJOB -a $ACCOUNT -q $CUE2FTP -g $GROUP -p 1/1/S -t 1:00:00 -r 128/1 -j ftpcard -o ftpcard$$.out ${rundir}/ftpcard$$.sh
-e : EVN variable list
$SUBJOB -e $listvar -a $task -q $cue -g $GROUP -p 1/1/S -r 512/1 -t 3:00:00 -o $SUBJOB -a $ACCOUNT -q $CUE2FTP -g $GROUP -p 1/1/S -r 256/1 -w +${waitfits} -t 1:00:00 -j ftpfits -o $mapdir/ftpfits.out ${mapdir}/ftpfits.sh
2) Original meaning of flags.
where the options are:
-a account account (default:none)
-e envars copy comm-separated environment variables
-g group group name
-j jobname specify jobname (default: executable basename)
-n write command file to stdout rather than submitting it
-o output specify output file (default: jobname.out)
-p procs[/nodes]
number of MPI tasks and number of nodes
-q queue queue name
-r nodetype node type (harp or neha)
-v verbose mode
-t timew wall time limit in [[hh:]mm:]ss format (default: 900)
-w when when to run, in yyyymmddhh[mm], +hh[mm], thh[mm], or
Thh[mm] (full, incremental, today or tomorrow) format
(default: now)
3) sub_badger is a wrapper to translate above options into options that scheduler on badger can recognize.
on badger:
qsub -V : pass all the EVN variables.
4) sub_cardinal is a wrapper to translate above options into options that scheduler on cardinal can recognize.
on cardinal:
sbatch --export=<environment variables | ALL | NONE>
Identify which environment variables are propagated to the batch job.
Multiple environment variable names should be comma separated. Environ-
ment variable names may be specified to propagate the current value of
those variables (e.g. "--export=EDITOR") or specific values for the
variables may be exported (e.g.. "--export=EDITOR=/bin/vi"). This
option particularly important for jobs that are submitted on one cluster
and execute on a different cluster (e.g. with different paths). By
default all environment variables are propagated. If the argument is
NONE or specific environment variable names, then the --get-user-env
option will implicitly be set to load other environment variables based
upon the user’s configuration on the cluster which executes the job.
7. score card related
File 1 is : ./run_scorecard.sh
file3=${scoredir}/score_${stat}_conflimit_${namedaily}_${mdnamec2}_day${dd}
File 2 is : ./map_util/allcenters_rmsmap.sh
'${vsdbhome}/map_util/grads/fprintf.gs 'tv.n' score_${var}_conflimit_'
cp score_${var}_conflimit*.txt $scoredir
File 3 is : ./map_util/allcenters_1cyc.sh
'${vsdbhome}/map_util/grads/fprintf.gs 'tv.n' score_cor_conflimit_'${n.
1). Create score card text files:
./vsdbjob_submit.sh
--> ./verify_exp_step2.sh
--> ./map_util/allcenters_rmsmap.sh and ./map_util/allcenters_1cyc.sh
A) anomaly correlation on single pressure layer
$SUBJOB -e $listvar -a $ACCOUNT -q $CUE2RUN -g $GROUP -p $penode -t $cputime -j HGTanom${narea} -o $rundir/HGT_anom.out \
${sorcdir}/allcenters_1cyc.sh $edate $ndays $fdays
if [ $? -ne 0 ]; then ${sorcdir}/allcenters_1cyc.sh $edate $ndays $fdays ; fi
sleep 3
B) rms and bias $SUBJOB -e $listvar -a $ACCOUNT -q $CUE2RUN -g $GROUP -p $penode -t $cputime -j HGTpres${narea} -o $rundir/HGT_pres.out \
${sorcdir}/allcenters_rmsmap.sh $edate $ndays $fdays
if [ $? -ne 0 ]; then ${sorcdir}/allcenters_rmsmap.sh $edate $ndays $fdays ; fi
sleep 3
2) Location of score card files?
Details of ./verify_exp_step2.sh
step 1: move vsdb status files to /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/vsdb_data
step 2 : 3 variable types: anom, pres, sfc
step 3: 5 regions: G2NHX ,G2SHX, G2TRO, G2,G2PNA
Eg:
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2NHX
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2SHX
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2TRO
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2PNA
B#) rms and bias :
step 5 : Process "sfc" separately.
eg: output location: /scratch2/portfolios/NESDIS/h-sandy/noscrub/Deyong.Xu/vsdb_workspace/data/stmp/Deyong.Xu/nwpvrfy29853/acrms29853/score
scorecard.html
scorecard.css
mainindex.html
legend.html
8. How VSDB status files are used?
verify_exp_step2.sh
--> map_util/allcenters_1cyc.sh & map_util/allcenters_rmsmap.sh
--> map_util/gen_scal.sh
map_util/gen_scal_pres.sh
map_util/gen_wind.sh
map_util/gen_wind_pres.sh
map_util/gen_sfc.sh
Detail of "./gen_scal.sh"
1) convert vsdb file into txt file
while [ $cdate -le $edate ]; do
fhour=00; vhr=$cyc
while [ $fhour -le $vlength ]; do
datadir=${vsdb_data}/${vtype}/${vhr}Z/${model}
vsdbname=${datadir}/${model}_${cdate}.vsdb ### vsdb files
string=" $mdl $fhour ${cdate}${vhr} $mdl $reg SAL1L2 $vnam $lev "
mycheck=$( grep "$string" $vsdbname )
if [ $? -ne 0 ]; then
echo "missing" >>$outname.txt
else
grep "$string" $vsdbname |cat >>$${outname}.txt ### grep data out and save it into text file
fi
fhour=` expr $fhour + $fhout `
if [ $fhour -lt 10 ]; then fhour=0$fhour ; fi
vhr=` expr $vhr + $fhout `
if [ $vhr -ge 24 ]; then vhr=`expr $vhr - 24 `; fi
if [ $vhr -lt 10 ]; then vhr=0$vhr ; fi
done
2) convert txt file into binary file via convert.f that is generated on the fly.
( within gen* script such as gen_scal.sh script )
open(9,file="modelname.txt",form="formatted",status="old")
open(10,file="${outname}.txt",form="formatted",status="old") ## text file generated above
open(11,file="tmp.txt",form="formatted",status="new")
open(20,file="${outname}.bin",form="unformatted",status="new") ## output binary file
$FC $FFLAG -o convert.x convert.f
./convert.x
meantxt=${vnam1}_${lev}_${reg1}_${yyyymm}
mv fort.13 meancor_${meantxt}.txt
mv fort.14 meanrms_${meantxt}.txt
mv fort.15 meanbias_${meantxt}.txt
3) create grads control file
cat >${outname}.ctl <<EOF1
dset ^${outname}.bin ## binary file generated above
undef -99.9
options big_endian sequential
title scores
xdef $nmdcyc linear 1 1
4) output ctl files
HGT_P1000_G2NHX_2014020120140228.ctl
HGT_P700_G2NHX_2014020120140228.ctl
HGT_P500_G2NHX_2014020120140228.ctl
HGT_P250_G2NHX_2014020120140228.ctl
Details of "allcenters_1cyc.sh"
0)# -- search data for all models; write out binary data, create grads control file
/gen_wind.sh
/gen_scal.sh
1) # ----- PLOT TYPE 1: time series of anomaly correlations ----
cat >acz_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'open ${outname}.ctl'
mdc.1=${mdnamec[0]}
* Create verification scorecard text files ( score card files )
'${vsdbhome}/map_util/grads/fprintf.gs 'sc.i' score_cor_'${namedaily}'_'mdc.i'_day'%day'.txt %-7.6f' ### GOLD
...
$GRADSBIN/grads -bcp "run acz_${outname1}.gs"
2) # ----- PLOT TYPE 2: Die-off plot for mean correlation over $ndays days----
ndaysp1=`expr $ndays + 1`
fdaysp1=`expr $fdays + 1`
cat >cordieoff_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'open ${outname}.ctl'
** Create verification scorecard text files ( conflimit files between models )
'${vsdbhome}/map_util/grads/fprintf.gs 'tv.n' score_cor_conflimit_'${namedaily}'_'mdc.i'_day'n-1'.txt %-7.6f' ## GOLD: generate score_cor_conflimit* files needed to run score card.
...
$GRADSBIN/grads -bcp "run cordieoff_${outname1}.gs"
3) # ----- PLOT TYPE 3: difference of AC, other models minus first model ----
cat >cordiff_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'run $sorcdir/grads/white.gs'
'open ${outname}.ctl'
...
$GRADSBIN/grads -bcp "run cordiff_${outname1}.gs"
4) # ----- PLOT TYPE 4: frequency distribution of anomaly correlations ----
ndayfq=$ndays
if [ $ndayfq -gt 20 ]; then ndayfq=20; fi
nday05=`expr $ndayfq \/ 2 `
cat >freq_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'open ${outname}.ctl'
...
$GRADSBIN/grads -bcp "run freq_${outname1}.gs"
5) output grads script (.gs) files
freq_HGT_P700_G2NHX_2014020120140228.gs
freq_HGT_P250_G2NHX_2014020120140228.gs
freq_HGT_P500_G2NHX_2014020120140228.gs
freq_HGT_P1000_G2NHX_2014020120140228.gs
cordiff_HGT_P700_G2NHX_2014020120140228.gs
cordiff_HGT_P250_G2NHX_2014020120140228.gs
cordiff_HGT_P500_G2NHX_2014020120140228.gs
cordiff_HGT_P1000_G2NHX_2014020120140228.gs
acz_HGT_P700_G2NHX_2014020120140228.gs
acz_HGT_P250_G2NHX_2014020120140228.gs
acz_HGT_P500_G2NHX_2014020120140228.gs
acz_HGT_P1000_G2NHX_2014020120140228.gs
cordieoff_HGT_P700_G2NHX_2014020120140228.gs
cordieoff_HGT_P250_G2NHX_2014020120140228.gs
cordieoff_HGT_P500_G2NHX_2014020120140228.gs
cordieoff_HGT_P1000_G2NHX_2014020120140228.gs
6) Sample score files needed to run scorecard.
[dxu@s4-cardinal HGT]$ pwd/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy_deyong/acrms_deyong/G2NHX/anom/HGT
[dxu@s4-cardinal HGT]$ tree -f |grep score_cor |grep 250
├── ./score_cor_conflimit_HGT_P250_G2NHX_ECM_day1.txt # for 2nd model only
├── ./score_cor_conflimit_HGT_P250_G2NHX_ECM_day2.txt
├── ./score_cor_conflimit_HGT_P250_G2NHX_ECM_day3.txt
├── ./score_cor_conflimit_HGT_P250_G2NHX_ECM_day4.txt
├── ./score_cor_conflimit_HGT_P250_G2NHX_ECM_day5.txt
├── ./score_cor_HGT_P250_G2NHX_ECM_day1.txt
├── ./score_cor_HGT_P250_G2NHX_ECM_day3.txt
├── ./score_cor_HGT_P250_G2NHX_ECM_day5.txt
├── ./score_cor_HGT_P250_G2NHX_GFS_day1.txt # Reference model
├── ./score_cor_HGT_P250_G2NHX_GFS_day3.txt
├── ./score_cor_HGT_P250_G2NHX_GFS_day5.txt
Details of "allcenters_rmsmap.sh"
varlist="rms bias pcor emd epv rsd msess"
0) # -- search data for all models; write out binary data, create grads control file
gen_wind_pres.sh
gen_scal_pres.sh
# ----- PLOT TYPE 1: maps of $var as a function of calendar day and pressure for each forecast time ----
cat >${var}p_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'run $sorcdir/grads/white.gs'
'open ${outname1}.ctl'
...
EOF1
$GRADSBIN/grads -bcp "run ${var}p_${outname1}.gs"
eg:
rmsp_WIND_G2NHX_2014020120140228.gs
biasp_WIND_G2NHX_2014020120140228.gs
pcorp_WIND_G2NHX_2014020120140228.gs
emdp_WIND_G2NHX_2014020120140228.gs
epvp_WIND_G2NHX_2014020120140228.gs
rsdp_WIND_G2NHX_2014020120140228.gs
msessp_WIND_G2NHX_2014020120140228.gs
# ----- PLOT TYPE 2: maps of mean ${var} as a function of forecast time and pressure ----ndaysp1=`expr $ndays + 1`
fdaysp1=`expr $fdays + 1`
cat >${var}pmean_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'run $sorcdir/grads/white.gs'
'open ${outname1}.ctl'
...
EOF1
$GRADSBIN/grads -bcp "run ${var}pmean_${outname1}.gs"
eg:
rmspmean_WIND_G2NHX_2014020120140228.gs
biaspmean_WIND_G2NHX_2014020120140228.gs
pcorpmean_WIND_G2NHX_2014020120140228.gs
emdpmean_WIND_G2NHX_2014020120140228.gs
epvpmean_WIND_G2NHX_2014020120140228.gs
rsdpmean_WIND_G2NHX_2014020120140228.gs
msesspmean_WIND_G2NHX_2014020120140228.gs
# ----- PLOT TYPE 3: time series of ${var} Errors----cat >${var}_${outname1}${lev}.gs <<EOF1
'reinit'; 'set font 1'
'open ${outname1}.ctl'
'${vsdbhome}/map_util/grads/fprintf.gs 'sc.i' score_${var}_${namedaily1}_'mdc.i'_day'%day'.txt %-7.6f' ## GOLD: generate single score file for each model
...
EOF1
$GRADSBIN/grads -bcp "run ${var}_${outname1}${lev}.gs"
eg:
rsddieoff_WIND_G2NHX_2014020120140228P200.gs
rsddieoff_WIND_G2NHX_2014020120140228P10.gs
rsddieoff_WIND_G2NHX_2014020120140228P100.gs
# ----- PLOT TYPE 4: mean ${var} error growth curve over $ndays days----ndaysp1=`expr $ndays + 1`
fdaysp1=`expr $fdays + 1`
cat >${var}dieoff_${outname1}${lev}.gs <<EOF1
'reinit'; 'set font 1'
'open ${outname1}.ctl'
'${vsdbhome}/map_util/grads/fprintf.gs 'tv.n' score_${var}_conflimit_'${namedaily1}'_'mdc.i'_day'n-1'.txt %-7.6f' ## GOLD *conflimit*
...
EOF1
$GRADSBIN/grads -bcp "run ${var}dieoff_${outname1}${lev}.gs"
eg:
rsddieoff_WIND_G2NHX_2014020120140228P10.gs
rsddieoff_WIND_G2NHX_2014020120140228P100.gs
msessdieoff_WIND_G2NHX_2014020120140228P850.gs
msessdieoff_WIND_G2NHX_2014020120140228P700.gs
log files:
/scratch2/portfolios/NESDIS/h-sandy/noscrub/Deyong.Xu/vsdb_workspace/data/stmp/Deyong.Xu/nwpvrfy774/acrms774/G2NHX/pres/WIND
./score_rsd_WIND_P1000_G2NHX_GFS_day3.txt
./score_epv_WIND_P500_G2NHX_GFS_day3.txt
./score_pcor_WIND_P700_G2NHX_ECM_day5.txt
./score_pcor_WIND_P50_G2NHX_ECM_day5.txt
8. Region control / distribution
Go to here: /scratch2/portfolios/NESDIS/h-sandy/noscrub/Deyong.Xu/vsdb_pkg/vsdb_v17/exe
on zeus and do following command and find the lines attached below.
cntl_sfc.sh seems to be what you're trying to find.
$ vi cntl_sfc.sh
12 ${gd}
${gd}/NHX 0 20 360 80
${gd}/SHX 0 -80 360 -20
${gd}/TRO 0 -20 360 20
${gd}/N60 0 60 360 90
${gd}/S60 0 -90 360 -60
${gd}/NPO
${gd}/SPO
${gd}/NAO
${gd}/SAO
${gd}/CAM
${gd}/NSA
$ vi cntl_pres.sh
5 ${gd}
${gd}/NHX 0 20 360 80
${gd}/SHX 0 -80 360 -20
${gd}/TRO 0 -20 360 20
${gd}/PNA 180 20 320 75
$ vi cntl_anom.sh
5 ${gd}
${gd}/NHX 0 20 360 80
${gd}/SHX 0 -80 360 -20
${gd}/TRO 0 -20 360 20
${gd}/PNA 180 20 320 75
9. Account related issue:
On some platforms such as zeus, account setting becomes matter because not all of accounts are allowed to submit jobs on zeus. This account information is part of parameters of job submission.
So for zeus, I set it to "h-sandy"
export ACCOUNT=h-sandy
$ cd /data/users/dxu/vsdb_pkg/vsdb_v17
$ ./build.sh
2. Configure two shell scripts:
# setup_envs.sh
# vsdbjob_submit.sh
$ cd /data/users/dxu/vsdb_pkg/vsdb_v17
$ vi setup_envs.sh
$ vi vsdbjob_submit.sh
3. Run VSDB
$ cd /data/users/dxu/vsdb_pkg/vsdb_v17
$ ./vsdbjob_submit.sh
4. Input, output, log and run dir
steps | run dir |
1 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy26072/stats |
2 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy47090/acrms47090 |
3 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy19091/mkup_precip |
4 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy22490/plot_pcp |
5 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy32596/fit |
6 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy20273/2dmaps |
steps | log file |
1 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy26072/vstep1.out |
2 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy47090/vstep2.out |
3 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy19091/mkup_rain_stat.out |
4 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy22490/plot_pcp.out |
5 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy32596/fit2obs.out |
6 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy20273/2dmaps[1-4].out |
steps | in dir |
1 | /data/users/dxu/vsdb_workspace/data/input/fcst_data |
2 | /data/users/dxu/vsdb_workspace/data/output/vsdb_data |
3 | /data/users/dxu/vsdb_workspace/data/input/fcst_data |
4 | /data/users/dxu/vsdb_workspace/data/intermediate/dxu/archive |
5 | /data/users/dxu/vsdb_workspace/data/input/f2o |
6 | /data/users/dxu/vsdb_workspace/data/input/fcst_data |
/data/users/dxu/vsdb_workspace/data/input/plot2d/obdata | |
steps | out dir |
1 | /data/users/dxu/vsdb_workspace/data/output/vsdb_data |
2 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy44333/acrms44333/G2/anom/HGT score text files /data/users/dxu/vsdb_workspace/data/stmp/dxu/web/allmodel PNG |
3 | /data/users/dxu/vsdb_workspace/data/intermediate/dxu/archive |
4 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/web/rain PNG |
5 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/web/fits GIF |
6 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/web/2D/d[1-4] GIF |
5. script that contains sub_cardinal
export SUBJOB=$vsdbhome/bin/sub_cardinal
File 1 is : ./map_util/sfcfcst_1cyc.sh
File 2 is : ./map_util/allcenters_rmsmap.sh
File 3 is : ./map_util/allcenters_1cyc.sh
File 4 is : ./setup_envs.sh
File 5 is : ./vsdbjob.sh
File 6 is : ./precip/plot_pcp.sh
File 7 is : ./precip/precip_score_vsdb.sh
File 8 is : ./fit2obs/plotall.sh
File 9 is : ./fit2obs/fit2obs.sh
File 10 is : ./plot2d/maps2d_new.sh
File 11 is : ./vsdbjob_submit.sh
File 12 is : ./grid2obs/grid2obs_plot.sh
File 13 is : ./grid2obs/grid2obs_driver.sh
File 14 is : ./grid2obs/scripts/get_opsgfs_data.sh
File 15 is : ./grid2obs/scripts/get_paragfs_data.sh
File 16 is : ./grid2obs/scripts/g2o_sfcmap.sh
File 17 is : ./grid2obs/scripts/grid2obssfc.fits.sh
File 18 is : ./grid2obs/scripts/g2o_airmap.sh
File 19 is : ./grid2obs/grid2obs_opsdaily.sh
File 20 is : ./grid2obs/grid2obs.sh
File 21 is : ./verify_exp_step2.sh
6. all options used in SUBJOB
1) Flag used in SUBJOB
$SUBJOB -a $ACCOUNT -q $CUE2FTP -g $GROUP -p 1/1/S -t 1:00:00 -r 128/1 -j ftpcard -o ftpcard$$.out ${rundir}/ftpcard$$.sh
-e : EVN variable list
$SUBJOB -e $listvar -a $task -q $cue -g $GROUP -p 1/1/S -r 512/1 -t 3:00:00 -o $SUBJOB -a $ACCOUNT -q $CUE2FTP -g $GROUP -p 1/1/S -r 256/1 -w +${waitfits} -t 1:00:00 -j ftpfits -o $mapdir/ftpfits.out ${mapdir}/ftpfits.sh
2) Original meaning of flags.
where the options are:
-a account account (default:none)
-e envars copy comm-separated environment variables
-g group group name
-j jobname specify jobname (default: executable basename)
-n write command file to stdout rather than submitting it
-o output specify output file (default: jobname.out)
-p procs[/nodes]
number of MPI tasks and number of nodes
-q queue queue name
-r nodetype node type (harp or neha)
-v verbose mode
-t timew wall time limit in [[hh:]mm:]ss format (default: 900)
-w when when to run, in yyyymmddhh[mm], +hh[mm], thh[mm], or
Thh[mm] (full, incremental, today or tomorrow) format
(default: now)
3) sub_badger is a wrapper to translate above options into options that scheduler on badger can recognize.
on badger:
qsub -V : pass all the EVN variables.
4) sub_cardinal is a wrapper to translate above options into options that scheduler on cardinal can recognize.
on cardinal:
sbatch --export=<environment variables | ALL | NONE>
Identify which environment variables are propagated to the batch job.
Multiple environment variable names should be comma separated. Environ-
ment variable names may be specified to propagate the current value of
those variables (e.g. "--export=EDITOR") or specific values for the
variables may be exported (e.g.. "--export=EDITOR=/bin/vi"). This
option particularly important for jobs that are submitted on one cluster
and execute on a different cluster (e.g. with different paths). By
default all environment variables are propagated. If the argument is
NONE or specific environment variable names, then the --get-user-env
option will implicitly be set to load other environment variables based
upon the user’s configuration on the cluster which executes the job.
7. score card related
File 1 is : ./run_scorecard.sh
file3=${scoredir}/score_${stat}_conflimit_${namedaily}_${mdnamec2}_day${dd}
File 2 is : ./map_util/allcenters_rmsmap.sh
'${vsdbhome}/map_util/grads/fprintf.gs 'tv.n' score_${var}_conflimit_'
cp score_${var}_conflimit*.txt $scoredir
File 3 is : ./map_util/allcenters_1cyc.sh
'${vsdbhome}/map_util/grads/fprintf.gs 'tv.n' score_cor_conflimit_'${n.
1). Create score card text files:
./vsdbjob_submit.sh
--> ./verify_exp_step2.sh
--> ./map_util/allcenters_rmsmap.sh and ./map_util/allcenters_1cyc.sh
A) anomaly correlation on single pressure layer
$SUBJOB -e $listvar -a $ACCOUNT -q $CUE2RUN -g $GROUP -p $penode -t $cputime -j HGTanom${narea} -o $rundir/HGT_anom.out \
${sorcdir}/allcenters_1cyc.sh $edate $ndays $fdays
if [ $? -ne 0 ]; then ${sorcdir}/allcenters_1cyc.sh $edate $ndays $fdays ; fi
sleep 3
B) rms and bias $SUBJOB -e $listvar -a $ACCOUNT -q $CUE2RUN -g $GROUP -p $penode -t $cputime -j HGTpres${narea} -o $rundir/HGT_pres.out \
${sorcdir}/allcenters_rmsmap.sh $edate $ndays $fdays
if [ $? -ne 0 ]; then ${sorcdir}/allcenters_rmsmap.sh $edate $ndays $fdays ; fi
sleep 3
2) Location of score card files?
Details of ./verify_exp_step2.sh
step 1: move vsdb status files to /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/vsdb_data
step 2 : 3 variable types: anom, pres, sfc
step 3: 5 regions: G2NHX ,G2SHX, G2TRO, G2,G2PNA
Eg:
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2NHX
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2SHX
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2TRO
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2PNA
step 4 : Each region has two types: pres & anom
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2NHX
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2NHX
├── anom│ ├── HGT│ ├── HGTWV0-3│ ├── HGTWV10-20│ ├── HGTWV4-9│ ├── PMSL│ ├── T│ ├── U│ ├── V│ └── WIND└── pres├── HGT├── O3├── T├── U├── V└── WIND
A#) anomaly correlation on single pressure layer:
vtype=anom + map_util/allcenters_1cyc.sh
------- ------------
Level Parameters
------- ------------
"P1000 P700 P500 P250" "HGT HGT_WV1/0-3 HGT_WV1/4-9 HGT_WV1/10-20"
"P850 P500 P250" "WIND U V T"
"MSL" "PMSL"
log files:
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2NHX
PMSL_anom.out
UVT_anom.out # combine U, V and T
HGT_anom.out # combine all HGT including layered HGT.
B#) rms and bias :
vtype=pres + map_util/allcenters_rmsmap.sh
------- ------------
Level Parameters
------- ------------if maptop = "10"
"P1000 P925 P850
P700 P500 P400
P300 P250 P200 "HGT WIND U V T"
P150 P100 P50
P20 P10"
if maptop = "50"
"P1000 P925 P850
P700 P500 P400 "HGT WIND U V T"
P300 P250 P200
P150 P100 P50"
if maptop = "100"
"P1000 P925 P850
P700 P500 P400 "HGT WIND U V T"
P300 P250 P200
P150 P100"
"P100 P70 P50 P30 P20 P10" : "O3"
log files:
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2NHX
O3_pres.out
HGT_pres.out
WIND_pres.out
U_pres.out
V_pres.out
T_pres.out
step 5 : Process "sfc" separately.
vtype=sfc + map_util/sfcfcst_1cyc.shreglist="G2 G2/NHX G2/SHX G2/TRO G2/N60 G2/S60 G2/NPO G2/SPO G2/NAO G2/SAO G2/CAM G2/NSA"
------- ------------
Level Parameters
------- ------------
"SL1L2" "CAPE CWAT PWAT HGTTRP"
"TMPTRP HPBL PSFC PSL"
"RH2m SPFH2m T2m TOZNE TG"
"U10m V10m WEASD TSOILT WSOILT"
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/sfc
3) Final html output files└── sfc
├── CAPE
├── CWAT
├── HGTTRP
├── HPBL
├── PSFC
├── PSL
├── PWAT
├── RH2m
├── SPFH2m
├── T2m
├── TG
├── TMPTRP
├── TOZNE
├── TSOILT
├── U10m
├── V10m
├── WEASD
└── WSOILT
eg: output location: /scratch2/portfolios/NESDIS/h-sandy/noscrub/Deyong.Xu/vsdb_workspace/data/stmp/Deyong.Xu/nwpvrfy29853/acrms29853/score
scorecard.html
scorecard.css
mainindex.html
legend.html
8. How VSDB status files are used?
verify_exp_step2.sh
--> map_util/allcenters_1cyc.sh & map_util/allcenters_rmsmap.sh
--> map_util/gen_scal.sh
map_util/gen_scal_pres.sh
map_util/gen_wind.sh
map_util/gen_wind_pres.sh
map_util/gen_sfc.sh
Detail of "./gen_scal.sh"
1) convert vsdb file into txt file
while [ $cdate -le $edate ]; do
fhour=00; vhr=$cyc
while [ $fhour -le $vlength ]; do
datadir=${vsdb_data}/${vtype}/${vhr}Z/${model}
vsdbname=${datadir}/${model}_${cdate}.vsdb ### vsdb files
string=" $mdl $fhour ${cdate}${vhr} $mdl $reg SAL1L2 $vnam $lev "
mycheck=$( grep "$string" $vsdbname )
if [ $? -ne 0 ]; then
echo "missing" >>$outname.txt
else
grep "$string" $vsdbname |cat >>$${outname}.txt ### grep data out and save it into text file
fi
fhour=` expr $fhour + $fhout `
if [ $fhour -lt 10 ]; then fhour=0$fhour ; fi
vhr=` expr $vhr + $fhout `
if [ $vhr -ge 24 ]; then vhr=`expr $vhr - 24 `; fi
if [ $vhr -lt 10 ]; then vhr=0$vhr ; fi
done
2) convert txt file into binary file via convert.f that is generated on the fly.
( within gen* script such as gen_scal.sh script )
open(9,file="modelname.txt",form="formatted",status="old")
open(10,file="${outname}.txt",form="formatted",status="old") ## text file generated above
open(11,file="tmp.txt",form="formatted",status="new")
open(20,file="${outname}.bin",form="unformatted",status="new") ## output binary file
$FC $FFLAG -o convert.x convert.f
./convert.x
meantxt=${vnam1}_${lev}_${reg1}_${yyyymm}
mv fort.13 meancor_${meantxt}.txt
mv fort.14 meanrms_${meantxt}.txt
mv fort.15 meanbias_${meantxt}.txt
3) create grads control file
cat >${outname}.ctl <<EOF1
dset ^${outname}.bin ## binary file generated above
undef -99.9
options big_endian sequential
title scores
xdef $nmdcyc linear 1 1
4) output ctl files
HGT_P1000_G2NHX_2014020120140228.ctl
HGT_P700_G2NHX_2014020120140228.ctl
HGT_P500_G2NHX_2014020120140228.ctl
HGT_P250_G2NHX_2014020120140228.ctl
Details of "allcenters_1cyc.sh"
0)# -- search data for all models; write out binary data, create grads control file
/gen_wind.sh
/gen_scal.sh
1) # ----- PLOT TYPE 1: time series of anomaly correlations ----
cat >acz_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'open ${outname}.ctl'
mdc.1=${mdnamec[0]}
* Create verification scorecard text files ( score card files )
'${vsdbhome}/map_util/grads/fprintf.gs 'sc.i' score_cor_'${namedaily}'_'mdc.i'_day'%day'.txt %-7.6f' ### GOLD
...
$GRADSBIN/grads -bcp "run acz_${outname1}.gs"
2) # ----- PLOT TYPE 2: Die-off plot for mean correlation over $ndays days----
ndaysp1=`expr $ndays + 1`
fdaysp1=`expr $fdays + 1`
cat >cordieoff_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'open ${outname}.ctl'
** Create verification scorecard text files ( conflimit files between models )
'${vsdbhome}/map_util/grads/fprintf.gs 'tv.n' score_cor_conflimit_'${namedaily}'_'mdc.i'_day'n-1'.txt %-7.6f' ## GOLD: generate score_cor_conflimit* files needed to run score card.
...
$GRADSBIN/grads -bcp "run cordieoff_${outname1}.gs"
3) # ----- PLOT TYPE 3: difference of AC, other models minus first model ----
cat >cordiff_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'run $sorcdir/grads/white.gs'
'open ${outname}.ctl'
...
$GRADSBIN/grads -bcp "run cordiff_${outname1}.gs"
4) # ----- PLOT TYPE 4: frequency distribution of anomaly correlations ----
ndayfq=$ndays
if [ $ndayfq -gt 20 ]; then ndayfq=20; fi
nday05=`expr $ndayfq \/ 2 `
cat >freq_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'open ${outname}.ctl'
...
$GRADSBIN/grads -bcp "run freq_${outname1}.gs"
5) output grads script (.gs) files
freq_HGT_P700_G2NHX_2014020120140228.gs
freq_HGT_P250_G2NHX_2014020120140228.gs
freq_HGT_P500_G2NHX_2014020120140228.gs
freq_HGT_P1000_G2NHX_2014020120140228.gs
cordiff_HGT_P700_G2NHX_2014020120140228.gs
cordiff_HGT_P250_G2NHX_2014020120140228.gs
cordiff_HGT_P500_G2NHX_2014020120140228.gs
cordiff_HGT_P1000_G2NHX_2014020120140228.gs
acz_HGT_P700_G2NHX_2014020120140228.gs
acz_HGT_P250_G2NHX_2014020120140228.gs
acz_HGT_P500_G2NHX_2014020120140228.gs
acz_HGT_P1000_G2NHX_2014020120140228.gs
cordieoff_HGT_P700_G2NHX_2014020120140228.gs
cordieoff_HGT_P250_G2NHX_2014020120140228.gs
cordieoff_HGT_P500_G2NHX_2014020120140228.gs
cordieoff_HGT_P1000_G2NHX_2014020120140228.gs
6) Sample score files needed to run scorecard.
[dxu@s4-cardinal HGT]$ pwd/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy_deyong/acrms_deyong/G2NHX/anom/HGT
[dxu@s4-cardinal HGT]$ tree -f |grep score_cor |grep 250
├── ./score_cor_conflimit_HGT_P250_G2NHX_ECM_day1.txt # for 2nd model only
├── ./score_cor_conflimit_HGT_P250_G2NHX_ECM_day2.txt
├── ./score_cor_conflimit_HGT_P250_G2NHX_ECM_day3.txt
├── ./score_cor_conflimit_HGT_P250_G2NHX_ECM_day4.txt
├── ./score_cor_conflimit_HGT_P250_G2NHX_ECM_day5.txt
├── ./score_cor_HGT_P250_G2NHX_ECM_day1.txt
├── ./score_cor_HGT_P250_G2NHX_ECM_day3.txt
├── ./score_cor_HGT_P250_G2NHX_ECM_day5.txt
├── ./score_cor_HGT_P250_G2NHX_GFS_day1.txt # Reference model
├── ./score_cor_HGT_P250_G2NHX_GFS_day3.txt
├── ./score_cor_HGT_P250_G2NHX_GFS_day5.txt
Details of "allcenters_rmsmap.sh"
varlist="rms bias pcor emd epv rsd msess"
0) # -- search data for all models; write out binary data, create grads control file
gen_wind_pres.sh
gen_scal_pres.sh
# ----- PLOT TYPE 1: maps of $var as a function of calendar day and pressure for each forecast time ----
cat >${var}p_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'run $sorcdir/grads/white.gs'
'open ${outname1}.ctl'
...
EOF1
$GRADSBIN/grads -bcp "run ${var}p_${outname1}.gs"
eg:
rmsp_WIND_G2NHX_2014020120140228.gs
biasp_WIND_G2NHX_2014020120140228.gs
pcorp_WIND_G2NHX_2014020120140228.gs
emdp_WIND_G2NHX_2014020120140228.gs
epvp_WIND_G2NHX_2014020120140228.gs
rsdp_WIND_G2NHX_2014020120140228.gs
msessp_WIND_G2NHX_2014020120140228.gs
# ----- PLOT TYPE 2: maps of mean ${var} as a function of forecast time and pressure ----ndaysp1=`expr $ndays + 1`
fdaysp1=`expr $fdays + 1`
cat >${var}pmean_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'run $sorcdir/grads/white.gs'
'open ${outname1}.ctl'
...
EOF1
$GRADSBIN/grads -bcp "run ${var}pmean_${outname1}.gs"
eg:
rmspmean_WIND_G2NHX_2014020120140228.gs
biaspmean_WIND_G2NHX_2014020120140228.gs
pcorpmean_WIND_G2NHX_2014020120140228.gs
emdpmean_WIND_G2NHX_2014020120140228.gs
epvpmean_WIND_G2NHX_2014020120140228.gs
rsdpmean_WIND_G2NHX_2014020120140228.gs
msesspmean_WIND_G2NHX_2014020120140228.gs
# ----- PLOT TYPE 3: time series of ${var} Errors----cat >${var}_${outname1}${lev}.gs <<EOF1
'reinit'; 'set font 1'
'open ${outname1}.ctl'
'${vsdbhome}/map_util/grads/fprintf.gs 'sc.i' score_${var}_${namedaily1}_'mdc.i'_day'%day'.txt %-7.6f' ## GOLD: generate single score file for each model
...
EOF1
$GRADSBIN/grads -bcp "run ${var}_${outname1}${lev}.gs"
eg:
rsddieoff_WIND_G2NHX_2014020120140228P200.gs
rsddieoff_WIND_G2NHX_2014020120140228P10.gs
rsddieoff_WIND_G2NHX_2014020120140228P100.gs
# ----- PLOT TYPE 4: mean ${var} error growth curve over $ndays days----ndaysp1=`expr $ndays + 1`
fdaysp1=`expr $fdays + 1`
cat >${var}dieoff_${outname1}${lev}.gs <<EOF1
'reinit'; 'set font 1'
'open ${outname1}.ctl'
'${vsdbhome}/map_util/grads/fprintf.gs 'tv.n' score_${var}_conflimit_'${namedaily1}'_'mdc.i'_day'n-1'.txt %-7.6f' ## GOLD *conflimit*
...
EOF1
$GRADSBIN/grads -bcp "run ${var}dieoff_${outname1}${lev}.gs"
eg:
rsddieoff_WIND_G2NHX_2014020120140228P10.gs
rsddieoff_WIND_G2NHX_2014020120140228P100.gs
msessdieoff_WIND_G2NHX_2014020120140228P850.gs
msessdieoff_WIND_G2NHX_2014020120140228P700.gs
log files:
/scratch2/portfolios/NESDIS/h-sandy/noscrub/Deyong.Xu/vsdb_workspace/data/stmp/Deyong.Xu/nwpvrfy774/acrms774/G2NHX/pres/WIND
./score_rsd_WIND_P1000_G2NHX_GFS_day3.txt
./score_epv_WIND_P500_G2NHX_GFS_day3.txt
./score_pcor_WIND_P700_G2NHX_ECM_day5.txt
./score_pcor_WIND_P50_G2NHX_ECM_day5.txt
8. Region control / distribution
Go to here: /scratch2/portfolios/NESDIS/h-sandy/noscrub/Deyong.Xu/vsdb_pkg/vsdb_v17/exe
on zeus and do following command and find the lines attached below.
cntl_sfc.sh seems to be what you're trying to find.
$ vi cntl_sfc.sh
12 ${gd}
${gd}/NHX 0 20 360 80
${gd}/SHX 0 -80 360 -20
${gd}/TRO 0 -20 360 20
${gd}/N60 0 60 360 90
${gd}/S60 0 -90 360 -60
${gd}/NPO
${gd}/SPO
${gd}/NAO
${gd}/SAO
${gd}/CAM
${gd}/NSA
$ vi cntl_pres.sh
5 ${gd}
${gd}/NHX 0 20 360 80
${gd}/SHX 0 -80 360 -20
${gd}/TRO 0 -20 360 20
${gd}/PNA 180 20 320 75
$ vi cntl_anom.sh
5 ${gd}
${gd}/NHX 0 20 360 80
${gd}/SHX 0 -80 360 -20
${gd}/TRO 0 -20 360 20
${gd}/PNA 180 20 320 75
9. Account related issue:
On some platforms such as zeus, account setting becomes matter because not all of accounts are allowed to submit jobs on zeus. This account information is part of parameters of job submission.
So for zeus, I set it to "h-sandy"
export ACCOUNT=h-sandy
Friday, October 17, 2014
sbatch
Submit job with sbatch
$SUB -J ${jobname} -s -o ${logfile} -e ${logfile} $cmdfile
eg:
main.sh
--------------
#!/bin/bash
export TT=999
sbatch -s -J "myjob " -o ~/aa.log -e ~/aa.err ~/abc1.sh
sbatch -s -J "myjob " -o ~/aa2.log -e ~/aa2.err ~/abc2.sh
$SUB -J ${jobname} -s -o ${logfile} -e ${logfile} $cmdfile
eg:
main.sh
--------------
#!/bin/bash
export TT=999
sbatch -s -J "myjob " -o ~/aa.log -e ~/aa.err ~/abc1.sh
sbatch -s -J "myjob " -o ~/aa2.log -e ~/aa2.err ~/abc2.sh
32-bit IEEE floating-point number
What is 32-bit IEEE floating-point number?
Refer to here to see the original post.
cray_32bit_ieee选项,这个资料不是很多。看看GrADS文档对它的描述:Indicates the data file contains 32-bit IEEE floats created on a cray. May be used with gridded or station data.
A 32-bit IEEE floating-point number has 1 bit for the sign, 8 bits for the exponent, and 23 bits for the mantissa.
读这种数据可以参考下面的描述:
We're looking at single precision floating point numbers here. Double precision uses the same scheme, just more bits. Here's what the output looks like :
0 00000000 0 00000000 00000000000000000000000
1 3F800000 0 01111111 00000000000000000000000
2 40000000 0 10000000 00000000000000000000000
4 40800000 0 10000001 00000000000000000000000
8 41000000 0 10000010 00000000000000000000000
16 41800000 0 10000011 00000000000000000000000
32 42000000 0 10000100 00000000000000000000000
64 42800000 0 10000101 00000000000000000000000
128 43000000 0 10000110 00000000000000000000000
256 43800000 0 10000111 00000000000000000000000
512 44000000 0 10001000 00000000000000000000000
1024 44800000 0 10001001 00000000000000000000000
2048 45000000 0 10001010 00000000000000000000000
4096 45800000 0 10001011 00000000000000000000000
8192 46000000 0 10001100 00000000000000000000000
5.75 40B80000 0 10000001 01110000000000000000000
-.1 BDCCCCCD 1 01111011 10011001100110011001101
The first column is what the stored format looks like in hex. After that come the actual bits; I've separated them in this odd way for a very good reason (which will become clear later). The value "5.75" is stored as "01000000101110000000000000000000" or "40B80000" (hex).
You might easily guess that the first bit is the sign bit. I think that's what I first grokked back in 1983 too. The next 8 bits are used for the exponent, and the last 23 are the value. As you will no doubt notice, the value bits from 0 to 8192 are all empty, so I must be crazy and there's no point in reading this trash any farther.
Well, actually there is. There's a hidden bit there that isn't stored but is always assumed. If you are really compulsive and counted the bits, you see that only 23 bits are there. The hidden bit makes it 24.bits (or 4 bytes) and is always 1. So, if we add the hidden bit, the bits would look like:
0 0 00000000 100000000000000000000000
1 0 01111111 100000000000000000000000
2 0 10000000 100000000000000000000000
4 0 10000001 100000000000000000000000
8 0 10000010 100000000000000000000000
16 0 10000011 100000000000000000000000
32 0 10000100 100000000000000000000000
64 0 10000101 100000000000000000000000
128 0 10000110 100000000000000000000000
256 0 10000111 100000000000000000000000
512 0 10001000 100000000000000000000000
1024 0 10001001 100000000000000000000000
2048 0 10001010 100000000000000000000000
4096 0 10001011 100000000000000000000000
8192 0 10001100 100000000000000000000000
5.75 0 10000001 101110000000000000000000
-.1 1 01111011 110011001100110011001101
But remember, it's what I showed above that is really there.
One more thing: there's an implied decimal point after that hidden number. To get the value of bits after the decimal point, start dividing by two: so the first bit after the (implied) decimal point is .5, the next is .25 and so on. We don't have to worry about any of that for the powers of two, because obviously those are whole numbers and the bits will be all 0. But down at the 5.75 we see that at work:
First, looking at the exponent for 5.75, we see that it is 129. Subtracting 127 gives us 2. So 1.0111 times 2^2 becomes 101.11 (simply shift 2 places to the right to multiply by 4). So now we have 101 binary, which is 5, plus .5 plus .25 (.11) or 5.75 in total. Too quick?
Taking it in detail:
Exponent: 10000001, which is 129 (use the Javascript Bit Twiddler if you like). Subtract 127 leaves us with 2.
Mantissa: 01110000000000000000000
Add in the implied bit and we have 101110000000000000000000, with implied decimal point that's 1.01110000000000000000000
Multiple that by 2^2 to get 101.110000000000000000000
That is 4 + 1 + .5 + .25 or 5.75
Look at 2048. The exponent is 128 + 8 + 2 or 138, subtract 127 we get 11. Use the Bit Twiddle if you don't see that. The mantissa is all 0's, which with the implied bit makes this all 1.00000000000000000000000 times 2^11. What's 2^11? It's 2048, of course.
Now the -.1. This actually can't store precisely, but the method is still the same. The exponent is 64 + 32 + 16 + 8 + 2 + 1 or 123. Subtract 127 and we get -4, which means the decimal point moves 4 places to the left, making our value .000110011001100110011001101. Now you understand why it's stored after adding 127 - it's so we can end up with negative exponents. If we calculate out the binary, that's .625 + .3125 + .0390625 and on to ever smaller numbers which get us very, very close to .1 (but off slightly). The sign bit was set, so it's a -.1
gsi diagnostic data information
==============================
diagnostic data info:
==============================
module read_diag
subroutine read_radiag_header
subroutine read_radiag_data
### 1.1) header fix ####
type diag_header_fix_list
character(len=20) :: isis ! sat and sensor type
character(len=10) :: id ! sat type
character(len=10) :: obstype ! observation type
integer(i_kind) :: jiter ! outer loop counter
integer(i_kind) :: nchan ! number of channels in the sensor
integer(i_kind) :: npred ! number of updating bias correction predictors
integer(i_kind) :: idate ! time (yyyymmddhh)
integer(i_kind) :: ireal ! # of real elements in the fix part of a data record
integer(i_kind) :: ipchan ! # of elements for each channel except for bias correction terms
integer(i_kind) :: iextra ! # of extra elements for each channel
integer(i_kind) :: jextra ! # of extra elements
integer(i_kind) :: idiag ! first dimension of diag_data_chan
integer(i_kind) :: angord ! order of polynomial for adp_anglebc option
integer(i_kind) :: iversion ! radiance diagnostic file version number
integer(i_kind) :: inewpc ! indicator of newpc4pred (1 on, 0 off)
end type diag_header_fix_list
### 1.2) header chan ####
type diag_header_chan_list
real(r_single) :: freq ! frequency (Hz)
real(r_single) :: polar ! polarization
real(r_single) :: wave ! wave number (cm^-1)
real(r_single) :: varch ! error variance (or SD error?)
real(r_single) :: tlapmean ! mean lapse rate
integer(i_kind):: iuse ! use flag
integer(i_kind):: nuchan ! sensor relative channel number
integer(i_kind):: iochan ! satinfo relative channel number
end type diag_header_chan_list
### 2.1) data name ####
type diag_data_name_list
character(len=10),dimension(ireal_radiag) :: fix
character(len=10),dimension(:),allocatable :: chn
end type diag_data_name_list
### 2.2) data fix ####
type diag_data_fix_list
real(r_single) :: lat ! latitude (deg)
real(r_single) :: lon ! longitude (deg)
real(r_single) :: zsges ! guess elevation at obs location (m)
real(r_single) :: obstime ! observation time relative to analysis
real(r_single) :: senscn_pos ! sensor scan position (integer(i_kind))
real(r_single) :: satzen_ang ! satellite zenith angle (deg)
real(r_single) :: satazm_ang ! satellite azimuth angle (deg)
real(r_single) :: solzen_ang ! solar zenith angle (deg)
real(r_single) :: solazm_ang ! solar azimumth angle (deg)
real(r_single) :: sungln_ang ! sun glint angle (deg)
real(r_single) :: water_frac ! fractional coverage by water
real(r_single) :: land_frac ! fractional coverage by land
real(r_single) :: ice_frac ! fractional coverage by ice
real(r_single) :: snow_frac ! fractional coverage by snow
real(r_single) :: water_temp ! surface temperature over water (K)
real(r_single) :: land_temp ! surface temperature over land (K)
real(r_single) :: ice_temp ! surface temperature over ice (K)
real(r_single) :: snow_temp ! surface temperature over snow (K)
real(r_single) :: soil_temp ! soil temperature (K)
real(r_single) :: soil_mois ! soil moisture
real(r_single) :: land_type ! land type (integer(i_kind))
real(r_single) :: veg_frac ! vegetation fraction
real(r_single) :: snow_depth ! snow depth
real(r_single) :: sfc_wndspd ! surface wind speed
real(r_single) :: qcdiag1 ! ir=cloud fraction, mw=cloud liquid water
real(r_single) :: qcdiag2 ! ir=cloud top pressure, mw=total column water
real(r_single) :: tref ! reference temperature (Tr) in NSST
real(r_single) :: dtw ! dt_warm at zob
real(r_single) :: dtc ! dt_cool at zob
real(r_single) :: tz_tr ! d(Tz)/d(Tr)
end type diag_data_fix_list
### 2.3) data chan ####
type diag_data_chan_list
real(r_single) :: tbobs ! Tb (obs) (K)
real(r_single) :: omgbc ! Tb_(obs) - Tb_(simulated w/ bc) (K)
real(r_single) :: omgnbc ! Tb_(obs) - Tb_(simulated_w/o bc) (K)
real(r_single) :: errinv ! inverse error (K**(-1))
real(r_single) :: qcmark ! quality control mark
real(r_single) :: emiss ! surface emissivity
real(r_single) :: tlap ! temperature lapse rate
real(r_single) :: tb_tz ! d(Tb)/d(Tz)
real(r_single) :: bicons ! constant bias correction term
real(r_single) :: biang ! scan angle bias correction term
real(r_single) :: biclw ! CLW bias correction term
real(r_single) :: bilap2 ! square lapse rate bias correction term
real(r_single) :: bilap ! lapse rate bias correction term
real(r_single) :: bicos ! node*cos(lat) bias correction term
real(r_single) :: bisin ! sin(lat) bias correction term
real(r_single) :: biemis ! emissivity sensitivity bias correction term
real(r_single),dimension(:),allocatable :: bifix ! angle dependent bias
real(r_single) :: bisst ! SST bias correction term
end type diag_data_chan_list
### 2.4) data extra ####
type diag_data_extra_list
real(r_single) :: extra ! extra information
end type diag_data_extra_list
diagnostic data info:
==============================
module read_diag
subroutine read_radiag_header
subroutine read_radiag_data
### 1.1) header fix ####
type diag_header_fix_list
character(len=20) :: isis ! sat and sensor type
character(len=10) :: id ! sat type
character(len=10) :: obstype ! observation type
integer(i_kind) :: jiter ! outer loop counter
integer(i_kind) :: nchan ! number of channels in the sensor
integer(i_kind) :: npred ! number of updating bias correction predictors
integer(i_kind) :: idate ! time (yyyymmddhh)
integer(i_kind) :: ireal ! # of real elements in the fix part of a data record
integer(i_kind) :: ipchan ! # of elements for each channel except for bias correction terms
integer(i_kind) :: iextra ! # of extra elements for each channel
integer(i_kind) :: jextra ! # of extra elements
integer(i_kind) :: idiag ! first dimension of diag_data_chan
integer(i_kind) :: angord ! order of polynomial for adp_anglebc option
integer(i_kind) :: iversion ! radiance diagnostic file version number
integer(i_kind) :: inewpc ! indicator of newpc4pred (1 on, 0 off)
end type diag_header_fix_list
### 1.2) header chan ####
type diag_header_chan_list
real(r_single) :: freq ! frequency (Hz)
real(r_single) :: polar ! polarization
real(r_single) :: wave ! wave number (cm^-1)
real(r_single) :: varch ! error variance (or SD error?)
real(r_single) :: tlapmean ! mean lapse rate
integer(i_kind):: iuse ! use flag
integer(i_kind):: nuchan ! sensor relative channel number
integer(i_kind):: iochan ! satinfo relative channel number
end type diag_header_chan_list
### 2.1) data name ####
type diag_data_name_list
character(len=10),dimension(ireal_radiag) :: fix
character(len=10),dimension(:),allocatable :: chn
end type diag_data_name_list
### 2.2) data fix ####
type diag_data_fix_list
real(r_single) :: lat ! latitude (deg)
real(r_single) :: lon ! longitude (deg)
real(r_single) :: zsges ! guess elevation at obs location (m)
real(r_single) :: obstime ! observation time relative to analysis
real(r_single) :: senscn_pos ! sensor scan position (integer(i_kind))
real(r_single) :: satzen_ang ! satellite zenith angle (deg)
real(r_single) :: satazm_ang ! satellite azimuth angle (deg)
real(r_single) :: solzen_ang ! solar zenith angle (deg)
real(r_single) :: solazm_ang ! solar azimumth angle (deg)
real(r_single) :: sungln_ang ! sun glint angle (deg)
real(r_single) :: water_frac ! fractional coverage by water
real(r_single) :: land_frac ! fractional coverage by land
real(r_single) :: ice_frac ! fractional coverage by ice
real(r_single) :: snow_frac ! fractional coverage by snow
real(r_single) :: water_temp ! surface temperature over water (K)
real(r_single) :: land_temp ! surface temperature over land (K)
real(r_single) :: ice_temp ! surface temperature over ice (K)
real(r_single) :: snow_temp ! surface temperature over snow (K)
real(r_single) :: soil_temp ! soil temperature (K)
real(r_single) :: soil_mois ! soil moisture
real(r_single) :: land_type ! land type (integer(i_kind))
real(r_single) :: veg_frac ! vegetation fraction
real(r_single) :: snow_depth ! snow depth
real(r_single) :: sfc_wndspd ! surface wind speed
real(r_single) :: qcdiag1 ! ir=cloud fraction, mw=cloud liquid water
real(r_single) :: qcdiag2 ! ir=cloud top pressure, mw=total column water
real(r_single) :: tref ! reference temperature (Tr) in NSST
real(r_single) :: dtw ! dt_warm at zob
real(r_single) :: dtc ! dt_cool at zob
real(r_single) :: tz_tr ! d(Tz)/d(Tr)
end type diag_data_fix_list
### 2.3) data chan ####
type diag_data_chan_list
real(r_single) :: tbobs ! Tb (obs) (K)
real(r_single) :: omgbc ! Tb_(obs) - Tb_(simulated w/ bc) (K)
real(r_single) :: omgnbc ! Tb_(obs) - Tb_(simulated_w/o bc) (K)
real(r_single) :: errinv ! inverse error (K**(-1))
real(r_single) :: qcmark ! quality control mark
real(r_single) :: emiss ! surface emissivity
real(r_single) :: tlap ! temperature lapse rate
real(r_single) :: tb_tz ! d(Tb)/d(Tz)
real(r_single) :: bicons ! constant bias correction term
real(r_single) :: biang ! scan angle bias correction term
real(r_single) :: biclw ! CLW bias correction term
real(r_single) :: bilap2 ! square lapse rate bias correction term
real(r_single) :: bilap ! lapse rate bias correction term
real(r_single) :: bicos ! node*cos(lat) bias correction term
real(r_single) :: bisin ! sin(lat) bias correction term
real(r_single) :: biemis ! emissivity sensitivity bias correction term
real(r_single),dimension(:),allocatable :: bifix ! angle dependent bias
real(r_single) :: bisst ! SST bias correction term
end type diag_data_chan_list
### 2.4) data extra ####
type diag_data_extra_list
real(r_single) :: extra ! extra information
end type diag_data_extra_list
Thursday, October 16, 2014
radmon summary
1. radmon step 1: data_extract
It will create output: angle, bcoef, bcor and time for all cycles in a day.
./data_extract/ush/VrfyRad_glbl.sh # wrapper
\/
./nwprod/jobs/JGDAS_VRFYRAD.sms.prod
||
\/
./nwprod/scripts/exgdas_vrfyrad.sh.sms
cd ${DATA}
$NCP $biascr ./biascr.$PDATE # input 1
$NCP $radstat ./radstat.$PDATE # input 2
tar -xvf radstat.$PDATE
rm radstat.$PDATE
mv diag_${type}_ges.${PDATE}.${Z} ${type}.${Z} # diag_*_anl* still there
${UNCOMPRESS} ./${type}.${Z}
||
\/
./nwprod/ush/radmon_verf_angle.sh ==> $TIMEX ./${angle_exec} < input > ${stdout_file}
./nwprod/ush/radmon_verf_bcoef.sh ==> $TIMEX ./${bcoef_exec} < input > stdout.$type
./nwprod/ush/radmon_verf_bcor.sh ==> $TIMEX ./${bcor_exec} < input > stdout.$type
./nwprod/ush/radmon_verf_time.sh ==> $TIMEX ./${time_exec} < input > ${stdout_file}
Also move data, control, and stdout files to $TANKverf_rad and compress.
output:
/data/users/dxu/radmon_workspace/data/output/radmon_tank/stats/myolddata/radmon.20130731
angle.sndrd2_g15.2013073100.ieee_d.gz # intermediate file
angle.sndrd2_g15.2013073106.ieee_d.gz
angle.sndrd2_g15.2013073112.ieee_d.gz
angle.sndrd2_g15.2013073118.ieee_d.gz
angle.sndrd2_g15.ctl.gz // ctl file created by f90 code.
angle.stdout.sndrd2_g15.gz // log file for one of the cycles above.
bcoef.sndrd2_g15.2013073100.ieee_d.gz
bcoef.sndrd2_g15.2013073106.ieee_d.gz
bcoef.sndrd2_g15.2013073112.ieee_d.gz
bcoef.sndrd2_g15.2013073118.ieee_d.gz
bcoef.sndrd2_g15.ctl.gz
bcoef.stdout.sndrd2_g15.gz
bcor.sndrd2_g15.2013073100.ieee_d.gz
bcor.sndrd2_g15.2013073106.ieee_d.gz
bcor.sndrd2_g15.2013073112.ieee_d.gz
bcor.sndrd2_g15.2013073118.ieee_d.gz
bcor.sndrd2_g15.ctl.gz
bcor.stdout.sndrd2_g15.gz
time.sndrd2_g15.2013073100.ieee_d.gz
time.sndrd2_g15.2013073106.ieee_d.gz
time.sndrd2_g15.2013073112.ieee_d.gz
time.sndrd2_g15.2013073118.ieee_d.gz
time.sndrd2_g15.ctl.gz
time.stdout.sndrd2_g15.gz
2. how endianness is set:
1) "./nwprod/ush/radmon_verf_time.sh"
LITTLE_ENDIAN=${LITTLE_ENDIAN:-0} # default value is 0: big_endian
cat << EOF > input # create namelist file that is input to f90 code.
&INPUT
satname='${type}',
iyy=${iyy},
imm=${imm},
idd=${idd},
ihh=${ihh},
idhh=-720
incr=6
nchanl=${nchanl},
suffix='${SUFFIX}',
imkctl=${MAKE_CTL},
imkdata=${MAKE_DATA},
gesanl='${dtype}',
little_endian=${LITTLE_ENDIAN},
rad_area='${RAD_AREA}',
/
EOF
$TIMEX ./${time_exec} < input > ${stdout_file} # namelist
2) ./nwprod/sorc/verf_radbcoef.fd/bcoef.f90:
integer :: little_endian = 1 # default is 1.
namelist /input/ satname,npredr,nchanl,iyy,imm,idd,ihh,idhh,&
incr,suffix,imkctl,imkdata,retrieval,gesanl,little_endian # read from namelist
read(luname,input)
call create_ctl_bcoef(ntype,ftype,n_chan,iyy,imm,idd,ihh,idhh,&
incr,ctl_file,lunctl,rmiss,mod_satname,satype,dplat,&
nu_chan,use,penalty,frequency,wavenumbr,little_endian)
3) ./nwprod/sorc/verf_radbcoef.fd/create_ctl_bcoef.f90:
Purpose: this sub creates ctl file such as "bcor.sndrd2_g15.ctl"
subroutine create_ctl_bcoef(ntype,ftype,n_chan,iyy,imm,idd,ihh,idhh,&
incr,ctl_file,lunctl,rmiss,satname,satype,dplat,&
nu_chan,use,ratio,frequency,wavenumbr,little_endian)
if ( little_endian == 1 ) then
write(lunctl,112) # 1 means little-endian
else
write(lunctl,110) # 0 means big_endian
endif
110 format('options template big_endian cray_32bit_ieee sequential')
112 format('options template little_endian sequential')
radmon little endian version:
SVN: https://svnemc.ncep.noaa.gov/projects/gsi/branches/NESDIS-JCSDA/users/dxu/radmon_addition/radmon_little_endian/
3. step 2 : generate images
1) code tracing :
./image_gen/ush/mk_bcoef_plots.sh
$SUB -J ${jobname} -s -o ${logfile} -e ${logfile} $SCRIPTS/plot_bcoef.sh
./ush/plot_bcoef.sh
plot_bcoef=plot_bcoef.gs # fixed grads script
cat << EOF > ${type}_${var}.gs # Create wrapper gs file on the fly
'open ${type}.ctl' # ctl file created by f90 above.
'run ${GSCRIPTS}/${plot_bcoef} ${type} ${var} x1100 y850'
'quit'
EOF
$GRADS -bpc "run ${tmpdir}/${type}_${var}.gs"
2) Wrapper grads script : ${type}_${var}.gs
'open amsua_metop-a.ctl'
'run plot_bcor_sep.glb.gs amsua_metop-a total 1 x1100 y850'
'quit'
http://www.iges.org/grads/gadoc/descriptorfile.html
4. How to install radmon
5. How to run radmon
1) data_extract:
How:
$ cd /data/users/dxu/radmon_pkg/radmon/util/Radiance_Monitor/data_extract/ush
$ cat run_radmon_kgarrett |head -3
./VrfyRad_glbl.sh kgarrett_radmon 2014051500 # kgarrett_radmon is ID
./VrfyRad_glbl.sh kgarrett_radmon 2014051506
./VrfyRad_glbl.sh kgarrett_radmon 2014051512
$ . run_radmon_kgarrett # run by each cycle
input dir:
/data/users/dxu/radmon_workspace/data/input/radmon_input_for_kevin
File:
radstat.gdas.2014061000
biascr.gdas.2014061000
run dir:
/data/users/dxu/radmon_workspace/run/dxu/gdas_vrfyrad_2014060218.4056
File:
./amsua_metop-a
./diag_amsua_metop-a_anl.2014060218.gz
./amsua_metop-a.ctl
./stdout.amsua_metop-a
./amsua_metop-a.deyong
output dir:
/data/users/dxu/radmon_workspace/data/output/radmon_tank/stats/kgarrett_radmon/radmon.20140603
File:
./bcoef.amsua_metop-a.2014060300.ieee_d.gz
./angle.amsua_metop-a.2014060300.ieee_d.gz
./time.amsua_metop-a.2014060300.ieee_d.gz
./bcor.amsua_metop-a.2014060300.ieee_d.gz
./bcoef.amsua_metop-a.ctl.gz
./time.amsua_metop-a.ctl.gz
./angle.amsua_metop-a.ctl.gz
./bcor.amsua_metop-a.ctl.gz
2) image_gen
How:
$ cd /data/users/dxu/radmon_for_kg/util/Radiance_Monitor/image_gen/ush
$ ./CkPlt_glbl.sh run_radmon_kgarrett # run with a shot, no need to specify cycle.
input dir:
/data/users/dxu/radmon_workspace/data/output/radmon_tank/stats/kgarrett_radmon/radmon.20140603
File: ./bcoef.amsua_metop-a.2014060300.ieee_d.gz
./angle.amsua_metop-a.2014060300.ieee_d.gz
./time.amsua_metop-a.2014060300.ieee_d.gz
./bcor.amsua_metop-a.2014060300.ieee_d.gz
./bcoef.amsua_metop-a.ctl.gz
./time.amsua_metop-a.ctl.gz
./angle.amsua_metop-a.ctl.gz
./bcor.amsua_metop-a.ctl.gz
run dir:
/data/users/dxu/radmon_workspace/run/dxu/plotjobs_kgarrett_radmon
/data/users/dxu/radmon_workspace/run/dxu/plot_summary_kgarrett_radmon.20140
/data/users/dxu/radmon_workspace/run/dxu/horiz_kgarrett_radmon.2014061018
File: omitted here because too many.
output dir:
/data/users/dxu/radmon_workspace/data/output/radmon_tank/imgn/kgarrett_radmon/pngs
File:
bcor/iasi_metop-a.lapse_region4_fr88.png
time/iasi_metop-a.omgnbc_region3_fr93.png
bcoef/iasi_metop-a.mean_fr128.png
angle/iasi_metop-a.omgnbc_region2_fr61.png
horiz/hirs4_metop-b.obs_10.png
summary/atms_npp.summary.png
3) job submission setting in RadMon_config
elif [[ $MY_MACHINE = "cardinal" ]]; then
shell=sh
export SUB=/usr/bin/sbatch # submit job to job scheduler slurm
export NWPROD=/usr/local/jcsda/nwprod_gdas_2014
export COMPRESS=gzip
export UNCOMPRESS="gunzip -f"
export TIMEX=
export UTILS_BIN=
6. "RadMon_install.pl" updates configuration files.
Users configures "RadMon_install.pl", which will update two configuration files in sub-dir "parm":
"./parm/RadMon_config"
"./parm/RadMon_user_settings"
1) When it updates "parm/RadMon_config", it updates following fields:
if( $_ =~ "MY_RADMON=" ) {
elsif( $_ =~ "MY_TANKDIR=" ) {
elsif( $_ =~ "WEB_SVR=" ) {
elsif( $_ =~ "WEB_USER=" ) {
elsif( $_ =~ "WEBDIR=" ) {
elsif( $_ =~ "LITTLE_ENDIAN=" ) {
elsif( $_ =~ "MY_MACHINE=" ) {
elsif( $_ =~ "PTMP=" ) {
elsif( $_ =~ "STMP=" ) {
2) When it updates "./parm/RadMon_user_settings", it updates following fields:
if ($line =~ m/export ACCOUNT/) {
elsif( $line =~ m/export PROJECT/ ){
elsif( $line =~ m/export JOB_QUEUE/ ){
elsif( $line =~ m/export HPSS_DIR/ ){
7. Platform-related files
Note: These files contain platform name such as "zeus", "jibb" etc. Need to update these files if new platform is added.
It will create output: angle, bcoef, bcor and time for all cycles in a day.
./data_extract/ush/VrfyRad_glbl.sh # wrapper
export biascr=$DATDIR/biascr.gdas.${PDATE}||
export radstat=$DATDIR/radstat.gdas.${PDATE}
\/
./nwprod/jobs/JGDAS_VRFYRAD.sms.prod
||
\/
./nwprod/scripts/exgdas_vrfyrad.sh.sms
cd ${DATA}
$NCP $biascr ./biascr.$PDATE # input 1
$NCP $radstat ./radstat.$PDATE # input 2
tar -xvf radstat.$PDATE
rm radstat.$PDATE
mv diag_${type}_ges.${PDATE}.${Z} ${type}.${Z} # diag_*_anl* still there
${UNCOMPRESS} ./${type}.${Z}
||
\/
./nwprod/ush/radmon_verf_angle.sh ==> $TIMEX ./${angle_exec} < input > ${stdout_file}
./nwprod/ush/radmon_verf_bcoef.sh ==> $TIMEX ./${bcoef_exec} < input > stdout.$type
./nwprod/ush/radmon_verf_bcor.sh ==> $TIMEX ./${bcor_exec} < input > stdout.$type
./nwprod/ush/radmon_verf_time.sh ==> $TIMEX ./${time_exec} < input > ${stdout_file}
Also move data, control, and stdout files to $TANKverf_rad and compress.
output:
/data/users/dxu/radmon_workspace/data/output/radmon_tank/stats/myolddata/radmon.20130731
angle.sndrd2_g15.2013073100.ieee_d.gz # intermediate file
angle.sndrd2_g15.2013073106.ieee_d.gz
angle.sndrd2_g15.2013073112.ieee_d.gz
angle.sndrd2_g15.2013073118.ieee_d.gz
angle.sndrd2_g15.ctl.gz // ctl file created by f90 code.
angle.stdout.sndrd2_g15.gz // log file for one of the cycles above.
bcoef.sndrd2_g15.2013073100.ieee_d.gz
bcoef.sndrd2_g15.2013073106.ieee_d.gz
bcoef.sndrd2_g15.2013073112.ieee_d.gz
bcoef.sndrd2_g15.2013073118.ieee_d.gz
bcoef.sndrd2_g15.ctl.gz
bcoef.stdout.sndrd2_g15.gz
bcor.sndrd2_g15.2013073100.ieee_d.gz
bcor.sndrd2_g15.2013073106.ieee_d.gz
bcor.sndrd2_g15.2013073112.ieee_d.gz
bcor.sndrd2_g15.2013073118.ieee_d.gz
bcor.sndrd2_g15.ctl.gz
bcor.stdout.sndrd2_g15.gz
time.sndrd2_g15.2013073100.ieee_d.gz
time.sndrd2_g15.2013073106.ieee_d.gz
time.sndrd2_g15.2013073112.ieee_d.gz
time.sndrd2_g15.2013073118.ieee_d.gz
time.sndrd2_g15.ctl.gz
time.stdout.sndrd2_g15.gz
2. how endianness is set:
1) "./nwprod/ush/radmon_verf_time.sh"
LITTLE_ENDIAN=${LITTLE_ENDIAN:-0} # default value is 0: big_endian
cat << EOF > input # create namelist file that is input to f90 code.
&INPUT
satname='${type}',
iyy=${iyy},
imm=${imm},
idd=${idd},
ihh=${ihh},
idhh=-720
incr=6
nchanl=${nchanl},
suffix='${SUFFIX}',
imkctl=${MAKE_CTL},
imkdata=${MAKE_DATA},
gesanl='${dtype}',
little_endian=${LITTLE_ENDIAN},
rad_area='${RAD_AREA}',
/
EOF
$TIMEX ./${time_exec} < input > ${stdout_file} # namelist
2) ./nwprod/sorc/verf_radbcoef.fd/bcoef.f90:
integer :: little_endian = 1 # default is 1.
namelist /input/ satname,npredr,nchanl,iyy,imm,idd,ihh,idhh,&
incr,suffix,imkctl,imkdata,retrieval,gesanl,little_endian # read from namelist
read(luname,input)
call create_ctl_bcoef(ntype,ftype,n_chan,iyy,imm,idd,ihh,idhh,&
incr,ctl_file,lunctl,rmiss,mod_satname,satype,dplat,&
nu_chan,use,penalty,frequency,wavenumbr,little_endian)
3) ./nwprod/sorc/verf_radbcoef.fd/create_ctl_bcoef.f90:
Purpose: this sub creates ctl file such as "bcor.sndrd2_g15.ctl"
subroutine create_ctl_bcoef(ntype,ftype,n_chan,iyy,imm,idd,ihh,idhh,&
incr,ctl_file,lunctl,rmiss,satname,satype,dplat,&
nu_chan,use,ratio,frequency,wavenumbr,little_endian)
if ( little_endian == 1 ) then
write(lunctl,112) # 1 means little-endian
else
write(lunctl,110) # 0 means big_endian
endif
110 format('options template big_endian cray_32bit_ieee sequential')
112 format('options template little_endian sequential')
radmon little endian version:
SVN: https://svnemc.ncep.noaa.gov/projects/gsi/branches/NESDIS-JCSDA/users/dxu/radmon_addition/radmon_little_endian/
3. step 2 : generate images
1) code tracing :
./image_gen/ush/mk_bcoef_plots.sh
$SUB -J ${jobname} -s -o ${logfile} -e ${logfile} $SCRIPTS/plot_bcoef.sh
./ush/plot_bcoef.sh
plot_bcoef=plot_bcoef.gs # fixed grads script
cat << EOF > ${type}_${var}.gs # Create wrapper gs file on the fly
'open ${type}.ctl' # ctl file created by f90 above.
'run ${GSCRIPTS}/${plot_bcoef} ${type} ${var} x1100 y850'
'quit'
EOF
$GRADS -bpc "run ${tmpdir}/${type}_${var}.gs"
2) Wrapper grads script : ${type}_${var}.gs
'open amsua_metop-a.ctl'
'run plot_bcor_sep.glb.gs amsua_metop-a total 1 x1100 y850'
'quit'
http://www.iges.org/grads/gadoc/descriptorfile.html
The best way to ensure hardware independence for gridded data is to specify the data's source platform. This facilitates moving data files and their descriptor files between machines; the data may be used on any type of hardware without having to worry about byte ordering. The following three OPTIONS keywords are used to describe the byte ordering of a gridded or station data file: | |
big_endian | Indicates the data file contains 32-bit IEEE floats created on a big endian platform (e.g., sun, sgi) |
little_endian | Indicates the data file contains 32-bit IEEE floats created on a little endian platform (e.g., iX86, and dec) |
cray_32bit_ieee | Indicates the data file contains 32-bit IEEE floats created on a cray. |
4. How to install radmon
$ cd /data/users/dxu/radmon_pkg/radmon/util/Radiance_Monitor
$ vi ./RadMon_install.pl
$ vi parm/RadMon_config
$ ./RadMon_install.pl
Note: Configuration in RadMon_install.pl overwrites both parm/RadMon_config and parm/RadMon_user_settings. So configuration in RadMon_install.pl has
higher preference.
5. How to run radmon
1) data_extract:
How:
$ cd /data/users/dxu/radmon_pkg/radmon/util/Radiance_Monitor/data_extract/ush
$ cat run_radmon_kgarrett |head -3
./VrfyRad_glbl.sh kgarrett_radmon 2014051500 # kgarrett_radmon is ID
./VrfyRad_glbl.sh kgarrett_radmon 2014051506
./VrfyRad_glbl.sh kgarrett_radmon 2014051512
$ . run_radmon_kgarrett # run by each cycle
input dir:
/data/users/dxu/radmon_workspace/data/input/radmon_input_for_kevin
File:
radstat.gdas.2014061000
biascr.gdas.2014061000
run dir:
/data/users/dxu/radmon_workspace/run/dxu/gdas_vrfyrad_2014060218.4056
File:
./amsua_metop-a
./diag_amsua_metop-a_anl.2014060218.gz
./amsua_metop-a.ctl
./stdout.amsua_metop-a
./amsua_metop-a.deyong
output dir:
/data/users/dxu/radmon_workspace/data/output/radmon_tank/stats/kgarrett_radmon/radmon.20140603
File:
./bcoef.amsua_metop-a.2014060300.ieee_d.gz
./angle.amsua_metop-a.2014060300.ieee_d.gz
./time.amsua_metop-a.2014060300.ieee_d.gz
./bcor.amsua_metop-a.2014060300.ieee_d.gz
./bcoef.amsua_metop-a.ctl.gz
./time.amsua_metop-a.ctl.gz
./angle.amsua_metop-a.ctl.gz
./bcor.amsua_metop-a.ctl.gz
2) image_gen
How:
$ cd /data/users/dxu/radmon_for_kg/util/Radiance_Monitor/image_gen/ush
$ ./CkPlt_glbl.sh run_radmon_kgarrett # run with a shot, no need to specify cycle.
input dir:
/data/users/dxu/radmon_workspace/data/output/radmon_tank/stats/kgarrett_radmon/radmon.20140603
File: ./bcoef.amsua_metop-a.2014060300.ieee_d.gz
./angle.amsua_metop-a.2014060300.ieee_d.gz
./time.amsua_metop-a.2014060300.ieee_d.gz
./bcor.amsua_metop-a.2014060300.ieee_d.gz
./bcoef.amsua_metop-a.ctl.gz
./time.amsua_metop-a.ctl.gz
./angle.amsua_metop-a.ctl.gz
./bcor.amsua_metop-a.ctl.gz
run dir:
/data/users/dxu/radmon_workspace/run/dxu/plotjobs_kgarrett_radmon
/data/users/dxu/radmon_workspace/run/dxu/plot_summary_kgarrett_radmon.20140
/data/users/dxu/radmon_workspace/run/dxu/horiz_kgarrett_radmon.2014061018
File: omitted here because too many.
output dir:
/data/users/dxu/radmon_workspace/data/output/radmon_tank/imgn/kgarrett_radmon/pngs
File:
bcor/iasi_metop-a.lapse_region4_fr88.png
time/iasi_metop-a.omgnbc_region3_fr93.png
bcoef/iasi_metop-a.mean_fr128.png
angle/iasi_metop-a.omgnbc_region2_fr61.png
horiz/hirs4_metop-b.obs_10.png
summary/atms_npp.summary.png
3) job submission setting in RadMon_config
elif [[ $MY_MACHINE = "cardinal" ]]; then
shell=sh
export SUB=/usr/bin/sbatch # submit job to job scheduler slurm
export NWPROD=/usr/local/jcsda/nwprod_gdas_2014
export COMPRESS=gzip
export UNCOMPRESS="gunzip -f"
export TIMEX=
export UTILS_BIN=
6. "RadMon_install.pl" updates configuration files.
Users configures "RadMon_install.pl", which will update two configuration files in sub-dir "parm":
"./parm/RadMon_config"
"./parm/RadMon_user_settings"
1) When it updates "parm/RadMon_config", it updates following fields:
if( $_ =~ "MY_RADMON=" ) {
elsif( $_ =~ "MY_TANKDIR=" ) {
elsif( $_ =~ "WEB_SVR=" ) {
elsif( $_ =~ "WEB_USER=" ) {
elsif( $_ =~ "WEBDIR=" ) {
elsif( $_ =~ "LITTLE_ENDIAN=" ) {
elsif( $_ =~ "MY_MACHINE=" ) {
elsif( $_ =~ "PTMP=" ) {
elsif( $_ =~ "STMP=" ) {
2) When it updates "./parm/RadMon_user_settings", it updates following fields:
if ($line =~ m/export ACCOUNT/) {
elsif( $line =~ m/export PROJECT/ ){
elsif( $line =~ m/export JOB_QUEUE/ ){
elsif( $line =~ m/export HPSS_DIR/ ){
7. Platform-related files
Note: These files contain platform name such as "zeus", "jibb" etc. Need to update these files if new platform is added.
makeall.sh
get_hostname.pl
RadMon_install.pl
data_extract/ush/VrfyRad_glbl.sh
image_gen/ush/CkPlt_glbl.sh
image_gen/parm/plot_rad_conf
Subscribe to:
Posts (Atom)