Here is how I shop for cellphone?
1. Pre-filter phones with good rating.
Go to websites like walmart : http://www.walmart.com/browse/cell-phones/unlocked-phones/1105910_1073085?sort=rating_high
and find the highest rating with around 100 reviewers. More reviewers, better.
Now find a few that are in your budget.
2. Check cellphone specs and read customer reviews related to technical issues.
Check cellphone specifications at both sites:
site 1: http://www.gsmarena.com/blu_studio_5_0_c_hd-6565.php
site 2: http://www.phonearena.com/phones/BLU-Studio-5.0-C-HD_id8805
Note:
1) "read all opinions" at site 1.
2) site 2 gives more visualized summary, which locates where your phone is in each category.
3. Make sure cellphone will work for intended wireless carrier
Go to http://www.bhphotovideo.com/ and locate the same phone.
http://www.bhphotovideo.com/c/product/1076548-REG/blu_d534u_blue_studio_5_0c_hd_d534u.html
And click the following links to make sure that cellphone you're choosing will work for the carrier you're going to use.
North American Carriers
South American Carriers
Worldwide Carriers
Monday, December 22, 2014
Thursday, December 18, 2014
Sunday, December 14, 2014
El Nino and La Nina
El Niño is the warm phase of the El Niño Southern Oscillation (commonly called ENSO) and is associated with a band of warm ocean water that develops in the central and east-central equatorial Pacific (between approximately the International Date Line and 120°W), including off the Pacific coast of South America. El Niño Southern Oscillation refers to the cycle of warm and cold temperatures, as measured by sea surface temperature, SST, of the tropical central and eastern Pacific Ocean.
El Niño is accompanied by high air pressure in the western Pacific and low air pressure in the eastern Pacific.
The cool phase of ENSO is called "La Niña" with SST in the eastern Pacific below average and air pressures high in the eastern and low in western Pacific.
The ENSO cycle, both El Niño and La Niña, causes global changes of both temperatures and rainfall.[2][3] Mechanisms that cause the oscillation remain under study.
Developing countries dependent upon agriculture and fishing, particularly those bordering the Pacific Ocean, are the most affected.
In Spanish, the capitalized term "El Niño" refers to the Christ child, Jesus (literal translation "The (male) Child"). La Niña, chosen as the 'opposite' of El Niño, literally means "The (female) Child).
El Niño was so named because periodic warming in the Pacific near South America is often noticed around Christmas.[4]
Thursday, December 11, 2014
Wednesday, December 3, 2014
Monday, December 1, 2014
Kill task in windows from command line
1. Bring out an command line terminal
cmd
2. Useful command about "tasklist"
tasklist /?
tasklist /fo list
tasklist /fo table
tasklist /fo csv
tasklist /fi "imagename eq firefox.exe"
tasklist /F /fi "imagename eq firefox.exe" ## force to kill task
3. Useful command man page about "taskkill"
taskkill /?
4. Examples
4.1) Use filter to find the application you want to terminate.
C:\Users\dxu>tasklist /fi "imagename eq firefox.exe"
Image Name PID Session Name Session# Mem Usage
========================= ======== ================ =========== ============
firefox.exe 7804 Console 1 135,052 K
4.2) Kill the application by PID
C:\Users\dxu>taskkill /pid 7804
SUCCESS: Sent termination signal to the process with PID 7804.
C:\Users\dxu>tasklist /fi "imagename eq firefox.exe"
Image Name PID Session Name Session# Mem Usage
========================= ======== ================ =========== ============
firefox.exe 11584 Console 1 115,612 K
4.3) Force to kill application "/f"
C:\Users\dxu>taskkill /f /im firefox ## Must specify full imagename.
ERROR: The process "firefox" not found.
C:\Users\dxu>taskkill /f /im firefox.exe
SUCCESS: The process "firefox.exe" with PID 11584 has been terminated.
cmd
2. Useful command about "tasklist"
tasklist /?
tasklist /fo list
tasklist /fo table
tasklist /fo csv
tasklist /fi "imagename eq firefox.exe"
tasklist /F /fi "imagename eq firefox.exe" ## force to kill task
3. Useful command man page about "taskkill"
taskkill /?
4. Examples
4.1) Use filter to find the application you want to terminate.
C:\Users\dxu>tasklist /fi "imagename eq firefox.exe"
Image Name PID Session Name Session# Mem Usage
========================= ======== ================ =========== ============
firefox.exe 7804 Console 1 135,052 K
4.2) Kill the application by PID
C:\Users\dxu>taskkill /pid 7804
SUCCESS: Sent termination signal to the process with PID 7804.
C:\Users\dxu>tasklist /fi "imagename eq firefox.exe"
Image Name PID Session Name Session# Mem Usage
========================= ======== ================ =========== ============
firefox.exe 11584 Console 1 115,612 K
4.3) Force to kill application "/f"
C:\Users\dxu>taskkill /f /im firefox ## Must specify full imagename.
ERROR: The process "firefox" not found.
C:\Users\dxu>taskkill /f /im firefox.exe
SUCCESS: The process "firefox.exe" with PID 11584 has been terminated.
Wednesday, November 19, 2014
Thursday, November 13, 2014
hit summary
1 Main scripts
East Pacific:
Atlantic:
Track mean for both East Pacific and Atlantic:
For instance: storm Amanda
"trackep2014_ep.sh"
Amanda) code1=ep012014.dat; DATEST=20140522; DATEND=20140529;;
ep012014.dat is used to find two files aep012014.dat and bep012014.dat within hit package's
archive data location: hit_pkg/hit/tpctrack
Note:
a) use NHC's website "http://www.nhc.noaa.gov/archive/2014/" to verify storm ID and storm base.
b) If you don't set it right " Amanda) code1=ep012014.dat; DATEST=20140522; DATEND=20140529;;" in script such as "trackep2014_ep.sh", then everything will go wrong.
ep012014.dat : storm #1 in East Pacific
aep012014.dat : A file contains all the forecast information from various storm forecast centers.
bep012014.dat : A file contains the BEST track/intensity information for the storm.
3. Input file
Users' input file such as "atcfunix.gdas.2011102406" will be appended to
aep012014.dat to make a new forecast information file, which contains forecast result of your model plus other models that are already in.
4. To be continued ....
East Pacific:
trackep2014_ep.sh
trackep2013_ep.sh
trackep2012_ep.sh
trackep2011_ep.sh
Atlantic:
trackat2013_at.sh
trackat2012_at.sh
trackat2011_at.sh
trackat2014_at.sh
Track mean for both East Pacific and Atlantic:
track_mean.sh2. File needed for a storm
For instance: storm Amanda
"trackep2014_ep.sh"
Amanda) code1=ep012014.dat; DATEST=20140522; DATEND=20140529;;
ep012014.dat is used to find two files aep012014.dat and bep012014.dat within hit package's
archive data location: hit_pkg/hit/tpctrack
Note:
a) use NHC's website "http://www.nhc.noaa.gov/archive/2014/" to verify storm ID and storm base.
b) If you don't set it right " Amanda) code1=ep012014.dat; DATEST=20140522; DATEND=20140529;;" in script such as "trackep2014_ep.sh", then everything will go wrong.
ep012014.dat : storm #1 in East Pacific
aep012014.dat : A file contains all the forecast information from various storm forecast centers.
bep012014.dat : A file contains the BEST track/intensity information for the storm.
3. Input file
Users' input file such as "atcfunix.gdas.2011102406" will be appended to
aep012014.dat to make a new forecast information file, which contains forecast result of your model plus other models that are already in.
4. To be continued ....
Wednesday, November 12, 2014
Thursday, November 6, 2014
vsdb summary 2
1. for loop layers in script "run_scorecard.sh"
for stat in $statlist ; do #Loop over cor, rms, bias
for vnam in $vnamlist ; do #Loop over HGT, T, U, V, WIND
for reg in $reglist ; do #Loop over G2, NH, SH
for dd in $day ; do #Loop over day 1,3,5,6,8,10
file1=${scoredir}/score_${stat}_${namedaily}_${mdnamec1}_day${dd}.txt
file2=${scoredir}/score_${stat}_${namedaily}_${mdnamec2}_day${dd}.txt
file3=${scoredir}/score_${stat}_conflimit_${namedaily}_${mdnamec2}_day${dd}.txt
# if any of three files missing, exit 88, no good!
# I change it so it continues.
if [[ ! -s "$file1" || ! -s "$file2" || ! -s "$file3" ]] ; then
#dxu exit 88
continue # dxu: skip that day, and move on to the next ....
fi
done
done
done
done
2. Two ways to run script "run_scorecard.sh"
for stat in $statlist ; do #Loop over cor, rms, bias
for vnam in $vnamlist ; do #Loop over HGT, T, U, V, WIND
for reg in $reglist ; do #Loop over G2, NH, SH
for dd in $day ; do #Loop over day 1,3,5,6,8,10
file1=${scoredir}/score_${stat}_${namedaily}_${mdnamec1}_day${dd}.txt
file2=${scoredir}/score_${stat}_${namedaily}_${mdnamec2}_day${dd}.txt
file3=${scoredir}/score_${stat}_conflimit_${namedaily}_${mdnamec2}_day${dd}.txt
# if any of three files missing, exit 88, no good!
# I change it so it continues.
if [[ ! -s "$file1" || ! -s "$file2" || ! -s "$file3" ]] ; then
#dxu exit 88
continue # dxu: skip that day, and move on to the next ....
fi
done
done
done
done
2. Two ways to run script "run_scorecard.sh"
Monday, November 3, 2014
Grid models vs. spectral models
Grid models vs. spectral models
The three dimensions of space can be accounted for in various ways in numerical weather or climate prediction models. Most models are grid models, in which variables are computed at discrete grid points in the horizontal and vertical directions. The model resolution refers to the (horizontal) spacing between gridpoints. The grid spacing is not necessarily equidistant. For instance, some models use a longitude difference as zonal grid spacing, so near the poles the zonal grid spacing becomes zero. In the vertical direction the spacing is usually variable, the model resolution typically is highest just above sea level.
Other models, in particular those whose domain is global, are spectral models (Note 15.H): these transform the variation of some variable (e.g. temperature) with latitude and longitude into a series of waves; the highest wave number retained in the Fourier transform is a measure of the model resolution.
Numerical prediction models are based on the equations of motion (Note 15.G), and these involve many partial derivatives in space. Partial derivatives of wave fields (as used in spectral models) can be calculated exactly, rather than by means of a finite difference approach (used in grid models). This is the main advantage of spectral models. Of course the wave form is converted back into a spatial form after the calculations, in order to analyze the forecasts.
Friday, October 31, 2014
Wednesday, October 29, 2014
IAT GUI summary
1. Find a selected item.
private Choice theIAT_Choice = new Choice();
if (theIAT_Choice.getSelectedItem().equals("vsdb"))
System.out.println("vsdb !!");
2. Shortcut to println statement in eclipse.
// type “syso”, followed by CTRL + SPACE.
System.out.println("Hello, World");
3.ItemEvent selection ( not really sure..... )
public void itemStateChanged(ItemEvent e) {
if (e.getStateChange() == ItemEvent.SELECTED) {
fdLbl.setVisible(false);
System.out.println("fd lbl selected");
} else {
fdLbl.setVisible(true);
System.out.println("fd lbl invisible");
}
4. Panels I have
emptyConfigPanel :
emptyLbl
emptyTextArea
radmonConfigPanel
vsdbConfigPanel
fdConfigPanel
fdConfigLabelArr
fdConfigTextAreaArr
fdConfigTextAreaStrArr
geConfigPanel
hitConfigPanel
move window via keyboard
Refer to the original post here.
Method #1
NOTE: Method #1 won't work with a maximized Window.
NOTE: Method #2 will move your Window to the right or left half of the screen in the same manner as dragging a window to the right or left of the screen will.
Press the Windows Key & Right Arrow or Left Arrow
Method #3
NOTE: Method #3 will move your Window one display to the right or left.
- Kudos to Brink for that tip.
Tried, not working. However,
Pressing the Windows Key & Shift & Up Arrow or Down Arrow
shows a good move.
Method #1
NOTE: Method #1 won't work with a maximized Window.
- Alt-Tab or Click On the Window
- Press "Alt & Space"
- Press "M"
- Use your arrow keys to move the Window
- Press Enter to exit
NOTE: Method #2 will move your Window to the right or left half of the screen in the same manner as dragging a window to the right or left of the screen will.
Press the Windows Key & Right Arrow or Left Arrow
Method #3
NOTE: Method #3 will move your Window one display to the right or left.
- Kudos to Brink for that tip.
- Press the Windows Key & Shift & Right Arrow or Left Arrow
Tried, not working. However,
Pressing the Windows Key & Shift & Up Arrow or Down Arrow
shows a good move.
Tuesday, October 28, 2014
Java SpringLayout example
// ==================================================================
// Config layout for runPanel
// ==================================================================
// ----------------------------> X
// | (0,0)
// |
// |
// \/ Y
// 1. Add components into runPanel
runPanel.add(iatCheckBoxPanel);
runPanel.add(runButton);
runPanel.add(parButton);
SpringLayout runPanelLayout = new SpringLayout();
runPanel.setLayout(runPanelLayout);
int spacer = 5;
int xOrig = 110;
int xWidth = 150;
int yHeight = 30;
// 2. Use constraint to position each component within panel
// All these positions starting from (0,) are RELATIVE to this panel LOCALLY.
SpringLayout.Constraints iatCheckBoxPanelCons = runPanelLayout
.getConstraints(iatCheckBoxPanel);
iatCheckBoxPanelCons.setX(Spring.constant(0));
iatCheckBoxPanelCons.setY(Spring.constant(0));
iatCheckBoxPanelCons.setWidth(Spring.constant(100));
iatCheckBoxPanelCons.setHeight(Spring.constant(100));
SpringLayout.Constraints runButtonCons = runPanelLayout
.getConstraints(runButton);
runButtonCons.setX(Spring.constant(xOrig));
runButtonCons.setY(Spring.constant(70));
runButtonCons.setWidth(Spring.constant(xWidth));
runButtonCons.setHeight(Spring.constant(yHeight));
SpringLayout.Constraints parButtonCons = runPanelLayout
.getConstraints(parButton);
parButtonCons.setX(Spring.constant(xOrig + xWidth + spacer));
parButtonCons.setY(Spring.constant(70));
parButtonCons.setWidth(Spring.constant(xWidth));
parButtonCons.setHeight(Spring.constant(yHeight));
// ==================================================================
// End of runPanel
// ==================================================================
fcstDiff summary
fcstDiff summary :
1. Utility location:
cardinal:
export ndate_dir=/usr/local/jcsda/nwprod_gdas_2014/util/exec
export copygb_dir=/usr/local/jcsda/nwprod_gdas_2014/util/exec
export gribmap_dir=/opt/grads/2.0.2-intel-14.0-2/bin
zeus:
export ndate_dir=/scratch2/portfolios/NCEPDEV/global/save/glopara/nwpara/util/exec
export copygb_dir=/scratch2/portfolios/NCEPDEV/global/save/glopara/nwpara/util/exec
export gribmap_dir=/apps/grads/2.0.1a/bin
2. Modules required to run
zeus:
$ module load grads/2.0.1a # grads
$ module load intel/12-12.0.4.191 # copygb
Note: These modules need to be loaded first before running fcstDiff package, otherwise, you are
going to see many scary error messages related to shared library and missing data in *ctl file and *gs files.
to be continued.........
1. Utility location:
cardinal:
export ndate_dir=/usr/local/jcsda/nwprod_gdas_2014/util/exec
export copygb_dir=/usr/local/jcsda/nwprod_gdas_2014/util/exec
export gribmap_dir=/opt/grads/2.0.2-intel-14.0-2/bin
zeus:
export ndate_dir=/scratch2/portfolios/NCEPDEV/global/save/glopara/nwpara/util/exec
export copygb_dir=/scratch2/portfolios/NCEPDEV/global/save/glopara/nwpara/util/exec
export gribmap_dir=/apps/grads/2.0.1a/bin
2. Modules required to run
zeus:
$ module load grads/2.0.1a # grads
$ module load intel/12-12.0.4.191 # copygb
Note: These modules need to be loaded first before running fcstDiff package, otherwise, you are
going to see many scary error messages related to shared library and missing data in *ctl file and *gs files.
to be continued.........
Thursday, October 23, 2014
How to permanently rotate pdf 90 deg?
See original post here.
Open your file that you want rotated (even if it is 1000 pages all in the wrong direction). Go to Document/Rotate Pages... or use Ctrl+Shift+R and this opens your rotation menu.
You have several rotation options to rotate single pages, all pages, or a selection of pages. Choose what you need and select OK to proceed.
Now you have two options;
1. You will notice your save icon is no longer grayed out so you can permanenly save the file,
2. You will be able to save as... a new file, should you want to keep the original in tact.
Both options permanently save your chosen rotation(s).
This has been built into to software for quite some time but is often overlooked - and rightly so as its not immediately evident.
Open your file that you want rotated (even if it is 1000 pages all in the wrong direction). Go to Document/Rotate Pages... or use Ctrl+Shift+R and this opens your rotation menu.
You have several rotation options to rotate single pages, all pages, or a selection of pages. Choose what you need and select OK to proceed.
Now you have two options;
1. You will notice your save icon is no longer grayed out so you can permanenly save the file,
2. You will be able to save as... a new file, should you want to keep the original in tact.
Both options permanently save your chosen rotation(s).
This has been built into to software for quite some time but is often overlooked - and rightly so as its not immediately evident.
Monday, October 20, 2014
vsdb summary 1
1. Compile vsdb package
2. Configure two shell scripts:
# setup_envs.sh
# vsdbjob_submit.sh
$ cd /data/users/dxu/vsdb_pkg/vsdb_v17
$ vi setup_envs.sh
$ vi vsdbjob_submit.sh
3. Run VSDB
$ cd /data/users/dxu/vsdb_pkg/vsdb_v17
$ ./vsdbjob_submit.sh
4. Input, output, log and run dir
5. script that contains sub_cardinal
export SUBJOB=$vsdbhome/bin/sub_cardinal
File 1 is : ./map_util/sfcfcst_1cyc.sh
File 2 is : ./map_util/allcenters_rmsmap.sh
File 3 is : ./map_util/allcenters_1cyc.sh
File 4 is : ./setup_envs.sh
File 5 is : ./vsdbjob.sh
File 6 is : ./precip/plot_pcp.sh
File 7 is : ./precip/precip_score_vsdb.sh
File 8 is : ./fit2obs/plotall.sh
File 9 is : ./fit2obs/fit2obs.sh
File 10 is : ./plot2d/maps2d_new.sh
File 11 is : ./vsdbjob_submit.sh
File 12 is : ./grid2obs/grid2obs_plot.sh
File 13 is : ./grid2obs/grid2obs_driver.sh
File 14 is : ./grid2obs/scripts/get_opsgfs_data.sh
File 15 is : ./grid2obs/scripts/get_paragfs_data.sh
File 16 is : ./grid2obs/scripts/g2o_sfcmap.sh
File 17 is : ./grid2obs/scripts/grid2obssfc.fits.sh
File 18 is : ./grid2obs/scripts/g2o_airmap.sh
File 19 is : ./grid2obs/grid2obs_opsdaily.sh
File 20 is : ./grid2obs/grid2obs.sh
File 21 is : ./verify_exp_step2.sh
6. all options used in SUBJOB
1) Flag used in SUBJOB
$SUBJOB -a $ACCOUNT -q $CUE2FTP -g $GROUP -p 1/1/S -t 1:00:00 -r 128/1 -j ftpcard -o ftpcard$$.out ${rundir}/ftpcard$$.sh
-e : EVN variable list
$SUBJOB -e $listvar -a $task -q $cue -g $GROUP -p 1/1/S -r 512/1 -t 3:00:00 -o $SUBJOB -a $ACCOUNT -q $CUE2FTP -g $GROUP -p 1/1/S -r 256/1 -w +${waitfits} -t 1:00:00 -j ftpfits -o $mapdir/ftpfits.out ${mapdir}/ftpfits.sh
2) Original meaning of flags.
where the options are:
-a account account (default:none)
-e envars copy comm-separated environment variables
-g group group name
-j jobname specify jobname (default: executable basename)
-n write command file to stdout rather than submitting it
-o output specify output file (default: jobname.out)
-p procs[/nodes]
number of MPI tasks and number of nodes
-q queue queue name
-r nodetype node type (harp or neha)
-v verbose mode
-t timew wall time limit in [[hh:]mm:]ss format (default: 900)
-w when when to run, in yyyymmddhh[mm], +hh[mm], thh[mm], or
Thh[mm] (full, incremental, today or tomorrow) format
(default: now)
3) sub_badger is a wrapper to translate above options into options that scheduler on badger can recognize.
on badger:
qsub -V : pass all the EVN variables.
4) sub_cardinal is a wrapper to translate above options into options that scheduler on cardinal can recognize.
on cardinal:
sbatch --export=<environment variables | ALL | NONE>
Identify which environment variables are propagated to the batch job.
Multiple environment variable names should be comma separated. Environ-
ment variable names may be specified to propagate the current value of
those variables (e.g. "--export=EDITOR") or specific values for the
variables may be exported (e.g.. "--export=EDITOR=/bin/vi"). This
option particularly important for jobs that are submitted on one cluster
and execute on a different cluster (e.g. with different paths). By
default all environment variables are propagated. If the argument is
NONE or specific environment variable names, then the --get-user-env
option will implicitly be set to load other environment variables based
upon the user’s configuration on the cluster which executes the job.
7. score card related
File 1 is : ./run_scorecard.sh
file3=${scoredir}/score_${stat}_conflimit_${namedaily}_${mdnamec2}_day${dd}
File 2 is : ./map_util/allcenters_rmsmap.sh
'${vsdbhome}/map_util/grads/fprintf.gs 'tv.n' score_${var}_conflimit_'
cp score_${var}_conflimit*.txt $scoredir
File 3 is : ./map_util/allcenters_1cyc.sh
'${vsdbhome}/map_util/grads/fprintf.gs 'tv.n' score_cor_conflimit_'${n.
1). Create score card text files:
./vsdbjob_submit.sh
--> ./verify_exp_step2.sh
--> ./map_util/allcenters_rmsmap.sh and ./map_util/allcenters_1cyc.sh
A) anomaly correlation on single pressure layer
$SUBJOB -e $listvar -a $ACCOUNT -q $CUE2RUN -g $GROUP -p $penode -t $cputime -j HGTanom${narea} -o $rundir/HGT_anom.out \
${sorcdir}/allcenters_1cyc.sh $edate $ndays $fdays
if [ $? -ne 0 ]; then ${sorcdir}/allcenters_1cyc.sh $edate $ndays $fdays ; fi
sleep 3
B) rms and bias $SUBJOB -e $listvar -a $ACCOUNT -q $CUE2RUN -g $GROUP -p $penode -t $cputime -j HGTpres${narea} -o $rundir/HGT_pres.out \
${sorcdir}/allcenters_rmsmap.sh $edate $ndays $fdays
if [ $? -ne 0 ]; then ${sorcdir}/allcenters_rmsmap.sh $edate $ndays $fdays ; fi
sleep 3
2) Location of score card files?
Details of ./verify_exp_step2.sh
step 1: move vsdb status files to /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/vsdb_data
step 2 : 3 variable types: anom, pres, sfc
step 3: 5 regions: G2NHX ,G2SHX, G2TRO, G2,G2PNA
Eg:
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2NHX
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2SHX
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2TRO
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2PNA
B#) rms and bias :
step 5 : Process "sfc" separately.
eg: output location: /scratch2/portfolios/NESDIS/h-sandy/noscrub/Deyong.Xu/vsdb_workspace/data/stmp/Deyong.Xu/nwpvrfy29853/acrms29853/score
scorecard.html
scorecard.css
mainindex.html
legend.html
8. How VSDB status files are used?
verify_exp_step2.sh
--> map_util/allcenters_1cyc.sh & map_util/allcenters_rmsmap.sh
--> map_util/gen_scal.sh
map_util/gen_scal_pres.sh
map_util/gen_wind.sh
map_util/gen_wind_pres.sh
map_util/gen_sfc.sh
Detail of "./gen_scal.sh"
1) convert vsdb file into txt file
while [ $cdate -le $edate ]; do
fhour=00; vhr=$cyc
while [ $fhour -le $vlength ]; do
datadir=${vsdb_data}/${vtype}/${vhr}Z/${model}
vsdbname=${datadir}/${model}_${cdate}.vsdb ### vsdb files
string=" $mdl $fhour ${cdate}${vhr} $mdl $reg SAL1L2 $vnam $lev "
mycheck=$( grep "$string" $vsdbname )
if [ $? -ne 0 ]; then
echo "missing" >>$outname.txt
else
grep "$string" $vsdbname |cat >>$${outname}.txt ### grep data out and save it into text file
fi
fhour=` expr $fhour + $fhout `
if [ $fhour -lt 10 ]; then fhour=0$fhour ; fi
vhr=` expr $vhr + $fhout `
if [ $vhr -ge 24 ]; then vhr=`expr $vhr - 24 `; fi
if [ $vhr -lt 10 ]; then vhr=0$vhr ; fi
done
2) convert txt file into binary file via convert.f that is generated on the fly.
( within gen* script such as gen_scal.sh script )
open(9,file="modelname.txt",form="formatted",status="old")
open(10,file="${outname}.txt",form="formatted",status="old") ## text file generated above
open(11,file="tmp.txt",form="formatted",status="new")
open(20,file="${outname}.bin",form="unformatted",status="new") ## output binary file
$FC $FFLAG -o convert.x convert.f
./convert.x
meantxt=${vnam1}_${lev}_${reg1}_${yyyymm}
mv fort.13 meancor_${meantxt}.txt
mv fort.14 meanrms_${meantxt}.txt
mv fort.15 meanbias_${meantxt}.txt
3) create grads control file
cat >${outname}.ctl <<EOF1
dset ^${outname}.bin ## binary file generated above
undef -99.9
options big_endian sequential
title scores
xdef $nmdcyc linear 1 1
4) output ctl files
HGT_P1000_G2NHX_2014020120140228.ctl
HGT_P700_G2NHX_2014020120140228.ctl
HGT_P500_G2NHX_2014020120140228.ctl
HGT_P250_G2NHX_2014020120140228.ctl
Details of "allcenters_1cyc.sh"
0)# -- search data for all models; write out binary data, create grads control file
/gen_wind.sh
/gen_scal.sh
1) # ----- PLOT TYPE 1: time series of anomaly correlations ----
cat >acz_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'open ${outname}.ctl'
mdc.1=${mdnamec[0]}
* Create verification scorecard text files ( score card files )
'${vsdbhome}/map_util/grads/fprintf.gs 'sc.i' score_cor_'${namedaily}'_'mdc.i'_day'%day'.txt %-7.6f' ### GOLD
...
$GRADSBIN/grads -bcp "run acz_${outname1}.gs"
2) # ----- PLOT TYPE 2: Die-off plot for mean correlation over $ndays days----
ndaysp1=`expr $ndays + 1`
fdaysp1=`expr $fdays + 1`
cat >cordieoff_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'open ${outname}.ctl'
** Create verification scorecard text files ( conflimit files between models )
'${vsdbhome}/map_util/grads/fprintf.gs 'tv.n' score_cor_conflimit_'${namedaily}'_'mdc.i'_day'n-1'.txt %-7.6f' ## GOLD: generate score_cor_conflimit* files needed to run score card.
...
$GRADSBIN/grads -bcp "run cordieoff_${outname1}.gs"
3) # ----- PLOT TYPE 3: difference of AC, other models minus first model ----
cat >cordiff_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'run $sorcdir/grads/white.gs'
'open ${outname}.ctl'
...
$GRADSBIN/grads -bcp "run cordiff_${outname1}.gs"
4) # ----- PLOT TYPE 4: frequency distribution of anomaly correlations ----
ndayfq=$ndays
if [ $ndayfq -gt 20 ]; then ndayfq=20; fi
nday05=`expr $ndayfq \/ 2 `
cat >freq_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'open ${outname}.ctl'
...
$GRADSBIN/grads -bcp "run freq_${outname1}.gs"
5) output grads script (.gs) files
freq_HGT_P700_G2NHX_2014020120140228.gs
freq_HGT_P250_G2NHX_2014020120140228.gs
freq_HGT_P500_G2NHX_2014020120140228.gs
freq_HGT_P1000_G2NHX_2014020120140228.gs
cordiff_HGT_P700_G2NHX_2014020120140228.gs
cordiff_HGT_P250_G2NHX_2014020120140228.gs
cordiff_HGT_P500_G2NHX_2014020120140228.gs
cordiff_HGT_P1000_G2NHX_2014020120140228.gs
acz_HGT_P700_G2NHX_2014020120140228.gs
acz_HGT_P250_G2NHX_2014020120140228.gs
acz_HGT_P500_G2NHX_2014020120140228.gs
acz_HGT_P1000_G2NHX_2014020120140228.gs
cordieoff_HGT_P700_G2NHX_2014020120140228.gs
cordieoff_HGT_P250_G2NHX_2014020120140228.gs
cordieoff_HGT_P500_G2NHX_2014020120140228.gs
cordieoff_HGT_P1000_G2NHX_2014020120140228.gs
6) Sample score files needed to run scorecard.
[dxu@s4-cardinal HGT]$ pwd/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy_deyong/acrms_deyong/G2NHX/anom/HGT
[dxu@s4-cardinal HGT]$ tree -f |grep score_cor |grep 250
├── ./score_cor_conflimit_HGT_P250_G2NHX_ECM_day1.txt # for 2nd model only
├── ./score_cor_conflimit_HGT_P250_G2NHX_ECM_day2.txt
├── ./score_cor_conflimit_HGT_P250_G2NHX_ECM_day3.txt
├── ./score_cor_conflimit_HGT_P250_G2NHX_ECM_day4.txt
├── ./score_cor_conflimit_HGT_P250_G2NHX_ECM_day5.txt
├── ./score_cor_HGT_P250_G2NHX_ECM_day1.txt
├── ./score_cor_HGT_P250_G2NHX_ECM_day3.txt
├── ./score_cor_HGT_P250_G2NHX_ECM_day5.txt
├── ./score_cor_HGT_P250_G2NHX_GFS_day1.txt # Reference model
├── ./score_cor_HGT_P250_G2NHX_GFS_day3.txt
├── ./score_cor_HGT_P250_G2NHX_GFS_day5.txt
Details of "allcenters_rmsmap.sh"
varlist="rms bias pcor emd epv rsd msess"
0) # -- search data for all models; write out binary data, create grads control file
gen_wind_pres.sh
gen_scal_pres.sh
# ----- PLOT TYPE 1: maps of $var as a function of calendar day and pressure for each forecast time ----
cat >${var}p_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'run $sorcdir/grads/white.gs'
'open ${outname1}.ctl'
...
EOF1
$GRADSBIN/grads -bcp "run ${var}p_${outname1}.gs"
eg:
rmsp_WIND_G2NHX_2014020120140228.gs
biasp_WIND_G2NHX_2014020120140228.gs
pcorp_WIND_G2NHX_2014020120140228.gs
emdp_WIND_G2NHX_2014020120140228.gs
epvp_WIND_G2NHX_2014020120140228.gs
rsdp_WIND_G2NHX_2014020120140228.gs
msessp_WIND_G2NHX_2014020120140228.gs
# ----- PLOT TYPE 2: maps of mean ${var} as a function of forecast time and pressure ----ndaysp1=`expr $ndays + 1`
fdaysp1=`expr $fdays + 1`
cat >${var}pmean_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'run $sorcdir/grads/white.gs'
'open ${outname1}.ctl'
...
EOF1
$GRADSBIN/grads -bcp "run ${var}pmean_${outname1}.gs"
eg:
rmspmean_WIND_G2NHX_2014020120140228.gs
biaspmean_WIND_G2NHX_2014020120140228.gs
pcorpmean_WIND_G2NHX_2014020120140228.gs
emdpmean_WIND_G2NHX_2014020120140228.gs
epvpmean_WIND_G2NHX_2014020120140228.gs
rsdpmean_WIND_G2NHX_2014020120140228.gs
msesspmean_WIND_G2NHX_2014020120140228.gs
# ----- PLOT TYPE 3: time series of ${var} Errors----cat >${var}_${outname1}${lev}.gs <<EOF1
'reinit'; 'set font 1'
'open ${outname1}.ctl'
'${vsdbhome}/map_util/grads/fprintf.gs 'sc.i' score_${var}_${namedaily1}_'mdc.i'_day'%day'.txt %-7.6f' ## GOLD: generate single score file for each model
...
EOF1
$GRADSBIN/grads -bcp "run ${var}_${outname1}${lev}.gs"
eg:
rsddieoff_WIND_G2NHX_2014020120140228P200.gs
rsddieoff_WIND_G2NHX_2014020120140228P10.gs
rsddieoff_WIND_G2NHX_2014020120140228P100.gs
# ----- PLOT TYPE 4: mean ${var} error growth curve over $ndays days----ndaysp1=`expr $ndays + 1`
fdaysp1=`expr $fdays + 1`
cat >${var}dieoff_${outname1}${lev}.gs <<EOF1
'reinit'; 'set font 1'
'open ${outname1}.ctl'
'${vsdbhome}/map_util/grads/fprintf.gs 'tv.n' score_${var}_conflimit_'${namedaily1}'_'mdc.i'_day'n-1'.txt %-7.6f' ## GOLD *conflimit*
...
EOF1
$GRADSBIN/grads -bcp "run ${var}dieoff_${outname1}${lev}.gs"
eg:
rsddieoff_WIND_G2NHX_2014020120140228P10.gs
rsddieoff_WIND_G2NHX_2014020120140228P100.gs
msessdieoff_WIND_G2NHX_2014020120140228P850.gs
msessdieoff_WIND_G2NHX_2014020120140228P700.gs
log files:
/scratch2/portfolios/NESDIS/h-sandy/noscrub/Deyong.Xu/vsdb_workspace/data/stmp/Deyong.Xu/nwpvrfy774/acrms774/G2NHX/pres/WIND
./score_rsd_WIND_P1000_G2NHX_GFS_day3.txt
./score_epv_WIND_P500_G2NHX_GFS_day3.txt
./score_pcor_WIND_P700_G2NHX_ECM_day5.txt
./score_pcor_WIND_P50_G2NHX_ECM_day5.txt
8. Region control / distribution
Go to here: /scratch2/portfolios/NESDIS/h-sandy/noscrub/Deyong.Xu/vsdb_pkg/vsdb_v17/exe
on zeus and do following command and find the lines attached below.
cntl_sfc.sh seems to be what you're trying to find.
$ vi cntl_sfc.sh
12 ${gd}
${gd}/NHX 0 20 360 80
${gd}/SHX 0 -80 360 -20
${gd}/TRO 0 -20 360 20
${gd}/N60 0 60 360 90
${gd}/S60 0 -90 360 -60
${gd}/NPO
${gd}/SPO
${gd}/NAO
${gd}/SAO
${gd}/CAM
${gd}/NSA
$ vi cntl_pres.sh
5 ${gd}
${gd}/NHX 0 20 360 80
${gd}/SHX 0 -80 360 -20
${gd}/TRO 0 -20 360 20
${gd}/PNA 180 20 320 75
$ vi cntl_anom.sh
5 ${gd}
${gd}/NHX 0 20 360 80
${gd}/SHX 0 -80 360 -20
${gd}/TRO 0 -20 360 20
${gd}/PNA 180 20 320 75
9. Account related issue:
On some platforms such as zeus, account setting becomes matter because not all of accounts are allowed to submit jobs on zeus. This account information is part of parameters of job submission.
So for zeus, I set it to "h-sandy"
export ACCOUNT=h-sandy
$ cd /data/users/dxu/vsdb_pkg/vsdb_v17
$ ./build.sh
2. Configure two shell scripts:
# setup_envs.sh
# vsdbjob_submit.sh
$ cd /data/users/dxu/vsdb_pkg/vsdb_v17
$ vi setup_envs.sh
$ vi vsdbjob_submit.sh
3. Run VSDB
$ cd /data/users/dxu/vsdb_pkg/vsdb_v17
$ ./vsdbjob_submit.sh
4. Input, output, log and run dir
steps | run dir |
1 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy26072/stats |
2 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy47090/acrms47090 |
3 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy19091/mkup_precip |
4 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy22490/plot_pcp |
5 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy32596/fit |
6 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy20273/2dmaps |
steps | log file |
1 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy26072/vstep1.out |
2 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy47090/vstep2.out |
3 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy19091/mkup_rain_stat.out |
4 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy22490/plot_pcp.out |
5 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy32596/fit2obs.out |
6 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy20273/2dmaps[1-4].out |
steps | in dir |
1 | /data/users/dxu/vsdb_workspace/data/input/fcst_data |
2 | /data/users/dxu/vsdb_workspace/data/output/vsdb_data |
3 | /data/users/dxu/vsdb_workspace/data/input/fcst_data |
4 | /data/users/dxu/vsdb_workspace/data/intermediate/dxu/archive |
5 | /data/users/dxu/vsdb_workspace/data/input/f2o |
6 | /data/users/dxu/vsdb_workspace/data/input/fcst_data |
/data/users/dxu/vsdb_workspace/data/input/plot2d/obdata | |
steps | out dir |
1 | /data/users/dxu/vsdb_workspace/data/output/vsdb_data |
2 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy44333/acrms44333/G2/anom/HGT score text files /data/users/dxu/vsdb_workspace/data/stmp/dxu/web/allmodel PNG |
3 | /data/users/dxu/vsdb_workspace/data/intermediate/dxu/archive |
4 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/web/rain PNG |
5 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/web/fits GIF |
6 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/web/2D/d[1-4] GIF |
5. script that contains sub_cardinal
export SUBJOB=$vsdbhome/bin/sub_cardinal
File 1 is : ./map_util/sfcfcst_1cyc.sh
File 2 is : ./map_util/allcenters_rmsmap.sh
File 3 is : ./map_util/allcenters_1cyc.sh
File 4 is : ./setup_envs.sh
File 5 is : ./vsdbjob.sh
File 6 is : ./precip/plot_pcp.sh
File 7 is : ./precip/precip_score_vsdb.sh
File 8 is : ./fit2obs/plotall.sh
File 9 is : ./fit2obs/fit2obs.sh
File 10 is : ./plot2d/maps2d_new.sh
File 11 is : ./vsdbjob_submit.sh
File 12 is : ./grid2obs/grid2obs_plot.sh
File 13 is : ./grid2obs/grid2obs_driver.sh
File 14 is : ./grid2obs/scripts/get_opsgfs_data.sh
File 15 is : ./grid2obs/scripts/get_paragfs_data.sh
File 16 is : ./grid2obs/scripts/g2o_sfcmap.sh
File 17 is : ./grid2obs/scripts/grid2obssfc.fits.sh
File 18 is : ./grid2obs/scripts/g2o_airmap.sh
File 19 is : ./grid2obs/grid2obs_opsdaily.sh
File 20 is : ./grid2obs/grid2obs.sh
File 21 is : ./verify_exp_step2.sh
6. all options used in SUBJOB
1) Flag used in SUBJOB
$SUBJOB -a $ACCOUNT -q $CUE2FTP -g $GROUP -p 1/1/S -t 1:00:00 -r 128/1 -j ftpcard -o ftpcard$$.out ${rundir}/ftpcard$$.sh
-e : EVN variable list
$SUBJOB -e $listvar -a $task -q $cue -g $GROUP -p 1/1/S -r 512/1 -t 3:00:00 -o $SUBJOB -a $ACCOUNT -q $CUE2FTP -g $GROUP -p 1/1/S -r 256/1 -w +${waitfits} -t 1:00:00 -j ftpfits -o $mapdir/ftpfits.out ${mapdir}/ftpfits.sh
2) Original meaning of flags.
where the options are:
-a account account (default:none)
-e envars copy comm-separated environment variables
-g group group name
-j jobname specify jobname (default: executable basename)
-n write command file to stdout rather than submitting it
-o output specify output file (default: jobname.out)
-p procs[/nodes]
number of MPI tasks and number of nodes
-q queue queue name
-r nodetype node type (harp or neha)
-v verbose mode
-t timew wall time limit in [[hh:]mm:]ss format (default: 900)
-w when when to run, in yyyymmddhh[mm], +hh[mm], thh[mm], or
Thh[mm] (full, incremental, today or tomorrow) format
(default: now)
3) sub_badger is a wrapper to translate above options into options that scheduler on badger can recognize.
on badger:
qsub -V : pass all the EVN variables.
4) sub_cardinal is a wrapper to translate above options into options that scheduler on cardinal can recognize.
on cardinal:
sbatch --export=<environment variables | ALL | NONE>
Identify which environment variables are propagated to the batch job.
Multiple environment variable names should be comma separated. Environ-
ment variable names may be specified to propagate the current value of
those variables (e.g. "--export=EDITOR") or specific values for the
variables may be exported (e.g.. "--export=EDITOR=/bin/vi"). This
option particularly important for jobs that are submitted on one cluster
and execute on a different cluster (e.g. with different paths). By
default all environment variables are propagated. If the argument is
NONE or specific environment variable names, then the --get-user-env
option will implicitly be set to load other environment variables based
upon the user’s configuration on the cluster which executes the job.
7. score card related
File 1 is : ./run_scorecard.sh
file3=${scoredir}/score_${stat}_conflimit_${namedaily}_${mdnamec2}_day${dd}
File 2 is : ./map_util/allcenters_rmsmap.sh
'${vsdbhome}/map_util/grads/fprintf.gs 'tv.n' score_${var}_conflimit_'
cp score_${var}_conflimit*.txt $scoredir
File 3 is : ./map_util/allcenters_1cyc.sh
'${vsdbhome}/map_util/grads/fprintf.gs 'tv.n' score_cor_conflimit_'${n.
1). Create score card text files:
./vsdbjob_submit.sh
--> ./verify_exp_step2.sh
--> ./map_util/allcenters_rmsmap.sh and ./map_util/allcenters_1cyc.sh
A) anomaly correlation on single pressure layer
$SUBJOB -e $listvar -a $ACCOUNT -q $CUE2RUN -g $GROUP -p $penode -t $cputime -j HGTanom${narea} -o $rundir/HGT_anom.out \
${sorcdir}/allcenters_1cyc.sh $edate $ndays $fdays
if [ $? -ne 0 ]; then ${sorcdir}/allcenters_1cyc.sh $edate $ndays $fdays ; fi
sleep 3
B) rms and bias $SUBJOB -e $listvar -a $ACCOUNT -q $CUE2RUN -g $GROUP -p $penode -t $cputime -j HGTpres${narea} -o $rundir/HGT_pres.out \
${sorcdir}/allcenters_rmsmap.sh $edate $ndays $fdays
if [ $? -ne 0 ]; then ${sorcdir}/allcenters_rmsmap.sh $edate $ndays $fdays ; fi
sleep 3
2) Location of score card files?
Details of ./verify_exp_step2.sh
step 1: move vsdb status files to /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/vsdb_data
step 2 : 3 variable types: anom, pres, sfc
step 3: 5 regions: G2NHX ,G2SHX, G2TRO, G2,G2PNA
Eg:
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2NHX
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2SHX
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2TRO
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2PNA
step 4 : Each region has two types: pres & anom
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2NHX
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2NHX
├── anom│ ├── HGT│ ├── HGTWV0-3│ ├── HGTWV10-20│ ├── HGTWV4-9│ ├── PMSL│ ├── T│ ├── U│ ├── V│ └── WIND└── pres├── HGT├── O3├── T├── U├── V└── WIND
A#) anomaly correlation on single pressure layer:
vtype=anom + map_util/allcenters_1cyc.sh
------- ------------
Level Parameters
------- ------------
"P1000 P700 P500 P250" "HGT HGT_WV1/0-3 HGT_WV1/4-9 HGT_WV1/10-20"
"P850 P500 P250" "WIND U V T"
"MSL" "PMSL"
log files:
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2NHX
PMSL_anom.out
UVT_anom.out # combine U, V and T
HGT_anom.out # combine all HGT including layered HGT.
B#) rms and bias :
vtype=pres + map_util/allcenters_rmsmap.sh
------- ------------
Level Parameters
------- ------------if maptop = "10"
"P1000 P925 P850
P700 P500 P400
P300 P250 P200 "HGT WIND U V T"
P150 P100 P50
P20 P10"
if maptop = "50"
"P1000 P925 P850
P700 P500 P400 "HGT WIND U V T"
P300 P250 P200
P150 P100 P50"
if maptop = "100"
"P1000 P925 P850
P700 P500 P400 "HGT WIND U V T"
P300 P250 P200
P150 P100"
"P100 P70 P50 P30 P20 P10" : "O3"
log files:
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2NHX
O3_pres.out
HGT_pres.out
WIND_pres.out
U_pres.out
V_pres.out
T_pres.out
step 5 : Process "sfc" separately.
vtype=sfc + map_util/sfcfcst_1cyc.shreglist="G2 G2/NHX G2/SHX G2/TRO G2/N60 G2/S60 G2/NPO G2/SPO G2/NAO G2/SAO G2/CAM G2/NSA"
------- ------------
Level Parameters
------- ------------
"SL1L2" "CAPE CWAT PWAT HGTTRP"
"TMPTRP HPBL PSFC PSL"
"RH2m SPFH2m T2m TOZNE TG"
"U10m V10m WEASD TSOILT WSOILT"
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/sfc
3) Final html output files└── sfc
├── CAPE
├── CWAT
├── HGTTRP
├── HPBL
├── PSFC
├── PSL
├── PWAT
├── RH2m
├── SPFH2m
├── T2m
├── TG
├── TMPTRP
├── TOZNE
├── TSOILT
├── U10m
├── V10m
├── WEASD
└── WSOILT
eg: output location: /scratch2/portfolios/NESDIS/h-sandy/noscrub/Deyong.Xu/vsdb_workspace/data/stmp/Deyong.Xu/nwpvrfy29853/acrms29853/score
scorecard.html
scorecard.css
mainindex.html
legend.html
8. How VSDB status files are used?
verify_exp_step2.sh
--> map_util/allcenters_1cyc.sh & map_util/allcenters_rmsmap.sh
--> map_util/gen_scal.sh
map_util/gen_scal_pres.sh
map_util/gen_wind.sh
map_util/gen_wind_pres.sh
map_util/gen_sfc.sh
Detail of "./gen_scal.sh"
1) convert vsdb file into txt file
while [ $cdate -le $edate ]; do
fhour=00; vhr=$cyc
while [ $fhour -le $vlength ]; do
datadir=${vsdb_data}/${vtype}/${vhr}Z/${model}
vsdbname=${datadir}/${model}_${cdate}.vsdb ### vsdb files
string=" $mdl $fhour ${cdate}${vhr} $mdl $reg SAL1L2 $vnam $lev "
mycheck=$( grep "$string" $vsdbname )
if [ $? -ne 0 ]; then
echo "missing" >>$outname.txt
else
grep "$string" $vsdbname |cat >>$${outname}.txt ### grep data out and save it into text file
fi
fhour=` expr $fhour + $fhout `
if [ $fhour -lt 10 ]; then fhour=0$fhour ; fi
vhr=` expr $vhr + $fhout `
if [ $vhr -ge 24 ]; then vhr=`expr $vhr - 24 `; fi
if [ $vhr -lt 10 ]; then vhr=0$vhr ; fi
done
2) convert txt file into binary file via convert.f that is generated on the fly.
( within gen* script such as gen_scal.sh script )
open(9,file="modelname.txt",form="formatted",status="old")
open(10,file="${outname}.txt",form="formatted",status="old") ## text file generated above
open(11,file="tmp.txt",form="formatted",status="new")
open(20,file="${outname}.bin",form="unformatted",status="new") ## output binary file
$FC $FFLAG -o convert.x convert.f
./convert.x
meantxt=${vnam1}_${lev}_${reg1}_${yyyymm}
mv fort.13 meancor_${meantxt}.txt
mv fort.14 meanrms_${meantxt}.txt
mv fort.15 meanbias_${meantxt}.txt
3) create grads control file
cat >${outname}.ctl <<EOF1
dset ^${outname}.bin ## binary file generated above
undef -99.9
options big_endian sequential
title scores
xdef $nmdcyc linear 1 1
4) output ctl files
HGT_P1000_G2NHX_2014020120140228.ctl
HGT_P700_G2NHX_2014020120140228.ctl
HGT_P500_G2NHX_2014020120140228.ctl
HGT_P250_G2NHX_2014020120140228.ctl
Details of "allcenters_1cyc.sh"
0)# -- search data for all models; write out binary data, create grads control file
/gen_wind.sh
/gen_scal.sh
1) # ----- PLOT TYPE 1: time series of anomaly correlations ----
cat >acz_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'open ${outname}.ctl'
mdc.1=${mdnamec[0]}
* Create verification scorecard text files ( score card files )
'${vsdbhome}/map_util/grads/fprintf.gs 'sc.i' score_cor_'${namedaily}'_'mdc.i'_day'%day'.txt %-7.6f' ### GOLD
...
$GRADSBIN/grads -bcp "run acz_${outname1}.gs"
2) # ----- PLOT TYPE 2: Die-off plot for mean correlation over $ndays days----
ndaysp1=`expr $ndays + 1`
fdaysp1=`expr $fdays + 1`
cat >cordieoff_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'open ${outname}.ctl'
** Create verification scorecard text files ( conflimit files between models )
'${vsdbhome}/map_util/grads/fprintf.gs 'tv.n' score_cor_conflimit_'${namedaily}'_'mdc.i'_day'n-1'.txt %-7.6f' ## GOLD: generate score_cor_conflimit* files needed to run score card.
...
$GRADSBIN/grads -bcp "run cordieoff_${outname1}.gs"
3) # ----- PLOT TYPE 3: difference of AC, other models minus first model ----
cat >cordiff_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'run $sorcdir/grads/white.gs'
'open ${outname}.ctl'
...
$GRADSBIN/grads -bcp "run cordiff_${outname1}.gs"
4) # ----- PLOT TYPE 4: frequency distribution of anomaly correlations ----
ndayfq=$ndays
if [ $ndayfq -gt 20 ]; then ndayfq=20; fi
nday05=`expr $ndayfq \/ 2 `
cat >freq_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'open ${outname}.ctl'
...
$GRADSBIN/grads -bcp "run freq_${outname1}.gs"
5) output grads script (.gs) files
freq_HGT_P700_G2NHX_2014020120140228.gs
freq_HGT_P250_G2NHX_2014020120140228.gs
freq_HGT_P500_G2NHX_2014020120140228.gs
freq_HGT_P1000_G2NHX_2014020120140228.gs
cordiff_HGT_P700_G2NHX_2014020120140228.gs
cordiff_HGT_P250_G2NHX_2014020120140228.gs
cordiff_HGT_P500_G2NHX_2014020120140228.gs
cordiff_HGT_P1000_G2NHX_2014020120140228.gs
acz_HGT_P700_G2NHX_2014020120140228.gs
acz_HGT_P250_G2NHX_2014020120140228.gs
acz_HGT_P500_G2NHX_2014020120140228.gs
acz_HGT_P1000_G2NHX_2014020120140228.gs
cordieoff_HGT_P700_G2NHX_2014020120140228.gs
cordieoff_HGT_P250_G2NHX_2014020120140228.gs
cordieoff_HGT_P500_G2NHX_2014020120140228.gs
cordieoff_HGT_P1000_G2NHX_2014020120140228.gs
6) Sample score files needed to run scorecard.
[dxu@s4-cardinal HGT]$ pwd/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy_deyong/acrms_deyong/G2NHX/anom/HGT
[dxu@s4-cardinal HGT]$ tree -f |grep score_cor |grep 250
├── ./score_cor_conflimit_HGT_P250_G2NHX_ECM_day1.txt # for 2nd model only
├── ./score_cor_conflimit_HGT_P250_G2NHX_ECM_day2.txt
├── ./score_cor_conflimit_HGT_P250_G2NHX_ECM_day3.txt
├── ./score_cor_conflimit_HGT_P250_G2NHX_ECM_day4.txt
├── ./score_cor_conflimit_HGT_P250_G2NHX_ECM_day5.txt
├── ./score_cor_HGT_P250_G2NHX_ECM_day1.txt
├── ./score_cor_HGT_P250_G2NHX_ECM_day3.txt
├── ./score_cor_HGT_P250_G2NHX_ECM_day5.txt
├── ./score_cor_HGT_P250_G2NHX_GFS_day1.txt # Reference model
├── ./score_cor_HGT_P250_G2NHX_GFS_day3.txt
├── ./score_cor_HGT_P250_G2NHX_GFS_day5.txt
Details of "allcenters_rmsmap.sh"
varlist="rms bias pcor emd epv rsd msess"
0) # -- search data for all models; write out binary data, create grads control file
gen_wind_pres.sh
gen_scal_pres.sh
# ----- PLOT TYPE 1: maps of $var as a function of calendar day and pressure for each forecast time ----
cat >${var}p_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'run $sorcdir/grads/white.gs'
'open ${outname1}.ctl'
...
EOF1
$GRADSBIN/grads -bcp "run ${var}p_${outname1}.gs"
eg:
rmsp_WIND_G2NHX_2014020120140228.gs
biasp_WIND_G2NHX_2014020120140228.gs
pcorp_WIND_G2NHX_2014020120140228.gs
emdp_WIND_G2NHX_2014020120140228.gs
epvp_WIND_G2NHX_2014020120140228.gs
rsdp_WIND_G2NHX_2014020120140228.gs
msessp_WIND_G2NHX_2014020120140228.gs
# ----- PLOT TYPE 2: maps of mean ${var} as a function of forecast time and pressure ----ndaysp1=`expr $ndays + 1`
fdaysp1=`expr $fdays + 1`
cat >${var}pmean_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'run $sorcdir/grads/white.gs'
'open ${outname1}.ctl'
...
EOF1
$GRADSBIN/grads -bcp "run ${var}pmean_${outname1}.gs"
eg:
rmspmean_WIND_G2NHX_2014020120140228.gs
biaspmean_WIND_G2NHX_2014020120140228.gs
pcorpmean_WIND_G2NHX_2014020120140228.gs
emdpmean_WIND_G2NHX_2014020120140228.gs
epvpmean_WIND_G2NHX_2014020120140228.gs
rsdpmean_WIND_G2NHX_2014020120140228.gs
msesspmean_WIND_G2NHX_2014020120140228.gs
# ----- PLOT TYPE 3: time series of ${var} Errors----cat >${var}_${outname1}${lev}.gs <<EOF1
'reinit'; 'set font 1'
'open ${outname1}.ctl'
'${vsdbhome}/map_util/grads/fprintf.gs 'sc.i' score_${var}_${namedaily1}_'mdc.i'_day'%day'.txt %-7.6f' ## GOLD: generate single score file for each model
...
EOF1
$GRADSBIN/grads -bcp "run ${var}_${outname1}${lev}.gs"
eg:
rsddieoff_WIND_G2NHX_2014020120140228P200.gs
rsddieoff_WIND_G2NHX_2014020120140228P10.gs
rsddieoff_WIND_G2NHX_2014020120140228P100.gs
# ----- PLOT TYPE 4: mean ${var} error growth curve over $ndays days----ndaysp1=`expr $ndays + 1`
fdaysp1=`expr $fdays + 1`
cat >${var}dieoff_${outname1}${lev}.gs <<EOF1
'reinit'; 'set font 1'
'open ${outname1}.ctl'
'${vsdbhome}/map_util/grads/fprintf.gs 'tv.n' score_${var}_conflimit_'${namedaily1}'_'mdc.i'_day'n-1'.txt %-7.6f' ## GOLD *conflimit*
...
EOF1
$GRADSBIN/grads -bcp "run ${var}dieoff_${outname1}${lev}.gs"
eg:
rsddieoff_WIND_G2NHX_2014020120140228P10.gs
rsddieoff_WIND_G2NHX_2014020120140228P100.gs
msessdieoff_WIND_G2NHX_2014020120140228P850.gs
msessdieoff_WIND_G2NHX_2014020120140228P700.gs
log files:
/scratch2/portfolios/NESDIS/h-sandy/noscrub/Deyong.Xu/vsdb_workspace/data/stmp/Deyong.Xu/nwpvrfy774/acrms774/G2NHX/pres/WIND
./score_rsd_WIND_P1000_G2NHX_GFS_day3.txt
./score_epv_WIND_P500_G2NHX_GFS_day3.txt
./score_pcor_WIND_P700_G2NHX_ECM_day5.txt
./score_pcor_WIND_P50_G2NHX_ECM_day5.txt
8. Region control / distribution
Go to here: /scratch2/portfolios/NESDIS/h-sandy/noscrub/Deyong.Xu/vsdb_pkg/vsdb_v17/exe
on zeus and do following command and find the lines attached below.
cntl_sfc.sh seems to be what you're trying to find.
$ vi cntl_sfc.sh
12 ${gd}
${gd}/NHX 0 20 360 80
${gd}/SHX 0 -80 360 -20
${gd}/TRO 0 -20 360 20
${gd}/N60 0 60 360 90
${gd}/S60 0 -90 360 -60
${gd}/NPO
${gd}/SPO
${gd}/NAO
${gd}/SAO
${gd}/CAM
${gd}/NSA
$ vi cntl_pres.sh
5 ${gd}
${gd}/NHX 0 20 360 80
${gd}/SHX 0 -80 360 -20
${gd}/TRO 0 -20 360 20
${gd}/PNA 180 20 320 75
$ vi cntl_anom.sh
5 ${gd}
${gd}/NHX 0 20 360 80
${gd}/SHX 0 -80 360 -20
${gd}/TRO 0 -20 360 20
${gd}/PNA 180 20 320 75
9. Account related issue:
On some platforms such as zeus, account setting becomes matter because not all of accounts are allowed to submit jobs on zeus. This account information is part of parameters of job submission.
So for zeus, I set it to "h-sandy"
export ACCOUNT=h-sandy
Subscribe to:
Posts (Atom)