$ cd /data/users/dxu/vsdb_pkg/vsdb_v17
$ ./build.sh
2. Configure two shell scripts:
# setup_envs.sh
# vsdbjob_submit.sh
$ cd /data/users/dxu/vsdb_pkg/vsdb_v17
$ vi setup_envs.sh
$ vi vsdbjob_submit.sh
3. Run VSDB
$ cd /data/users/dxu/vsdb_pkg/vsdb_v17
$ ./vsdbjob_submit.sh
4. Input, output, log and run dir
steps | run dir |
1 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy26072/stats |
2 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy47090/acrms47090 |
3 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy19091/mkup_precip |
4 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy22490/plot_pcp |
5 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy32596/fit |
6 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy20273/2dmaps |
steps | log file |
1 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy26072/vstep1.out |
2 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy47090/vstep2.out |
3 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy19091/mkup_rain_stat.out |
4 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy22490/plot_pcp.out |
5 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy32596/fit2obs.out |
6 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy20273/2dmaps[1-4].out |
steps | in dir |
1 | /data/users/dxu/vsdb_workspace/data/input/fcst_data |
2 | /data/users/dxu/vsdb_workspace/data/output/vsdb_data |
3 | /data/users/dxu/vsdb_workspace/data/input/fcst_data |
4 | /data/users/dxu/vsdb_workspace/data/intermediate/dxu/archive |
5 | /data/users/dxu/vsdb_workspace/data/input/f2o |
6 | /data/users/dxu/vsdb_workspace/data/input/fcst_data |
/data/users/dxu/vsdb_workspace/data/input/plot2d/obdata | |
steps | out dir |
1 | /data/users/dxu/vsdb_workspace/data/output/vsdb_data |
2 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy44333/acrms44333/G2/anom/HGT score text files /data/users/dxu/vsdb_workspace/data/stmp/dxu/web/allmodel PNG |
3 | /data/users/dxu/vsdb_workspace/data/intermediate/dxu/archive |
4 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/web/rain PNG |
5 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/web/fits GIF |
6 | /data/users/dxu/vsdb_workspace/data/stmp/dxu/web/2D/d[1-4] GIF |
5. script that contains sub_cardinal
export SUBJOB=$vsdbhome/bin/sub_cardinal
File 1 is : ./map_util/sfcfcst_1cyc.sh
File 2 is : ./map_util/allcenters_rmsmap.sh
File 3 is : ./map_util/allcenters_1cyc.sh
File 4 is : ./setup_envs.sh
File 5 is : ./vsdbjob.sh
File 6 is : ./precip/plot_pcp.sh
File 7 is : ./precip/precip_score_vsdb.sh
File 8 is : ./fit2obs/plotall.sh
File 9 is : ./fit2obs/fit2obs.sh
File 10 is : ./plot2d/maps2d_new.sh
File 11 is : ./vsdbjob_submit.sh
File 12 is : ./grid2obs/grid2obs_plot.sh
File 13 is : ./grid2obs/grid2obs_driver.sh
File 14 is : ./grid2obs/scripts/get_opsgfs_data.sh
File 15 is : ./grid2obs/scripts/get_paragfs_data.sh
File 16 is : ./grid2obs/scripts/g2o_sfcmap.sh
File 17 is : ./grid2obs/scripts/grid2obssfc.fits.sh
File 18 is : ./grid2obs/scripts/g2o_airmap.sh
File 19 is : ./grid2obs/grid2obs_opsdaily.sh
File 20 is : ./grid2obs/grid2obs.sh
File 21 is : ./verify_exp_step2.sh
6. all options used in SUBJOB
1) Flag used in SUBJOB
$SUBJOB -a $ACCOUNT -q $CUE2FTP -g $GROUP -p 1/1/S -t 1:00:00 -r 128/1 -j ftpcard -o ftpcard$$.out ${rundir}/ftpcard$$.sh
-e : EVN variable list
$SUBJOB -e $listvar -a $task -q $cue -g $GROUP -p 1/1/S -r 512/1 -t 3:00:00 -o $SUBJOB -a $ACCOUNT -q $CUE2FTP -g $GROUP -p 1/1/S -r 256/1 -w +${waitfits} -t 1:00:00 -j ftpfits -o $mapdir/ftpfits.out ${mapdir}/ftpfits.sh
2) Original meaning of flags.
where the options are:
-a account account (default:none)
-e envars copy comm-separated environment variables
-g group group name
-j jobname specify jobname (default: executable basename)
-n write command file to stdout rather than submitting it
-o output specify output file (default: jobname.out)
-p procs[/nodes]
number of MPI tasks and number of nodes
-q queue queue name
-r nodetype node type (harp or neha)
-v verbose mode
-t timew wall time limit in [[hh:]mm:]ss format (default: 900)
-w when when to run, in yyyymmddhh[mm], +hh[mm], thh[mm], or
Thh[mm] (full, incremental, today or tomorrow) format
(default: now)
3) sub_badger is a wrapper to translate above options into options that scheduler on badger can recognize.
on badger:
qsub -V : pass all the EVN variables.
4) sub_cardinal is a wrapper to translate above options into options that scheduler on cardinal can recognize.
on cardinal:
sbatch --export=<environment variables | ALL | NONE>
Identify which environment variables are propagated to the batch job.
Multiple environment variable names should be comma separated. Environ-
ment variable names may be specified to propagate the current value of
those variables (e.g. "--export=EDITOR") or specific values for the
variables may be exported (e.g.. "--export=EDITOR=/bin/vi"). This
option particularly important for jobs that are submitted on one cluster
and execute on a different cluster (e.g. with different paths). By
default all environment variables are propagated. If the argument is
NONE or specific environment variable names, then the --get-user-env
option will implicitly be set to load other environment variables based
upon the user’s configuration on the cluster which executes the job.
7. score card related
File 1 is : ./run_scorecard.sh
file3=${scoredir}/score_${stat}_conflimit_${namedaily}_${mdnamec2}_day${dd}
File 2 is : ./map_util/allcenters_rmsmap.sh
'${vsdbhome}/map_util/grads/fprintf.gs 'tv.n' score_${var}_conflimit_'
cp score_${var}_conflimit*.txt $scoredir
File 3 is : ./map_util/allcenters_1cyc.sh
'${vsdbhome}/map_util/grads/fprintf.gs 'tv.n' score_cor_conflimit_'${n.
1). Create score card text files:
./vsdbjob_submit.sh
--> ./verify_exp_step2.sh
--> ./map_util/allcenters_rmsmap.sh and ./map_util/allcenters_1cyc.sh
A) anomaly correlation on single pressure layer
$SUBJOB -e $listvar -a $ACCOUNT -q $CUE2RUN -g $GROUP -p $penode -t $cputime -j HGTanom${narea} -o $rundir/HGT_anom.out \
${sorcdir}/allcenters_1cyc.sh $edate $ndays $fdays
if [ $? -ne 0 ]; then ${sorcdir}/allcenters_1cyc.sh $edate $ndays $fdays ; fi
sleep 3
B) rms and bias $SUBJOB -e $listvar -a $ACCOUNT -q $CUE2RUN -g $GROUP -p $penode -t $cputime -j HGTpres${narea} -o $rundir/HGT_pres.out \
${sorcdir}/allcenters_rmsmap.sh $edate $ndays $fdays
if [ $? -ne 0 ]; then ${sorcdir}/allcenters_rmsmap.sh $edate $ndays $fdays ; fi
sleep 3
2) Location of score card files?
Details of ./verify_exp_step2.sh
step 1: move vsdb status files to /data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/vsdb_data
step 2 : 3 variable types: anom, pres, sfc
step 3: 5 regions: G2NHX ,G2SHX, G2TRO, G2,G2PNA
Eg:
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2NHX
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2SHX
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2TRO
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2PNA
step 4 : Each region has two types: pres & anom
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2NHX
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2NHX
├── anom│ ├── HGT│ ├── HGTWV0-3│ ├── HGTWV10-20│ ├── HGTWV4-9│ ├── PMSL│ ├── T│ ├── U│ ├── V│ └── WIND└── pres├── HGT├── O3├── T├── U├── V└── WIND
A#) anomaly correlation on single pressure layer:
vtype=anom + map_util/allcenters_1cyc.sh
------- ------------
Level Parameters
------- ------------
"P1000 P700 P500 P250" "HGT HGT_WV1/0-3 HGT_WV1/4-9 HGT_WV1/10-20"
"P850 P500 P250" "WIND U V T"
"MSL" "PMSL"
log files:
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2NHX
PMSL_anom.out
UVT_anom.out # combine U, V and T
HGT_anom.out # combine all HGT including layered HGT.
B#) rms and bias :
vtype=pres + map_util/allcenters_rmsmap.sh
------- ------------
Level Parameters
------- ------------if maptop = "10"
"P1000 P925 P850
P700 P500 P400
P300 P250 P200 "HGT WIND U V T"
P150 P100 P50
P20 P10"
if maptop = "50"
"P1000 P925 P850
P700 P500 P400 "HGT WIND U V T"
P300 P250 P200
P150 P100 P50"
if maptop = "100"
"P1000 P925 P850
P700 P500 P400 "HGT WIND U V T"
P300 P250 P200
P150 P100"
"P100 P70 P50 P30 P20 P10" : "O3"
log files:
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/G2NHX
O3_pres.out
HGT_pres.out
WIND_pres.out
U_pres.out
V_pres.out
T_pres.out
step 5 : Process "sfc" separately.
vtype=sfc + map_util/sfcfcst_1cyc.shreglist="G2 G2/NHX G2/SHX G2/TRO G2/N60 G2/S60 G2/NPO G2/SPO G2/NAO G2/SAO G2/CAM G2/NSA"
------- ------------
Level Parameters
------- ------------
"SL1L2" "CAPE CWAT PWAT HGTTRP"
"TMPTRP HPBL PSFC PSL"
"RH2m SPFH2m T2m TOZNE TG"
"U10m V10m WEASD TSOILT WSOILT"
/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy9457/acrms9457/sfc
3) Final html output files└── sfc
├── CAPE
├── CWAT
├── HGTTRP
├── HPBL
├── PSFC
├── PSL
├── PWAT
├── RH2m
├── SPFH2m
├── T2m
├── TG
├── TMPTRP
├── TOZNE
├── TSOILT
├── U10m
├── V10m
├── WEASD
└── WSOILT
eg: output location: /scratch2/portfolios/NESDIS/h-sandy/noscrub/Deyong.Xu/vsdb_workspace/data/stmp/Deyong.Xu/nwpvrfy29853/acrms29853/score
scorecard.html
scorecard.css
mainindex.html
legend.html
8. How VSDB status files are used?
verify_exp_step2.sh
--> map_util/allcenters_1cyc.sh & map_util/allcenters_rmsmap.sh
--> map_util/gen_scal.sh
map_util/gen_scal_pres.sh
map_util/gen_wind.sh
map_util/gen_wind_pres.sh
map_util/gen_sfc.sh
Detail of "./gen_scal.sh"
1) convert vsdb file into txt file
while [ $cdate -le $edate ]; do
fhour=00; vhr=$cyc
while [ $fhour -le $vlength ]; do
datadir=${vsdb_data}/${vtype}/${vhr}Z/${model}
vsdbname=${datadir}/${model}_${cdate}.vsdb ### vsdb files
string=" $mdl $fhour ${cdate}${vhr} $mdl $reg SAL1L2 $vnam $lev "
mycheck=$( grep "$string" $vsdbname )
if [ $? -ne 0 ]; then
echo "missing" >>$outname.txt
else
grep "$string" $vsdbname |cat >>$${outname}.txt ### grep data out and save it into text file
fi
fhour=` expr $fhour + $fhout `
if [ $fhour -lt 10 ]; then fhour=0$fhour ; fi
vhr=` expr $vhr + $fhout `
if [ $vhr -ge 24 ]; then vhr=`expr $vhr - 24 `; fi
if [ $vhr -lt 10 ]; then vhr=0$vhr ; fi
done
2) convert txt file into binary file via convert.f that is generated on the fly.
( within gen* script such as gen_scal.sh script )
open(9,file="modelname.txt",form="formatted",status="old")
open(10,file="${outname}.txt",form="formatted",status="old") ## text file generated above
open(11,file="tmp.txt",form="formatted",status="new")
open(20,file="${outname}.bin",form="unformatted",status="new") ## output binary file
$FC $FFLAG -o convert.x convert.f
./convert.x
meantxt=${vnam1}_${lev}_${reg1}_${yyyymm}
mv fort.13 meancor_${meantxt}.txt
mv fort.14 meanrms_${meantxt}.txt
mv fort.15 meanbias_${meantxt}.txt
3) create grads control file
cat >${outname}.ctl <<EOF1
dset ^${outname}.bin ## binary file generated above
undef -99.9
options big_endian sequential
title scores
xdef $nmdcyc linear 1 1
4) output ctl files
HGT_P1000_G2NHX_2014020120140228.ctl
HGT_P700_G2NHX_2014020120140228.ctl
HGT_P500_G2NHX_2014020120140228.ctl
HGT_P250_G2NHX_2014020120140228.ctl
Details of "allcenters_1cyc.sh"
0)# -- search data for all models; write out binary data, create grads control file
/gen_wind.sh
/gen_scal.sh
1) # ----- PLOT TYPE 1: time series of anomaly correlations ----
cat >acz_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'open ${outname}.ctl'
mdc.1=${mdnamec[0]}
* Create verification scorecard text files ( score card files )
'${vsdbhome}/map_util/grads/fprintf.gs 'sc.i' score_cor_'${namedaily}'_'mdc.i'_day'%day'.txt %-7.6f' ### GOLD
...
$GRADSBIN/grads -bcp "run acz_${outname1}.gs"
2) # ----- PLOT TYPE 2: Die-off plot for mean correlation over $ndays days----
ndaysp1=`expr $ndays + 1`
fdaysp1=`expr $fdays + 1`
cat >cordieoff_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'open ${outname}.ctl'
** Create verification scorecard text files ( conflimit files between models )
'${vsdbhome}/map_util/grads/fprintf.gs 'tv.n' score_cor_conflimit_'${namedaily}'_'mdc.i'_day'n-1'.txt %-7.6f' ## GOLD: generate score_cor_conflimit* files needed to run score card.
...
$GRADSBIN/grads -bcp "run cordieoff_${outname1}.gs"
3) # ----- PLOT TYPE 3: difference of AC, other models minus first model ----
cat >cordiff_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'run $sorcdir/grads/white.gs'
'open ${outname}.ctl'
...
$GRADSBIN/grads -bcp "run cordiff_${outname1}.gs"
4) # ----- PLOT TYPE 4: frequency distribution of anomaly correlations ----
ndayfq=$ndays
if [ $ndayfq -gt 20 ]; then ndayfq=20; fi
nday05=`expr $ndayfq \/ 2 `
cat >freq_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'open ${outname}.ctl'
...
$GRADSBIN/grads -bcp "run freq_${outname1}.gs"
5) output grads script (.gs) files
freq_HGT_P700_G2NHX_2014020120140228.gs
freq_HGT_P250_G2NHX_2014020120140228.gs
freq_HGT_P500_G2NHX_2014020120140228.gs
freq_HGT_P1000_G2NHX_2014020120140228.gs
cordiff_HGT_P700_G2NHX_2014020120140228.gs
cordiff_HGT_P250_G2NHX_2014020120140228.gs
cordiff_HGT_P500_G2NHX_2014020120140228.gs
cordiff_HGT_P1000_G2NHX_2014020120140228.gs
acz_HGT_P700_G2NHX_2014020120140228.gs
acz_HGT_P250_G2NHX_2014020120140228.gs
acz_HGT_P500_G2NHX_2014020120140228.gs
acz_HGT_P1000_G2NHX_2014020120140228.gs
cordieoff_HGT_P700_G2NHX_2014020120140228.gs
cordieoff_HGT_P250_G2NHX_2014020120140228.gs
cordieoff_HGT_P500_G2NHX_2014020120140228.gs
cordieoff_HGT_P1000_G2NHX_2014020120140228.gs
6) Sample score files needed to run scorecard.
[dxu@s4-cardinal HGT]$ pwd/data/users/dxu/vsdb_workspace/data/stmp/dxu/nwpvrfy_deyong/acrms_deyong/G2NHX/anom/HGT
[dxu@s4-cardinal HGT]$ tree -f |grep score_cor |grep 250
├── ./score_cor_conflimit_HGT_P250_G2NHX_ECM_day1.txt # for 2nd model only
├── ./score_cor_conflimit_HGT_P250_G2NHX_ECM_day2.txt
├── ./score_cor_conflimit_HGT_P250_G2NHX_ECM_day3.txt
├── ./score_cor_conflimit_HGT_P250_G2NHX_ECM_day4.txt
├── ./score_cor_conflimit_HGT_P250_G2NHX_ECM_day5.txt
├── ./score_cor_HGT_P250_G2NHX_ECM_day1.txt
├── ./score_cor_HGT_P250_G2NHX_ECM_day3.txt
├── ./score_cor_HGT_P250_G2NHX_ECM_day5.txt
├── ./score_cor_HGT_P250_G2NHX_GFS_day1.txt # Reference model
├── ./score_cor_HGT_P250_G2NHX_GFS_day3.txt
├── ./score_cor_HGT_P250_G2NHX_GFS_day5.txt
Details of "allcenters_rmsmap.sh"
varlist="rms bias pcor emd epv rsd msess"
0) # -- search data for all models; write out binary data, create grads control file
gen_wind_pres.sh
gen_scal_pres.sh
# ----- PLOT TYPE 1: maps of $var as a function of calendar day and pressure for each forecast time ----
cat >${var}p_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'run $sorcdir/grads/white.gs'
'open ${outname1}.ctl'
...
EOF1
$GRADSBIN/grads -bcp "run ${var}p_${outname1}.gs"
eg:
rmsp_WIND_G2NHX_2014020120140228.gs
biasp_WIND_G2NHX_2014020120140228.gs
pcorp_WIND_G2NHX_2014020120140228.gs
emdp_WIND_G2NHX_2014020120140228.gs
epvp_WIND_G2NHX_2014020120140228.gs
rsdp_WIND_G2NHX_2014020120140228.gs
msessp_WIND_G2NHX_2014020120140228.gs
# ----- PLOT TYPE 2: maps of mean ${var} as a function of forecast time and pressure ----ndaysp1=`expr $ndays + 1`
fdaysp1=`expr $fdays + 1`
cat >${var}pmean_${outname1}.gs <<EOF1
'reinit'; 'set font 1'
'run $sorcdir/grads/white.gs'
'open ${outname1}.ctl'
...
EOF1
$GRADSBIN/grads -bcp "run ${var}pmean_${outname1}.gs"
eg:
rmspmean_WIND_G2NHX_2014020120140228.gs
biaspmean_WIND_G2NHX_2014020120140228.gs
pcorpmean_WIND_G2NHX_2014020120140228.gs
emdpmean_WIND_G2NHX_2014020120140228.gs
epvpmean_WIND_G2NHX_2014020120140228.gs
rsdpmean_WIND_G2NHX_2014020120140228.gs
msesspmean_WIND_G2NHX_2014020120140228.gs
# ----- PLOT TYPE 3: time series of ${var} Errors----cat >${var}_${outname1}${lev}.gs <<EOF1
'reinit'; 'set font 1'
'open ${outname1}.ctl'
'${vsdbhome}/map_util/grads/fprintf.gs 'sc.i' score_${var}_${namedaily1}_'mdc.i'_day'%day'.txt %-7.6f' ## GOLD: generate single score file for each model
...
EOF1
$GRADSBIN/grads -bcp "run ${var}_${outname1}${lev}.gs"
eg:
rsddieoff_WIND_G2NHX_2014020120140228P200.gs
rsddieoff_WIND_G2NHX_2014020120140228P10.gs
rsddieoff_WIND_G2NHX_2014020120140228P100.gs
# ----- PLOT TYPE 4: mean ${var} error growth curve over $ndays days----ndaysp1=`expr $ndays + 1`
fdaysp1=`expr $fdays + 1`
cat >${var}dieoff_${outname1}${lev}.gs <<EOF1
'reinit'; 'set font 1'
'open ${outname1}.ctl'
'${vsdbhome}/map_util/grads/fprintf.gs 'tv.n' score_${var}_conflimit_'${namedaily1}'_'mdc.i'_day'n-1'.txt %-7.6f' ## GOLD *conflimit*
...
EOF1
$GRADSBIN/grads -bcp "run ${var}dieoff_${outname1}${lev}.gs"
eg:
rsddieoff_WIND_G2NHX_2014020120140228P10.gs
rsddieoff_WIND_G2NHX_2014020120140228P100.gs
msessdieoff_WIND_G2NHX_2014020120140228P850.gs
msessdieoff_WIND_G2NHX_2014020120140228P700.gs
log files:
/scratch2/portfolios/NESDIS/h-sandy/noscrub/Deyong.Xu/vsdb_workspace/data/stmp/Deyong.Xu/nwpvrfy774/acrms774/G2NHX/pres/WIND
./score_rsd_WIND_P1000_G2NHX_GFS_day3.txt
./score_epv_WIND_P500_G2NHX_GFS_day3.txt
./score_pcor_WIND_P700_G2NHX_ECM_day5.txt
./score_pcor_WIND_P50_G2NHX_ECM_day5.txt
8. Region control / distribution
Go to here: /scratch2/portfolios/NESDIS/h-sandy/noscrub/Deyong.Xu/vsdb_pkg/vsdb_v17/exe
on zeus and do following command and find the lines attached below.
cntl_sfc.sh seems to be what you're trying to find.
$ vi cntl_sfc.sh
12 ${gd}
${gd}/NHX 0 20 360 80
${gd}/SHX 0 -80 360 -20
${gd}/TRO 0 -20 360 20
${gd}/N60 0 60 360 90
${gd}/S60 0 -90 360 -60
${gd}/NPO
${gd}/SPO
${gd}/NAO
${gd}/SAO
${gd}/CAM
${gd}/NSA
$ vi cntl_pres.sh
5 ${gd}
${gd}/NHX 0 20 360 80
${gd}/SHX 0 -80 360 -20
${gd}/TRO 0 -20 360 20
${gd}/PNA 180 20 320 75
$ vi cntl_anom.sh
5 ${gd}
${gd}/NHX 0 20 360 80
${gd}/SHX 0 -80 360 -20
${gd}/TRO 0 -20 360 20
${gd}/PNA 180 20 320 75
9. Account related issue:
On some platforms such as zeus, account setting becomes matter because not all of accounts are allowed to submit jobs on zeus. This account information is part of parameters of job submission.
So for zeus, I set it to "h-sandy"
export ACCOUNT=h-sandy
No comments:
Post a Comment