STAR   Computing Tutorials main page
Details of bfc.kumac
Offline computing tutorial Maintained by K. Turner

Some information about $STAR/kumacs/chain/bfc.kumac
(I've cut some things out of $STAR/kumacs/chain/README file)

Basic Steps in BFC Chain Kumac

-- DICTIONARY OF INPUTS TO BFC.KUMAC --

  1. TOP - top level directory to search for kumacs
    - change if you copy stuff to your area!
    - files must be in standard directory structure format:
      ~workdir/kumacs/domain
      ~workdir/pams/domain
  2. calib_dir - for ebye analysis
  3. tpc_sector_first - first sector # of tpc to simulation using tss
  4. tpc_sector_last - last sector # of tpc to simulation using tss
  5. ebye_prior_switch - for ebye analysis
  6. ebye_ensemble_switch - for ebye analysis
  7. ebye_analysis_switch - for ebye analysis
  8. gstar_settings - settings for loading gstar libraries
    when running gstar, set 'complete 4th_off hadr_on' 
                       (4th_off = svt 4th layer off)
        !!! when not running gstar, should use 'field_only' !!!
         *** can put 'help' as a value in gstar_settings  ***
                See $STAR/pams/geometry/geometry/geometry.g 
                        for list of other available settings 
  9. domain - list of domains to load [domain='tpc svt global]
  10. chain - list of modules to run **** HAVE TO BE IN ORDER ****
    - See Available List Below
  11. skip - number of events to skip before processing starts
  12. no_events - number of events to process
  13. input_data_file - directory + input file name
  14. output_file - output file name for use with chain=fzout or evout
  15. log - log file
  16. nunit - output ntuple file unit #
  17. nfile - output ntuple file name
  18. nid - output ntuple i.d.
  19. ntable - output ntuple table - starts from event/data/ (i.e. list your table from under this directory)
  20. nid2 - output ntuple i.d. #2 (same ntuple file as above!)
  21. ntable2 - output ntuple table #2

--DOMAINS Available--

  1. gen
  2. geometry
  3. sim/gstar
  4. sim/g2t
  5. tpc
  6. emc
  7. ctf
  8. mwc
  9. svt
  10. ftpc
  11. global
  12. ebye
  13. strange

* !! In order to run tpc code, you must load BOTH geometry and tpc 
*     e.g. domain='geometry tpc'
* !! In order to run global code, you must load svt,tpc & global
*     e.g. domain='tpc svt global'
* !! In order to run some DST packages, you must load ctf 
* !! In order to load emc, you must also load geometry

-- CHAIN VARIABLES --

* rhij       - run hijing - uses hijev.inp in top directory as input !!!
* mdin       - read in mini-daq data
* evgin      - read in event generator file for running gstar
* g2tin      - read G2T xdf file (post gstar)
* evin       - read in simulated-data file (post gstar + more) 
* fzin       - read in fz file (post gstar but haven't run g2t)
* gtxin      - read in txt file for running gstar
* dstin      - read in dst file (only dst directories on file)
* obin       - read from objectivity database
* calin      - find event time and read in correct calibration with cal2mem
* rgst       - run gstar from evgin file
* rg2t       - run g2t from rungst(gstar) stuff
* svg        - SVT geometry initialization
* tpg        - TPC geometry initialization
* srs        - SVT  resolution simulator (fast)
* sss        - SVT slow simulator - not implemented yet
* tss        - TPC slow simulator
* tfc        - unpacks tpc raw data into adcxyz table for diagnostics
* tfc_r      - unpacks minidaq tpc data and reformats
* tcl        - TPC clustering - slow sim only !
* tcl_e      - TPC cluster evaluation - slow sim only
* tfs        - TPC fast simulator
* stk        - SVT tracker
* sgr        - grouping SVT hits into tracks 
* spr        - SVT dE/dX
* tpt        - TPC track finding
* tpt_r      - TPC track residuals
* tte_t      - TPC tte tracker
* tte_e      - TPC track evaluator
* tid        - TPC dE/dx pid
* ffs        - ftpc fast sim
* fss        - ftpc slow sim
* fcl        - ftpc clustering
* fpt        - ftpc tracker
* fte        - ftpc track eval
* svm        - tpc and svt track matching
* evr        - primary vertex reconstruction
* egr        - refit global tracks
* ev0        - Strange particle reconstruction (v0's)
* ev02       - V0 reco w/separate params for SVT&TPC trks; should replace ev0 
* ev0_e      - v0 evaluation
* ev0d       - creates dst_v0_vertex table - call AFTER ev02! and b4 smdst!
* ctf   - runs cts,ctu
* mwc   - runs mws,mwu
* trg   - runs l3cl, ftf_tpc - not implemented yet
* ems   - runs ems_interface2 for em calorimeter simulation
* emc   - convert em calorimeter adc counts to energy
* jet   - runs emc_egrid,emc_erj
* ddedx - dst dedx filler
* dpnt  - dst point filler
* dmsft - dst monitor software filler
* dmhrd - dst monitor hardware filler
* desum - dst event summary
* drsum - dst run summary
* dehd  - dst event header
* drhd  - dst run header
* ebye1 - run ebye sca analysis code
* ev0j  - join ev0out and vertex tables and make new ev0joined table
* sdst - strange micro dst analysis
* sdst2 - strange micro dst analysis - runs after ev02 and ev0d !!
* dchkt - dumps from dst files
* user  - user kumacs called here: init_user,run_user,finish_user
* ntup  - create output ntuple file
* ntup2 - add 2nd ntuple to output ntuple file (must also use ntup chain!)
* fzout - write out fz file from gstar (only works if you run gstar!)
* evout - write event & run directories into xdf file
* dout  - write run/dst and event/data/global/dst directories into xdf file
* obout - write out to objectivity database