-
Notifications
You must be signed in to change notification settings - Fork 0
Workflow
SLAC account are individual user accounts. To get a SLAC computing account, go to
https://confluence.slac.stanford.edu/display/LSSTDESC/Home
For confluence, ask confluence-admin@slac.stanford.edu. Once the unix account is created, it will automatically be used to connect to confluence.
-
connection between the SLAC portal and the CC is operated by a deamon running on the VM machine
/home/descprod/Pipeline2/bsub/srs-sub-dev
-
connect to the VM from ccage through ssh (ccosvms0136)
-
no password for the descprod account. From the VM, do:
sudo -u descprod -i
Here are an outline of what has currently been done, and should be re-tested.
-
script might need to be change to make sure that we don't get an error due to the fact that files already exist. See http://srs.slac.stanford.edu/Pipeline-II/exp/SRS/log.jsp?pi=41888307
-
job must be launched from the web interface (login needed):
http://srs.slac.stanford.edu/Pipeline-II/exp/LSST-DESC/admin.jsp
or via command line at SLAC (not doc on that yet).
- scripts to execute must be located under
/sps/lsst/dev/lsstprod/clusters/workflows/scripts/
- the pipeline launch a stream of the ClustersDM task, defined by the ClustersDM.xml file,. This task contain a process called 'ingest', executing the script BATCH_NAME=ClustersBatch.sh
All scripts are currently saved under workflows/CFHT/clusters
http://srs.slac.stanford.edu/Pipeline-II/exp/LSST-DESC/task.jsp?task=41814431
- objectives: deliverable, calendar for the reprocessing campaign, success?
- data: input and output data directories?
- stack: what version of the stack do we want to use?
- other software needed?
- other things to set up at CC-IN2P3?
- who is in charge of the coordination?
http://srs.slac.stanford.edu/Pipeline-II/exp/LSST-DESC/admin.jsp