-
Notifications
You must be signed in to change notification settings - Fork 55
Creating an Anaconda Python Installation at SLAC
These instructions describe how to create your own anaconda python installation at SLAC to run fermipy and the Fermi STs. These instructions assume you are running a bash shell. If your default shell is c-shell you should launch a bash shell before running the following commands.
First grab the installation and setup scripts from the fermipy github repository:
$ curl -OL https://raw.githubusercontent.com/fermiPy/fermipy/master/condainstall.sh
$ curl -OL https://raw.githubusercontent.com/fermiPy/fermipy/master/slacsetup.sh
Now choose an installation path. This should be a new directory (e.g. $HOME/anaconda) that has at least 2-4 GB available. We will assign this location to the CONDABASE environment variable which is used by the setup script to find the location of your python installation. To avoid setting this every time you log in it's recommended to set CONDABASE into your .bashrc
file.
Now run the following commands to install anaconda and fermipy. This will take about 5-10 minutes.
$ export CONDABASE=<path to install directory>
$ bash condainstall.sh $CONDABASE
Once anaconda is installed you will initialize your python and ST environment by running the slacsetup function in slacsetup.sh
. This function will set the appropriate environment variables needed to run the STs and python.
$ source slacsetup.sh
$ slacsetup
For convenience you can also copy this function into your .bashrc
file so that it will automatically be available when you launch a new shell session. By default the function will setup your environment to point to a recent version of the STs and the installation of python in CONDABASE. If CONDABASE is not defined then it will use the installation of python that is packaged with a given release of the STs. The slacsetup function takes two optional arguments which can be used to override the ST version or python installation path.
# Use ST 10-00-05
$ slacsetup 10-00-05
# Use ST 11-01-01 and python distribution located at <PATH>
$ slacsetup 11-01-01 <PATH>
The installation script only installs packages that are required by fermipy and the STs. Once you've initialized your shell environment you are free to install additional python packages with the conda package manager tool with conda install <package name>
. Packages that are not available on conda can also be installed with pip.
conda can also be used to upgrade packages. For instance you can upgrade fermipy to the newest version with the conda update
command:
$ conda update fermipy
The conda package manager tool has a system for creating installation environments that contain different versions of python or specific python packages. By default your anaconda installation will come with a single root environment. To create a new environment which is a clone of the root environment run the conda create
command followed by the name you want to assign to the environment:
$ conda create -n my-env --clone=root
To activate this environment use:
$ source activate my-env
Once you've activated an environment any conda installation commands that you run will only apply to that environment. This allows for instance to install a development version of a package in an environment without changing the installation of that package in the root environment.
To switch back to the root environment use the source deactivate
command:
$ source deactivate my-env
More information about using environments is available in the conda documentation.
First ssh into one of the SLAC machines (e.g. rhel6-64.slac.stanford.edu). If you are using one of the pool names to connect to SLAC (rhel6-64, ki-ls) make note of the actual hostname of the machine that appears when you log in (e.g. rhel6-64c.slac.stanford.edu). Now setup your ST and python environment, e.g.
$ slacsetup
Navigate to the directory where you plan to perform your analysis and launch the jupyter notebook server:
$ jupyter notebook --port=8888 --no-browser
After starting the notebook server you will see the message:
Copy/paste this URL into your browser when you connect for the first time,
to login with a token:
followed by a URL containing a unique token. Save this URL to your clipboard.
On your local machine setup an ssh tunnel connecting port 8888 on your local machine to the same port of the SLAC machine where you started the notebook server:
$ ssh -N -L localhost:8888:localhost:8888 user@remote_host
Now connect to the remote server by pasting the URL that you saved earlier into your browser.
When you are finished be sure to close your ssh tunnel. Note that closing the SSH tunnel will not close the notebook server and you are free to leave the notebook server running and reconnect to it at a later point. To launch a notebook server and detach it from the current shell (so that it doesn't close when you log out) run the disown
command immediately after starting the notebook server:
$ jupyter notebook --port=8888 --no-browser &
$ disown $!
Be sure to make note of the hostname of the machine where you left the notebook server running as you will need to know this to restart the SSH tunnel.
Fermipy comes with a set of jupyter notebook tutorials. To run these at SLAC clone the fermipy-extra repo and then launch a jupyter notebook server in the same directory:
$ git clone https://github.com/fermiPy/fermipy-extra.git
$ jupyter notebook --port=8888 --no-browser &
After connecting to the notebook server using the instructions above navigate to the notebooks directory inside the fermipy-extra repo. Click on any of the notebooks in that directory to begin.
fermipy-quick-analysis is a script for running a very simple analysis of an ROI:
- fitting all source parameters
- generating an SED and localizing a source
- generating TS and residual maps
- generating standard diagnostic plots
First download the script:
$ curl -OL https://raw.githubusercontent.com/fermiPy/fermipy/master/fermipy/scripts/quickanalysis.py
Now copy the following into a YAML file called base_config.yaml
:
data:
evfile: /u/gl/mdwood/ki20/mdwood/fermi/data/P8_SOURCE_V6_HE/P8_SOURCE_V6_239557414_476239414_z100_r180_ft1.lst
ltcube: /nfs/slac/g/ki/ki20/cta/mdwood/fermi/data/P8_SOURCE_V6_HE/P8_SOURCE_V6_239557414_476239414_z100_r180_gti_ft1_gtltcube_z100.fits
scfile: /nfs/slac/g/ki/ki20/cta/mdwood/fermi/data/P8_P302_BASE/P8_P302_SOURCE_239557414_476239414_ft2.fits
selection:
logemin : 3.0
logemax : 5.5
tmin : 239557414
tmax : 476239414
Now execute the script with the path to the new configuration file and the name of the source that you would like to analyze:
$ python quickanalysis.py mkn501/config.yaml --config=base_config.yaml --target=mkn501
You can also choose to analyze an arbitrary region of the sky with the ra
and dec
parameters:
$ python quickanalysis.py ra_70.0_dec_30.0/config.yaml --config=base_config.yaml --ra=70.0 --dec=30.0
Alternatively you can analyze a new source by adding it to the model in base_config.yaml
:
data:
evfile: /u/gl/mdwood/ki20/mdwood/fermi/data/P8_SOURCE_V6_HE/P8_SOURCE_V6_239557414_476239414_z100_r180_ft1.lst
ltcube: /nfs/slac/g/ki/ki20/cta/mdwood/fermi/data/P8_SOURCE_V6_HE/P8_SOURCE_V6_239557414_476239414_z100_r180_gti_ft1_gtltcube_z100.fits
scfile: /nfs/slac/g/ki/ki20/cta/mdwood/fermi/data/P8_P302_BASE/P8_P302_SOURCE_239557414_476239414_ft2.fits
selection:
logemin : 3.0
logemax : 5.5
tmin : 239557414
tmax : 476239414
model:
sources:
- {name : 'mynewsource', ra : 70.0, dec : 30.0 }
$ python quickanalysis.py mynewsource/config.yaml --config=base_config.yaml --target=mynewsource