Skip to content

bashrc spark localmode

Stratis Ioannidis edited this page Aug 3, 2017 · 5 revisions

A .bashrc File for Running Slurm, Python, and Spark

These are the contents of an example .bashrc file that can be used to run slurm, python, and spark in local mode.

You can copy this entire .bashrc file to your home directory by logging in to discovery and typing:

cp /gss_gpfs_scratch/EECE5698/.bashrc ./

You then need to log out and log back in to the cluster for the file to take effect. Alternatively, if you already have your own .bashrc file, you can copy the contents below to it. To launch a spark standalone cluster, you need to modify this .bashrc file as discussed here. You can also use the command below:

cd ~
vi .bashrc
# .bashrc

# Source global definitions
if [ -f /etc/bashrc ]; then
	. /etc/bashrc
fi


#spark modules loaded by nilay
module load gnu-4.4-compilers 
module load fftw-3.3.3
module load platform-mpi
module load gnu-4.8.1-compilers
module load oracle_java_1.7u40
module load hadoop-2.4.1
module load python-2.7.5
module load spark-1.4.1_hadoop_2.4




#SLURM needs the following modules to be loaded as prerequisites
module load perl-5.20.0
module load slurm-14.11.8       # loads the modules environment for SLURM 14.11.8 (http://slurm.schedmd.com/) executibles, libraries, and include files.


source spark-config.sh

#Uncomment this line to run spark in standalone mode 
#source /gss_gpfs_scratch/$USER/spark/conf/spark-env.sh

#useful SLURM aliases
alias sq="squeue -u $USER" 
alias sc="scancel" 
alias sb="sbatch" 
alias sa="salloc -N 1 --exclusive"
Clone this wiki locally