Skip to content

bashrc spark localmode

Armin Moharrer edited this page Oct 19, 2018 · 5 revisions

A .bashrc File for Running Slurm, Python, and Spark

These are the contents of an example .bashrc file that can be used to run slurm, python, and spark in local mode.

You can copy this entire .bashrc file to your home directory by logging in to discovery and typing:

cp /scratch/Discovery/.bashrc ~

You then need to log out and log back in to the cluster for the file to take effect. Alternatively, if you already have your own .bashrc file, you can copy the contents below to it. To launch a spark standalone cluster, you need to modify this .bashrc file as discussed here.

# .bashrc

# Source global definitions
if [ -f /etc/bashrc ]; then
    . /etc/bashrc
fi

module load spark/2.3.2-hadoop2.7
module load python/2.7.15




#Uncomment this line to run spark in standalone mode 
#source /scratch/$USER/spark/conf/spark-env.sh


#useful SLURM aliases
alias sq="squeue -u $USER"
alias sc="scancel"
alias sb="sbatch"
alias sr="srun --pty --export=ALL --tasks-per-node 1 --nodes 1   --mem=10Gb /bin/bash"

If you want to edit your .bashrc file, you can use the vim or emacs editor, e.g., as follows:

cd ~
vim .bashrc