Date: Fri, 29 Mar 2024 11:34:40 +0100 (CET) Message-ID: <1925386319.1259.1711708480467@plg-s04> Subject: Exported From Confluence MIME-Version: 1.0 Content-Type: multipart/related; boundary="----=_Part_1258_551362730.1711708480458" ------=_Part_1258_551362730.1711708480458 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Content-Location: file:///C:/exported.html
Python Jupyter notebooks are usually started on a localhost, which re= sults in starting local webserver and using web browser to interact with th= e notebook.
On Ares we cannot easily expose the web socket to external world, a= s calculations are done internally on a computing node, not visible from In= ternet.
The trick is to start Jupyter via a job submitted to a computing node an= d creating a SSH tunnel to access it on a local PC.
Create a following file:
#!/bin/bash #SBATCH --partition plgrid #SBATCH --nodes 1 #SBATCH --ntasks-per-node 6 #SBATCH --time 0:30:00 #SBATCH --job-name jupyter-notebook-tunnel #SBATCH --output jupyter-log-%J.txt ## get tunneling info XDG_RUNTIME_DIR=3D"" ipnport=3D$(shuf -i8000-9999 -n1) ipnip=3D$(hostname -i) user=3D$USER ## print tunneling instructions to jupyter-log-{jobid}.txt echo -e " Copy/Paste this in your local terminal to ssh tunnel with remote ----------------------------------------------------------------- ssh -o ServerAliveInterval=3D300 -N -L $ipnport:$ipnip:$ipnport ${user}= @ares.cyfronet.pl ----------------------------------------------------------------- Then open a browser on your local machine to the following address ------------------------------------------------------------------ localhost:$ipnport (prefix w/ https:// if using password) ------------------------------------------------------------------ " module load jupyterlab/3.1.6-gcccore-11.2.0 scipy-bundle/2021.10-intel-2021= b ## start an ipcluster instance and launch jupyter server jupyter-notebook --no-browser --port=3D$ipnport --ip=3D$ipnip
JupyterLab
In case you want to start JupyterLab jus=
t change last line in above python-notebook.slurm
script from<=
/p>
jupyter-noteb= ook --no-browser --port=3D$ipnport --ip=3D$ipnip
to
jupyterlab --= no-browser --port=3D$ipnport --ip=3D$ipnip
GPU in Jupyter
To use GPUs in your Jupyer simply add the proper flag to job requirement= s
#SBATCH --gpu= s=3D<nubmer-of-gpus>
or
#SBATCH --gre= s=3Dgpu:<nubmer-of-gpus>
Save it as pyton-notebook.slurm
.
Send job to queue using sbatch
command on login node =
of Ares
sbatch pyton-= notebook.slurm
Wait until your job enters running state.
To check status of submitted job using squeue
co=
mmand
squeue -j <= ;JobID>
or all jobs of user
squeue -u $US= ER
which lists all current user jobs submitted to queue ($USER - is e= nviromental).
Common states of jobs:
PD - PENDING
- Job is awaiting resource allocation.R - RUNNING
- Job currently has an allocation and is runni=
ng. CF - CONFIGURING
- Job has been allocated reso=
urces, but are waiting for them to become ready for use (e.g. booting). On =
Ares CF
state could last for up to 8 minutes in case=
when nodes that have been in power save mode. CG - COMPLETING
- Job is in the process of completi=
ng. Some processes on some nodes may still be active.In your directory cat jupyter log file:
cat jupyter-l= og-XXXXXXX.txt
where `XXXXXXX
` is your sbatch job id which is displayed af=
ter you run it f.e. `cat jupyter-log-7123485.txt
`
It will show you something like this:
Copy/Paste th= is in your local terminal to ssh tunnel with remote ----------------------------------------------------------------- ssh -o ServerAliveInterval=3D300 -N -L 8511:172.20.68.193:8511 plgusername@= ares.cyfronet.pl ----------------------------------------------------------------- Then open a browser on your local machine to the following address ------------------------------------------------------------------ localhost:8511 (prefix w/ https:// if using password) ------------------------------------------------------------------ ## You exec in another shell given command: > ssh -o ServerAliveInterval=3D300 -N -L 8511:172.20.68.193:8511 plguser= name@ares.cyfronet.pl ## And you are set, you can open in browser: > `localhost:8511`
Exec in another shell at your local computer given command to make a tun= nel:
ssh -o Server= AliveInterval=3D300 -N -L 8511:172.20.68.193:8511 plgusername@ares.cyfronet= .pl
Open in browser: `localhost:8511
`
If you need jupyter token 'cat jupyter-log-XXXXXXX.txt
` onc=
e again, and you can copy your token from there.
All informations from jupyter will be stored in this log file.
if you wish to to end your sbatch, use scancel <JOBID>
command, where JOBID is your tunnel JOBID you can look it up with h=
pc-jobs
or qsueue -u $USER
commands.
scancel <= JOBID>
To check submitted and running jobs use hpc-jobs
or
To check information about finished and historic jobs use hpc-jobs=
-history
command. For example with option "-d 30" that command=
shows all user's jobs from last 30 days. More info in hpc-jobs-histo=
ry -h
.
hpc-jobs-his= tory -d 30