COMP 73307336 Advanced Parallel and Distributed Computing Setting
COMP 7330/7336 Advanced Parallel and Distributed Computing Setting Up Your Programming Environment Dr. Xiao Qin Auburn University http: //www. eng. auburn. edu/~xqin@auburn. edu 2
Login • The DMC, SGI Ultraviolet, and SGI Altix and can be accessed using secure shell. • Windows Machines: use Pu. TTY • Secure shell is installed on many Linux and Unix machines. Command line: ssh user_name@dmc. asc. edu ssh user_name@uv. asc. edu ssh user_name@altix. asc. edu 3
Running MPI Interactive on DMC Option 1) Use the login node • Work run on the login node is limited to 10 minutes of CPU time and a small amount of memory. • You can run mpi jobs like this mpirun -np 4 myprogram • The login node has 16 cores. • If 20 students try to run 4 core jobs at the same time, it will bog down the node to the point of making it unusable to everyone 4
Running MPI Interactive on DMC Option 2) Use a queue • Open an interactive session through the queue: qsub -I -q small-parallel -r n -l nodes=4, mem=2 gb, partition=dmc • The terminal session hangs until the job starts on the compute nodes. • It may wait from a minute to a number of hours. • ASC provides a "class" queue and will reserve some processors for that queue so you don't have to wait on research work. 5
Running MPI Interactive on DMC Option 2) Use a queue (cont. ) • Once it has started an interactive session on the compute nodes, you may have to again load the module for the mpi you are using. • Then run the mpi job like this: mpiexec myprogram 6
Compile and Run an MPI Application • Load the module for the mpi: module load openmpi • Compile the program: mpicc myprogram. cpp –o myprogram • Compile the program: mpirun –np 4 myprogram 7
HPC User Manual http: //www. asc. edu/html/man. pdf 8
- Slides: 7