Large Memory Computing
Debugging

Slurm Migration

Access the New Slurm Cluster

1. Remove Known Host Entry

  • for Mac/Linux, run the following command

ssh-keygen -R "ssh.ccv.brown.edu"
  • For Windows

    • PuTTY: when PuTTY displays the warning message, click 'Yes' to update PuTTY's cache with the new RSA key

    • Windows Terminal : enter 'yes' when you are asked "Are you sure you want to continue connecting (yes/no)".

2. Brown AD Password and DUO for Login

Users can log into Oscar using their Brown AD password. Local Oscar passwords no longer work.

You will be prompted for DUO two factor authentication after entering your password

You can change your Brown AD password at https://myaccount.brown.edu

3. File Owner and Access Issue

Due to the changes of uid in Oscar, some users may have issues accessing their files and/or directories. Please contact [email protected] if you have issues.

Submit Jobs with MPI Applications for New Slurm

The new slurm 20.02.6 is built with pmix. So jobs with all MPI applications should be submitted with the following command:

srun --mpi=pmix <mpi_application>

MPI Applications to Migrate

Software

Current Module Name

New Modue Name

cesm

1.2.1

planned

cesm

1.2.2

planned

cesm

2.1.1

planned

dedalus

2.1905

dedalus/2.1905_openmpi_4.05_gcc_10.2_slurm20

fenics

2018.1.0

planned

global_arrays

5.8_openmpi_4.0.5_gcc_10.2_slurm20gpawgpaw

gpaw

20.10.0_hpcx_2.7.0_intel_2020.2_slurm20

gpaw

20.10_hpcx_2.7.0_intel_2020.2_slurm20

gromacs

gromacs/2020.4_gpu_hpcx_2.7.0_gcc_10.2_slurm20

gromacs

gromacs/2020.4_hpcx_2.7.0_gcc_10.2_slurm20

gromacs

2020.1

gromacs/2020.4_hpcx_2.7.0_gcc_10.2_slurm20

gromacs

2018.2

gromacs/2018.2_hpcx_2.7.0_gcc_10.2_slurm20

hdf5

1.10.1_parallel

1.10.1_mvapich2-2.3.5_gcc_10.2_slurm20

hdf5

1.10.5_parallel

1.10.5_mvapich2-2.3.5_intel_2020.2_slurm20

hdf5

1.10.5_openmpi_4.0.0_gcc

1.10.5_openmpi_4.0.5_gcc_10.2_slurm20

hdf5

1.10.5_openmpi_3.1.3_gcc

1.10.5_openmpi_4.0.5_gcc_10.2_slurm20

hdf5

1.10.5_fortran

1.10.5_openmpi_4.0.5_gcc_10.2_slurm20

hdf5

n/a

1.10.7_openmpi_4.0.5_gcc_10.2_slurm20

hdf5

n/a

1.10.7_openmpi_4.0.5_intel_2020.2_slurm20

hdf5

n/a

1.12.0_openmpi_4.0.5_intel_2020.2_slurm20

lammps

29Oct20_openmpi_4.0.5_gcc_10.2_slurm20

meme

5.0.5

planned

meshlab

20190129_qt59

planned

Molpro

2020.1_openmpi_4.0.5_gcc_10.2_slurm20

mpi

hpcx_2.7.0_gcc_10.2_slurm20

mpi

hpcx_2.7.0_intel_2020.2_slurm20

mpi

mvapich2-2.3.5_gcc_10.2_slurm20

mpi

mvapich2-2.3.5_intel_2020.2_slurm20

mpi

openmpi_2.0.3_intel

openmpi_2.0.3_intel_2020.2_slurm20

mpi

openmpi_3.1.6

openmpi_3.1.6_gcc_10.2_slurm20

mpi

openmpi_4.0.5_gcc

openmpi_4.0.5_gcc_10.2_slurm20

mpi

openmpi_4.0.5_icc

openmpi_4.0.5_intel_2020.2_slurm20

n2p2

1.0.0

planned

nwchem

7.0

7.0.2_openmpi_4.0.5_intel_2020.2_slurm20

openfoam

4.1

4.1-openmpi_3.1.6_gcc_10.2_slurm20

openfoam

4.1a

openfoam

7

osu-mpi

5.3.2

planned

osu-mpi

5.6.2_mvapich2-2.3a_gcc

planned

paraview

5.8.1_openmpi_4.0.5_intel_2020.2_slurm20

petsc

3.14.2_hpcx_2.7.0_intel_2020.2_slurm20

polychord

2

planned

prophet

augustegm_1.2

planned

qmcpack

3.10.0_hpcx_2.7.0_intel_2020.2_slurm20

quantumespresso

6.6

6.6_openmpi_4.0.5_intel_2020.2_slurm20

quantumespresso

6.4

6.4_openmpi_4.0.5_intel_slurm20

quantumespresso

6.5

6.5_openmpi_4.0.5_intel_slurm20

siesta

3.2

planned

siesta

4.1

planned

su2

7.0.1

planned

su2

7.0.2

planned

vasp

5.4.1

5.4.1_mvapich2-2.3.5_intel_2020.2_slurm20

vasp

5.4.4

5.4.4_openmpi_4.0.5_gcc_10.2_slurm20

vasp

5.4.4_intel

5.4.4_mvapich2-2.3.5_intel_2020.2_slurm20

vasp

6.1.1_ompi405_yqi27

6.1.1_openmpi_4.0.5_intel_2020.2_yqi27_slurm20

wrf

4.2.1_hpcx_2.7.0_intel_2020.2_slurm20

Deprecated MPI Applications

  • global_arrays/5.6.1_openmpi_2.0.3

  • gpaw/1.2.0_hpcx_2.7.0_gcc

  • gpaw/1.2.0_mvapich2-2.3a_gcc

  • gromacs/2016.6

  • gromacs/2018.2

  • gromacs/2018.2_gpu

  • lammps/11Aug17

  • lammps/11Aug17_serial

  • lammps/16Mar18

  • lammps/17Nov16

  • lammps/22Aug18

  • Molpro/2012.1.15

  • Molpro/2015_gcc

  • Molpro/2015_serial

  • Molpro/2018.2_ga

  • Molpro/2019.2

  • Molpro/2019.2_ga

  • Molpro/2020.1

  • Molpro/2020.1_ga

  • mpi/cave_mvapich2_2.3rc2_gcc

  • mpi/cave_mvapich2_2.3b_gcc

  • mpi/cave_mvapich2_2.3b_intel

  • mpi/mvapich2-2.3a_pgi

  • mpi/mvapich2-2.3a_gcc

  • mpi/mvapich2-2.3a_intel

  • mpi/mvapich2-2.3b_gcc

  • mpi/mvapich2-2.3.1_gcc

  • mpi/openmpi_2.1.1_gcc

  • mpi/openmpi_2.0.3_gcc

  • mpi/openmpi_3.1.3_gcc

  • mpi/openmpi_4.0.0_gcc

  • mpi/openmpi_4.0.0_gcc_i8

  • mpi/openmpi_4.0.1_gcc

  • mpi/openmpi_4.0.3_gcc

  • mpi/openmpi_4.0.4_gcc

  • mpi/5.6.1_openmpi_2.0.3

  • nwchem/6.8-openmpi

  • paraview/5.1.0

  • paraview/5.1.0_yurt

  • paraview/5.4.1

  • paraview/5.6.0_no_scalable

  • paraview/5.6.0_yurt

  • paraview/5.8.0

  • paraview/5.8.0_mesa

  • paraview/5.8.0_release

  • petsc/3.7.5

  • petsc/3.7.7

  • petsc/3.8.3

  • qmcpack/3.7.0

  • qmcpack/3.9.1

  • qmcpack/3.9.1_openmpi_3.1.6

  • qmcpack/3.9.2_intel_2020

  • qmcpack/3.9.2_openmpi_4.0.1_gcc

  • qmcpack/3.9.2_openmpi_4.0.4_gcc

  • quantumespresso/6.1

  • quantumespresso/6.3

  • quantumespresso/6.4

  • quantumespresso/6.4.1

  • quantumespresso/6.5

  • quantumespresso/6.6

  • wrf/3.6.1