Comment on page
Wednesday, 7 June
Research Computing at Brown
Today's tutorials will occur along three tracks running concurrently. Each tab below corresponds to one of these tracks. The tutorials associated with each track are listed on the relevant tab.
A primer on submitting jobs to the job scheduler on Oscar. Some basic familiarity with Unix/Linux systems is assumed. Topics covered include: an overview of the use of Slurm for resource allocation, submitting jobs to Slurm, and using Bash scripts to configure and submit jobs to Slurm.
This workshop is for people who are already familiar with Slurm, but would like to use Slurm's more powerful features. Topics covered include: dependencies for conditional execution of jobs, job arrays for parameter sweeps, dealing with hundreds or thousands of small tasks, how to limit the number of jobs running at once, and how to cancel multiple jobs.
This workshop will introduce users to checkpointing in HPC workloads. Checkpointing allows users to periodically save the state of a distributed/serial computation to disk. This allows user to restart a job from a checkpoint file in case of a node/job failure. This workshop will include a hands-on demonstration on using DMTCP to checkpoint batch jobs, job-arrays, multithreaded programs, and MPI applications.
A general Introduction to GPU architectures available on Oscar, using NGC container images to leverage RT cores on higher-end GPUs, and optimizing GPU jobs for better filesystem IO.
An introduction to the Unity platform with specific aims for the development of 3D Virtual Reality applications. This workshop tours an example app provided by Unity and dives into the tools and structure required for an app deployed in Virtual Reality.
Get to know this new service that uses Oscar to provide you with your own 3D render server to visualize datasets that will not fit in your desktop/laptop. We will use Paraview as a tool to interact with and analyze your data. No previous experience with Oscar or Paraview is needed.
An overview of methods for moving files across the network. Topics covered include: Linux command line tools for file transfer (scp, rsync, sftp), GUI-based file transfer applications, mounting remote filesystems using CIFS, and using Globus.
In this hands-on workshop we will be covering source control: what it is, why every coder needs it, and how to use the popular source control tool git for solo and collaborative coding.
Foundations for Bioinformatics Data Analyses | 10:30 - 12:00 EDT
This session we will discuss the most common file formats used in bioinformatics, some resources for accessing public datasets (like the SRA), and a few other helpful bioinformatics QC tools.
Basic Bioinformatics Workflows & Workflow Management on OSCAR including NextFlow/Snakemake | 1:30 - 3:00 EDT
This session will address how you can use and run Nextflow/Snakemake workflow tools to run publicly available bioinformatic analysis pipelines on Oscar. We will also briefly provide an overview of tips and suggestions on specific configurations to modify the workflows.
This session will briefly cover different approaches to using R on Oscar.