Installing TensorFlow

Setting up a GPU-accelerated environment can be challenging due to driver dependencies, version conflicts, and other complexities. Apptainer simplifies this process by encapsulating all these details

Apptainer Using NGC Containers (Our #1 Recommendation)

There are multiple ways to install and run TensorFlow. Our recommended approach is via NGC containers. The containers are available via NGC Registryarrow-up-right. In this example we will pull TensorFlow NGC container

  1. Build the container:

apptainer build tensorflow-24.03-tf2-py3.simg docker://nvcr.io/nvidia/tensorflow:24.03-tf2-py3

This will take some time, and once it completes you should see a .simg file.

circle-info

For your convenience, the pre-built container images are located in directory:

/oscar/runtime/software/external/ngc-containers/tensorflow.d/x86_64/

You can choose either to build your own or use one of the pre-downloaded images.

triangle-exclamation
  1. Once the container is ready, request an interactive session with a GPU

interact -q gpu -g 1 -f ampere -m 20g -n 4
  1. Run a container wih GPU support

export APPTAINER_BINDPATH="/oscar/home/$USER,/oscar/scratch/$USER,/oscar/data"
# Run a container with GPU support
apptainer run --nv tensorflow-24.03-tf2-py3.simg
circle-check
  1. Or, if you're executing a specific command inside the container:

  1. Make sure your Tensorflow image is able to detect GPUs

  1. If you need to install more custom packages, the containers itself are non-writable but we can use the --user flag to install packages inside .local Example:

Slurm Script:

Here is how you can submit a SLURM job script by using the srun command to run your container. Here is a basic example:

Last updated

Was this helpful?