Skip to content

Setup/Installation Instructions

Prerequisites

Alternatively, you can run GaNDLF via Docker. This needs different prerequisites. See the Docker Installation section below for more information.

Optional Requirements

  • GPU compute (usually needed for faster training):
    • Install appropriate drivers:
    • Compute toolkit appropriate for your hardware:
      • NVIDIA: CUDA and a compatible cuDNN installed system-wide
      • AMD: ROCm
  • Windows: Microsoft Visual C++ 14.0 or greater. This is required for PyTorch to work on Windows. If you are using conda, you can install it using the following command for your virtual environment: conda install -c anaconda m2w64-toolchain.

Installation

Install PyTorch

GaNDLF's primary computational foundation is built on PyTorch, and as such it supports all hardware types that PyTorch supports. Please install PyTorch for your hardware type before installing GaNDLF. See the PyTorch installation instructions for more details.

First, instantiate your environment

(base) $> conda create -n venv_gandlf python=3.9 -y
(base) $> conda activate venv_gandlf
(venv_gandlf) $> ### subsequent commands go here

You may install pytorch to be compatible with CUDA, ROCm, or CPU-only. An exhaustive list of PyTorch installations for the specific version compatible with GaNDLF can be found here: https://pytorch.org/get-started/previous-versions/#v231 Use one of the following depending on your needs: - CUDA 12.1

(venv_gandlf) $> pip install torch==2.3.1 torchvision==0.18.1 torchaudio==2.3.1 --index-url https://download.pytorch.org/whl/cu121

Optional Dependencies

The following dependencies are optional, and are only needed to access specific features of GaNDLF.

(venv_gandlf) $> pip install openvino-dev==2023.0.1 # [OPTIONAL] to generate post-training optimized models for inference
(venv_gandlf) $> pip install mlcube_docker # [OPTIONAL] to deploy GaNDLF models as MLCube-compliant Docker containers

Install from Package Managers

This option is recommended for most users, and allows for the quickest way to get started with GaNDLF.

(venv_gandlf) $> pip install gandlf # this will give you the latest stable release
You can also use conda
(venv_gandlf) $> conda install -c conda-forge gandlf -y

If you are interested in running the latest version of GaNDLF, you can install the nightly build by running the following command:

(venv_gandlf) $> pip install --pre gandlf

You can also use conda

(venv_gandlf) $> conda install -c conda-forge/label/gandlf_dev -c conda-forge gandlf -y

Install from Sources

Use this option if you want to contribute to GaNDLF, or are interested to make other code-level changes for your own use.

(venv_gandlf) $> git clone https://github.com/mlcommons/GaNDLF.git
(venv_gandlf) $> cd GaNDLF
(venv_gandlf) $> pip install -e .

Test your installation:

(venv_gandlf) $> gandlf verify-install

Docker Installation

We provide containerized versions of GaNDLF, which allows you to run GaNDLF without worrying about installation steps or dependencies.

Steps to run the Docker version of GaNDLF

  1. Install the Docker Engine for your platform.
  2. GaNDLF is available from GitHub Package Registry. Several platform versions are available, including support for CUDA, ROCm, and CPU-only. Choose the one that best matches your system and drivers. For example, if you want to get the bleeding-edge GaNDLF version, and you have CUDA Toolkit v11.6, run the following command:
(base) $> docker pull ghcr.io/mlcommons/gandlf:latest-cuda116

This will download the GaNDLF image onto your machine. See the usage page for details on how to run GaNDLF in this "dockerized" form.

Enable GPU usage from Docker (optional, Linux only)

In order for "dockerized" GaNDLF to use your GPU, several steps are needed:

  1. Ensure sure that you have correct NVIDIA drivers for your GPU.
  2. Then, on Linux, follow the instructions to set up the NVIDIA Container Toolkit.
  3. This can be replicated for ROCm for AMD , by following the instructions to set up the ROCm Container Toolkit.

On Windows

On Windows, GPU and CUDA support requires either Windows 11, or (on Windows 10) to be registered for the Windows Insider program. If you meet those requirements and have current NVIDIA drivers, GPU support for Docker should work automatically. Otherwise, please try updating your Docker Desktop version.

Note: We cannot provide support for the Windows Insider program or for Docker Desktop itself.

Building your own GaNDLF Docker Image

You may also build a Docker image of GaNDLF from the source repository. Just specify the Dockerfile for your preferred GPU-compute platform (or CPU):

(base) $> git clone https://github.com/mlcommons/GaNDLF.git
(base) $> cd GaNDLF
(base) $> docker build -t gandlf:${mytagname} -f Dockerfile-${target_platform} . # change ${mytagname} and ${target_platform} as needed