Navigation


Search the site
System Requirements
Contents:
Computer
MATLAB
Toolboxes
-------------------- Optional --------------------
Bio-Formats
BMxD filters
DipLib library
Edge Enhancing Coherence Filter
Fiji
Frangi Filter
Imaris
Membrane Click Tracker
Omero Server
Random Forest Classifier
Read NRRD
Segment-anything model
Supervoxels/Graphcut
Volume Rendering


Contents

General information
Segment-anything model (SAM, SAM2) is developed by Meta AI Research, FAIR research team. It can be used to segment individual objects or the whole image using a single or few mouse clicks.
SAM2 marks the updated version of the model with faster and better segmentation results.

Details of the research is available here:
https://ai.meta.com/research/publications/sam-2-segment-anything-in-images-and-videos/

Implementation of SAM in MIB is done via utilization of an external Python interpreter, please follow the following steps to configure local python environment to work in MIB.
Important!
even though SAM can work on CPU, GPU is highly recommended as it is x30-60 faster.

Reference
  • Nikhila Ravi, Valentin Gabeur, Yuan-Ting Hu, Ronghang Hu, Chaitanya Ryali, Tengyu Ma, Haitham Khedr, Roman Rädle, Chloe Rolland, Laura Gustafson, Eric Mintun, Junting Pan, Kalyan Vasudev Alwala, Nicolas Carion, Chao-Yuan Wu, Ross Girshick, Piotr Dollar, Christoph Feichtenhofer
    SAM 2: Segment Anything in Images and Videos
    arXiv:2408.00714, https://arxiv.org/abs/2408.00714

Requirements
see below for step-by-step installation instructions

  • MATLAB R2022b or newer (tested on R2024a)
    list of Python versions compatible with various MATLAB releases
  • Python 3.10 or newer (tested on 3.11, version 24.5.0)
  • Python environment with torch>=2.3.1 and torchvision>=0.18.1
  • CUDA-compatible GPU is highly recommended, CPU can also be used but it is significantly slower
If you want to install SAM-2 to use without MIB or MATLAB start by installing these software packages (the installation instructions at the bottom of the page). And after that continue with Python installation.
  • Microsoft Visual Studio 2022 (tested on 17.10.5 Community) (see below)
  • NVidia CUDA toolkit (tested on 12.1.0.531.14) (see below)

Installation on Linux
Is not yet tested, but following the general logic of installing SAM-2 for Windows should also work in Linux.
instuctions for installation of the previous version SAM-1 are here:
https://mib.helsinki.fi/downloads_systemreq_sam_linux.html

------------- Installation instructions -------------

Installation and usage:
  • Youtube tutorial (SAM1 version):

CUDA toolkit version
  • For MATLAB version of MIB
    • In MATLAB command prompt type:
      >> gpuDevice
    • Check the output of the command to find out CUDA Toolkit Version MATLAB: CUDA Toolkit version
      MATLAB: CUDA Toolkit version
  • For stand alone version of MIB
    • Start MIB
    • Start DeepMIB (Menu->Tools->Deep Learning Segmentation)
    • Press the "?" button in the Network panel CUDA Toolkit version
      MIB: CUDA Toolkit version
    • Check the output of the command to find out CUDA Toolkit Version MIB: CUDA Toolkit version
      MIB: CUDA Toolkit version


Python
  • Install Miniconda
    (tested on python 3.11, version 24.5.0, Miniconda3-py311_24.5.0-0-Windows-x86_64.exe)
    Archive of miniconda releases: https://repo.anaconda.com/miniconda/
    Commands below assume that Miniconda was installed to
    D:\Python\Miniconda311\

    Adjust the path to the actual location on miniconda on your system
    If you have admin account, you can install Python for all users, otherwise it is possible to install Python only for the current user.

  • With Admin rights
    Example of the installation directory: D:\Python\Miniconda311\
    Installation for all users
    Installation for all users
    Installation for all users
    Miniconda configuration

    Optional note!
    If installation was done for All Users, change permission of Miniconda's envs directory (e.g. d:\Python\Miniconda311\envs\) or the whole Miniconda311 directory to be accessible for all users.
    This makes things a bit more organized, otherwise the python environment will be created in C:\Users\[USERNAME]\.conda\envs\
    Installation for all users
    Set permissions to python directory
    Installation for all users
    Set permissions to python directory


  • Without Admin rights
    Example of the installation directory: C:\Users\[USERNAME]\AppData\Local\miniconda311\
    Installation for the current user
    Installation for the current user
    Installation for the current user
    Miniconda installation directory
    Installation for the current user
    Miniconda configuration


Segment-anything model 2
If you have Git, segment-anything-2 can also be downloaded using the following git command:
  • Start "Anaconda Prompt"
    Start->Miniconda3->Anaconda prompt (Miniconda311)
  • Change directory to location where you want to download segment-anything-2, e.g. D:\Python\Examples\
  • Type:
    git clone https://github.com/Ajaxels/segment-anything-2.git
    or
    git clone https://github.com/facebookresearch/segment-anything-2.git
    to download SAM-2,
    the final destination will be d:\Python\Examples\segment-anything-2\ Usage of GIT to download SAM-2
    Usage of GIT to download SAM-2


Required python packages
  • Create a new environment for Python with SAM
    • Start "Anaconda Prompt"
      Start->Miniconda3->Anaconda prompt (Miniconda311)
      If you have a local-admin account use it to start "Anaconda Prompt" and add a new environment!
      Otherwise, the installed environment may not be accessible to other users
    • Create a new environment, specify location for the environment and the version of Python e.g.:
      >> conda create --prefix d:\Python\Miniconda311\envs\sam4mib python=3.11

  • Activate the environment:
    >> activate sam4mib
    For installation of SAM-2 without MIB or MATLAB follow sections at the bottom of this page

  • Install Pytorch and torchvision
    (torch>=2.3.1, torchvision>=0.18.1)
    the instructions here to install both PyTorch and TorchVision dependencies:
    https://pytorch.org/get-started/locally

    • Using the possible options configure the command to install the packages (use CUDA version that matches the one detected in the CUDA toolkit version section): Configure pytorch
      Configure pytorch
    • In the command window type the generated command to install pytorch; the tested command:
      >> pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu121
    CUDA for pytorch may be lower than the installed CUDA toolkit version
  • Install additional packages:
    • Install hydra-core:
      >> pip3 install hydra-core
    • Install iopath:
      >> pip3 install iopath


MIB configuration
  • Start MIB
  • Open MIB preferences:
    Menu->File->Preferences
  • Define path of python.exe installed in the specified environment (sam4mib):
    External directories->Python installation path
    For example:
    - D:\Python\Miniconda311\envs\sam4mib\python.exe
    - C:\Users\[USERNAME]\.conda\envs\sam4mib\python.exe
  • Define directory to store network architectures for DeepMIB;
    this location will be used to download checkpoints: MIB preferences
    MIB preferences

  • Select "Segment-anything model" tool in the Segmentation panel:
    SAM tool
  • Check the "SAM2" checkbox to enable the SAM2 version and press the Settings button:
    SAM tool
  • Open SAM settings:
    SAM Settings
    SAM Settings
  • Select the backbone
    speed performance is about the same, but the largest model (sam2_hiera_l) gives the best results:
    - sam2_hiera_t (0.15Gb)
    - sam2_hiera_s (0.18Gb)
    - sam2_hiera_b_plus (0.32Gb)
    - sam2_hiera_l (0.90Gb) - recommended for best predictions

  • Define location where segment-anything package was unzipped:
    if you check Check to select path to segment-anything a directory selection dialog will be shown
  • Set correct execution environment, please note that CPU is 30-60 times slower than CUDA


How to use
Documentation on segment-anything is available:
  • youtube tutorial (SAM1 version):
  • Documentation page is available in MIB Help:
    User guide->
    ...GUI Overview, Panels->
    ......Segmentation panel->
    ..........Segmentation tools->Segment-anything model


JSON file for custom networks
All trained SAM networks are automatically installed upon user's demand. Links to the these networks are specified in sam2_links.json file located in Resources subfolder in MIB installation directory.
This json file can be used to specify locations to custom networks and those networks will be automatically linked and become available under the backbone section of the SAM configuration window.

An example of a section within json file that specify location of "Hiera tiny model"; use this as a template if you need to add a custom trained network
{	
"info": "Hiera tiny model",
"name": "sam2_hiera_t (0.15Gb)",
"backbone": "sam2_hiera_t",
"checkpointFilename": "sam2_hiera_tiny.pt",
"onnxFilename": "",
"checkpointLink_url_1": "https://dl.fbaipublicfiles.com/segment_anything_2/072824/sam2_hiera_tiny.pt",
"checkpointLink_url_2": "http://mib.helsinki.fi/web-update/sam/sam2_hiera_tiny.pt",
"modelCfgLink_url_1": "https://raw.githubusercontent.com/facebookresearch/segment-anything-2/main/sam2_configs/sam2_hiera_t.yaml",
"modelCfgLink_url_2": "http://mib.helsinki.fi/web-update/sam/sam2_hiera_t.yaml",
"onnxLink": "http://mib.helsinki.fi/web-update/sam/vit_h.zip"
}
			
Description of fields:
  • info: information about the SAM trained network
  • name: name how it will be shown in the SAM configuration
  • backbone: backbone name
  • checkpointFilename: filename of the network file
  • onnxFilename: [only for SAM1], onnx filename that is used with SAM version 1, not used for SAM2
  • checkpointLink_url_1: an URL link from where pytorch file will be downloaded
  • checkpointLink_url_2: an alternative URL link, in case when the first one does not work
  • modelCfgLink_url_1: [only for SAM2] an URL link to the config file, that is typically located under sam2_configs directrory of SAM2
  • modelCfgLink_url_2: [only for SAM2] alternative link to the config file
  • onnxLink: [only for SAM1] an URL to a generated onnx file
It is also possible to link this file from anywhere, in this case use segment-anything settings in MIB to provide a new location.

Addition of SAM-1
If needed SAM-1 can be added to SAM-2 installation; in this case these few steps should be done:
  • Activate the environment (if it is not yet activated):
    >> activate sam4mib
  • Install opencv, onnxruntime and onnx:
    >> pip3 install opencv-python onnxruntime onnx
  • Install pycocotools:
    >> pip3 install pycocotools
    if there is an error see below
  • Install "onnxruntime-gpu" to make prediction on GPU:
    >> pip3 install onnxruntime-gpu==1.19.2


Remove sam4mib environment
If you do not need mib4sam environment, you can follow the following steps to uninstall it from your system.
  • Start "Anaconda Prompt"
    Start->Miniconda3->Anaconda prompt (Miniconda311)
  • List the environments installed on your system by running the command:
    >> conda env list
  • Remove sam4mib environment:
    >> conda remove --name sam4mib --all


------------- Optional instructions for SAM-2 without MIB -------------

These steps are only needed if SAM-2 is planned to be used without MIB or MATLAB

Microsoft Visual Studio C++ 2022
start the procedure with installation of Microsoft Visual Studio C++ 2022 Community Edition with C++ compiler
  • Download Microsoft Visual Studio C++ 2022 Community Edition:
    https://visualstudio.microsoft.com/downloads
    Download Microsoft Visual Studio 2022
    Download Microsoft Visual Studio 2022
  • Start installation of Microsoft Visual Studio
  • Select Python and Desktop C++ in the Workloads
    Select Python and Desktop C++ in the Workloads
    Select Python and Desktop C++ in the Workloads
  • Update the destination location to whereever you want to install it
    Select destination
    Select destination

NVidia CUDA toolkit
CUDA toolkits should match the CUDA version for your PyTorch installation. This should typically be CUDA 12.1 if you follow the default installation command.
  • Navigate to CUDA Toolkit Archive page and download the suitable package
    (e.g. CUDA Toolkit 12.1.0 (February 2023))
  • Start the installation and select the suitable configuration of options:
    Select suitable CUDA configuration
    Select suitable CUDA configuration
  • Define CUDA_HOME Windows environment variable:
    • In Windows start menu type: "Edit the system environment variables
      Edit the system environment variables
      Edit the system environment variables"
    • Edit the system environment variables:
      Edit the system environment variables
      Edit the system environment variables"
    • Add a new system environment variable:
      Add a new system environment variable
      Add a new system environment variable"
    • Specify location where CUDA Toolkit was installed:
      Specify location where CUDA Toolkit was installed
      Specify location where CUDA Toolkit was installed"

Additional steps in Python
These step are used after "Activate the environment" step from the Required python packages section.

  • Do system check
    • Check that the proper version of CUDA is installed and visible, type:
      >> nvcc --version

    • - Check that CUDA_HOME variable is available, type:
      >> echo %CUDA_HOME%

  • Install setuptools:
    >> conda install conda-forge::setuptools

  • Install Pytorch and torchvision, as described above
  • Compile required packages:
    • Change directory to the location of segment-anything-2:
      >> cd D:\Python\Examples\segment-anything-2\
    • Start compiling CUDA:
      >> pip install --no-build-isolation -e .
      it should finish with successful installation of packages:
      CUDA compiled
      CUDA compiled


Main page >> Downloads >> System Requirements
Back to the main page Main page of the University of Helsinki Main page of the University of Helsinki Main page of the Institute of Biotechnology Main page of the Electron Microscopy Unit Back to the main page
Loading...