Deep MIB - Predict tab

This tab contains parameters used for efficient prediction (inference) of images and generation of semantic segmentation models.

Back to Index --> User Guide --> Menu --> Tools Menu --> Deep learning segmentation

 

Contents

How to start the prediction (inference) process




Prediction (inference) requires a pretrained network, if you do not have the pretrained network, you neet to train it.
The pretrained networks can be loaded to DeepMIB and used for prediction of new datasets

To start with prediction:

 

Settings sections

The Settings section is used to specify the main parameters using for the prediction (inference) process

Overlapping vs non-overlapping mode

Same padding in the non-overlapping mode may have vertical and horizontal artefacts, that are eliminated when the overlapping mode is used.


Dynamic masking settings and preview

  • , press the button to specify parameters for dynamic masking.
    • Masking method ▼, specify whether to keep blocks with average intensity below or above the specified in the Intensity threshold value... editbox
    • Intensity threshold value... is used to specify the treshold value; patches with average intensity above or below this value will be predicted
    • Inclusion threshold (0-1)... is used to specify a fraction of pixels that should be above or below the tresholding value to keep the tile for prediction
  • The Eye button, hit to see the effect of the specified dynamic masking settings on a portion of the image that is currently shown in the Image View panel of MIB
Example of patch-wise segmentation with dynamic masking

Snapshot showing result of 2D patch-wise segmentation of nuclei.

  • Green color patches indicate predicted locations of nuclei
  • Red color patches indicate predicted locations of background
  • Uncolored areas indicate patches that were skipped due to dynamic masking



List of available image formats for the model files

  • MIB Model format ▼, standard formal for models in MIB. The model files have *.model extension and can be read directly to MATLAB using model = load('filename.model', '-mat'); command
  • TIF compressed format ▼, a standard TIF LZW compressed file, where each pixel encodes the predicted class as 1, 2, 3, etc...
  • TIF uncompressed format ▼, a standard TIF uncompressed file, where each pixel encodes the predicted class as 1, 2, 3, etc...

List of available image formats for the score files

  • Do not generate ▼, skip generation of score files improving performance and minimizing disk usage
  • Use AM format ▼, AmiraMesh format, compatible with MIB, Fiji or Amira
  • Use Matlab non-compressed format ▼, resulting score files are generated in MATLAB uncompressed format with *.mibImg extension. The score files can be loaded to MIB or to MATLAB using model = load('filename.mibImg', '-mat'); command
  • Use Matlab compressed format ▼, resulting score files are generated in MATLAB compressed format with *.mibImg extension. The score files can be loaded to MIB or to MATLAB using model = load('filename.mibImg', '-mat'); command
  • Use Matlab non-compressed format (range 0-1) ▼, the score files are generated in MATLAB non-compressed format without scaling, i.e. in the range from 0 to 1. The file extension is *.mibImg and these files can not be opened in MIB. The score files can be loaded to MATLAB using model = load('filename.mat'); command

 

Explore activations

Activations explorer brings the possibility for detailed evaliation of the network.


Here is the description of the options:
Snapshot with the generated collage image


 

Preview results section

Details of the Evaluate segmentation operation

  • Press the button to calculate various precision metrics
  • As result of the evaluation a table with the confusion matrix will be shown. The confusion matrix displays how well the predicted classes are matching classes defined in the ground truth labels. The values are scaled from 0 (bad) to 100 (excelent).
    In addition, the calculated class metrics (Accuracy, IoU, MeanBGScore) as well as the global dataset metrics are shown.
  • In addition, it is possible to calculate occurrence of labels and Sørensen-Dice similarity coefficient in the generated and ground truth labels. These options are available from a dropdown located in the right-bottom corner of the Evaluation results window:
  • The evaluation results can be exported to MATLAB or saved in MATLAB, Excel or CSV formats to 3_Results\PredictionImages\ResultsModels directory, see more in the Directories and Preprocessing section.
    For details of the metrics refer to MATLAB documentation for evaluatesemanticsegmentation function

Back to Index --> User Guide --> Menu --> Tools Menu --> Deep learning segmentation