Skip to content
Snippets Groups Projects
Commit 4d606b11 authored by Lukasz Tomasz Bienias's avatar Lukasz Tomasz Bienias
Browse files
parents e73d061d 66f8f87f
No related branches found
No related tags found
No related merge requests found
......@@ -95,15 +95,66 @@ In order to use the package you need to have properly prepared environment. It i
* Tensorflow
* Pytorch ...
You can install all dependencies by: ...
and many more, which are listed in 'requirements.txt' files (each architecture has its own list of necessary dependecies).
You can install all dependencies by:
```bash
conda install --file requirements.txt
```
### Download Package
To download package you can simply clone this GitLab repository by using the following command:
```bash
$ git clone https://lab.compute.dtu.dk/lutobi/mask_rcnn_git/
$ git clone https://lab.compute.dtu.dk/lutobi/mlsp2019_software_package/
```
All the contents of the repository can also be downloaded from the GitHub site by using the "Download ZIP" button.
### Run everything
In order to run end to end experiment in order to generate all tables and figures from the paper, including the following activities:
* Mask R-CNN part:
* traning 5 separate models
* generating 5 predictions for each sample, based on 5 different models
* carrying out post processing of the samples and calculating final scores
* generating table consisting of scores
* SA-FCN part:
* traning 5 separate models
* generating 5 predictions for each sample, based on 5 different models
* carrying out post processing of the samples, saving post processed samples and calculating final scores
* generating table consisting of scores
* generating Figure 4, Figure 5 and Figure 6 from the paper
* generating Tbale 1 and Table 2 from the paper
please follow steps:
1. Log in to DTU Compute cluster via ThinLinc.
2. Open gterm terminal.
3. Log in to one of the GPUs available for instance:
```
ssh titan11
```
4. Activate your environment, for instance:
```
conda activate lutobi
```
5. Check which node is available:
```
gpustat
```
6. Check in to the available nodes, for instance:
```
export CUDA\_VISIBLE\_DEVICES="0,1"
```
7. Go to directory of the downloaded repo, for instance:
```
cd /dtu-compute/s162377/mlsp2019_software_package/
```
8. Open file 'mask_rcnn/run_maskrcnn.sh' and check if the dataset path is properly defined.
9. Check if you are on correct branch on the repo.
10. Run the bash script by calling:
```
./run_all.sh
```
### Run Mask R-CNN
In order to run end to end experiment for Mask R-CNN, which consists of:
* traning 5 separate models
......@@ -133,13 +184,13 @@ export CUDA\_VISIBLE\_DEVICES="0,1"
```
7. Go to directory of the downloaded repo, for instance:
```
cd /dtu-compute/s162377/mask_rcnn_git/
cd /dtu-compute/s162377/mlsp2019_software_package/mask_rcnn_git/
```
8. Open file run_it.sh and check if the dataset path is properly defined.
8. Open file run_maskrcnn.sh and check if the dataset path is properly defined.
9. Check if you are on correct branch on the repo.
10. Run the bash script by calling:
```
./run_it.sh
./run_maskrcnn.sh
```
### Run SA-FCN
......@@ -158,7 +209,7 @@ ssh titan11
```
4. Activate your environment, for instance:
```
conda activate lutobi
source activate s162377
```
5. Check which node is available:
```
......@@ -166,16 +217,16 @@ gpustat
```
6. Check in to the available nodes, for instance:
```
export CUDA\_VISIBLE\_DEVICES="0,1"
export CUDA\_VISIBLE\_DEVICES="1"
```
7. Go to directory of the downloaded repo, for instance:
```
cd /dtu-compute/s162377/sa_fcn_thesis/python
cd /dtu-compute/s162377/mlsp2019_software_package/sa_fcn_thesis/
```
8. Check if you are on correct branch on the repo.
9. Run the bash script by calling:
```
./run_it.sh
./run_safcn.sh
```
For more information on package content can be found in the documentation [file](Documentation.md).
......@@ -186,44 +237,92 @@ Please find documentation [here](documentation.md).
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
## Dataset
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
All analysed models in this thesis are evaluated on the dataset provided by the MICCAI 2015 Gland Segmentation Challenge Contest [article](https://arxiv.org/abs/1603.00275).
The dataset consists of 165 labelled colorectal cancer histological images. 85 images belong to the training set and 80 images are part of the test set, which is divided
into two subsets. Set A contains 60 images, while set B contains 20 images. The training set consists of 37 benign sections and 48 malignant areas. Test A set contains
33 benign sections and 27 malignant areas. Test B set has 4 benign sections and 16 malignant areas. Due to the characteristic of the SA-FCN architecture, authors have prepared
their own version of the labelled images. This is mainly aimed at adding information about the contours of the glands that is necessary to carry out the training of the model.
## Step by Step Detection Mask R-CNN
Figure below shows a few examples of samples from the dataset, with corresponding original labelling and the SA-FCN labelling version.
![example_dataset](/images/example_dataset.PNG)
More informationa about the dataset as well as the scores description can be found on the contest [website](https://warwick.ac.uk/fac/sci/dcs/research/tia/glascontest/).
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
## Expected output
Apart from all output information printed out in the console, you can also check correctness of the experiment by checking the output files. Below we will
list all produced files as well as their examples. Please notice that all output files from this reproducibility package are corresponing to figures and tables from the paper.
*Table 1. Classification (F1 Score) and segmentation (Dice Index) scores using three different post-processing methods. Scores are represented by mean value from five training
followed by the standard error of the mean. *
Table 1 is represented by three separate tables, describing each post-processing methods separately.
![table_1_original](/images/table_1_original.pdf)
![table_1_dcan](/images/table_1_dcan.pdf)
![table_1_our](/images/table_1_our.pdf)
*Table 2. Comparison of the Mask R-CNN and the SA-FCN models’ performance in terms of classification (F1 Score) and segmentation (Dice Index). Scores are represented by mean
value from five training followed by the standard error of the mean.*
Table 2 is represented by two separate tables, describing each model scores separately.
![table2_mask_rcnn](/images/table2_mask_rcnn.pdf)
![table_2_sa_fcn](/images/table_2_sa_fcn.pdf)
*Fig. 4. Visualisation of three post-processing methods on the example of one sample. Images headers describe postprocessing actions applied on sample.*
![figure_4](/images/figure_4.pdf)
*Fig. 5. Visualisation of the contour prediction of the sample, presenting the misalignment problem. The bottom right image show superposed ground truth and prediction
labelling. Yellow colour indicates ground truth pixels not overlapping with prediction, orange indicates prediction pixels not overlapping with the ground truth and
black colour is used to mark properly predicted pixels.*
![figure_5](/images/figure_5.pdf)
*Fig. 6. Visualisation of the same sample prediction before and after post-processing for the Mask R-CNN and SA-FCN models.*
![figure_6](/images/figure_6.pdf)
<!--
## Step by Step Detection Mask R-CNN
In this section we will provide a few tips how to run each part of the experiment separately. We will also provide examples of outputs.
### Training
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
### Predicting
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
### Post Processing
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
### Scores
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
### Outputs
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
## Step by Step Detection SA-FCN
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
### Training
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
### Predicting
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
### Post Processing
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
### Scores
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
### Outputs
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
-->
## Authors
<sup>1</sup> Lukasz T. Bienias lutobi@dtu.dk
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment