Skip to content
Snippets Groups Projects
Commit 9644291e authored by Lukasz Tomasz Bienias's avatar Lukasz Tomasz Bienias
Browse files
parents 0bb625a7 6b4a4488
Branches
No related tags found
No related merge requests found
# **Insights into the behavior of multi-task deep neural networks for medical image segmentation - Reproducibility Package**
This repository is a reproducibility package for the paper "Insights into the behavior of multi-task deep neural networks for medical image segmentation” published
in: 2019 IEEE 29th International Workshop on Machine Learning for Signal Processing (MLSP). Paper has can be found [here](https://ieeexplore.ieee.org/document/8918753)
Reproducibility package consists two main parts. First one is devotedt to Mask R-CNN architecture, while the second one to SA-FCN architecture.
Provided code allows to reproduce all tables and figures placed in the paper, as well as experiment with the training, predicting and post-processing methods.
- [Introduction](#introduction)
- [Mask R-CNN](#maskrcnn)
- [SA-FCN](#safcn)
- [Cite Paper](#cite-uva)
- [Installation, tutorials and documentation](#installation-tutorials-and-documentation)
- [Installation requirements](#installation-requirements)
- [Download Package](#download)
- [Run Mask R-CNN](#runmaskrcnn)
- [Run SA-FCN](#runsafcn)
- [Reproducing figures](#reproducingfigures)
- [Documentation](#documentation)
- [Dataset](#dataset)
- [Step by Step Detection Mask R-CNN](#stepbystep)
- [Training](#training)
- [Predicting](#predicting)
- [Post Processing](#postprocessing)
- [Scores](#scores)
- [Outputs](#outputs)
- [Step by Step Detection SA-FCN](#stepbystep)
- [Training](#training)
- [Predicting](#predicting)
- [Post Processing](#postprocessing)
- [Scores](#scores)
- [Outputs](#outputs)
- [Authors](#authors)
- [License](#license)
- [Acknowledgments](#acknowledgments)
<!-- /TOC -->
## Introduction
Glandular morphology is used by pathologists to assess the malignancy of different adenocarcinomas. This process involves conducting gland segmentation task. The common
approach in specialised domains, such as medical imaging, is to design complex architectures in a multi-task learning setup. Generally, these approaches rely on
substantial postprocessing efforts. Moreover, a predominant notion is that general purpose models are not suitable for gland instance segmentation. We analyse the
behaviour of two architectures: SA-FCN and Mask R-CNN. We compare the impact of post-processing on the final predictive results and the performance of generic and
specific models for the gland segmentation problem. Our results highlight the dependency of post-processing on tailored models as well as comparable results when using
a generic model. Thus, in the interest of time, it is worth considering to use and improve generic models as opposed to design complex architectures when tackling new
domains.
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
## Cite Paper
If you use the package, please cite the following works:
```
L. T. Bienias, J. R. Guillamón, L. H. Nielsen and T. S. Alstrøm, "Insights Into The Behaviour Of Multi-Task Deep Neural Networks For Medical Image Segmentation," 2019 IEEE 29th International Workshop on Machine Learning for Signal Processing (MLSP), Pittsburgh, PA, USA, 2019, pp. 1-6.
```
Bibtex entry:
```
@inproceedings{bienias2019insights,
title={Insights into the behaviour of multi-task deep neural networks for medical image segmentation},
author={Bienias, Lukasz T and Nielsen, Line H and Alstr{\o}m, Tommy S and others},
booktitle={2019 IEEE 29th International Workshop on Machine Learning for Signal Processing (MLSP)},
pages={1--6},
year={2019},
organization={IEEE}
}
```
For more information about our research group please visit [Section for Cognitive Systems website](https://www.compute.dtu.dk/english/research/research-sections/cogsys)
at [Technical University of Denamrk](https://www.dtu.dk//) (Denmark).
We are interested in feedback and error reporting. Please contact us via email or open an issue in the repository if you have any kind of problem, comment, suggestion or
found a mistake.
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
## Mask R-CNN
The Mask R-CNN architecture is described in the [article](https://arxiv.org/abs/1703.06870).
In the article we used implementation written in Tensorflow framework, which comes from the [repository](https://github.com/matterport/Mask_RCNN). The code has been
adjusted to our purposes.
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
## SA-FCN
The SA-FCN architecture is described in the [article](https://arxiv.org/abs/1706.04737).
The package consists of implementation written in Pytorch framework, following the information included in the article as well as the Lua implementation,
provided by the original article authors.
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
## Installation, tutorials and documentation
This is a quick installation guide.
### Installation requirements
In order to use the package you need to have properly prepared environment. It is necessary to have the following packages installed:
* Python 3
* Cuda
* Tensorflow
* Pytorch ...
You can install all dependencies by: ...
### Download Package
To download package you can simply clone this GitLab repository by using the following command:
```bash
$ git clone https://lab.compute.dtu.dk/lutobi/mask_rcnn_git/
```
All the contents of the repository can also be downloaded from the GitHub site by using the "Download ZIP" button.
### Run Mask R-CNN
In order to run end to end experiment for Mask R-CNN, which consists of:
* traning 5 separate models
* generating 5 predictions for each sample, based on 5 different models
* carrying out post processing of the samples and calculating final scores
* generating table consisting of scores
please follow steps:
1. Log in to DTU Compute cluster via ThinLinc.
2. Open gterm terminal.
3. Log in to one of the GPUs available for instance:
```
ssh titan11
```
4. Activate your environment, for instance:
```
conda activate lutobi
```
5. Check which node is available:
```
gpustat
```
6. Check in to the available nodes, for instance:
```
export CUDA\_VISIBLE\_DEVICES="0,1"
```
7. Go to directory of the downloaded repo, for instance:
```
cd /dtu-compute/s162377/mask_rcnn_git/
```
8. Open file run_it.sh and check if the dataset path is properly defined.
9. Check if you are on correct branch on the repo.
10. Run the bash script by calling:
```
./run_it.sh
```
### Run SA-FCN
In order to run end to end experiment for SA-FCN, which consists of:
* traning 5 separate models
* generating 5 predictions for each sample, based on 5 different models
* carrying out post processing of the samples, saving post processed samples and calculating final scores
* generating table consisting of scores
please follow steps:
1. Log in to DTU Compute cluster via ThinLinc.
2. Open gterm terminal.
3. Log in to one of the GPUs available for instance:
```
ssh titan11
```
4. Activate your environment, for instance:
```
conda activate lutobi
```
5. Check which node is available:
```
gpustat
```
6. Check in to the available nodes, for instance:
```
export CUDA\_VISIBLE\_DEVICES="0,1"
```
7. Go to directory of the downloaded repo, for instance:
```
cd /dtu-compute/s162377/sa_fcn_thesis/python
```
8. Check if you are on correct branch on the repo.
9. Run the bash script by calling:
```
./run_it.sh
```
For more information on package content can be found in the documentation [file](Documentation.md).
## Documentation
Please find documentation [here](documentation.md).
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
## Dataset
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
## Step by Step Detection Mask R-CNN
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
### Training
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
### Predicting
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
### Post Processing
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
### Scores
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
### Outputs
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
## Step by Step Detection SA-FCN
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
### Training
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
### Predicting
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
### Post Processing
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
### Scores
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
### Outputs
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
## Authors
<sup>1</sup> Lukasz T. Bienias lutobi@dtu.dk
<sup>1</sup> Juanjo R. Guillamon jugu@dtu.dk
<sup>2</sup> Line H. Nielsen lihan@dtu.dk
<sup>1</sup> Tommy S. Alstrøm tsal@dtu.dk
<sup>1</sup> Department of Applied Mathematics and Computer Science
Technical University of Denmark, Richard Petersens Plads 324, 2800 Kgs. Lyngby, Denmark
<sup>2</sup> Department of Health Technology
Technical University of Denmark, Ørsteds Plads 345C, 2800, Kgs. Lyngby, Denmark
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
## License
Copyright 2020 Lukasz Tomasz Bienias
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
<!-- ----------------------------------------------------------------------------------------------------------------------------------------------- -->
## Acknowledgments
This research was funded by the IDUN Center of Excellence supported by the Danish National Research Foundation (DNRF122) and the Velux Foundations
(Grant No. 9301). We also thank NVIDIA corporation for donating GPU.
\ No newline at end of file
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment