Skip to content

GilsonJRS/flim_ad

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Adaptive Decoders for FLIM Networks applied to Salient Object Detection

This repository contains the source code for the experiments on Feature Learning from Image Markers with Adaptive Decoders (FLIM-AD). It enables the reproduction of results presented in Gilson Junior Soares’s Master’s thesis, Adaptive Decoders for FLIM Networks Applied to Salient Object Detection, developed at the Institute of Computing, UNICAMP, under the supervision of Prof. Alexandre Xavier Falcão. The repository also supports the findings reported in the article FLIM-Based Salient Object Detection Networks with Adaptive Decoders.

Code organization

FLIM_AD/
├── data/                                   #data files
│   ├── schisto/                            #data files for schisto dataset
│   │   ├── user_A/                         #annotation by user A
│   │   │   ├── split1/          
│   │   │   │   ├── markers/                #folder with marker files                
│   │   │   │   │   ├── 000479-seeds.txt    #marker file for one images
│   │   │   │   │   └── ...     
│   │   │   │   ├── arch2D.json             #FLIM architecture file 
│   │   │   │   ├── test1.csv               #list of test images
│   │   │   │   ├── train1.csv              #list of train images
│   │   │   │   ├── val1.csv                #list of validation images
│   │   │   └── ...     
│   │   └── ...
│   ├── brats/
│   └── ...
├── libs/                                   
│   ├── flim-python/                        #flim-python library
│   └── ift/                                #ift library
├── scripts/  
│   ├── brats/                              #script for running brats dataset
│   ├── schisto/                            #script for running schistosoma dataset
│   └── conf                                #scripts for configuration
└── src/                                    #source codeds and scripts

Datasets

This repository includes a dataset folder containing only the training images required for training the FLIM encoders. To execute the full pipeline and reproduce the reported results, please download and organize the datasets from the sources listed below.

Parasites

This dataset, developed at the LIDS Laboratory, contains images of parasite eggs and is available at https://github.com/LIDS-Datasets/schistossoma-eggs. Although the repository provides a predefined 70–30 split, this work does not use it. Instead, the experiments use the images and masks located in the orig and label folders, following a 50–50 split.

BraTS

BraTS is a public available dataset available at BraTS dataset. Is necessary to download and process the 3D dataset, generating a 2D dataset where each file is a slide on de axial axis. Using the train{1,2,3}.csv and test{1,2,3}.csv files on data, is possible to select the correct axial slices by geting its number from the last part of the file name: images/BraTS2021_00000_a1_s0072.png -> slice s0072.

Libraries installation

FLIM Python (flim-python)

To install flim-python, run the command pip install -e libs/flim-python/. This will install the library on the current python environment.

IFT

This library is necessary for executing the Dynamic Trees Algorithm on the saliency maps. First, export the environment variable export NEWIFT_DIR=libs/ift and then run make inside libs/ift/. After that, run bash compile.sh 0 inside libs/ift/demo/FLIM/auxiliary_operations/.

Usage

1. Train FLIM encoders

The first step consist on training the FLIM encoders with the images and markers selected by the user. This can be done with the scripts train_flim_encoders.sh. The script will train the encoders for users A and B on 3 splits. The script will use the cuda:0 GPU by default, if your not using a GPU or is using another device, modify it to cpu or cuda:you_device_number.

1.1 Train backprop decoder

The training of the FLIM Backprop Decoder is done with the script train_backprop_decoder.sh.

1.2 U-NetFLIM

For training and testing U-NetFLIM use train_flim_unet.sh and test_flim_unet.sh. The trained models will be saved on out/models, while the saliencies will be saved on out/saliencies_unet.sh.

2. Run FLIM on validation

Use the script val_run_flim_decoders.sh to run the trained encoders on the validation set for users A and B. By default, this script will run all adaptive decoders, saving the output on out/saliencies.

3. Run delineation method (only for Parasites)

To run the delieation method, execute the val_run_delineation.sh. This script will run the algorithm of Dynamic Trees on the validation images.

4. Get the optimized architectures

Use the script val_get_best_layers.sh to select the best architecture for each decoder. This script will produce a file output/best_layers_X.json for the dataset. This file contains the best layer to be used for each user, decoder, and split.

5. Run test

Lastly, use test_flim_decoders.sh to execute the pipeline of FLIM + Adaptive Decoders on the test set.

Citation

@inproceedings{soares2024adaptive,
  title={Adaptive decoders for flim-based salient object detection networks},
  author={Soares, Gilson Junior and Cerqueira, Matheus Abrantes and Guimaraes, Silvio Jamil F and Gomes, Jancarlo F and Falc{\~a}o, Alexandre X},
  booktitle={2024 37th SIBGRAPI Conference on Graphics, Patterns and Images (SIBGRAPI)},
  pages={1--6},
  year={2024},
  organization={IEEE}
}
@article{soares2025flim,
  title={FLIM-based Salient Object Detection Networks with Adaptive Decoders},
  author={Soares, Gilson Junior and Cerqueira, Matheus Abrantes and Gomes, Jancarlo F and Najman, Laurent and Guimar{\~a}es, Silvio Jamil F and Falc{\~a}o, Alexandre Xavier},
  journal={arXiv preprint arXiv:2504.20872},
  year={2025}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published