Skip to content

WHYfromNUT/RADAR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

High-Resolution Underwater Creature Segmentation

Authors: Huiyang Wu, Qiuping Jiang, Zongwei Wu, Runmin Cong, Cedric Demonceaux, Yi Yang and Xiangyang Ji.

1. Preface

  • This repository provides code for "High-Resolution Underwater Creature Segmentation" TIP-2025. Arxiv Page
  • Created by Huiyang Wu, email: 2311100185@nbu.edu.cn

2. High-Resolution Underwater Creature Segmentation Dataset UCS4K

Baidu Netdisk: UCS4K fetch code: [1390] &&& Google drive: UCS4K is the first large-scale dataset for High-Resolution Underwater Creature Segmentation (UCS). It is free for academic research, not for any commercial purposes.

3.Directory

The directory should be like this:

-- model (saved model)
-- pre (pretrained model)
-- result (saliency maps)
-- data (train dataset and test dataset)
   -- UCS4K
      |-- trian
      |   |-- Imgs
      |   |-- GT
      |   |-- Edge_gt
      |-- val
      |   |-- Imgs
      |   |-- GT
      |   |-- Edge_gt
      |-- test
      |   |-- Imgs
      |   |-- GT
      |   |-- Edge_gt
   ...
   

4. Proposed Baseline

4.1. Training/Testing

Requirements

  • Python 3.8
  • Pytorch 1.7.1
  • OpenCV
  • Numpy
  • Apex
  • Timm
  • tqdm

The training and testing experiments are conducted using PyTorch with a single NVIDIA TIAN GPU of 24 GB Memory.

  1. Configuring your environment (Prerequisites):

    • Creating a virtual environment in terminal: conda create -n RADAR python=3.8.

    • Installing necessary packages: pip install -r requirements.txt.

  2. Downloading necessary data:

    • downloading testing dataset and move it into ./data/test/

    • downloading training dataset and move it into ./data/train/,

    • downloading pretrained weights and move it into ./checkpoints/best/RADAR.pth,

    • downloading ResNet-18 and Swin-B-224 as backbone networks, which are saved in pre folder.

  3. Training Configuration:

    • Assigning your costumed path, like --train_save and --train_path in etrain.py.
  4. Testing Configuration:

    • After you download all the pre-trained model and testing dataset, just run etest.py to generate the final prediction map: replace your trained model directory (--pth_path).

4.2 Evaluating your trained model:

One-key evaluation is written in MATLAB code (revised from link), please follow this the instructions in ./eval/main.m and just run it to generate the evaluation results in.

If you want to speed up the evaluation on GPU, you just need to use the efficient tool by pip install pysodmetrics.

Assigning your costumed path, like method, mask_root and pred_root in eval.py.

Just run eval.py to evaluate the trained model.

pre-computed maps of RADAR can be found in download link (Google Drive).

pre-computed maps of other comparison methods can be found in download link (Baidu Pan) with Code: yxy9.

5. Other dataset

(1)USOD10K (2)MAS3K (3)RMAS (4)SUIM (5)NAUTEC

6. Citation

Please cite our paper if you find the work useful:

@inproceedings{sun2022bgnet,
title={Boundary-Guided Camouflaged Object Detection},
author={Sun, Yujia and Wang, Shuo and Chen, Chenglizhao and Xiang, Tian-Zhu},
booktitle={IJCAI},
pages = "1335--1341",
year={2022}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages