Skip to content

dongdong2061/QSTNet

Repository files navigation

Quality-aware Spatio-temporal Transformer Network for RGBT Tracking

Our work has been accepted by TIP!

Our project page is now available at: https://zhaodongah.github.io/QSTNet

Installation

Create and activate a conda environment:

conda create -n qstnet python=3.7
conda activate qstnet

Install the required packages:

bash install_qstnet.sh

Data Preparation

Download the training datasets, It should look like:

$<PATH_of_Datasets>
    -- LasHeR/TrainingSet
        |-- 1boygo
        |-- 1handsth
        ...

Path Setting

Run the following command to set paths:

cd <PATH_of_BAT>
python tracking/create_default_local_file.py --workspace_dir . --data_dir <PATH_of_Datasets> --save_dir ./output

You can also modify paths by these two files:

./lib/train/admin/local.py  # paths for training
./lib/test/evaluation/local.py  # paths for testing

Training

Dowmload the pretrained foundation model (OSTrack and DropTrack) and put it under ./pretrained/.

bash train_qstnet.sh

You can train models with various modalities and variants by modifying train_qstnet.sh.

Testing

For RGB-T benchmarks

[LasHeR & RGBT234]
Modify the <DATASET_PATH> and <SAVE_PATH> in./RGBT_workspace/test_rgbt_mgpus.py, then run:

bash eval_rgbt.sh

We refer you to use LasHeR Toolkit for LasHeR evaluation, and refer you to use MPR_MSR_Evaluation for RGBT234 evaluation. You can also use eval_lasher.py to evaluate the results on the LasHeR dataset.

Acknowledgment

  • This repo is based on BAT which is an exellent work, helps us to quickly implement our ideas.
  • Thanks for the OSTrack and PyTracking library.

About

Code for Quality-aware Spatio-temporal Transformer Network for RGBT Tracking

Resources

License

Stars

Watchers

Forks

Contributors