Our project page is now available at: https://zhaodongah.github.io/QSTNet
Create and activate a conda environment:
conda create -n qstnet python=3.7
conda activate qstnet
Install the required packages:
bash install_qstnet.sh
Download the training datasets, It should look like:
$<PATH_of_Datasets>
-- LasHeR/TrainingSet
|-- 1boygo
|-- 1handsth
...
Run the following command to set paths:
cd <PATH_of_BAT>
python tracking/create_default_local_file.py --workspace_dir . --data_dir <PATH_of_Datasets> --save_dir ./output
You can also modify paths by these two files:
./lib/train/admin/local.py # paths for training
./lib/test/evaluation/local.py # paths for testing
Dowmload the pretrained foundation model (OSTrack and DropTrack) and put it under ./pretrained/.
bash train_qstnet.sh
You can train models with various modalities and variants by modifying train_qstnet.sh.
[LasHeR & RGBT234]
Modify the <DATASET_PATH> and <SAVE_PATH> in./RGBT_workspace/test_rgbt_mgpus.py, then run:
bash eval_rgbt.sh
We refer you to use LasHeR Toolkit for LasHeR evaluation,
and refer you to use MPR_MSR_Evaluation for RGBT234 evaluation.
You can also use eval_lasher.py to evaluate the results on the LasHeR dataset.
- This repo is based on BAT which is an exellent work, helps us to quickly implement our ideas.
- Thanks for the OSTrack and PyTracking library.