UniRTL: A Universal RGBT and Low-Light Benchmark for Object Tracking
Lian Zhang, Lingxue Wang, Yuzhen Wu, Mingkun Chen, Dezhi Zheng, Liangcai Cao, Bangze Zeng, and Yi Cai
Pattern Recognition: 110984
Single and multiple object tracking in RGBT (RGB and Thermal) and low-light conditions pose significant challenges due to varying illumination and modality differences. Unismot introduces a Universal RGBT and Low-light Benchmark (UniRTL), comprising 3 Γ 626 videos for SOT and 3 Γ 50 videos for MOT, totaling over 158K frame triplets. The dataset is categorized into low, medium, and high illuminance based on scene illuminance measurements. We propose a unified tracking-with-detection framework that integrates a detector, first-frame target prior (FTP), and data associator, effectively addressing both SOT and MOT tasks. Enhancements such as a ReID long-term matching module and the reuse of low-score bounding boxes further improve tracking performance. Extensive experiments demonstrate that Unismot outperforms existing methods on established RGBT tracking datasets, promoting robust multimodal tracking across varying lighting conditions.
- October 2024: Dataset and code are now open-source!
- September 2024: Paper accepted by Pattern Recognition!
| Dataset | Feature Extractor | NPR | PR | SR | HOTA | DetA | AssA | MOTA | IDF1 |
|---|---|---|---|---|---|---|---|---|---|
| UniRTL | RGB+T+Low-light | 0.461 | 0.676 | 0.607 | 56.6 | 62.2 | 51.7 | 69.3 | 62.9 |
- Operating System: Ubuntu 20.04
- Python: 3.8.20
- PyTorch: 1.9.1
git clone --recursive https://github.com/Liamzh0331/Unismot.git
cd UnismotIf you cloned without --recursive, initialize submodules:
git submodule update --init --recursiveEnsure the following OpenGL-related packages are installed:
sudo apt update
sudo apt install -y libgl1-mesa-dri libglu1-mesa libgl1-mesa-glxIt's recommended to use Conda or a virtual environment.
Using Conda:
conda create -n unismot python=3.8
conda activate unismotUsing Python Virtual Environment:
python -m venv unismot_env
source unismot_env/bin/activateDue to known issues with some packages, install numpy and lap with specific pip first:
pip install --upgrade pip==22.0.3
pip install numpy==1.24.4
pip install lap==0.4.0Then, install the remaining dependencies:
pip install -r requirements.txtpython setup.py developOpen a Python REPL and run:
import unismot.core
import unismot.data.datasetsEnsure no errors are raised. For additional verification, you can run:
python -c "import unismot; print('Unismot installed successfully!')"Download the UniRTL dataset from the following links:
- Baidu Pan: UniRTL Dataset (code: liam)
- Google Drive: UniRTL Dataset
Extract the dataset and place it under <Unismot_HOME>/datasets with the following structure:
datasets/
βββ UniRTL/
βββ MOT/
β βββ train/
β βββ test/
β βββ test_high_seqmap.txt
β βββ test_low_seqmap.txt
β βββ test_med_seqmap.txt
β βββ test_seqmap.txt
β βββ train_seqmap.txt
βββ SOT/
βββ train/
βββ test/
βββ eval/
Navigate to the project root and execute the following scripts:
cd <Unismot_HOME>
python tools/convert_sot_to_coco.py
python tools/convert_mot_to_coco.pyFollow the steps in mix_data_UniRTL.py to create a data folder and establish necessary links. Then, execute:
cd <Unismot_HOME>
python tools/mix_data_UniRTL.pyPretrained models trained on half of the UniRTL training set and evaluated on the UniRTL test set are available for download:
- Baidu Pan: Pretrained Models (code: liam)
- Google Drive: Pretrained Models
| Model | SR | HOTA |
|---|---|---|
| unismot_l_RGBTL2 | 0.607 | 56.6 |
| unismot_x_RGBT | 0.582 | 54.2 |
| unismot_l_RGBT | 0.582 | 54.1 |
| unismot_m_RGBT | 0.613 | 52.9 |
| unismot_s_RGBT | 0.532 | 50.6 |
| unismot_tiny_RGBT | 0.546 | 50.3 |
| unismot_nano_RGBT | 0.525 | 46.1 |
After downloading the pretrained models, place them in the <Unismot_HOME>/pretrained directory.
cd <Unismot_HOME>
python tools/train.py -f exps/example/mot/unismot_l_RGBTL2.py -d 2 -b 10 --fp16 -o -c pretrained/unismot_l_RGBTL2.pth.tar --nlowcd <Unismot_HOME>
python tools/train.py -f exps/example/mot/unismot_l_RGBT.py -d 2 -b 10 --fp16 -o -c pretrained/unismot_l_RGBT.pth.tarRun the Unismot tracker for a demonstration:
cd <Unismot_HOME>
python tools/demo_track.py -f exps/example/mot/unismot_l_RGBTL2.py -c pretrained/unismot_l_RGBTL2.pth.tar --nlow-
Single Object Tracking (SOT):
cd <Unismot_HOME> python tools/run_tracker.py -f exps/example/mot/unismot_l_RGBTL2.py -c pretrained/unismot_l_RGBTL2.pth.tar -pt ./datasets/UniRTL/SOT/train -tm SOT --nlow
-
Multiple Object Tracking (MOT):
cd <Unismot_HOME> python tools/run_tracker.py -f exps/example/mot/unismot_l_RGBTL2.py -c pretrained/unismot_l_RGBTL2.pth.tar -pt ./datasets/UniRTL/MOT/train -tm MOT --nlow
Download evaluation tools:
- SOT Evaluation Toolkit: UniRTL Evaluation Toolkit (Baidu Pan) | (Google Drive)
- MOT Evaluation Tools: TrackEval (Baidu Pan) | (Google Drive)
Follow the instructions provided with each toolkit to evaluate tracking results.
If you use Unismot or UniRTL in your research, please consider citing the following paper:
@article{ZHANG2025110984,
title = {UniRTL: A universal RGBT and low-light benchmark for object tracking},
journal = {Pattern Recognition},
volume = {158},
pages = {110984},
year = {2025},
issn = {0031-3203},
author = {Lian Zhang and Lingxue Wang and Yuzhen Wu and Mingkun Chen and Dezhi Zheng and Liangcai Cao and Bangze Zeng and Yi Cai}
}Contributions are welcome! Please follow these steps to contribute:
-
Fork the Repository
-
Create a Feature Branch
git checkout -b feature/YourFeature
-
Commit Your Changes
git commit -m "Add Your Feature" -
Push to the Branch
git push origin feature/YourFeature
-
Open a Pull Request
Please ensure your code follows the project's coding standards and includes relevant tests.
This project builds upon the excellent work of:
- YOLOX: Base detection framework.
- ByteTrack: Advanced tracking methods.
- RGBT Benchmark and MOT16: Dataset references.
Many thanks to the authors and contributors of these projects for their outstanding work. We appreciate the open-source community for their continuous support and contributions.
This project is licensed under the MIT License.
For any questions or support, please open an issue or contact the maintainer:
- Lian Zhang: liamzh0331[AT]gmail.com or liamzh0331[AT]qq.com.

