This repository is an official implementation of QTNet.
In this paper, we propose a simple and effective Query-based Temporal Fusion Network (QTNet). The main idea is to exploit the object queries in previous frames to enhance the representation of current object queries by the proposed Motion-guided Temporal Modeling (MTM) module, which utilizes the spatial position information of object queries along the temporal dimension to construct their relevance between adjacent frames reliably. Our method can be integrated into some advanced LiDAR-only or multi-modality 3D detectors and even brings new SOTA performance with negligible computation cost and latency on the nuScenes dataset.
- [2023-12-06] The memory bank training code is released.
- [2023-09-22] QTNet is accepted by NeurIPS 2023. 🔥
-
Memory Bank Data
Please install the memory bank of TransFusion-L, which contains query features and detection results.
-
Training and Validation Infos
You need to run the data conver script in this repo (create_data). We also provided the processed training and validation infos.
After preparation, you will be able to see the following directory structure:
QTNet
├── configs
│ ├── qtnet.py
├── mmdet3d
├── tools
├── data
│ ├── nuscenes
│ ├── memorybank
│ ├── ...
├── ...
├── README.md
You can train the model following:
tools/dist_train.sh configs/qtnet.py 8 --work-dir work_dirs/qtnet_4frames/
You can evaluate the model following:
tools/dist_test.sh configs/qtnet.py work_dirs/qtnet_4frames/latest.pth 8 --eval mAP
Model | Setting | mAP | NDS | Config | Download |
---|---|---|---|---|---|
QTNet | LiDAR - 4 frames | 66.5 | 70.9 | config | model |
Model | Setting | mAP | NDS |
---|---|---|---|
QTNet | LiDAR - 4 frames | 68.4 | 72.2 |
- Release the paper.
- Release the code of QTNet (Memory Bank Training).
We thank these great works and open-source repositories: TransFusion, DeepInteraction, and MMDetection3d.
@inproceedings{hou2023querybased,
title={Query-based Temporal Fusion with Explicit Motion for 3D Object Detection},
author={Jinghua Hou and Zhe Liu and dingkang liang and Zhikang Zou and Xiaoqing Ye and Xiang Bai},
booktitle={Thirty-seventh Conference on Neural Information Processing Systems},
year={2023},
}