Skip to content

[IJCV 2022] Multi-Object Tracking and Segmentation via Neural Message Passing

License

Notifications You must be signed in to change notification settings

ocetintas/MPNTrackSeg

Repository files navigation

Multi-Object Tracking and Segmentation via Neural Message Passing

This the official implementation of our IJCV 2022 paper Multi-Object Tracking and Segmentation via Neural Message Passing (Guillem Brasó*, Orcun Cetintas*, Laura Leal-Taixe)

This work builds upon our previous CVPR 2020 (oral) paper Learning a Neural Solver for Multiple Object Tracking and extends it by:

  1. integrating an attentive module to our neural message passing scheme to yield a unified model for multi-object tracking and segmentation
  2. providing an extensive evaluation of our tracking model over three challenging datasets, including MOT20, KITTI and the recently proposed Human in Events dataset.

[Paper] Method Visualization

Setup

  1. Clone and enter this repository:

    git clone https://github.com/ocetintas/MPNTrackSeg.git
    cd MPNTrackSeg
    
  2. Put the project directories in PYTHONPATH by copying the following lines in your .bashrc file:

     export PYTHONPATH="${PYTHONPATH}:[PATH_TO_YOUR_PROJECT]/MPNTrackSeg/src"
     export PYTHONPATH="${PYTHONPATH}:[PATH_TO_YOUR_PROJECT]/MPNTrackSeg/MOTChallengeEvalKit/src"
     export PYTHONPATH="${PYTHONPATH}:[PATH_TO_YOUR_PROJECT]/MPNTrackSeg/tracktor-mots/src"
    
  3. Create an Anaconda environment for this project:

    1. conda env create -f environment.yml
    2. conda activate MPNTrackSeg
  4. Modify the variables DATA_PATH, and OUTPUT_PATH in src/mot_neural_solver/path_cfg.py so that they are set to your preferred locations for storing datasets and output results, respectively.

  5. Download MOTS20 and/or KITTIMOTS datasets. Expected folder structure:

    DATA_PATH
    ├── KITTIMOTS
    │   └── ...
    └── MOTS20
        └── train
        │    ├── MOTS20-02
        │    │   ├── det
        │    │   │   └── det.txt
        │    │   └── gt
        │    │   │   └── gt.txt
        │    │   └── img1 
        │    │   │   └── ...
        │    │   └── seqinfo.ini
        │    └── ...
        └── test
            └── ...
    
    
  6. (OPTIONAL) We provide our trained models and detections:

    1. [Models]

      Default folder structure is:

      MPNTrackSeg
      └── output
              └── trained_models
                  └── mots
                  │    ├── kitti.ckpt
                  │    └── mots20.ckpt
                  └── reid
                      └── resnet50_market_cuhk_duke.tar-232
      
      
    2. [Detections]

      Default location for each detections file is under det folder for each sequence with the naming det.txt (See step 5)

Training

Specify hyperparameters via tracking_cfg.yaml and choose train and validation splits via train.py.

python scripts/train.py

Evaluation

Specify hyperparameters via tracking_cfg.yaml and choose the trained model via evaluate.py. Our hyperparameters for MOTS20 and KITTIMOTS datasets are provided under configs folder.

python scripts/evaluate.py

Citation

If you use our work in your research, please cite our publications:

  1. Multi-Object Tracking and Segmentation via Neural Message Passing (IJCV 2022)

         @article{MPNTrackSeg,
         author = {Bras{\'o}, Guillem and Cetintas, Orcun and Leal-Taix{\'e}, Laura},
         date = {2022/09/26},
         doi = {10.1007/s11263-022-01678-6},
         id = {Bras{\'o}2022},
         isbn = {1573-1405},
         journal = {International Journal of Computer Vision},
         title = {Multi-Object Tracking and Segmentation Via Neural Message Passing},
         url = {https://doi.org/10.1007/s11263-022-01678-6},
         year = {2022}}
    
  2. Learning a Neural Solver for Multiple Object Tracking (CVPR 2020)

        @InProceedings{braso_2020_CVPR,
        author={Guillem Brasó and Laura Leal-Taixé},
        title={Learning a Neural Solver for Multiple Object Tracking},
        booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
        month = {June},
        year = {2020}
    }
    

Acknowledgements

We use the codebases of Tracktor for preprocessing and MOTChallengeEvalKit and TrackEval for evaluation. We thank the authors of these codebases for their great work!

About

[IJCV 2022] Multi-Object Tracking and Segmentation via Neural Message Passing

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published