Here we provide instructions and results for applying DenseCL pre-trained models to AdelaiDet. Please refer to https://git.io/DenseCL for the pre-training code.
Dense Contrastive Learning for Self-Supervised Visual Pre-Training,
Xinlong Wang, Rufeng Zhang, Chunhua Shen, Tao Kong, Lei Li
In: Proc. IEEE Conf. Computer Vision and Pattern Recognition (CVPR), 2021, Oral
arXiv preprint (arXiv 2011.09157)
First, follow the default instruction to install the project and datasets/README.md set up the datasets (e.g., MS-COCO).
pre-train method | pre-train dataset | backbone | #epoch | Link |
---|---|---|---|---|
DenseCL | COCO | ResNet-50 | 800 | download |
DenseCL | COCO | ResNet-50 | 1600 | download |
DenseCL | ImageNet | ResNet-50 | 200 | download |
DenseCL | ImageNet | ResNet-101 | 200 | download |
PRETRAIN_DIR=./
wget https://cloudstor.aarnet.edu.au/plus/s/hdAg5RYm8NNM2QP/download -O ${PRETRAIN_DIR}/densecl_r50_imagenet_200ep.pkl
Use convert-pretrain-to-detectron2.py to convert the pre-trained backbone weights:
WEIGHT_FILE=${PRETRAIN_DIR}/densecl_r50_imagenet_200ep.pth
OUTPUT_FILE=${PRETRAIN_DIR}/densecl_r50_imagenet_200ep.pkl
python convert-pretrain-to-detectron2.py ${WEIGHT_FILE} ${OUTPUT_FILE}
For training a SOLOv2, run:
OMP_NUM_THREADS=1 python tools/train_net.py \
--config-file configs/DenseCL/SOLOv2_R50_1x_DenseCL.yaml \
--num-gpus 8 \
OUTPUT_DIR training_dir/SOLOv2_R50_1x_DenseCL \
MODEL.WEIGHTS ${PRETRAIN_DIR}/densecl_r50_imagenet_200ep.pkl
For training a FCOS, run:
OMP_NUM_THREADS=1 python tools/train_net.py \
--config-file configs/DenseCL/FCOS_R50_1x_DenseCL.yaml \
--num-gpus 8 \
OUTPUT_DIR training_dir/FCOS_R50_1x_DenseCL \
MODEL.WEIGHTS ${PRETRAIN_DIR}/densecl_r50_imagenet_200ep.pkl
pre-train method | pre-train dataset | mask AP |
---|---|---|
Supervised | ImageNet | 35.2 |
MoCo-v2 | ImageNet | 35.2 |
DenseCL | ImageNet | 35.7 (+0.5) |
pre-train method | pre-train dataset | box AP |
---|---|---|
Supervised | ImageNet | 39.9 |
MoCo-v2 | ImageNet | 40.3 |
DenseCL | ImageNet | 40.9 (+1.0) |
Please consider citing our paper in your publications if the project helps your research. BibTeX reference is as follows.
@inproceedings{wang2020densecl,
title = {Dense Contrastive Learning for Self-Supervised Visual Pre-Training},
author = {Wang, Xinlong and Zhang, Rufeng and Shen, Chunhua and Kong, Tao and Li, Lei},
booktitle = {Proc. IEEE Conf. Computer Vision and Pattern Recognition (CVPR)},
year = {2021}
}