Skip to content

[NTIRE2024 ESR] SlimRLFN: Making RLFN Smaller and Faster Again

License

Notifications You must be signed in to change notification settings

Blcony/SlimRLFN

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SlimRLFN

SlimRLFN: Making RLFN Smaller and Faster Again

NTIRE2024 Efficient Super-Resolution

Shenglong Zhou1 and Binbin Chen2

1University of Science and Technology of China (USTC)

2Huazhong University Of Science And Technology (HUST)

Introduction

SlimRLFN

We propose the SlimRLFN for the efficient super-resolution task. The network architecture is inspired by the design of RLFN, while fully exploring the capacity of reparameterizable convolution, light distillation, and iterative model pruning. The whole architecture is shown above, which mainly consists of six SRLFB modules and a pixel shuffle module. Reparameterizable convolutions are utilized in the SRLFB module, aiming to improve the super-resolution capability without introducing any additional parameter overhead during the inference stage. Meanwhile, the network is optimized by the pixel-wise loss such as charbonnier loss or L2 loss, along with the distillation loss provided by a light but efficient teacher model. Last but not least, we use iterative pruning to shrink the model size while maintaining the promising performance at the last training stage.

Requirements

For the datasets and environments, please refer to the Official Repo.

Usage

To obtain the results of our SlimRLFN, please conduct the following command:

CUDA_VISIBLE_DEVICES=0 python test_demo.py --data_dir [path to your data dir] --save_dir [path to your save dir] --model_id 21

To obtain the results of our SlimRLFN after pruning, please conduct the following command:

CUDA_VISIBLE_DEVICES=0 python test_demo.py --data_dir [path to your data dir] --save_dir [path to your save dir] --model_id 21 --use_prune

Contact

Please be free to contact us by e-mail ([email protected]) or WeChat (ZslBlcony) if you have any questions.

About

[NTIRE2024 ESR] SlimRLFN: Making RLFN Smaller and Faster Again

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published