Skip to content

Latest commit

 

History

History
62 lines (46 loc) · 3.41 KB

nafnet.md

File metadata and controls

62 lines (46 loc) · 3.41 KB

Contents

Although there have been significant advances in the field of image restoration recently, the system complexity of the state-of-the-art (SOTA) methods is increasing as well, which may hinder the convenient analysis and comparison of methods. In this paper, we propose a simple baseline that exceeds the SOTA methods and is computationally efficient. To further simplify the baseline, we reveal that the nonlinear activation functions, e.g. Sigmoid, ReLU, GELU, Softmax, etc. are not necessary: they could be replaced by multiplication or removed. Thus, we derive a Nonlinear Activation Free Network, namely NAFNet, from the baseline. SOTA results are achieved on various challenging benchmarks, e.g. 33.69 dB PSNR on GoPro (for image deblurring), exceeding the previous SOTA 0.38 dB with only 8.4% of its computational costs; 40.30 dB PSNR on SIDD (for image denoising), exceeding the previous SOTA 0.28 dB with less than half of its computational costs.

Paper: Simple Baselines for Image Restoration

Original github repository

This work uses the GoPro dataset. The dataset consists of T2103 train and 1111 test images with a resolution of 1280 x 720.

Download link.

Ascend 910

  • Hardware (Ascend)
    • Prepare hardware environment with Ascend 910 (cann_6.0.0, euler_2.8, py_3.7)
  • Framework
Parameters NAFNet deblur (8xNPU)
Model Version NAFNet
Resources 1x Ascend 910A
Uploaded Date 06 / 14 / 2023 (month/day/year)
MindSpore Version 1.9.0
Dataset GoPro
Training Parameters batch_size=16, 10000 epochs
Optimizer Adam
Speed 780 ms/step
Total time 2d 12h 31m
Checkpoint for Fine tuning 274.9 MB (.ckpt file)
Parameters NAFNet deblur (1xNPU, CANN)
Model Version NAFNet
Resources 1x Ascend 910A
Uploaded Date 06 / 14 / 2023 (month/day/year)
MindSpore Version 1.9.0
Datasets GoPro
Batch_size 1
Inference speed, s 0.041 (1280x720)
PSNR metric 30.48
SSIM metric 0.9041