Revisiting Implicit Neural Representations in Low-Level Vision

Wentian Xu, Jianbo Jiao

Research output: Working paper/PreprintPreprint

32 Downloads (Pure)

Abstract

Implicit Neural Representation (INR) has been emerging in computer vision in recent years. It has been shown to be effective in parameterising continuous signals such as dense 3D models from discrete image data, e.g. the neural radius field (NeRF). However, INR is under-explored in 2D image processing tasks. Considering the basic definition and the structure of INR, we are interested in its effectiveness in low-level vision problems such as image restoration. In this work, we revisit INR and investigate its application in low-level image restoration tasks including image denoising, super-resolution, inpainting, and deblurring. Extensive experimental evaluations suggest the superior performance of INR in several low-level vision tasks with limited resources, outperforming its counterparts by over 2dB. Code and models are available at https://github.com/WenTXuL/LINR
Original languageEnglish
PublisherarXiv
DOIs
Publication statusPublished - 20 Apr 2023

Bibliographical note

Published at the ICLR 2023 Neural Fields workshop. Project Webpage: https://wentxul.github.io/LINR-projectpage

Fingerprint

Dive into the research topics of 'Revisiting Implicit Neural Representations in Low-Level Vision'. Together they form a unique fingerprint.

Cite this