Skip to content

Spatial-Temporal Correlation Learning for Real-Time Video DeInterlacing (ICME 2021)

Notifications You must be signed in to change notification settings

yuqing-liu-dut/ST-DeInt

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Spatial-Temporal Correlation Learning for Real-Time Video DeInterlacing

Yuqing Liu, Xinfeng Zhang, Shanshe Wang, Siwei Ma, and Wen Gao Accept by ICME 2021 IEEExplore

Abstract

Deinterlacing is a classical issue in video processing area, which aims to generate the progressive video from the interlaced instance. Although numerous algorithms have been proposed in the past decades, their performances are still not satisfactory from both quality of experience and processing efficiency. This paper focuses on the spatial-temporal correlation in the given frame, and design a network for recovering the missing field. Intra-frame motion compensation is considered between the given fields for detail refinement. Furthermore, we address the inherent correlations among image features with channel attention for better exploration. Extensive experimental results on different video sequences show that our method outperforms state-of-the-art methods according to both objective and subjective evaluations satisfying the real-time requirement.

News

2021/11/04: We win the 5-th place of MSU Deinterlacer Benchmark link

About

Spatial-Temporal Correlation Learning for Real-Time Video DeInterlacing (ICME 2021)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages