TimeLens: Event-based Video Frame Interpolation

Ghosting Interpolation Motion interpolation Optical Flow Key frame Frame rate
DOI: 10.48550/arxiv.2106.07286 Publication Date: 2021-01-01
ABSTRACT
State-of-the-art frame interpolation methods generate intermediate frames by inferring object motions in the image from consecutive key-frames. In absence of additional information, first-order approximations, i.e. optical flow, must be used, but this choice restricts types that can modeled, leading to errors highly dynamic scenarios. Event cameras are novel sensors address limitation providing auxiliary visual information blind-time between frames. They asynchronously measure per-pixel brightness changes and do with high temporal resolution low latency. Event-based typically adopt a synthesis-based approach, where predicted residuals directly applied However, while these approaches capture non-linear they suffer ghosting perform poorly low-texture regions few events. Thus, flow-based complementary. work, we introduce Time Lens, indicates equal contribution method leverages advantages both. We extensively evaluate our on three synthetic two real benchmarks show an up 5.21 dB improvement terms PSNR over state-of-the-art frame-based event-based methods. Finally, release new large-scale dataset scenarios, aimed at pushing limits existing
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....