Traditional focal stack methods require multiple shots to capture images focused at different distances of the same scene, which cannot be applied to dynamic scenes well. Generating a high-quality all-in-focus image from a single shot is challenging, due to the highly ill-posed nature of the single-image defocus and deblurring problem. In this thesis, to restore an all-in-focus image, we propose the event focal stack which is defined as event streams captured during a continuous focal sweep. Given an RGB image focused at an arbitrary distance, we explore the high temporal resolution of event streams, from which we automatically select refocusing timestamps and reconstruct corresponding refocused images with events to form a focal stack. Guided by the neighbouring events around the selected timestamps, we can merge the focal stack with proper weights and restore a sharp all-in-focus image. Experimental results on both synthetic and real datasets show superior performance over state-of-the-art methods.

Animated Results

Visual quality comparison with an image-based focal stack method.

(a) Image focal stack.

(b) All-in-focus image restored by Zhou et al.

(c) The visualization of EFS.

(d) All-in-focus image restored by our method.


    author    = {Lou, Hanyue and Teng, Minggui and Yang, Yixin and Shi, Boxin},
    title     = {All-in-Focus Imaging From Event Focal Stack},
    booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
    month     = {June},
    year      = {2023},
    pages     = {17366-17375}