A Hybrid Approach for Near-Range Video Stabilization

作者:Liu, Shuaicheng*; Xu, Binhan; Deng, Chuang; Zhu, Shuyuan; Zeng, Bing*; Gabbouj, Moncef
来源:IEEE Transactions on Circuits and Systems for Video Technology, 2017, 27(9): 1922-1933.
DOI:10.1109/TCSVT.2016.2556587

摘要

Near-range videos contain objects that are close to the camera. These videos often contain discontinuous depth variation (DDV), which is the main challenge to the existing video stabilization methods. Traditionally, 2D methods are robust to various camera motions (e.g., quick rotation and zooming) under scenes with continuous depth variation (CDV). However, in the presence of DDV, they often generate wobbled results due to the limited ability of their 2D motion models. Alternatively, 3D methods are more robust in handling near-range videos. We show that, by compensating rotational motions and ignoring translational motions, near-range videos can be successfully stabilized by 3D methods without sacrificing the stability too much. However, it is time-consuming to reconstruct the 3D structures for the entire video and sometimes even impossible due to rapid camera motions. In this paper, we combine the advantages of 2D and 3D methods, yielding a hybrid approach that is robust to various camera motions and can handle the near-range scenarios well. To this end, we automatically partition the input video into CDV and DDV segments. Then, the 2D and 3D approaches are adopted for CDV and DDV clips, respectively. Finally, these segments are stitched seamlessly via a constrained optimization. We validate our method on a large variety of consumer videos.