摘要

Most existing localization estimation methods using monocular vision need to imitate stereo vision with corresponding points matching. However, corresponding points matching is often computationally expensive. This paper proposes a novel absolute localization estimation method based on monocular vision using mapping relationship between projection points and their corresponding points in the image, and an algorithm for error amending is also presented. Firstly, the mapping of 3D points in the world to a 2D image generated by pinhole imaging is established, while a model for mapping relationship between actual area in metric units and area in pixel units is set up through camera calibration. The area in the image can be calculated by accumulating limited small rectangles and compared with the actual area in the world so that the distance in z-axis direction can be computed. Consequently, the measurement of the absolute distance between the camera and the target is developed by using the relationship between projection points based on projective geometry. Then, the method to recovering the 3D positions of a known target is proposed. Meanwhile, an amended method for error correction is presented based on the line fitting. Finally, two experiments are carried out, which validates the proposed method.