Detecting Dominant Vanishing Points in Natural Scenes with Application to Composition-Sensitive Image Retrieval

Zihan Zhou, Farshid Farhat and James Z. Wang
The Pennsylvania State University

Abstract:

Linear perspective is widely used in landscape photography to create the impression of depth on a 2D photo. Automated understanding of linear perspective in landscape photography has several real-world applications, including aes- thetics assessment, image retrieval, and on-site feedback for photo composition, yet adequate automated understanding has been elusive. We address this problem by detecting the dominant vanishing point and the associated line structures in a photo. However, natural landscape scenes pose great technical challenges because often the inadequate number of strong edges converging to the dominant vanishing point is inadequate. To overcome this difficulty, we propose a novel vanishing point detection method that exploits global structures in the scene via contour detection. We show that our method significantly outperforms state-of-the-art methods on a public ground truth landscape image dataset that we have created. Based on the detection results, we further demonstrate how our approach to linear perspective understanding provides on-site guidance to amateur photographers on their work through a novel viewpoint-specific image retrieval system.


Full Paper
(PDF, 13MB)

Datasets (dominant vanishing points in 1,316 images from the AVA landscape dataset and 959 images from Flickr)
(ZIP, 453KB)


Citation: Zihan Zhou, Farshid Farhat and James Z. Wang, ``Detecting Dominant Vanishing Points in Natural Scenes with Application to Composition-Sensitive Image Retrieval,'' IEEE Transactions on Multimedia, vol. 19, no. , 15 pages, 2017.

© 2017 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

Last Modified: May 9, 2017
© 2017