LNCS Homepage
ContentsAuthor IndexSearch

An Efficient Parallel Strategy for Matching Visual Self-similarities in Large Image Databases

Katharina Schwarz, Tobias Häußler, and Hendrik P.A. Lensch

Computer Graphics, Tübingen University 72076, Tübingen, Germany

Abstract. Due to high interest of social online systems, there exists a huge and still increasing amount of image data in the web. In order to handle this massive amount of visual information, algorithms often need to be redesigned. In this work, we developed an efficient approach to find visual similarities between images that runs completely on GPU and is applicable to large image databases. Based on local self-similarity descriptors, the approach finds similarities even across modalities. Given a set of images, a database is created by storing all descriptors in an arrangement suitable for parallel GPU-based comparison. A novel voting-scheme further considers the spatial layout of descriptors with hardly any overhead. Thousands of images are searched in only a few seconds. We apply our algorithm to cluster a set of image responses to identify various senses of ambiguous words and re-tag similar images with missing tags.

LNCS 7583, p. 281 ff.

Full article in PDF | BibTeX


lncs@springer.com
© Springer-Verlag Berlin Heidelberg 2012