Gates Building, Room 100, 3:15-4:30
Friday, 16 Feb, 2001

The Stanford Semantics-sensitive Integrated Matching for Picture Libraries (SIMPLIcity) System

Dr. Dr. Gio Wiederhold, Computer Science, Stanford University

This lecture will describe and demonstrate the Stanford SIMPLIcity System (Semantics-sensitive Integrated Matching for Picture LIbraries), an image database retrieval system, using high-level semantics classification and integrated region matching based upon image segmentation. The SIMPLIcity system represents an image by a set of regions, roughly corresponding to objects, which are characterized by color, texture, shape, and location. Based on segmented regions, the system classifies images into categories which are intended to distinguish semantically meaningful differences. These high-level categories, such as textured-nontextured, indoor-outdoor, objectionable-benign, graph-photograph, enhance retrieval by narrowing down the searching range in a database and permitting semantically adaptive searching methods.

The algorithm characterizes the color variations over the spatial extent of the image in a manner that provides seminatically-meaningful image comparisons. The indexing algorithm applies a Daubechies wavelet transform for each of the three opponent color components. The wavelet coefficients in the lowest frequency bands, and their variances, are stored within feature vectors. The color histogram is also computed and stored. An altered tree-structured vector quantization is used to efficiently partition the high-dimensional feature space during the indexing stage. Image classes, along with the centroid inmage and the boundary image of each class are stored in the tree structure.

The user of the system may search the database by browsing the automatically pre-defined classes. When a query is accepted, the system performs a search refinement so that images in the same class as the query can be sorted and displyed to the user based on similarity.