IRM: Integrated Region Matching for Image Retrieval
Jia Li, James Z. Wang, Gio Wiederhold
Stanford University, Stanford, CA 94305
Abstract:
Content-based image retrieval using region segmentation has been an
active research area. We present IRM (Integrated Region Matching), a
novel similarity measure for region-based image similarity comparison.
The targeted image retrieval systems represent an image by a set of
regions, roughly corresponding to objects, which are characterized by
features reflecting color, texture, shape, and location properties.
The IRM measure for evaluating overall similarity between images
incorporates properties of all the regions in the images by a
region-matching scheme. Compared with retrieval based on individual
regions, the overall similarity approach reduces the influence of
inaccurate segmentation, helps to clarify the semantics of a
particular region, and enables a simple querying interface for
region-based image retrieval systems. The IRM has been implemented as
a part of our experimental SIMPLIcity image retrieval system. The
application to a database of about 200,000 general-purpose images
shows exceptional robustness to image alterations such as intensity
variation, sharpness variation, color distortions, shape distortions,
cropping, shifting, and rotation. Compared with several existing
systems, our system in general achieves more accurate retrieval at
higher speed.
Full Paper in Color
(PDF, 5MB)
(HTML)
On-line Demo
Citation:
Jia Li, James Z. Wang and Gio Wiederhold, ``IRM: Integrated Region
Matching for Image Retrieval,'' Proc. ACM Multimedia, pp. 147-156, Los
Angeles, CA, ACM, October 2000.
Copyright 2000 ACM.
Personal use of this
material is permitted. However, permission to reprint/republish this
material for advertising or promotional purposes or for creating new
collective works for resale or redistribution to servers or lists, or
to reuse any copyrighted component of this work in other works, must
be obtained from the ACM.
Last Modified:
July 10 2000
© 2000