Sparse Color Interest Points for Image Retrieval and Object Categorization
Interest point detection is an important research areain the field of image processing and computer vision. In particular,image retrieval and object categorization heavily rely on interestpoint detection from which local image descriptors are computedfor image matching. In general, interest points are based on luminance,and color has been largely ignored. However, the use of colorincreases the distinctiveness of interest points. The use of colormay therefore provide selective search reducing the total numberof interest points used for image matching. This paper proposescolor interest points for sparse image representation. To reduce thesensitivity to varying imaging conditions, light-invariant interestpoints are introduced. Color statistics based on occurrence probabilitylead to color boosted points, which are obtained throughsaliency-based feature selection. Furthermore, a principal componentanalysis-based scale selectionmethod is proposed, which givesa robust scale estimation per interest point. From large-scale experiments,it is shown that the proposed color interest point detectorhas higher repeatability than a luminance-based one. Furthermore,in the context of image retrieval, a reduced and predictablenumber of color features show an increase in performancecompared to state-of-the-art interest points. Finally, in the contextof object recognition, for the Pascal VOC 2007 challenge, ourmethod gives comparable performance to state-of-the-artmethodsusing only a small fraction of the features, reducing the computingtime considerably.
- System : Pentium Dual Core.
- Hard Disk : 120 GB.
- Monitor : 15’’LED
- Input Devices : Keyboard, Mouse
- Ram :1 GB
- Operating system : Windows 7.
- Coding Language :
- Tool : MATLAB R2013A
Julian Stöttinger, Allan Hanbury, Nicu Sebe, Senior Member, IEEE, and Theo Gevers, Member, IEEE, “Sparse Color Interest Points for ImageRetrieval and Object Categorization”, IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 21, NO. 5, MAY 2012.