Mean shift
Machine learning and data mining 

Machine learning venues

Mean shift is a nonparametric featurespace analysis technique for locating the maxima of a density function, a socalled modeseeking algorithm.^{[1]} Application domains include cluster analysis in computer vision and image processing.^{[2]}
Contents
History
The mean shift procedure was originally presented in 1975 by Fukunaga and Hostetler.^{[3]}
Overview
Mean shift is a procedure for locating the maxima of a density function given discrete data sampled from that function.^{[1]} It is useful for detecting the modes of this density.^{[1]} This is an iterative method, and we start with an initial estimate . Let a kernel function be given. This function determines the weight of nearby points for reestimation of the mean. Typically a Gaussian kernel on the distance to the current estimate is used, . The weighted mean of the density in the window determined by
where is the neighborhood of , a set of points for which .
The difference is called mean shift in Fukunaga and Hostetler.^{[3]} The meanshift algorithm now sets , and repeats the estimation until converges.
Although the mean shift algorithm has been widely used in many applications, a rigid proof for the convergence of the algorithm using a general kernel in a high dimensional space is still missing.^{[4]} Aliyari Ghassabeh showed the convergence of the mean shift algorithm in onedimension with a differentiable, convex, and strictly decreasing profile function.^{[5]} However, the onedimensional case has limited real world applications. Also, the convergence of the algorithm in higher dimensions with a finite number of the (or isolated) stationary points has been proved.^{[4]}^{[6]} However, sufficient conditions for a general kernel function to have finite (or isolated) stationary points have not been provided.
Details
Let data be a finite set embedded in the ndimensional Euclidean space, X. Let K be a flat kernel that is the characteristic function of the ball in X,
In each iteration of the algorithm, is performed for all simultaneously. The first question, then, is how to estimate the density function given a sparse set of samples. One of the simplest approaches is to just smooth the data, e.g., by convolving it with a fixed kernel of width ,
where are the input samples and is the kernel function (or Parzen window). h is the only parameter in the algorithm and is called the bandwidth. This approach is known as kernel density estimation or the Parzen window technique. Once we have computed from equation above, we can find its local maxima using gradient ascent or some other optimization technique. The problem with this "brute force" approach is that, for higher dimensions, it becomes computationally prohibitive to evaluate over the complete search space. Instead, mean shift uses a variant of what is known in the optimization literature as multiple restart gradient descent. Starting at some guess for a local maximum, , which can be a random input data point , mean shift computes the gradient of the density estimate at and takes an uphill step in that direction.
Types of kernels
Kernel definition: Let X be the ndimensional Euclidean space, . Denote the ith component of x by . The norm of x is a nonnegative number. A function K: is said to be a kernel if there exists a profile, , such that
and
 k is nonnegative.
 k is nonincreasing: if .
 k is piecewise continuous and
The two frequently used kernel profiles for mean shift are:
 Flat kernel
 Gaussian kernel
where the standard deviation parameter works as the bandwidth parameter, .
Applications
Clustering
Consider a set of points in twodimensional space. Assume a circular window centered at C and having radius r as the kernel. Mean shift is a hill climbing algorithm which involves shifting this kernel iteratively to a higher density region until convergence. Every shift is defined by a mean shift vector. The mean shift vector always points toward the direction of the maximum increase in the density. At every iteration the kernel is shifted to the centroid or the mean of the points within it. The method of calculating this mean depends on the choice of the kernel. In this case if a Gaussian kernel is chosen instead of a flat kernel, then every point will first be assigned a weight which will decay exponentially as the distance from the kernel's center increases. At convergence, there will be no direction at which a shift can accommodate more points inside the kernel.
Tracking
The mean shift algorithm can be used for visual tracking. The simplest such algorithm would create a confidence map in the new image based on the color histogram of the object in the previous image, and use mean shift to find the peak of a confidence map near the object's old position. The confidence map is a probability density function on the new image, assigning each pixel of the new image a probability, which is the probability of the pixel color occurring in the object in the previous image. A few algorithms, such as ensemble tracking,^{[7]} CAMshift,^{[8]}^{[9]} expand on this idea.
Smoothing
Let and be the ddimensional input and filtered image pixels in the joint spatialrange domain. For each pixel,
 Initialize and
 Compute according to until convergence, .
 Assign . The superscripts s and r denote the spatial and range components of a vector, respectively. The assignment specifies that the filtered data at the spatial location axis will have the range component of the point of convergence .
Strengths
 Mean shift is an applicationindependent tool suitable for real data analysis.
 Does not assume any predefined shape on data clusters.
 It is capable of handling arbitrary feature spaces.
 The procedure relies on choice of a single parameter: bandwidth.
 The bandwidth/window size 'h' has a physical meaning, unlike kmeans.
Weaknesses
 The selection of a window size is not trivial.
 Inappropriate window size can cause modes to be merged, or generate additional “shallow” modes.
 Often requires using adaptive window size.
See also
References
 ↑ ^{1.0} ^{1.1} ^{1.2} Cheng, Yizong (August 1995). "Mean Shift, Mode Seeking, and Clustering". IEEE Transactions on Pattern Analysis and Machine Intelligence. IEEE. 17 (8): 790–799. doi:10.1109/34.400568.<templatestyles src="Module:Citation/CS1/styles.css"></templatestyles>
 ↑ Comaniciu, Dorin; Peter Meer (May 2002). "Mean Shift: A Robust Approach Toward Feature Space Analysis". IEEE Transactions on Pattern Analysis and Machine Intelligence. IEEE. 24 (5): 603–619. doi:10.1109/34.1000236.
accessdate=
requiresurl=
(help)<templatestyles src="Module:Citation/CS1/styles.css"></templatestyles>  ↑ ^{3.0} ^{3.1} Fukunaga, Keinosuke; Larry D. Hostetler (January 1975). "The Estimation of the Gradient of a Density Function, with Applications in Pattern Recognition". IEEE Transactions on Information Theory. IEEE. 21 (1): 32–40. doi:10.1109/TIT.1975.1055330. Retrieved 20080229.<templatestyles src="Module:Citation/CS1/styles.css"></templatestyles>
 ↑ ^{4.0} ^{4.1} Aliyari Ghassabeh, Youness (20150301). "A sufficient condition for the convergence of the mean shift algorithm with Gaussian kernel". Journal of Multivariate Analysis. 135: 1–10. doi:10.1016/j.jmva.2014.11.009.<templatestyles src="Module:Citation/CS1/styles.css"></templatestyles>
 ↑ Aliyari Ghassabeh, Youness (20130901). "On the convergence of the mean shift algorithm in the onedimensional space". Pattern Recognition Letters. 34 (12): 1423–1427. doi:10.1016/j.patrec.2013.05.004.<templatestyles src="Module:Citation/CS1/styles.css"></templatestyles>
 ↑ Li, Xiangru; Hu, Zhanyi; Wu, Fuchao (20070601). "A note on the convergence of the mean shift". Pattern Recognition. 40 (6): 1756–1762. doi:10.1016/j.patcog.2006.10.016.<templatestyles src="Module:Citation/CS1/styles.css"></templatestyles>
 ↑ Avidan, Shai (2005). "Ensemble Tracking". 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05). San Diego, California: IEEE. 2. ISBN 0769523722.<templatestyles src="Module:Citation/CS1/styles.css"></templatestyles>
 ↑ Gary Bradski (1998) Computer Vision Face Tracking For Use in a Perceptual User Interface, Intel Technology Journal, No. Q2.
 ↑ Emami, Ebrahim (2013). "Online failure detection and correction for CAMShift tracking algorithm". 2013 Iranian Conference on Machine Vision and Image Processing (MVIP). IEEE. 2: 180–183.<templatestyles src="Module:Citation/CS1/styles.css"></templatestyles>
External links
Code implementations
 Aiphial. Javabased meanshift implementation for numeric data clustering and image segmentation.
 ImageJ. Image filtering using the mean shift filter.
 Mahout. An mapreduce based implementation of MeanShift clustering written on Apache Hadoop.
 MeanShift_py. Simple meanshift implementation in Python.
 OpenCV contains meanshift implementation via cvMeanShift Method
 Orfeo toolbox. A C++ implementation.
 Scikitlearn Numpy/Python implementation uses ball tree for efficient neighboring points lookup