Normalization (statistics)

From Infogalactic: the planetary knowledge core
Jump to: navigation, search

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>


In statistics and applications of statistics, normalization can have a range of meanings.[1] In the simplest cases, normalization of ratings means adjusting values measured on different scales to a notionally common scale, often prior to averaging. In more complicated cases, normalization may refer to more sophisticated adjustments where the intention is to bring the entire probability distributions of adjusted values into alignment. In the case of normalization of scores in educational assessment, there may be an intention to align distributions to a normal distribution. A different approach to normalization of probability distributions is quantile normalization, where the quantiles of the different measures are brought into alignment.

In another usage in statistics, normalization refers to the creation of shifted and scaled versions of statistics, where the intention is that these normalized values allow the comparison of corresponding normalized values for different datasets in a way that eliminates the effects of certain gross influences, as in an anomaly time series. Some types of normalization involve only a rescaling, to arrive at values relative to some size variable. In terms of levels of measurement, such ratios only make sense for ratio measurements (where ratios of measurements are meaningful), not interval measurements (where only distances are meaningful, but not ratios).

In theoretical statistics, parametric normalization can often lead to pivotal quantities – functions whose sampling distribution does not depend on the parameters – and to ancillary statistics – pivotal quantities that can be computed from observations, without knowing parameters.

Examples

There are various normalizations in statistics – nondimensional ratios of errors, residuals, means and standard deviations, which are hence scale invariant – some of which may be summarized as follows. Note that in terms of levels of measurement, these ratios only make sense for ratio measurements (where ratios of measurements are meaningful), not interval measurements (where only distances are meaningful, but not ratios). See also Category:Statistical ratios...

Name Formula Use
Standard score \frac{X - \mu}{\sigma} Normalizing errors when population parameters are known. Works well for populations that are normally distributed
Student's t-statistic \frac{X - \overline{X}}{s} Normalizing residuals when population parameters are unknown (estimated).
Studentized residual \frac{\hat \epsilon_i}{\hat \sigma_i} = \frac{X_i - \hat \mu_i}{\hat \sigma_i} Normalizing residuals when parameters are estimated, particularly across different data points in regression analysis.
Standardized moment \frac{\mu_k}{\sigma^k} Normalizing moments, using the standard deviation \sigma as a measure of scale.
Coefficient of
variation
\frac{\sigma}{\mu} Normalizing dispersion, using the mean \mu as a measure of scale, particularly for positive distribution such as the exponential distribution and Poisson distribution.
Feature scaling X' = \frac{X - X_{min}}{X_{max}-X_{min}} Feature scaling used to bring all values into the range [0,1]. This is also called unity-based normalization. This can be generalized to restrict the range of values in the dataset between any arbitrary points  a and  b using  X' = a + \frac{\left(X-X_{min}\right)\left(b-a\right)}{X_{max} - X_{min}} .

Note that some other ratios, such as the variance-to-mean ratio \left(\frac{\sigma^2}{\mu}\right), are also done for normalization, but are not nondimensional: the units do not cancel, and thus the ratio has units, and are not scale invariant.

Other types

Other non-dimensional normalizations that can be used with no assumptions on the distribution include:

  • Assignment of percentiles. This is common on standardized tests. See also quantile normalization.
  • Normalization by adding and/or multiplying by constants so values fall between 0 and 1. This used for probability density functions, with applications in fields such as physical chemistry in assigning probabilities to | ψ |2.

References

  1. Dodge, Y (2003) The Oxford Dictionary of Statistical Terms, OUP. ISBN 0-19-920613-9 (entry for normalization of scores)

<templatestyles src="Asbox/styles.css"></templatestyles>