Stochastic volatility

From Infogalactic: the planetary knowledge core
Jump to: navigation, search

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Stochastic volatility models are those in which the variance of a stochastic process is itself randomly distributed.[1] They are used in the field of mathematical finance to evaluate derivative securities, such as options. The name derives from the models' treatment of the underlying security's volatility as a random process, governed by state variables such as the price level of the underlying security, the tendency of volatility to revert to some long-run mean value, and the variance of the volatility process itself, among others.

Stochastic volatility models are one approach to resolve a shortcoming of the Black–Scholes model. In particular, models based on Black-Scholes assume that the underlying volatility is constant over the life of the derivative, and unaffected by the changes in the price level of the underlying security. However, these models cannot explain long-observed features of the implied volatility surface such as volatility smile and skew, which indicate that implied volatility does tend to vary with respect to strike price and expiry. By assuming that the volatility of the underlying price is a stochastic process rather than a constant, it becomes possible to model derivatives more accurately.

Basic model

Starting from a constant volatility approach, assume that the derivative's underlying asset price follows a standard model for geometric Brownian motion:

 dS_t = \mu S_t\,dt + \sigma S_t\,dW_t \,

where \mu \, is the constant drift (i.e. expected return) of the security price S_t \,, \sigma \, is the constant volatility, and dW_t \, is a standard Wiener process with zero mean and unit rate of variance. The explicit solution of this stochastic differential equation is

S_t= S_0 e^{(\mu- \frac{1}{2} \sigma^2) t+ \sigma W_t}.

The Maximum likelihood estimator to estimate the constant volatility \sigma \, for given stock prices S_t \, at different times t_i \, is

\begin{align}\hat{\sigma}^2 &= \left(\frac{1}{n} \sum_{i=1}^n \frac{(\ln S_{t_i}- \ln S_{t_{i-1}})^2}{t_i-t_{i-1}} \right) - \frac 1 n \frac{(\ln S_{t_n}- \ln S_{t_0})^2}{t_n-t_0}\\
& = \frac 1 n \sum_{i=1}^n (t_i-t_{i-1})\left(\frac{\ln \frac{S_{t_i}}{S_{t_{i-1}}}}{t_i-t_{i-1}} - \frac{\ln \frac{S_{t_n}}{S_{t_{0}}}}{t_n-t_0}\right)^2;\end{align}

its expectation value is E \left[ \hat{\sigma}^2\right]= \frac{n}{n-1} \sigma^2.

This basic model with constant volatility \sigma \, is the starting point for non-stochastic volatility models such as Black–Scholes model and Cox–Ross–Rubinstein model.

For a stochastic volatility model, replace the constant volatility \sigma \, with a function \nu_t \,, that models the variance of S_t \,. This variance function is also modeled as Brownian motion, and the form of \nu_t \, depends on the particular SV model under study.

 dS_t = \mu S_t\,dt + \sqrt{\nu_t} S_t\,dW_t \,
 d\nu_t = \alpha_{\nu,t}\,dt + \beta_{\nu,t}\,dB_t \,

where \alpha_{\nu,t} \, and \beta_{\nu,t} \, are some functions of \nu \, and dB_t \, is another standard gaussian that is correlated with dW_t \, with constant correlation factor \rho \,.

Heston model

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

The popular Heston model is a commonly used SV model, in which the randomness of the variance process varies as the square root of variance. In this case, the differential equation for variance takes the form:

 d\nu_t = \theta(\omega - \nu_t)dt + \xi \sqrt{\nu_t}\,dB_t \,

where \omega is the mean long-term volatility, \theta is the rate at which the volatility reverts toward its long-term mean, \xi is the volatility of the volatility process, and dB_t is, like dW_t, a gaussian with zero mean and \sqrt{\nu_t} standard deviation. However, dW_t and dB_t are correlated with the constant correlation value \rho.

In other words, the Heston SV model assumes that the variance is a random process that

  1. exhibits a tendency to revert towards a long-term mean \omega at a rate \theta,
  2. exhibits a volatility proportional to the square root of its level
  3. and whose source of randomness is correlated (with correlation \rho) with the randomness of the underlying's price processes.

There exist few known parametrisation of the volatility surface based on the heston model (Schonbusher, SVI and gSVI) as well as their de-arbitraging methodologies.[2]

CEV model

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

The CEV model describes the relationship between volatility and price, introducing stochastic volatility:

dS_t=\mu S_t \, dt + \sigma S_t ^{\, \gamma} \, dW_t

Conceptually, in some markets volatility rises when prices rise (e.g. commodities), so \gamma > 1. In other markets, volatility tends to rise as prices fall, modelled with \gamma < 1.

Some argue that because the CEV model does not incorporate its own stochastic process for volatility, it is not truly a stochastic volatility model. Instead, they call it a local volatility model.

SABR volatility model

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

The SABR model (Stochastic Alpha, Beta, Rho) describes a single forward F (related to any asset e.g. an index, interest rate, bond, currency or equity) under stochastic volatility \sigma:

dF_t=\sigma_t F^\beta_t\, dW_t,
d\sigma_t=\alpha\sigma^{}_t\, dZ_t,

The initial values F_0 and \sigma_0 are the current forward price and volatility, whereas W_t and Z_t are two correlated Wiener processes (i.e. Brownian motions) with correlation coefficient -1<\rho<1. The constant parameters \beta,\;\alpha are such that 0\leq\beta\leq 1,\;\alpha\geq 0.

The main feature of the SABR model is to be able to reproduce the smile effect of the volatility smile.

GARCH model

The Generalized Autoregressive Conditional Heteroskedasticity (GARCH) model is another popular model for estimating stochastic volatility. It assumes that the randomness of the variance process varies with the variance, as opposed to the square root of the variance as in the Heston model. The standard GARCH(1,1) model has the following form for the variance differential:

 d\nu_t = \theta(\omega - \nu_t)\,dt + \xi \nu_t\,dB_t \,

The GARCH model has been extended via numerous variants, including the NGARCH, TGARCH, IGARCH, LGARCH, EGARCH, GJR-GARCH, etc. Strictly, however, the conditional volatilities from GARCH models are not stochastic since at time t the volatility is completely pre-determined (deterministic) given previous values.[3]

3/2 model

The 3/2 model is similar to the Heston model, but assumes that the randomness of the variance process varies with \nu_t^{3/2}. The form of the variance differential is:

 d\nu_t = \nu_t(\omega - \theta\nu_t)\,dt + \xi \nu_t^\frac{3}{2}\,dB_t. \,

However the meaning of the parameters is different from Heston model. In this model both, mean reverting and volatility of variance parameters, are stochastic quantities given by  \theta\nu_t and  \xi\nu_t respectively.

Chen model

In interest rate modelings, Lin Chen in 1994 developed the first stochastic mean and stochastic volatility model, Chen model. Specifically, the dynamics of the instantaneous interest rate are given by following the stochastic differential equations:

 dr_t = (\theta_t-\alpha_t)\,dt + \sqrt{r_t}\,\sigma_t,\, dW_t,
 d \alpha_t = (\zeta_t-\alpha_t)\,dt + \sqrt{\alpha_t}\,\sigma_t\, dW_t,
 d \sigma_t = (\beta_t-\sigma_t)\,dt + \sqrt{\sigma_t}\,\eta_t\, dW_t.

Calibration and Estimation

Once a particular SV model is chosen, it must be calibrated against existing market data. Calibration is the process of identifying the set of model parameters that are most likely given the observed data. One popular technique is to use maximum likelihood estimation (MLE). For instance, in the Heston model, the set of model parameters \Psi_0 = \{\omega, \theta, \xi, \rho\} \, can be estimated applying an MLE algorithm such as the Powell Directed Set method [1] to observations of historic underlying security prices.

In this case, you start with an estimate for \Psi_0 \,, compute the residual errors when applying the historic price data to the resulting model, and then adjust \Psi \, to try to minimize these errors. Once the calibration has been performed, it is standard practice to re-calibrate the model periodically.

An alternative to calibration is statistical estimation, thereby accounting for parameter uncertainty. Many frequentist and Bayesian methods have been proposed and implemented, typically for a subset of the abovementioned models. The following list contains extension packages for the open source statistical software R that have been specifically designed for heteroskedasticity estimation. The first three cater for GARCH-type models with deterministic volatilities; the fourth deals with stochastic volatility estimation.

  • rugarch: ARFIMA, in-mean, external regressors and various GARCH flavors, with methods for fit, forecast, simulation, inference and plotting.[4]
  • fGarch: Part of the Rmetrics environment for teaching "Financial Engineering and Computational Finance".
  • bayesGARCH: Bayesian estimation of the GARCH(1,1) model with Student's t innovations.[5]
  • stochvol: Efficient algorithms for fully Bayesian estimation of stochastic volatility (SV) models via Markov chain Monte Carlo (MCMC) methods.[6][7]

There are also alternate statistical estimation libraries in other languages such as Python:

  • PyFlux Includes Bayesian and classical inference support for GARCH and beta-t-EGARCH models.

See also

References

  1. Gatheral, J. (2006). The volatility surface: a practitioner's guide. Wiley.
  2. Lua error in package.lua at line 80: module 'strict' not found.
  3. Lua error in package.lua at line 80: module 'strict' not found.
  4. Lua error in package.lua at line 80: module 'strict' not found.
  5. Lua error in package.lua at line 80: module 'strict' not found.
  6. Lua error in package.lua at line 80: module 'strict' not found.
  7. Lua error in package.lua at line 80: module 'strict' not found.
  8. Lua error in package.lua at line 80: module 'strict' not found.

Additional Sources