History of quantum field theory

From Infogalactic: the planetary knowledge core
Jump to: navigation, search

Lua error in package.lua at line 80: module 'strict' not found. In particle physics, the history of quantum field theory starts with its creation by Paul Dirac, when he attempted to quantize the electromagnetic field in the late 1920s. Major advances in the theory were made in the 1950s, and led to the introduction of quantum electrodynamics (QED). QED was so successful and "natural" that efforts were made to use the same basic concepts for the other forces of nature. These efforts were successful in the application of gauge theory to the strong nuclear force and weak nuclear force, producing the modern standard model of particle physics. Efforts to describe gravity using the same techniques have, to date, failed. The study of quantum field theory is alive and flourishing, as are applications of this method to many physical problems. It remains one of the most vital areas of theoretical physics today, providing a common language to many branches of physics.

Early developments

Quantum field theory originated in the 1920s from the problem of creating a quantum mechanical theory of the electromagnetic field. In particular, de Broglie in 1924 introduced the idea of a wave description of elementary systems in the following way: "we proceed in this work from the assumption of the existence of a certain periodic phenomenon of a yet to be determined character, which is to be attributed to each and every isolated energy parcel".[1]

In 1925, Werner Heisenberg, Max Born, and Pascual Jordan constructed just such a theory by expressing the field's internal degrees of freedom as an infinite set of harmonic oscillators, and by then utilizing the canonical quantization procedure to these oscillators; their paper was published in 1926.[2][3][4] This theory assumed that no electric charges or currents were present and today would be called a free field theory.

The first reasonably complete theory of quantum electrodynamics, which included both the electromagnetic field and electrically charged matter (specifically, electrons) as quantum mechanical objects, was created by Paul Dirac in 1927.[5] This quantum field theory could be used to model important processes such as the emission of a photon by an electron dropping into a quantum state of lower energy, a process in which the number of particles changes—one atom in the initial state becomes an atom plus a photon in the final state. It is now understood that the ability to describe such processes is one of the most important features of quantum field theory.

The final crucial step was Enrico Fermi's theory of β-decay (1934).[6][7] In it, fermion species nonconservation was shown to follow from second quantization: creation and annihilation of fermions came to the fore and quantum field theory was seen to describe particle decays.

Incorporating special relativity

It was evident from the beginning that a proper quantum treatment of the electromagnetic field had to somehow incorporate Einstein's relativity theory, which had grown out of the study of classical electromagnetism. This need to put together relativity and quantum mechanics was the second major motivation in the development of quantum field theory. Pascual Jordan and Wolfgang Pauli showed in 1928[8][9] that quantum fields could be made to behave in the way predicted by special relativity during coordinate transformations (specifically, they showed that the field commutators were Lorentz invariant). A further boost for quantum field theory came with the discovery of the Dirac equation, which was originally formulated and interpreted as a single-particle equation analogous to the Schrödinger equation, but unlike the Schrödinger equation, the Dirac equation satisfies both the Lorentz invariance, that is, the requirements of special relativity, and the rules of quantum mechanics. The Dirac equation accommodated the spin-1/2 value of the electron and accounted for its magnetic moment as well as giving accurate predictions for the spectra of hydrogen. The attempted interpretation of the Dirac equation as a single-particle equation could not be maintained long, however, and finally it was shown that several of its undesirable properties (such as negative-energy states) could be made sense of by reformulating and reinterpreting the Dirac equation as a true field equation, in this case for the quantized "Dirac field" or the "electron field", with the "negative-energy solutions" pointing to the existence of anti-particles. This work was performed first by Dirac himself with the invention of hole theory in 1930 and by Wendell Furry, Robert Oppenheimer, Vladimir Fock, and others. Schrödinger, during the same period that he discovered his famous equation in 1926, also independently found the relativistic generalization of it known as the Klein–Gordon equation but dismissed it since, without spin, it predicted impossible properties for the hydrogen spectrum. (See Oskar Klein and Walter Gordon.) All relativistic wave equations that describe spin-zero particles are said to be of the Klein–Gordon type.

Role of Soviet scientists

Of great importance are the studies of Soviet physicists, Viktor Ambartsumian and Dmitri Ivanenko, in particular the Ambarzumian–Ivanenko hypothesis of creation of massive particles (published in 1930) which is the cornerstone of the contemporary quantum field theory.[10] The idea is that not only the quanta of the electromagnetic field, photons, but also other particles (including particles having nonzero rest mass) may be born and disappear as a result of their interaction with other particles. This idea of Ambartsumian and Ivanenko formed the basis of modern quantum field theory and theory of elementary particles.[11][12]

Uncertainty, again

A subtle and careful analysis in 1933 and later in 1950 by Niels Bohr and Leon Rosenfeld showed that there is a fundamental limitation on the ability to simultaneously measure the electric and magnetic field strengths that enter into the description of charges in interaction with radiation, imposed by the uncertainty principle, which must apply to all canonically conjugate quantities. This limitation is crucial for the successful formulation and interpretation of a quantum field theory of photons and electrons (quantum electrodynamics), and indeed, any perturbative quantum field theory. The analysis of Bohr and Rosenfeld explains fluctuations in the values of the electromagnetic field that differ from the classically "allowed" values distant from the sources of the field. Their analysis was crucial to showing that the limitations and physical implications of the uncertainty principle apply to all dynamical systems, whether fields or material particles. Their analysis also convinced most physicists that any notion of returning to a fundamental description of nature based on classical field theory, such as what Einstein aimed at with his numerous and failed attempts at a classical unified field theory, was simply out of the question.

Second quantization

The third thread in the development of quantum field theory was the need to handle the statistics of many-particle systems consistently and with ease. In 1927, Pascual Jordan tried to extend the canonical quantization of fields to the many-body wave functions of identical particles[13] using a formalism which is known as statistical transformation theory;[14] this procedure is now sometimes called second quantization.[15][16] In 1928, Jordan and Eugene Wigner found that the quantum field describing electrons, or other fermions, had to be expanded using anti-commuting creation and annihilation operators due to the Pauli exclusion principle (see Jordan–Wigner transformation). This thread of development was incorporated into many-body theory and strongly influenced condensed matter physics and nuclear physics.

The problem of infinities

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Despite its early successes quantum field theory was plagued by several serious theoretical difficulties. Basic physical quantities, such as the self-energy of the electron, the energy shift of electron states due to the presence of the electromagnetic field, gave infinite, divergent contributions—a nonsensical result—when computed using the perturbative techniques available in the 1930s and most of the 1940s. The electron self-energy problem was already a serious issue in the classical electromagnetic field theory, where the attempt to attribute to the electron a finite size or extent (the classical electron-radius) led immediately to the question of what non-electromagnetic stresses would need to be invoked, which would presumably hold the electron together against the Coulomb repulsion of its finite-sized "parts". The situation was dire, and had certain features that reminded many of the "Rayleigh–Jeans catastrophe". What made the situation in the 1940s so desperate and gloomy, however, was the fact that the correct ingredients (the second-quantized Maxwell–Dirac field equations) for the theoretical description of interacting photons and electrons were well in place, and no major conceptual change was needed analogous to that which was necessitated by a finite and physically sensible account of the radiative behavior of hot objects, as provided by the Planck radiation law.

Renormalization procedures

This "divergence problem" was solved in the case of quantum electrodynamics during the late 1940s and early 1950s by Hans Bethe, Tomonaga, Schwinger, Feynman, and Dyson, through the procedure known as renormalization. Great progress was made after realizing that all infinities in quantum electrodynamics are related to two effects: the self-energy of the electron/positron, and vacuum polarization. Renormalization concerns the business of paying very careful attention to just what is meant by, for example, the very concepts "charge" and "mass" as they occur in the pure, non-interacting field-equations. The "vacuum" is itself polarizable and, hence, populated by virtual particle (on shell and off shell) pairs, and, hence, is a seething and busy dynamical system in its own right. This was a critical step in identifying the source of "infinities" and "divergences". The "bare mass" and the "bare charge" of a particle, the values that appear in the free-field equations (non-interacting case), are abstractions that are simply not realized in experiment (in interaction). What we measure, and hence, what we must take account of with our equations, and what the solutions must account for, are the "renormalized mass" and the "renormalized charge" of a particle. That is to say, the "shifted" or "dressed" values these quantities must have when due care is taken to include all deviations from their "bare values" is dictated by the very nature of quantum fields themselves.

Gauge invariance

The first approach that bore fruit is known as the "interaction representation" (see the article Interaction picture), a Lorentz covariant and gauge-invariant generalization of time-dependent perturbation theory used in ordinary quantum mechanics, and developed by Tomonaga and Schwinger, generalizing earlier efforts of Dirac, Fock and Podolsky. Tomonaga and Schwinger invented a relativistically covariant scheme for representing field commutators and field operators intermediate between the two main representations of a quantum system, the Schrödinger and the Heisenberg representations. Within this scheme, field commutators at separated points can be evaluated in terms of "bare" field creation and annihilation operators. This allows for keeping track of the time-evolution of both the "bare" and "renormalized", or perturbed, values of the Hamiltonian and expresses everything in terms of the coupled, gauge invariant "bare" field-equations. Schwinger gave the most elegant formulation of this approach. The next and most famous development is due to Feynman, who, with his brilliant rules for assigning a "graph"/"diagram" to the terms in the scattering matrix (see S-matrix and Feynman diagrams). These directly corresponded (through the Schwinger-Dyson equation) to the measurable physical processes (cross sections, probability amplitudes, decay widths and lifetimes of excited states) one needs to be able to calculate. This revolutionized how quantum field theory calculations are carried-out in practice.

Two classic text-books from the 1960s, J.D. Bjorken and S.D. Drell, Relativistic Quantum Mechanics (1964) and J.J. Sakurai, Advanced Quantum Mechanics (1967), thoroughly developed the Feynman graph expansion techniques using physically intuitive and practical methods following from the correspondence principle, without worrying about the technicalities involved in deriving the Feynman rules from the superstructure of quantum field theory itself. Although both Feynman's heuristic and pictorial style of dealing with the infinities, as well as the formal methods of Tomonaga and Schwinger, worked extremely well, and gave spectacularly accurate answers, the true analytical nature of the question of "renormalizability", that is, whether ANY theory formulated as a "quantum field theory" would give finite answers, was not worked-out until much later, when the urgency of trying to formulate finite theories for the strong and electro-weak (and gravitational interactions) demanded its solution.

Renormalization in the case of QED was largely fortuitous due to the smallness of the coupling constant, the fact that the coupling has no dimensions involving mass, the so-called fine structure constant, and also the zero-mass of the gauge boson involved, the photon, rendered the small-distance/high-energy behavior of QED manageable. Also, electromagnetic processes are very "clean" in the sense that they are not badly suppressed/damped and/or hidden by the other gauge interactions. By 1958 Sidney Drell observed: "Quantum electrodynamics (QED) has achieved a status of peaceful coexistence with its divergences ...".

The unification of the electromagnetic force with the weak force encountered initial difficulties due to the lack of accelerator energies high enough to reveal processes beyond the Fermi interaction range. Additionally, a satisfactory theoretical understanding of hadron substructure had to be developed, culminating in the quark model.

The strong force

In the case of the strong interactions, progress concerning their short-distance/high-energy behavior was much slower and more frustrating. For strong interactions with the electro-weak fields, there were difficult issues regarding the strength of coupling, the mass generation of the force carriers as well as their non-linear, self interactions. Although there has been theoretical progress toward a grand unified quantum field theory incorporating the electro-magnetic force, the weak force and the strong force, empirical verification is still pending. Superunification, incorporating the gravitational force, is still very speculative, and is under intensive investigation by many of the best minds in contemporary theoretical physics. Gravitation is a tensor field description of a spin-2 gauge-boson, the "graviton", and is further discussed in the articles on general relativity and quantum gravity.

Quantum gravity

From the point of view of the techniques of (four-dimensional) quantum field theory, and as the numerous efforts to formulate a consistent quantum gravity theory attests, gravitational quantization has been the reigning champion for bad behavior.[17]

There are technical problems underlain by the fact that the gravitational coupling constant has dimensions involving inverse powers of mass, and, as a simple consequence, it is plagued by perturbatively badly behaved non-linear self-interactions. Gravity is itself a source of gravity, analogously to gauge theories (whose couplings, are, by contrast, dimensionless) leading to uncontrollable divergences at increasing orders of perturbation theory.

Moreover, gravity couples to all energy equally strongly, as per the equivalence principle, so this makes the notion of ever really "switching-off", "cutting-off" or separating, the gravitational interaction from other interactions ambiguous, since, with gravitation, we are dealing with the very structure of space-time itself.

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Moreover, it has not been established that a theory of quantum gravity is necessary (see Quantum field theory in curved spacetime).

An Abelian gauge theory

Thanks to the somewhat brute-force, clanky and heuristic methods of Feynman, and the elegant and abstract methods of Tomonaga and Schwinger, from the period of early renormalization, we do have the modern theory of quantum electrodynamics (QED). It is still the most accurate physical theory known, the prototype of a successful quantum field theory. Beginning in the 1950s with the work of Yang and Mills, following the previous lead of Weyl and Pauli, deep explorations illuminated the types of symmetries and invariances any field theory must satisfy. QED, and indeed, all field theories, were generalized to a class of quantum field theories known as gauge theories. Quantum electrodynamics is the most famous example of what is known as an Abelian gauge theory. It relies on the symmetry group U(1) and has one massless gauge field, the U(1) gauge symmetry, dictating the form of the interactions involving the electromagnetic field, with the photon being the gauge boson. That symmetries dictate, limit and necessitate the form of interaction between particles is the essence of the "gauge theory revolution". Yang and Mills formulated the first explicit example of a non-Abelian gauge theory, Yang–Mills theory, with an attempted explanation of the strong interactions in mind. The strong interactions were then (incorrectly) understood in the mid-1950s, to be mediated by the pi-mesons, the particles predicted by Hideki Yukawa in 1935, based on his profound reflections concerning the reciprocal connection between the mass of any force-mediating particle and the range of the force it mediates. This was allowed by the uncertainty principle. The 1960s and 1970s saw the formulation of a gauge theory now known as the Standard Model of particle physics, which systematically describes the elementary particles and the interactions between them.

Electroweak unification

The electroweak interaction part of the standard model was formulated by Sheldon Glashow in 1958–60 with his discovery of the SU(2)xU(1) group structure of the theory. Steven Weinberg and Abdus Salam brilliantly invoked the Anderson-Higgs mechanism for the generation of the W and Z masses (the intermediate vector bosons responsible for the weak interactions and neutral-currents) and keeping the mass of the photon zero. The Goldstone and Higgs idea for generating mass in gauge theories was sparked in the late 1950s and early 1960s when a number of theoreticians (including Yoichiro Nambu, Steven Weinberg, Jeffrey Goldstone, François Englert, Robert Brout, G. S. Guralnik, C. R. Hagen, Tom Kibble and Philip Warren Anderson) noticed a possibly useful analogy to the (spontaneous) breaking of the U(1) symmetry of electromagnetism in the formation of the BCS ground-state of a superconductor. The gauge boson involved in this situation, the photon, behaves as though it has acquired a finite mass. There is a further possibility that the physical vacuum (ground-state) does not respect the symmetries implied by the "unbroken" electroweak Lagrangian (see the article Electroweak interaction for more details) from which one arrives at the field equations. The electroweak theory of Weinberg and Salam was shown to be renormalizable (finite) and hence consistent by Gerardus 't Hooft and Martinus Veltman. The Glashow–Weinberg–Salam theory (GWS theory) is a triumph and, in certain applications, gives an accuracy on a par with quantum electrodynamics.

Common trends in particle and condensed matter physics

Also during the 1970s, parallel developments in the study of phase transitions in condensed matter physics led Leo Kadanoff, Michael Fisher and Kenneth Wilson (extending work of Ernst Stueckelberg, Andre Peterman, Murray Gell-Mann, and Francis Low) to a set of ideas and methods monitor changes of the behavior of the theory with scale, known as the renormalization group. By providing a deep physical understanding of the formal renormalization procedure invented in the 1940s, the renormalization group sparked what has been called the "grand synthesis" of theoretical physics, uniting the quantum field theoretical techniques used in particle physics and condensed matter physics into a single theoretical framework.

The gauge field theory of the strong interactions, quantum chromodynamics, QCD, relies crucially on this renormalization group for its distinguishing characteristic features, asymptotic freedom and color confinement.

Modern developments

See also

Notes

  1. Recherches sur la theorrie des quanta (Ann. de Phys., 10, III, 1925; translation by A. F. Kracklauer)
  2. Todorov, Ivan (2012). "Quantization is a mystery", Bulg. J. Phys. 39 (2012) 107–149; arXiv: 1206.3116 [math-ph]
  3. Lua error in package.lua at line 80: module 'strict' not found. The paper was received on 16 November 1925. [English translation in: van der Waerden 1968, 15 "On Quantum Mechanics II"]
  4. This paper was preceded by an earlier one by Born and Jordan published in 1925. (Lua error in package.lua at line 80: module 'strict' not found.)
  5. Dirac, P.A.M. (1927). The Quantum Theory of the Emission and Absorption of Radiation, Proceedings of the Royal Society of London, Series A, Vol. 114, p. 243.
  6. Chen Ning Yang (2012). "Fermi's β-decay Theory", Asia Pac. Phys. Newslett. 1, p. 27. doi: 10.1142/S2251158X12000045 online
  7. Fermi, E. (1934). "Versuch einer Theorie der -Strahlen", Z. Phys. 88 161–177, doi: 10.1007/BF01351864
  8. Jordan, P., and Pauli, W. (1928), "Zur Quantenelektrodynamik", Zeitschrift fur Physik 47: 151-73
  9. Jagdish Mehra, Helmut Rechenberg, The Probability Interpretation and the Statistical Transformation Theory, the Physical Interpretation, and the Empirical and Mathematical Foundations of Quantum Mechanics 1926–1932, Springer, 2000, p. 199.
  10. G-sardanashvily.ru
  11. Vaprize.sci.am
  12. Sciteclibrary.ru
  13. P. Jordan, "Über eine neue Begründung der Quantenmechanik". Zeitschrift für Physik 40: 809-838, 1927 and "Über eine neue Begründung der Quantenmechanik II". Zeitschrift für Physik 44: 1-25, 1927.
  14. "Quantum Mechanics in Context: Pascual Jordan's 1936 Anschauliche Quantentheorie"
  15. Daniel Greenberger, Klaus Hentschel, Friedel Weinert (eds.), Compendium of Quantum Physics: Concepts, Experiments, History and Philosophy, Springer, 2009: "Quantization (First, Second)".
  16. Arthur I. Miller, Early Quantum Electrodynamics: A Sourcebook, Cambridge University Press, 1995, p. 18.
  17. Brian Hatfield, Fernando Morinigo, Richard P. Feynman, William Wagner (2002) "Feynman Lectures on Gravitation", ISBN 978-0-8133-4038-8

Further reading

  • Pais, Abraham; Inward Bound - Of Matter & Forces in the Physical World, Oxford University Press (1986) [ISBN 0-19-851997-4] Written by a former Einstein assistant at Princeton, this is a beautiful detailed history of modern fundamental physics, from 1895 (discovery of X-rays) to 1983 (discovery of vectors bosons at C.E.R.N.).
  • Richard Feynman; Lecture Notes in Physics. Princeton University Press: Princeton, (1986).
  • Richard Feynman; QED. Princeton University Press: Princeton, (1982).
  • Weinberg, Steven; The Quantum Theory of Fields - Foundations (vol. I), Cambridge University Press (1995) [ISBN 0-521-55001-7] The first chapter (pp. 1–40) of Weinberg's monumental treatise gives a brief history of Q.F.T., pp. 608.
  • Weinberg, Steven; The Quantum Theory of Fields - Modern Applications (vol. II), Cambridge University Press:Cambridge, U.K. (1996) [ISBN 0-521-55001-7], pp. 489.
  • Weinberg, Steven; The Quantum Theory of Fields - Supersymmetry (vol. III), Cambridge University Press:Cambridge, U.K. (2000) [ISBN 0-521-55002-5], pp. 419.
  • Schweber, Silvan S.; Q.E.D. and the men who made it: Dyson, Feynman, Schwinger, and Tomonaga, Princeton University Press (1994) [ISBN 0-691-03327-7]
  • Ynduráin, Francisco José; Quantum Chromodynamics: An Introduction to the Theory of Quarks and Gluons, Springer Verlag, New York, 1983. [ISBN 0-387-11752-0]
  • Miller, Arthur I.; Early Quantum Electrodynamics : A Sourcebook, Cambridge University Press (1995) [ISBN 0-521-56891-9]
  • Schwinger, Julian; Selected Papers on Quantum Electrodynamics, Dover Publications, Inc. (1958) [ISBN 0-486-60444-6]
  • O'Raifeartaigh, Lochlainn; The Dawning of Gauge Theory, Princeton University Press (May 5, 1997) [ISBN 0-691-02977-6]
  • Cao, Tian Yu; Conceptual Developments of 20th Century Field Theories, Cambridge University Press (1997) [ISBN 0-521-63420-2]
  • Darrigol, Olivier; La genèse du concept de champ quantique, Annales de Physique (France) 9 (1984) pp. 433–501. Text in French, adapted from the author's Ph.D. thesis.