Quantum mechanics, including quantum field theory, is a fundamental theory in physics which describes nature at the smallest scales of energy levels of atoms and subatomic particles. Classical physics, the physics existing before quantum mechanics, describes nature at ordinary scale. Most theories in classical physics can be derived from quantum mechanics as an approximation valid at large scale. Quantum mechanics differs from classical physics in that energy, angular momentum and other quantities of a bound system are restricted to discrete values. Quantum mechanics arose from theories to explain observations which could not be reconciled with classical physics, such as Max Planck's solution in 1900 to the black-body radiation problem, from the correspondence between energy and frequency in Albert Einstein's 1905 paper which explained the photoelectric effect. Early quantum theory was profoundly re-conceived in the mid-1920s by Erwin Schrödinger, Werner Heisenberg, Max Born and others; the modern theory is formulated in various specially developed mathematical formalisms.
In one of them, a mathematical function, the wave function, provides information about the probability amplitude of position and other physical properties of a particle. Important applications of quantum theory include quantum chemistry, quantum optics, quantum computing, superconducting magnets, light-emitting diodes, the laser, the transistor and semiconductors such as the microprocessor and research imaging such as magnetic resonance imaging and electron microscopy. Explanations for many biological and physical phenomena are rooted in the nature of the chemical bond, most notably the macro-molecule DNA. Scientific inquiry into the wave nature of light began in the 17th and 18th centuries, when scientists such as Robert Hooke, Christiaan Huygens and Leonhard Euler proposed a wave theory of light based on experimental observations. In 1803, Thomas Young, an English polymath, performed the famous double-slit experiment that he described in a paper titled On the nature of light and colours.
This experiment played a major role in the general acceptance of the wave theory of light. In 1838, Michael Faraday discovered cathode rays; these studies were followed by the 1859 statement of the black-body radiation problem by Gustav Kirchhoff, the 1877 suggestion by Ludwig Boltzmann that the energy states of a physical system can be discrete, the 1900 quantum hypothesis of Max Planck. Planck's hypothesis that energy is radiated and absorbed in discrete "quanta" matched the observed patterns of black-body radiation. In 1896, Wilhelm Wien empirically determined a distribution law of black-body radiation, known as Wien's law in his honor. Ludwig Boltzmann independently arrived at this result by considerations of Maxwell's equations. However, it underestimated the radiance at low frequencies. Planck corrected this model using Boltzmann's statistical interpretation of thermodynamics and proposed what is now called Planck's law, which led to the development of quantum mechanics. Following Max Planck's solution in 1900 to the black-body radiation problem, Albert Einstein offered a quantum-based theory to explain the photoelectric effect.
Around 1900–1910, the atomic theory and the corpuscular theory of light first came to be accepted as scientific fact. Among the first to study quantum phenomena in nature were Arthur Compton, C. V. Raman, Pieter Zeeman, each of whom has a quantum effect named after him. Robert Andrews Millikan studied the photoelectric effect experimentally, Albert Einstein developed a theory for it. At the same time, Ernest Rutherford experimentally discovered the nuclear model of the atom, for which Niels Bohr developed his theory of the atomic structure, confirmed by the experiments of Henry Moseley. In 1913, Peter Debye extended Niels Bohr's theory of atomic structure, introducing elliptical orbits, a concept introduced by Arnold Sommerfeld; this phase is known as old quantum theory. According to Planck, each energy element is proportional to its frequency: E = h ν, where h is Planck's constant. Planck cautiously insisted that this was an aspect of the processes of absorption and emission of radiation and had nothing to do with the physical reality of the radiation itself.
In fact, he considered his quantum hypothesis a mathematical trick to get the right answer rather than a sizable discovery. However, in 1905 Albert Einstein interpreted Planck's quantum hypothesis realistically and used it to explain the photoelectric effect, in which shining light on certain materials can eject electrons from the material, he won the 1921 Nobel Prize in Physics for this work. Einstein further developed this idea to show that an electromagnetic wave such as light could be described as a particle, with a discrete quantum of energy, dependent on its frequency; the foundations of quantum mechanics were established during the first half of the 20th century by Max Planck, Niels Bohr, Werner Heisenberg, Louis de Broglie, Arthur Compton, Albert Einstein, Erwin Schrödinger, Max Born, John von Neumann, Paul Dirac, Enrico Fermi, Wolfgang Pauli, Max von Laue, Freeman Dyson, David Hilbert, Wi
Quantum field theory
In theoretical physics, quantum field theory is a theoretical framework that combines classical field theory, special relativity, quantum mechanics and is used to construct physical models of subatomic particles and quasiparticles. QFT treats particles as excited states of their underlying fields, which are—in a sense—more fundamental than the basic particles. Interactions between particles are described by interaction terms in the Lagrangian involving their corresponding fields; each interaction can be visually represented by Feynman diagrams, which are formal computational tools, in the process of relativistic perturbation theory. As a successful theoretical framework today, quantum field theory emerged from the work of generations of theoretical physicists spanning much of the 20th century, its development began in the 1920s with the description of interactions between light and electrons, culminating in the first quantum field theory — quantum electrodynamics. A major theoretical obstacle soon followed with the appearance and persistence of various infinities in perturbative calculations, a problem only resolved in the 1950s with the invention of the renormalization procedure.
A second major barrier came with QFT's apparent inability to describe the weak and strong interactions, to the point where some theorists called for the abandonment of the field theoretic approach. The development of gauge theory and the completion of the Standard Model in the 1970s led to a renaissance of quantum field theory. Quantum field theory is the result of the combination of classical field theory, quantum mechanics, special relativity. A brief overview of these theoretical precursors is in order; the earliest successful classical field theory is one that emerged from Newton's law of universal gravitation, despite the complete absence of the concept of fields from his 1687 treatise Philosophiæ Naturalis Principia Mathematica. The force of gravity as described by Newton is an "action at a distance" — its effects on faraway objects are instantaneous, no matter the distance. In an exchange of letters with Richard Bentley, Newton stated that "it is inconceivable that inanimate brute matter should, without the mediation of something else, not material, operate upon and affect other matter without mutual contact."
It was not until the 18th century that mathematical physicists discovered a convenient description of gravity based on fields — a numerical quantity assigned to every point in space indicating the action of gravity on any particle at that point. However, this was considered a mathematical trick. Fields began to take on an existence of their own with the development of electromagnetism in the 19th century. Michael Faraday coined the English term "field" in 1845, he introduced fields as properties of space having physical effects. He argued against "action at a distance", proposed that interactions between objects occur via space-filling "lines of force"; this description of fields remains to this day. The theory of classical electromagnetism was completed in 1862 with Maxwell's equations, which described the relationship between the electric field, the magnetic field, electric current, electric charge. Maxwell's equations implied the existence of electromagnetic waves, a phenomenon whereby electric and magnetic fields propagate from one spatial point to another at a finite speed, which turns out to be the speed of light.
Action-at-a-distance was thus conclusively refuted. Despite the enormous success of classical electromagnetism, it was unable to account for the discrete lines in atomic spectra, nor for the distribution of blackbody radiation in different wavelengths. Max Planck's study of blackbody radiation marked the beginning of quantum mechanics, he treated atoms, which absorb and emit electromagnetic radiation, as tiny oscillators with the crucial property that their energies can only take on a series of discrete, rather than continuous, values. These are known as quantum harmonic oscillators; this process of restricting energies to discrete values is called quantization. Building on this idea, Albert Einstein proposed in 1905 an explanation for the photoelectric effect, that light is composed of individual packets of energy called photons; this implied that the electromagnetic radiation, while being waves in the classical electromagnetic field exists in the form of particles. In 1913, Niels Bohr introduced the Bohr model of atomic structure, wherein electrons within atoms can only take on a series of discrete, rather than continuous, energies.
This is another example of quantization. The Bohr model explained the discrete nature of atomic spectral lines. In 1924, Louis de Broglie proposed the hypothesis of wave-particle duality, that microscopic particles exhibit both wave-like and particle-like properties under different circumstances. Uniting these scattered ideas, a coherent discipline, quantum mechanics, was formulated between 1925 and 1926, with important contributions from de Broglie, Werner Heisenberg, Max Born, Erwin Schrödinger, Paul Dirac, Wolfgang Pauli.:22-23In the same year as his paper on the photoelectric effect, Einstein published his theory of special relativity, built on Maxwell's electromagnetism. New rules, called Lorentz transformation, were given for the way time and space coordinates of an event change under changes in the observer's velocity, the distinction between time and space was blurred.:19 It was proposed that all physical laws must be the same for observers at different velocities, i.e. that physical laws be invariant under Lorentz transformations.
Two difficulties remained. Observationally, the Schrödinger equation underlying q
Loop quantum gravity
Loop quantum gravity is a theory of quantum gravity, merging quantum mechanics and general relativity, making it a possible candidate for a theory of everything. Its goal is to unify gravity in a common theoretical framework with the other three fundamental forces of nature, beginning with relativity and adding quantum features, it competes with string theory that adds gravity. From the point of view of Einstein's theory, all attempts to treat gravity as another quantum force equal in importance to electromagnetism and the nuclear forces have failed. According to Einstein, gravity is not a force – it is a property of spacetime itself. Loop quantum gravity is an attempt to develop a quantum theory of gravity based directly on Einstein's geometric formulation. To do this, in LQG theory space and time are quantized, analogously to the way quantities like energy and momentum are quantized in quantum mechanics; the theory gives a physical picture of spacetime where space and time are granular and discrete directly because of quantization just like photons in the quantum theory of electromagnetism and the discrete energy levels of atoms.
Distance exists with a minimum. Space's structure prefers an fine fabric or network woven of finite loops; these networks of loops are called spin networks. The evolution of a spin network, or spin foam, has a scale on the order of a Planck length 10−35 metres, smaller scales do not exist. Not just matter, but space itself, prefers an atomic structure; the vast areas of research developed in several directions that involve about 30 research groups worldwide. They all share the basic physical assumptions and the mathematical description of quantum space. Research follows two directions: the more traditional canonical loop quantum gravity, the newer covariant loop quantum gravity, called spin foam theory. Physical consequences of the theory proceed in several directions; the most well-developed applies to cosmology, called loop quantum cosmology, the study of the early universe and the physics of the Big Bang. Its greatest consequence sees the evolution of the universe continuing beyond the Big Bang called the Big Bounce.
In 1986, Abhay Ashtekar reformulated Einstein's general relativity in a language closer to that of the rest of fundamental physics. Shortly after, Ted Jacobson and Lee Smolin realized that the formal equation of quantum gravity, called the Wheeler–DeWitt equation, admitted solutions labelled by loops when rewritten in the new Ashtekar variables. Carlo Rovelli and Lee Smolin defined a nonperturbative and background-independent quantum theory of gravity in terms of these loop solutions. Jorge Pullin and Jerzy Lewandowski understood that the intersections of the loops are essential for the consistency of the theory, the theory should be formulated in terms of intersecting loops, or graphs. In 1994, Rovelli and Smolin showed that the quantum operators of the theory associated to area and volume have a discrete spectrum; that is, geometry is quantized. This result defines an explicit basis of states of quantum geometry, which turned out to be labelled by Roger Penrose's spin networks, which are graphs labelled by spins.
The canonical version of the dynamics was put on firm ground by Thomas Thiemann, who defined an anomaly-free Hamiltonian operator, showing the existence of a mathematically consistent background-independent theory. The covariant or spin foam version of the dynamics developed during several decades, crystallized in 2008, from the joint work of research groups in France, Canada, UK, Germany, leading to the definition of a family of transition amplitudes, which in the classical limit can be shown to be related to a family of truncations of general relativity; the finiteness of these amplitudes was proven in 2011. It requires the existence of a positive cosmological constant, this is consistent with observed acceleration in the expansion of the Universe. In theoretical physics, general covariance is the invariance of the form of physical laws under arbitrary differentiable coordinate transformations; the essential idea is that coordinates are only artifices used in describing nature, hence should play no role in the formulation of fundamental physical laws.
A more significant requirement is the principle of general relativity that states that the laws of physics take the same form in all reference systems. This is a generalization of the principle of special relativity which states that the laws of physics take the same form in all inertial frames. In mathematics, a diffeomorphism is an isomorphism in the category of smooth manifolds, it is an invertible function that maps one differentiable manifold to another, such that both the function and its inverse are smooth. These are the defining symmetry transformations of General Relativity since the theory is formulated only in terms of a differentiable manifold. In general relativity, general covariance is intimately related to "diffeomorphism invariance"; this symmetry is one of the defining features of the theory. However, it is a common misunderstanding that "diffeomorphism invariance" refers to the invariance of the physical predictions of a theory under arbitrary coordinate transformations. Diffeomorphisms, as mathematicians define them, correspond to something much more radical.
Diffeomorphisms are the true symmetry transformations of general relativity, come about from the assertion that the formulation of the theory is based on a bare different
Large Hadron Collider
The Large Hadron Collider is the world's largest and most powerful particle collider and the largest machine in the world. It was built by the European Organization for Nuclear Research between 1998 and 2008 in collaboration with over 10,000 scientists and hundreds of universities and laboratories, as well as more than 100 countries, it lies in a tunnel 27 kilometres in circumference and as deep as 175 metres beneath the France–Switzerland border near Geneva. First collisions were achieved in 2010 at an energy of 3.5 teraelectronvolts per beam, about four times the previous world record. After upgrades it reached 6.5 TeV per beam. At the end of 2018, it entered a two-year shutdown period for further upgrades; the collider has four crossing points, around which are positioned seven detectors, each designed for certain kinds of research. The LHC collides proton beams, but it can use beams of heavy ions: Lead–lead collisions and proton-lead collisions are done for one month per year; the aim of the LHC's detectors is to allow physicists to test the predictions of different theories of particle physics, including measuring the properties of the Higgs boson and searching for the large family of new particles predicted by supersymmetric theories, as well as other unsolved questions of physics.
The term hadron refers to composite particles composed of quarks held together by the strong force. The best-known hadrons are the baryons such as neutrons. A collider is a type of a particle accelerator with two directed beams of particles. In particle physics, colliders are used as a research tool: they accelerate particles to high kinetic energies and let them impact other particles. Analysis of the byproducts of these collisions gives scientists good evidence of the structure of the subatomic world and the laws of nature governing it. Many of these byproducts are produced only by high-energy collisions, they decay after short periods of time, thus many of them are nearly impossible to study in other ways. Physicists hope that the Large Hadron Collider will help answer some of the fundamental open questions in physics, concerning the basic laws governing the interactions and forces among the elementary objects, the deep structure of space and time, in particular the interrelation between quantum mechanics and general relativity.
Data are needed from high-energy particle experiments to suggest which versions of current scientific models are more to be correct – in particular to choose between the Standard Model and Higgsless model and to validate their predictions and allow further theoretical development. Many theorists expect new physics beyond the Standard Model to emerge at the TeV energy level, as the Standard Model appears to be unsatisfactory. Issues explored by LHC collisions include: is the mass of elementary particles being generated by the Higgs mechanism via electroweak symmetry breaking? It was expected that the collider experiments will either demonstrate or rule out the existence of the elusive Higgs boson, thereby allowing physicists to consider whether the Standard Model or its Higgsless alternatives are more to be correct. Is supersymmetry, an extension of the Standard Model and Poincaré symmetry, realized in nature, implying that all known particles have supersymmetric partners? Are there extra dimensions, as predicted by various models based on string theory, can we detect them?
What is the nature of the dark matter that appears to account for 27% of the mass-energy of the universe? Other open questions that may be explored using high-energy particle collisions: It is known that electromagnetism and the weak nuclear force are different manifestations of a single force called the electroweak force; the LHC may clarify whether the electroweak force and the strong nuclear force are just different manifestations of one universal unified force, as predicted by various Grand Unification Theories. Why is the fourth fundamental force so many orders of magnitude weaker than the other three fundamental forces? See Hierarchy problem. Are there additional sources of quark flavour mixing, beyond those present within the Standard Model? Why are there apparent violations of the symmetry between matter and antimatter? See CP violation. What are the nature and properties of quark–gluon plasma, thought to have existed in the early universe and in certain compact and strange astronomical objects today?
This will be investigated by heavy ion collisions in ALICE, but in CMS, ATLAS and LHCb. First observed in 2010, findings published in 2012 confirmed the phenomenon of jet quenching in heavy-ion collisions; the LHC is the world's largest and highest-energy particle accelerator. The collider is contained in a circular tunnel, with a circumference of 26.7 kilometres, at a depth ranging from 50 to 175 metres underground. The 3.8-metre wide concrete-lined tunnel, constructed between 1983 and 1988, was used to house the Large Electron–Positron Collider. It crosses the border between Switzerland and France with most of it in France. Surface buildings hold ancillary equipment such as compressors, ventilation equipment, control electronics and refrigeration plants; the collider tunnel contains two adjacent parallel beamlines each containing a beam, which travel in opposite directions around the ring. The beams intersect at four points around the ring, where the particle collisio
Yang–Mills theory is a gauge theory based on the SU group, or more any compact, reductive Lie algebra. Yang–Mills theory seeks to describe the behavior of elementary particles using these non-abelian Lie groups and is at the core of the unification of the electromagnetic force and weak forces as well as quantum chromodynamics, the theory of the strong force, thus it forms the basis of our understanding of the Standard Model of particle physics. In a private correspondence, Wolfgang Pauli formulated in 1953 a six-dimensional theory of Einstein's field equations of general relativity, extending the five-dimensional theory of Kaluza, Klein and others to a higher-dimensional internal space. However, there is no evidence that Pauli developed the Lagrangian of a gauge field or the quantization of it; because Pauli found that his theory "leads to some rather unphysical shadow particles", he refrained from publishing his results formally. Although Pauli did not publish his six-dimensional theory, he gave two talks about it in Zürich.
Recent research shows that an extended Kaluza–Klein theory is in general not equivalent to Yang–Mills theory, as the former contains additional terms. In early 1954, Chen Ning Yang and Robert Mills extended the concept of gauge theory for abelian groups, e.g. quantum electrodynamics, to nonabelian groups to provide an explanation for strong interactions. The idea by Yang–Mills was criticized by Pauli, as the quanta of the Yang–Mills field must be massless in order to maintain gauge invariance; the idea was set aside until 1960, when the concept of particles acquiring mass through symmetry breaking in massless theories was put forward by Jeffrey Goldstone, Yoichiro Nambu, Giovanni Jona-Lasinio. This prompted a significant restart of Yang–Mills theory studies that proved successful in the formulation of both electroweak unification and quantum chromodynamics; the electroweak interaction is described by the gauge group SU × U, while QCD is an SU Yang–Mills theory. The massless gauge bosons of the electroweak SU × U mix after spontaneous symmetry breaking to produce the 3 massive weak bosons as well as the still-massless photon field.
The dynamics of the photon field and its interactions with matter are, in turn, governed by the U gauge theory of quantum electrodynamics. The Standard Model combines the strong interaction with the unified electroweak interaction through the symmetry group SU × SU × U. In the current epoch the strong interaction is not unified with the electroweak interaction, but from the observed running of the coupling constants it is believed they all converge to a single value at high energies. Phenomenology at lower energies in quantum chromodynamics is not understood due to the difficulties of managing such a theory with a strong coupling; this may be the reason why confinement has not been theoretically proven, though it is a consistent experimental observation. Proof that QCD confines at low energy is a mathematical problem of great relevance, an award has been proposed by the Clay Mathematics Institute for whoever is able to show that the Yang–Mills theory has a mass gap and its existence. Yang–Mills theories are a special example of gauge theory with a non-abelian symmetry group given by the Lagrangian L g f = − 1 2 Tr = − 1 4 F a μ ν F μ ν a with the generators of the Lie algebra, indexed by a, corresponding to the F-quantities satisfying Tr = 1 2 δ a b, = i f a b c T c, where the fabc are structure constants of the Lie algebra, the covariant derivative defined as D μ = I ∂ μ − i g T a A μ a where I is the identity matrix, A μ a is the vector potential, g is the coupling constant.
In four dimensions, the coupling constant g is a pure number and for a SU group one has a, b, c = 1 … N 2 − 1. The relation F μ ν a = ∂ μ A ν a − ∂ ν A μ a + g f a b c A μ b A ν c can be derived by the commutator [ D μ, D ν
In physics, string theory is a theoretical framework in which the point-like particles of particle physics are replaced by one-dimensional objects called strings. It describes how these strings propagate through interact with each other. On distance scales larger than the string scale, a string looks just like an ordinary particle, with its mass and other properties determined by the vibrational state of the string. In string theory, one of the many vibrational states of the string corresponds to the graviton, a quantum mechanical particle that carries gravitational force, thus string theory is a theory of quantum gravity. String theory is a broad and varied subject that attempts to address a number of deep questions of fundamental physics. String theory has been applied to a variety of problems in black hole physics, early universe cosmology, nuclear physics, condensed matter physics, it has stimulated a number of major developments in pure mathematics; because string theory provides a unified description of gravity and particle physics, it is a candidate for a theory of everything, a self-contained mathematical model that describes all fundamental forces and forms of matter.
Despite much work on these problems, it is not known to what extent string theory describes the real world or how much freedom the theory allows in the choice of its details. String theory was first studied in the late 1960s as a theory of the strong nuclear force, before being abandoned in favor of quantum chromodynamics. Subsequently, it was realized that the properties that made string theory unsuitable as a theory of nuclear physics made it a promising candidate for a quantum theory of gravity; the earliest version of string theory, bosonic string theory, incorporated only the class of particles known as bosons. It developed into superstring theory, which posits a connection called supersymmetry between bosons and the class of particles called fermions. Five consistent versions of superstring theory were developed before it was conjectured in the mid-1990s that they were all different limiting cases of a single theory in eleven dimensions known as M-theory. In late 1997, theorists discovered an important relationship called the AdS/CFT correspondence, which relates string theory to another type of physical theory called a quantum field theory.
One of the challenges of string theory is that the full theory does not have a satisfactory definition in all circumstances. Another issue is that the theory is thought to describe an enormous landscape of possible universes, this has complicated efforts to develop theories of particle physics based on string theory; these issues have led some in the community to criticize these approaches to physics and question the value of continued research on string theory unification. In the twentieth century, two theoretical frameworks emerged for formulating the laws of physics; the first is Albert Einstein's general theory of relativity, a theory that explains the force of gravity and the structure of space and time. The other is quantum mechanics, a different formulation to describe physical phenomena using the known probability principles. By the late 1970s, these two frameworks had proven to be sufficient to explain most of the observed features of the universe, from elementary particles to atoms to the evolution of stars and the universe as a whole.
In spite of these successes, there are still many problems. One of the deepest problems in modern physics is the problem of quantum gravity; the general theory of relativity is formulated within the framework of classical physics, whereas the other fundamental forces are described within the framework of quantum mechanics. A quantum theory of gravity is needed in order to reconcile general relativity with the principles of quantum mechanics, but difficulties arise when one attempts to apply the usual prescriptions of quantum theory to the force of gravity. In addition to the problem of developing a consistent theory of quantum gravity, there are many other fundamental problems in the physics of atomic nuclei, black holes, the early universe. String theory is a theoretical framework that attempts to address many others; the starting point for string theory is the idea that the point-like particles of particle physics can be modeled as one-dimensional objects called strings. String theory describes how strings propagate through interact with each other.
In a given version of string theory, there is only one kind of string, which may look like a small loop or segment of ordinary string, it can vibrate in different ways. On distance scales larger than the string scale, a string will look just like an ordinary particle, with its mass and other properties determined by the vibrational state of the string. In this way, all of the different elementary particles may be viewed as vibrating strings. In string theory, one of the vibrational states of the string gives rise to the graviton, a quantum mechanical particle that carries gravitational force, thus string theory is a theory of quantum gravity. One of the main developments of the past several decades in string theory was the discovery of certain "dualities", mathematical transformations that identify one physical theory with another. Physicists studying string theory have discovered a number of these dualities between different versions of string theory, this has led to the conjecture that all consistent versions of string theory are subsumed in a single framework known as M-theory.
Studies of string theory have yielded a number of results on the nature of black holes and the gravitational interaction. There are certain paradoxes that arise when one attempts to understand the quantum aspects of black holes, work on string theory
Dark matter is a hypothetical form of matter, thought to account for 85% of the matter in the universe and about a quarter of its total energy density. The majority of dark matter is thought to be non-baryonic in nature being composed of some as-yet undiscovered subatomic particles, its presence is implied in a variety of astrophysical observations, including gravitational effects that cannot be explained by accepted theories of gravity unless more matter is present than can be seen. For this reason, most experts think dark matter to be ubiquitous in the universe and to have had a strong influence on its structure and evolution. Dark matter is called dark because it does not appear to interact with observable electromagnetic radiation, such as light, is thus invisible to the entire electromagnetic spectrum, making it difficult to detect using usual astronomical equipment; the primary evidence for dark matter is that calculations show that many galaxies would fly apart instead of rotating, or would not have formed or move as they do, if they did not contain a large amount of unseen matter.
Other lines of evidence include observations in gravitational lensing, from the cosmic microwave background, from astronomical observations of the observable universe's current structure, from the formation and evolution of galaxies, from mass location during galactic collisions, from the motion of galaxies within galaxy clusters. In the standard Lambda-CDM model of cosmology, the total mass–energy of the universe contains 5% ordinary matter and energy, 27% dark matter and 68% of an unknown form of energy known as dark energy. Thus, dark matter constitutes 85% of total mass, while dark energy plus dark matter constitute 95% of total mass–energy content; because dark matter has not yet been observed directly, if it exists, it must interact with ordinary baryonic matter and radiation, except through gravity. The primary candidate for dark matter is some new kind of elementary particle that has not yet been discovered, in particular, weakly-interacting massive particles, or gravitationally-interacting massive particles.
Many experiments to directly detect and study dark matter particles are being undertaken, but none have yet succeeded. Dark matter is classified as warm, or hot according to its velocity. Current models favor a cold dark matter scenario, in which structures emerge by gradual accumulation of particles. Although the existence of dark matter is accepted by the scientific community, some astrophysicists, intrigued by certain observations that do not fit the dark matter theory, argue for various modifications of the standard laws of general relativity, such as modified Newtonian dynamics, tensor–vector–scalar gravity, or entropic gravity; these models attempt to account for all observations without invoking supplemental non-baryonic matter. The hypothesis of dark matter has an elaborate history. In a talk given in 1884, Lord Kelvin estimated the number of dark bodies in the Milky Way from the observed velocity dispersion of the stars orbiting around the center of the galaxy. By using these measurements, he estimated the mass of the galaxy, which he determined is different from the mass of visible stars.
Lord Kelvin thus concluded that "many of our stars a great majority of them, may be dark bodies". In 1906 Henri Poincaré in "The Milky Way and Theory of Gases" used "dark matter", or "matière obscure" in French, in discussing Kelvin's work; the first to suggest the existence of dark matter, using stellar velocities, was Dutch astronomer Jacobus Kapteyn in 1922. Fellow Dutchman and radio astronomy pioneer Jan Oort hypothesized the existence of dark matter in 1932. Oort was studying stellar motions in the local galactic neighborhood and found that the mass in the galactic plane must be greater than what was observed, but this measurement was determined to be erroneous. In 1933, Swiss astrophysicist Fritz Zwicky, who studied galaxy clusters while working at the California Institute of Technology, made a similar inference. Zwicky applied the virial theorem to the Coma Cluster and obtained evidence of unseen mass that he called dunkle Materie. Zwicky estimated its mass based on the motions of galaxies near its edge and compared that to an estimate based on its brightness and number of galaxies.
He estimated. The gravity effect of the visible galaxies was far too small for such fast orbits, thus mass must be hidden from view. Based on these conclusions, Zwicky inferred that some unseen matter provided the mass and associated gravitation attraction to hold the cluster together; this was the first formal inference about the existence of dark matter. Zwicky's estimates were off by more than an order of magnitude due to an obsolete value of the Hubble constant. However, Zwicky did infer that the bulk of the matter was dark. Further indications that the mass-to-light ratio was not unity came from measurements of galaxy rotation curves. In 1939, Horace W. Babcock reported the rotation curve for the Andromeda nebula, which suggested that the mass-to-luminosity ratio increases radially, he attributed it to either light absorption within the galaxy or modified dynamics in the outer portions of the spiral and not to the missing matter that he had uncovered. Following Babcock's 1939 report of unexpectedly rapid rotation in the outskirts of the Andromeda galaxy and a mass-to-light ratio of 50, in 1940 Jan Oort discovered and wrote about the large non-visible halo of NGC 3115.
Vera Rubin, Kent Ford and Ken Freeman's work in the