In physics, string theory is a theoretical framework in which the point-like particles of particle physics are replaced by one-dimensional objects called strings. It describes how these strings propagate through interact with each other. On distance scales larger than the string scale, a string looks just like an ordinary particle, with its mass and other properties determined by the vibrational state of the string. In string theory, one of the many vibrational states of the string corresponds to the graviton, a quantum mechanical particle that carries gravitational force, thus string theory is a theory of quantum gravity. String theory is a broad and varied subject that attempts to address a number of deep questions of fundamental physics. String theory has been applied to a variety of problems in black hole physics, early universe cosmology, nuclear physics, condensed matter physics, it has stimulated a number of major developments in pure mathematics; because string theory provides a unified description of gravity and particle physics, it is a candidate for a theory of everything, a self-contained mathematical model that describes all fundamental forces and forms of matter.
Despite much work on these problems, it is not known to what extent string theory describes the real world or how much freedom the theory allows in the choice of its details. String theory was first studied in the late 1960s as a theory of the strong nuclear force, before being abandoned in favor of quantum chromodynamics. Subsequently, it was realized that the properties that made string theory unsuitable as a theory of nuclear physics made it a promising candidate for a quantum theory of gravity; the earliest version of string theory, bosonic string theory, incorporated only the class of particles known as bosons. It developed into superstring theory, which posits a connection called supersymmetry between bosons and the class of particles called fermions. Five consistent versions of superstring theory were developed before it was conjectured in the mid-1990s that they were all different limiting cases of a single theory in eleven dimensions known as M-theory. In late 1997, theorists discovered an important relationship called the AdS/CFT correspondence, which relates string theory to another type of physical theory called a quantum field theory.
One of the challenges of string theory is that the full theory does not have a satisfactory definition in all circumstances. Another issue is that the theory is thought to describe an enormous landscape of possible universes, this has complicated efforts to develop theories of particle physics based on string theory; these issues have led some in the community to criticize these approaches to physics and question the value of continued research on string theory unification. In the twentieth century, two theoretical frameworks emerged for formulating the laws of physics; the first is Albert Einstein's general theory of relativity, a theory that explains the force of gravity and the structure of space and time. The other is quantum mechanics, a different formulation to describe physical phenomena using the known probability principles. By the late 1970s, these two frameworks had proven to be sufficient to explain most of the observed features of the universe, from elementary particles to atoms to the evolution of stars and the universe as a whole.
In spite of these successes, there are still many problems. One of the deepest problems in modern physics is the problem of quantum gravity; the general theory of relativity is formulated within the framework of classical physics, whereas the other fundamental forces are described within the framework of quantum mechanics. A quantum theory of gravity is needed in order to reconcile general relativity with the principles of quantum mechanics, but difficulties arise when one attempts to apply the usual prescriptions of quantum theory to the force of gravity. In addition to the problem of developing a consistent theory of quantum gravity, there are many other fundamental problems in the physics of atomic nuclei, black holes, the early universe. String theory is a theoretical framework that attempts to address many others; the starting point for string theory is the idea that the point-like particles of particle physics can be modeled as one-dimensional objects called strings. String theory describes how strings propagate through interact with each other.
In a given version of string theory, there is only one kind of string, which may look like a small loop or segment of ordinary string, it can vibrate in different ways. On distance scales larger than the string scale, a string will look just like an ordinary particle, with its mass and other properties determined by the vibrational state of the string. In this way, all of the different elementary particles may be viewed as vibrating strings. In string theory, one of the vibrational states of the string gives rise to the graviton, a quantum mechanical particle that carries gravitational force, thus string theory is a theory of quantum gravity. One of the main developments of the past several decades in string theory was the discovery of certain "dualities", mathematical transformations that identify one physical theory with another. Physicists studying string theory have discovered a number of these dualities between different versions of string theory, this has led to the conjecture that all consistent versions of string theory are subsumed in a single framework known as M-theory.
Studies of string theory have yielded a number of results on the nature of black holes and the gravitational interaction. There are certain paradoxes that arise when one attempts to understand the quantum aspects of black holes, work on string theory
In physical cosmology and astronomy, dark energy is an unknown form of energy, hypothesized to permeate all of space, tending to accelerate the expansion of the universe. Dark energy is the most accepted hypothesis to explain the observations since the 1990s indicating that the universe is expanding at an accelerating rate. Assuming that the standard model of cosmology is correct, the best current measurements indicate that dark energy contributes 68% of the total energy in the present-day observable universe; the mass–energy of dark matter and ordinary matter contribute 27% and 5% and other components such as neutrinos and photons contribute a small amount. The density of dark energy is low much less than the density of ordinary matter or dark matter within galaxies. However, it dominates the mass -- energy of the universe. Two proposed forms for dark energy are the cosmological constant, representing a constant energy density filling space homogeneously, scalar fields such as quintessence or moduli, dynamic quantities whose energy density can vary in time and space.
Contributions from scalar fields that are constant in space are also included in the cosmological constant. The cosmological constant can be formulated to be equivalent to the zero-point radiation of space i.e. the vacuum energy. Scalar fields that change in space can be difficult to distinguish from a cosmological constant because the change may be slow; the "cosmological constant" is a constant term that can be added to Einstein's field equation of general relativity. If considered as a "source term" in the field equation, it can be viewed as equivalent to the mass of empty space, or "vacuum energy"; the cosmological constant was first proposed by Einstein as a mechanism to obtain a solution of the gravitational field equation that would lead to a static universe using dark energy to balance gravity. Einstein gave the cosmological constant the symbol Λ. Einstein stated that the cosmological constant required that `empty space takes the role of gravitating negative masses which are distributed all over the interstellar space'..
The mechanism was an example of fine-tuning, it was realized that Einstein's static universe would not be stable: local inhomogeneities would lead to either the runaway expansion or contraction of the universe. The equilibrium is unstable: if the universe expands then the expansion releases vacuum energy, which causes yet more expansion. A universe which contracts will continue contracting; these sorts of disturbances are inevitable, due to the uneven distribution of matter throughout the universe. Further, observations made by Edwin Hubble in 1929 showed that the universe appears to be expanding and not static at all. Einstein referred to his failure to predict the idea of a dynamic universe, in contrast to a static universe, as his greatest blunder. Alan Guth and Alexei Starobinsky proposed in 1980 that a negative pressure field, similar in concept to dark energy, could drive cosmic inflation in the early universe. Inflation postulates that some repulsive force, qualitatively similar to dark energy, resulted in an enormous and exponential expansion of the universe after the Big Bang.
Such expansion is an essential feature of most current models of the Big Bang. However, inflation must have occurred at a much higher energy density than the dark energy we observe today and is thought to have ended when the universe was just a fraction of a second old, it is unclear what relation, if any, exists between dark inflation. After inflationary models became accepted, the cosmological constant was thought to be irrelevant to the current universe. Nearly all inflation models predict that the total density of the universe should be close to the critical density. During the 1980s, most cosmological research focused on models with critical density in matter only 95% cold dark matter and 5% ordinary matter; these models were found to be successful at forming realistic galaxies and clusters, but some problems appeared in the late 1980s: in particular, the model required a value for the Hubble constant lower than preferred by observations, the model under-predicted observations of large-scale galaxy clustering.
These difficulties became stronger after the discovery of anisotropy in the cosmic microwave background by the COBE spacecraft in 1992, several modified CDM models came under active study through the mid-1990s: these included the Lambda-CDM model and a mixed cold/hot dark matter model. The first direct evidence for dark energy came from supernova observations in 1998 of accelerated expansion in Riess et al. and in Perlmutter et al. and the Lambda-CDM model became the leading model. Soon after, dark energy was supported by independent observations: in 2000, the BOOMERanG and Maxima cosmic microwave background experiments observed the first acoustic peak in the CMB, showing that the total density is close to 100% of critical density. In 2001, the 2dF Galaxy Redshift Survey gave strong evidence that the matter density is around 30% of critical; the large difference between these two supports a smooth component of dark energy making up the difference. Much more precise measurements from WMAP in 2003–2010 have continued to support the standard model and give more accurate measurements of the key parameters.
The term "dark energy", echoing Fritz Zwicky's "dark matter" from the 1930s, was coined by Michael Turner in 1998. High-precision measurements of the expansion of the universe are required to understand how the expansion rate change
Loop quantum gravity
Loop quantum gravity is a theory of quantum gravity, merging quantum mechanics and general relativity, making it a possible candidate for a theory of everything. Its goal is to unify gravity in a common theoretical framework with the other three fundamental forces of nature, beginning with relativity and adding quantum features, it competes with string theory that adds gravity. From the point of view of Einstein's theory, all attempts to treat gravity as another quantum force equal in importance to electromagnetism and the nuclear forces have failed. According to Einstein, gravity is not a force – it is a property of spacetime itself. Loop quantum gravity is an attempt to develop a quantum theory of gravity based directly on Einstein's geometric formulation. To do this, in LQG theory space and time are quantized, analogously to the way quantities like energy and momentum are quantized in quantum mechanics; the theory gives a physical picture of spacetime where space and time are granular and discrete directly because of quantization just like photons in the quantum theory of electromagnetism and the discrete energy levels of atoms.
Distance exists with a minimum. Space's structure prefers an fine fabric or network woven of finite loops; these networks of loops are called spin networks. The evolution of a spin network, or spin foam, has a scale on the order of a Planck length 10−35 metres, smaller scales do not exist. Not just matter, but space itself, prefers an atomic structure; the vast areas of research developed in several directions that involve about 30 research groups worldwide. They all share the basic physical assumptions and the mathematical description of quantum space. Research follows two directions: the more traditional canonical loop quantum gravity, the newer covariant loop quantum gravity, called spin foam theory. Physical consequences of the theory proceed in several directions; the most well-developed applies to cosmology, called loop quantum cosmology, the study of the early universe and the physics of the Big Bang. Its greatest consequence sees the evolution of the universe continuing beyond the Big Bang called the Big Bounce.
In 1986, Abhay Ashtekar reformulated Einstein's general relativity in a language closer to that of the rest of fundamental physics. Shortly after, Ted Jacobson and Lee Smolin realized that the formal equation of quantum gravity, called the Wheeler–DeWitt equation, admitted solutions labelled by loops when rewritten in the new Ashtekar variables. Carlo Rovelli and Lee Smolin defined a nonperturbative and background-independent quantum theory of gravity in terms of these loop solutions. Jorge Pullin and Jerzy Lewandowski understood that the intersections of the loops are essential for the consistency of the theory, the theory should be formulated in terms of intersecting loops, or graphs. In 1994, Rovelli and Smolin showed that the quantum operators of the theory associated to area and volume have a discrete spectrum; that is, geometry is quantized. This result defines an explicit basis of states of quantum geometry, which turned out to be labelled by Roger Penrose's spin networks, which are graphs labelled by spins.
The canonical version of the dynamics was put on firm ground by Thomas Thiemann, who defined an anomaly-free Hamiltonian operator, showing the existence of a mathematically consistent background-independent theory. The covariant or spin foam version of the dynamics developed during several decades, crystallized in 2008, from the joint work of research groups in France, Canada, UK, Germany, leading to the definition of a family of transition amplitudes, which in the classical limit can be shown to be related to a family of truncations of general relativity; the finiteness of these amplitudes was proven in 2011. It requires the existence of a positive cosmological constant, this is consistent with observed acceleration in the expansion of the Universe. In theoretical physics, general covariance is the invariance of the form of physical laws under arbitrary differentiable coordinate transformations; the essential idea is that coordinates are only artifices used in describing nature, hence should play no role in the formulation of fundamental physical laws.
A more significant requirement is the principle of general relativity that states that the laws of physics take the same form in all reference systems. This is a generalization of the principle of special relativity which states that the laws of physics take the same form in all inertial frames. In mathematics, a diffeomorphism is an isomorphism in the category of smooth manifolds, it is an invertible function that maps one differentiable manifold to another, such that both the function and its inverse are smooth. These are the defining symmetry transformations of General Relativity since the theory is formulated only in terms of a differentiable manifold. In general relativity, general covariance is intimately related to "diffeomorphism invariance"; this symmetry is one of the defining features of the theory. However, it is a common misunderstanding that "diffeomorphism invariance" refers to the invariance of the physical predictions of a theory under arbitrary coordinate transformations. Diffeomorphisms, as mathematicians define them, correspond to something much more radical.
Diffeomorphisms are the true symmetry transformations of general relativity, come about from the assertion that the formulation of the theory is based on a bare different
The Higgs boson is an elementary particle in the Standard Model of particle physics, produced by the quantum excitation of the Higgs field, one of the fields in particle physics theory. It is named after physicist Peter Higgs, who in 1964, along with five other scientists, proposed the mechanism which suggested the existence of such a particle, its existence was confirmed in 2012 by the ATLAS and CMS collaborations based on collisions in the LHC at CERN. On December 10, 2013, two of the physicists, Peter Higgs and François Englert, were awarded the Nobel Prize in Physics for their theoretical predictions. Although Higgs's name has come to be associated with this theory, several researchers between about 1960 and 1972 independently developed different parts of it. In mainstream media the Higgs boson has been called the "God particle", from a 1993 book on the topic, although the nickname is disliked by many physicists, including Higgs himself, who regard it as sensationalism. Physicists explain the properties of forces between elementary particles in terms of the Standard Model – a accepted framework for understanding everything in the known universe, other than gravity.
In this model, the fundamental forces in nature arise from properties of our universe called gauge invariance and symmetries. The forces are transmitted by particles known as gauge bosons. In the Standard Model, the Higgs particle is a boson with spin zero, no electric charge and no colour charge, it is very unstable, decaying into other particles immediately. The Higgs field is a scalar field, with two neutral and two electrically charged components that form a complex doublet of the weak isospin SU symmetry; the Higgs field has a "Mexican hat-shaped" potential. In its ground state, this causes the field to have a nonzero value everywhere, as a result, below a high energy it breaks the weak isospin symmetry of the electroweak interaction; when this happens, three components of the Higgs field are "absorbed" by the SU and U gauge bosons to become the longitudinal components of the now-massive W and Z bosons of the weak force. The remaining electrically neutral component either manifests as a Higgs particle, or may couple separately to other particles known as fermions, causing these to acquire mass as well.
Field theories had been used with great success in understanding the electromagnetic field and the strong force, but by around 1960 all attempts to create a gauge invariant theory for the weak force had failed, with gauge theories thereby starting to fall into disrepute as a result. The problem was that the symmetry requirements in gauge theory predicted that both electromagnetism's gauge boson and the weak force's gauge bosons should have zero mass. Although the photon is indeed massless, experiments show; this meant that either gauge invariance was an incorrect approach, or something else – unknown – was giving these particles their mass, but all attempts to suggest a theory able to solve this problem just seemed to create new theoretical issues. In the late 1950s, physicists had "no idea" how to resolve these issues, which were significant obstacles to developing a full-fledged theory for particle physics. By the early 1960s, physicists had realised that a given symmetry law might not always be followed under certain conditions, at least in some areas of physics.
This was recognised in the late 1950s by Yoichiro Nambu. Symmetry breaking can lead to unexpected results. In 1962 physicist Philip Anderson – an expert in superconductivity – wrote a paper that considered symmetry breaking in particle physics, suggested that symmetry breaking might be the missing piece needed to solve the problems of gauge invariance in particle physics. If electroweak symmetry was somehow being broken, it might explain why electromagnetism's boson is massless, yet the weak force bosons have mass, solve the problems. Shortly afterwards, in 1963, this was shown to be theoretically possible, at least for some limited cases. Following the 1962 and 1963 papers, three groups of researchers independently published the 1964 PRL symmetry breaking papers with similar conclusions: that the conditions for electroweak symmetry would be "broken" if an unusual type of field existed throughout the universe, indeed, some fundamental particles would acquire mass; the field required for this to happen became known as the Higgs field and the mechanism by which it led to symmetry breaking, known as the Higgs mechanism.
A key feature of the necessary field is that it would take less energy for the field to have a non-zero value than a zero value, unlike all other known fields, the Higgs field has a non-zero value everywhere. It was the first proposal capable of showing how the weak force gauge bosons could have mass despite their governing symmetry, within a gauge invariant theory. Although these ideas did not gain much initial support or attention, by 1972 they had been developed into a comprehensive theory and proved capable of giving "sensible" results that described particles known at the time, which, with exceptional accuracy, predicted several other particles discovered during the following years. During the 1970s these theories became the Standard Mod
Yang–Mills theory is a gauge theory based on the SU group, or more any compact, reductive Lie algebra. Yang–Mills theory seeks to describe the behavior of elementary particles using these non-abelian Lie groups and is at the core of the unification of the electromagnetic force and weak forces as well as quantum chromodynamics, the theory of the strong force, thus it forms the basis of our understanding of the Standard Model of particle physics. In a private correspondence, Wolfgang Pauli formulated in 1953 a six-dimensional theory of Einstein's field equations of general relativity, extending the five-dimensional theory of Kaluza, Klein and others to a higher-dimensional internal space. However, there is no evidence that Pauli developed the Lagrangian of a gauge field or the quantization of it; because Pauli found that his theory "leads to some rather unphysical shadow particles", he refrained from publishing his results formally. Although Pauli did not publish his six-dimensional theory, he gave two talks about it in Zürich.
Recent research shows that an extended Kaluza–Klein theory is in general not equivalent to Yang–Mills theory, as the former contains additional terms. In early 1954, Chen Ning Yang and Robert Mills extended the concept of gauge theory for abelian groups, e.g. quantum electrodynamics, to nonabelian groups to provide an explanation for strong interactions. The idea by Yang–Mills was criticized by Pauli, as the quanta of the Yang–Mills field must be massless in order to maintain gauge invariance; the idea was set aside until 1960, when the concept of particles acquiring mass through symmetry breaking in massless theories was put forward by Jeffrey Goldstone, Yoichiro Nambu, Giovanni Jona-Lasinio. This prompted a significant restart of Yang–Mills theory studies that proved successful in the formulation of both electroweak unification and quantum chromodynamics; the electroweak interaction is described by the gauge group SU × U, while QCD is an SU Yang–Mills theory. The massless gauge bosons of the electroweak SU × U mix after spontaneous symmetry breaking to produce the 3 massive weak bosons as well as the still-massless photon field.
The dynamics of the photon field and its interactions with matter are, in turn, governed by the U gauge theory of quantum electrodynamics. The Standard Model combines the strong interaction with the unified electroweak interaction through the symmetry group SU × SU × U. In the current epoch the strong interaction is not unified with the electroweak interaction, but from the observed running of the coupling constants it is believed they all converge to a single value at high energies. Phenomenology at lower energies in quantum chromodynamics is not understood due to the difficulties of managing such a theory with a strong coupling; this may be the reason why confinement has not been theoretically proven, though it is a consistent experimental observation. Proof that QCD confines at low energy is a mathematical problem of great relevance, an award has been proposed by the Clay Mathematics Institute for whoever is able to show that the Yang–Mills theory has a mass gap and its existence. Yang–Mills theories are a special example of gauge theory with a non-abelian symmetry group given by the Lagrangian L g f = − 1 2 Tr = − 1 4 F a μ ν F μ ν a with the generators of the Lie algebra, indexed by a, corresponding to the F-quantities satisfying Tr = 1 2 δ a b, = i f a b c T c, where the fabc are structure constants of the Lie algebra, the covariant derivative defined as D μ = I ∂ μ − i g T a A μ a where I is the identity matrix, A μ a is the vector potential, g is the coupling constant.
In four dimensions, the coupling constant g is a pure number and for a SU group one has a, b, c = 1 … N 2 − 1. The relation F μ ν a = ∂ μ A ν a − ∂ ν A μ a + g f a b c A μ b A ν c can be derived by the commutator [ D μ, D ν
Large Hadron Collider
The Large Hadron Collider is the world's largest and most powerful particle collider and the largest machine in the world. It was built by the European Organization for Nuclear Research between 1998 and 2008 in collaboration with over 10,000 scientists and hundreds of universities and laboratories, as well as more than 100 countries, it lies in a tunnel 27 kilometres in circumference and as deep as 175 metres beneath the France–Switzerland border near Geneva. First collisions were achieved in 2010 at an energy of 3.5 teraelectronvolts per beam, about four times the previous world record. After upgrades it reached 6.5 TeV per beam. At the end of 2018, it entered a two-year shutdown period for further upgrades; the collider has four crossing points, around which are positioned seven detectors, each designed for certain kinds of research. The LHC collides proton beams, but it can use beams of heavy ions: Lead–lead collisions and proton-lead collisions are done for one month per year; the aim of the LHC's detectors is to allow physicists to test the predictions of different theories of particle physics, including measuring the properties of the Higgs boson and searching for the large family of new particles predicted by supersymmetric theories, as well as other unsolved questions of physics.
The term hadron refers to composite particles composed of quarks held together by the strong force. The best-known hadrons are the baryons such as neutrons. A collider is a type of a particle accelerator with two directed beams of particles. In particle physics, colliders are used as a research tool: they accelerate particles to high kinetic energies and let them impact other particles. Analysis of the byproducts of these collisions gives scientists good evidence of the structure of the subatomic world and the laws of nature governing it. Many of these byproducts are produced only by high-energy collisions, they decay after short periods of time, thus many of them are nearly impossible to study in other ways. Physicists hope that the Large Hadron Collider will help answer some of the fundamental open questions in physics, concerning the basic laws governing the interactions and forces among the elementary objects, the deep structure of space and time, in particular the interrelation between quantum mechanics and general relativity.
Data are needed from high-energy particle experiments to suggest which versions of current scientific models are more to be correct – in particular to choose between the Standard Model and Higgsless model and to validate their predictions and allow further theoretical development. Many theorists expect new physics beyond the Standard Model to emerge at the TeV energy level, as the Standard Model appears to be unsatisfactory. Issues explored by LHC collisions include: is the mass of elementary particles being generated by the Higgs mechanism via electroweak symmetry breaking? It was expected that the collider experiments will either demonstrate or rule out the existence of the elusive Higgs boson, thereby allowing physicists to consider whether the Standard Model or its Higgsless alternatives are more to be correct. Is supersymmetry, an extension of the Standard Model and Poincaré symmetry, realized in nature, implying that all known particles have supersymmetric partners? Are there extra dimensions, as predicted by various models based on string theory, can we detect them?
What is the nature of the dark matter that appears to account for 27% of the mass-energy of the universe? Other open questions that may be explored using high-energy particle collisions: It is known that electromagnetism and the weak nuclear force are different manifestations of a single force called the electroweak force; the LHC may clarify whether the electroweak force and the strong nuclear force are just different manifestations of one universal unified force, as predicted by various Grand Unification Theories. Why is the fourth fundamental force so many orders of magnitude weaker than the other three fundamental forces? See Hierarchy problem. Are there additional sources of quark flavour mixing, beyond those present within the Standard Model? Why are there apparent violations of the symmetry between matter and antimatter? See CP violation. What are the nature and properties of quark–gluon plasma, thought to have existed in the early universe and in certain compact and strange astronomical objects today?
This will be investigated by heavy ion collisions in ALICE, but in CMS, ATLAS and LHCb. First observed in 2010, findings published in 2012 confirmed the phenomenon of jet quenching in heavy-ion collisions; the LHC is the world's largest and highest-energy particle accelerator. The collider is contained in a circular tunnel, with a circumference of 26.7 kilometres, at a depth ranging from 50 to 175 metres underground. The 3.8-metre wide concrete-lined tunnel, constructed between 1983 and 1988, was used to house the Large Electron–Positron Collider. It crosses the border between Switzerland and France with most of it in France. Surface buildings hold ancillary equipment such as compressors, ventilation equipment, control electronics and refrigeration plants; the collider tunnel contains two adjacent parallel beamlines each containing a beam, which travel in opposite directions around the ring. The beams intersect at four points around the ring, where the particle collisio
Quantum mechanics, including quantum field theory, is a fundamental theory in physics which describes nature at the smallest scales of energy levels of atoms and subatomic particles. Classical physics, the physics existing before quantum mechanics, describes nature at ordinary scale. Most theories in classical physics can be derived from quantum mechanics as an approximation valid at large scale. Quantum mechanics differs from classical physics in that energy, angular momentum and other quantities of a bound system are restricted to discrete values. Quantum mechanics arose from theories to explain observations which could not be reconciled with classical physics, such as Max Planck's solution in 1900 to the black-body radiation problem, from the correspondence between energy and frequency in Albert Einstein's 1905 paper which explained the photoelectric effect. Early quantum theory was profoundly re-conceived in the mid-1920s by Erwin Schrödinger, Werner Heisenberg, Max Born and others; the modern theory is formulated in various specially developed mathematical formalisms.
In one of them, a mathematical function, the wave function, provides information about the probability amplitude of position and other physical properties of a particle. Important applications of quantum theory include quantum chemistry, quantum optics, quantum computing, superconducting magnets, light-emitting diodes, the laser, the transistor and semiconductors such as the microprocessor and research imaging such as magnetic resonance imaging and electron microscopy. Explanations for many biological and physical phenomena are rooted in the nature of the chemical bond, most notably the macro-molecule DNA. Scientific inquiry into the wave nature of light began in the 17th and 18th centuries, when scientists such as Robert Hooke, Christiaan Huygens and Leonhard Euler proposed a wave theory of light based on experimental observations. In 1803, Thomas Young, an English polymath, performed the famous double-slit experiment that he described in a paper titled On the nature of light and colours.
This experiment played a major role in the general acceptance of the wave theory of light. In 1838, Michael Faraday discovered cathode rays; these studies were followed by the 1859 statement of the black-body radiation problem by Gustav Kirchhoff, the 1877 suggestion by Ludwig Boltzmann that the energy states of a physical system can be discrete, the 1900 quantum hypothesis of Max Planck. Planck's hypothesis that energy is radiated and absorbed in discrete "quanta" matched the observed patterns of black-body radiation. In 1896, Wilhelm Wien empirically determined a distribution law of black-body radiation, known as Wien's law in his honor. Ludwig Boltzmann independently arrived at this result by considerations of Maxwell's equations. However, it underestimated the radiance at low frequencies. Planck corrected this model using Boltzmann's statistical interpretation of thermodynamics and proposed what is now called Planck's law, which led to the development of quantum mechanics. Following Max Planck's solution in 1900 to the black-body radiation problem, Albert Einstein offered a quantum-based theory to explain the photoelectric effect.
Around 1900–1910, the atomic theory and the corpuscular theory of light first came to be accepted as scientific fact. Among the first to study quantum phenomena in nature were Arthur Compton, C. V. Raman, Pieter Zeeman, each of whom has a quantum effect named after him. Robert Andrews Millikan studied the photoelectric effect experimentally, Albert Einstein developed a theory for it. At the same time, Ernest Rutherford experimentally discovered the nuclear model of the atom, for which Niels Bohr developed his theory of the atomic structure, confirmed by the experiments of Henry Moseley. In 1913, Peter Debye extended Niels Bohr's theory of atomic structure, introducing elliptical orbits, a concept introduced by Arnold Sommerfeld; this phase is known as old quantum theory. According to Planck, each energy element is proportional to its frequency: E = h ν, where h is Planck's constant. Planck cautiously insisted that this was an aspect of the processes of absorption and emission of radiation and had nothing to do with the physical reality of the radiation itself.
In fact, he considered his quantum hypothesis a mathematical trick to get the right answer rather than a sizable discovery. However, in 1905 Albert Einstein interpreted Planck's quantum hypothesis realistically and used it to explain the photoelectric effect, in which shining light on certain materials can eject electrons from the material, he won the 1921 Nobel Prize in Physics for this work. Einstein further developed this idea to show that an electromagnetic wave such as light could be described as a particle, with a discrete quantum of energy, dependent on its frequency; the foundations of quantum mechanics were established during the first half of the 20th century by Max Planck, Niels Bohr, Werner Heisenberg, Louis de Broglie, Arthur Compton, Albert Einstein, Erwin Schrödinger, Max Born, John von Neumann, Paul Dirac, Enrico Fermi, Wolfgang Pauli, Max von Laue, Freeman Dyson, David Hilbert, Wi