ALICE is one of seven detector experiments at the Large Hadron Collider at CERN. The other six are: ATLAS, CMS, TOTEM, LHCb, LHCf and MoEDAL. ALICE is optimized to study heavy-ion collisions at a centre of mass energy of 2.76 TeV per nucleon pair. The resulting temperature and energy density are expected to be high enough to produce quark–gluon plasma, a state of matter wherein quarks and gluons are freed. Similar conditions are believed to have existed a fraction of the second after the Big Bang before quarks and gluons bound together to form hadrons and heavier particles. ALICE is focusing on the physics of interacting matter at extreme energy densities; the existence of the quark–gluon plasma and its properties are key issues in quantum chromodynamics for understanding color confinement and chiral symmetry restoration. Recreating this primordial form of matter and understanding how it evolves is expected to shed light on questions about how matter is organized, the mechanism that confines quarks and gluons and the nature of strong interactions and how they result in generating the bulk of the mass of ordinary matter.
Quantum chromodynamics predicts that at sufficiently high energy densities there will be a phase transition from conventional hadronic matter, where quarks are locked inside nuclear particles, to a plasma of deconfined quarks and gluons. The reverse of this transition is believed to have taken place when the universe was just 10−6 s old, may still play a role today in the hearts of collapsing neutron stars or other astrophysical objects; the idea of building a dedicated heavy-ion detector for the LHC was first aired at the historic Evian meeting "Towards the LHC experimental Programme" in March 1992. From the ideas presented there, the ALICE collaboration was formed and in 1993, a LoI was submitted. ALICE was first proposed as a central detector in 1993 and complemented by an additional forward muon spectrometer designed in 1995. In 1997, ALICE received the green light from the LHC Committee to proceed towards final design and construction; the first ten years were spent on an extensive R&D effort.
Like for all other LHC experiments, it became clear from the outset that the challenges of heavy ion physics at LHC could not be met with existing technology. Significant advances, in some cases a technological break-through, would be required to build on the ground what physicists had dreamed up on paper for their experiments; the very broad and more focused, well organised and well supported R&D effort, sustained over most of the 1990s, has led to many evolutionary and some revolutionary advances in detectors and computing. Designing a dedicated heavy-ion experiment in the early'90s for use at the LHC some 15 years posed some daunting challenges; the detector had to be general purpose - able to measure most signals of potential interest if their relevance may only become apparent - and flexible, allowing additions and modifications along the way as new avenues of investigation would open up. In both respects ALICE did quite well, as it included a number of observables in its initial menu whose importance only became clear later.
Various major detection system were added, from the muon spectrometer in 1995, the transition radiation detectors in 1999 to a large jet calorimeter added in 2007. ALICE recorded data from the first lead-lead collisions at the LHC in 2010. Data sets taken during heavy-ion periods in 2010 and 2011 as well as proton-lead data from 2013 have provided an excellent basis for an in-depth look at the physics of quark–gluon plasma; as of 2014 After more than three years of successful operation, the ALICE detector is about to undergo a major programme of consolidation and upgrade during the long shutdown of CERN's accelerator complex. A new subdetector called the dijet calorimeter will be installed, all 18 of the existing ALICE subdetectors will be upgraded. There will be major renovation work on the ALICE infrastructure, including the electrical and cooling systems; the wealth of published scientific results and the intense upgrade programme of ALICE have attracted numerous institutes and scientists from all over the world.
Today the ALICE Collaboration has more than 1800 members coming from 176 institutes in 41 countries Searches for Quark Gluon plasma and a deeper understanding of the QCD started at CERN and Brookhaven with lighter ions in the 1980s. Today's programme at these laboratories has moved on to ultrarelativistic collisions of heavy ions, is just reaching the energy threshold at which the phase transition is expected to occur; the LHC, with a centre-of-mass energy around 5.5 TeV/nucleon, will push the energy reach further. During head-on collisions of lead ions at the LHC, hundreds of protons and neutrons smash into one another at energies of upwards of a few TeVs. Lead ions are accelerated to more than 99.9999% of the speed of light and collisions at the LHC are 100 times more energetic than those of protons - heating up matter in the interaction point to a temperature 100,000 times higher than the temperature in the core of the sun. When the two lead nuclei slam into each other, matter undergoes a transition to form for a brief instant a droplet of primordial matter, the so-called quark–gluon plasma, believed to have filled the universe a few microseconds after the Big Bang.
The quark–gluon plasma is formed as protons and neutrons "melt" into their elementary constituents and gluons become asymptotically free. The droplet of QGP cools, the individual quarks and gluons recombine into a blizzard of ordinary matter that speeds away in all directions; the debris contains particles
Contrails are line-shaped clouds produced by aircraft engine exhaust or changes in air pressure at aircraft cruise altitudes several miles above the Earth's surface. Contrails are composed of water, in the form of ice crystals; the combination of water vapor in aircraft engine exhaust and the low ambient temperatures that exist at high altitudes allows the formation of the trails. Impurities in the engine exhaust from the fuel, including sulfur compounds provide some of the particles that can serve as sites for water droplet growth in the exhaust and, if water droplets form, they might freeze to form ice particles that compose a contrail, their formation can be triggered by changes in air pressure in wingtip vortices or in the air over the entire wing surface. Contrails, other clouds directly resulting from human activity, are collectively named homogenitus. Depending on the temperature and humidity at the altitude the contrails form, they may be visible for only a few seconds or minutes, or may persist for hours and spread to be several miles wide resembling natural cirrus or altocumulus clouds.
Persistent contrails are of particular interest to scientists because they increase the cloudiness of the atmosphere. The resulting cloud forms are formally described as homomutatus, may resemble cirrus, cirrocumulus, or cirrostratus, are sometimes called cirrus aviaticus. Persistent spreading contrails are suspected to have an effect on global climate; the main products of hydrocarbon fuel combustion are water vapor. At high altitudes this water vapor emerges into a cold environment, the local increase in water vapor can raise the relative humidity of the air past saturation point; the vapor condenses into tiny water droplets which freeze if the temperature is low enough. These millions of tiny water droplets and/or ice crystals form the contrails; the time taken for the vapor to cool enough to condense accounts for the contrail forming some distance behind the aircraft. At high altitudes, supercooled water vapor requires a trigger to encourage deposition or condensation; the exhaust particles in the aircraft's exhaust act as this trigger, causing the trapped vapor to condense rapidly.
Exhaust contrails form at high altitudes. They can form closer to the ground when the air is cold and moist. A 2013–2014 study jointly supported by NASA, the German aerospace center DLR, Canada's National Research Council NRC, determined that biofuels could reduce contrail generation; this reduction was explained by demonstrating that biofuels produce fewer soot particles, which are the nuclei around which the ice crystals form. The tests were performed by flying a DC-8 at cruising altitude with a sample-gathering aircraft flying in trail. In these samples, the contrail-producing soot particle count was reduced by 50 to 70 percent, using a 50% blend of conventional Jet A1 fuel and HEFA biofuel produced from camelina; as a wing generates lift, it causes a vortex to form at the wingtip, at the tip of the flap when deployed These wingtip vortices persist in the atmosphere long after the aircraft has passed. The reduction in pressure and temperature across each vortex can cause water to condense and make the cores of the wingtip vortices visible.
This effect is more common on humid days. Wingtip vortices can sometimes be seen behind the wing flaps of airliners during takeoff and landing, during landing of the Space Shuttle; the visible cores of wingtip vortices contrast with the other major type of contrails which are caused by the combustion of fuel. Contrails produced from jet engine exhaust are seen at high altitude, directly behind each engine. By contrast, the visible cores of wingtip vortices are seen only at low altitude where the aircraft is travelling after takeoff or before landing, where the ambient humidity is higher, they trail behind the wingtips and wing flaps rather than behind the engines. At high-thrust settings the fan blades at the intake of a turbofan engine reach transonic speeds, causing a sudden drop in air pressure; this creates the condensation fog, observed by air travelers during takeoff. The tips of rotating surfaces sometimes produce visible contrails. Contrails, by affecting the Earth's radiation balance, act as a radiative forcing.
Studies have found that contrails trap outgoing longwave radiation emitted by the Earth and atmosphere at a greater rate than they reflect incoming solar radiation. NASA conducted a great deal of detailed research on atmospheric and climatological effects of contrails, including effects on ozone, ice crystal formation, particle composition, during the Atmospheric Effects of Aviation Project. Global radiative forcing has been calculated from the reanalysis data, climatological models and radiative transfer codes, it is estimated to amount to 0.012 W/m² for 2005, with an uncertainty range of 0.005 to 0.026 W/m², with a low level of scientific understanding. Therefore, the overall net effect of contrails is positive, i.e. a warming effect. However, the effect varies daily and annually, overall the magnitude of the forcing is not well known: Globally, values range from 3.5 mW/m² to 17 mW/m². Other studies have determined that night flights are responsible for the warming effect: while accounting for only 25% of daily air traffic, they contribute 60 to 80% of contrail radiative forcing
ATLAS is one of the seven particle detector experiments constructed at the Large Hadron Collider, a particle accelerator at CERN in Switzerland. The experiment is designed to take advantage of the unprecedented energy available at the LHC and observe phenomena that involve massive particles which were not observable using earlier lower-energy accelerators. ATLAS was one of the two LHC experiments involved in the discovery of the Higgs boson in July 2012, it was designed to search for evidence of theories of particle physics beyond the Standard Model. The ATLAS detector is 46 metres long, 25 metres in diameter, weighs about 7,000 tonnes; the experiment is a collaboration involving 3,000 physicists from over 175 institutions in 38 countries. The project was led for the first 15 years by Peter Jenni, between 2009 and 2013 was headed by Fabiola Gianotti, from 2013 to 2017 by David Charlton, afterwards by Karl Jakobs; the ATLAS Collaboration, the group of physicists who built and run the detector, was formed in 1992 when the proposed EAGLE and ASCOT collaborations merged their efforts to build a single, general-purpose particle detector for the Large Hadron Collider.
The design was a combination of the two previous experiments, benefitted from the detector research and development, done for the Superconducting Supercollider. The ATLAS experiment was proposed in its current form in 1994, funded by the CERN member countries in 1995. Additional countries and laboratories have joined in subsequent years. Construction work began at individual institutions, with detector components being shipped to CERN and assembled in the ATLAS experiment pit starting in 2003. Construction was completed in 2008 and the experiment detected its first single beam events on 10 September of that year. Data taking was interrupted for over a year due to an LHC magnet quench incident. On 23 November 2009, the first proton-proton collisions occurred at the LHC and were recorded by ATLAS, at a low injection energy of 450 GeV per beam. Since the LHC energy has been increasing: 900 GeV per beam at the end of 2009, 3,500 GeV for the whole of 2010 and 2011 4,000 GeV per beam in 2012. After a long shutdown in 2013 and 2014, in 2015 ATLAS saw 6,500 GeV per beam.
The first cyclotron, an early type of particle accelerator, was built by Ernest O. Lawrence in 1931, with a radius of just a few centimetres and a particle energy of 1 megaelectronvolt. Since accelerators have grown enormously in the quest to produce new particles of greater and greater mass; as accelerators have grown, so too has the list of known particles that they might be used to investigate. The most comprehensive model of particle interactions available today is known as the Standard Model of Particle Physics. With the important exception of the Higgs boson, now detected by the ATLAS and the CMS experiments, all of the particles predicted by the model had been observed by previous experiments. While the Standard Model predicts that quarks and neutrinos should exist, it does not explain why the masses of these particles differ by orders of magnitude. Due to this, many particle physicists believe it is possible that the Standard Model will break down at energies at the teraelectronvolt scale or higher.
If such beyond-the-Standard-Model physics is observed, a new model, identical to the Standard Model at energies thus far probed, can be developed to describe particle physics at higher energies. Most of the proposed theories predict new higher-mass particles, some of which may be light enough to be observed by ATLAS. ATLAS is designed to be a general-purpose detector; when the proton beams produced by the Large Hadron Collider interact in the center of the detector, a variety of different particles with a broad range of energies are produced. Rather than focusing on a particular physical process, ATLAS is designed to measure the broadest possible range of signals; this is intended to ensure that whatever form any new physical processes or particles might take, ATLAS will be able to detect them and measure their properties. Experiments at earlier colliders, such as the Tevatron and Large Electron-Positron Collider, were designed based on a similar philosophy. However, the unique challenges of the Large Hadron Collider – its unprecedented energy and high rate of collisions – require ATLAS to be larger and more complex than previous experiments.
At 27 kilometres in circumference, the Large Hadron Collider collides two beams of protons together, with each proton carrying up to 6.5 TeV of energy – enough to produce particles with masses greater than any particles known, if these particles exist. ATLAS is designed to detect these particles, namely their masses, energies, lifetime and nuclear spins. In order to identify all particles produced at the interaction point where the particle beams collide, the detector is designed in layers made up of detectors of different types, each of, designed to observe specific types of particles; the different traces that particles leave in each layer of the detector allow for effective particle identification and accurate measurements of energy and momentum. As the energy of the particles produced by the accelerator increases, the detectors attached to it must grow to measure and stop higher-energy particles; as of 2017, ATLAS is the largest detector built at a particle collider. ATLAS investigates ma
A multi-wire proportional chamber is a type of proportional counter that detects charged particles and photons and can give positional information on their trajectory, by tracking the trails of gaseous ionization. The multi-wire chamber uses an array of wires at high voltage, which run through a chamber with conductive walls held at ground potential. Alternatively, the wires may be at the cathode held at a high negative voltage; the chamber is filled with chosen gas, such as an argon/methane mix, such that any ionizing particle that passes through the tube will ionize surrounding gaseous atoms. The resulting ions and electrons are accelerated by the electric field across the chamber, causing a localised cascade of ionization known as a Townsend avalanche; this collects on the nearest wire and results in a charge proportional to the ionisation effect of the detected particle. By computing pulses from all the wires, the particle trajectory can be found. Adaptations of this basic design are resistive plate and drift chambers.
The drift chamber is sub-divided into ranges of specific use in the chamber designs known as time projection, microstrip gas, those types of detectors that use silicon. In 1968, Georges Charpak, while at the European Organization for Nuclear Research and developed the multi-wire proportional chamber; this invention resulted in him winning the Nobel Prize for Physics in 1992. The chamber was an advancement of the earlier bubble chamber rate of detection of only one or two particles every second to 1000 particle detections every second; the MWPC produced electronic signals from particle detection allowing scientists to examine data via computers. The multi-wire chamber is a development of the spark chamber. In a typical experiment, the chamber contains a mixture of these gases: argon isobutane freon The chamber could be filled with: liquid xenon. For high energy physics experiments, it is used to observe a particle's path. For a long time, bubble chambers were used for this purpose, but with the improvement of electronics, it became desirable to have a detector with fast electronic read-out.
A wire chamber is a chamber with many parallel wires, arranged as a grid and put on high voltage, with the metal casing being on ground potential. As in the Geiger counter, a particle leaves a trace of ions and electrons, which drift toward the case or the nearest wire, respectively. By marking off the wires which had a pulse of current, one can see the particle's path; the chamber has a good relative time resolution, good positional accuracy, a self-triggered operation. The development of the chamber enabled scientists to study the trajectories of particles with much improved precision, for the first time observe and study the rarer interactions that occur through particle interaction. If one precisely measures the timing of the current pulses of the wires and takes into account that the ions need some time to drift to the nearest wire, one can infer the distance at which the particle passed the wire; this increases the accuracy of the path reconstruction and is known as a drift chamber. The drift chamber functions by balancing the loss of energy from particles caused by impacts with particles of gas, with the accretion of energy created with high-energy electrical fields in use to cause the particle acceleration.
Design is similar to the Mw chamber but instead with central layer wires at a greater distance apart. The detection of charged particles within the chamber is possible by the ionizing of particles of gas due to the motion of the charged particle; the Fermilab detector CDF II contains. The chamber contains argon and ethane gas, wires separated by 3.56 millimetres gaps. If two drift chambers are used with the wires of one orthogonal to the wires of the other, both orthogonal to the beam direction, a more precise detection of the position is obtained. If an additional simple detector is used to detect, with poor or null positional resolution, the particle at a fixed distance before or after the wires, a tri-dimensional reconstruction can be made and the speed of the particle deducted from the difference in time of the passage of the particle in the different part of the detector; this setup gives us a detector called a Time Projection Chamber For measuring the velocity of the electrons in a gas there are special drift chambers, Velocity Drift Chambers which measure the drift time for known location of ionisation.
Particle Detectors hypermail_ archive of links to CLAS drift chambers Heidelberg lecture on research ionisation chambers
This article is about ionizing radiation detectors. For information about semiconductor detectors in radio, see Detector, Crystal detector, Diode#Semiconductor diodes, Rectifier. A semiconductor detector in ionising radiation detection physics is a device that uses a semiconductor to measure the effect of incident charged particles or photons. Semiconductor detectors find broad application for radiation protection, gamma and X-ray spectrometry, as particle detectors. In semiconductor detectors, ionizing radiation is measured by the number of charge carriers set free in the detector material, arranged between two electrodes, by the radiation. Ionizing radiation produces free holes; the number of electron-hole pairs is proportional to the energy of the radiation to the semiconductor. As a result, a number of electrons are transferred from the valence band to the conduction band, an equal number of holes are created in the valence band. Under the influence of an electric field and holes travel to the electrodes, where they result in a pulse that can be measured in an outer circuit, as described by the Shockley-Ramo theorem.
The holes travel in the opposite direction and can be measured. As the amount of energy required to create an electron-hole pair is known, is independent of the energy of the incident radiation, measuring the number of electron-hole pairs allows the intensity of the incident radiation to be determined; the energy required to produce electron-hole-pairs is low compared to the energy required to produce paired ions in a gas detector. In semiconductor detectors the statistical variation of the pulse height is smaller and the energy resolution is higher; as the electrons travel fast, the time resolution is very good, is dependent upon rise time. Compared with gaseous ionization detectors, the density of a semiconductor detector is high, charged particles of high energy can give off their energy in a semiconductor of small dimensions. Most silicon particle detectors work, in principle, by doping narrow strips of silicon to turn them into diodes, which are reverse biased; as charged particles pass through these strips, they cause small ionization currents that can be detected and measured.
Arranging thousands of these detectors around a collision point in a particle accelerator can yield an accurate picture of what paths particles take. Silicon detectors have a much higher resolution in tracking charged particles than older technologies such as cloud chambers or wire chambers; the drawback is that silicon detectors are much more expensive than these older technologies and require sophisticated cooling to reduce leakage currents. They suffer degradation over time from radiation, however this can be reduced thanks to the Lazarus effect. Diamond detectors have many similarities with silicon detectors, but are expected to offer significant advantages, in particular a high radiation hardness and low drift currents. At present they are more difficult to manufacture. Germanium detectors are used for gamma spectroscopy in nuclear physics, as well as x-ray spectroscopy. While silicon detectors cannot be thicker than a few millimeters, germanium can have a depleted, sensitive thickness of centimeters, therefore can be used as a total absorption detector for gamma rays up to few MeV.
These detectors are called high-purity germanium detectors or hyperpure germanium detectors. Before current purification techniques were refined, germanium crystals could not be produced with purity sufficient to enable their use as spectroscopy detectors. Impurities in the crystals trap holes, ruining the performance of the detectors. Germanium crystals were doped with lithium ions, in order to produce an intrinsic region in which the electrons and holes would be able to reach the contacts and produce a signal; when germanium detectors were first developed, only small crystals were available. Low efficiency was the result, germanium detector efficiency is still quoted in relative terms to a "standard" 3″ x 3″ NaI scintillation detector. Crystal growth techniques have since improved, allowing detectors to be manufactured that are as large as or larger than available NaI crystals, although such detectors cost more than €100,000; as of 2012 HPGe detectors use lithium diffusion to make an n+ ohmic contact, boron implantation to make a p+ contact.
Coaxial detectors with a central n+ contact are referred to as n-type detectors, while p-type detectors have a p+ central contact. The thickness of these contacts represents a dead layer around the surface of the crystal within which energy depositions do not result in detector signals; the central contact in these detectors is opposite to the surface contact, making the dead layer in n-type detectors smaller than the dead layer in p-type detectors. Typical dead layer thicknesses are several hundred micrometers for an Li diffusion layer, a few tenths of a micrometer for a B implantation layer; the major drawback of germanium detectors is that they must be cooled to liquid nitrogen temperatures to produce spectroscopic data. At higher temperatures, the electrons can cross the band gap in the crystal and reach the conduction band, where they are free to respond to the electric field, producing too much electrical noise to be useful as a spectrometer. Cooling to liquid nitrogen temperature reduces thermal excitations of valence electrons so that only a gamma ray interaction can give an electron the energy necessary to cross the band gap and reach the conduction band.
Cooling with liquid nitrogen is inconvenie
A cloud chamber known as a Wilson cloud chamber, is a particle detector used for visualizing the passage of ionizing radiation. A cloud chamber consists of a sealed environment containing a supersaturated vapor of water or alcohol. An energetic charged particle interacts with the gaseous mixture by knocking electrons off gas molecules via electrostatic forces during collisions, resulting in a trail of ionized gas particles; the resulting ions act as condensation centers around which a mist-like trail of small droplets form if the gas mixture is at the point of condensation. These droplets are visible as a "cloud" track that persist for several seconds while the droplets fall through the vapor; these tracks have characteristic shapes. For example, an alpha particle track is thick and straight, while an electron track is wispy and shows more evidence of deflections by collisions. Cloud chambers played a prominent role in the experimental particle physics from the 1920s to the 1950s, until the advent of the bubble chamber.
In particular, the discoveries of the positron in 1932 and the muon in 1936, both by Carl Anderson, used cloud chambers. Discovery of the kaon by George Rochester and Clifford Charles Butler in 1947 was made using a cloud chamber as the detector.. In each case, cosmic rays were the source of ionizing radiation. Charles Thomson Rees Wilson, a Scottish physicist, is credited with inventing the cloud chamber. Inspired by sightings of the Brocken spectre while working on the summit of Ben Nevis in 1894, he began to develop expansion chambers for studying cloud formation and optical phenomena in moist air, he discovered that ions could act as centers for water droplet formation in such chambers. He pursued the application of this discovery and perfected the first cloud chamber in 1911. In Wilson's original chamber the air inside the sealed device was saturated with water vapor a diaphragm was used to expand the air inside the chamber, cooling the air and starting to condense water vapor. Hence the name expansion cloud chamber is used.
When an ionizing particle passes through the chamber, water vapor condenses on the resulting ions and the trail of the particle is visible in the vapor cloud. Wilson, along with Arthur Compton, received the Nobel Prize in Physics in 1927 for his work on the cloud chamber; this kind of chamber is called a pulsed chamber because the conditions for operation are not continuously maintained. Further developments were made by Patrick Blackett who utilised a stiff spring to expand and compress the chamber rapidly, making the chamber sensitive to particles several times a second. A cine film was used to record the images; the diffusion cloud chamber was developed in 1936 by Alexander Langsdorf. This chamber differs from the expansion cloud chamber in that it is continuously sensitized to radiation, in that the bottom must be cooled to a rather low temperature colder than −26 °C. Instead of water vapor, alcohol is used because of its lower freezing point. Cloud chambers cooled by dry ice or Peltier effect thermoelectric cooling are common demonstration and hobbyist devices.
Diffusion-type cloud chambers will be discussed here. A simple cloud chamber consists of a warm top plate and a cold bottom plate, it requires a source of liquid alcohol at the warm side of the chamber where the liquid evaporates, forming a vapor that cools as it falls through the gas and condenses on the cold bottom plate. Some sort of ionizing radiation is needed. Methanol, isopropanol, or other alcohol vapor saturates the chamber; the alcohol falls as it cools down and the cold condenser provides a steep temperature gradient. The result is a supersaturated environment; as energetic charged particles pass through the gas they leave ionization trails. The alcohol vapor condenses around gaseous ion trails left behind by the ionizing particles; this occurs because alcohol and water molecules are polar, resulting in a net attractive force toward a nearby free charge. The result is a misty cloud-like formation, seen by the presence of droplets falling down to the condenser; when the tracks are emitted radially outward from a source, their point of origin can be determined.
Just above the cold condenser plate there is a volume of the chamber, sensitive to ionization tracks. The ion trail left by the radioactive particles provides an optimal trigger for condensation and cloud formation; this sensitive volume is increased in height by employing a steep temperature gradient, stable conditions. A strong electric field is used to draw cloud tracks down to the sensitive region of the chamber and increase the sensitivity of the chamber; the electric field can serve to prevent large amounts of background "rain" from obscuring the sensitive region of the chamber, caused by condensation forming above the sensitive volume of the chamber, thereby obscuring tracks by constant precipitation. A black background makes it easier to observe cloud tracks. A tangential light source is needed; this illuminates the white droplets against the black background. The tracks are not apparent until a shallow pool of alcohol is formed at the condenser plate. If a magnetic field is applied across the cloud chamber and negatively charged particles will curve in opposite directions, according to the Lorentz force law.
The bubble chamber was invented by Donald A. Glaser of the United States in 1952, for this, he was awarded the Nobel
A particle accelerator is a machine that uses electromagnetic fields to propel charged particles to high speeds and energies, to contain them in well-defined beams. Large accelerators are used for basic research in particle physics; the most powerful accelerator is the Large Hadron Collider near Geneva, built by the European collaboration CERN. It is a collider accelerator, which can accelerate two beams of protons to an energy of 6.5 TeV and cause them to collide head-on, creating center-of-mass energies of 13 TeV. Other powerful accelerators are KEKB at KEK in Japan, RHIC at Brookhaven National Laboratory, the Tevatron at Fermilab, Illinois. Accelerators are used as synchrotron light sources for the study of condensed matter physics. Smaller particle accelerators are used in a wide variety of applications, including particle therapy for oncological purposes, radioisotope production for medical diagnostics, ion implanters for manufacture of semiconductors, accelerator mass spectrometers for measurements of rare isotopes such as radiocarbon.
There are more than 30,000 accelerators in operation around the world. There are two basic classes of accelerators: electrodynamic accelerators. Electrostatic accelerators use static electric fields to accelerate particles; the most common types are the Cockcroft -- the Van de Graaff generator. A small-scale example of this class is the cathode ray tube in an ordinary old television set; the achievable kinetic energy for particles in these devices is determined by the accelerating voltage, limited by electrical breakdown. Electrodynamic or electromagnetic accelerators, on the other hand, use changing electromagnetic fields to accelerate particles. Since in these types the particles can pass through the same accelerating field multiple times, the output energy is not limited by the strength of the accelerating field; this class, first developed in the 1920s, is the basis for most modern large-scale accelerators. Rolf Widerøe, Gustav Ising, Leó Szilárd, Max Steenbeck, Ernest Lawrence are considered pioneers of this field and building the first operational linear particle accelerator, the betatron, the cyclotron.
Because colliders can give evidence of the structure of the subatomic world, accelerators were referred to as atom smashers in the 20th century. Despite the fact that most accelerators propel subatomic particles, the term persists in popular usage when referring to particle accelerators in general. Beams of high-energy particles are useful for fundamental and applied research in the sciences, in many technical and industrial fields unrelated to fundamental research, it has been estimated that there are 30,000 accelerators worldwide. Of these, only about 1% are research machines with energies above 1 GeV, while about 44% are for radiotherapy, 41% for ion implantation, 9% for industrial processing and research, 4% for biomedical and other low-energy research; the bar graph shows the breakdown of the number of industrial accelerators according to their applications. The numbers are based on 2012 statistics available from various sources, including production and sales data published in presentations or market surveys, data provided by a number of manufacturers.
For the most basic inquiries into the dynamics and structure of matter and time, physicists seek the simplest kinds of interactions at the highest possible energies. These entail particle energies of many GeV, the interactions of the simplest kinds of particles: leptons and quarks for the matter, or photons and gluons for the field quanta. Since isolated quarks are experimentally unavailable due to color confinement, the simplest available experiments involve the interactions of, leptons with each other, second, of leptons with nucleons, which are composed of quarks and gluons. To study the collisions of quarks with each other, scientists resort to collisions of nucleons, which at high energy may be usefully considered as 2-body interactions of the quarks and gluons of which they are composed, thus elementary particle physicists tend to use machines creating beams of electrons, positrons and antiprotons, interacting with each other or with the simplest nuclei at the highest possible energies hundreds of GeV or more.
The largest and highest energy particle accelerator used for elementary particle physics is the Large Hadron Collider at CERN, operating since 2009. Nuclear physicists and cosmologists may use beams of bare atomic nuclei, stripped of electrons, to investigate the structure and properties of the nuclei themselves, of condensed matter at high temperatures and densities, such as might have occurred in the first moments of the Big Bang; these investigations involve collisions of heavy nuclei – of atoms like iron or gold – at energies of several GeV per nucleon. The largest such particle accelerator is the Relativistic Heavy Ion Collider at Brookhaven National Laboratory. Particle accelerators can produce proton beams, which can produce proton-rich medical or research isotopes as opposed to the neutron-rich ones made in fission reactors. An example of this type of machine is LANSCE at Los Alamos. Besides being of fundamental interest, electrons accelerated in the magnetic field causes the high energy electrons to emit extre