Quantum mechanics, including quantum field theory, is a fundamental theory in physics which describes nature at the smallest scales of energy levels of atoms and subatomic particles. Classical physics, the physics existing before quantum mechanics, describes nature at ordinary scale. Most theories in classical physics can be derived from quantum mechanics as an approximation valid at large scale. Quantum mechanics differs from classical physics in that energy, angular momentum and other quantities of a bound system are restricted to discrete values. Quantum mechanics arose from theories to explain observations which could not be reconciled with classical physics, such as Max Planck's solution in 1900 to the black-body radiation problem, from the correspondence between energy and frequency in Albert Einstein's 1905 paper which explained the photoelectric effect. Early quantum theory was profoundly re-conceived in the mid-1920s by Erwin Schrödinger, Werner Heisenberg, Max Born and others; the modern theory is formulated in various specially developed mathematical formalisms.
In one of them, a mathematical function, the wave function, provides information about the probability amplitude of position and other physical properties of a particle. Important applications of quantum theory include quantum chemistry, quantum optics, quantum computing, superconducting magnets, light-emitting diodes, the laser, the transistor and semiconductors such as the microprocessor and research imaging such as magnetic resonance imaging and electron microscopy. Explanations for many biological and physical phenomena are rooted in the nature of the chemical bond, most notably the macro-molecule DNA. Scientific inquiry into the wave nature of light began in the 17th and 18th centuries, when scientists such as Robert Hooke, Christiaan Huygens and Leonhard Euler proposed a wave theory of light based on experimental observations. In 1803, Thomas Young, an English polymath, performed the famous double-slit experiment that he described in a paper titled On the nature of light and colours.
This experiment played a major role in the general acceptance of the wave theory of light. In 1838, Michael Faraday discovered cathode rays; these studies were followed by the 1859 statement of the black-body radiation problem by Gustav Kirchhoff, the 1877 suggestion by Ludwig Boltzmann that the energy states of a physical system can be discrete, the 1900 quantum hypothesis of Max Planck. Planck's hypothesis that energy is radiated and absorbed in discrete "quanta" matched the observed patterns of black-body radiation. In 1896, Wilhelm Wien empirically determined a distribution law of black-body radiation, known as Wien's law in his honor. Ludwig Boltzmann independently arrived at this result by considerations of Maxwell's equations. However, it underestimated the radiance at low frequencies. Planck corrected this model using Boltzmann's statistical interpretation of thermodynamics and proposed what is now called Planck's law, which led to the development of quantum mechanics. Following Max Planck's solution in 1900 to the black-body radiation problem, Albert Einstein offered a quantum-based theory to explain the photoelectric effect.
Around 1900–1910, the atomic theory and the corpuscular theory of light first came to be accepted as scientific fact. Among the first to study quantum phenomena in nature were Arthur Compton, C. V. Raman, Pieter Zeeman, each of whom has a quantum effect named after him. Robert Andrews Millikan studied the photoelectric effect experimentally, Albert Einstein developed a theory for it. At the same time, Ernest Rutherford experimentally discovered the nuclear model of the atom, for which Niels Bohr developed his theory of the atomic structure, confirmed by the experiments of Henry Moseley. In 1913, Peter Debye extended Niels Bohr's theory of atomic structure, introducing elliptical orbits, a concept introduced by Arnold Sommerfeld; this phase is known as old quantum theory. According to Planck, each energy element is proportional to its frequency: E = h ν, where h is Planck's constant. Planck cautiously insisted that this was an aspect of the processes of absorption and emission of radiation and had nothing to do with the physical reality of the radiation itself.
In fact, he considered his quantum hypothesis a mathematical trick to get the right answer rather than a sizable discovery. However, in 1905 Albert Einstein interpreted Planck's quantum hypothesis realistically and used it to explain the photoelectric effect, in which shining light on certain materials can eject electrons from the material, he won the 1921 Nobel Prize in Physics for this work. Einstein further developed this idea to show that an electromagnetic wave such as light could be described as a particle, with a discrete quantum of energy, dependent on its frequency; the foundations of quantum mechanics were established during the first half of the 20th century by Max Planck, Niels Bohr, Werner Heisenberg, Louis de Broglie, Arthur Compton, Albert Einstein, Erwin Schrödinger, Max Born, John von Neumann, Paul Dirac, Enrico Fermi, Wolfgang Pauli, Max von Laue, Freeman Dyson, David Hilbert, Wi
Atmospheric entry is the movement of an object from outer space into and through the gases of an atmosphere of a planet, dwarf planet, or natural satellite. There are two main types of atmospheric entry: uncontrolled entry, such as the entry of astronomical objects, space debris, or bolides. Technologies and procedures allowing the controlled atmospheric entry and landing of spacecraft are collectively termed as EDL. Atmospheric drag and aerodynamic heating can cause atmospheric breakup capable of disintegrating smaller objects; these forces may cause objects with lower compressive strength to explode. Crewed space vehicles must be slowed to subsonic speeds before parachutes or air brakes may be deployed; such vehicles have kinetic energies between 50 and 1,800 megajoules, atmospheric dissipation is the only way of expending the kinetic energy. The amount of rocket fuel required to slow the vehicle would be nearly equal to the amount used to accelerate it and it is thus impractical to use retro rockets for the entire Earth reentry procedure.
While the high temperature generated at the surface of the heat shield is due to adiabatic compression, the vehicle's kinetic energy is lost to gas friction after the vehicle has passed by. Other smaller energy losses include black-body radiation directly from the hot gases and chemical reactions between ionized gases. Ballistic warheads and expendable vehicles do not require slowing at reentry, in fact, are made streamlined so as to maintain their speed. Furthermore, slow-speed returns to Earth from near-space such as parachute jumps from balloons do not require heat shielding because the gravitational acceleration of an object starting at relative rest from within the atmosphere itself cannot create enough velocity to cause significant atmospheric heating. For Earth, atmospheric entry occurs at the Kármán line at an altitude of 100 km above the surface, while at Venus atmospheric entry occurs at 250 km and at Mars atmospheric entry at about 80 km. Uncontrolled, objects reach high velocities while accelerating through space toward the Earth under the influence of Earth's gravity, are slowed by friction upon encountering Earth's atmosphere.
Meteors are often travelling quite fast relative to the Earth because their own orbital path is different from that of the Earth before they encounter Earth's gravity well. Most controlled objects enter at hypersonic speeds due to their suborbital, orbital, or unbounded trajectories. Various advanced technologies have been developed to enable atmospheric reentry and flight at extreme velocities. An alternative low velocity method of controlled atmospheric entry is buoyancy, suitable for planetary entry where thick atmospheres, strong gravity, or both factors complicate high-velocity hyperbolic entry, such as the atmospheres of Venus and the gas giants; the concept of the ablative heat shield was described as early as 1920 by Robert Goddard: "In the case of meteors, which enter the atmosphere with speeds as high as 30 miles per second, the interior of the meteors remains cold, the erosion is due, to a large extent, to chipping or cracking of the heated surface. For this reason, if the outer surface of the apparatus were to consist of layers of a infusible hard substance with layers of a poor heat conductor between, the surface would not be eroded to any considerable extent as the velocity of the apparatus would not be nearly so great as that of the average meteor."Practical development of reentry systems began as the range and reentry velocity of ballistic missiles increased.
For early short-range missiles, like the V-2, stabilization and aerodynamic stress were important issues, but heating was not a serious problem. Medium-range missiles like the Soviet R-5, with a 1,200-kilometer range, required ceramic composite heat shielding on separable reentry vehicles; the first ICBMs, with ranges of 8,000 to 12,000 kilometers, were only possible with the development of modern ablative heat shields and blunt-shaped vehicles. In the United States, this technology was pioneered by H. Julian Allen and A. J. Eggers Jr. of the National Advisory Committee for Aeronautics at Ames Research Center. In 1951, they made the counterintuitive discovery that a blunt shape made the most effective heat shield. From simple engineering principles and Eggers showed that the heat load experienced by an entry vehicle was inversely proportional to the drag coefficient. If the reentry vehicle is made blunt, air cannot "get out of the way" enough, acts as an air cushion to push the shock wave and heated shock layer forward.
Since most of the hot gases are no longer in direct contact with the vehicle, the heat energy would stay in the shocked gas and move around the vehicle to dissipate into the atmosphere. The Allen and Eggers discovery, though treated as a military secret, was published in 1958. Over the decades since the 1950s, a rich technical jargon has grown around the engineering of vehicles designed to enter planetary atmospheres, it is recommended that the reader review the jargon glossary before continuing with this article on atmospheric reentry. When atmospheric entry is pa
United States House Committee on Financial Services
The United States House Committee on Financial Services referred to as the House Banking Committee and known as the Committee on Banking and Currency, is the committee of the United States House of Representatives that oversees the entire financial services industry, including the securities, insurance and housing industries. The Financial Services Committee oversees the work of the Federal Reserve, the United States Department of the Treasury, the U. S. Securities and Exchange Commission and other financial services regulators, it is chaired by Democrat Maxine Waters from California. Waters was elected as chair of the committee, assumed office on January 3, 2019; the Banking and Currency Committee was created on December 11, 1865, to take over responsibilities handled by the Ways and Means Committee. It continued to function under this name until 1968. Sources: H. Res. 7, H. Res. 8, H. Res. 57, H. Res. 68 Sources: H. Res. 6, H. Res. 7, H. Res. 29, H. Res. 45 The Financial Services Committee operates with six subcommittees.
The jurisdiction over insurance was transferred in 2001 to the then-House Banking and Financial Services Committee from the House Energy and Commerce Committee. Since that time it had been the purview of the Subcommittee on Capital Markets and Government Sponsored Enterprises, but "with plans to reform Fannie Mae and Freddie Mac expected to take up much of that panel's agenda, insurance instead moved to a new Subcommittee on Insurance and Community Opportunity." In the 115th Congress, a new subcommittee on Terrorism and Illicit Finance was created, dedicated to disrupting the financing of terrorist organizations. United States Senate Committee on Banking and Urban Affairs List of current United States House of Representatives committees House Committee on Financial Services Homepage House Financial Services Committee. Legislation activity and reports, Congress.gov. House Financial Services Committee Hearings and Meetings Video. Congress.gov
In mathematics, non-Euclidean geometry consists of two geometries based on axioms related to those specifying Euclidean geometry. As Euclidean geometry lies at the intersection of metric geometry and affine geometry, non-Euclidean geometry arises when either the metric requirement is relaxed, or the parallel postulate is replaced with an alternative one. In the latter case one obtains hyperbolic geometry and elliptic geometry, the traditional non-Euclidean geometries; when the metric requirement is relaxed there are affine planes associated with the planar algebras which give rise to kinematic geometries that have been called non-Euclidean geometry. The essential difference between the metric geometries is the nature of parallel lines. Euclid's fifth postulate, the parallel postulate, is equivalent to Playfair's postulate, which states that, within a two-dimensional plane, for any given line ℓ and a point A, not on ℓ, there is one line through A that does not intersect ℓ. In hyperbolic geometry, by contrast, there are infinitely many lines through A not intersecting ℓ, while in elliptic geometry, any line through A intersects ℓ.
Another way to describe the differences between these geometries is to consider two straight lines indefinitely extended in a two-dimensional plane that are both perpendicular to a third line: In Euclidean geometry, the lines remain at a constant distance from each other and are known as parallels. In hyperbolic geometry, they "curve away" from each other, increasing in distance as one moves further from the points of intersection with the common perpendicular. In elliptic geometry, the lines intersect. Euclidean geometry, named after the Greek mathematician Euclid, includes some of the oldest known mathematics, geometries that deviated from this were not accepted as legitimate until the 19th century; the debate that led to the discovery of the non-Euclidean geometries began as soon as Euclid's work Elements was written. In the Elements, Euclid began with a limited number of assumptions and sought to prove all the other results in the work; the most notorious of the postulates is referred to as "Euclid's Fifth Postulate," or the "parallel postulate", which in Euclid's original formulation is: If a straight line falls on two straight lines in such a manner that the interior angles on the same side are together less than two right angles the straight lines, if produced indefinitely, meet on that side on which are the angles less than the two right angles.
Other mathematicians have devised simpler forms of this property. Regardless of the form of the postulate, however, it appears to be more complicated than Euclid's other postulates: 1. To draw a straight line from any point to any point. 2. To produce a finite straight line continuously in a straight line. 3. To describe a circle with any centre and distance. 4. That all right angles are equal to one another. For at least a thousand years, geometers were troubled by the disparate complexity of the fifth postulate, believed it could be proved as a theorem from the other four. Many attempted to find a proof by contradiction, including Ibn al-Haytham, Omar Khayyám, Nasīr al-Dīn al-Tūsī, Giovanni Girolamo Saccheri; the theorems of Ibn al-Haytham, Khayyam and al-Tusi on quadrilaterals, including the Lambert quadrilateral and Saccheri quadrilateral, were "the first few theorems of the hyperbolic and the elliptic geometries." These theorems along with their alternative postulates, such as Playfair's axiom, played an important role in the development of non-Euclidean geometry.
These early attempts at challenging the fifth postulate had a considerable influence on its development among European geometers, including Witelo, Levi ben Gerson, John Wallis and Saccheri. All of these early attempts made at trying to formulate non-Euclidean geometry, provided flawed proofs of the parallel postulate, containing assumptions that were equivalent to the parallel postulate; these early attempts did, provide some early properties of the hyperbolic and elliptic geometries. Khayyam, for example, tried to derive it from an equivalent postulate he formulated from "the principles of the Philosopher": "Two convergent straight lines intersect and it is impossible for two convergent straight lines to diverge in the direction in which they converge." Khayyam considered the three cases right and acute that the summit angles of a Saccheri quadrilateral can take and after proving a number of theorems about them, he refuted the obtuse and acute cases based on his postulate and hence derived the classic postulate of Euclid which he didn't realize was equivalent to his own postulate.
Another example is al-Tusi's son, Sadr al-Din, who wrote a book on the subject in 1298, based on al-Tusi's thoughts, which presented another hypothesis equivalent to the parallel postulate. "He revised both the Euclidean system of axioms and postulates and the proofs of many propositions from the Elements." His work was published in Rome in 1594 and was studied by European geometers, including Saccheri who criticised this work as well as that of Wallis. Giordano Vitale, in his book Euclide restituo, used the Saccheri quadrilateral to prove that if three points are equidistant on the base AB and the summ
Wave–particle duality is the concept in quantum mechanics that every particle or quantum entity may be described in terms not only of particles, but of waves. It expresses the inability of the classical concepts "particle" or "wave" to describe the behaviour of quantum-scale objects; as Albert Einstein wrote: It seems as though we must use sometimes the one theory and sometimes the other, while at times we may use either. We are faced with a new kind of difficulty. We have two contradictory pictures of reality. Through the work of Max Planck, Albert Einstein, Louis de Broglie, Arthur Compton, Niels Bohr, many others, current scientific theory holds that all particles exhibit a wave nature and vice versa; this phenomenon has been verified not only for elementary particles, but for compound particles like atoms and molecules. For macroscopic particles, because of their short wavelengths, wave properties cannot be detected. Although the use of the wave-particle duality has worked well in physics, the meaning or interpretation has not been satisfactorily resolved.
Bohr regarded the "duality paradox" as a metaphysical fact of nature. A given kind of quantum object will exhibit sometimes wave, sometimes particle, character, in different physical settings, he saw such duality as one aspect of the concept of complementarity. Bohr regarded renunciation of the cause-effect relation, or complementarity, of the space-time picture, as essential to the quantum mechanical account. Werner Heisenberg considered the question further, he saw the duality as present for all quantic entities, but not quite in the usual quantum mechanical account considered by Bohr. He saw it in what is called second quantization, which generates an new concept of fields which exist in ordinary space-time, causality still being visualizable. Classical field values are replaced by an new kind of field value, as considered in quantum field theory. Turning the reasoning around, ordinary quantum mechanics can be deduced as a specialized consequence of quantum field theory. Democritus argued that all things in the universe, including light, are composed of indivisible sub-components.
At the beginning of the 11th Century, the Arabic scientist Ibn al-Haytham wrote the first comprehensive Book of optics describing reflection and the operation of a pinhole lens via rays of light traveling from the point of emission to the eye. He asserted. In 1630, René Descartes popularized and accredited the opposing wave description in his treatise on light, The World, showing that the behavior of light could be re-created by modeling wave-like disturbances in a universal medium i.e. luminiferous aether. Beginning in 1670 and progressing over three decades, Isaac Newton developed and championed his corpuscular theory, arguing that the straight lines of reflection demonstrated light's particle nature, only particles could travel in such straight lines, he explained refraction by positing that particles of light accelerated laterally upon entering a denser medium. Around the same time, Newton's contemporaries Robert Hooke and Christiaan Huygens, Augustin-Jean Fresnel, mathematically refined the wave viewpoint, showing that if light traveled at different speeds in different media, refraction could be explained as the medium-dependent propagation of light waves.
The resulting Huygens–Fresnel principle was successful at reproducing light's behavior and was subsequently supported by Thomas Young's discovery of wave interference of light by his double-slit experiment in 1801. The wave view did not displace the ray and particle view, but began to dominate scientific thinking about light in the mid 19th century, since it could explain polarization phenomena that the alternatives could not. James Clerk Maxwell discovered that he could apply his discovered Maxwell's equations, along with a slight modification to describe self-propagating waves of oscillating electric and magnetic fields, it became apparent that visible light, ultraviolet light, infrared light were all electromagnetic waves of differing frequency. In 1901, Max Planck published an analysis that succeeded in reproducing the observed spectrum of light emitted by a glowing object. To accomplish this, Planck had to make a mathematical assumption of quantized energy of the oscillators i.e. atoms of the black body that emit radiation.
Einstein proposed that electromagnetic radiation itself is quantized, not the energy of radiating atoms. Black-body radiation, the emission of electromagnetic energy due to an object's heat, could not be explained from classical arguments alone; the equipartition theorem of classical mechanics, the basis of all classical thermodynamic theories, stated that an object's energy is partitioned among the object's vibrational modes. But applying the same reasoning to the electromagnetic emission of such a thermal object was not so successful; that thermal objects emit light had been long known. Since light was known to be waves of electromagnetism, physicists hoped to describe this emission via classical laws; this became known as the black body problem. Since the equipartition theorem worked so well in describing the vibrational modes of the thermal object itself, it was natural to assume that it would perform well in describing the radiative emission of such objects, but a problem arose if each mode received an equal partition of energy, the short wavelength modes would consume all the energy.
This became clear when plo
A barometer is a scientific instrument used to measure air pressure. Pressure tendency can forecast short term changes in the weather. Many measurements of air pressure are used within surface weather analysis to help find surface troughs, high pressure systems and frontal boundaries. Barometers and pressure altimeters are the same instrument, but used for different purposes. An altimeter is intended to be used at different levels matching the corresponding atmospheric pressure to the altitude, while a barometer is kept at the same level and measures subtle pressure changes caused by weather; the word barometer is derived from the Ancient Greek: βάρος, lit.'weight', -meter from Ancient Greek: μέτρον. Although Evangelista Torricelli is universally credited with inventing the barometer in 1643, historical documentation suggests Gasparo Berti, an Italian mathematician and astronomer, unintentionally built a water barometer sometime between 1640 and 1643. French scientist and philosopher René Descartes described the design of an experiment to determine atmospheric pressure as early as 1631, but there is no evidence that he built a working barometer at that time.
On July 27, 1630, Giovanni Battista Baliani wrote a letter to Galileo Galilei explaining an experiment he had made in which a siphon, led over a hill about twenty-one meters high, failed to work. Galileo responded with an explanation of the phenomenon: he proposed that it was the power of a vacuum that held the water up, at a certain height the amount of water became too much and the force could not hold any more, like a cord that can support only so much weight; this was a restatement of the theory of horror vacui, which dates to Aristotle, which Galileo restated as resistenza del vacuo. Galileo's ideas reached Rome in December 1638 in his Discorsi. Raffaele Magiotti and Gasparo Berti were excited by these ideas, decided to seek a better way to attempt to produce a vacuum other than with a siphon. Magiotti devised such an experiment, sometime between 1639 and 1641, Berti carried it out. Four accounts of Berti's experiment exist, but a simple model of his experiment consisted of filling with water a long tube that had both ends plugged standing the tube in a basin full of water.
The bottom end of the tube was opened, water, inside of it poured out into the basin. However, only part of the water in the tube flowed out, the level of the water inside the tube stayed at an exact level, which happened to be 10.3 m, the same height Baliani and Galileo had observed, limited by the siphon. What was most important about this experiment was that the lowering water had left a space above it in the tube which had no intermediate contact with air to fill it up; this seemed to suggest the possibility of a vacuum existing in the space above the water. Torricelli, a friend and student of Galileo, interpreted the results of the experiments in a novel way, he proposed that the weight of the atmosphere, not an attracting force of the vacuum, held the water in the tube. In a letter to Michelangelo Ricci in 1644 concerning the experiments, he wrote: Many have said that a vacuum does not exist, others that it does exist in spite of the repugnance of nature and with difficulty. I argued thus: If there can be found a manifest cause from which the resistance can be derived, felt if we try to make a vacuum, it seems to me foolish to try to attribute to vacuum those operations which follow evidently from some other cause.
It was traditionally thought that the air did not have weight: that is, that the kilometers of air above the surface did not exert any weight on the bodies below it. Galileo had accepted the weightlessness of air as a simple truth. Torricelli questioned that assumption, instead proposed that air had weight and that it was the latter which held up the column of water, he thought that the level the water stayed at was reflective of the force of the air's weight pushing on it. In other words, he viewed the barometer as a balance, an instrument for measurement, because he was the first to view it this way, he is traditionally considered the inventor of the barometer; because of rumors circulating in Torricelli's gossipy Italian neighborhood, which included that he was engaged in some form of sorcery or witchcraft, Torricelli realized he had to keep his experiment secret to avoid the risk of being arrested. He needed to use a liquid, heavier than water, from his previous association and suggestions by Galileo, he deduced by using mercury, a shorter tube could be used.
With mercury, about 14 times denser than water, a tube only 80 cm was now needed, not 10.5 m. In 1646, Blaise Pascal along with Pierre Petit, had repeated and perfected Torricelli's experiment after hearing about it from Marin Mersenne, who himself had been shown the experiment by Torricelli toward the end of 1644. Pascal further devised an experiment to test the Aristotelian proposition that it was vapors from the liquid that filled the space in a barometer, his expe
The Space Shuttle was a reusable low Earth orbital spacecraft system operated by the U. S. National Aeronautics and Space Administration as part of the Space Shuttle program, its official program name was Space Transportation System, taken from a 1969 plan for a system of reusable spacecraft of which it was the only item funded for development. The first of four orbital test flights occurred in 1981, leading to operational flights beginning in 1982. In addition to the prototype whose completion was cancelled, five complete Shuttle systems were built and used on a total of 135 missions from 1981 to 2011, launched from the Kennedy Space Center in Florida. Operational missions launched numerous satellites, interplanetary probes, the Hubble Space Telescope; the Shuttle fleet's total mission time was 19 hours, 21 minutes and 23 seconds. Shuttle components included the Orbiter Vehicle with three clustered Rocketdyne RS-25 main engines, a pair of recoverable solid rocket boosters, the expendable external tank containing liquid hydrogen and liquid oxygen.
The Space Shuttle was launched vertically, like a conventional rocket, with the two SRBs operating in parallel with the OV's three main engines, which were fueled from the ET. The SRBs were jettisoned before the vehicle reached orbit, the ET was jettisoned just before orbit insertion, which used the orbiter's two Orbital Maneuvering System engines. At the conclusion of the mission, the orbiter fired its OMS to re-enter the atmosphere; the orbiter glided as a spaceplane to a runway landing to the Shuttle Landing Facility at Kennedy Space Center, Florida or Rogers Dry Lake in Edwards Air Force Base, California. After landing at Edwards, the orbiter was flown back to the KSC on the Shuttle Carrier Aircraft, a specially modified Boeing 747; the first orbiter, was built in 1976, used in Approach and Landing Tests and had no orbital capability. Four operational orbiters were built: Columbia, Challenger and Atlantis. Of these, two were lost in mission accidents: Challenger in 1986 and Columbia in 2003, with a total of fourteen astronauts killed.
A fifth operational orbiter, was built in 1991 to replace Challenger. The Space Shuttle was retired from service upon the conclusion of Atlantis's final flight on July 21, 2011; the U. S. has since relied on the Russian Soyuz spacecraft to transport astronauts to the International Space Station, pending the Commercial Crew Development and Space Launch System programs on schedule for first flights in 2019 and 2020. The Space Shuttle was a reusable human spaceflight vehicle capable of reaching low Earth orbit and operated by the U. S. National Aeronautics and Space Administration from 1981 to 2011, it resulted from shuttle design studies conducted by NASA and the U. S. Air Force in the 1960s and was first proposed for development as part of an ambitious second-generation Space Transportation System of space vehicles to follow the Apollo program in a September 1969 report of a Space Task Group headed by Vice President Spiro Agnew to President Richard Nixon. Nixon's post-Apollo NASA budgeting withdrew support of all system components except the Shuttle, to which NASA applied the STS name.
The vehicle consisted of a spaceplane for orbit and re-entry, fueled from an expendable External Tank containing liquid hydrogen and liquid oxygen, with two reusable strap-on solid rocket boosters. The first of four orbital test flights occurred in 1981, leading to operational flights beginning in 1982, all launched from the Kennedy Space Center, Florida; the system was retired from service in 2011 after 135 missions, with Atlantis making the final launch of the three-decade Shuttle program on July 8, 2011. The program ended after Atlantis landed at the Kennedy Space Center on July 21, 2011. Major missions included launching numerous satellites and interplanetary probes, conducting space science experiments, servicing and construction of space stations; the first orbiter vehicle, named Enterprise, was used in the initial Approach and Landing Tests phase but installation of engines, heat shielding, other equipment necessary for orbital flight was cancelled. A total of five operational orbiters were built, of these, two were destroyed in accidents.
It was used for orbital space missions by NASA, the U. S. Department of Defense, the European Space Agency and Germany; the United States funded Shuttle development and operations except for the Spacelab modules used on D1 and D2—sponsored by Germany. SL-J was funded by Japan. At launch, it consisted of the "stack", including the dark orange external tank; some payloads were launched into higher orbits with either of two different upper stages developed for the STS. The Space Shuttle was stacked in the Vehicle Assembly Building, the stack mounted on a mobile launch platform held down by four frangible nuts on each SRB, which were detonated at launch; the Shuttle stack launched vertically like a conventional rocket. It lifted off under the power of its two SRBs and three main engines, which were fueled by liquid hydrogen and liquid oxygen from the ET; the Space Shuttle had a two-stage ascent. The SRBs provided additional thrust during first-stage flight. About two minutes after liftoff, frangible nuts were fired, releasing the SRBs, which parachuted into the ocean, to