An endotherm is an organism that maintains its body at a metabolically favorable temperature by the use of heat set free by its internal bodily functions instead of relying purely on ambient heat. Such internally generated heat is an incidental product of the animal's routine metabolism, but under conditions of excessive cold or low activity an endotherm might apply special mechanisms adapted to heat production. Examples include special-function muscular exertion such as shivering, uncoupled oxidative metabolism such as within brown adipose tissue. Only birds and mammals are extant universally endothermic groups of animals. Certain lamnid sharks and billfishes are endothermic. In common parlance, endotherms are characterized as "warm-blooded"; the opposite of endothermy is ectothermy, although in general, there is no absolute or clear separation between the nature of endotherms and ectotherms. Many endotherms have a larger number of mitochondria per cell than ectotherms; this enables them to generate heat by increasing the rate at which they metabolize sugars.
Accordingly, to sustain their higher metabolism, endothermic animals require several times as much food as ectothermic animals do, require a more sustained supply of metabolic fuel. In many endothermic animals, a controlled temporary state of hypothermia conserves energy by permitting the body temperature to drop nearly to ambient levels; such states may be brief, regular circadian cycles called torpor, or they might occur in much longer seasonal, cycles called hibernation. The body temperatures of many small birds and small mammals fall during daily inactivity, such as nightly in diurnal animals or during the day in nocturnal animals, thus reducing the energy cost of maintaining body temperature. Less drastic intermittent reduction in body temperature occurs in other, larger endotherms. There may be other variations in temperature smaller, either endogenous or in response to external circumstances or vigorous exertion, either an increase or a drop; the resting human body generates about two-thirds of its heat through metabolism in internal organs in the thorax and abdomen, as well as in the brain.
The brain generates about 16% of the total heat produced by the body. Heat loss is a major threat to smaller creatures, as they have a larger ratio of surface area to volume. Small warm-blooded animals have insulation in the form of fur or feathers. Aquatic warm-blooded animals, such as seals have deep layers of blubber under the skin and any pelage that they might have. Penguins have blubber. Penguin feathers serve both for insulation and for streamlining. Endotherms that live in cold circumstances or conditions predisposing to heat loss, such as polar waters, tend to have specialised structures of blood vessels in their extremities that act as heat exchangers; the veins are adjacent to the arteries full of warm blood. Some of the arterial heat is recycled back into the trunk. Birds waders have well-developed heat exchange mechanisms in their legs—those in the legs of emperor penguins are part of the adaptations that enable them to spend months on Antarctic winter ice. In response to cold many warm-blooded animals reduce blood flow to the skin by vasoconstriction to reduce heat loss.
As a result, they blanch. In equatorial climates and during temperate summers, overheating is as great a threat as cold. In hot conditions, many warm-blooded animals increase heat loss by panting, which cools the animal by increasing water evaporation in the breath, and/or flushing, increasing the blood flow to the skin so the heat will radiate into the environment. Hairless and short-haired mammals, including humans sweat, since the evaporation of the water in sweat removes heat. Elephants keep cool by using their huge ears like radiators in automobiles, their ears are thin and the blood vessels are close to the skin, flapping their ears to increase the airflow over them causes the blood to cool, which reduces their core body temperature when the blood moves through the rest of the circulatory system. The major advantage of endothermy over ectothermy is decreased vulnerability to fluctuations in external temperature. Regardless of location, endothermy maintains a constant core temperature for optimum enzyme activity.
Endotherms control body temperature by internal homeostatic mechanisms. In mammals, two separate homeostatic mechanisms are involved in thermoregulation—one mechanism increases body temperature, while the other decreases it; the presence of two separate mechanisms provides a high degree of control. This is important because the core temperature of mammals can be controlled to be as close as possible to the optimum temperature for enzyme activity; the overall rate of an animal's metabolism increases by a factor of about two for every 10 °C rise in temperature, limited by the need to avoid hyperthermia. Endothermy does not provide greater speed in movement than ectothermy —ectothermic animals can move as fast as warm-blooded animals of the same size and build when the ectotherm is near or at its optimum temperature, but cannot maintain high metabolic activity for as long as endotherms. Endothermic/homeothermic animals can be optimally active at more times during the diurnal cycle in places of sharp temperature variations between day and night and
Sublimation (phase transition)
Sublimation is the transition of a substance directly from the solid to the gas phase, without passing through the intermediate liquid phase. Sublimation is an endothermic process that occurs at temperatures and pressures below a substance's triple point in its phase diagram, which corresponds to the lowest pressure at which the substance can exist as a liquid; the reverse process of sublimation is deposition or desublimation, in which a substance passes directly from a gas to a solid phase. Sublimation has been used as a generic term to describe a solid-to-gas transition followed by a gas-to-solid transition. While a transition from liquid to gas is described as evaporation if it occurs below the boiling point of the liquid, as boiling if it occurs at the boiling point, there is no such distinction within the solid-to-gas transition, always described as sublimation. At normal pressures, most chemical compounds and elements possess three different states at different temperatures. In these cases, the transition from the solid to the gaseous state requires an intermediate liquid state.
The pressure referred to is the partial pressure of the substance, not the total pressure of the entire system. So, all solids that possess an appreciable vapour pressure at a certain temperature can sublime in air. For some substances, such as carbon and arsenic, sublimation is much easier than evaporation from the melt, because the pressure of their triple point is high, it is difficult to obtain them as liquids; the term sublimation refers to a physical change of state and is not used to describe the transformation of a solid to a gas in a chemical reaction. For example, the dissociation on heating of solid ammonium chloride into hydrogen chloride and ammonia is not sublimation but a chemical reaction; the combustion of candles, containing paraffin wax, to carbon dioxide and water vapor is not sublimation but a chemical reaction with oxygen. Sublimation is caused by the absorption of heat which provides enough energy for some molecules to overcome the attractive forces of their neighbors and escape into the vapor phase.
Since the process requires additional energy, it is an endothermic change. The enthalpy of sublimation can be calculated by adding the enthalpy of fusion and the enthalpy of vaporization. Solid carbon dioxide sublimes everywhere along the line below the triple point (e.g. at the temperature of −78.5 °C at atmospheric pressure, whereas its melting into liquid CO2 can occur only along the line at pressures and temperatures above the triple point. Snow and ice sublime, although more at temperatures below the freezing/melting point temperature line at 0 °C for most pressures. In freeze-drying, the material to be dehydrated is frozen and its water is allowed to sublime under reduced pressure or vacuum; the loss of snow from a snowfield during a cold spell is caused by sunshine acting directly on the upper layers of the snow. Ablation is a process that includes erosive wear of glacier ice. Naphthalene, an organic compound found in pesticides such as mothballs, sublimes because it is made of non-polar molecules that are held together only by van der Waals intermolecular forces.
Naphthalene is a solid that sublimes at standard atmospheric temperature with the sublimation point at around 80 °C or 176 °F. At low temperature, its vapour pressure is high enough, 1 mmHg at 53 °C, to make the solid form of naphthalene evaporate into gas. On cool surfaces, the naphthalene vapours will solidify to form needle-like crystals. Iodine produces fumes on gentle heating, it is possible to obtain liquid iodine at atmospheric pressure by controlling the temperature at just above the melting point of iodine. In forensic science, iodine vapor can reveal latent fingerprints on paper. Arsenic can sublime at high temperatures. Cadmium and zinc are not suitable materials for use in vacuum because they sublimate much more than other common materials. Sublimation is a technique used by chemists to purify compounds. A solid is placed in a sublimation apparatus and heated under vacuum. Under this reduced pressure, the solid volatilizes and condenses as a purified compound on a cooled surface, leaving a non-volatile residue of impurities behind.
Once heating ceases and the vacuum is removed, the purified compound may be collected from the cooling surface. For higher purification efficiencies, a temperature gradient is applied, which allows for the separation of different fractions. Typical setups use an evacuated glass tube, heated in a controlled manner; the material flow is from the hot end, where the initial material is placed, to the cold end, connected to a pump stand. By controlling temperatures along the length of the tube, the operator can control the zones of re-condensation, with volatile compounds being pumped out of the system moderately volatile compounds re-condensing along the tube according to their different volatilities, non-volatile compounds remaining in the hot end. Vacuum sublimation of this type is the method of choice for purification of organic compounds for use in the organic electronics industry, where high purities are needed to satisfy the standards for consumer electronics and other applications. In ancient alchemy, a protoscience that contributed to the development of modern chemistry and medicine, alchemists developed a structure of basic laboratory techniques, theory and experimental methods.
Sublimation was used to refer to the process in which a
Thermal decomposition, or thermolysis, is a chemical decomposition caused by heat. The decomposition temperature of a substance is the temperature at which the substance chemically decomposes; the reaction is endothermic as heat is required to break chemical bonds in the compound undergoing decomposition. If decomposition is sufficiently exothermic, a positive feedback loop is created producing thermal runaway and an explosion. Calcium carbonate decomposes into carbon dioxide when heated; the chemical reaction is as follows:CaCO3 → CaO + CO2 The reaction is used to make quick lime, an industrially important product. Some oxides of weakly electropositive metals decompose when heated to high enough temperature. A classical example is the decomposition of mercuric oxide to give mercury metal; the reaction was used by Joseph Priestley to prepare samples of gaseous oxygen for the first time. When water is heated to well over 2000 °C, a small percentage of it will decompose into OH, monatomic oxygen, monatomic hydrogen, O2, H2.
The compound with the highest known decomposition temperature is carbon monoxide at ≈3870 °C. Ammonium dichromate on heating yields nitrogen and chromium oxide. Ammonium nitrate on strong heating yields dinitrogen water. Ammonium nitrite on heating yields nitrogen water. Barium azide on heating yields barium nitrogen gas. Sodium nitrate on oxygen gas. Organic compounds like tertiary amines on heating undergo Hoffmann elimination and yield secondary amines and alkenes; when metals are near the bottom of the reactivity series, their compounds decompose at high temperatures. This is because stronger bonds form between atoms towards the top of the reactivity series, strong bonds break less easily. For example, copper is near the bottom of the reactivity series, copper sulfate, begins to decompose at about 200 °C, increasing at higher temperatures to about 560 °C. In contrast potassium is near the top of the reactivity series, potassium sulfate does not decompose at its melting point of about 1069 °C, nor at its boiling point.
Ellingham diagram Thermochemical cycle Thermal depolymerization Chemical thermodynamics Pyrolysis - thermolysis of organic material Gas generator
Gibbs free energy
In thermodynamics, the Gibbs free energy is a thermodynamic potential that can be used to calculate the maximum of reversible work that may be performed by a thermodynamic system at a constant temperature and pressure. The Gibbs free energy is the maximum amount of non-expansion work that can be extracted from a thermodynamically closed system; when a system transforms reversibly from an initial state to a final state, the decrease in Gibbs free energy equals the work done by the system to its surroundings, minus the work of the pressure forces. The Gibbs energy is the thermodynamic potential, minimized when a system reaches chemical equilibrium at constant pressure and temperature, its derivative with respect to the reaction coordinate of the system vanishes at the equilibrium point. As such, a reduction in G is a necessary condition for the spontaneity of processes at constant pressure and temperature; the Gibbs free energy called available energy, was developed in the 1870s by the American scientist Josiah Willard Gibbs.
In 1873, Gibbs described this "available energy" as the greatest amount of mechanical work which can be obtained from a given quantity of a certain substance in a given initial state, without increasing its total volume or allowing heat to pass to or from external bodies, except such as at the close of the processes are left in their initial condition. The initial state of the body, according to Gibbs, is supposed to be such that "the body can be made to pass from it to states of dissipated energy by reversible processes". In his 1876 magnum opus On the Equilibrium of Heterogeneous Substances, a graphical analysis of multi-phase chemical systems, he engaged his thoughts on chemical free energy in full. According to the second law of thermodynamics, for systems reacting at STP, there is a general natural tendency to achieve a minimum of the Gibbs free energy. A quantitative measure of the favorability of a given reaction at constant temperature and pressure is the change ΔG in Gibbs free energy, caused by the reaction.
As a necessary condition for the reaction to occur at constant temperature and pressure, ΔG must be smaller than the non-PV work, equal to zero. ΔG equals the maximum amount of non-PV work that can be performed as a result of the chemical reaction for the case of reversible process. If the analysis indicated a positive ΔG for the reaction energy — in the form of electrical or other non-PV work — would have to be added to the reacting system for ΔG to be smaller than the non-PV work and make it possible for the reaction to occur. We can think of ∆G as the amount of "free" or "useful" energy available to do work; the equation can be seen from the perspective of the system taken together with its surroundings. First, assume that the given reaction at constant temperature and pressure is the only one, occurring; the entropy released or absorbed by the system equals the entropy that the environment must absorb or release, respectively. The reaction will only be allowed if the total entropy change of the universe is positive.
This is reflected in a negative ΔG, the reaction is called exergonic. If we couple reactions an otherwise endergonic chemical reaction can be made to happen; the input of heat into an inherently endergonic reaction, such as the elimination of cyclohexanol to cyclohexene, can be seen as coupling an unfavourable reaction to a favourable one such that the total entropy change of the universe is greater than or equal to zero, making the total Gibbs free energy difference of the coupled reactions negative. In traditional use, the term "free" was included in "Gibbs free energy" to mean "available in the form of useful work"; the characterization becomes more precise if we add the qualification that it is the energy available for non-volume work.. However, an increasing number of books and journal articles do not include the attachment "free", referring to G as "Gibbs energy"; this is the result of a 1988 IUPAC meeting to set unified terminologies for the international scientific community, in which the adjective "free" was banished.
This standard, has not yet been universally adopted. The quantity called "free energy" is a more advanced and accurate replacement for the outdated term affinity, used by chemists in the earlier years of physical chemistry to describe the force that caused chemical reactions. In 1873, Willard Gibbs published A Method of Geometrical Representation of the Thermodynamic Properties of Substances by Means of Surfaces, in which he sketched the principles of his new equation, able to predict or estimate the tendencies of various natural processes to ensue when bodies or systems are brought into contact. By studying the interactions of homogeneous substances in contact, i.e. bodies composed of part solid, part liquid, part vapor, by using a three-dimensional volume-entropy-internal energy graph, Gibbs was able to determine three states of equilibrium, i.e. "necessarily stable", "neutral", "unstable", whether or not changes woul
Castle Bravo was the first in a series of high-yield thermonuclear weapon design tests conducted by the United States at Bikini Atoll, Marshall Islands, as part of Operation Castle. Detonated on March 1, 1954, the device was the most powerful nuclear device detonated by the United States and its first lithium deuteride fueled thermonuclear weapon. Castle Bravo's yield was 15 megatons of TNT, 2.5 times the predicted 6.0 megatons, due to unforeseen additional reactions involving 7Li, which led to the unexpected radioactive contamination of areas to the east of Bikini Atoll. Fallout from the detonation fell on residents of Rongelap and Utirik atolls and spread around the world; the inhabitants of the islands were not evacuated until three days and suffered radiation sickness. Twenty-three crew members of the Japanese fishing vessel Daigo Fukuryū Maru were contaminated by fallout, experiencing acute radiation syndrome; the blast incited international reaction over atmospheric thermonuclear testing.
The Bravo Crater is located at 11°41′50″N 165°16′19″E. The remains of the Castle Bravo causeway are at 11°42′6″N 165°17′7″E; the Castle Bravo device was housed in a cylinder that weighed 23,500 pounds and measured 179.5 inches in length and 53.9 inches in diameter. The primary device was a COBRA Deuterium-Tritium gas-boosted atomic bomb made by Los Alamos Scientific Laboratory, a compact MK 7 device; this boosted fission device was tested in the Upshot Knothole Climax event and yielded 61 kilotonnes of TNT. It was considered successful enough that the planned operation series Domino, designed to explore the same question about a suitable primary for thermonuclear bombs, could be cancelled; the implosion system was quite lightweight at 410 kg, because it eliminated the aluminum pusher shell around the tamper and used the more compact ring lenses, a design feature shared with the Mark 5, 12, 13 and 18 designs. The explosive material of the inner charges in the MK 7 was changed to the more powerful Cyclotol 75/25, instead of the Composition B used in most stockpiled bombs at that time, as Cyclotol 75/25 was denser than Composition B and thus could generate the same amount of explosive force in a smaller volume.
The composite uranium-plutonium COBRA core was levitated in a type-D pit. COBRA was Los Alamos' most recent product of design work on the "new principles" of the hollow core. A copper pit liner encased within the weapon-grade plutonium inner capsule prevented DT gas diffusion in plutonium, a technique first tested in Greenhouse Item; the assembled module weighed 830 kg. It was located at the end of the device, which, as seen in the declassified film, shows a small cone projecting from the ballistic case; this cone is the part of the paraboloid, used to focus the radiation emanating from the primary to the secondary. The device was called SHRIMP and had the same basic configuration as the Ivy Mike wet device, except with a different type of fusion fuel. SHRIMP used lithium deuteride, solid at room temperature. Castle Bravo was the first test by the United States of a practical deliverable fusion bomb though the TX-21 as proof-tested in the Bravo event was not weaponized; the successful test rendered obsolete the cryogenic design used by Ivy Mike and its weaponized derivative, the JUGHEAD, slated to be tested as the initial Castle Yankee.
It used a 7075 aluminum 9.5 cm thick ballistic case. Aluminum was used to drastically reduce bomb's weight and provided sufficient radiation confinement time to raise yield, a departure from the heavy stainless steel casing employed by contemporary weapon-projects; the SHRIMP was at least in theory and in many critical aspects identical in geometry to the RUNT and RUNT II devices proof-fired in Castle Romeo and Castle Yankee respectively. On paper it was a scaled-down version of these devices, its origins can be traced back to the spring and summer of 1953; the United States Air Force indicated the importance of lighter thermonuclear weapons for delivery by the B-47 Stratojet and B-58 Hustler. Los Alamos National Laboratory responded to this indication with a follow-up enriched version of the RUNT scaled down to a 3/4 scale radiation-implosion system called the SHRIMP; the proposed weight reduction would provide the Air Force with a much more versatile deliverable gravity bomb. The final version tested in Castle used enriched lithium as its fusion fuel.
Natural lithium is a mixture of lithium-7 isotopes. The enriched lithium used in Bravo was nominally 40% lithium-6; the fuel slugs varied in enrichment from 37 to 40% in 6Li, the slugs with lower enrichment were positioned at the end of the fusion-fuel chamber, away from the primary. The lower levels of lithium enrichment in the fuel slugs, compared with the ALARM CLOCK and many hydrogen weapons, were due to shortages in enriched lithium at that time, as the first of the Alloy Development Plants started production by the fall of 1953; the volume of LiD fuel used was 60% the volume of the fusion fuel filling used in the wet SAUSAGE and dry RUNT I and II devices, or about 500 liters, corresponding to about 400 kg of lithium deuteride. The mixture cost about 4.54 USD/g at th
Tritium is a radioactive isotope of hydrogen. The nucleus of tritium contains one proton and two neutrons, whereas the nucleus of protium contains one proton and no neutrons. Occurring tritium is rare on Earth, where trace amounts are formed by the interaction of the atmosphere with cosmic rays, it can be produced by irradiating lithium metal or lithium-bearing ceramic pebbles in a nuclear reactor. Tritium is used as a radioactive tracer, in radioluminescent light sources for watches and instruments, along with deuterium, as a fuel for nuclear fusion reactions with applications in energy generation and weapons; the name of this isotope is derived from Greek, Modern τρίτος, meaning'third'. While tritium has several different experimentally determined values of its half-life, the National Institute of Standards and Technology lists 4,500 ± 8 days, it decays into helium-3 by beta decay as in this nuclear equation: and it releases 18.6 keV of energy in the process. The electron's kinetic energy varies, with an average of 5.7 keV, while the remaining energy is carried off by the nearly undetectable electron antineutrino.
Beta particles from tritium can penetrate only about 6.0 mm of air, they are incapable of passing through the dead outermost layer of human skin. The unusually low energy released in the tritium beta decay makes the decay appropriate for absolute neutrino mass measurements in the laboratory; the low energy of tritium's radiation makes it difficult to detect tritium-labeled compounds except by using liquid scintillation counting. Tritium is produced in nuclear reactors by neutron activation of lithium-6; this is possible with neutrons of any energy, is an exothermic reaction yielding 4.8 MeV. In comparison, the fusion of deuterium with tritium releases about 17.6 MeV of energy. For applications in proposed fusion energy reactors, such as ITER, pebbles consisting of lithium bearing ceramics including Li2TiO3 and Li4SiO4, are being developed for tritium breeding within a helium cooled pebble bed known as a breeder blanket. High-energy neutrons can produce tritium from lithium-7 in an endothermic reaction, consuming 2.466 MeV.
This was discovered. High-energy neutrons irradiating boron-10 will occasionally produce tritium: A more common result of boron-10 neutron capture is 7Li and a single alpha particle. Tritium is produced in heavy water-moderated reactors whenever a deuterium nucleus captures a neutron; this reaction has a quite small absorption cross section, making heavy water a good neutron moderator, little tritium is produced. So, cleaning tritium from the moderator may be desirable after several years to reduce the risk of its escaping to the environment. Ontario Power Generation's "Tritium Removal Facility" processes up to 2,500 tonnes of heavy water a year, it separates out about 2.5 kg of tritium, making it available for other uses. Deuterium's absorption cross section for thermal neutrons is about 0.52 millibarns, whereas that of oxygen-16 is about 0.19 millibarns and that of oxygen-17 is about 240 millibarns. Tritium is an uncommon product of the nuclear fission of uranium-235, plutonium-239, uranium-233, with a production of about one atom per each 10,000 fissions.
The release or recovery of tritium needs to be considered in the operation of nuclear reactors in the reprocessing of nuclear fuels and in the storage of spent nuclear fuel. The production of tritium is not a goal, but rather a side-effect, it is discharged to the atmosphere in small quantities by some nuclear power plants. In June 2016 the Tritiated Water Task Force released a report on the status of tritium in tritiated water at Fukushima Daiichi nuclear plant, as part of considering options for final disposal of this water; this identified that the March 2016 holding of tritium on-site was 760 TBq in a total of 860000 m3 of stored water. This report identified the reducing concentration of tritium in the water extracted from the buildings etc. for storage, seeing a factor of ten decrease over the five years considered, 3.3 MBq/L to 0.3 MBq/L. According to a report by an expert panel considering the best approach to dealing with this issue, "Tritium could be separated theoretically, but there is no practical separation technology on an industrial scale.
Accordingly, a controlled environmental release is said to be the best way to treat low-tritium-concentration water." Tritium's decay product helium-3 has a large cross section for reacting with thermal neutrons, expelling a proton, hence it is converted back to tritium in nuclear reactors. Tritium occurs due to cosmic rays interacting with atmospheric gases. In the most important reaction for natural production, a fast neutron interacts with atmospheric nitrogen: Worldwide, the production of tritium from natural sources is 148 petabecquerels per year; the global equilibrium inventory of tritium created by natural sources remains constant at 2,590 petabecquerels. This is due to losses proportional to the inventory. According to a 1996 report from Institute for Energy and Environmental Research on the US Department of Energy, only 225 kg of tritium had been produced in the United States from 1955 to 1996. Since it continually de
In statistical mechanics, entropy is an extensive property of a thermodynamic system. It is related to the number Ω of microscopic configurations that are consistent with the macroscopic quantities that characterize the system. Under the assumption that each microstate is probable, the entropy S is the natural logarithm of the number of microstates, multiplied by the Boltzmann constant kB. Formally, S = k B ln Ω. Macroscopic systems have a large number Ω of possible microscopic configurations. For example, the entropy of an ideal gas is proportional to the number of gas molecules N. Twenty liters of gas at room temperature and atmospheric pressure has N ≈ 6×1023. At equilibrium, each of the Ω ≈ eN configurations can be regarded as random and likely; the second law of thermodynamics states. Such systems spontaneously evolve towards the state with maximum entropy. Non-isolated systems may lose entropy, provided their environment's entropy increases by at least that amount so that the total entropy increases.
Entropy is a function of the state of the system, so the change in entropy of a system is determined by its initial and final states. In the idealization that a process is reversible, the entropy does not change, while irreversible processes always increase the total entropy; because it is determined by the number of random microstates, entropy is related to the amount of additional information needed to specify the exact physical state of a system, given its macroscopic specification. For this reason, it is said that entropy is an expression of the disorder, or randomness of a system, or of the lack of information about it; the concept of entropy plays a central role in information theory. Boltzmann's constant, therefore entropy, have dimensions of energy divided by temperature, which has a unit of joules per kelvin in the International System of Units; the entropy of a substance is given as an intensive property—either entropy per unit mass or entropy per unit amount of substance. The French mathematician Lazare Carnot proposed in his 1803 paper Fundamental Principles of Equilibrium and Movement that in any machine the accelerations and shocks of the moving parts represent losses of moment of activity.
In other words, in any natural process there exists an inherent tendency towards the dissipation of useful energy. Building on this work, in 1824 Lazare's son Sadi Carnot published Reflections on the Motive Power of Fire which posited that in all heat-engines, whenever "caloric" falls through a temperature difference, work or motive power can be produced from the actions of its fall from a hot to cold body, he made the analogy with that of. This was an early insight into the second law of thermodynamics. Carnot based his views of heat on the early 18th century "Newtonian hypothesis" that both heat and light were types of indestructible forms of matter, which are attracted and repelled by other matter, on the contemporary views of Count Rumford who showed that heat could be created by friction as when cannon bores are machined. Carnot reasoned that if the body of the working substance, such as a body of steam, is returned to its original state at the end of a complete engine cycle, that "no change occurs in the condition of the working body".
The first law of thermodynamics, deduced from the heat-friction experiments of James Joule in 1843, expresses the concept of energy, its conservation in all processes. In the 1850s and 1860s, German physicist Rudolf Clausius objected to the supposition that no change occurs in the working body, gave this "change" a mathematical interpretation by questioning the nature of the inherent loss of usable heat when work is done, e.g. heat produced by friction. Clausius described entropy as the transformation-content, i.e. dissipative energy use, of a thermodynamic system or working body of chemical species during a change of state. This was in contrast to earlier views, based on the theories of Isaac Newton, that heat was an indestructible particle that had mass. Scientists such as Ludwig Boltzmann, Josiah Willard Gibbs, James Clerk Maxwell gave entropy a statistical basis. In 1877 Boltzmann visualized a probabilistic way to measure the entropy of an ensemble of ideal gas particles, in which he defined entropy to be proportional to the natural logarithm of the number of microstates such a gas could occupy.
Henceforth, the essential problem in statistical thermodynamics, i.e. according to Erwin Schrödinger, has been to determine the distribution of a given amount of energy E over N identical systems. Carathéodory linked entropy with a mathematical definition of irreversibility, in terms of trajectories and integrability. There are two related definitions of entropy: the thermodynamic definition and the statistical mechanics definition; the classical thermodynamics definition developed first. In the classical thermodynamics viewpoint, the system is composed of large numbers of constituents and the state of the system is described by the average thermodynamic properties of those constituents.