History of the metre
The history of the metre starts with the scientific revolution that began with Nicolaus Copernicus's work in 1543. Accurate measurements were required, scientists looked for measures that were universal and could be based on natural phenomena rather than royal decree or physical prototypes. Rather than the various complex systems of subdivision in use, they preferred a decimal system to ease their calculations. With the French Revolution came a desire to replace many features of the Ancien Régime, including the traditional units of measure; as a base unit of length, many scientists favored the seconds pendulum, but this was rejected when it was discovered that it varied from place to place with local gravity. A new unit of length, the metre was introduced - defined as one ten-millionth of the distance from the North Pole to the equator. For practical purposes however, the standard metre was made available in the form of a platinum bar held in Paris; this in turn was replaced in 1889 by thirty platinum-iridium bars kept across the globe.
However, using such physical objects as the standard had been something that the original definition had aimed to avoid, so in 1960 a new definition based on a specific number of wavelengths of light from a specific transition in krypton-86 allowed the standard to be universally available by measurement. In 1983 this was updated to the current definition: "the length of the path travelled by light in a vacuum in 1/299,792,458 of a second"During the mid nineteenth century the metre gained adoption worldwide in scientific usage, was established as an international measurement unit by the Metre Convention of 1875. Where older traditional length measures are still used, they are now defined in terms of the metre - for example the yard has since 1959 been defined as 0.9144 metre. The standard measures of length in Europe diverged from one another after the fall of the Carolingian Empire: while measures could be standardized within a given jurisdiction, there were numerous variations of measure between regions.
Indeed, as the measures were used as the basis for taxation, the use of a particular measure was associated with the sovereignty of a given ruler and dictated by law. With the increasing scientific activity of the 17th century came calls for the institution of a "universal measure" or "metro cattolico", which would be based on natural phenomena rather than royal decree, would be decimal rather than using the various systems of subdivision duodecimal, which coexisted at the time. According to Wilkins it was Christopher Wren's idea to choose the length of a "seconds pendulum" as the unit of length: such pendulums had been demonstrated by Christiaan Huygens, their length is quite close to one modern metre. However, it was soon discovered that the length of a seconds pendulum varies from place to place: French astronomer Jean Richer had measured the 0.3% difference in length between Cayenne and Paris. Jean Richer and Giovanni Domenico Cassini measured the parallax of Mars between Paris and Cayenne in French Guiana when Mars was at its closest to Earth in 1672.
They arrived at a figure for the solar parallax of 91/2 inches, equivalent to an Earth–Sun distance of about 22,000 Earth radii. They were the first astronomers to have access to an accurate and reliable value for the radius of Earth, measured by their colleague Jean Picard in 1669 as 3269 thousand toises. Isaac Newton used this measurement for establishing his law of universal gravitation. Picard's geodetic observations had been confined to the determination of the magnitude of the earth considered as a sphere, but the discovery made by Jean Richer turned the attention of mathematicians to its deviation from a spherical form; the determination of the figure of the earth became a problem of the highest importance in astronomy, inasmuch as the diameter of the earth was the unit to which all celestial distances had to be referred. Geodetic surveys found practical applications in French cartography and in the Anglo-French Survey, which aimed to connect Paris and Greenwich Observatories and led to the Principal Triangulation of Great Britain.
The unit of length used by the French was the Toise de Paris, divided in six feet. The English unit of length was the yard; the French main unit of length was the Toise of Paris whose standard was the Toise of Châtelet, fixed outside the Grand Châtelet in Paris from 1668 to 1776. In 1735 two geodetic standards were calibrated against the Toise of Châtelet. One of them, the Toise of Peru was used for the Spanish-French Geodesic Mission. In 1766 the Toise of Peru became the official standard of the Toise in France and was renamed as the Toise of the Academy. Despite scientific progresses in the field of geodesy, little practical advance was made towards the establishment of the "universal measure" until the French Revolution of 1789. France was affected by the proliferation of length measures, the need for reform was accepted across all political viewpoints if it needed the push of revolution to bring it about. Talleyrand resurrected the idea of the seconds pendulum before the Constituent Assembly in 1790, suggesting that the new measure be defined at 45°N (a latitude that, in France, runs just north of Bordeaux and just
A Kibble balance or watt balance is an electromechanical measuring instrument that measures the weight of a test object precisely by the electric current and voltage needed to produce a compensating force. It is a metrological instrument that can realize the new definition of the kilogram unit of mass based on fundamental constants, termed an electronic or electrical kilogram; the name watt balance comes from the fact that the weight of the test mass is proportional to the product of current and voltage, measured in units of watts. In June 2016, two months after the death of the inventor of the balance, Bryan Kibble, metrologists of the Consultative Committee for Units of the International Committee for Weights and Measures agreed to rename the device in his honor. Since 1889, the definition of the kilogram was based on a physical object known as the International Prototype of the Kilogram. In 2013, accuracy criteria were agreed upon by the General Conference on Weights and Measures for replacing this definition with one based on the use of a Kibble balance.
After these criteria had been achieved, the CGPM voted unanimously on November 16, 2018 to change the definition of the kilogram and several other units, effective May 20, 2019, to coincide with World Metrology Day. The Kibble balance is a more accurate version of the ampere balance, an early current measuring instrument in which the force between two current-carrying coils of wire is measured and used to calculate the magnitude of the current. In this new application, the balance will be used in the opposite sense; the balance determines the weight of the object. Thus the mass of the object is defined in terms of a current and a voltage, as described below—an "electronic kilogram." The principle, used in the Kibble balance was proposed by Bryan Kibble of the UK National Physical Laboratory in 1975 for measurement of the gyromagnetic ratio. The main weakness of the ampere balance method is that the result depends on the accuracy with which the dimensions of the coils are measured; the Kibble balance method has an extra calibration step in which the effect of the geometry of the coils is eliminated, removing the main source of uncertainty.
This extra step involves moving the force coil through a known magnetic flux at a known speed. This step was performed in 1990; the Kibble balance originating from the National Physical Laboratory was transferred to the National Research Council of Canada in 2009, where scientists from the two labs continued to refine the instrument. In 2014, NRC researchers published the most accurate measurement of the Planck constant at that time, with a relative uncertainty of 1.8×10−8. A final paper by NRC researchers was published in May 2017, presenting a measurement of Planck's constant with an uncertainty of only 9.1 parts per billion, the measurement with the least uncertainty to date. Other Kibble balance experiments are conducted in the US National Institute of Standards and Technology, the Swiss Federal Office of Metrology in Berne, the International Bureau of Weights and Measures near Paris and Laboratoire national de métrologie et d’essais in Trappes, France. A conducting wire of length L that carries an electric current I perpendicular to a magnetic field of strength B experiences a Lorentz force equal to the product of these variables.
In the Kibble balance, the current is varied so that this force counteracts the weight w of a standard mass m. This principle is derived from the ampere balance. W is given by the mass m multiplied by the local gravitational acceleration g. Thus, w = m g = B L I; the Kibble balance avoids the problems of measuring L in a second calibration step. The same wire is moved through the same magnetic field at a known speed v. By Faraday's law of induction, a potential difference U is generated across the ends of the wire, which equals BLv, thus U = B L v. The unknown product BL can be eliminated from the equations to give U I = m g v. With U, I, g, v measured, this gives an accurate value for m. Both sides of the equation have the dimensions of power, measured in watts in the International System of Units. Accurate measurements of electric current and potential difference are made in conventional electrical units, which are based on fixed "conventional values" of the Josephson constant and the von Klitzing constant, K J-90 = 2 e / h and R K-90 = h / e 2 respectively.
The current Kibble balance experiments are equivalent to measuring the value of the conventional watt in SI units. From the definition of the conventional watt, this is equivalent to measuring the value of the product KJ2RK in SI units instead of its fixed value in conventional electrical units: K J 2 R K = K J-90 2 R K-90 m g v U
The volt is the derived unit for electric potential, electric potential difference, electromotive force. It is named after the Italian physicist Alessandro Volta. One volt is defined as the difference in electric potential between two points of a conducting wire when an electric current of one ampere dissipates one watt of power between those points, it is equal to the potential difference between two parallel, infinite planes spaced 1 meter apart that create an electric field of 1 newton per coulomb. Additionally, it is the potential difference between two points that will impart one joule of energy per coulomb of charge that passes through it, it can be expressed in terms of SI base units as V = potential energy charge = J C = kg ⋅ m 2 A ⋅ s 3. It can be expressed as amperes times ohms, watts per ampere, or joules per coulomb, equivalent to electronvolts per elementary charge: V = A ⋅ Ω = W A = J C = eV e; the "conventional" volt, V90, defined in 1987 by the 18th General Conference on Weights and Measures and in use from 1990, is implemented using the Josephson effect for exact frequency-to-voltage conversion, combined with the caesium frequency standard.
For the Josephson constant, KJ = 2e/h, the "conventional" value KJ-90 is used: K J-90 = 0.4835979 GHz μ V. This standard is realized using a series-connected array of several thousand or tens of thousands of junctions, excited by microwave signals between 10 and 80 GHz. Empirically, several experiments have shown that the method is independent of device design, measurement setup, etc. and no correction terms are required in a practical implementation. In the water-flow analogy, sometimes used to explain electric circuits by comparing them with water-filled pipes, voltage is likened to difference in water pressure. Current is proportional to the amount of water flowing at that pressure. A resistor would be a reduced diameter somewhere in the piping and a capacitor/inductor could be likened to a "U" shaped pipe where a higher water level on one side could store energy temporarily; the relationship between voltage and current is defined by Ohm's law. Ohm's Law is analogous to the Hagen–Poiseuille equation, as both are linear models relating flux and potential in their respective systems.
The voltage produced by each electrochemical cell in a battery is determined by the chemistry of that cell. See Galvanic cell § Cell voltage. Cells can be combined in series for multiples of that voltage, or additional circuitry added to adjust the voltage to a different level. Mechanical generators can be constructed to any voltage in a range of feasibility. Nominal voltages of familiar sources: Nerve cell resting potential: ~75 mV Single-cell, rechargeable NiMH or NiCd battery: 1.2 V Single-cell, non-rechargeable: alkaline battery: 1.5 V. Some antique vehicles use 6.3 volts. Electric vehicle battery: 400 V when charged Household mains electricity AC: 100 V in Japan 120 V in North America, 230 V in Europe, Asia and Australia Rapid transit third rail: 600–750 V High-speed train overhead power lines: 25 kV at 50 Hz, but see the List of railway electrification systems and 25 kV at 60 Hz for exceptions. High-voltage electric power transmission lines: 110 kV and up Lightning: Varies often around 100 MV.
In 1800, as the result of a professional disagreement over the galvanic response advocated by Luigi Galvani, Alessandro Volta developed the so-called voltaic pile, a forerunner of the battery, which produced a steady electric current. Volta had determined that the most effective pair of dissimilar metals to produce electricity was zinc and silver. In 1861, Latimer Clark and Sir Charles Bright coined the name "volt" for the unit of resistance. By 1873, the British Association for the Advancement of Science had defined the volt and farad. In 1881, the International Electrical Congress, now the International Electrotechnical Commission, approved the volt as the unit for electromotive force, they made the volt equal to 108 cgs units of voltage
A micrometer, sometimes known as a micrometer screw gauge, is a device incorporating a calibrated screw used for accurate measurement of components in mechanical engineering and machining as well as most mechanical trades, along with other metrological instruments such as dial and digital calipers. Micrometers are but not always, in the form of calipers; the spindle is a accurately machined screw and the object to be measured is placed between the spindle and the anvil. The spindle is moved by turning the ratchet knob or thimble until the object to be measured is touched by both the spindle and the anvil. Micrometers are used in telescopes or microscopes to measure the apparent diameter of celestial bodies or microscopic objects; the micrometer used with a telescope was invented about 1638 by William Gascoigne, an English astronomer. Colloquially the word micrometer is shortened to mike or mic; the word micrometer is a neoclassical coinage from Greek, Modern micros, meaning'small', metron, meaning'measure'.
The Merriam-Webster Collegiate Dictionary says that English got it from French and that its first known appearance in English writing was in 1670. Neither the metre nor the micrometre nor the micrometer as we know. However, the people of that time did have much need for, interest in, the ability to measure small things and small differences; the word was no doubt coined in reference to this endeavor if it did not refer to its present-day senses. The first micrometric screw was invented by William Gascoigne in the 17th century, as an enhancement of the vernier. Henry Maudslay built a bench micrometer in the early 19th century, jocularly nicknamed "the Lord Chancellor" among his staff because it was the final judge on measurement accuracy and precision in the firm's work. In 1844 details of Whitworth's workshop micrometer were published; this was described as having a strong frame of cast iron, the opposite ends of which were two finished steel cylinders, which traversed longitudinally by action of screws.
The ends of the cylinders where they met was of hemispherical shape. One screw was fitted with a wheel graduated to measure to the ten thousandth of an inch, his object was to furnish ordinary mechanics with an instrument which, while it afforded accurate indications, was yet not liable to be deranged by the rough handling of the workshop. The first documented development of handheld micrometer-screw calipers was by Jean Laurent Palmer of Paris in 1848; the micrometer caliper was introduced to the mass market in anglophone countries by Brown & Sharpe in 1867, allowing the penetration of the instrument's use into the average machine shop. Brown & Sharpe were inspired by one of them being Palmer's design. In 1888 Edward W. Morley added to the precision of micrometric measurements and proved their accuracy in a complex series of experiments; the culture of toolroom accuracy and precision, which started with interchangeability pioneers including Gribeauval, North, Hall and Colt, continued through leaders such as Maudslay, Whitworth, Sharpe, Whitney and others, grew during the Machine Age to become an important part of combining applied science with technology.
Beginning in the early 20th century, one could no longer master tool and die making, machine tool building, or engineering without some knowledge of the science of metrology, as well as the sciences of chemistry and physics. Each type of micrometer caliper can be fitted with specialized anvils and spindle tips for particular measuring tasks. For example, the anvil may be shaped in the form of a segment of screw thread, in the form of a v-block, or in the form of a large disc. Universal micrometer sets come with interchangeable anvils, such as flat, spline, blade and knife-edge; the term universal micrometer may refer to a type of micrometer whose frame has modular components, allowing one micrometer to function as outside mic, depth mic, step mic, etc.. Blade micrometers have a matching set of narrow tips, they allow, for example, the measuring of a narrow o-ring groove. Pitch-diameter micrometers have a matching set of thread-shaped tips for measuring the pitch diameter of screw threads. Limit mics have two anvils and two spindles, are used like a snap gauge.
The part being checked must pass through the first gap and must stop at the second gap in order to be within specification. The two gaps reflect the top and bottom of the tolerance range. Bore micrometer a three-anvil head on a micrometer base used to measure inside diameters. Tube micrometers have a cylindrical anvil positioned perpendicularly to a spindle and is used to measure the thickness of tubes. Micrometer stops are micrometer heads that are mounted on the table of a manual milling machine, bedways of a lathe, or other machine tool, in place of simple stops, they help the operator to position the carriage precisely. Stops can be used to actuate kickout mechanisms or limit switches to halt an automatic feed system. Ball micrometers have ball-shaped anvils, they may have one flat and one ball anvil, i
The Weston cell is a wet-chemical cell that produces a stable voltage suitable as a laboratory standard for calibration of voltmeters. Invented by Edward Weston in 1893, it was adopted as the International Standard for EMF from 1911 until superseded by the Josephson voltage standard in 1990; the anode is an amalgam of cadmium with mercury with a cathode of pure mercury over which a paste of mercurous sulfate and mercury is placed. The electrolyte is a saturated solution of cadmium sulfate, the depolarizer is a paste of mercurous sulfate; as shown in the illustration, the cell is set up in an H-shaped glass vessel with the cadmium amalgam in one leg and the pure mercury in the other. Electrical connections to the cadmium amalgam and the mercury are made by platinum wires fused through the lower ends of the legs. Anode reaction Cd → Cd2+ + 2e−Cathode reaction 2SO42− + 2e− → 2Hg + SO42−Reference cells must be applied in such a way that no current is drawn from them; the original design was a saturated cadmium cell producing a 1.018638 V reference and had the advantage of having a lower temperature coefficient than the used Clark cell.
One of the great advantages of the Weston normal cell is its small change of electromotive force with change of temperature. At any temperature t between 0 °C and 40 °C, Et/V = E20/V − 0.0000406 − 0.00000095 2 + 0.00000001 3. This temperature formula was adopted by the London conference of 1908The temperature coefficient can be reduced by shifting to an unsaturated design, the predominant type today. However, an unsaturated cell's output decreases by some 80 microvolts per year, compensated by periodic calibration against a saturated cell. List of battery types Practical Electricity by W. E. Ayrton and T. Mather, published by Cassell and Company, London, 1911, pp 198–203 U. S. Patent 494,827, "Voltaic cell" Standard Cells, Their Construction and Characteristics by Walter J. Hamer, National Bureau of Standards Monograph 84, January 15, 1965. Standard Cells for E. M. F. Determinations Special-purpose batteries
International System of Units
The International System of Units is the modern form of the metric system, is the most used system of measurement. It comprises a coherent system of units of measurement built on seven base units, which are the ampere, second, kilogram, mole, a set of twenty prefixes to the unit names and unit symbols that may be used when specifying multiples and fractions of the units; the system specifies names for 22 derived units, such as lumen and watt, for other common physical quantities. The base units are derived from invariant constants of nature, such as the speed of light in vacuum and the triple point of water, which can be observed and measured with great accuracy, one physical artefact; the artefact is the international prototype kilogram, certified in 1889, consisting of a cylinder of platinum-iridium, which nominally has the same mass as one litre of water at the freezing point. Its stability has been a matter of significant concern, culminating in a revision of the definition of the base units in terms of constants of nature, scheduled to be put into effect on 20 May 2019.
Derived units may be defined in terms of other derived units. They are adopted to facilitate measurement of diverse quantities; the SI is intended to be an evolving system. The most recent derived unit, the katal, was defined in 1999; the reliability of the SI depends not only on the precise measurement of standards for the base units in terms of various physical constants of nature, but on precise definition of those constants. The set of underlying constants is modified as more stable constants are found, or may be more measured. For example, in 1983 the metre was redefined as the distance that light propagates in vacuum in a given fraction of a second, thus making the value of the speed of light in terms of the defined units exact; the motivation for the development of the SI was the diversity of units that had sprung up within the centimetre–gram–second systems and the lack of coordination between the various disciplines that used them. The General Conference on Weights and Measures, established by the Metre Convention of 1875, brought together many international organisations to establish the definitions and standards of a new system and standardise the rules for writing and presenting measurements.
The system was published in 1960 as a result of an initiative that began in 1948. It is based on the metre–kilogram–second system of units rather than any variant of the CGS. Since the SI has been adopted by all countries except the United States and Myanmar; the International System of Units consists of a set of base units, derived units, a set of decimal-based multipliers that are used as prefixes. The units, excluding prefixed units, form a coherent system of units, based on a system of quantities in such a way that the equations between the numerical values expressed in coherent units have the same form, including numerical factors, as the corresponding equations between the quantities. For example, 1 N = 1 kg × 1 m/s2 says that one newton is the force required to accelerate a mass of one kilogram at one metre per second squared, as related through the principle of coherence to the equation relating the corresponding quantities: F = m × a. Derived units apply to derived quantities, which may by definition be expressed in terms of base quantities, thus are not independent.
Other useful derived quantities can be specified in terms of the SI base and derived units that have no named units in the SI system, such as acceleration, defined in SI units as m/s2. The SI base units are the building blocks of the system and all the other units are derived from them; when Maxwell first introduced the concept of a coherent system, he identified three quantities that could be used as base units: mass and time. Giorgi identified the need for an electrical base unit, for which the unit of electric current was chosen for SI. Another three base units were added later; the early metric systems defined a unit of weight as a base unit, while the SI defines an analogous unit of mass. In everyday use, these are interchangeable, but in scientific contexts the difference matters. Mass the inertial mass, represents a quantity of matter, it relates the acceleration of a body to the applied force via Newton's law, F = m × a: force equals mass times acceleration. A force of 1 N applied to a mass of 1 kg will accelerate it at 1 m/s2.
This is true whether the object is floating in space or in a gravity field e.g. at the Earth's surface. Weight is the force exerted on a body by a gravitational field, hence its weight depends on the strength of the gravitational field. Weight of a 1 kg mass at the Earth's surface is m × g. Since the acceleration due to gravity is local and varies by location and altitude on the Earth, weight is unsuitable for precision
Measurement is the assignment of a number to a characteristic of an object or event, which can be compared with other objects or events. The scope and application of measurement are dependent on the discipline. In the natural sciences and engineering, measurements do not apply to nominal properties of objects or events, consistent with the guidelines of the International vocabulary of metrology published by the International Bureau of Weights and Measures. However, in other fields such as statistics as well as the social and behavioral sciences, measurements can have multiple levels, which would include nominal, ordinal and ratio scales. Measurement is a cornerstone of trade, science and quantitative research in many disciplines. Many measurement systems existed for the varied fields of human existence to facilitate comparisons in these fields; these were achieved by local agreements between trading partners or collaborators. Since the 18th century, developments progressed towards unifying accepted standards that resulted in the modern International System of Units.
This system reduces all physical measurements to a mathematical combination of seven base units. The science of measurement is pursued in the field of metrology; the measurement of a property may be categorized by the following criteria: type, magnitude and uncertainty. They enable unambiguous comparisons between measurements; the level of measurement is a taxonomy for the methodological character of a comparison. For example, two states of a property may be compared by difference, or ordinal preference; the type is not explicitly expressed, but implicit in the definition of a measurement procedure. The magnitude is the numerical value of the characterization obtained with a suitably chosen measuring instrument. A unit assigns a mathematical weighting factor to the magnitude, derived as a ratio to the property of an artifact used as standard or a natural physical quantity. An uncertainty represents the systemic errors of the measurement procedure. Errors are evaluated by methodically repeating measurements and considering the accuracy and precision of the measuring instrument.
Measurements most use the International System of Units as a comparison framework. The system defines seven fundamental units: kilogram, candela, ampere and mole. Six of these units are defined without reference to a particular physical object which serves as a standard, while the kilogram is still embodied in an artifact which rests at the headquarters of the International Bureau of Weights and Measures in Sèvres near Paris. Artifact-free definitions fix measurements at an exact value related to a physical constant or other invariable phenomena in nature, in contrast to standard artifacts which are subject to deterioration or destruction. Instead, the measurement unit can only change through increased accuracy in determining the value of the constant it is tied to; the first proposal to tie an SI base unit to an experimental standard independent of fiat was by Charles Sanders Peirce, who proposed to define the metre in terms of the wavelength of a spectral line. This directly influenced the Michelson–Morley experiment.
With the exception of a few fundamental quantum constants, units of measurement are derived from historical agreements. Nothing inherent in nature dictates that an inch has to be a certain length, nor that a mile is a better measure of distance than a kilometre. Over the course of human history, first for convenience and for necessity, standards of measurement evolved so that communities would have certain common benchmarks. Laws regulating measurement were developed to prevent fraud in commerce. Units of measurement are defined on a scientific basis, overseen by governmental or independent agencies, established in international treaties, pre-eminent of, the General Conference on Weights and Measures, established in 1875 by the Metre Convention, overseeing the International System of Units and having custody of the International Prototype Kilogram; the metre, for example, was redefined in 1983 by the CGPM in terms of light speed, while in 1960 the international yard was defined by the governments of the United States, United Kingdom and South Africa as being 0.9144 metres.
In the United States, the National Institute of Standards and Technology, a division of the United States Department of Commerce, regulates commercial measurements. In the United Kingdom, the role is performed by the National Physical Laboratory, in Australia by the National Measurement Institute, in South Africa by the Council for Scientific and Industrial Research and in India the National Physical Laboratory of India. Before SI units were adopted around the world, the British systems of English units and imperial units were used in Britain, the Commonwealth and the United States; the system came to be known as U. S. is still in use there and in a few Caribbean countries. These various systems of measurement have at times been called foot-pound-second systems after the Imperial units for length and time though the tons, hundredweights and nautical miles, for example, are different for the U. S. units. Many Imperial units remain in use in Britain, which has switched to the SI system—with a few exceptions such as road signs, which are still in miles.
Draught beer and cider must be sold by the imperial pint, milk in returnable bottles can be sold by the imperial pint. Many people meas