Vacuum is space devoid of matter. The word stems from the Latin adjective vacuus for "vacant" or "void". An approximation to such vacuum is a region with a gaseous pressure much less than atmospheric pressure. Physicists discuss ideal test results that would occur in a perfect vacuum, which they sometimes call "vacuum" or free space, use the term partial vacuum to refer to an actual imperfect vacuum as one might have in a laboratory or in space. In engineering and applied physics on the other hand, vacuum refers to any space in which the pressure is lower than atmospheric pressure; the Latin term in vacuo is used to describe an object, surrounded by a vacuum. The quality of a partial vacuum refers to how it approaches a perfect vacuum. Other things equal, lower gas pressure means higher-quality vacuum. For example, a typical vacuum cleaner produces enough suction to reduce air pressure by around 20%. Much higher-quality vacuums are possible. Ultra-high vacuum chambers, common in chemistry and engineering, operate below one trillionth of atmospheric pressure, can reach around 100 particles/cm3.
Outer space is an higher-quality vacuum, with the equivalent of just a few hydrogen atoms per cubic meter on average in intergalactic space. According to modern understanding if all matter could be removed from a volume, it would still not be "empty" due to vacuum fluctuations, dark energy, transiting gamma rays, cosmic rays and other phenomena in quantum physics. In the study of electromagnetism in the 19th century, vacuum was thought to be filled with a medium called aether. In modern particle physics, the vacuum state is considered the ground state of a field. Vacuum has been a frequent topic of philosophical debate since ancient Greek times, but was not studied empirically until the 17th century. Evangelista Torricelli produced the first laboratory vacuum in 1643, other experimental techniques were developed as a result of his theories of atmospheric pressure. A torricellian vacuum is created by filling a tall glass container closed at one end with mercury, inverting it in a bowl to contain the mercury.
Vacuum became a valuable industrial tool in the 20th century with the introduction of incandescent light bulbs and vacuum tubes, a wide array of vacuum technology has since become available. The recent development of human spaceflight has raised interest in the impact of vacuum on human health, on life forms in general; the word vacuum comes from Latin, meaning'an empty space, void', noun use of neuter of vacuus, meaning "empty", related to vacare, meaning "be empty". Vacuum is one of the few words in the English language that contains two consecutive letters'u'. There has been much dispute over whether such a thing as a vacuum can exist. Ancient Greek philosophers debated the existence of a vacuum, or void, in the context of atomism, which posited void and atom as the fundamental explanatory elements of physics. Following Plato the abstract concept of a featureless void faced considerable skepticism: it could not be apprehended by the senses, it could not, provide additional explanatory power beyond the physical volume with which it was commensurate and, by definition, it was quite nothing at all, which cannot rightly be said to exist.
Aristotle believed that no void could occur because the denser surrounding material continuum would fill any incipient rarity that might give rise to a void. In his Physics, book IV, Aristotle offered numerous arguments against the void: for example, that motion through a medium which offered no impediment could continue ad infinitum, there being no reason that something would come to rest anywhere in particular. Although Lucretius argued for the existence of vacuum in the first century BC and Hero of Alexandria tried unsuccessfully to create an artificial vacuum in the first century AD, it was European scholars such as Roger Bacon, Blasius of Parma and Walter Burley in the 13th and 14th century who focused considerable attention on these issues. Following Stoic physics in this instance, scholars from the 14th century onward departed from the Aristotelian perspective in favor of a supernatural void beyond the confines of the cosmos itself, a conclusion acknowledged by the 17th century, which helped to segregate natural and theological concerns.
Two thousand years after Plato, René Descartes proposed a geometrically based alternative theory of atomism, without the problematic nothing–everything dichotomy of void and atom. Although Descartes agreed with the contemporary position, that a vacuum does not occur in nature, the success of his namesake coordinate system and more implicitly, the spatial–corporeal component of his metaphysics would come to define the philosophically modern notion of empty space as a quantified extension of volume. By the ancient definition however, directional information and magnitude were conceptually distinct. In the medieval Middle Eastern world, the physicist and Islamic scholar, Al-Farabi, conducted a small experiment concerning the existence of vacuum, in which he investigated handheld plungers in water, he concluded that air's volume can expand to fill available space, he suggested that the concept of perfect vacuum was incoherent. However, according to Nader El-Bizri, the physicist Ibn al-Haytham and the Mu'tazili theologians disagreed with Aristotle and Al-Farabi, they supported the existence of a void.
Using geometry, Ibn al-Haytham mathematically demonstrated that place is the imagined three-dimensional void between the inner surfaces of a containing body. According to Ahmad Dallal, Abū Rayhān al-Bīrūnī states that "there is no observable
The inch is a unit of length in the imperial and United States customary systems of measurement. It is equal to 1⁄12 of a foot. Derived from the Roman uncia, the word inch is sometimes used to translate similar units in other measurement systems understood as deriving from the width of the human thumb. Standards for the exact length of an inch have varied in the past, but since the adoption of the international yard during the 1950s and 1960s it has been based on the metric system and defined as 25.4 mm. The English word "inch" was an early borrowing from Latin uncia not present in other Germanic languages; the vowel change from Latin /u/ to Old English /y/ is known as umlaut. The consonant change from the Latin /k/ to English /tʃ/ is palatalisation. Both were features of Old English phonology. "Inch" is cognate with "ounce", whose separate pronunciation and spelling reflect its reborrowing in Middle English from Anglo-Norman unce and ounce. In many other European languages, the word for "inch" is the same as or derived from the word for "thumb", as a man's thumb is about an inch wide.
Examples include Afrikaans: duim. The inch is a used customary unit of length in the United States and the United Kingdom, it is used in Japan for electronic parts display screens. In most of continental Europe, the inch is used informally as a measure for display screens. For the United Kingdom, guidance on public sector use states that, since 1 October 1995, without time limit, the inch is to be used as a primary unit for road signs and related measurements of distance and may continue to be used as a secondary or supplementary indication following a metric measurement for other purposes; the international standard symbol for inch is in but traditionally the inch is denoted by a double prime, approximated by double quotes, the foot by a prime, approximated by an apostrophe. For example, three feet two inches can be written as 3′ 2″. Subdivisions of an inch are written using dyadic fractions with odd number numerators. 1 international inch is equal to: 10,000 tenths 1,000 thou or mil 100 points or gries 72 PostScript points 10, 12, 16, or 40 lines 6 computer picas 3 barleycorns 25.4 millimetres 0.999998 US Survey inches 1/3 or 0.333 palms 1/4 or 0.25 hands 1/12 or 0.08333 feet 1/36 or 0.02777 yards The earliest known reference to the inch in England is from the Laws of Æthelberht dating to the early 7th century, surviving in a single manuscript, the Textus Roffensis from 1120.
Paragraph LXVII sets out the fine for wounds of various depths: one inch, one shilling, two inches, two shillings, etc. An Anglo-Saxon unit of length was the barleycorn. After 1066, 1 inch was equal to 3 barleycorns, which continued to be its legal definition for several centuries, with the barleycorn being the base unit. One of the earliest such definitions is that of 1324, where the legal definition of the inch was set out in a statute of Edward II of England, defining it as "three grains of barley and round, placed end to end, lengthwise". Similar definitions are recorded in both Welsh medieval law tracts. One, dating from the first half of the 10th century, is contained in the Laws of Hywel Dda which superseded those of Dyfnwal, an earlier definition of the inch in Wales. Both definitions, as recorded in Ancient Laws and Institutes of Wales, are that "three lengths of a barleycorn is the inch". King David I of Scotland in his Assize of Weights and Measures is said to have defined the Scottish inch as the width of an average man's thumb at the base of the nail including the requirement to calculate the average of a small, a medium, a large man's measures.
However, the oldest surviving manuscripts date from the early 14th century and appear to have been altered with the inclusion of newer material. In 1814, Charles Butler, a mathematics teacher at Cheam School, recorded the old legal definition of the inch to be "three grains of sound ripe barley being taken out the middle of the ear, well dried, laid end to end in a row", placed the barleycorn, not the inch, as the base unit of the English Long Measure system, from which all other units were derived. John Bouvier recorded in his 1843 law dictionary that the barleycorn was the fundamental measure. Butler observed, that "s the length of the barley-corn cannot be fixed, so the inch according to this method will be uncertain", noting that a standard inch measure was now kept in the Exchequer chamber and, the legal definition of the inch; this was a point made by George Long in his 1842 Penny Cyclopædia, observing that st
The micrometre or micrometer commonly known by the previous name micron, is an SI derived unit of length equalling 1×10−6 metre. The micrometre is a common unit of measurement for wavelengths of infrared radiation as well as sizes of biological cells and bacteria, for grading wool by the diameter of the fibres; the width of a single human hair ranges from 10 to 200 μm. The longest human chromosome is 10 μm in length. Between 1 μm and 10 μm: 1–10 μm – length of a typical bacterium 10 μm – Size of fungal hyphae 5 μm – length of a typical human spermatozoon's head 3–8 μm – width of strand of spider web silk about 10 μm – size of a fog, mist, or cloud water droplet Between 10 μm and 100 μm about 10–12 μm – thickness of plastic wrap 10 to 55 μm – width of wool fibre 17 to 181 μm – diameter of human hair 70 to 180 μm – thickness of paper The term micron and the symbol μ were accepted for use in isolation to denote the micrometre in 1879, but revoked by the International System of Units in 1967; this became necessary because the older usage was incompatible with the official adoption of the unit prefix micro-, denoted μ, during the creation of the SI in 1960.
In the SI, the systematic name micrometre became the official name of the unit, μm became the official unit symbol. In practice, "micron" remains a used term in preference to "micrometre" in many English-speaking countries, both in academic science and in applied science and industry. Additionally, in American English, the use of "micron" helps differentiate the unit from the micrometer, a measuring device, because the unit's name in mainstream American spelling is a homograph of the device's name. In spoken English, they may be distinguished by pronunciation, as the name of the measuring device is invariably stressed on the second syllable, whereas the systematic pronunciation of the unit name, in accordance with the convention for pronouncing SI units in English, places the stress on the first syllable; the plural of micron is "microns", though "micra" was used before 1950. The official symbol for the SI prefix micro- is a Greek lowercase mu. In Unicode, there is a micro sign with the code point U+00B5, distinct from the code point U+03BC of the Greek letter lowercase mu.
According to the Unicode Consortium, the Greek letter character is preferred, but implementations must recognize the micro sign as well. Most fonts use the same glyph for the two characters. Metric prefix Metric system Orders of magnitude Wool measurement The dictionary definition of micrometre at Wiktionary
The metre or meter is the base unit of length in the International System of Units. The SI unit symbol is m; the metre is defined as the length of the path travelled by light in vacuum in 1/299 792 458 of a second. The metre was defined in 1793 as one ten-millionth of the distance from the equator to the North Pole – as a result the Earth's circumference is 40,000 km today. In 1799, it was redefined in terms of a prototype metre bar. In 1960, the metre was redefined in terms of a certain number of wavelengths of a certain emission line of krypton-86. In 1983, the current definition was adopted; the imperial inch is defined as 0.0254 metres. One metre is about 3 3⁄8 inches longer than a yard, i.e. about 39 3⁄8 inches. Metre is the standard spelling of the metric unit for length in nearly all English-speaking nations except the United States and the Philippines, which use meter. Other Germanic languages, such as German and the Scandinavian languages spell the word meter. Measuring devices are spelled "-meter" in all variants of English.
The suffix "-meter" has the same Greek origin as the unit of length. The etymological roots of metre can be traced to the Greek verb μετρέω and noun μέτρον, which were used for physical measurement, for poetic metre and by extension for moderation or avoiding extremism; this range of uses is found in Latin, French and other languages. The motto ΜΕΤΡΩ ΧΡΩ in the seal of the International Bureau of Weights and Measures, a saying of the Greek statesman and philosopher Pittacus of Mytilene and may be translated as "Use measure!", thus calls for both measurement and moderation. In 1668 the English cleric and philosopher John Wilkins proposed in an essay a decimal-based unit of length, the universal measure or standard based on a pendulum with a two-second period; the use of the seconds pendulum to define length had been suggested to the Royal Society in 1660 by Christopher Wren. Christiaan Huygens had observed that length to be 39.26 English inches. No official action was taken regarding these suggestions.
In 1670 Gabriel Mouton, Bishop of Lyon suggested a universal length standard with decimal multiples and divisions, to be based on a one-minute angle of the Earth's meridian arc or on a pendulum with a two-second period. In 1675, the Italian scientist Tito Livio Burattini, in his work Misura Universale, used the phrase metro cattolico, derived from the Greek μέτρον καθολικόν, to denote the standard unit of length derived from a pendulum; as a result of the French Revolution, the French Academy of Sciences charged a commission with determining a single scale for all measures. On 7 October 1790 that commission advised the adoption of a decimal system, on 19 March 1791 advised the adoption of the term mètre, a basic unit of length, which they defined as equal to one ten-millionth of the distance between the North Pole and the Equator. In 1793, the French National Convention adopted the proposal. In 1791, the French Academy of Sciences selected the meridional definition over the pendular definition because the force of gravity varies over the surface of the Earth, which affects the period of a pendulum.
To establish a universally accepted foundation for the definition of the metre, more accurate measurements of this meridian were needed. The French Academy of Sciences commissioned an expedition led by Jean Baptiste Joseph Delambre and Pierre Méchain, lasting from 1792 to 1799, which attempted to measure the distance between a belfry in Dunkerque and Montjuïc castle in Barcelona to estimate the length of the meridian arc through Dunkerque; this portion of the meridian, assumed to be the same length as the Paris meridian, was to serve as the basis for the length of the half meridian connecting the North Pole with the Equator. The problem with this approach is that the exact shape of the Earth is not a simple mathematical shape, such as a sphere or oblate spheroid, at the level of precision required for defining a standard of length; the irregular and particular shape of the Earth smoothed to sea level is represented by a mathematical model called a geoid, which means "Earth-shaped". Despite these issues, in 1793 France adopted this definition of the metre as its official unit of length based on provisional results from this expedition.
However, it was determined that the first prototype metre bar was short by about 200 micrometres because of miscalculation of the flattening of the Earth, making the prototype about 0.02% shorter than the original proposed definition of the metre. Regardless, this length became the French standard and was progressively adopted by other countries in Europe; the expedition was fictionalised in Le mètre du Monde. Ken Alder wrote factually about the expedition in The Measure of All Things: the seven year odyssey and hidden error that transformed the world. In 1867 at the second general conference of the International Association of Geodesy held in Berlin, the question of an international standard unit of length was discussed in order to combine the measurements made in different countries to determine the size and shape of the Earth; the conference recommended the adoption of the metre and the creation of an internatio
Paper is a thin material produced by pressing together moist fibres of cellulose pulp derived from wood, rags or grasses, drying them into flexible sheets. It is a versatile material with many uses, including writing, packaging, decorating, a number of industrial and construction processes. Papers are essential in non-legal documentation; the pulp papermaking process is said to have been developed in China during the early 2nd century CE as early as the year 105 CE, by the Han court eunuch Cai Lun, although the earliest archaeological fragments of paper derive from the 2nd century BCE in China. The modern pulp and paper industry is global, with China leading its production and the United States right behind it; the oldest known archaeological fragments of the immediate precursor to modern paper date to the 2nd century BCE in China. The pulp paper-making process is ascribed to a 2nd-century CE Han court eunuch. In the 13th century, the knowledge and uses of paper spread from China through the Middle East to medieval Europe, where the first water powered paper mills were built.
Because paper was introduced to the West through the city of Baghdad, it was first called bagdatikos. In the 19th century, industrialization reduced the cost of manufacturing paper. In 1844, the Canadian inventor Charles Fenerty and the German F. G. Keller independently developed processes for pulping wood fibres. Before the industrialisation of paper production the most common fibre source was recycled fibres from used textiles, called rags; the rags were from hemp and cotton. A process for removing printing inks from recycled paper was invented by German jurist Justus Claproth in 1774. Today this method is called deinking, it was not until the introduction of wood pulp in 1843 that paper production was not dependent on recycled materials from ragpickers. The word "paper" is etymologically derived from Latin papyrus, which comes from the Greek πάπυρος, the word for the Cyperus papyrus plant. Papyrus is a thick, paper-like material produced from the pith of the Cyperus papyrus plant, used in ancient Egypt and other Mediterranean cultures for writing before the introduction of paper into the Middle East and Europe.
Although the word paper is etymologically derived from papyrus, the two are produced differently and the development of the first is distinct from the development of the second. Papyrus is a lamination of natural plant fibres, while paper is manufactured from fibres whose properties have been changed by maceration. To make pulp from wood, a chemical pulping process separates lignin from cellulose fibres; this is accomplished by dissolving lignin in a cooking liquor, so that it may be washed from the cellulose. Paper made from chemical pulps are known as wood-free papers–not to be confused with tree-free paper; the pulp can be bleached to produce white paper, but this consumes 5% of the fibres. There are three main chemical pulping processes: the sulfite process dates back to the 1840s and it was the dominant method extent before the second world war; the kraft process, invented in the 1870s and first used in the 1890s, is now the most practiced strategy, one of its advantages is the chemical reaction with lignin, that produces heat, which can be used to run a generator.
Most pulping operations using the kraft process are net contributors to the electricity grid or use the electricity to run an adjacent paper mill. Another advantage is that this process reuses all inorganic chemical reagents. Soda pulping is another specialty process used to pulp straws and hardwoods with high silicate content. There are two major mechanical pulps: groundwood pulp. In the TMP process, wood is chipped and fed into steam heated refiners, where the chips are squeezed and converted to fibres between two steel discs. In the groundwood process, debarked logs are fed into grinders where they are pressed against rotating stones to be made into fibres. Mechanical pulping does not remove the lignin, so the yield is high, >95%, however it causes the paper thus produced to turn yellow and become brittle over time. Mechanical pulps have rather short fibres. Although large amounts of electrical energy are required to produce mechanical pulp, it costs less than the chemical kind. Paper recycling processes can use mechanically produced pulp.
Most recycled paper contains a proportion of virgin fibre for the sake of quality. There are three main classifications of recycled fibre:. Mill broke or internal mill waste – This incorporates any substandard or grade-change paper made within the paper mill itself, which goes back into the manufacturing system to be re-pulped back into paper; such out-of-specification paper is not sold and is therefore not classified as genuine reclaimed recycled fibre, however most paper mills have been reusing their own waste fibre for many years, long before recycling became popular. Preconsumer waste – This is offcut and processing waste, such as guillotine trims and envelope blank waste.
The hectometre or hectometer is an uncommonly used unit of length in the metric system, equal to one hundred metres. The word comes from a combination of "metre" and the SI prefix "hecto-", meaning "hundred". A soccer field is 1 hectometre in length; the hectare, a common metric unit for land area, is equal to one square hectometre. Conversion of units The dictionary definition of hectometre at Wiktionary
American and British English spelling differences
Many of the differences between American and British English date back to a time when spelling standards had not yet developed. For instance, some spellings seen as "American" today were once used in Britain and some spellings seen as "British" were once used in the United States. A "British standard" began to emerge following the 1755 publication of Samuel Johnson's A Dictionary of the English Language, an "American standard" started following the work of Noah Webster and in particular his An American Dictionary of the English Language, first published in 1828. Webster's efforts at spelling reform were somewhat effective in his native country, resulting in certain well-known patterns of spelling differences between the American and British varieties of English. However, English-language spelling reform has been adopted otherwise, so modern English orthography varies somewhat between countries and is far from phonemic in any country. In the early 18th century, English spelling was inconsistent.
These differences became noticeable after the publishing of influential dictionaries. Today's British English spellings follow Johnson's A Dictionary of the English Language, while many American English spellings follow Webster's An American Dictionary of the English Language. Webster was a proponent of English spelling reform for reasons both nationalistic. In A Companion to the American Revolution, John Algeo notes: "it is assumed that characteristically American spellings were invented by Noah Webster, he was influential in popularizing certain spellings in America, but he did not originate them. Rather he chose existing options such as center and check for the simplicity, analogy or etymology". William Shakespeare's first folios, for example, used spellings like center and color as much as centre and colour. Webster did attempt to introduce some reformed spellings, as did the Simplified Spelling Board in the early 20th century, but most were not adopted. In Britain, the influence of those who preferred the Norman spellings of words proved to be decisive.
Spelling adjustments in the United Kingdom had little effect on today's American spellings and vice versa. For the most part, the spelling systems of most Commonwealth countries and Ireland resemble the British system. In Canada, the spelling system can be said to follow both British and American forms, Canadians are somewhat more tolerant of foreign spellings when compared with other English-speaking nationalities. Australian spelling has strayed from British spelling, with some American spellings incorporated as standard. New Zealand spelling is identical to British spelling, except in the word fiord. There is an increasing use of macrons in words that originated in Māori and an unambiguous preference for -ise endings. Most words ending in an unstressed -our in British English end in -or in American English. Wherever the vowel is unreduced in pronunciation, e.g. contour, velour and troubadour the spelling is consistent everywhere. Most words of this kind came from Latin, they were first adopted into English from early Old French, the ending was spelled -or or -ur.
After the Norman conquest of England, the ending became -our to match the Old French spelling. The -our ending was not only used in new English borrowings, but was applied to the earlier borrowings that had used -or. However, -or was still sometimes found, the first three folios of Shakespeare's plays used both spellings before they were standardised to -our in the Fourth Folio of 1685. After the Renaissance, new borrowings from Latin were taken up with their original -or ending and many words once ending in -our went back to -or. Many words of the -our/or group do not have a Latin counterpart; some 16th- and early 17th-century British scholars indeed insisted that -or be used for words from Latin and -our for French loans. Webster's 1828 dictionary had only -or and is given much of the credit for the adoption of this form in the United States. By contrast, Johnson's 1755 dictionary used -our for all words still so spelled in Britain, but for words where the u has since been dropped: ambassadour, governour, inferiour, superiour.
Johnson, unlike Webster, was not an advocate of spelling reform, but chose the spelling best derived, as he saw it, from among the variations in his sources. He preferred French over Latin spellings because, as he put it, "the French supplied us". English speakers who moved to America took these preferences with them, H. L. Mencken notes that "honor appears in the 1776 Declaration of Independence, but it seems to have got there rather by accident than by design. In Jefferson's original draft it is spelled "honour". In Britain, examples of color, behavior and neighbor appear in Old Bailey court records from the 17th and 18th centuries, whereas there are thousands of examples of their -our counterparts. One notable exception is honor. Honor and honour were frequent in Br