Scanning electron microscope
A scanning electron microscope is a type of electron microscope that produces images of a sample by scanning the surface with a focused beam of electrons. The electrons interact with atoms in the sample, producing various signals that contain information about the surface topography and composition of the sample; the electron beam is scanned in a raster scan pattern, the position of the beam is combined with the intensity of the detected signal to produce an image. In the most common SEM mode, secondary electrons emitted by atoms excited by the electron beam are detected using an Everhart-Thornley detector; the number of secondary electrons that can be detected, thus the signal intensity, among other things, on specimen topography. SEM can achieve resolution better than 1 nanometer. Specimens are observed in high vacuum in conventional SEM, or in low vacuum or wet conditions in variable pressure or environmental SEM, at a wide range of cryogenic or elevated temperatures with specialized instruments.
An account of the early history of SEM has been presented by McMullan. Although Max Knoll produced a photo with a 50 mm object-field-width showing channeling contrast by the use of an electron beam scanner, it was Manfred von Ardenne who in 1937 invented a true microscope with high magnification by scanning a small raster with a demagnified and finely focused electron beam. Ardenne applied the scanning principle not only to achieve magnification but to purposefully eliminate the chromatic aberration otherwise inherent in the electron microscope, he further discussed the various detection modes and theory of SEM, together with the construction of the first high magnification SEM. Further work was reported by Zworykin's group, followed by the Cambridge groups in the 1950s and early 1960s headed by Charles Oatley, all of which led to the marketing of the first commercial instrument by Cambridge Scientific Instrument Company as the "Stereoscan" in 1965, delivered to DuPont; the signals used by a scanning electron microscope to produce an image result from interactions of the electron beam with atoms at various depths within the sample.
Various types of signals are produced including secondary electrons, reflected or back-scattered electrons, characteristic X-rays and light, absorbed current and transmitted electrons. Secondary electron detectors are standard equipment in all SEMs, but it is rare for a single machine to have detectors for all other possible signals. In secondary electron imaging, the secondary electrons are emitted from close to the specimen surface. SEI can produce high-resolution images of a sample surface, revealing details less than 1 nm in size. Back-scattered electrons are beam electrons that are reflected from the sample by elastic scattering, they emerge from deeper locations within the specimen and the resolution of BSE images is less than SE images. However, BSE are used in analytical SEM, along with the spectra made from the characteristic X-rays, because the intensity of the BSE signal is related to the atomic number of the specimen. BSE images can provide information about the distribution, but not the identity, of different elements in the sample.
In samples predominantly composed of light elements, such as biological specimens, BSE imaging can image colloidal gold immuno-labels of 5 or 10 nm diameter, which would otherwise be difficult or impossible to detect in secondary electron images. Characteristic X-rays are emitted when the electron beam removes an inner shell electron from the sample, causing a higher-energy electron to fill the shell and release energy; the energy or wavelength of these characteristic X-rays can be measured by Energy-dispersive X-ray spectroscopy or Wavelength-dispersive X-ray spectroscopy and used to identify and measure the abundance of elements in the sample and map their distribution. Due to the narrow electron beam, SEM micrographs have a large depth of field yielding a characteristic three-dimensional appearance useful for understanding the surface structure of a sample; this is exemplified by the micrograph of pollen shown above. A wide range of magnifications is possible, from about 10 times to more than 500,000 times, about 250 times the magnification limit of the best light microscopes.
SEM samples have to be small enough to fit on the specimen stage, may need special preparation to increase their electrical conductivity and to stabilize them, so that they can withstand the high vacuum conditions and the high energy beam of electrons. Samples are mounted rigidly on a specimen holder or stub using a conductive adhesive. SEM is used extensively for defect analysis of semiconductor wafers, manufacturers make instruments that can examine any part of a 300 mm semiconductor wafer. Many instruments have chambers that can tilt an object of that size to 45° and provide continuous 360° rotation. Nonconductive specimens collect charge when scanned by the electron beam, in secondary electron imaging mode, this causes scanning faults and other image artifacts. For conventional imaging in the SEM, specimens must be electrically conductive, at least at the surface, electrically grounded to prevent the accumulation of electrostatic charge. Metal objects require little special preparation for SEM except for cleaning and conductively mounting to a specimen stub.
Non-conducting materials are coated with an ultrathin coating of electrically conducting material, deposited on the sample either by low-vacuum sputter coating or by high-vacuum evaporation. Conductive materials in current use for specimen coating include gold, gold/palladium alloy, platinum, i
A micrograph or photomicrograph is a photograph or digital image taken through a microscope or similar device to show a magnified image of an object. This is opposed to a macrograph or photomacrograph, an image, taken on a microscope but is only magnified less than 10 times. Micrography is the art of using microscopes to make photographs. A micrograph contains extensive details of microstructure. A wealth of information can be obtained from a simple micrograph like behavior of the material under different conditions, the phases found in the system, failure analysis, grain size estimation, elemental analysis and so on. Micrographs are used in all fields of microscopy. A light micrograph or photomicrograph is a micrograph prepared using an optical microscope, a process referred to as photomicroscopy. At a basic level, photomicroscopy may be performed by connecting a camera to a microscope, thereby enabling the user to take photographs at reasonably high magnification. Scientific use began in England in 1850 by Prof Richard Hill Norris FRSE for his studies of blood cells.
Roman Vishniac was a pioneer in the field of photomicroscopy, specializing in the photography of living creatures in full motion. He made major developments in light-interruption photography and color photomicroscopy. Photomicrographs may be obtained using a USB microscope attached directly to a home computer or laptop. An electron micrograph is a micrograph prepared using an electron microscope. Micrographs have micron bars, or magnification ratios, or both. Magnification is a ratio between the size of an object on its real size. Magnification can be a misleading parameter as it depends on the final size of a printed picture and therefore varies with picture size. A scale bar, or micron bar, is a line of known length displayed on a picture; the bar can be used for measurements on a picture. When the picture is resized the bar is resized making it possible to recalculate the magnification. Ideally, all pictures destined for publication/presentation should be supplied with a scale bar. All but one of the micrographs presented on this page do not have a micron bar.
The microscope has been used for scientific discovery. It has been linked to the arts since its invention in the 17th century. Early adopters of the microscope, such as Robert Hooke and Antonie van Leeuwenhoek, were excellent illustrators. After the invention of photography in the 1820s the microscope was combined with the camera to take pictures instead of relying on an artistic rendering. Since the early 1970s individuals have been using the microscope as an artistic instrument. Websites and traveling art exhibits such as the Nikon Small World and Olympus Bioscapes have featured a range of images for the sole purpose of artistic enjoyment; some collaborative groups, such as the Paper Project have incorporated microscopic imagery into tactile art pieces as well as 3D immersive rooms and dance performances. Close-up Digital microscope Macro photography Microphotograph Microscopy USB microscope Make a Micrograph – This presentation by the research department of Children's Hospital Boston shows how researchers create a three-color micrograph.
Shots with a Microscope – a basic, comprehensive guide to photomicrography Scientific photomicrographs – free scientific quality photomicrographs by Doc. RNDr. Josef Reischig, CSc. Micrographs of 18 natural fibres by the International Year of Natural Fibres 2009 Seeing Beyond the Human Eye Video produced by Off Book - Solomon C. Fuller bio Charles Krebs Microscopic Images Dennis Kunkel Microscopy Andrew Paul Leonard, APL Microscopic Cell Centered Database - Montage Nikon Small World Olympus Bioscapes Other examples
New Scientist, first published on 22 November 1956, is a weekly, English-language magazine that covers all aspects of science and technology. New Scientist, based in London, publishes editions in the UK, the United States, Australia. Since 1996 it has been available online. Sold in retail outlets and on subscription, the magazine covers news, features and commentary on science and their implications. New Scientist publishes speculative articles, ranging from the technical to the philosophical; the magazine was founded in 1956 by Tom Margerison, Max Raison and Nicholas Harrison as The New Scientist, with Issue 1 on 22 November, priced one shilling. The British monthly science magazine Science Journal, published 1965–71, was merged with New Scientist to form New Scientist and Science Journal; the cover of New Scientist listed articles in plain text. Page numbering followed academic practice with sequential numbering for each quarterly volume. So, for example, the first page of an issue in March could be 649 instead of 1.
Issues numbered issues separately. From the beginning of 1961 "The" was dropped from the title. From 1965, the front cover was illustrated; until the 1970s, colour was not used except for on the cover. Since its first issue, New Scientist has written about the applications of science, through its coverage of technology. For example, the first issue included an article "Where next from Calder Hall?" on the future of nuclear power in the UK, a topic that it has covered throughout its history. In 1964 there was a regular "Science in British Industry" section with several items. An article in the magazine's 10th anniversary issues provides anecdotes on the founding of the magazine. In 1970, the Reed Group, which went on to become Reed Elsevier, acquired New Scientist when it merged with IPC Magazines. Reed retained the magazine when it sold most of its consumer titles in a management buyout to what is now TI Media. Throughout most of its history, New Scientist has published cartoons as light relief and comment on the news, with contributions from regulars such as Mike Peyton and David Austin.
The Grimbledon Down comic strip, by cartoonist Bill Tidy, appeared from 1970 to 1994. The Ariadne pages in New Scientist commented on the lighter side of science and technology and included contributions from Daedalus; the fictitious inventor devised plausible but impractical and humorous inventions developed by the DREADCO corporation. Daedalus moved to Nature. Issues of New Scientist from Issue 1 to the end of 1989 have been made free to read online. Subsequent issues require a subscription. In the first half of 2013, the international circulation of New Scientist averaged 125,172. While this was a 4.3% reduction on the previous year's figure, it was a much smaller reduction in circulation than many mainstream magazines of similar or greater circulation. For the 2014 UK circulation fell by 3.2% but stronger international sales, increased the circulation to 129,585. See #Website below. In April 2017, New Scientist changed hands when RELX Group known as Reed Elsevier, sold the magazine to Kingston Acquisitions, a group set up by Sir Bernard Gray, Louise Rogers and Matthew O’Sullivan to acquire New Scientist.
Kingston Acquisitions renamed itself New Scientist Ltd. New Scientist contains the following sections: Leader, Technology, Features, CultureLab, The Last Word and Jobs & Careers. A Tom Gauld cartoon appears on the Letters page. A readers' letters section discusses recent articles and discussions take place on the website. Readers contribute observations on examples of pseudoscience to Feedback, offer questions and answers on scientific and technical topics to Last Word. New Scientist has produced a series of books compiled from contributions to Last Word. There are 51 issues a year, with a New Year double issue; the double issue in 2014 was the 3,000th edition of the magazine. The Editor-in-chief is Emily Wilson, Executive Editor is Graham Lawton, Managing Editor is Rowan Hooper and Editor-at-Large is Jeremy Webb. Consultants include Fred Pearce, Marcus Chown, Linda Geddes. Simon Ings and former editor Alun Anderson are contributors.) Percy Cudlipp Nigel Calder Donald Gould Bernard Dixon Michael Kenward David Dickson Alun Anderson Jeremy Webb Roger Highfield Sumit Paul-Choudhury Emily Wilson The New Scientist website carries blogs and news articles.
Users with free-of-charge registration have limited access to new content and can receive emailed New Scientist newsletters. Subscribers to the print edition have full access to all articles and the archive of past content that has so far been digitised. Online readership takes various forms. Overall global views of an online database of over 100,000 articles are 8.0m by 3.6m unique users according to Adobe Reports & Analytics, as of September 2014. On social media there are 1.47m+ Twitter followers, 2.3m+ Facebook likes and 365,000+ Google+ followers as of January 2015. New Scientist has published books derived from its content, many of which are selected questions and answers from the Last Word section of the magazine and website: 1998; the Last Word. ISBN 978-0-19-286199-3 2000; the Last Word 2. ISBN 978-0-19-286204-4 2005. Does Anything Eat Wasps?. ISBN 978-1-86197-973-5 2006. Why Don't Penguins' Feet Freeze?. ISBN 978-1861978769 2007. How to
A detonator a blasting cap, is a device used to trigger an explosive device. Detonators can be chemically, mechanically, or electrically initiated, the latter two being the most common; the commercial use of explosives uses electrical detonators or the capped fuse, a length of safety fuse to which an ordinary detonator has been crimped. Many detonators' primary explosive is a material called ASA compound; this compound is formed from lead azide, lead styphnate and aluminium and is pressed into place above the base charge TNT or tetryl in military detonators and PETN in commercial detonators. Other materials such as DDNP are used as the primary charge to reduce the amount of lead emitted into the atmosphere by mining and quarrying operations. Old detonators used mercury fulminate as the primary mixed with potassium chlorate to yield better performance. A blasting cap is a small sensitive primary explosive device used to detonate a larger, more powerful and less sensitive secondary explosive such as TNT, dynamite, or plastic explosive.
Blasting caps come in a variety of types, including non-electric caps, electric caps, fuse caps. They are used in commercial mining and demolition. Electric types are set off by a short burst of current conducted from a blasting machine by a long wire to the cap to ensure safety. Traditional fuse caps have a fuse, ignited by a flame source, such as a match or a lighter; the need for detonators such as blasting caps came from the development of safer explosives. Different explosives require different amounts of energy to detonate. Most commercial explosives are formulated with a high activation energy, to make them stable and safe to handle so they will not explode if accidentally dropped, mishandled, or exposed to fire; these are called secondary explosives. However they are correspondingly difficult to detonate intentionally, require a small initiating explosion; this is provided by a detonator. A detonator contains an easy-to-ignite primary explosive that provides the initial activation energy to start the detonation in the main charge.
Explosives used in detonators include mercury fulminate, lead azide, lead styphnate, DDNP. Blasting caps and some detonators are stored separately and not inserted into the main explosive charge until just before use, keeping the main charge safe. Early blasting caps used silver fulminate, but it has been replaced with cheaper and safer primary explosives. Silver azide is still used sometimes, but rarely due to its high price. Detonators are hazardous for untrained personnel to handle, they are sometimes not recognized as explosives due to their appearance, leading to injuries. Ordinary detonators take the form of ignition-based explosives. While they are used in commercial operations, ordinary detonators are still used in military operations; this form of detonator is most initiated using safety fuse, used in non time-critical detonations e.g. conventional munitions disposal. Well known detonators are lead azide, Pb2, silver azide and mercury fulminate. There are three categories of electrical detonators: instantaneous electrical detonators, short period delay detonators and long period delay detonators.
SPDs are measured in milliseconds and LPDs are measured in seconds. In situations where nanosecond accuracy is required in the implosion charges in nuclear weapons, exploding-bridgewire detonators are employed; the initial shock wave is created by vaporizing a length of a thin wire by an electric discharge. A new development is a slapper detonator, which uses thin plates accelerated by an electrically exploded wire or foil to deliver the initial shock, it is in use in some modern weapon systems. A variant of this concept is used in mining operations, when the foil is exploded by a laser pulse delivered to the foil by optical fiber. A non-electric detonator is a shock tube detonator designed to initiate explosions for the purpose of demolition of buildings and for use in the blasting of rock in mines and quarries. Instead of electric wires, a hollow plastic tube delivers the firing impulse to the detonator, making it immune to most of the hazards associated with stray electric current, it consists of a small diameter, three-layer plastic tube coated on the innermost wall with a reactive explosive compound, when ignited, propagates a low energy signal, similar to a dust explosion.
The reaction travels at 6,500 ft/s along the length of the tubing with minimal disturbance outside of the tube. The design of non-electric detonators incorporates patented technology, including the Cushion Disk and Delay Ignition Buffer to provide reliability and accuracy in all blasting applications. Non-electric detonators was invented by the Swedish company Nitro Nobel in the 1960s and 1970s, under the leadership of Per-Anders Persson, launched to the demolitions market in 1973. Nonel is a contraction of "Non-electric detonators". In civil mining, electronic detonators have a better precision for delays. Electronic detonators are designed to provide the precise control necessary to produce accurate and consistent blasting results in a variety of blasting applications in the mining and construction industries. Electronic detonators may be programmed in 1-millisecond increments from 1 millisecond to 10,000 milliseconds using the dedicated programming device called the logger. Benefits: 100% verification of reliability of connections in initiation network.
Delay range of 1–10,000 ms with an incr
ASTM International known as American Society for Testing and Materials, is an international standards organization that develops and publishes voluntary consensus technical standards for a wide range of materials, products and services. Some 12,575 ASTM voluntary consensus standards operate globally; the organization's headquarters is in West Conshohocken, about 5 mi northwest of Philadelphia. Founded in 1898 as the American Section of the International Association for Testing Materials, ASTM International predates other standards organizations such as the BSI, IEC, DIN, ANSI, AFNOR, ISO. A group of scientists and engineers, led by Charles Dudley, formed ASTM in 1898 to address the frequent rail breaks affecting the fast-growing railroad industry; the group developed a standard for the steel used to fabricate rails. Called the "American Society for Testing Materials" in 1902, it became the "American Society for Testing and Materials" in 1961 before it changed its name to “ASTM International” in 2001 and added the tagline "Standards Worldwide".
In 2014, it changed the tagline to "Helping our World Work better". Now, ASTM International has offices in Belgium, China and Washington, D. C. Membership in the organization is open to anyone with an interest in its activities. Standards are developed within committees, new committees are formed as needed, upon request of interested members. Membership in most committees is voluntary and is initiated by the member's own request, not by appointment nor by invitation. Members are classified as users, consumers, "general interest"; the latter includes consultants. Users include industry users, who may be producers in the context of other technical committees, end-users such as consumers. In order to meet the requirements of antitrust laws, producers must constitute less than 50% of every committee or subcommittee, votes are limited to one per producer company; because of these restrictions, there can be a substantial waiting-list of producers seeking organizational memberships on the more popular committees.
Members can, participate without a formal vote and their input will be considered. As of 2015, ASTM has more than 30,000 members, including over 1,150 organizational members, from more than 140 countries; the members serve on one or more of 140+ ASTM Technical Committees. ASTM International has several awards for contributions to standards authorship, including the ASTM International Award of Merit ASTM International is classified by the United States Internal Revenue Service as a 501 nonprofit organization. ASTM International has no role in enforcing compliance with its standards; the standards, may become mandatory when referenced by an external contract, corporation, or government. In the United States, ASTM standards have been adopted, by incorporation or by reference, in many federal and municipal government regulations; the National Technology Transfer and Advancement Act, passed in 1995, requires the federal government to use developed consensus standards whenever possible. The Act reflects.
Other governments have referenced ASTM standards. Corporations doing international business may choose to reference an ASTM standard. All toys sold in the United States must meet the safety requirements of ASTM F963, Standard Consumer Safety Specification for Toy Safety, as part of the Consumer Product Safety Improvement Act of 2008; the law makes the ASTM F963 standard a mandatory requirement for toys while the Consumer Product Safety Commission studies the standard's effectiveness and issues final consumer guidelines for toy safety. International Organization for Standardisation Materials property Pt/Co scale Technical standard Media related to ASTM at Wikimedia Commons ASTM International
Mass spectrometry is an analytical technique that ionizes chemical species and sorts the ions based on their mass-to-charge ratio. In simpler terms, a mass spectrum measures the masses within a sample. Mass spectrometry is used in many different fields and is applied to pure samples as well as complex mixtures. A mass spectrum is a plot of the ion signal as a function of the mass-to-charge ratio; these spectra are used to determine the elemental or isotopic signature of a sample, the masses of particles and of molecules, to elucidate the chemical structures of molecules and other chemical compounds. In a typical MS procedure, a sample, which may be solid, liquid, or gas, is ionized, for example by bombarding it with electrons; this may cause some of the sample's molecules to break into charged fragments. These ions are separated according to their mass-to-charge ratio by accelerating them and subjecting them to an electric or magnetic field: ions of the same mass-to-charge ratio will undergo the same amount of deflection.
The ions are detected by a mechanism capable of detecting charged particles, such as an electron multiplier. Results are displayed as spectra of the relative abundance of detected ions as a function of the mass-to-charge ratio; the atoms or molecules in the sample can be identified by correlating known masses to the identified masses or through a characteristic fragmentation pattern. In 1886, Eugen Goldstein observed rays in gas discharges under low pressure that traveled away from the anode and through channels in a perforated cathode, opposite to the direction of negatively charged cathode rays. Goldstein called these positively charged anode rays "Kanalstrahlen". Wilhelm Wien found that strong electric or magnetic fields deflected the canal rays and, in 1899, constructed a device with perpendicular electric and magnetic fields that separated the positive rays according to their charge-to-mass ratio. Wien found. English scientist J. J. Thomson improved on the work of Wien by reducing the pressure to create the mass spectrograph.
The word spectrograph had become part of the international scientific vocabulary by 1884. Early spectrometry devices that measured the mass-to-charge ratio of ions were called mass spectrographs which consisted of instruments that recorded a spectrum of mass values on a photographic plate. A mass spectroscope is similar to a mass spectrograph except that the beam of ions is directed onto a phosphor screen. A mass spectroscope configuration was used in early instruments when it was desired that the effects of adjustments be observed. Once the instrument was properly adjusted, a photographic plate was exposed; the term mass spectroscope continued to be used though the direct illumination of a phosphor screen was replaced by indirect measurements with an oscilloscope. The use of the term mass spectroscopy is now discouraged due to the possibility of confusion with light spectroscopy. Mass spectrometry is abbreviated as mass-spec or as MS. Modern techniques of mass spectrometry were devised by Arthur Jeffrey Dempster and F.
W. Aston in 1918 and 1919 respectively. Sector mass spectrometers known as calutrons were developed by Ernest O. Lawrence and used for separating the isotopes of uranium during the Manhattan Project. Calutron mass spectrometers were used for uranium enrichment at the Oak Ridge, Tennessee Y-12 plant established during World War II. In 1989, half of the Nobel Prize in Physics was awarded to Hans Dehmelt and Wolfgang Paul for the development of the ion trap technique in the 1950s and 1960s. In 2002, the Nobel Prize in Chemistry was awarded to John Bennett Fenn for the development of electrospray ionization and Koichi Tanaka for the development of soft laser desorption and their application to the ionization of biological macromolecules proteins. A mass spectrometer consists of three components: an ion source, a mass analyzer, a detector; the ionizer converts a portion of the sample into ions. There is a wide variety of ionization techniques, depending on the phase of the sample and the efficiency of various ionization mechanisms for the unknown species.
An extraction system removes ions from the sample, which are targeted through the mass analyzer and into the detector. The differences in masses of the fragments allows the mass analyzer to sort the ions by their mass-to-charge ratio; the detector measures the value of an indicator quantity and thus provides data for calculating the abundances of each ion present. Some detectors give spatial information, e.g. a multichannel plate. The following example describes the operation of a spectrometer mass analyzer, of the sector type. Consider a sample of sodium chloride. In the ion source, the sample is ionized into sodium and chloride ions. Sodium atoms and ions are monoisotopic, with a mass of about 23 u. Chloride atoms and ions come in two isotopes with masses of 35 u and 37 u; the analyzer part of the spectrometer contains electric and magnetic fields, which exert forces on ions traveling through these fields. The speed of a charged particle may be increased or decreased while passing through the electric field, its direction may be altered by the magnetic field.
The magnitude of the deflection of the moving ion's trajectory depends on its mass-to-charge ratio. L
A weapon, arm or armament is any device that can be used with intent to inflict damage or harm. Weapons are used to increase the efficacy and efficiency of activities such as hunting, law enforcement, self-defense, warfare. In broader context, weapons may be construed to include anything used to gain a tactical, material or mental advantage over an adversary or enemy target. While ordinary objects such as sticks, cars, or pencils can be used as weapons, many are expressly designed for the purpose – ranging from simple implements such as clubs and axes, to complicated modern intercontinental ballistic missiles, biological weapons and cyberweapons. Something, re-purposed, converted, or enhanced to become a weapon of war is termed weaponized, such as a weaponized virus or weaponized laser; the use of objects as weapons has been observed among chimpanzees, leading to speculation that early hominids used weapons as early as five million years ago. However, this can not be confirmed using physical evidence because wooden clubs and unshaped stones would have left an ambiguous record.
The earliest unambiguous weapons to be found are the Schöningen spears, eight wooden throwing spears dating back more than 300,000 years. At the site of Nataruk in Turkana, numerous human skeletons dating to 10,000 years ago may present evidence of traumatic injuries to the head, ribs and hands, including obsidian projectiles embedded in the bones that might have been caused from arrows and clubs during conflict between two hunter-gatherer groups, but the evidence interpretation of warfare at Nataruk has been challenged. The earliest ancient weapons were evolutionary improvements of late neolithic implements, but significant improvements in materials and crafting techniques led to a series of revolutions in military technology; the development of metal tools began with copper during the Copper Age and was followed by the Bronze Age, leading to the creation of the Bronze Age sword and similar weapons. During the Bronze Age, the first defensive structures and fortifications appeared as well, indicating an increased need for security.
Weapons designed to breach fortifications followed soon after, such as the battering ram, in use by 2500 BC. The development of iron-working around 1300 BC in Greece had an important impact on the development of ancient weapons, it was not the introduction of early Iron Age swords, however, as they were not superior to their bronze predecessors, but rather the domestication of the horse and widespread use of spoked wheels by c. 2000 BC. This led to the creation of the light, horse-drawn chariot, whose improved mobility proved important during this era. Spoke-wheeled chariot usage peaked around 1300 BC and declined, ceasing to be militarily relevant by the 4th century BC. Cavalry developed; the horse increased the speed of attacks. In addition to land based weaponry, such as the trireme, were in use by the 7th century BC. European warfare during the Post-classical history was dominated by elite groups of knights supported by massed infantry, they were involved in mobile combat and sieges which involved various siege tactics.
Knights on horseback developed tactics for charging with lances providing an impact on the enemy formations and drawing more practical weapons once they entered into the melee. By contrast, infantry, in the age before structured formations, relied on cheap, sturdy weapons such as spears and billhooks in close combat and bows from a distance; as armies became more professional, their equipment was standardized and infantry transitioned to pikes. Pikes are seven to eight feet in length, used in conjunction with smaller side-arms. In Eastern and Middle Eastern warfare, similar tactics were developed independent of European influences; the introduction of gunpowder from the Asia at the end of this period revolutionized warfare. Formations of musketeers, protected by pikemen came to dominate open battles, the cannon replaced the trebuchet as the dominant siege weapon; the European Renaissance marked the beginning of the implementation of firearms in western warfare. Guns and rockets were introduced to the battlefield.
Firearms are qualitatively different from earlier weapons because they release energy from combustible propellants such as gunpowder, rather than from a counter-weight or spring. This energy is released rapidly and can be replicated without much effort by the user; therefore early firearms such as the arquebus were much more powerful than human-powered weapons. Firearms became important and effective during the 16th century to 19th century, with progressive improvements in ignition mechanisms followed by revolutionary changes in ammunition handling and propellant. During the U. S. Civil War new applications of firearms including the machine gun and ironclad warship emerged that would still be recognizable and useful military weapons today in limited conflicts. In the 19th century warship propulsion changed from sail power to fossil fuel-powered steam engines. Since the mid-18th century North American French-Indian war through the beginning of the 20th century, human-powered weapons were reduced from the primary weaponry of the battlefield yielding to gunpowder-based weaponry.
Sometimes referred to as the "Age of Rifles", this period was characterized by the development of firearms for infantry and cannons for support, as well as the beginnings of mechanized weapons such as the machine gun. Of particular note, Howitzers were able to destroy masonry fortresses and other fortifications, this single invention caused a Revolution in