Paleontology or palaeontology is the scientific study of life that existed prior to, sometimes including, the start of the Holocene Epoch. It includes the study of fossils to determine organisms' evolution and interactions with each other and their environments. Paleontological observations have been documented as far back as the 5th century BC; the science became established in the 18th century as a result of Georges Cuvier's work on comparative anatomy, developed in the 19th century. The term itself originates from Greek παλαιός, palaios, "old, ancient", ὄν, on, "being, creature" and λόγος, logos, "speech, study". Paleontology lies on the border between biology and geology, but differs from archaeology in that it excludes the study of anatomically modern humans, it now uses techniques drawn from a wide range of sciences, including biochemistry and engineering. Use of all these techniques has enabled paleontologists to discover much of the evolutionary history of life all the way back to when Earth became capable of supporting life, about 3.8 billion years ago.
As knowledge has increased, paleontology has developed specialised sub-divisions, some of which focus on different types of fossil organisms while others study ecology and environmental history, such as ancient climates. Body fossils and trace fossils are the principal types of evidence about ancient life, geochemical evidence has helped to decipher the evolution of life before there were organisms large enough to leave body fossils. Estimating the dates of these remains is essential but difficult: sometimes adjacent rock layers allow radiometric dating, which provides absolute dates that are accurate to within 0.5%, but more paleontologists have to rely on relative dating by solving the "jigsaw puzzles" of biostratigraphy. Classifying ancient organisms is difficult, as many do not fit well into the Linnaean taxonomy classifying living organisms, paleontologists more use cladistics to draw up evolutionary "family trees"; the final quarter of the 20th century saw the development of molecular phylogenetics, which investigates how organisms are related by measuring the similarity of the DNA in their genomes.
Molecular phylogenetics has been used to estimate the dates when species diverged, but there is controversy about the reliability of the molecular clock on which such estimates depend. The simplest definition of paleontology is "the study of ancient life"; the field seeks information about several aspects of past organisms: "their identity and origin, their environment and evolution, what they can tell us about the Earth's organic and inorganic past". Paleontology is one of the historical sciences, along with archaeology, astronomy, cosmology and history itself: it aims to describe phenomena of the past and reconstruct their causes. Hence it has three main elements: description of past phenomena; when trying to explain the past and other historical scientists construct a set of hypotheses about the causes and look for a smoking gun, a piece of evidence that accords with one hypothesis over the others. Sometimes the smoking gun is discovered by a fortunate accident during other research. For example, the discovery by Luis and Walter Alvarez of iridium, a extra-terrestrial metal, in the Cretaceous–Tertiary boundary layer made asteroid impact the most favored explanation for the Cretaceous–Paleogene extinction event, although the contribution of volcanism continues to be debated.
The other main type of science is experimental science, said to work by conducting experiments to disprove hypotheses about the workings and causes of natural phenomena. This approach cannot prove a hypothesis, since some experiment may disprove it, but the accumulation of failures to disprove is compelling evidence in favor. However, when confronted with unexpected phenomena, such as the first evidence for invisible radiation, experimental scientists use the same approach as historical scientists: construct a set of hypotheses about the causes and look for a "smoking gun". Paleontology lies between biology and geology since it focuses on the record of past life, but its main source of evidence is fossils in rocks. For historical reasons, paleontology is part of the geology department at many universities: in the 19th and early 20th centuries, geology departments found fossil evidence important for dating rocks, while biology departments showed little interest. Paleontology has some overlap with archaeology, which works with objects made by humans and with human remains, while paleontologists are interested in the characteristics and evolution of humans as a species.
When dealing with evidence about humans and paleontologists may work together – for example paleontologists might identify animal or plant fossils around an archaeological site, to discover what the people who lived there ate. In addition, paleontology borrows techniques from other sciences, including biology, ecology, chemistry and mathematics. For example, geochemical signatures from rocks may help to discover when life first arose on Earth, analyses of carbon isotope ratios may help to identify climate changes and to explain major transitions such as the Permian–Triassic extinction event. A recent discipline, molecular phylogenetics, compares the DNA and RNA of modern organisms to re-construct the "family trees" of their
A stone tool is, in the most general sense, any tool made either or out of stone. Although stone tool-dependent societies and cultures still exist today, most stone tools are associated with prehistoric cultures that have become extinct. Archaeologists study such prehistoric societies, refer to the study of stone tools as lithic analysis. Ethnoarchaeology has been a valuable research field in order to further the understanding and cultural implications of stone tool use and manufacture. Stone has been used to make a wide variety of different tools throughout history, including arrow heads and querns. Stone tools may be made of either ground stone or chipped stone, a person who creates tools out of the latter is known as a flintknapper. Chipped stone tools are made from cryptocrystalline materials such as chert or flint, chalcedony, obsidian and quartzite via a process known as lithic reduction. One simple form of reduction is to strike stone flakes from a nucleus of material using a hammerstone or similar hard hammer fabricator.
If the goal of the reduction strategy is to produce flakes, the remnant lithic core may be discarded once it has become too small to use. In some strategies, however, a flintknapper reduces the core to a rough unifacial or bifacial preform, further reduced using soft hammer flaking techniques or by pressure flaking the edges. More complex forms of reduction include the production of standardized blades, which can be fashioned into a variety of tools such as scrapers, knives and microliths. In general terms, chipped stone tools are nearly ubiquitous in all pre-metal-using societies because they are manufactured, the tool stone is plentiful, they are easy to transport and sharpen. Archaeologists classify stone tools into industries that share distinctive technological or morphological characteristics. In 1969 in the 2nd edition of World Prehistory, Grahame Clark proposed an evolutionary progression of flint-knapping in which the "dominant lithic technologies" occurred in a fixed sequence from Mode 1 through Mode 5.
He assigned to them relative dates: Modes 1 and 2 to the Lower Palaeolithic, 3 to the Middle Palaeolithic, 4 to the Advanced and 5 to the Mesolithic. They were not to be conceived, however, as either universal—that is, they did not account for all lithic technology. Mode 1, for example, was in use in Europe. Clark's scheme was adopted enthusiastically by the archaeological community. One of its advantages was the simplicity of terminology; the transitions are of greatest interest. In the literature the stone tools used in the period of the Palaeolithic are divided into four "modes", each of which designate a different form of complexity, which in most cases followed a rough chronological order. KenyaStone tools found from 2011 to 2014 at Lake Turkana in Kenya, are dated to be 3.3 million years old, predate the genus Homo by half million years. The oldest known Homo fossil is 2.8 million years old compared to the 3.3 million year old stone tools. The stone tools may have been made by Australopithecus afarensis —also called Kenyanthropus platyops— the species whose best fossil example is Lucy, which inhabited East Africa at the same time as the date of the oldest stone tools.
Dating of the tools was by dating volcanic ash layers in which the tools were found and dating the magnetic signature of the rock at the site. EthiopiaGrooved and fractured animal bone fossils, made by using stone tools, were found in Dikika, Ethiopia near the remains of Selam, a young Australopithecus afarensis girl who lived about 3.3 million years ago. The earliest stone tools in the life span of the genus Homo are Mode 1 tools, come from what has been termed the Oldowan Industry, named after the type of site found in Olduvai Gorge, where they were discovered in large quantities. Oldowan tools were characterised by their simple construction; these cores were river pebbles, or rocks similar to them, struck by a spherical hammerstone to cause conchoidal fractures removing flakes from one surface, creating an edge and a sharp tip. The blunt end is the proximal surface. Oldowan is a percussion technology. Grasping the proximal surface, the hominid brought the distal surface down hard on an object he wished to detach or shatter, such as a bone or tuber.
The earliest known Oldowan tools yet found date from 2.6 million years ago, during the Lower Palaeolithic period, have been uncovered at Gona in Ethiopia. After this date, the Oldowan Industry subsequently spread throughout much of Africa, although archaeologists are unsure which Hominan species first developed them, with some speculating that it was Australopithecus garhi, others believing that it was in fact Homo habilis. Homo habilis was the hominin who used the tools for most of the Oldowan in Africa, but at about 1.9-1.8 million years ago Homo erectus inherited them. The Industry flourished in southern and eastern Africa between 2.6 and 1.7 million years ago, but was spread out of Africa and into Eurasia by travelling bands of H. erectus, who took it as far east as Java by 1.8 million years ago and Northern China by 1.6 million years ago. More complex, Mode 2 tools began to be developed through the Acheulean Industry, named after the site
History is the study of the past as it is described in written documents. Events occurring before written record are considered prehistory, it is an umbrella term that relates to past events as well as the memory, collection, organization and interpretation of information about these events. Scholars who write about history are called historians. History can refer to the academic discipline which uses a narrative to examine and analyse a sequence of past events, objectively determine the patterns of cause and effect that determine them. Historians sometimes debate the nature of history and its usefulness by discussing the study of the discipline as an end in itself and as a way of providing "perspective" on the problems of the present. Stories common to a particular culture, but not supported by external sources, are classified as cultural heritage or legends, because they do not show the "disinterested investigation" required of the discipline of history. Herodotus, a 5th-century BC Greek historian is considered within the Western tradition to be the "father of history", along with his contemporary Thucydides, helped form the foundations for the modern study of human history.
Their works continue to be read today, the gap between the culture-focused Herodotus and the military-focused Thucydides remains a point of contention or approach in modern historical writing. In East Asia, a state chronicle, the Spring and Autumn Annals was known to be compiled from as early as 722 BC although only 2nd-century BC texts have survived. Ancient influences have helped spawn variant interpretations of the nature of history which have evolved over the centuries and continue to change today; the modern study of history is wide-ranging, includes the study of specific regions and the study of certain topical or thematical elements of historical investigation. History is taught as part of primary and secondary education, the academic study of history is a major discipline in university studies; the word history comes from the Ancient Greek ἱστορία, meaning'inquiry','knowledge from inquiry', or'judge'. It was in that sense; the ancestor word ἵστωρ is attested early on in Homeric Hymns, the Athenian ephebes' oath, in Boiotic inscriptions.
The Greek word was borrowed into Classical Latin as historia, meaning "investigation, research, description, written account of past events, writing of history, historical narrative, recorded knowledge of past events, narrative". History was borrowed from Latin into Old English as stær, but this word fell out of use in the late Old English period. Meanwhile, as Latin became Old French, historia developed into forms such as istorie and historie, with new developments in the meaning: "account of the events of a person's life, account of events as relevant to a group of people or people in general, dramatic or pictorial representation of historical events, body of knowledge relative to human evolution, narrative of real or imaginary events, story", it was from Anglo-Norman that history was borrowed into Middle English, this time the loan stuck. It appears in the 13th-century Ancrene Wisse, but seems to have become a common word in the late 14th century, with an early attestation appearing in John Gower's Confessio Amantis of the 1390s: "I finde in a bok compiled | To this matiere an old histoire, | The which comth nou to mi memoire".
In Middle English, the meaning of history was "story" in general. The restriction to the meaning "the branch of knowledge that deals with past events. With the Renaissance, older senses of the word were revived, it was in the Greek sense that Francis Bacon used the term in the late 16th century, when he wrote about "Natural History". For him, historia was "the knowledge of objects determined by space and time", that sort of knowledge provided by memory. In an expression of the linguistic synthetic vs. analytic/isolating dichotomy, English like Chinese now designates separate words for human history and storytelling in general. In modern German and most Germanic and Romance languages, which are solidly synthetic and inflected, the same word is still used to mean both'history' and'story'. Historian in the sense of a "researcher of history" is attested from 1531. In all European languages, the substantive history is still used to mean both "what happened with men", "the scholarly study of the happened", the latter sense sometimes distinguished with a capital letter, or the word historiography.
The adjective historical is attested from 1661, historic from 1669. Historians write in the context of their own time, with due regard to the current dominant ideas of how to interpret the past, sometimes write to provide lessons for their own society. In the words of Benedetto Croce, "All history is contemporary history". History is facilitated by the formation of a "true discourse of past" through the production of narrative and analysis of past events relating to the human race; the modern discipline of history is dedicated to the institutional production of this discourse. All events that are remembered and preserved in some authentic form constitute the historical record; the task of histori
Relative dating is the science of determining the relative order of past events, without determining their absolute age. In geology, rock or superficial deposits and lithologies can be used to correlate one stratigraphic column with another. Prior to the discovery of radiometric dating in the early 20th century, which provided a means of absolute dating and geologists used relative dating to determine ages of materials. Though relative dating can only determine the sequential order in which a series of events occurred, not when they occurred, it remains a useful technique. Relative dating by biostratigraphy is the preferred method in paleontology and is, in some respects, more accurate; the Law of Superposition, which states that older layers will be deeper in a site than more recent layers, was the summary outcome of'relative dating' as observed in geology from the 17th century to the early 20th century. The regular order of the occurrence of fossils in rock layers was discovered around 1800 by William Smith.
While digging the Somerset Coal Canal in southwest England, he found that fossils were always in the same order in the rock layers. As he continued his job as a surveyor, he found the same patterns across England, he found that certain animals were in only certain layers and that they were in the same layers all across England. Due to that discovery, Smith was able to recognize the order. Sixteen years after his discovery, he published a geological map of England showing the rocks of different geologic time eras. Methods for relative dating were developed when geology first emerged as a natural science in the 18th century. Geologists still use the following principles today as a means to provide information about geologic history and the timing of geologic events; the principle of Uniformitarianism states that the geologic processes observed in operation that modify the Earth's crust at present have worked in much the same way over geologic time. A fundamental principle of geology advanced by the 18th century Scottish physician and geologist James Hutton, is that "the present is the key to the past."
In Hutton's words: "the past history of our globe must be explained by what can be seen to be happening now." The principle of intrusive relationships concerns crosscutting intrusions. In geology, when an igneous intrusion cuts across a formation of sedimentary rock, it can be determined that the igneous intrusion is younger than the sedimentary rock. There are a number of different types of intrusions, including stocks, batholiths and dikes; the principle of cross-cutting relationships pertains to the formation of faults and the age of the sequences through which they cut. Faults are younger than the rocks. Finding the key bed in these situations may help determine whether the fault is a normal fault or a thrust fault; the principle of inclusions and components explains that, with sedimentary rocks, if inclusions are found in a formation the inclusions must be older than the formation that contains them. For example, in sedimentary rocks, it is common for gravel from an older formation to be ripped up and included in a newer layer.
A similar situation with igneous rocks occurs. These foreign bodies are picked up as magma or lava flows, are incorporated to cool in the matrix; as a result, xenoliths are older than the rock. The principle of original horizontality states that the deposition of sediments occurs as horizontal beds. Observation of modern marine and non-marine sediments in a wide variety of environments supports this generalization; the law of superposition states that a sedimentary rock layer in a tectonically undisturbed sequence is younger than the one beneath it and older than the one above it. This is because it is not possible for a younger layer to slip beneath a layer deposited; the only disturbance that the layers experience is bioturbation, in which animals and/or plants move things in the layers. However, this process is not enough to allow the layers to change their positions; this principle allows sedimentary layers to be viewed as a form of vertical time line, a partial or complete record of the time elapsed from deposition of the lowest layer to deposition of the highest bed.
The principle of faunal succession is based on the appearance of fossils in sedimentary rocks. As organisms exist at the same time period throughout the world, their presence or absence may be used to provide a relative age of the formations in which they are found. Based on principles laid out by William Smith a hundred years before the publication of Charles Darwin's theory of evolution, the principles of succession were developed independently of evolutionary thought; the principle becomes quite complex, given the uncertainties of fossilization, the localization of fossil types due to lateral changes in habitat, that not all fossils may be found globally at the same time. The principle of lateral continuity states that layers of sediment extend laterally in all directions; as a result, rocks that are otherwise similar, but are now separated by a valley or other erosional feature, can be assumed to be continuous. Layers of sediment do not extend indefinitely.
Magnification is the process of enlarging the apparent size, not physical size, of something. This enlargement is quantified by a calculated number called "magnification"; when this number is less than one, it refers to a reduction in size, sometimes called minification or de-magnification. Magnification is related to scaling up visuals or images to be able to see more detail, increasing resolution, using microscope, printing techniques, or digital processing. In all cases, the magnification of the image does not change the perspective of the image; some optical instruments provide visual aid by magnifying distant subjects. A magnifying glass, which uses a positive lens to make things look bigger by allowing the user to hold them closer to their eye. A telescope, which uses its large objective lens or primary mirror to create an image of a distant object and allows the user to examine the image with a smaller eyepiece lens, thus making the object look larger. A microscope, which makes a small object appear as a much larger image at a comfortable distance for viewing.
A microscope is similar in layout to a telescope except that the object being viewed is close to the objective, much smaller than the eyepiece. A slide projector, which projects a large image of a small slide on a screen. A photographic enlarger is similar. Optical magnification is the ratio between the apparent size of an object and its true size, thus it is a dimensionless number. Optical magnification is sometimes referred to as "power", although this can lead to confusion with optical power. For real images, such as images projected on a screen, size means a linear dimension. For optical instruments with an eyepiece, the linear dimension of the image seen in the eyepiece cannot be given, thus size means the angle subtended by the object at the focal point. Speaking, one should take the tangent of that angle. Thus, angular magnification is given by: where ε 0 is the angle subtended by the object at the front focal point of the objective and ε is the angle subtended by the image at the rear focal point of the eyepiece.
For example, the mean angular size of the Moon's disk as viewed from Earth's surface is about 0.52°. Thus, through binoculars with 10× magnification, the Moon appears to subtend an angle of about 5.2°. By convention, for magnifying glasses and optical microscopes, where the size of the object is a linear dimension and the apparent size is an angle, the magnification is the ratio between the apparent size as seen in the eyepiece and the angular size of the object when placed at the conventional closest distance of distinct vision: 25 cm from the eye; the linear magnification of a thin lens is where f is the focal length and d o is the distance from the lens to the object. Note that for real images, M is negative and the image is inverted. For virtual images, M is positive and the image is upright. With d i d_ being the distance from the lens to the image, h i h_ the height of the image and h o h_ the height of the object, the magnification can be written as: Note again that a negative magnification implies an inverted image.
The image recorded by a photographic film or image sensor is always a real image and is inverted. When measuring the height of an inverted image using the cartesian sign convention the value for hi will be negative, as a result M will be negative. However, the traditional sign convention used in photography is "real is positive, virtual is negative". Therefore, in photography: Object height and distance are always positive; when the focal length is positive the image's height and magnification are real and positive. Only if the focal length is negative, the image's height and magnification are virtual and negative. Therefore, the photographic magnification formulae are traditionally presented as: The angular magnification of an optical telescope is given by in which f o f_ is the focal length of the objective lens in a refractor or of the primary mirror in a reflector, f e f_ is the focal length of the eyepiece; the maximum angular magnification of a magnifying glass depends on how the glass and the object are held, relative to the eye.
If the lens is held at a distance from the object such that its front focal point is on the object being viewed, the relaxed eye can view the image with angular magnification Here, f f is the focal length of the lens in centimeters. The constant 25 cm is an estimate of the "near point" distance of the eye—the closest distance at which the healthy naked eye can focus. In this case the angular magnification is independent from the distance kept between the eye and the magnifying glass. If instead the lens is held close to the eye and the object is placed closer to the lens than its focal point so that the observer focuses on the near point, a larger angular magnification can be obtained, approaching A different interpretation of the working of the latter case is that the magnifying glass changes the diopter of the eye so that the object can be placed closer to the eye resulting in a larger angular magnification; the angular ma
Human prehistory is the period between the use of the first stone tools c. 3.3 million years ago by hominins and the invention of writing systems. The earliest writing systems appeared c. 5,300 years ago, but it took thousands of years for writing to be adopted, it was not used in some human cultures until the 19th century or until the present. The end of prehistory therefore came at different dates in different places, the term is less used in discussing societies where prehistory ended recently. Sumer in Mesopotamia, the Indus valley civilization, ancient Egypt were the first civilizations to develop their own scripts and to keep historical records. Neighboring civilizations were the first to follow. Most other civilizations reached the end of prehistory during the Iron Age; the three-age system of division of prehistory into the Stone Age, followed by the Bronze Age and Iron Age, remains in use for much of Eurasia and North Africa, but is not used in those parts of the world where the working of hard metals arrived abruptly with contact with Eurasian cultures, such as the Americas, Oceania and much of Sub-Saharan Africa.
These areas with some exceptions in Pre-Columbian civilizations in the Americas, did not develop complex writing systems before the arrival of Eurasians, their prehistory reaches into recent periods. The period when a culture is written about by others, but has not developed its own writing is known as the protohistory of the culture. By definition, there are no written records from human prehistory, so dating of prehistoric materials is crucial. Clear techniques for dating were not well-developed until the 19th century; this article is concerned with human prehistory, the time since behaviorally and anatomically modern humans first appeared until the beginning of recorded history. Earlier periods are called "prehistoric". Beginning The term "prehistory" can refer to the vast span of time since the beginning of the Universe or the Earth, but more it refers to the period since life appeared on Earth, or more to the time since human-like beings appeared. End The date marking the end of prehistory is defined as the advent of the contemporary written historical record.
The date varies from region to region depending on the date when relevant records become a useful academic resource. For example, in Egypt it is accepted that prehistory ended around 3200 BCE, whereas in New Guinea the end of the prehistoric era is set much more at around 1900 common era. In Europe the well-documented classical cultures of Ancient Greece and Ancient Rome had neighbouring cultures, including the Celts and to a lesser extent the Etruscans, with little or no writing, historians must decide how much weight to give to the highly prejudiced accounts of these "prehistoric" cultures in Greek and Roman literature. Time periods In dividing up human prehistory in Eurasia, historians use the three-age system, whereas scholars of pre-human time periods use the well-defined geologic record and its internationally defined stratum base within the geologic time scale; the three-age system is the periodization of human prehistory into three consecutive time periods, named for their respective predominant tool-making technologies: Stone Age Bronze Age Iron Age The notion of "prehistory" began to surface during the Enlightenment in the work of antiquarians who used the word'primitive' to describe societies that existed before written records.
The first use of the word prehistory in English, occurred in the Foreign Quarterly Review in 1836. The use of the geologic time scale for pre-human time periods, of the three-age system for human prehistory, is a system that emerged during the late nineteenth century in the work of British and Scandinavian archeologists and anthropologists; the main source for prehistory is archaeology, but some scholars are beginning to make more use of evidence from the natural and social sciences. This view has been articulated by advocates of deep history; the primary researchers into human prehistory are archaeologists and physical anthropologists who use excavation and geographic surveys, other scientific analysis to reveal and interpret the nature and behavior of pre-literate and non-literate peoples. Human population geneticists and historical linguists are providing valuable insight for these questions. Cultural anthropologists help provide context for societal interactions, by which objects of human origin pass among people, allowing an analysis of any article that arises in a human prehistoric context.
Therefore, data about prehistory is provided by a wide variety of natural and social sciences, such as paleontology, archaeology, geology, comparative linguistics, molecular genetics and many others. Human prehistory differs from history not only in terms of its chronology but in the way it deals with the activities of archaeological cultures rather than named nations or individuals. Restricted to material processes and artifacts rather than written records, prehistory is anonymous; because of this, reference terms that prehistorians use, such as Neanderthal or Iron Age are modern labels with definitions sometimes subject to debate. The concept of a "Stone Age" is found useful in the archaeology of most of the world, though in the archaeology of the Americas it is called by different names and begins with a Lithic sta
A microscope is an instrument used to see objects that are too small to be seen by the naked eye. Microscopy is the science of investigating small structures using such an instrument. Microscopic means invisible to the eye. There are many types of microscopes, they may be grouped in different ways. One way is to describe the way the instruments interact with a sample to create images, either by sending a beam of light or electrons to a sample in its optical path, or by scanning across, a short distance from the surface of a sample using a probe; the most common microscope is the optical microscope, which uses light to pass through a sample to produce an image. Other major types of microscopes are the fluorescence microscope, the electron microscope and the various types of scanning probe microscopes. Although objects resembling lenses date back 4000 years and there are Greek accounts of the optical properties of water-filled spheres followed by many centuries of writings on optics, the earliest known use of simple microscopes dates back to the widespread use of lenses in eyeglasses in the 13th century.
The earliest known examples of compound microscopes, which combine an objective lens near the specimen with an eyepiece to view a real image, appeared in Europe around 1620. The inventor is unknown. Several revolve around the spectacle-making centers in the Netherlands including claims it was invented in 1590 by Zacharias Janssen and/or Zacharias' father, Hans Martens, claims it was invented by their neighbor and rival spectacle maker, Hans Lippershey, claims it was invented by expatriate Cornelis Drebbel, noted to have a version in London in 1619. Galileo Galilei seems to have found after 1610 that he could close focus his telescope to view small objects and, after seeing a compound microscope built by Drebbel exhibited in Rome in 1624, built his own improved version. Giovanni Faber coined the name microscope for the compound microscope Galileo submitted to the Accademia dei Lincei in 1625; the first detailed account of the microscopic anatomy of organic tissue based on the use of a microscope did not appear until 1644, in Giambattista Odierna's L'occhio della mosca, or The Fly's Eye.
The microscope was still a novelty until the 1660s and 1670s when naturalists in Italy, the Netherlands and England began using them to study biology. Italian scientist Marcello Malpighi, called the father of histology by some historians of biology, began his analysis of biological structures with the lungs. Robert Hooke's Micrographia had a huge impact because of its impressive illustrations. A significant contribution came from Antonie van Leeuwenhoek who achieved up to 300 times magnification using a simple single lens microscope, he sandwiched a small glass ball lens between the holes in two metal plates riveted together, with an adjustable-by-screws needle attached to mount the specimen. Van Leeuwenhoek re-discovered red blood cells and spermatozoa, helped popularise the use of microscopes to view biological ultrastructure. On 9 October 1676, van Leeuwenhoek reported the discovery of micro-organisms; the performance of a light microscope depends on the quality and correct use of the condensor lens system to focus light on the specimen and the objective lens to capture the light from the specimen and form an image.
Early instruments were limited until this principle was appreciated and developed from the late 19th to early 20th century, until electric lamps were available as light sources. In 1893 August Köhler developed a key principle of sample illumination, Köhler illumination, central to achieving the theoretical limits of resolution for the light microscope; this method of sample illumination produces lighting and overcomes the limited contrast and resolution imposed by early techniques of sample illumination. Further developments in sample illumination came from the discovery of phase contrast by Frits Zernike in 1953, differential interference contrast illumination by Georges Nomarski in 1955. In the early 20th century a significant alternative to the light microscope was developed, an instrument that uses a beam of electrons rather than light to generate an image; the German physicist, Ernst Ruska, working with electrical engineer Max Knoll, developed the first prototype electron microscope in 1931, a transmission electron microscope.
The transmission electron microscope works on similar principles to an optical microscope but uses electrons in the place of light and electromagnets in the place of glass lenses. Use of electrons, instead of light, allows for much higher resolution. Development of the transmission electron microscope was followed in 1935 by the development of the scanning electron microscope by Max Knoll. Although TEMs were being used for research before WWII, became popular afterwards, the SEM was not commercially available until 1965. Transmission electron microscopes became popular following the Second World War. Ernst Ruska, working at Siemens, developed the first commercial transmission electron microscope and, in the 1950s, major scientific conferences on electron microscopy started being held. In 1965, the first commercial scanning electron microscope was developed by Profess