New wave music
New wave is a genre of rock music popular in the late 1970s and the 1980s with ties to mid-1970s punk rock. New wave moved away from blues and rock and roll sounds to create rock music or pop music that incorporated disco and electronic music. New wave was similar to punk rock, before becoming a distinct genre, it subsequently engendered fusions, including synth-pop. New wave differs from other movements with ties to first-wave punk as it displays characteristics common to pop music, rather than the more "artsy" post-punk. Although it incorporates much of the original punk rock sound and ethos, new wave exhibits greater complexity in both music and lyrics. Common characteristics of new wave music include the use of synthesizers and electronic productions, a distinctive visual style featured in music videos and fashion. New wave has been called one of the definitive genres of the 1980s, after it was promoted by MTV; the popularity of several new wave artists is attributed to their exposure on the channel.
In the mid-1980s, differences between new wave and other music genres began to blur. New wave has enjoyed resurgences since the 1990s, after a rising "nostalgia" for several new wave-influenced artists. Subsequently, the genre influenced other genres. During the 2000s, a number of acts, such as the Strokes, Franz Ferdinand and The Killers explored new wave and post-punk influences; these acts were sometimes labeled "new wave of new wave". The catch-all nature of new wave music has been a source of much controversy; the 1985 discography Who's New Wave in Music listed artists in over 130 separate categories. The New Rolling Stone Encyclopedia of Rock calls the term "virtually meaningless", while AllMusic mentions "stylistic diversity". New wave first emerged as a rock genre in the early 1970s, used by critics including Nick Kent and Dave Marsh to classify such New York-based groups as the Velvet Underground and New York Dolls, it gained currency beginning in 1976 when it appeared in UK punk fanzines such as Sniffin' Glue and newsagent music weeklies such as Melody Maker and New Musical Express.
In November 1976 Caroline Coon used Malcolm McLaren's term "new wave" to designate music by bands not punk, but related to the same musical scene. The term was used in that sense by music journalist Charles Shaar Murray in his comments about the Boomtown Rats. For a period of time in 1976 and 1977, the terms new wave and punk were somewhat interchangeable. By the end of 1977, "new wave" had replaced "punk" as the definition for new underground music in the UK. In the United States, Sire Records chairman Seymour Stein, believing that the term "punk" would mean poor sales for Sire's acts who had played the club CBGB, launched a "Don't Call It Punk" campaign designed to replace the term with "new wave"; as radio consultants in the United States had advised their clients that punk rock was a fad, they settled on the term "new wave". Like the filmmakers of the French new wave movement, its new artists were anti-corporate and experimental. At first, most U. S. writers used the term "new wave" for British punk acts.
Starting in December 1976, The New York Rocker, suspicious of the term "punk", became the first American journal to enthusiastically use the term starting with British acts appropriating it to acts associated with the CBGB scene. Part of what attracted Stein and others to new wave was the music's stripped back style and upbeat tempos, which they viewed as a much needed return to the energetic rush of rock and roll and 1960s rock that had dwindled in the 1970s with the ascendance of overblown progressive rock and stadium spectacles. Music historian Vernon Joynson claimed that new wave emerged in the UK in late 1976, when many bands began disassociating themselves from punk. Music that followed the anarchic garage band ethos of the Sex Pistols was distinguished as "punk", while music that tended toward experimentation, lyrical complexity or more polished production, came to be categorized as "new wave". In the U. S. the first new wavers were the not-so-punk acts associated with the New York club CBGB.
CBGB owner Hilly Kristal, referring to the first show of the band Television at his club in March 1974, said, "I think of that as the beginning of new wave." Furthermore, many artists who would have been classified as punk were termed new wave. A 1977 Phonogram Records compilation album of the same name features US artists including the Dead Boys, Talking Heads and the Runaways. New wave is much more tied to punk, came and went more in the United Kingdom than in the United States. At the time punk began, it was a major phenomenon in the United Kingdom and a minor one in the United States, thus when new wave acts started getting noticed in America, punk meant little to the mainstream audience and it was common for rock clubs and discos to play British dance mixes and videos between live sets by American guitar acts. Post-punk music developments in the UK were considered unique cultural events. By the early 1980s, British journalists had abandoned the term "new wave" in favor of subgenre terms such as "synthpop".
By 1983, the term of choice for the US music industry had become "new music", while to the majority of US fans it was still a "new wave" reacting to album-based rock. New wave died out in the mid-1980s, knocked out by guitar-driven rock reacting against new wave. In the 21st-century United States, "new wave" was used to describe ar
Cannibalism involves consuming all or part of another individual of the same species as food. To consume the same species, or show cannibalistic behavior, is a common ecological interaction in the animal kingdom, has been recorded in more than 1,500 species. Human cannibalism is well-documented, both in recent times; the rate of cannibalism increases in nutritionally-poor environments as individuals turn to other conspecific individuals as an additional food-source. Cannibalism regulates population numbers, whereby resources such as food and territory become more available with the decrease of potential competition. Although it may benefit the individual, it has been shown that the presence of cannibalism decreases the expected survival rate of the whole population and increases the risk of consuming a relative. Other negative effects may include the increased risk of pathogen transmission as the encounter rate of hosts increases. Cannibalism, does not—as once believed—occur only as a result of extreme food shortage or of artificial/unnatural conditions, but may occur under natural conditions in a variety of species.
Cannibalism seems prevalent in aquatic ecosystems, in which up to 90% of the organisms engage in cannibalistic activity at some point in their life-cycle. Cannibalism is not restricted to carnivorous species: it occurs in herbivores and in detritivores. Sexual cannibalism involves the consumption of the male by the female individual before, during or after copulation. Other forms of cannibalism include intrauterine cannibalism. Behavioural and morphological adaptations have evolved to decrease the rate of cannibalism in individual species. In environments where food availability is constrained, individuals can receive extra nutrition and energy if they use other conspecific individuals as an additional food source; this would, in turn, increase the survival rate of the cannibal and thus provide an evolutionary advantage in environments where food is scarce. A study conducted on wood frog tadpoles showed that those that exhibited cannibalistic tendencies had faster growth rates and higher fitness levels than non-cannibals.
An increase of size and growth would give them the added benefit of protection from potential predators such as other cannibals and give them an advantage when competing for resources The nutritional benefits of cannibalism may allow for the more efficient conversion of a conspecific diet into reusable resources than herbaceous diet. This facilitates for faster development. Studies have shown that there is a noticeable size difference between animals fed on a high conspecific diet which were smaller compared to those fed on a low conspecific diet. Hence, individual fitness could only be increased if the balance between developmental rate and size is balanced out, with studies showing that this is achieved in low conspecific diets. Cannibalism regulates population numbers and benefits the cannibalistic individual and its kin as resources such as extra shelter and food are freed. However, this is only the case if the cannibal recognizes its own kin as this won't hinder any future chances of perpetuating its genes in future generations.
The elimination of competition can increase mating opportunities, allowing further spread of an individual's genes. Animals which have diets consisting of predominantly conspecific prey expose themselves to a greater risk of injury and expend more energy foraging for suitable prey as compared to non-cannibalistic species. In order to combat the risk of personal injury, a predator targets younger or more vulnerable prey. However, the time necessitated by such selective predation could result in a failure to meet the predator's self-set nutritional requirements. In addition, the consumption of conspecific prey may involve the ingestion of defense compounds and hormones, which have the capacity to impact the developmental growth of the cannibal's offspring Hence, predators partake in a cannibalistic diet in conditions where alternative food sources are absent or not as available. Failure to recognize kin prey is a disadvantage, provided cannibals target and consume younger individuals. For example, a male stickleback fish may mistake their own "eggs" for their competitor's eggs, hence would inadvertently eliminate some of its own genes from the available gene pool.
Kin recognition has been observed in tadpoles of the spadefoot toad, whereby cannibalistic tadpoles of the same clutch tended to avoid consuming and harming siblings, while eating other non-siblings. The act of cannibalism may facilitate trophic disease transmission within a population, though cannibalistically spread pathogens and parasites employ alternative modes of infection. Cannibalism can reduce the prevalence of parasites in the population by decreasing the number of susceptible hosts and indirectly killing the parasite in the host, it has been shown in some studies that the risk of encountering an infected victim increases when there is a higher cannibalism rate, though this risk drops as the number of available hosts decreases. However, this is only the case. Cannibalism is an ineffective method of disease spread as cannibalism in the animal kingdom is a one-on-one interaction, the spread of disease requires group cannibalism.
Polygenism is a theory of human origins which posits the view that the human races are of different origins. This view is opposite to the idea of monogenism. Modern scientific views no longer favor the polygenic model, with the monogenic "Out of Africa" theory and its variants being the most accepted models for human origins. Many oral traditions feature polygenesis in their creation stories. For example, Bambuti mythology and other creation stories from the pygmies of Congo state that the supreme God of the pygmies, created three different races of man separately out of three kinds of clay: one black, one white, one red. In some cultures, polygenism in the creation narrative served as an etiological function; these narratives provided an explanation as to why other people groups exist who are not affiliated with their tribe. Moreover, distinctions made between the creation of foreign people groups and the tribe or ethnic group to which the creation myth pertains served to reinforce tribal or ethnic unity, the need to exercise wariness and caution when dealing with outsiders, or the unique nature of the relationship between that tribe and the deities of their religious system.
An example may be found in the creation myth of the Asmat people, a hunter-gatherer tribe situated along the south-western coast of New Guinea. This creation myth asserts that the Asmat themselves came into being when a deity placed carved wooden statues in a ceremonial house and began to beat a drum; the statues began to dance. Some time a great crocodile attempted to attack this ceremonial house, but was defeated by the power of the deity; the crocodile was cut into several pieces and these were tossed in different directions. Each piece became one of the foreign tribes known to the Asmat; the idea is found in some ancient Greek and Roman literature. For example the Roman Emperor Julian the Apostate in his Letter to a Priest wrote that he believed Zeus made multiple creations of man and women. In his Against the Galilaens Julian presented his reasoning for this belief. Julian had noticed that the Germanics and Scythians were different in their bodies to the Ethiopians, he therefore could not imagine such difference in physical attributes as having originated from common ancestry, so maintained separate creations for different races.
In early classical and medieval geography the idea of polygenism surfaced because of the suggested possibility of there being inhabitants of the antipodes. These inhabitants were considered by some to have separate origins because of their geographical extremity; the religion of the Ainu people claims that the ancestors of the Ainu people arrived on Earth from the skies separate from the other races. See Ainu creation myth. Traditionally, most Jews and Muslims have embraced monogenism in the form that all modern humans are descended from a single mating pair, named Adam and Eve. In this context, polygenism described all alternative explanations for the origin of humankind that involved more than two individual "first people"; this definition of polygenism is still employed among some Creationists and within the Roman Catholic Church. With the development of the evolutionary paradigm of human origins, it has become recognized within the scientific community that at no point did there exist a single "first man" and a single "first woman" who constituted the first true humans and to whom all lineages of modern humans converge.
If Adam and Eve existed as distinct historical persons, they were members of a much larger population of the same species. However, a common scientific explanation of human origins asserts that the population directly ancestral to all modern humans remained united as a single population by constant gene flow. Therefore, on the level of the entire human population, this explanation of human origin is classified as monogenism. All modern humans share the same origin from this single ancestral population. Modern polygenists do not accept either scientific monogenism, they believe that the variation among human racial types cannot be accounted for by monogenism or by evolutionary processes occurring since the proposed recent African origin of modern humans. Polygenists reject the argument that human races must belong to a single species because they can interbreed. There are several polygenist hypotheses, including biblical creationist polygenism and polygenist evolution. To make polygenism compatible with the Biblical account in the early chapters of the Book of Genesis, some argument is needed to the effect that what is in the Bible is incomplete.
Three standard positions are: Pre-Adamism. In Christian terms, polygenesis remained an uncommon Biblical interpretation until the mid-19th century, was considered heretical. A major reason for the emergence of Biblical polygenism from around the 18th century was because it became noted that the number of races could not have developed within the commonly-accepted Biblical timeframe. Francis Dobbs, an eccentric member of the Irish Parliament, believed in a different kind of biblical polygenism. In his Concise View from History written in 1800 he maintained that there was a race resulting from a clandestine affair between Eve and the Devil. Polygenism was criticized in the 20th century Roman Catholic Church, by Pope Pius XII in the encyclical Humani
Pollution is the introduction of contaminants into the natural environment that cause adverse change. Pollution can take the form such as noise, heat or light. Pollutants, the components of pollution, can be either foreign substances/energies or occurring contaminants. Pollution is classed as point source or nonpoint source pollution. In 2015, pollution killed 9 million people in the world. Major forms of pollution include: Air pollution, light pollution, noise pollution, plastic pollution, soil contamination, radioactive contamination, thermal pollution, visual pollution, water pollution. Air pollution has always accompanied civilizations. Pollution started from prehistoric times. According to a 1983 article in the journal Science, "soot" found on ceilings of prehistoric caves provides ample evidence of the high levels of pollution, associated with inadequate ventilation of open fires." Metal forging appears to be a key turning point in the creation of significant air pollution levels outside the home.
Core samples of glaciers in Greenland indicate increases in pollution associated with Greek and Chinese metal production. The burning of coal and wood, the presence of many horses in concentrated areas made the cities the primary sources of pollution; the Industrial Revolution brought an infusion of untreated chemicals and wastes into local streams that served as the water supply. King Edward I of England banned the burning of sea-coal by proclamation in London in 1272, after its smoke became a problem, it was the industrial revolution. London recorded one of the earlier extreme cases of water quality problems with the Great Stink on the Thames of 1858, which led to construction of the London sewerage system soon afterward. Pollution issues escalated as population growth far exceeded viability of neighborhoods to handle their waste problem. Reformers began to clean water. In 1870, the sanitary conditions in Berlin were among the worst in Europe. August Bebel recalled conditions before a modern sewer system was built in the late 1870s: "Waste-water from the houses collected in the gutters running alongside the curbs and emitted a fearsome smell.
There were no public toilets in the squares. Visitors women became desperate when nature called. In the public buildings the sanitary facilities were unbelievably primitive.... As a metropolis, Berlin did not emerge from a state of barbarism into civilization until after 1870."The primitive conditions were intolerable for a world national capital, the Imperial German government brought in its scientists and urban planners to not only solve the deficiencies, but to forge Berlin as the world's model city. A British expert in 1906 concluded that Berlin represented "the most complete application of science and method of public life," adding "it is a marvel of civic administration, the most modern and most organized city that there is."The emergence of great factories and consumption of immense quantities of coal gave rise to unprecedented air pollution and the large volume of industrial chemical discharges added to the growing load of untreated human waste. Chicago and Cincinnati were the first two American cities to enact laws ensuring cleaner air in 1881.
Pollution became a major issue in the United States in the early twentieth century, as progressive reformers took issue with air pollution caused by coal burning, water pollution caused by bad sanitation, street pollution caused by the 3 million horses who worked in American cities in 1900, generating large quantities of urine and manure. As historian Martin Melosi notes, The generation that first saw automobiles replacing the horses saw cars as "miracles of cleanliness.". By the 1940s, automobile-caused smog was a major issue in Los Angeles. Other cities followed around the country until early in the 20th century, when the short lived Office of Air Pollution was created under the Department of the Interior. Extreme smog events were experienced by the cities of Los Angeles and Donora, Pennsylvania in the late 1940s, serving as another public reminder. Air pollution would continue to be a problem in England later during the industrial revolution, extending into the recent past with the Great Smog of 1952.
Awareness of atmospheric pollution spread after World War II, with fears triggered by reports of radioactive fallout from atomic warfare and testing. A non-nuclear event – the Great Smog of 1952 in London – killed at least 4000 people; this prompted some of the first major modern environmental legislation: the Clean Air Act of 1956. Pollution began to draw major public attention in the United States between the mid-1950s and early 1970s, when Congress passed the Noise Control Act, the Clean Air Act, the Clean Water Act, the National Environmental Policy Act. Severe incidents of pollution helped increase consciousness. PCB dumping in the Hudson River resulted in a ban by the EPA on consumption of its fish in 1974. National news stories in the late 1970s – the long-term dioxin contamination at Love Canal starting in 1947 and uncontrolled dumping in Valley of the Drums – led to the Superfund legislation of 1980; the pollution of industrial land gave rise to the name brownfield, a term now common in city planning.
The development of nuclear science introduced radioactive contamination, which can remain lethally radioactive for hundreds of thousands of years. Lake Karachay – named by the Worldwatch Institute as the "most polluted
Trepanning known as trepanation, trephining or making a burr hole is a surgical intervention in which a hole is drilled or scraped into the human skull, exposing the dura mater to treat health problems related to intracranial diseases or release pressured blood buildup from an injury. It may refer to any "burr" hole created through other body surfaces, including nail beds, it is used to relieve pressure beneath a surface. A trephine is an instrument used for cutting out a round piece of skull bone. In ancient times, holes were drilled into a person, behaving in what was considered an abnormal way to let out what people believed were evil spirits. Evidence of trepanation has been found in prehistoric human remains from Neolithic times onward; the bone, trepanned was kept by the prehistoric people and may have been worn as a charm to keep evil spirits away. Evidence suggests that trepanation was primitive emergency surgery after head wounds to remove shattered bits of bone from a fractured skull and clean out the blood that pools under the skull after a blow to the head.
Such injuries were typical for primitive weaponry such as slings and war clubs. There is some contemporary use of the term. In modern eye surgery, a trephine instrument is used in corneal transplant surgery; the procedure of drilling a hole through a fingernail or toenail is known as trephination. It is performed by a surgeon to relieve the pain associated with a subungual hematoma. In abdominal surgery, a trephine incision is when a small disc of abdominal skin is excised to accommodate a stoma. Although the abdominal wall does not contain bone, the use of the word'trephine' in this context may relate to the round excised area of skin being similar in shape to a burr hole. Trepanation is the oldest surgical procedure for which there is archaeological evidence, in some areas may have been quite widespread. At one burial site in France dated to 6500 BCE, 40 out of 120 prehistoric skulls found had trepanation holes. Many prehistoric and premodern patients had signs of their skull structure healing, suggesting that many of those subjected to the surgery survived.
Another skull with a trepanation hole was found at burial site Chalaghantepe dated to the 5th millennium BCE. More than 1,500 trephined skulls from the Neolithic period have been uncovered throughout the world – from Europe, Siberia and the Americas. Most of the trephined crania belong to adult males, but women and children are represented. A cow skull dating to 3400-3000 BCE upon which trepanation had been performed was discovered in France. In the more recent times of postclassical pre-Columbian Mesoamerica, evidence for the practice of trepanation and an assortment of other cranial deformation techniques comes from a variety of sources, including physical cranial remains of burials, allusions in iconographic artworks and reports from the post-colonial period. Among New World societies trepanning is most found in the Andean civilizations, such as pre-Incan cultures. For example, the Paracas culture Ica, situated in what is now known as Ica, located south of Lima, it has been found in the Muisca Confederation and the Inca Empire.
In both cranioplasty existed. Its prevalence among Mesoamerican civilizations is much lower, at least judging from the comparatively few trepanated crania that have been uncovered; the archaeological record in Mesoamerica is further complicated by the practice of skull mutilation and modification carried out after the death of the subject, to fashion "trophy skulls" and the like of captives and enemies. This was a widespread tradition, illustrated in pre-Columbian art that depicts rulers adorned with or carrying the modified skulls of their defeated enemies, or of the ritualistic display of sacrificial victims. Several Mesoamerican cultures used a skull-rack, on which skulls were impaled in rows or columns of wooden stakes. So, some evidence of genuine trepanation in Mesoamerica has survived; the earliest archaeological survey published of trepanated crania was a late 19th-century study of several specimens recovered from the Tarahumara mountains by the Norwegian ethnographer Carl Lumholtz. Studies documented cases identified from a range of sites in Oaxaca and central Mexico, such as Tilantongo and the major Zapotec site of Monte Albán.
Two specimens from the Tlatilco civilization's homelands indicate the practice has a lengthy tradition. A study of ten low-status burials from the Late Classic period at Monte Albán concluded that the trepanation had been applied non-therapeutically, since multiple techniques had been used and since some people had received more than one trepanation, concluded it had been done experimentally. Inferring the events to represent experiments on people until they died, the study interpreted that use of trepanation as an indicator of the stressful sociopolitical climate that not long thereafter resulted in the abandonment of Monte Alban as the primary regional administrative center in the Oaxacan highlands. Specimens identified from the Maya civilization region of southern Mexico and the Yucatán Peninsula show no evidence of the drilling or cutting techniques found in central and highland Mexico. Instead, the pre-Columbian Maya used an abrasive technique that ground away
Pseudoscience consists of statements, beliefs, or practices that are claimed to be both scientific and factual, but are incompatible with the scientific method. Pseudoscience is characterized by contradictory, exaggerated or unfalsifiable claims; the term pseudoscience is considered pejorative because it suggests something is being presented as science inaccurately or deceptively. Those described as practicing or advocating pseudoscience dispute the characterization; the demarcation between science and pseudoscience has scientific implications. Differentiating science from pseudoscience has practical implications in the case of health care, expert testimony, environmental policies, science education. Distinguishing scientific facts and theories from pseudoscientific beliefs, such as those found in astrology, alternative medicine, occult beliefs, religious beliefs, creation science, is part of science education and scientific literacy. Pseudoscience can cause negative consequences in the real world.
Antivaccine activists present pseudoscientific studies that falsely call into question the safety of vaccines. Homeopathic remedies with no active ingredients have been promoted as treatment for deadly diseases; the word pseudoscience is derived from the Greek root pseudo meaning false and the English word science, from the Latin word scientia, meaning "knowledge". Although the term has been in use since at least the late 18th century the concept of pseudoscience as distinct from real or proper science seems to have become more widespread during the mid-19th century. Among the earliest uses of "pseudo-science" was in an 1844 article in the Northern Journal of Medicine, issue 387: That opposite kind of innovation which pronounces what has been recognized as a branch of science, to have been a pseudo-science, composed of so-called facts, connected together by misapprehensions under the disguise of principles. An earlier use of the term was in 1843 by the French physiologist François Magendie.
During the 20th century, the word was used pejoratively to describe explanations of phenomena which were claimed to be scientific, but which were not in fact supported by reliable experimental evidence. From time-to-time, the usage of the word occurred in a more formal, technical manner in response to a perceived threat to individual and institutional security in a social and cultural setting. Philosophers classify types of knowledge. In English, the word science is used to indicate the natural sciences and related fields, which are called the social sciences. Different philosophers of science may disagree on the exact limits – for example, is mathematics a formal science, closer to the empirical ones, or is pure mathematics closer to the philosophical study of logic and therefore not a science? – but all agree that all of the ideas that are not scientific are non-scientific. The large category of non-science includes all matters outside the natural and social sciences, such as the study of history, religion and the humanities.
Dividing the category again, unscientific claims are a subset of the large category of non-scientific claims. This category includes all matters that are directly opposed to good science. Un-science includes pseudoscience, thus pseudoscience is a subset of un-science, un-science, in turn, is subset of non-science. Pseudoscience is differentiated from science because – although it claims to be science – pseudoscience does not adhere to accepted scientific standards, such as the scientific method, falsifiability of claims, Mertonian norms. A number of basic principles are accepted by scientists as standards for determining whether a body of knowledge, method, or practice is scientific. Experimental results should be verified by other researchers; these principles are intended to ensure experiments can be reproduced measurably given the same conditions, allowing further investigation to determine whether a hypothesis or theory related to given phenomena is valid and reliable. Standards require the scientific method to be applied throughout, bias to be controlled for or eliminated through randomization, fair sampling procedures, blinding of studies, other methods.
All gathered data, including the experimental or environmental conditions, are expected to be documented for scrutiny and made available for peer review, allowing further experiments or studies to be conducted to confirm or falsify results. Statistical quantification of significance and error are important tools for the scientific method. During the mid-20th century, the philosopher Karl Popper emphasized the criterion of falsifiability to distinguish science from nonscience. Statements, hypotheses, or theories have falsifiability or refutability if there is the inherent possibility that they can be proven false; that is, if it is possible to conceive of an argument which negates them. Popper used astrology and psychoanalysis as examples of pseudoscience and Einstein's theory of relativity as an example of science, he subdivided nonscience into philosophical, mythological and metaphysical formulations on one hand, pseudoscientific formulations on the other, though he did not provide clear criteria for the differences.
Another example which shows the distinct need for a claim to be f
Human cannibalism is the act or practice of humans eating the flesh or internal organs of other human beings. A person who practices cannibalism is called a cannibal; the expression cannibalism has been extended into zoology to mean one individual of a species consuming all or part of another individual of the same species as food, including sexual cannibalism. Some scholars have argued, that no firm evidence exists that cannibalism has been a acceptable practice anywhere in the world, at any time in history; the Island Carib people of the Lesser Antilles, from whom the word cannibalism is derived, acquired a long-standing reputation as cannibals following the recording of their legends in the 17th century. Some controversy exists over the accuracy of these legends and the prevalence of actual cannibalism in the culture. Cannibalism was practiced in New Guinea and in parts of the Solomon Islands, flesh markets existed in some parts of Melanesia. Fiji was once known as the "Cannibal Isles". Cannibalism has been well documented around the world, from Fiji to the Amazon Basin to the Congo to the Māori people of New Zealand.
Neanderthals are believed to have practiced cannibalism, Neanderthals may have been eaten by anatomically modern humans. Cannibalism was practiced in the past in Egypt during ancient Egypt, Roman Egypt and during famines such as the great famine in the year 1201. Cannibalism has been both practiced and fiercely condemned in several wars in Liberia and the Democratic Republic of the Congo, it was still practiced in Papua New Guinea as of 2012, for cultural reasons and in ritual and in war in various Melanesian tribes. Cannibalism has been said to test the bounds of cultural relativism because it challenges anthropologists "to define what is or is not beyond the pale of acceptable human behavior". Cannibalism has been practiced as a last resort by people suffering from famine in modern times. Famous examples include the ill-fated Donner Party and, more the crash of Uruguayan Air Force Flight 571, after which some survivors ate the bodies of dead passengers; some mentally ill people have done so, such as Jeffrey Dahmer and Albert Fish.
There is resistance to formally labeling cannibalism a mental disorder. The word "cannibalism" is derived from Caníbales, the Spanish name for the Caribs, a West Indies tribe that may have practiced cannibalism, from Spanish canibal or caribal, "a savage", it is called anthropophagy. In some societies tribal societies, cannibalism is a cultural norm. Consumption of a person from within the same community is called endocannibalism. Exocannibalism is the consumption of a person from outside the community as a celebration of victory against a rival tribe. Both types of cannibalism can be fueled by the belief that eating a person's flesh or internal organs will endow the cannibal with some of the characteristics of the deceased. In most parts of the world, cannibalism is not a societal norm, but is sometimes resorted to in situations of extreme necessity; the survivors of the shipwrecks of the Essex and Méduse in the 19th century are said to have engaged in cannibalism, as did the members of Franklin's lost expedition and the Donner Party.
Such cases involve necro-cannibalism as opposed to homicidal cannibalism. In English law, the latter is always considered a crime in the most trying circumstances; the case of R v Dudley and Stephens, in which two men were found guilty of murder for killing and eating a cabin boy while adrift at sea in a lifeboat, set the precedent that necessity is no defence to a charge of murder. In pre-modern medicine, the explanation given by the now-discredited theory of humorism for cannibalism was that it came about within a black acrimonious humor, being lodged in the linings of the ventricle, produced the voracity for human flesh. A well-known case of mortuary cannibalism is that of the Fore tribe in New Guinea, which resulted in the spread of the prion disease kuru. Although the Fore's mortuary cannibalism was well documented, the practice had ceased before the cause of the disease was recognized. However, some scholars argue that although post-mortem dismemberment was the practice during funeral rites, cannibalism was not.
Marvin Harris theorizes that it happened during a famine period coincident with the arrival of Europeans and was rationalized as a religious rite. In 2003, a publication in Science received a large amount of press attention when it suggested that early humans may have practiced extensive cannibalism. According to this research, genetic markers found in modern humans worldwide suggest that today many people carry a gene that evolved as protection against the brain diseases that can be spread by consuming human brain tissue. A 2006 reanalysis of the data questioned this hypothesis, because it claimed to have found a data collection bias, which led to an erroneous conclusion; this claimed bias came from incidents of cannibalism used in the analysis not being due to local cultures, but having been carried out by explorers, stranded seafarers or escaped convicts. The original authors published a subsequent paper in 2008 defending their conclusions. Cannibalism features in the folklore and legends of many cultures and is most attributed to evil characters or as extreme retribution for some wrongdoing.
Examples include the witch in "Hansel and Gretel", Lamia of Greek mythology and Baba Yaga of Slavic folklore. A number of stories in Greek mythology involve cannibalism, in particular cannib