In computer science, artificial intelligence, sometimes called machine intelligence, is intelligence demonstrated by machines, in contrast to the natural intelligence displayed by humans and animals. Computer science defines AI research as the study of "intelligent agents": any device that perceives its environment and takes actions that maximize its chance of achieving its goals. Colloquially, the term "artificial intelligence" is used to describe machines that mimic "cognitive" functions that humans associate with other human minds, such as "learning" and "problem solving"; as machines become capable, tasks considered to require "intelligence" are removed from the definition of AI, a phenomenon known as the AI effect. A quip in Tesler's Theorem says "AI is whatever hasn't been done yet." For instance, optical character recognition is excluded from things considered to be AI, having become a routine technology. Modern machine capabilities classified as AI include understanding human speech, competing at the highest level in strategic game systems, autonomously operating cars, intelligent routing in content delivery networks and military simulations.
Artificial intelligence can be classified into three different types of systems: analytical, human-inspired, humanized artificial intelligence. Analytical AI has only characteristics consistent with cognitive intelligence. Human-inspired AI has elements from emotional intelligence. Humanized AI shows characteristics of all types of competencies, is able to be self-conscious and is self-aware in interactions with others. Artificial intelligence was founded as an academic discipline in 1956, in the years since has experienced several waves of optimism, followed by disappointment and the loss of funding, followed by new approaches and renewed funding. For most of its history, AI research has been divided into subfields that fail to communicate with each other; these sub-fields are based on technical considerations, such as particular goals, the use of particular tools, or deep philosophical differences. Subfields have been based on social factors; the traditional problems of AI research include reasoning, knowledge representation, learning, natural language processing and the ability to move and manipulate objects.
General intelligence is among the field's long-term goals. Approaches include statistical methods, computational intelligence, traditional symbolic AI. Many tools are used in AI, including versions of search and mathematical optimization, artificial neural networks, methods based on statistics and economics; the AI field draws upon computer science, information engineering, psychology, linguistics and many other fields. The field was founded on the claim that human intelligence "can be so described that a machine can be made to simulate it"; this raises philosophical arguments about the nature of the mind and the ethics of creating artificial beings endowed with human-like intelligence which are issues that have been explored by myth and philosophy since antiquity. Some people consider AI to be a danger to humanity if it progresses unabated. Others believe that AI, unlike previous technological revolutions, will create a risk of mass unemployment. In the twenty-first century, AI techniques have experienced a resurgence following concurrent advances in computer power, large amounts of data, theoretical understanding.
Thought-capable artificial beings appeared as storytelling devices in antiquity, have been common in fiction, as in Mary Shelley's Frankenstein or Karel Čapek's R. U. R.. These characters and their fates raised many of the same issues now discussed in the ethics of artificial intelligence; the study of mechanical or "formal" reasoning began with philosophers and mathematicians in antiquity. The study of mathematical logic led directly to Alan Turing's theory of computation, which suggested that a machine, by shuffling symbols as simple as "0" and "1", could simulate any conceivable act of mathematical deduction; this insight, that digital computers can simulate any process of formal reasoning, is known as the Church–Turing thesis. Along with concurrent discoveries in neurobiology, information theory and cybernetics, this led researchers to consider the possibility of building an electronic brain. Turing proposed that "if a human could not distinguish between responses from a machine and a human, the machine could be considered "intelligent".
The first work, now recognized as AI was McCullouch and Pitts' 1943 formal design for Turing-complete "artificial neurons". The field of AI research was born at a workshop at Dartmouth College in 1956. Attendees Allen Newell, Herbert Simon, John McCarthy, Marvin Minsky and Arthur Samuel became the founders and leaders of AI research, they and their students produced programs that the press described as "astonishing": computers were learning checkers strategies (and by 1959 were playing better than the average human
Sun Microsystems, Inc. was an American company that sold computers, computer components and information technology services and created the Java programming language, the Solaris operating system, ZFS, the Network File System, SPARC. Sun contributed to the evolution of several key computing technologies, among them Unix, RISC processors, thin client computing, virtualized computing. Sun was founded on February 24, 1982. At its height, the Sun headquarters were in Santa Clara, California, on the former west campus of the Agnews Developmental Center. On April 20, 2009, it was announced; the deal was completed on January 27, 2010. Sun products included computer servers and workstations built on its own RISC-based SPARC processor architecture, as well as on x86-based AMD Opteron and Intel Xeon processors. Sun developed its own storage systems and a suite of software products, including the Solaris operating system, developer tools, Web infrastructure software, identity management applications. Other technologies included the Java platform and NFS.
In general, Sun was a proponent of open systems Unix. It was a major contributor to open-source software, as evidenced by its $1 billion purchase, in 2008, of MySQL, an open-source relational database management system. At various times, Sun had manufacturing facilities in several locations worldwide, including Newark, California. However, by the time the company was acquired by Oracle, it had outsourced most manufacturing responsibilities; the initial design for what became Sun's first Unix workstation, the Sun-1, was conceived by Andy Bechtolsheim when he was a graduate student at Stanford University in Palo Alto, California. Bechtolsheim designed the SUN workstation for the Stanford University Network communications project as a personal CAD workstation, it was designed around the Motorola 68000 processor with an advanced memory management unit to support the Unix operating system with virtual memory support. He built the first ones from spare parts obtained from Stanford's Department of Computer Science and Silicon Valley supply houses.
On February 24, 1982, Vinod Khosla, Andy Bechtolsheim, Scott McNealy, all Stanford graduate students, founded Sun Microsystems. Bill Joy of Berkeley, a primary developer of the Berkeley Software Distribution, joined soon after and is counted as one of the original founders; the Sun name is derived from the initials of the Stanford University Network. Sun was profitable from its first quarter in July 1982. By 1983 Sun was known for producing 68k-based systems with high-quality graphics that were the only computers other than DEC's VAX to run 4.2BSD. It licensed the computer design to other manufacturers, which used it to build Multibus-based systems running Unix from UniSoft. Sun's initial public offering was in 1986 for Sun Workstations; the symbol was changed in 2007 to JAVA. Sun's logo, which features four interleaved copies of the word sun in the form of a rotationally symmetric ambigram, was designed by professor Vaughan Pratt of Stanford; the initial version of the logo was orange and had the sides oriented horizontally and vertically, but it was subsequently rotated to stand on one corner and re-colored purple, blue.
In the dot-com bubble, Sun began making much more money, its shares rose dramatically. It began spending much more, hiring workers and building itself out; some of this was because of genuine demand, but much was from web start-up companies anticipating business that would never happen. In 2000, the bubble burst. Sales in Sun's important hardware division went into free-fall as customers closed shop and auctioned high-end servers. Several quarters of steep losses led to executive departures, rounds of layoffs, other cost cutting. In December 2001, the stock fell to the 1998, pre-bubble level of about $100, but it kept falling, faster than many other tech companies. A year it had dipped below $10 but bounced back to $20. In mid-2004, Sun closed their Newark, California and consolidated all manufacturing to Hillsboro, Oregon. In 2006, the rest of the Newark campus was put on the market. In 2004, Sun canceled two major processor projects which emphasized high instruction-level parallelism and operating frequency.
Instead, the company chose to concentrate on processors optimized for multi-threading and multiprocessing, such as the UltraSPARC T1 processor. The company announced a collaboration with Fujitsu to use the Japanese company's processor chips in mid-range and high-end Sun servers; these servers were announced on April 17, 2007, as the M-Series, part of the SPARC Enterprise series. In February 2005, Sun announced the Sun Grid, a grid computing deployment on which it offered utility computing services priced at US$1 per CPU/hour for processing and per GB/month for storage; this offering built upon an existing 3,000-CPU server farm used for internal R&D for over 10 years, which Sun marketed as being able to achieve 97% utilization. In August 2005, the first commercial use of this grid was announced for financial risk simulations, launched as its first software as a service product. In January 2005, Sun reported a net profit of $19 million for fiscal 2005 second quarter, for the first time in three years.
This was followed by net loss of $9 million on GAAP basis for the third quarter 2005, as reported on April 14, 2005. In January 2007, Sun reported a net GAAP profit of $126
Brown University is a private Ivy League research university in Providence, Rhode Island. Founded in 1764 as the College in the English Colony of Rhode Island and Providence Plantations, it is the seventh-oldest institution of higher education in the United States and one of the nine colonial colleges chartered before the American Revolution. At its foundation, Brown was the first college in the U. S. to accept students regardless of their religious affiliation. Its engineering program was established in 1847, it was one of the early doctoral-granting U. S. institutions in the late 19th century, adding masters and doctoral studies in 1887. In 1969, Brown adopted a New Curriculum sometimes referred to as the Brown Curriculum after a period of student lobbying; the New Curriculum eliminated mandatory "general education" distribution requirements, made students "the architects of their own syllabus" and allowed them to take any course for a grade of satisfactory or unrecorded no-credit. In 1971, Brown's coordinate women's institution, Pembroke College, was merged into the university.
Undergraduate admissions is selective, with an acceptance rate of 6.6% for the class of 2023. The university comprises the College, the Graduate School, Alpert Medical School, the School of Engineering, the School of Public Health and the School of Professional Studies. Brown's international programs are organized through the Watson Institute for International and Public Affairs, the university is academically affiliated with the Marine Biological Laboratory and the Rhode Island School of Design; the Brown/RISD Dual Degree Program, offered in conjunction with the Rhode Island School of Design, is a five-year course that awards degrees from both institutions. Brown's main campus is located in the College Hill Historic District in the city of Providence, Rhode Island; the University's neighborhood is a federally listed architectural district with a dense concentration of Colonial-era buildings. Benefit Street, on the western edge of the campus, contains "one of the finest cohesive collections of restored seventeenth- and eighteenth-century architecture in the United States".
As of August 2018, 8 Nobel Prize winners have been affiliated with Brown University as alumni, faculty members or researchers. In addition, Brown's faculty and alumni include five National Humanities Medalists and ten National Medal of Science laureates. Other notable alumni include eight billionaire graduates, a U. S. Supreme Court Chief Justice, four U. S. Secretaries of State and other Cabinet officials, 54 members of the United States Congress, 56 Rhodes Scholars, 52 Gates Cambridge Scholars 49 Marshall Scholars, 14 MacArthur Genius Fellows, 21 Pulitzer Prize winners, various royals and nobles, as well as leaders and founders of Fortune 500 companies; the origin of Brown University can be dated to 1761, when three residents of Newport, Rhode Island drafted a petition to the General Assembly of the colony: Your Petitioners propose to open a literary institution or School for instructing young Gentlemen in the Languages, Geography & History, & such other branches of Knowledge as shall be desired.
That for this End... it will be necessary... to erect a public Building or Buildings for the boarding of the youth & the Residence of the Professors. The three petitioners were Ezra Stiles, pastor of Newport's Second Congregational Church and future president of Yale. Stiles and Ellery were co-authors of the Charter of the College two years later; the editor of Stiles's papers observes, "This draft of a petition connects itself with other evidence of Dr. Stiles's project for a Collegiate Institution in Rhode Island, before the charter of what became Brown University."There is further documentary evidence that Stiles was making plans for a college in 1762. On January 20, Chauncey Whittelsey, pastor of the First Church of New Haven, answered a letter from Stiles: The week before last I sent you the Copy of Yale College Charter... Should you make any Progress in the Affair of a Colledge, I should be glad to hear of it; the Philadelphia Association of Baptist Churches had an eye on Rhode Island, home of the mother church of their denomination: the First Baptist Church in America, founded in Providence in 1638 by Roger Williams.
The Baptists were as yet unrepresented among colonial colleges. Isaac Backus was the historian of the New England Baptists and an inaugural Trustee of Brown, writing in 1784, he described the October 1762 resolution taken at Philadelphia: The Philadelphia Association obtained such an acquaintance with our affairs, as to bring them to an apprehension that it was practicable and expedient to erect a college in the Colony of Rhode-Island, under the chief direction of the Baptists. Mr. James Manning, who took his first degree in New-Jersey college in September, 1762, was esteemed a suitable leader in this important work. Manning arrived at Newport in July 1763 and was introduced to Stiles, who agreed to write the Charter for the College. Stiles's first draft was read to the General Assembly in August 1763 and rejected by Baptist members who worried that the College Board of Fellows would under-represent the Baptists. A revised Charter written by Stiles and Ellery was adopted by the Assembly on March 3, 1764.
In September 1764, the inaugural meeting of the College Corporation was held at Newport. Go
OCLC Online Computer Library Center, Incorporated d/b/a OCLC is an American nonprofit cooperative organization "dedicated to the public purposes of furthering access to the world's information and reducing information costs". It was founded in 1967 as the Ohio College Library Center. OCLC and its member libraries cooperatively produce and maintain WorldCat, the largest online public access catalog in the world. OCLC is funded by the fees that libraries have to pay for its services. OCLC maintains the Dewey Decimal Classification system. OCLC began in 1967, as the Ohio College Library Center, through a collaboration of university presidents, vice presidents, library directors who wanted to create a cooperative computerized network for libraries in the state of Ohio; the group first met on July 5, 1967 on the campus of the Ohio State University to sign the articles of incorporation for the nonprofit organization, hired Frederick G. Kilgour, a former Yale University medical school librarian, to design the shared cataloging system.
Kilgour wished to merge the latest information storage and retrieval system of the time, the computer, with the oldest, the library. The plan was to merge the catalogs of Ohio libraries electronically through a computer network and database to streamline operations, control costs, increase efficiency in library management, bringing libraries together to cooperatively keep track of the world's information in order to best serve researchers and scholars; the first library to do online cataloging through OCLC was the Alden Library at Ohio University on August 26, 1971. This was the first online cataloging by any library worldwide. Membership in OCLC is based on use of services and contribution of data. Between 1967 and 1977, OCLC membership was limited to institutions in Ohio, but in 1978, a new governance structure was established that allowed institutions from other states to join. In 2002, the governance structure was again modified to accommodate participation from outside the United States.
As OCLC expanded services in the United States outside Ohio, it relied on establishing strategic partnerships with "networks", organizations that provided training and marketing services. By 2008, there were 15 independent United States regional service providers. OCLC networks played a key role in OCLC governance, with networks electing delegates to serve on the OCLC Members Council. During 2008, OCLC commissioned two studies to look at distribution channels. In early 2009, OCLC negotiated new contracts with the former networks and opened a centralized support center. OCLC provides bibliographic and full-text information to anyone. OCLC and its member libraries cooperatively produce and maintain WorldCat—the OCLC Online Union Catalog, the largest online public access catalog in the world. WorldCat has holding records from private libraries worldwide; the Open WorldCat program, launched in late 2003, exposed a subset of WorldCat records to Web users via popular Internet search and bookselling sites.
In October 2005, the OCLC technical staff began a wiki project, WikiD, allowing readers to add commentary and structured-field information associated with any WorldCat record. WikiD was phased out; the Online Computer Library Center acquired the trademark and copyrights associated with the Dewey Decimal Classification System when it bought Forest Press in 1988. A browser for books with their Dewey Decimal Classifications was available until July 2013; until August 2009, when it was sold to Backstage Library Works, OCLC owned a preservation microfilm and digitization operation called the OCLC Preservation Service Center, with its principal office in Bethlehem, Pennsylvania. The reference management service QuestionPoint provides libraries with tools to communicate with users; this around-the-clock reference service is provided by a cooperative of participating global libraries. Starting in 1971, OCLC produced catalog cards for members alongside its shared online catalog. OCLC commercially sells software, such as CONTENTdm for managing digital collections.
It offers the bibliographic discovery system WorldCat Discovery, which allows for library patrons to use a single search interface to access an institution's catalog, database subscriptions and more. OCLC has been conducting research for the library community for more than 30 years. In accordance with its mission, OCLC makes its research outcomes known through various publications; these publications, including journal articles, reports and presentations, are available through the organization's website. OCLC Publications – Research articles from various journals including Code4Lib Journal, OCLC Research, Reference & User Services Quarterly, College & Research Libraries News, Art Libraries Journal, National Education Association Newsletter; the most recent publications are displayed first, all archived resources, starting in 1970, are available. Membership Reports – A number of significant reports on topics ranging from virtual reference in libraries to perceptions about library funding. Newsletters – Current and archived newsletters for the library and archive community.
Presentations – Presentations from both guest speakers and OCLC research from conferences and other events. The presentations are organized into five categories: Conference presentations, Dewey presentations, Distinguished Seminar Series, Guest presentations, Research staff
Computer science is the study of processes that interact with data and that can be represented as data in the form of programs. It enables the use of algorithms to manipulate and communicate digital information. A computer scientist studies the theory of computation and the practice of designing software systems, its fields can be divided into practical disciplines. Computational complexity theory is abstract, while computer graphics emphasizes real-world applications. Programming language theory considers approaches to the description of computational processes, while computer programming itself involves the use of programming languages and complex systems. Human–computer interaction considers the challenges in making computers useful and accessible; the earliest foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating fixed numerical tasks such as the abacus have existed since antiquity, aiding in computations such as multiplication and division.
Algorithms for performing computations have existed since antiquity before the development of sophisticated computing equipment. Wilhelm Schickard designed and constructed the first working mechanical calculator in 1623. In 1673, Gottfried Leibniz demonstrated a digital mechanical calculator, called the Stepped Reckoner, he may be considered the first computer scientist and information theorist, among other reasons, documenting the binary number system. In 1820, Thomas de Colmar launched the mechanical calculator industry when he released his simplified arithmometer, the first calculating machine strong enough and reliable enough to be used daily in an office environment. Charles Babbage started the design of the first automatic mechanical calculator, his Difference Engine, in 1822, which gave him the idea of the first programmable mechanical calculator, his Analytical Engine, he started developing this machine in 1834, "in less than two years, he had sketched out many of the salient features of the modern computer".
"A crucial step was the adoption of a punched card system derived from the Jacquard loom" making it infinitely programmable. In 1843, during the translation of a French article on the Analytical Engine, Ada Lovelace wrote, in one of the many notes she included, an algorithm to compute the Bernoulli numbers, considered to be the first computer program. Around 1885, Herman Hollerith invented the tabulator, which used punched cards to process statistical information. In 1937, one hundred years after Babbage's impossible dream, Howard Aiken convinced IBM, making all kinds of punched card equipment and was in the calculator business to develop his giant programmable calculator, the ASCC/Harvard Mark I, based on Babbage's Analytical Engine, which itself used cards and a central computing unit; when the machine was finished, some hailed it as "Babbage's dream come true". During the 1940s, as new and more powerful computing machines were developed, the term computer came to refer to the machines rather than their human predecessors.
As it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. In 1945, IBM founded the Watson Scientific Computing Laboratory at Columbia University in New York City; the renovated fraternity house on Manhattan's West Side was IBM's first laboratory devoted to pure science. The lab is the forerunner of IBM's Research Division, which today operates research facilities around the world; the close relationship between IBM and the university was instrumental in the emergence of a new scientific discipline, with Columbia offering one of the first academic-credit courses in computer science in 1946. Computer science began to be established as a distinct academic discipline in the 1950s and early 1960s; the world's first computer science degree program, the Cambridge Diploma in Computer Science, began at the University of Cambridge Computer Laboratory in 1953. The first computer science degree program in the United States was formed at Purdue University in 1962.
Since practical computers became available, many applications of computing have become distinct areas of study in their own rights. Although many believed it was impossible that computers themselves could be a scientific field of study, in the late fifties it became accepted among the greater academic population, it is the now well-known IBM brand that formed part of the computer science revolution during this time. IBM released the IBM 704 and the IBM 709 computers, which were used during the exploration period of such devices. "Still, working with the IBM was frustrating if you had misplaced as much as one letter in one instruction, the program would crash, you would have to start the whole process over again". During the late 1950s, the computer science discipline was much in its developmental stages, such issues were commonplace. Time has seen significant improvements in the effectiveness of computing technology. Modern society has seen a significant shift in the users of computer technology, from usage only by experts and professionals, to a near-ubiquitous user base.
Computers were quite costly, some degree of humanitarian aid was needed for efficient use—in part from professional computer operators. As computer adoption became more widespread and affordable, less human assistance was needed for common usage. Despite its short history as a formal academic discipline, computer science has made a number of fundamental contributions to science and society—in fact, along with electronics, it is
Association for Computing Machinery
The Association for Computing Machinery is an international learned society for computing. It was founded in 1947, is the world's largest scientific and educational computing society; the ACM is a non-profit professional membership group, with nearly 100,000 members as of 2019. Its headquarters are in New York City; the ACM is an umbrella organization for scholarly interests in computer science. Its motto is "Advancing Computing as a Science & Profession"; the ACM was founded in 1947 under the name Eastern Association for Computing Machinery, changed the following year to the Association for Computing Machinery. ACM is organized into over 171 local chapters and 37 Special Interest Groups, through which it conducts most of its activities. Additionally, there are over 500 university chapters; the first student chapter was founded in 1961 at the University of Louisiana at Lafayette. Many of the SIGs, such as SIGGRAPH, SIGPLAN, SIGCSE and SIGCOMM, sponsor regular conferences, which have become famous as the dominant venue for presenting innovations in certain fields.
The groups publish a large number of specialized journals and newsletters. ACM sponsors other computer science related events such as the worldwide ACM International Collegiate Programming Contest, has sponsored some other events such as the chess match between Garry Kasparov and the IBM Deep Blue computer. ACM publishes over 50 journals including the prestigious Journal of the ACM, two general magazines for computer professionals, Communications of the ACM and Queue. Other publications of the ACM include: ACM XRDS "Crossroads", was redesigned in 2010 and is the most popular student computing magazine in the US. ACM Interactions, an interdisciplinary HCI publication focused on the connections between experiences and technology, the third largest ACM publication. ACM Computing Surveys ACM Computers in Entertainment ACM Special Interest Group: Computers and Society A number of journals, specific to subfields of computer science, titled ACM Transactions; some of the more notable transactions include: ACM Transactions on Computer Systems IEEE/ACM Transactions on Computational Biology and Bioinformatics ACM Transactions on Computational Logic ACM Transactions on Computer-Human Interaction ACM Transactions on Database Systems ACM Transactions on Graphics ACM Transactions on Mathematical Software ACM Transactions on Multimedia Computing and Applications IEEE/ACM Transactions on Networking ACM Transactions on Programming Languages and Systems Although Communications no longer publishes primary research, is not considered a prestigious venue, many of the great debates and results in computing history have been published in its pages.
ACM has made all of its publications available to paid subscribers online at its Digital Library and has a Guide to Computing Literature. Individual members additionally have access to Safari Books Online and Books24x7. ACM offers insurance, online courses, other services to its members. In 1997, ACM Press published Wizards and Their Wonders: Portraits in Computing, written by Christopher Morgan, with new photographs by Louis Fabian Bachrach; the book is a collection of historic and current portrait photographs of figures from the computer industry. The ACM Portal is an online service of the ACM, its core are two main sections: the ACM Guide to Computing Literature. The ACM Digital Library is the full-text collection of all articles published by the ACM in its articles and conference proceedings; the Guide is a bibliography in computing with over one million entries. The ACM Digital Library contains a comprehensive archive starting in the 1950s of the organization's journals, magazines and conference proceedings.
Online services include a forum called Tech News digest. There is an extensive underlying bibliographic database containing key works of all genres from all major publishers of computing literature; this secondary database is a rich discovery service known as The ACM Guide to Computing Literature. ACM adopted a hybrid Open Access publishing model in 2013. Authors who do not choose to pay the OA fee must grant ACM publishing rights by either a copyright transfer agreement or a publishing license agreement. ACM was a "green" publisher. Authors may post documents on their own websites and in their institutional repositories with a link back to the ACM Digital Library's permanently maintained Version of Record. All metadata in the Digital Library is open to the world, including abstracts, linked references and citing works and usage statistics, as well as all functionality and services. Other than the free articles, the full-texts are accessed by subscription. There is a mounting challenge to the ACM's publication practices coming from the open access movement.
Some authors see a centralized peer–review process as less relevant and publish on their home pages or on unreviewed sites like arXiv. Other organizations have sprung up which do their peer review free and online, such as Journal of Artificial Intelligence Research, Journal of Machine Learning Research and the Journal of Research and Practice in Information Technology. In addition to student and regular members, ACM has several advanced membership grades to recognize those with multiple years of membership and "demonstrated performance that sets them apart from their peers"; the number of Fellows, Distinguished Members, Senior Members cannot exceed 1%, 10%, 25% of the total number of professional members, respect
Scientific opinion on climate change
Scientific opinion on climate change is a judgment of scientists regarding the degree to which global warming is occurring, its causes, its probable consequences. A related—but not identical—term, "scientific consensus on climate change", is the prevailing view on climate change within the scientific community; the consensus is that: Earth's climate has warmed since the late 1800s. Human activities are the primary cause. Continuing emissions will increase the severity of global effects. People could manage future climate change effects through intense efforts at reducing further warming while preparing for any unavoidable climate changes. Several studies of the consensus have been undertaken. Among the most-cited is a 2013 study of nearly 12,000 abstracts of peer-reviewed papers on climate science published since 1990, of which just over 4,000 papers expressed an opinion on the cause of recent global warming. Of these, 97 % agree, that global warming is happening and is human-caused, it is "extremely likely" that this warming arises from "human activities emissions of greenhouse gases" in the atmosphere.
Natural change alone would have had a slight cooling effect rather than a warming effect. This scientific opinion is expressed in synthesis reports, by scientific bodies of national or international standing, by surveys of opinion among climate scientists. Individual scientists and laboratories contribute to the overall scientific opinion via their peer-reviewed publications, the areas of collective agreement and relative certainty are summarised in these respected reports and surveys; the IPCC's Fifth Assessment Report was completed in 2014. Its conclusions are summarized below: "Warming of the climate system is unequivocal, since the 1950s, many of the observed changes are unprecedented over decades to millennia." "Atmospheric concentrations of carbon dioxide and nitrous oxide have increased to levels unprecedented in at least the last 800,000 years." Human influence on the climate system is clear. It is likely that human influence was the dominant cause of global warming between 1951-2010.
"Increasing magnitudes of warming increase the likelihood of severe and irreversible impacts." "A first step towards adaptation to future climate change is reducing vulnerability and exposure to present climate variability." "The overall risks of climate change impacts can be reduced by limiting the rate and magnitude of climate change" Without new policies to mitigate climate change, projections suggest an increase in global mean temperature in 2100 of 3.7 to 4.8 °C, relative to pre-industrial levels. The current trajectory of global greenhouse gas emissions is not consistent with limiting global warming to below 1.5 or 2°C, relative to pre-industrial levels. Pledges made as part of the Cancún Agreements are broadly consistent with cost-effective scenarios that give a "likely" chance of limiting global warming to below 3 °C, relative to pre-industrial levels. National and international science academies and scientific societies have assessed current scientific opinion on global warming; these assessments are consistent with the conclusions of the Intergovernmental Panel on Climate Change.
Some scientific bodies have recommended specific policies to governments, science can play a role in informing an effective response to climate change. Policy decisions, may require value judgements and so are not included in the scientific opinion. No scientific body of national or international standing maintains a formal opinion dissenting from any of these main points; the last national or international scientific body to drop dissent was the American Association of Petroleum Geologists, which in 2007 updated its statement to its current non-committal position. Some other organizations those focusing on geology hold non-committal positions. Synthesis reports are assessments of scientific literature that compile the results of a range of stand-alone studies in order to achieve a broad level of understanding, or to describe the state of knowledge of a given subject; the IPCC Fifth Assessment Report followed the same general format as the Fourth Assessment Report, with three Working Group reports and a Synthesis report.
The Working Group I report was published in September 2013. The report's Summary for Policymakers stated that warming of the climate system is'unequivocal' with changes unprecedented over decades to millennia, including warming of the atmosphere and oceans, loss of snow and ice, sea level rise. Greenhouse gas emissions, driven by economic and population growth, have led to greenhouse gas concentrations that are unprecedented in at least the last 800,000 years. These, together with other anthropogenic drivers, are "extremely likely" to have been the dominant cause of the observed global warming since the mid-20th century, it said that Continued emission of greenhouse gases will cause further warming and long-lasting changes in all components of the climate system, increasing the likelihood of severe and irreversible impacts for people and ecosystems. Limiting climate change would require substantial and sustained reductions in greenhouse gas emissions which, together with adaptation, can limit climate change risks.
Reporting on the publication of the report, The Guardian said that In the end it all boils down to risk management. The stronger our efforts to reduce greenhouse gas emissions, the lower the risk of extreme climate impacts; the higher our emissions, the larger climate changes we'll face, which means more ex