Futures studies called futurology, is the study of postulating possible and preferable futures and the worldviews and myths that underlie them. In general, it can be considered as a branch of the social sciences and parallel to the field of history. Futures studies seeks to understand what is to continue and what could plausibly change. Part of the discipline thus seeks a systematic and pattern-based understanding of past and present, to determine the likelihood of future events and trends. Unlike the physical sciences where a narrower, more specified system is studied, futurology concerns a much bigger and more complex world system; the methodology and knowledge are much less proven as compared to natural science or social science like sociology and economics. There is a debate as to whether this discipline is an art or science and sometimes described by scientists as pseudoscience. Futures studies is an interdisciplinary field that aggregates and analyzes trends, with both lay and professional methods, to compose possible futures.
It includes analyzing the sources and causes of change and stability in an attempt to develop foresight. Around the world the field is variously referred to as futures studies, strategic foresight, futures thinking and futurology. Futures studies and strategic foresight are the academic field's most used terms in the English-speaking world. Foresight was the original term and was first used in this sense by H. G. Wells in 1932. "Futurology" is a term common in encyclopedias, though it is used exclusively by nonpractitioners today, at least in the English-speaking world. "Futurology" is defined as the "study of the future." The term was coined by German professor Ossip K. Flechtheim in the mid-1940s, who proposed it as a new branch of knowledge that would include a new science of probability; this term has fallen from favor in recent decades because modern practitioners stress the importance of alternative and plural futures, rather than one monolithic future, the limitations of prediction and probability, versus the creation of possible and preferable futures.
Three factors distinguish futures studies from the research conducted by other disciplines. First, futures studies examines trends to compose possible and preferable futures along with the role "wild cards" can play on future scenarios. Second, futures studies attempts to gain a holistic or systemic view based on insights from a range of different disciplines focusing on the STEEP categories of Social, Economic and Political. Third, futures studies challenges and unpacks the assumptions behind dominant and contending views of the future; the future thus is not fraught with hidden assumptions. For example, many people expect the collapse of the Earth's ecosystem in the near future, while others believe the current ecosystem will survive indefinitely. A foresight approach would seek to highlight the assumptions underpinning such views; as a field, futures studies expands on the research component, by emphasizing the communication of a strategy and the actionable steps needed to implement the plan or plans leading to the preferable future.
It is in this regard, that futures studies evolves from an academic exercise to a more traditional business-like practice, looking to better prepare organizations for the future. Futures studies does not focus on short term predictions such as interest rates over the next business cycle, or of managers or investors with short-term time horizons. Most strategic planning, which develops goals and objectives with time horizons of one to three years, is not considered futures. Plans and strategies with longer time horizons that attempt to anticipate possible future events are part of the field; as a rule, futures studies is concerned with changes of transformative impact, rather than those of an incremental or narrow scope. The futures field excludes those who make future predictions through professed supernatural means. Johan Galtung and Sohail Inayatullah argue in Macrohistory and Macrohistorians that the search for grand patterns of social change goes all the way back to Ssu-Ma Chien and his theory of the cycles of virtue, although the work of Ibn Khaldun such as The Muqaddimah would be an example, more intelligible to modern sociology.
Early western examples include Sir Thomas More’s “Utopia,” published in 1516, based upon Plato’s “Republic,” in which a future society has overcome poverty and misery to create a perfect model for living. This work was so powerful that utopias have come to represent positive and fulfilling futures in which everyone’s needs are met; some intellectual foundations of futures studies appeared in the mid-19th century. Isadore Comte, considered the father of scientific philosophy, was influenced by the work of utopian socialist Henri Saint-Simon, his discussion of the metapatterns of social change presages futures studies as a scholarly dialogue; the first works that attempt to make systematic predictions for the future were written in the 18th century. Memoirs of the Twentieth Century written by Samuel Madden in 1733, takes the form of a series of diplomatic letters written in 1997 and 1998 from British representatives in the foreign cities of Constantinople, Rome and Moscow. However, the technology of the 20th century is identical to that of Madden's own era - the focus is instead on the political and religious state of the world in the future.
Madden went on to write The Reign of George VI, 1900 to 1925, where (in th
Systems engineering is an interdisciplinary field of engineering and engineering management that focuses on how to design and manage complex systems over their life cycles. At its core, systems engineering utilizes systems thinking principles to organize this body of knowledge; the individual outcome of such efforts, an engineered system, can be defined as a combination of components that work in synergy to collectively perform a useful function. Issues such as requirements engineering, logistics, coordination of different teams and evaluation, maintainability and many other disciplines necessary for successful system development, design and ultimate decommission become more difficult when dealing with large or complex projects. Systems engineering deals with work-processes, optimization methods, risk management tools in such projects, it overlaps technical and human-centered disciplines such as industrial engineering, mechanical engineering, manufacturing engineering, control engineering, software engineering, electrical engineering, organizational studies, civil engineering and project management.
Systems engineering ensures that all aspects of a project or system are considered, integrated into a whole. The systems engineering process is a discovery process, quite unlike a manufacturing process. A manufacturing process is focused on repetitive activities that achieve high quality outputs with minimum cost and time; the systems engineering process must begin by discovering the real problems that need to be resolved, identifying the most probable or highest impact failures that can occur – systems engineering involves finding solutions to these problems. The term systems engineering can be traced back to Bell Telephone Laboratories in the 1940s; the need to identify and manipulate the properties of a system as a whole, which in complex engineering projects may differ from the sum of the parts' properties, motivated various industries those developing systems for the U. S. Military; when it was no longer possible to rely on design evolution to improve upon a system and the existing tools were not sufficient to meet growing demands, new methods began to be developed that addressed the complexity directly.
The continuing evolution of systems engineering comprises the development and identification of new methods and modeling techniques. These methods aid in a better comprehension of the design and developmental control of engineering systems as they grow more complex. Popular tools that are used in the systems engineering context were developed during these times, including USL, UML, QFD, IDEF0. In 1990, a professional society for systems engineering, the National Council on Systems Engineering, was founded by representatives from a number of U. S. corporations and organizations. NCOSE was created to address the need for improvements in systems engineering practices and education; as a result of growing involvement from systems engineers outside of the U. S. the name of the organization was changed to the International Council on Systems Engineering in 1995. Schools in several countries offer graduate programs in systems engineering, continuing education options are available for practicing engineers.
Systems engineering signifies only an approach and, more a discipline in engineering. The aim of education in systems engineering is to formalize various approaches and in doing so, identify new methods and research opportunities similar to that which occurs in other fields of engineering; as an approach, systems engineering is interdisciplinary in flavour. The traditional scope of engineering embraces the conception, development and operation of physical systems. Systems engineering, as conceived, falls within this scope. "Systems engineering", in this sense of the term, refers to the building of engineering concepts. The use of the term "systems engineer" has evolved over time to embrace a wider, more holistic concept of "systems" and of engineering processes; this evolution of the definition has been a subject of ongoing controversy, the term continues to apply to both the narrower and broader scope. Traditional systems engineering was seen as a branch of engineering in the classical sense, that is, as applied only to physical systems, such as spacecraft and aircraft.
More systems engineering has evolved to a take on a broader meaning when humans were seen as an essential component of a system. Checkland, for example, captures the broader meaning of systems engineering by stating that'engineering' "can be read in its general sense. Enterprise Systems Engineering pertains to the view of enterprises, that is, organizations or combinations of organizations, as systems. Service Systems Engineering has to do with the engineering of service systems. Checkland defines a service system as a system, conceived as serving another system. Most civil infrastructure systems are service systems. Systems engineering focuses on analyzing and eliciting customer needs and required functionality early in the development cycle, documenting requirements proceeding with design synthesis and system validation while considering the complete problem, the system lifecycle; this includes understanding all of the stakeholders involved. Oliver et al. claim that the systems engineerin
The Delphi method is a structured communication technique or method developed as a systematic, interactive forecasting method which relies on a panel of experts. The technique can be adapted for use in face-to-face meetings, is called mini-Delphi or Estimate-Talk-Estimate. Delphi has been used for business forecasting and has certain advantages over another structured forecasting approach, prediction markets. Delphi is based on the principle that forecasts from a structured group of individuals are more accurate than those from unstructured groups; the experts answer questionnaires in two or more rounds. After each round, a facilitator or change agent provides an anonymised summary of the experts' forecasts from the previous round as well as the reasons they provided for their judgments. Thus, experts are encouraged to revise their earlier answers in light of the replies of other members of their panel, it is believed that during this process the range of the answers will decrease and the group will converge towards the "correct" answer.
The process is stopped after a predefined stop criterion, the mean or median scores of the final rounds determine the results. The name "Delphi" derives from the Oracle of Delphi, although the authors of the method were unhappy with the oracular connotation of the name, "smacking a little of the occult"; the Delphi method is based on the assumption that group judgments are more valid than individual judgments. The Delphi method was developed at the beginning of the Cold War to forecast the impact of technology on warfare. In 1944, General Henry H. Arnold ordered the creation of the report for the U. S. Army Air Corps on the future technological capabilities that might be used by the military. Different approaches were tried, but the shortcomings of traditional forecasting methods, such as theoretical approach, quantitative models or trend extrapolation became apparent in areas where precise scientific laws have not been established yet. To combat these shortcomings, the Delphi method was developed by Project RAND during the 1950-1960s by Olaf Helmer, Norman Dalkey, Nicholas Rescher.
It has been used since, together with various modifications and reformulations, such as the Imen-Delphi procedure. Experts were asked to give their opinion on the probability and intensity of possible enemy attacks. Other experts could anonymously give feedback; this process was repeated several times. The following key characteristics of the Delphi method help the participants to focus on the issues at hand and separate Delphi from other methodologies: in this technique a panel of experts is drawn from both inside and outside the organisation; the panel consist of experts having knowledge of the area requiring decision making. Each expert is asked to make anonymous predictions. All participants remain anonymous, their identity is not revealed after the completion of the final report. This prevents the authority, personality, or reputation of some participants from dominating others in the process. Arguably, it frees participants from their personal biases, minimizes the "bandwagon effect" or "halo effect", allows free expression of opinions, encourages open critique, facilitates admission of errors when revising earlier judgments.
The initial contributions from the experts are collected in the form of answers to questionnaires and their comments to these answers. The panel director controls the interactions among the participants by processing the information and filtering out irrelevant content; this avoids the negative effects of face-to-face panel discussions and solves the usual problems of group dynamics. The Delphi Method allows participants to comment on the responses of others, the progress of the panel as a whole, to revise their own forecasts and opinions in real time; the person coordinating the Delphi method is known as a facilitator or Leader, facilitates the responses of their panel of experts, who are selected for a reason that they hold knowledge on an opinion or view. The facilitator sends out questionnaires, surveys etc. and if the panel of experts accept, they follow instructions and present their views. Responses are collected and analyzed common and conflicting viewpoints are identified. If consensus is not reached, the process continues through thesis and antithesis, to work towards synthesis, building consensus.
During the past decades, facilitators have used many different measures and thresholds to measure the degree of consensus or dissent. A comprehensive literature review and summary is compiled in an article by von der Gracht. First applications of the Delphi method were in the field of technology forecasting; the objective of the method was to combine expert opinions on likelihood and expected development time, of the particular technology, in a single indicator. One of the first such reports, prepared in 1964 by Gordon and Helmer, assessed the direction of long-term trends in science and technology development, covering such topics as scientific breakthroughs, population control, space progress, war prevention and weapon systems. Other forecasts of technology were dealing with vehicle-highway systems, industrial robots, intelligent internet, broadband connections, technology in education; the Delphi method was applied in other places those related to public policy issues, such as economic trends and education.
It was applied and with high accuracy in business forecasting. For example, in one case reported by Basu and Schroeder, the
History is the study of the past as it is described in written documents. Events occurring before written record are considered prehistory, it is an umbrella term that relates to past events as well as the memory, collection, organization and interpretation of information about these events. Scholars who write about history are called historians. History can refer to the academic discipline which uses a narrative to examine and analyse a sequence of past events, objectively determine the patterns of cause and effect that determine them. Historians sometimes debate the nature of history and its usefulness by discussing the study of the discipline as an end in itself and as a way of providing "perspective" on the problems of the present. Stories common to a particular culture, but not supported by external sources, are classified as cultural heritage or legends, because they do not show the "disinterested investigation" required of the discipline of history. Herodotus, a 5th-century BC Greek historian is considered within the Western tradition to be the "father of history", along with his contemporary Thucydides, helped form the foundations for the modern study of human history.
Their works continue to be read today, the gap between the culture-focused Herodotus and the military-focused Thucydides remains a point of contention or approach in modern historical writing. In East Asia, a state chronicle, the Spring and Autumn Annals was known to be compiled from as early as 722 BC although only 2nd-century BC texts have survived. Ancient influences have helped spawn variant interpretations of the nature of history which have evolved over the centuries and continue to change today; the modern study of history is wide-ranging, includes the study of specific regions and the study of certain topical or thematical elements of historical investigation. History is taught as part of primary and secondary education, the academic study of history is a major discipline in university studies; the word history comes from the Ancient Greek ἱστορία, meaning'inquiry','knowledge from inquiry', or'judge'. It was in that sense; the ancestor word ἵστωρ is attested early on in Homeric Hymns, the Athenian ephebes' oath, in Boiotic inscriptions.
The Greek word was borrowed into Classical Latin as historia, meaning "investigation, research, description, written account of past events, writing of history, historical narrative, recorded knowledge of past events, narrative". History was borrowed from Latin into Old English as stær, but this word fell out of use in the late Old English period. Meanwhile, as Latin became Old French, historia developed into forms such as istorie and historie, with new developments in the meaning: "account of the events of a person's life, account of events as relevant to a group of people or people in general, dramatic or pictorial representation of historical events, body of knowledge relative to human evolution, narrative of real or imaginary events, story", it was from Anglo-Norman that history was borrowed into Middle English, this time the loan stuck. It appears in the 13th-century Ancrene Wisse, but seems to have become a common word in the late 14th century, with an early attestation appearing in John Gower's Confessio Amantis of the 1390s: "I finde in a bok compiled | To this matiere an old histoire, | The which comth nou to mi memoire".
In Middle English, the meaning of history was "story" in general. The restriction to the meaning "the branch of knowledge that deals with past events. With the Renaissance, older senses of the word were revived, it was in the Greek sense that Francis Bacon used the term in the late 16th century, when he wrote about "Natural History". For him, historia was "the knowledge of objects determined by space and time", that sort of knowledge provided by memory. In an expression of the linguistic synthetic vs. analytic/isolating dichotomy, English like Chinese now designates separate words for human history and storytelling in general. In modern German and most Germanic and Romance languages, which are solidly synthetic and inflected, the same word is still used to mean both'history' and'story'. Historian in the sense of a "researcher of history" is attested from 1531. In all European languages, the substantive history is still used to mean both "what happened with men", "the scholarly study of the happened", the latter sense sometimes distinguished with a capital letter, or the word historiography.
The adjective historical is attested from 1661, historic from 1669. Historians write in the context of their own time, with due regard to the current dominant ideas of how to interpret the past, sometimes write to provide lessons for their own society. In the words of Benedetto Croce, "All history is contemporary history". History is facilitated by the formation of a "true discourse of past" through the production of narrative and analysis of past events relating to the human race; the modern discipline of history is dedicated to the institutional production of this discourse. All events that are remembered and preserved in some authentic form constitute the historical record; the task of histori
A conceptual model is a representation of a system, made of the composition of concepts which are used to help people know, understand, or simulate a subject the model represents. It is a set of concepts; some models are physical objects. The term conceptual model may be used to refer to models which are formed after a conceptualization or generalization process. Conceptual models are abstractions of things in the real world whether physical or social. Semantic studies are relevant to various stages of concept formation. Semantics is about concepts, the meaning that thinking beings give to various elements of their experience; the term conceptual model is normal. It could mean "a model of concept" or it could mean "a model, conceptual." A distinction can be made between what models are made of. With the exception of iconic models, such as a scale model of Winchester Cathedral, most models are concepts, but they are intended to be models of real world states of affairs. The value of a model is directly proportional to how well it corresponds to a past, future, actual or potential state of affairs.
A model of a concept is quite different because in order to be a good model it need not have this real world correspondence. In artificial intelligence conceptual models and conceptual graphs are used for building expert systems and knowledge-based systems. Conceptual models range in type from the more concrete, such as the mental image of a familiar physical object, to the formal generality and abstractness of mathematical models which do not appear to the mind as an image. Conceptual models range in terms of the scope of the subject matter that they are taken to represent. A model may, for instance, represent a single thing, whole classes of things, very vast domains of subject matter such as the physical universe; the variety and scope of conceptual models is due to the variety of purposes had by the people using them. Conceptual modeling is the activity of formally describing some aspects of the physical and social world around us for the purposes of understanding and communication." A conceptual model's primary objective is to convey the fundamental principles and basic functionality of the system which it represents.
A conceptual model must be developed in such a way as to provide an understood system interpretation for the model's users. A conceptual model, when implemented properly, should satisfy four fundamental objectives. Enhance an individual's understanding of the representative system Facilitate efficient conveyance of system details between stakeholders Provide a point of reference for system designers to extract system specifications Document the system for future reference and provide a means for collaborationThe conceptual model plays an important role in the overall system development life cycle. Figure 1 below, depicts the role of the conceptual model in a typical system development scheme, it is clear that if the conceptual model is not developed, the execution of fundamental system properties may not be implemented properly, giving way to future problems or system shortfalls. These failures do have been linked to; those weak links in the system design and development process can be traced to improper execution of the fundamental objectives of conceptual modeling.
The importance of conceptual modeling is evident when such systemic failures are mitigated by thorough system development and adherence to proven development objectives/techniques. As systems have become complex, the role of conceptual modelling has expanded. With that expanded presence, the effectiveness of conceptual modeling at capturing the fundamentals of a system is being realized. Building on that realization, numerous conceptual modeling techniques have been created; these techniques can be applied across multiple disciplines to increase the user's understanding of the system to be modeled. A few techniques are described in the following text, many more exist or are being developed; some used conceptual modeling techniques and methods include: workflow modeling, workforce modeling, rapid application development, object-role modeling, the Unified Modeling Language. Data flow modeling is a basic conceptual modeling technique that graphically represents elements of a system. DFM is a simple technique, like many conceptual modeling techniques, it is possible to construct higher and lower level representative diagrams.
The data flow diagram does not convey complex system details such as parallel development considerations or timing information, but rather works to bring the major system functions into context. Data flow modeling is a central technique used in systems development that utilizes the structured systems analysis and design method. Entity-relationship modeling is a conceptual modeling technique used for software system representation. Entity-relationship diagrams, which are a product of executing the ERM technique, are used to represent database models and information systems; the main components of the diagram are the relationships. The entities can represent objects, or events; the relationships are responsible for relating the entities to one another. To form a system process, the relationships are combined with the entities and any attr
World War II
World War II known as the Second World War, was a global war that lasted from 1939 to 1945. The vast majority of the world's countries—including all the great powers—eventually formed two opposing military alliances: the Allies and the Axis. A state of total war emerged, directly involving more than 100 million people from over 30 countries; the major participants threw their entire economic and scientific capabilities behind the war effort, blurring the distinction between civilian and military resources. World War II was the deadliest conflict in human history, marked by 50 to 85 million fatalities, most of whom were civilians in the Soviet Union and China, it included massacres, the genocide of the Holocaust, strategic bombing, premeditated death from starvation and disease, the only use of nuclear weapons in war. Japan, which aimed to dominate Asia and the Pacific, was at war with China by 1937, though neither side had declared war on the other. World War II is said to have begun on 1 September 1939, with the invasion of Poland by Germany and subsequent declarations of war on Germany by France and the United Kingdom.
From late 1939 to early 1941, in a series of campaigns and treaties, Germany conquered or controlled much of continental Europe, formed the Axis alliance with Italy and Japan. Under the Molotov–Ribbentrop Pact of August 1939, Germany and the Soviet Union partitioned and annexed territories of their European neighbours, Finland and the Baltic states. Following the onset of campaigns in North Africa and East Africa, the fall of France in mid 1940, the war continued between the European Axis powers and the British Empire. War in the Balkans, the aerial Battle of Britain, the Blitz, the long Battle of the Atlantic followed. On 22 June 1941, the European Axis powers launched an invasion of the Soviet Union, opening the largest land theatre of war in history; this Eastern Front trapped most crucially the German Wehrmacht, into a war of attrition. In December 1941, Japan launched a surprise attack on the United States as well as European colonies in the Pacific. Following an immediate U. S. declaration of war against Japan, supported by one from Great Britain, the European Axis powers declared war on the U.
S. in solidarity with their Japanese ally. Rapid Japanese conquests over much of the Western Pacific ensued, perceived by many in Asia as liberation from Western dominance and resulting in the support of several armies from defeated territories; the Axis advance in the Pacific halted in 1942. Key setbacks in 1943, which included a series of German defeats on the Eastern Front, the Allied invasions of Sicily and Italy, Allied victories in the Pacific, cost the Axis its initiative and forced it into strategic retreat on all fronts. In 1944, the Western Allies invaded German-occupied France, while the Soviet Union regained its territorial losses and turned toward Germany and its allies. During 1944 and 1945 the Japanese suffered major reverses in mainland Asia in Central China, South China and Burma, while the Allies crippled the Japanese Navy and captured key Western Pacific islands; the war in Europe concluded with an invasion of Germany by the Western Allies and the Soviet Union, culminating in the capture of Berlin by Soviet troops, the suicide of Adolf Hitler and the German unconditional surrender on 8 May 1945.
Following the Potsdam Declaration by the Allies on 26 July 1945 and the refusal of Japan to surrender under its terms, the United States dropped atomic bombs on the Japanese cities of Hiroshima and Nagasaki on 6 and 9 August respectively. With an invasion of the Japanese archipelago imminent, the possibility of additional atomic bombings, the Soviet entry into the war against Japan and its invasion of Manchuria, Japan announced its intention to surrender on 15 August 1945, cementing total victory in Asia for the Allies. Tribunals were set up by fiat by the Allies and war crimes trials were conducted in the wake of the war both against the Germans and the Japanese. World War II changed the political social structure of the globe; the United Nations was established to foster international co-operation and prevent future conflicts. The Soviet Union and United States emerged as rival superpowers, setting the stage for the nearly half-century long Cold War. In the wake of European devastation, the influence of its great powers waned, triggering the decolonisation of Africa and Asia.
Most countries whose industries had been damaged moved towards economic expansion. Political integration in Europe, emerged as an effort to end pre-war enmities and create a common identity; the start of the war in Europe is held to be 1 September 1939, beginning with the German invasion of Poland. The dates for the beginning of war in the Pacific include the start of the Second Sino-Japanese War on 7 July 1937, or the Japanese invasion of Manchuria on 19 September 1931. Others follow the British historian A. J. P. Taylor, who held that the Sino-Japanese War and war in Europe and its colonies occurred and the two wars merged in 1941; this article uses the conventional dating. Other starting dates sometimes used for World War II include the Italian invasion of Abyssinia on 3 October 1935; the British historian Antony Beevor views the beginning of World War II as the Battles of Khalkhin Gol fought between Japan and the fo
Science fiction is a genre of speculative fiction dealing with imaginative and futuristic concepts such as advanced science and technology, space exploration, time travel, extraterrestrials in fiction. Science fiction explores the potential consequences of scientific other various innovations, has been called a "literature of ideas." "Science fiction" is difficult to define as it includes a wide range of concepts and themes. James Blish wrote: "Wells used the term to cover what we would today call'hard' science fiction, in which a conscientious attempt to be faithful to known facts was the substrate on which the story was to be built, if the story was to contain a miracle, it ought at least not to contain a whole arsenal of them."Isaac Asimov said: "Science fiction can be defined as that branch of literature which deals with the reaction of human beings to changes in science and technology." According to Robert A. Heinlein, "A handy short definition of all science fiction might read: realistic speculation about possible future events, based solidly on adequate knowledge of the real world and present, on a thorough understanding of the nature and significance of the scientific method."Lester del Rey wrote, "Even the devoted aficionado or fan—has a hard time trying to explain what science fiction is," and that the reason for there not being a "full satisfactory definition" is that "there are no delineated limits to science fiction."
Author and editor Damon Knight summed up the difficulty, saying "science fiction is what we point to when we say it." Mark C. Glassy described the definition of science fiction as U. S. Supreme Court Justice Potter Stewart did with the definition of pornography: "I know it when I see it." Science fiction had its beginnings in a time when the line between myth and fact was arguably more blurred than the present day. Written in the 2nd century CE by the satirist Lucian, A True Story contains many themes and tropes that are characteristic of contemporary science fiction, including travel to other worlds, extraterrestrial lifeforms, interplanetary warfare, artificial life; some consider it the first science-fiction novel. Some of the stories from The Arabian Nights, along with the 10th-century The Tale of the Bamboo Cutter and Ibn al-Nafis's 13th-century Theologus Autodidactus contain elements of science fiction. Products of the Age of Reason and the development of modern science itself, Johannes Kepler's Somnium, Francis Bacon's New Atlantis, Cyrano de Bergerac's Comical History of the States and Empires of the Moon and The States and Empires of the Sun, Margaret Cavendish's "The Blazing World", Jonathan Swift's Gulliver's Travels, Ludvig Holberg's Nicolai Klimii Iter Subterraneum and Voltaire's Micromégas are regarded as some of the first true science-fantasy works.
Indeed, Isaac Asimov and Carl Sagan considered Somnium the first science-fiction story. Following the 18th-century development of the novel as a literary form, Mary Shelley's books Frankenstein and The Last Man helped define the form of the science-fiction novel. Brian Aldiss has argued. Edgar Allan Poe wrote several stories considered science fiction, including "The Unparalleled Adventure of One Hans Pfaall" which featured a trip to the Moon. Jules Verne was noted for his attention to detail and scientific accuracy Twenty Thousand Leagues Under the Sea which predicted the contemporary nuclear submarine. In 1887, the novel El anacronópete by Spanish author Enrique Gaspar y Rimbau introduced the first time machine. Many critics consider H. G. Wells one of science fiction's most important authors, or "the Shakespeare of science fiction." His notable science-fiction works include The Time Machine, The Island of Doctor Moreau, The Invisible Man, The War of the Worlds. His science fiction imagined alien invasion, biological engineering and time travel.
In his non-fiction futurologist works he predicted the advent of airplanes, military tanks, nuclear weapons, satellite television, space travel, something resembling the World Wide Web. In 1912, Edgar Rice Burroughs published A Princess of Mars, the first of his three-decade-long planetary romance series of Barsoom novels, set on Mars and featuring John Carter as the hero. In 1926, Hugo Gernsback published the first American science-fiction magazine, Amazing Stories, in which he wrote: By'scientifiction' I mean the Jules Verne, H. G. Wells and Edgar Allan Poe type of story—a charming romance intermingled with scientific fact and prophetic vision... Not only do these amazing tales make tremendously interesting reading—they are always instructive, they supply knowledge... in a palatable form... New adventures pictured for us in the scientifiction of today are not at all impossible of realization tomorrow... Many great science stories destined to be of historical interest are still to be written...
Posterity will point to them as having blazed a new trail, not only in literature and fiction, but progress as well. In 1928, E. E. "Doc" Smith's first published work, The Skylark of Space, written in collaboration with Lee Hawkins Garby, appeared in Amazing Stories. It is called the first great space opera; the same year, Philip Francis Nowlan's original Buck Rogers story, Armageddon 2419 appeared in Amazing Stories. This was followed by the first serious science-fiction comic. In 1937, John W. Campbell became editor of Astounding Science Fiction, an event, sometimes conside