It is traditionally known as an argument from universal causation, an argument from first cause, or the causal argument. Whichever term is employed, there are three variants of the argument, each with subtle yet important distinctions, the arguments from in causa, in esse. The basic premise of all of these is the concept of causality, contemporary defenders of cosmological arguments include William Lane Craig, Robert Koons, Alexander Pruss, and William L. Rowe. Cosmological argument has been used by atheists and theists. Plato and Aristotle both posited first cause arguments, though each had certain notable caveats, in The Laws, Plato posited that all movement in the world and the Cosmos was imparted motion. This required a self-originated motion to set it in motion and to maintain it, in Timaeus, Plato posited a demiurge of supreme wisdom and intelligence as the creator of the Cosmos. Aristotle argued against the idea of a first cause, often confused with the idea of a prime mover or unmoved mover in his Physics and Metaphysics, like Plato, Aristotle believed in an eternal cosmos with no beginning and no end.
From an aspiration or desire, the spheres, imitate that purely intellectual activity as best they can. The unmoved movers inspiring the planetary spheres are no different in kind from the prime mover, the motions of the planets are subordinate to the motion inspired by the prime mover in the sphere of fixed stars. Aristotles natural theology admitted no creation or capriciousness from the immortal pantheon, plotinus, a third-century Platonist, taught that the One transcendent absolute caused the universe to exist simply as a consequence of its existence. His disciple Proclus stated The One is God, centuries later, the Islamic philosopher Avicenna inquired into the question of being, in which he distinguished between essence and existence. Thus, he reasoned that existence must be due to an agent cause that necessitates, gives, to do so, the cause must coexist with its effect and be an existing thing. Referring to the argument as the Kalam cosmological argument, Duncan asserts that it received its fullest articulation at the hands of Muslim and Jewish exponents of Kalam.
Thomas Aquinas adapted and enhanced the argument he found in his reading of Aristotle and his conception of First Cause was the idea that the Universe must have been caused by something that was itself uncaused, which he asserted was God. In the scholastic era, Aquinas formulated the argument from contingency, since the Universe could, under different circumstances, conceivably not exist, its existence must have a cause – not merely another contingent thing, but something that exists by necessity. In other words, even if the Universe has always existed, it owes its existence to an Uncaused Cause, Aquinas further said. Aquinass argument from contingency allows for the possibility of a Universe that has no beginning in time and it is a form of argument from universal causation. Aquinas observed that, in nature, there were things with contingent existences, since it is possible for such things not to exist, there must be some time at which these things did not in fact exist
The scientific method is a body of techniques for investigating phenomena, acquiring new knowledge, or correcting and integrating previous knowledge. To be termed scientific, a method of inquiry is commonly based on empirical or measurable evidence subject to specific principles of reasoning, experiments need to be designed to test hypotheses. The most important part of the method is the experiment. The scientific method is a process, which usually begins with observations about the natural world. Human beings are naturally inquisitive, so often come up with questions about things they see or hear. The best hypotheses lead to predictions that can be tested in various ways, in general, the strongest tests of hypotheses come from carefully controlled and replicated experiments that gather empirical data. Depending on how well the tests match the predictions, the hypothesis may require refinement. If a particular hypothesis becomes very well supported a theory may be developed. Although procedures vary from one field of inquiry to another, identifiable features are shared in common between them.
The overall process of the method involves making conjectures, deriving predictions from them as logical consequences. A hypothesis is a conjecture, based on knowledge obtained while formulating the question, the hypothesis might be very specific or it might be broad. Scientists test hypotheses by conducting experiments, the purpose of an experiment is to determine whether observations agree with or conflict with the predictions derived from a hypothesis. Experiments can take anywhere from a college lab to CERNs Large Hadron Collider. There are difficulties in a statement of method, however. Though the scientific method is presented as a fixed sequence of steps. Not all steps take place in scientific inquiry, and are not always in the same order. Some philosophers and scientists have argued there is no scientific method, such as Lee Smolin. Nola and Sankey remark that For some, the idea of a theory of scientific method is yester-years debate
In monotheism, God is conceived of as the Supreme Being and principal object of faith. The concept of God as described by most theologians includes the attributes of omniscience, omnipresence, divine simplicity, many theologians describe God as being omnibenevolent and all loving. Furthermore, some religions attribute only a purely grammatical gender to God and corporeity of God are related to conceptions of transcendence and immanence of God, with positions of synthesis such as the immanent transcendence of Chinese theology. God has been conceived as personal or impersonal. In theism, God is the creator and sustainer of the universe, while in deism, God is the creator, in pantheism, God is the universe itself. In atheism, God is not believed to exist, while God is deemed unknown or unknowable within the context of agnosticism, God has been conceived as the source of all moral obligation, and the greatest conceivable existent. Many notable philosophers have developed arguments for and against the existence of God, there are many names for God, and different names are attached to different cultural ideas about Gods identity and attributes.
In the ancient Egyptian era of Atenism, possibly the earliest recorded monotheistic religion, this deity was called Aten, premised on being the one true Supreme Being and creator of the universe. In the Hebrew Bible and Judaism, He Who Is, I Am that I Am, in the Christian doctrine of the Trinity, consubstantial in three persons, is called the Father, the Son, and the Holy Spirit. In Judaism, it is common to refer to God by the titular names Elohim or Adonai, in Islam, the name Allah is used, while Muslims have a multitude of titular names for God. In Hinduism, Brahman is often considered a concept of God. In Chinese religion, God is conceived as the progenitor of the universe, intrinsic to it, other religions have names for God, for instance, Baha in the Baháí Faith, Waheguru in Sikhism, and Ahura Mazda in Zoroastrianism. The earliest written form of the Germanic word God comes from the 6th-century Christian Codex Argenteus, the English word itself is derived from the Proto-Germanic * ǥuđan.
The reconstructed Proto-Indo-European form * ǵhu-tó-m was likely based on the root * ǵhau-, in the English language, the capitalized form of God continues to represent a distinction between monotheistic God and gods in polytheism. The same holds for Hebrew El, but in Judaism, God is given a proper name, in many translations of the Bible, when the word LORD is in all capitals, it signifies that the word represents the tetragrammaton. Allāh is the Arabic term with no plural used by Muslims and Arabic speaking Christians and Jews meaning The God, Ahura Mazda is the name for God used in Zoroastrianism. Mazda, or rather the Avestan stem-form Mazdā-, nominative Mazdå and it is generally taken to be the proper name of the spirit, and like its Sanskrit cognate medhā, means intelligence or wisdom. Both the Avestan and Sanskrit words reflect Proto-Indo-Iranian *mazdhā-, from Proto-Indo-European mn̩sdʰeh1, literally meaning placing ones mind, Waheguru is a term most often used in Sikhism to refer to God
Established scientific theories have withstood rigorous scrutiny and are a comprehensive form of scientific knowledge. It is important to note that the definition of a theory as used in the disciplines of science is significantly different from the common vernacular usage of the word theory. These different usages are comparable to the differing, and often opposing, usages of the prediction in science versus prediction in vernacular speech. The strength of a theory is related to the diversity of phenomena it can explain. In certain cases, the less-accurate unmodified scientific theory can still be treated as an if it is useful as an approximation under specific conditions. Scientific theories are testable and make falsifiable predictions and they describe the causal elements responsible for a particular natural phenomenon, and are used to explain and predict aspects of the physical universe or specific areas of inquiry. Scientists use theories as a foundation to further scientific knowledge.
As with other forms of knowledge, scientific theories are both deductive and inductive in nature and aim for predictive power and explanatory capability. Paleontologist, evolutionary biologist, and science historian Stephen Jay Gould said, “. facts and theories are different things, not rungs in a hierarchy of increasing certainty. Theories are structures of ideas that explain and interpret facts. ”The defining characteristic of all scientific knowledge, the relevance and specificity of those predictions determine how potentially useful the theory is. A would-be theory that makes no observable predictions is not a theory at all. Predictions not sufficiently specific to be tested are similarly not useful, in both cases, the term theory is not applicable. A body of descriptions of knowledge can be called a theory if it fulfills the following criteria and it is well-supported by many independent strands of evidence, rather than a single foundation. It is consistent with preexisting experimental results and at least as accurate in its predictions as are any preexisting theories and these qualities are certainly true of such established theories as special and general relativity, quantum mechanics, plate tectonics, the modern evolutionary synthesis, etc.
It is among the most parsimonious explanations, economical in the use of proposed entities or explanatory steps as per Occams razor. The United States National Academy of Sciences defines scientific theories as follows and it refers to a comprehensive explanation of some aspect of nature that is supported by a vast body of evidence. Such fact-supported theories are not guesses but reliable accounts of the real world, the theory of biological evolution is more than just a theory. It is as factual an explanation of the universe as the theory of matter or the germ theory of disease
Speculation is the purchase of an asset with the hope that it will become more valuable at a future date. Many speculators pay little attention to the value of a security. Speculation can in principle involve any tradable good or financial instrument, Speculators are particularly common in the markets for stocks, commodity futures, fine art, real estate, and derivatives. The number of shareholders increased, from 4.4 million in 1900 to 26 million in 1932, the view of what distinguishes investment from speculation and speculation from excessive speculation varies widely among pundits and academics. Some sources note that speculation is simply a higher form of investment. Others define speculation more narrowly as positions not characterized as hedging, the agency emphasizes that speculators serve important market functions, but defines excessive speculation as harmful to the proper functioning of futures markets. According to Ben Graham in The Intelligent Investor, the prototypical defensive investor is.
one interested chiefly in safety plus freedom from bother, Speculation is condemned on ethical-moral grounds as creating money from money and thereby promoting the vices of avarice and gambling. When a harvest is too small to satisfy consumption at its rate, speculators come in. Their purchases raise the price, thereby checking consumption so that the supply will last longer. Producers encouraged by the price further lessen the shortage by growing or importing to reduce the shortage. On the other side, when the price is higher than the speculators think the facts warrant and this reduces prices, encouraging consumption and exports and helping to reduce the surplus. If any market, such as pork bellies, had no speculators, with fewer players in the market, there would be a larger spread between the current bid and ask price of pork bellies. By contrast, a commodity speculator may profit the difference in the spread and, in competition with other speculators, some schools of thought argue that speculators increase the liquidity in a market, and therefore promote an efficient market.
This efficiency is difficult to achieve without speculators, a very beneficial by-product of speculation for the economy is price discovery. On the other hand, as more speculators participate in a market, underlying real demand and supply can diminish compared to trading volume, Speculators perform a very important risk bearing role that is beneficial to society. For example, a farmer might be considering planting corn on some unused farmland, however, he might not want to do so because he is concerned that the price might fall too far by harvest time. By selling his crop in advance at a price to a speculator, he is now able to hedge the price risk. Thus, speculators can actually increase production through their willingness to take on risk, they make the prices better reflect the true quality of operation of the firms
The Big Bang theory is the prevailing cosmological model for the universe from the earliest known periods through its subsequent large-scale evolution. If the known laws of physics are extrapolated to the highest density regime, detailed measurements of the expansion rate of the universe place this moment at approximately 13.8 billion years ago, which is thus considered the age of the universe. After the initial expansion, the universe cooled sufficiently to allow the formation of subatomic particles, giant clouds of these primordial elements coalesced through gravity in halos of dark matter, eventually forming the stars and galaxies visible today. Since Georges Lemaître first noted in 1927 that a universe could be traced back in time to an originating single point. More recently, measurements of the redshifts of supernovae indicate that the expansion of the universe is accelerating, the known physical laws of nature can be used to calculate the characteristics of the universe in detail back in time to an initial state of extreme density and temperature.
American astronomer Edwin Hubble observed that the distances to faraway galaxies were strongly correlated with their redshifts, assuming the Copernican principle, the only remaining interpretation is that all observable regions of the universe are receding from all others. Since we know that the distance between galaxies increases today, it must mean that in the past galaxies were closer together, the continuous expansion of the universe implies that the universe was denser and hotter in the past. Large particle accelerators can replicate the conditions that prevailed after the early moments of the universe, resulting in confirmation, these accelerators can only probe so far into high energy regimes. Consequently, the state of the universe in the earliest instants of the Big Bang expansion is still poorly understood, the first subatomic particles to be formed included protons and electrons. Though simple atomic nuclei formed within the first three minutes after the Big Bang, thousands of years passed before the first electrically neutral atoms formed, the majority of atoms produced by the Big Bang were hydrogen, along with helium and traces of lithium.
Giant clouds of primordial elements coalesced through gravity to form stars and galaxies. The framework for the Big Bang model relies on Albert Einsteins theory of relativity and on simplifying assumptions such as homogeneity. The governing equations were formulated by Alexander Friedmann, and similar solutions were worked on by Willem de Sitter, extrapolation of the expansion of the universe backwards in time using general relativity yields an infinite density and temperature at a finite time in the past. This singularity indicates that general relativity is not a description of the laws of physics in this regime. How closely models based on general relativity alone can be used to extrapolate toward the singularity is debated—certainly no closer than the end of the Planck epoch. This primordial singularity is itself called the Big Bang, but the term can refer to a more generic early hot. The agreement of independent measurements of this age supports the model that describes in detail the characteristics of the universe.
The earliest phases of the Big Bang are subject to much speculation, in the most common models the universe was filled homogeneously and isotropically with a very high energy density and huge temperatures and pressures and was very rapidly expanding and cooling
Sir Roger Penrose OM FRS is an English mathematical physicist and philosopher of science. He is the Emeritus Rouse Ball Professor of Mathematics at the Mathematical Institute of the University of Oxford, Penrose is known for his work in mathematical physics, in particular for his contributions to general relativity and cosmology. He has received prizes and awards, including the 1988 Wolf Prize for physics. Penrose told a Russian audience that his grandmother had left St. Petersburg in the late 1880s and his uncle was artist Roland Penrose, whose son with photographer Lee Miller is Antony Penrose. Penrose is the brother of physicist Oliver Penrose and of chess Grandmaster Jonathan Penrose, Penrose attended University College School and University College, where he graduated with a first class degree in mathematics. In 1955, while still a student, Penrose reintroduced the E. H. Moore generalised matrix inverse, known as the Moore–Penrose inverse, after it had been reinvented by Arne Bjerhammar in 1951.
He devised and popularised the Penrose triangle in the 1950s, describing it as impossibility in its purest form, whose earlier depictions of impossible objects partly inspired it. Eschers Waterfall, and Ascending and Descending were in inspired by Penrose. As reviewer Manjit Kumar puts it, As a student in 1954, soon he was trying to conjure up impossible figures of his own and discovered the tribar – a triangle that looks like a real, solid three-dimensional object, but isnt. Together with his father, a physicist and mathematician, Penrose went on to design a staircase that simultaneously loops up, an article followed and a copy was sent to Escher. Completing a cyclical flow of creativity, the Dutch master of illusions was inspired to produce his two masterpieces. One approach to issue was by the use of perturbation theory. The importance of Penroses epoch-making paper Gravitational collapse and space-time singularities was not only its result, following up his weak cosmic censorship hypothesis, Penrose went on, in 1979, to formulate a stronger version called the strong censorship hypothesis.
Together with the BKL conjecture and issues of stability, settling the censorship conjectures is one of the most important outstanding problems in general relativity. Also from 1979 dates Penroses influential Weyl curvature hypothesis on the conditions of the observable part of the universe. Penrose and James Terrell independently realised that objects travelling near the speed of light appear to undergo a peculiar skewing or rotation. This effect has come to be called the Terrell rotation or Penrose–Terrell rotation, in 1967, Penrose invented the twistor theory which maps geometric objects in Minkowski space into the 4-dimensional complex space with the metric signature. Penrose developed these ideas based on the article Deux types fondamentaux de distribution statistique by Czech geographer, demographer, in 1984, such patterns were observed in the arrangement of atoms in quasicrystals
Philosophy of science
Philosophy of science is a branch of philosophy concerned with the foundations and implications of science. The central questions of this study concern what qualifies as science, the reliability of theories. This discipline overlaps with metaphysics and epistemology, for example, in addition to these general questions about science as a whole, philosophers of science consider problems that apply to particular sciences. Some philosophers of science use contemporary results in science to reach conclusions about philosophy itself, Karl Popper and Charles Sanders Pierce moved on from positivism to establish a modern set of standards for scientific methodology. Subsequently, the coherentist approach to science, in which a theory is validated if it makes sense of observations as part of a coherent whole, became prominent due to W. V. Quine and others. Some thinkers such as Stephen Jay Gould seek to ground science in axiomatic assumptions, another approach to thinking about science involves studying how knowledge is created from a sociological perspective, an approach represented by scholars like David Bloor and Barry Barnes.
Finally, a tradition in continental philosophy approaches science from the perspective of an analysis of human experience. Philosophies of the particular sciences range from questions about the nature of time raised by Einsteins general relativity, a central theme is whether one scientific discipline can be reduced to the terms of another. That is, can chemistry be reduced to physics, or can sociology be reduced to individual psychology, the general questions of philosophy of science arise with greater specificity in some particular sciences. For instance, the question of the validity of scientific reasoning is seen in a different guise in the foundations of statistics, the question of what counts as science and what should be excluded arises as a life-or-death matter in the philosophy of medicine. Distinguishing between science and non-science is referred to as the demarcation problem, for example, should psychoanalysis be considered science. How about so-called creation science, the multiverse hypothesis, or macroeconomics.
Karl Popper called this the question in the philosophy of science. However, no unified account of the problem has won acceptance among philosophers, Martin Gardner has argued for the use of a Potter Stewart standard for recognizing pseudoscience. Early attempts by the logical positivists grounded science in observation while non-science was non-observational, Popper argued that the central property of science is falsifiability. That is, every genuinely scientific claim is capable of being proven false, a closely related question is what counts as a good scientific explanation. In addition to providing predictions about events, society often takes scientific theories to provide explanations for events that occur regularly or have already occurred. One early and influential theory of scientific explanation is the deductive-nomological model and it says that a successful scientific explanation must deduce the occurrence of the phenomena in question from a scientific law
President Dwight D. Eisenhower established NASA in 1958 with a distinctly civilian orientation encouraging peaceful applications in space science. The National Aeronautics and Space Act was passed on July 29,1958, disestablishing NASAs predecessor, the new agency became operational on October 1,1958. Since that time, most US space exploration efforts have led by NASA, including the Apollo Moon landing missions, the Skylab space station. Currently, NASA is supporting the International Space Station and is overseeing the development of the Orion Multi-Purpose Crew Vehicle, the agency is responsible for the Launch Services Program which provides oversight of launch operations and countdown management for unmanned NASA launches. NASA shares data with various national and international such as from the Greenhouse Gases Observing Satellite. Since 2011, NASA has been criticized for low cost efficiency, from 1946, the National Advisory Committee for Aeronautics had been experimenting with rocket planes such as the supersonic Bell X-1.
In the early 1950s, there was challenge to launch a satellite for the International Geophysical Year. An effort for this was the American Project Vanguard, after the Soviet launch of the worlds first artificial satellite on October 4,1957, the attention of the United States turned toward its own fledgling space efforts. This led to an agreement that a new federal agency based on NACA was needed to conduct all non-military activity in space. The Advanced Research Projects Agency was created in February 1958 to develop technology for military application. On July 29,1958, Eisenhower signed the National Aeronautics and Space Act, a NASA seal was approved by President Eisenhower in 1959. Elements of the Army Ballistic Missile Agency and the United States Naval Research Laboratory were incorporated into NASA, earlier research efforts within the US Air Force and many of ARPAs early space programs were transferred to NASA. In December 1958, NASA gained control of the Jet Propulsion Laboratory, NASA has conducted many manned and unmanned spaceflight programs throughout its history.
Some missions include both manned and unmanned aspects, such as the Galileo probe, which was deployed by astronauts in Earth orbit before being sent unmanned to Jupiter, the experimental rocket-powered aircraft programs started by NACA were extended by NASA as support for manned spaceflight. This was followed by a space capsule program, and in turn by a two-man capsule program. This goal was met in 1969 by the Apollo program, reduction of the perceived threat and changing political priorities almost immediately caused the termination of most of these plans. NASA turned its attention to an Apollo-derived temporary space laboratory, to date, NASA has launched a total of 166 manned space missions on rockets, and thirteen X-15 rocket flights above the USAF definition of spaceflight altitude,260,000 feet. The X-15 was an NACA experimental rocket-powered hypersonic research aircraft, developed in conjunction with the US Air Force, the design featured a slender fuselage with fairings along the side containing fuel and early computerized control systems