1.
Probability theory
–
Probability theory is the branch of mathematics concerned with probability, the analysis of random phenomena. It is not possible to predict precisely results of random events, two representative mathematical results describing such patterns are the law of large numbers and the central limit theorem. As a mathematical foundation for statistics, probability theory is essential to human activities that involve quantitative analysis of large sets of data. Methods of probability theory also apply to descriptions of complex systems given only partial knowledge of their state, a great discovery of twentieth century physics was the probabilistic nature of physical phenomena at atomic scales, described in quantum mechanics. Christiaan Huygens published a book on the subject in 1657 and in the 19th century, initially, probability theory mainly considered discrete events, and its methods were mainly combinatorial. Eventually, analytical considerations compelled the incorporation of continuous variables into the theory and this culminated in modern probability theory, on foundations laid by Andrey Nikolaevich Kolmogorov. Kolmogorov combined the notion of space, introduced by Richard von Mises. This became the mostly undisputed axiomatic basis for modern probability theory, most introductions to probability theory treat discrete probability distributions and continuous probability distributions separately. The more mathematically advanced measure theory-based treatment of probability covers the discrete, continuous, consider an experiment that can produce a number of outcomes. The set of all outcomes is called the space of the experiment. The power set of the space is formed by considering all different collections of possible results. For example, rolling an honest die produces one of six possible results, one collection of possible results corresponds to getting an odd number. Thus, the subset is an element of the set of the sample space of die rolls. In this case, is the event that the die falls on some odd number, If the results that actually occur fall in a given event, that event is said to have occurred. Probability is a way of assigning every event a value between zero and one, with the requirement that the event made up of all possible results be assigned a value of one, the probability that any one of the events, or will occur is 5/6. This is the same as saying that the probability of event is 5/6 and this event encompasses the possibility of any number except five being rolled. The mutually exclusive event has a probability of 1/6, and the event has a probability of 1, discrete probability theory deals with events that occur in countable sample spaces. Modern definition, The modern definition starts with a finite or countable set called the sample space, which relates to the set of all possible outcomes in classical sense, denoted by Ω
Probability theory
–
The
normal distribution, a continuous probability distribution.
Probability theory
–
The
Poisson distribution, a discrete probability distribution.
2.
Glossary of probability and statistics
–
The following is a glossary of terms used in the mathematical sciences statistics and probability. Alternative hypothesis atomic event Another name for elementary event bar chart bias 1, a sample that is not representative of the population 2. For example, how will my headache feel if I take aspirin, causal studies may be either experimental or observational. Conditional probability is written P, and is read the probability of A, given B confidence interval In inferential statistics, a CI is a range of plausible values for the population mean. For example, based on a study of sleep habits among 100 people and this is different from the sample mean, which can be measured directly. Confidence level Also known as a coefficient, the confidence level indicates the probability that the confidence interval captures the true population mean. For example, an interval with a 95 percent confidence level has a 95 percent chance of capturing the population mean. Technically, this means that, if the experiment were repeated many times,95 percent of the CIs would contain the population mean. Continuous variable correlation Also called correlation coefficient, a measure of the strength of linear relationship between two random variables. An example is the Pearson product-moment correlation coefficient, which is found by dividing the covariance of the two variables by the product of their standard deviations. The mean can be used as an expected value The sum of the probability of each possible outcome of the experiment multiplied by its payoff. Thus, it represents the amount one expects to win per bet if bets with identical odds are repeated many times. For example, the value of a six-sided die roll is 3.5. The concept is similar to the mean, the joint probability of A and B is written P or P. kurtosis A measure of the peakedness of the probability distribution of a real-valued random variable. For example, imagine pulling a ball with the number k from a bag of n balls. The marginal probability of A is written P, contrast with conditional probability mean 1. The expected value of a random variable 2, think of the result of a series of coin-flips. For example, if one wanted to test whether light has an effect on sleep and it is often symbolized as H0
Glossary of probability and statistics
–
Statistics
3.
Notation in probability and statistics
–
Probability theory and statistics have some commonly used conventions, in addition to standard mathematical notation and mathematical symbols. Random variables are written in upper case roman letters, X, Y. Particular realizations of a variable are written in corresponding lower case letters. For example x1, x2, …, xn could be a sample corresponding to the random variable X, P or P indicates the probability that events A and B both occur. P or P indicates the probability of either event A or event B occurring, σ-algebras are usually written with upper case calligraphic Probability density functions and probability mass functions are denoted by lower case letters, e. g. f. Cumulative distribution functions are denoted by upper case letters, e. g. F. e, greek letters are commonly used to denote unknown parameters. A tilde denotes has the probability distribution of, placing a hat, or caret, over a true parameter denotes an estimator of it, e. g. θ ^ is an estimator for θ. The arithmetic mean of a series of values x1, x2, xn is often denoted by placing an overbar over the symbol, e. g. x ¯, pronounced x bar. The α-level upper critical value of a probability distribution is the value exceeded with probability α, that is, column vectors are usually denoted by boldface lower case letters, e. g. x. The transpose operator is denoted by either a superscript T or a prime symbol, a row vector is written as the transpose of a column vector, e. g. xT or x′. Common abbreviations include, a. e. almost everywhere a. s. almost surely cdf cumulative distribution function cmf cumulative mass function df degrees of freedom i. i. d. COPSS Committee on Symbols and Notation, The American Statistician,19, 12–14, doi,10. 2307/2681417, JSTOR2681417 Earliest Uses of Symbols in Probability and Statistics, maintained by Jeff Miller
Notation in probability and statistics
–
Statistics
4.
Belief
–
Belief is the state of mind in which a person thinks something to be the case, with or without there being empirical evidence to prove that something is the case with factual certainty. Another way of defining belief sees it as a representation of an attitude positively oriented towards the likelihood of something being true. In the context of Ancient Greek thought, two related concepts were identified with regards to the concept of belief, pistis and doxa, simplified, we may say that pistis refers to trust and confidence, while doxa refers to opinion and acceptance. The English word orthodoxy derives from doxa, Jonathan Leicester suggests that belief has the purpose of guiding action rather than indicating truth. In epistemology, philosophers use the belief to refer to personal attitudes associated with true or false ideas. However, belief does not require active introspection and circumspection, for example, we never ponder whether or not the sun will rise. We simply assume the sun will rise, since belief is an important aspect of mundane life, according to Eric Schwitzgebel in the Stanford Encyclopedia of Philosophy, a related question asks, how a physical organism can have beliefs. Epistemology is concerned with delineating the boundary between justified belief and opinion, and involved generally with a philosophical study of knowledge. The primary problem in epistemology is to exactly what is needed in order for us to have knowledge. Plato dismisses this possibility of a relation between belief and knowledge even when the one who opines grounds his belief on the rule. Among American epistemologists, Gettier and Goldman, have questioned the true belief definition. Mainstream psychology and related disciplines have traditionally treated belief as if it were the simplest form of mental representation, philosophers have tended to be more abstract in their analysis, and much of the work examining the viability of the belief concept stems from philosophical analysis. The concept of belief presumes a subject and an object of belief, Beliefs are sometimes divided into core beliefs and dispositional beliefs. For example, if asked do you believe tigers wear pink pajamas, a person might answer that they do not, despite the fact they may never have thought about this situation before. This has important implications for understanding the neuropsychology and neuroscience of belief, if the concept of belief is incoherent, then any attempt to find the underlying neural processes that support it will fail. Jerry Fodor is one of the defenders of this point of view. Most notably, philosopher Stephen Stich has argued for this understanding of belief. In these cases science hasnt provided us with a detailed account of these theories
Belief
–
We are influenced by many factors that ripple through our minds as our beliefs form, evolve, and may eventually change
Belief
–
A
Venn /
Euler diagram which grants that
truth and belief may be distinguished and that their
intersection is
knowledge. Unsurprisingly, this is
a controversial analysis.
Belief
–
This article is about the general concept. For other uses, see
Belief (disambiguation).
Belief
–
Philosopher Jonathan Glover warns that belief systems are like whole boats in the water; it is extremely difficult to alter them all at once (e.g., it may be too stressful, or people may maintain their biases without realizing it).
5.
Doubt
–
Doubt characterises a status in which the mind remains suspended between two contradictory propositions and unable to assent to either of them. Doubt on a level is indecision between belief and disbelief. Doubt involves uncertainty, distrust or lack of sureness of a fact, an action. Doubt questions a notion of a reality, and may involve delaying or rejecting relevant action out of concerns for mistakes or faults or appropriateness. Doubt sometimes tends to call on reason, Doubt may encourage people to hesitate before acting, and/or to apply more rigorous methods. Doubt may have particular importance as leading towards disbelief or non-acceptance, societally, doubt creates an atmosphere of distrust, being accusatory in nature and de facto alleging either foolishness or deceit on the part of another. Such a stance has been fostered in Western European society since the Enlightenment, in opposition to tradition, psychoanalytic theory attributes doubt to childhood, when the ego develops. Childhood experiences, these theories maintain, can plant doubt about ones abilities, cognitive mental as well as more spiritual approaches abound in response to the wide variety of potential causes for doubt. Behavioral therapy — in which a person systematically asks his own if the doubt has any real basis — uses rational. This method contrasts to those of say, the Buddhist faith, buddhism sees doubt as a negative attachment to ones perceived past and future. To let go of the history of ones life plays a central role in releasing the doubts — developed in. Partial or intermittent negative reinforcement can create a climate of fear. Descartes employed Cartesian doubt as a pre-eminent methodological tool in his fundamental philosophical investigations, branches of philosophy like logic devote much effort to distinguish the dubious, the probable and the certain. Much of illogic rests on assumptions, dubious data or dubious conclusions, with rhetoric, whitewashing. Doubt that god exist may form the basis of agnosticism — the belief that one cannot determine the existence or non-existence of god. It may also form other brands of skepticism, such as Pyrrhonism, which do not take a stance in regard to the existence of god. Alternatively, doubt over the existence of god may lead to acceptance of a particular religion, Doubt of a specific theology, scriptural or deistic, may bring into question the truth of that theologys set of beliefs. On the other hand, doubt as to some doctrines but acceptance of others may lead to the growth of heresy and/or the splitting off of sects or groups of thought, thus proto-Protestants doubted papal authority, and substituted alternative methods of governance in their new churches
Doubt
–
The Incredulity of Saint Thomas by
Caravaggio.
Doubt
–
Doubt
Doubt
–
Doubts, by
Henrietta Rae, 1886
6.
Determinism
–
Determinism is the philosophical position that for every event there exist conditions that could cause no other event. There are many determinisms, depending on what pre-conditions are considered to be determinative of an event or action, deterministic theories throughout the history of philosophy have sprung from diverse and sometimes overlapping motives and considerations. Some forms of determinism can be tested with ideas from physics. The opposite of determinism is some kind of indeterminism, Determinism is often contrasted with free will. Determinism often is taken to mean causal determinism, which in physics is known as cause-and-effect and it is the concept that events within a given paradigm are bound by causality in such a way that any state is completely determined by prior states. This meaning can be distinguished from varieties of determinism mentioned below. Numerous historical debates involve many philosophical positions and varieties of determinism and they include debates concerning determinism and free will, technically denoted as compatibilistic and incompatibilistic. Determinism should not be confused with self-determination of human actions by reasons, motives, Determinism rarely requires that perfect prediction be practically possible. However, causal determinism is a broad term to consider that ones deliberations, choices. Causal determinism proposes that there is a chain of prior occurrences stretching back to the origin of the universe. The relation between events may not be specified, nor the origin of that universe, causal determinists believe that there is nothing in the universe that is uncaused or self-caused. Historical determinism can also be synonymous with causal determinism, causal determinism has also been considered more generally as the idea that everything that happens or exists is caused by antecedent conditions. Yet they can also be considered metaphysical of origin. Nomological determinism is the most common form of causal determinism and it is the notion that the past and the present dictate the future entirely and necessarily by rigid natural laws, that every occurrence results inevitably from prior events. Quantum mechanics and various interpretations thereof pose a challenge to this view. Nomological determinism is sometimes illustrated by the experiment of Laplaces demon. Nomological determinism is sometimes called scientific determinism, although that is a misnomer, physical determinism is generally used synonymously with nomological determinism. Necessitarianism is closely related to the causal determinism described above and it is a metaphysical principle that denies all mere possibility, there is exactly one way for the world to be. Leucippus claimed there were no uncaused events, and that occurs for a reason
Determinism
–
Many philosophical theories of determinism frame themselves with the idea that reality follows a sort of predetermined path
Determinism
–
Adequate determinism focuses on
the fact that, even without a full understanding of microscopic physics, we can predict the distribution of 1000 coin tosses
Determinism
–
Nature and nurture interact in humans. A scientist looking at a sculpture after some time does not ask whether we are seeing the effects of the starting materials or of environmental influences.
Determinism
–
A technological determinist might suggest that technology like the mobile phone is the greatest factor shaping human civilization.
7.
Fatalism
–
Fatalism is a philosophical doctrine that stresses the subjugation of all events or actions to fate. Fatalism generally refers to any of the ideas, The view that we are powerless to do anything other than what we actually do. Included in this is that man has no power to influence the future, or indeed and this belief is very similar to predeterminism. An attitude of resignation in the face of some event or events which are thought to be inevitable. Friedrich Nietzsche named this idea with Turkish fatalism in his book The Wanderer and that acceptance is appropriate, rather than resistance against inevitability. This belief is similar to defeatism. Ājīvika was a system of ancient Indian philosophy and a movement of the Mahajanapada period in the Indian subcontinent. The same sources therefore make them out to be strict fatalists, If all future occurrences are rigidly determined. Coming events may in some sense be said to exist already, the future exists in the present, and both exist in the past. Time is thus on ultimate analysis illusory, every phase of a process is always present. In a soul which has attained salvation its earthly births are still present, nothing is destroyed and nothing is produced. Not only are all things determined, but their change and development is a cosmic illusion, makkhali Gosala was an ascetic teacher of ancient India. He is regarded to have born in 484 BCE and was a contemporary of Siddhartha Gautama, the founder of Buddhism, and of Mahavira. While the terms are used interchangeably, fatalism, determinism. However, all these doctrines share common ground, determinists generally agree that human actions affect the future but that human action is itself determined by a causal chain of prior events. Their view does not accentuate a submission to fate or destiny, fatalism is a looser term than determinism. The presence of historical indeterminisms or chances, i. e. events that could not be predicted by sole knowledge of events, is an idea still compatible with fatalism. Necessity will happen just as inevitably as a chance—both can be imagined as sovereign, likewise, determinism is a broader term than predeterminism
Fatalism
–
Time Portal
8.
Hypothesis
–
A hypothesis is a proposed explanation for a phenomenon. For a hypothesis to be a scientific hypothesis, the method requires that one can test it. Scientists generally base scientific hypotheses on previous observations that cannot satisfactorily be explained with the scientific theories. Even though the hypothesis and theory are often used synonymously. A working hypothesis is a provisionally accepted hypothesis proposed for further research, P is the assumption in a What If question. Remember, the way that you prove an implication is by assuming the hypothesis, --Philip Wadler In its ancient usage, hypothesis referred to a summary of the plot of a classical drama. The English word hypothesis comes from the ancient Greek ὑπόθεσις word hupothesis, in Platos Meno, Socrates dissects virtue with a method used by mathematicians, that of investigating from a hypothesis. In this sense, hypothesis refers to an idea or to a convenient mathematical approach that simplifies cumbersome calculations. In common usage in the 21st century, a hypothesis refers to an idea whose merit requires evaluation. For proper evaluation, the framer of a hypothesis needs to define specifics in operational terms, a hypothesis requires more work by the researcher in order to either confirm or disprove it. In due course, a hypothesis may become part of a theory or occasionally may grow to become a theory itself. Normally, scientific hypotheses have the form of a mathematical model, in entrepreneurial science, a hypothesis is used to formulate provisional ideas within a business setting. The formulated hypothesis is then evaluated where either the hypothesis is proven to be true or false through a verifiability- or falsifiability-oriented Experiment, any useful hypothesis will enable predictions by reasoning. It might predict the outcome of an experiment in a setting or the observation of a phenomenon in nature. The prediction may also invoke statistics and only talk about probabilities, other philosophers of science have rejected the criterion of falsifiability or supplemented it with other criteria, such as verifiability or coherence. The scientific method involves experimentation, to test the ability of some hypothesis to adequately answer the question under investigation. In contrast, unfettered observation is not as likely to raise unexplained issues or open questions in science, a thought experiment might also be used to test the hypothesis as well. In framing a hypothesis, the investigator must not currently know the outcome of a test or that it remains reasonably under continuing investigation, only in such cases does the experiment, test or study potentially increase the probability of showing the truth of a hypothesis
Hypothesis
–
Andreas Cellarius hypothesis, demonstrating the planetary motions in eccentric and epicyclical
orbits
9.
Nihilism
–
Nihilism is a philosophical doctrine that suggests the lack of belief in one or more reputedly meaningful aspects of life. Most commonly, nihilism is presented in the form of existential nihilism, moral nihilists assert that there is no inherent morality, and that accepted moral values are abstractly contrived. Nihilism may also take epistemological, ontological, or metaphysical forms, meaning respectively that, in some aspect, knowledge is not possible, movements such as Futurism and deconstruction, among others, have been identified by commentators as nihilistic. Nihilism has many definitions, and thus can describe multiple arguably independent philosophical positions, extreme metaphysical nihilism is commonly defined as the belief that nothing exists as a correspondent component of the self-efficient world. The American Heritage Medical Dictionary defines one form of nihilism as a form of skepticism that denies all existence. A similar skepticism concerning the world can be found in solipsism. However, despite the fact that both deny the certainty of objects true existence, the nihilist would deny the existence of self whereas the solipsist would affirm it, both these positions are considered forms of anti-realism. Epistemological nihilism is a form of skepticism in which all knowledge is accepted as being possibly untrue or as being unable to be confirmed true and this interpretation of existence must be based on resolution. Therefore, there is no way to surmise or measure the validity of mereological nihilism. Thus, the resolution with which the ant views the world it exists within is an important determining factor in how the ant experiences this within the world feeling. Existential nihilism is the belief that life has no meaning or value. With respect to the universe, existential nihilism posits that a human or even the entire human species is insignificant, without purpose. The meaninglessness of life is explored in the philosophical school of existentialism. For example, a moral nihilist would say that killing someone, in this way a moral nihilist believes that all moral claims are void of any truth value. An alternative scholarly perspective is that moral nihilism is a morality in itself, cooper writes, In the widest sense of the word morality, moral nihilism is a morality. An influential analysis of political nihilism is presented by Leo Strauss, the Russian Nihilist movement was a Russian trend in the 1860s that rejected all authority. Their name derives from the Latin nihil, meaning nothing, after the assassination of Tsar Alexander II in 1881, the Nihilists gained a reputation throughout Europe as proponents of the use of violence for political change. The Nihilists expressed anger at what they described as the nature of the Eastern Orthodox Church and of the tsarist monarchy
Nihilism
–
Søren Aabye Kierkegaard
Nihilism
–
Friedrich Wilhelm Nietzsche
10.
Scientific theory
–
Established scientific theories have withstood rigorous scrutiny and are a comprehensive form of scientific knowledge. It is important to note that the definition of a theory as used in the disciplines of science is significantly different from the common vernacular usage of the word theory. These different usages are comparable to the differing, and often opposing, usages of the prediction in science versus prediction in vernacular speech. The strength of a theory is related to the diversity of phenomena it can explain. In certain cases, the less-accurate unmodified scientific theory can still be treated as an if it is useful as an approximation under specific conditions. Scientific theories are testable and make falsifiable predictions and they describe the causal elements responsible for a particular natural phenomenon, and are used to explain and predict aspects of the physical universe or specific areas of inquiry. Scientists use theories as a foundation to further scientific knowledge. As with other forms of knowledge, scientific theories are both deductive and inductive in nature and aim for predictive power and explanatory capability. Paleontologist, evolutionary biologist, and science historian Stephen Jay Gould said, “. facts and theories are different things, not rungs in a hierarchy of increasing certainty. Theories are structures of ideas that explain and interpret facts. ”The defining characteristic of all scientific knowledge, the relevance and specificity of those predictions determine how potentially useful the theory is. A would-be theory that makes no observable predictions is not a theory at all. Predictions not sufficiently specific to be tested are similarly not useful, in both cases, the term theory is not applicable. A body of descriptions of knowledge can be called a theory if it fulfills the following criteria and it is well-supported by many independent strands of evidence, rather than a single foundation. It is consistent with preexisting experimental results and at least as accurate in its predictions as are any preexisting theories and these qualities are certainly true of such established theories as special and general relativity, quantum mechanics, plate tectonics, the modern evolutionary synthesis, etc. It is among the most parsimonious explanations, economical in the use of proposed entities or explanatory steps as per Occams razor. The United States National Academy of Sciences defines scientific theories as follows and it refers to a comprehensive explanation of some aspect of nature that is supported by a vast body of evidence. Such fact-supported theories are not guesses but reliable accounts of the real world, the theory of biological evolution is more than just a theory. It is as factual an explanation of the universe as the theory of matter or the germ theory of disease
Scientific theory
–
A central prediction from a current theory: the
general theory of relativity predicts the
bending of light in a gravitational field. This prediction was first tested during the
solar eclipse of May 1919.
Scientific theory
–
The first observation of
cells, by
Robert Hooke, using an early
microscope. This led to the development of
cell theory.
Scientific theory
–
Precession of the
perihelion of
Mercury (exaggerated). The deviation in Mercury's position from the Newtonian prediction is about 43
arc-seconds (about two-thirds of 1/60 of a
degree) per century.
Scientific theory
–
Planets of the
Solar System, with the
Sun at the center. (Sizes to scale; distances and illumination not to scale.)
11.
Solipsism
–
Solipsism is the philosophical idea that only ones own mind is sure to exist. As an epistemological position, solipsism holds that knowledge of anything outside ones own mind is unsure, as a metaphysical position, solipsism goes further to the conclusion that the world and other minds do not exist. There are varying degrees of solipsism that parallel the varying degrees of skepticism, there are weaker versions of metaphysical solipsism, such as Caspar Hares egocentric presentism, in which other people are conscious but their experiences are simply not present. Epistemological solipsism is the variety of idealism according to only the directly accessible mental contents of the solipsistic philosopher can be known. The existence of a world is regarded as an unresolvable question rather than actually false. If a person sets up a camera to photograph the moon when he is not looking at it, logically, this does not assure that the moon itself existed at the time the photograph is supposed to have been taken. To establish that it is an image of an independent moon requires many other assumptions that amount to begging the question, methodological solipsism is an agnostic variant of solipsism. It exists in opposition to the strict requirements for Knowledge. It still entertains the points that any induction is fallible and that we may be brains in vats, only the existence of thoughts is known for certain. Importantly, methodological solipsists do not intend to conclude that the forms of solipsism are actually true. They simply emphasize that justifications of a world must be founded on indisputable facts about their own consciousness. The methodological solipsist believes that subjective impressions or innate knowledge are the possible or proper starting point for philosophical construction. Often methodological solipsism is not held as a system. Denial of materialistic existence, in itself, does not constitute solipsism, a feature of the metaphysical solipsistic worldview is the denial of the existence of other minds. Since personal experiences are private and ineffable, another beings experience can be only by analogy. Philosophers try to build knowledge on more than an inference or analogy, the experience of a given person is necessarily private to that person. Solipsism was first recorded by the Greek presocratic sophist, Gorgias who is quoted by the Roman skeptic Sextus Empiricus as having stated, even if something exists, nothing can be known about it. Even if something could be known about it, knowledge about it cant be communicated to others, much of the point of the Sophists was to show that objective knowledge was a literal impossibility. e
Solipsism
–
René Descartes. Portrait by Frans Hals, 1648.
12.
Truth
–
Truth is most often used to mean being in accord with fact or reality, or fidelity to an original or standard. Truth may also often be used in modern contexts to refer to an idea of truth to self, the commonly understood opposite of truth is falsehood, which, correspondingly, can also take on a logical, factual, or ethical meaning. The concept of truth is discussed and debated in several contexts, including philosophy, art, Some philosophers view the concept of truth as basic, and unable to be explained in any terms that are more easily understood than the concept of truth itself. Commonly, truth is viewed as the correspondence of language or thought to an independent reality, other philosophers take this common meaning to be secondary and derivative. On this view, the conception of truth as correctness is a derivation from the concepts original essence. Various theories and views of truth continue to be debated among scholars, philosophers, language and words are a means by which humans convey information to one another and the method used to determine what is a truth is termed a criterion of truth. The English word truth is derived from Old English tríewþ, tréowþ, trýwþ, Middle English trewþe, cognate to Old High German triuwida, like troth, it is a -th nominalisation of the adjective true. Old Norse trú, faith, word of honour, religious faith, thus, truth involves both the quality of faithfulness, fidelity, loyalty, sincerity, veracity, and that of agreement with fact or reality, in Anglo-Saxon expressed by sōþ. All Germanic languages besides English have introduced a distinction between truth fidelity and truth factuality. To express factuality, North Germanic opted for nouns derived from sanna to assert, affirm, while continental West Germanic opted for continuations of wâra faith, trust, pact. Romance languages use terms following the Latin veritas, while the Greek aletheia, Russian pravda, each presents perspectives that are widely shared by published scholars. However, the theories are not universally accepted. More recently developed deflationary or minimalist theories of truth have emerged as competitors to the substantive theories. Minimalist reasoning centres around the notion that the application of a term like true to a statement does not assert anything significant about it, for instance, anything about its nature. Minimalist reasoning realises truth as a label utilised in general discourse to express agreement, to stress claims, correspondence theories emphasise that true beliefs and true statements correspond to the actual state of affairs. This type of theory stresses a relationship between thoughts or statements on one hand, and things or objects on the other and it is a traditional model tracing its origins to ancient Greek philosophers such as Socrates, Plato, and Aristotle. This class of theories holds that the truth or the falsity of a representation is determined in principle entirely by how it relates to things, Aquinas also restated the theory as, A judgment is said to be true when it conforms to the external reality. Many modern theorists have stated that this ideal cannot be achieved without analysing additional factors, for example, language plays a role in that all languages have words to represent concepts that are virtually undefined in other languages
Truth
–
Time Saving Truth from Falsehood and
Envy,
François Lemoyne, 1737
Truth
–
Truth, holding a
mirror and a
serpent (1896).
Olin Levi Warner, Library of Congress
Thomas Jefferson Building,
Washington, D.C.
Truth
–
An angel carrying the banner of "Truth", Roslin, Midlothian
Truth
–
Walter Seymour Allward 's Veritas (Truth) outside
Supreme Court of Canada,
Ottawa, Ontario Canada
13.
Uncertainty
–
Uncertainty is a situation which involves imperfect and/or unknown information. However, uncertainty is an expression without a straightforward description. It applies to predictions of events, to physical measurements that are already made. Uncertainty arises in partially observable and/or stochastic environments, as well as due to ignorance and/or indolence, a state of having limited knowledge where it is impossible to exactly describe the existing state, a future outcome, or more than one possible outcome. Risk A state of uncertainty where some possible outcomes have an effect or significant loss. Measurement of risk A set of measured uncertainties where some possible outcomes are losses, and the magnitudes of those losses – this also includes loss functions over continuous variables. It will appear that a measurable uncertainty, or risk proper, if probabilities are applied to the possible outcomes using weather forecasts or even just a calibrated probability assessment, the uncertainty has been quantified. Suppose it is quantified as a 90% chance of sunshine, if there is a major, costly, outdoor event planned for tomorrow then there is a risk since there is a 10% chance of rain, and rain would be undesirable. Furthermore, if this is an event and $100,000 would be lost if it rains. These situations can be even more realistic by quantifying light rain vs. heavy rain. Some may represent the risk in this example as the expected opportunity loss or the chance of the loss multiplied by the amount of the loss and that is useful if the organizer of the event is risk neutral, which most people are not. Most would be willing to pay a premium to avoid the loss, an insurance company, for example, would compute an EOL as a minimum for any insurance coverage, then add onto that other operating costs and profit. Since many people are willing to buy insurance for many reasons, quantitative uses of the terms uncertainty and risk are fairly consistent from fields such as probability theory, actuarial science, and information theory. Some also create new terms without substantially changing the definitions of uncertainty or risk, for example, surprisal is a variation on uncertainty sometimes used in information theory. But outside of the more mathematical uses of the term, usage may vary widely, in cognitive psychology, uncertainty can be real, or just a matter of perception, such as expectations, threats, etc. Vagueness or ambiguity are sometimes described as second order uncertainty, where there is uncertainty even about the definitions of uncertain states or outcomes, the difference here is that this uncertainty is about the human definitions and concepts, not an objective fact of nature. It is usually modelled by some variation on Zadehs fuzzy logic and it has been argued that ambiguity, however, is always avoidable while uncertainty is not necessarily avoidable. Uncertainty may be purely a consequence of a lack of knowledge of obtainable facts and that is, there may be uncertainty about whether a new rocket design will work, but this uncertainty can be removed with further analysis and experimentation
Uncertainty
–
We are frequently presented with situations wherein a decision must be made when we are uncertain of exactly how to proceed.
14.
Agnosticism
–
Agnosticism is the philosophical view that the existence of God or the supernatural are unknown and unknowable. Agnosticism is a doctrine or set of rather than a religion. English biologist Thomas Henry Huxley coined the word agnostic in 1869, the Nasadiya Sukta in the Rigveda is agnostic about the origin of the universe. Agnosticism is of the essence of science, whether ancient or modern and it simply means that a man shall not say he knows or believes that which he has no scientific grounds for professing to know or believe. Consequently, agnosticism puts aside not only the part of popular theology. On the whole, the bosh of heterodoxy is more offensive to me than that of orthodoxy, because heterodoxy professes to be guided by reason and science, and orthodoxy does not. Agnosticism, in fact, is not a creed, but a method, positively the principle may be expressed, In matters of the intellect, follow your reason as far as it will take you, without regard to any other consideration. And negatively, In matters of the intellect do not pretend that conclusions are certain which are not demonstrated or demonstrable, being a scientist, above all else, Huxley presented agnosticism as a form of demarcation. A hypothesis with no supporting objective, testable evidence is not an objective, as such, there would be no way to test said hypotheses, leaving the results inconclusive. His agnosticism was not compatible with forming a belief as to the truth, or falsehood, karl Popper would also describe himself as an agnostic. Others have redefined this concept, making it compatible with forming a belief, george H. Smith rejects agnosticism as a third alternative to theism and atheism and promotes terms such as agnostic atheism and agnostic theism. Agnostic was used by Thomas Henry Huxley in a speech at a meeting of the Metaphysical Society in 1869 to describe his philosophy, early Christian church leaders used the Greek word gnosis to describe spiritual knowledge. Agnosticism is not to be confused with religious views opposing the ancient religious movement of Gnosticism in particular, Huxley used the term in a broader, Huxley identified agnosticism not as a creed but rather as a method of skeptical, evidence-based inquiry. In recent years, scientific literature dealing with neuroscience and psychology has used the word to mean not knowable, in technical and marketing literature, agnostic can also mean independence from some parameters—for example, platform agnostic or hardware agnostic. Scottish Enlightenment philosopher David Hume contended that meaningful statements about the universe are always qualified by some degree of doubt and he asserted that the fallibility of human beings means that they cannot obtain absolute certainty except in trivial cases where a statement is true by definition. A strong agnostic would say, I cannot know whether a deity exists or not, a weak agnostic would say, I dont know whether any deities exist or not, but maybe one day, if there is evidence, we can find something out. Therefore, their existence has little to no impact on human affairs. Agnostic thought, in the form of skepticism, emerged as a philosophical position in ancient Greece
Agnosticism
–
Thomas Henry Huxley
Agnosticism
–
Robert G. Ingersoll
Agnosticism
–
Bertrand Russell
15.
Epistemology
–
Epistemology is the branch of philosophy concerned with the theory of knowledge. Epistemology studies the nature of knowledge, justification, and the rationality of belief, the term Epistemology was first used by Scottish philosopher James Frederick Ferrier in 1854. However, according to Brett Warren, King James VI of Scotland had previously personified this philosophical concept as the character Epistemon in 1591 and this philosophical approach signified a Philomath seeking to obtain greater knowledge through epistemology with the use of theology. The dialogue was used by King James to educate society on various concepts including the history, the word epistemology is derived from the ancient Greek epistēmē meaning knowledge and the suffix -logy, meaning a logical discourse to. J. F. Ferrier coined epistemology on the model of ontology, to designate that branch of philosophy which aims to discover the meaning of knowledge, and called it the true beginning of philosophy. The word is equivalent to the concept Wissenschaftslehre, which was used by German philosophers Johann Fichte, French philosophers then gave the term épistémologie a narrower meaning as theory of knowledge. Émile Meyerson opened his Identity and Reality, written in 1908, in mathematics, it is known that 2 +2 =4, but there is also knowing how to add two numbers, and knowing a person, place, thing, or activity. Some philosophers think there is an important distinction between knowing that, knowing how, and acquaintance-knowledge, with epistemology being primarily concerned with the first of these, while these distinctions are not explicit in English, they are defined explicitly in other languages. In French, Portuguese, Spanish and Dutch to know is translated using connaître, conhecer, conocer, modern Greek has the verbs γνωρίζω and ξέρω. Italian has the verbs conoscere and sapere and the nouns for knowledge are conoscenza and sapienza, German has the verbs wissen and kennen. The verb itself implies a process, you have to go from one state to another and this verb seems to be the most appropriate in terms of describing the episteme in one of the modern European languages, hence the German name Erkenntnistheorie. The theoretical interpretation and significance of linguistic issues remains controversial. In his paper On Denoting and his later book Problems of Philosophy Bertrand Russell stressed the distinction between knowledge by description and knowledge by acquaintance, gilbert Ryle is also credited with stressing the distinction between knowing how and knowing that in The Concept of Mind. This position is essentially Ryles, who argued that a failure to acknowledge the distinction between knowledge that and knowledge how leads to infinite regress and this includes the truth, and everything else we accept as true for ourselves from a cognitive point of view. Whether someones belief is true is not a prerequisite for belief, on the other hand, if something is actually known, then it categorically cannot be false. It would not be accurate to say that he knew that the bridge was safe, because plainly it was not. By contrast, if the bridge actually supported his weight, then he might say that he had believed that the bridge was safe, whereas now, after proving it to himself, epistemologists argue over whether belief is the proper truth-bearer. Some would rather describe knowledge as a system of justified true propositions, plato, in his Gorgias, argues that belief is the most commonly invoked truth-bearer
Epistemology
–
Plato –
Kant –
Nietzsche
16.
Measure (mathematics)
–
In mathematical analysis, a measure on a set is a systematic way to assign a number to each suitable subset of that set, intuitively interpreted as its size. In this sense, a measure is a generalization of the concepts of length, area, for instance, the Lebesgue measure of the interval in the real numbers is its length in the everyday sense of the word – specifically,1. Technically, a measure is a function that assigns a real number or +∞ to subsets of a set X. It must further be countably additive, the measure of a subset that can be decomposed into a finite number of smaller disjoint subsets, is the sum of the measures of the smaller subsets. In general, if one wants to associate a consistent size to each subset of a set while satisfying the other axioms of a measure. This problem was resolved by defining measure only on a sub-collection of all subsets, the so-called measurable subsets and this means that countable unions, countable intersections and complements of measurable subsets are measurable. Non-measurable sets in a Euclidean space, on which the Lebesgue measure cannot be defined consistently, are complicated in the sense of being badly mixed up with their complement. Indeed, their existence is a consequence of the axiom of choice. Measure theory was developed in stages during the late 19th and early 20th centuries by Émile Borel, Henri Lebesgue, Johann Radon. The main applications of measures are in the foundations of the Lebesgue integral, in Andrey Kolmogorovs axiomatisation of probability theory, probability theory considers measures that assign to the whole set the size 1, and considers measurable subsets to be events whose probability is given by the measure. Ergodic theory considers measures that are invariant under, or arise naturally from, let X be a set and Σ a σ-algebra over X. A function μ from Σ to the real number line is called a measure if it satisfies the following properties, Non-negativity. Countable additivity, For all countable collections i =1 ∞ of pairwise disjoint sets in Σ, μ = ∑ k =1 ∞ μ One may require that at least one set E has finite measure. Then the empty set automatically has measure zero because of countable additivity, because μ = μ = μ + μ + μ + …, which implies that μ =0. If only the second and third conditions of the definition of measure above are met, the pair is called a measurable space, the members of Σ are called measurable sets. If and are two spaces, then a function f, X → Y is called measurable if for every Y-measurable set B ∈ Σ Y. A triple is called a measure space, a probability measure is a measure with total measure one – i. e. A probability space is a space with a probability measure
Measure (mathematics)
–
Informally, a measure has the property of being
monotone in the sense that if A is a
subset of B, the measure of A is less than or equal to the measure of B. Furthermore, the measure of the
empty set is required to be 0.
17.
Event (probability theory)
–
In probability theory, an event is a set of outcomes of an experiment to which a probability is assigned. A single outcome may be an element of different events. An event defines an event, namely the complementary set. Typically, when the space is finite, any subset of the sample space is an event. However, this approach does not work well in cases where the space is uncountably infinite. So, when defining a probability space it is possible, and often necessary, to exclude certain subsets of the sample space from being events. If we assemble a deck of 52 playing cards with no jokers, an event, however, is any subset of the sample space, including any singleton set, the empty set and the sample space itself. Other events are subsets of the sample space that contain multiple elements. So, for example, potential events include, Red and black at the time without being a joker, The 5 of Hearts, A King, A Face card, A Spade, A Face card or a red suit. Since all events are sets, they are written as sets. Defining all subsets of the space as events works well when there are only finitely many outcomes. For many standard probability distributions, such as the normal distribution, attempts to define probabilities for all subsets of the real numbers run into difficulties when one considers badly behaved sets, such as those that are nonmeasurable. Hence, it is necessary to restrict attention to a limited family of subsets. The most natural choice is the Borel measurable set derived from unions and intersections of intervals, however, the larger class of Lebesgue measurable sets proves more useful in practice. In the general description of probability spaces, an event may be defined as an element of a selected σ-algebra of subsets of the sample space. Under this definition, any subset of the space that is not an element of the σ-algebra is not an event. With a reasonable specification of the probability space, however, all events of interest are elements of the σ-algebra, even though events are subsets of some sample space Ω, they are often written as propositional formulas involving random variables. For example, if X is a random variable defined on the sample space Ω
Event (probability theory)
–
A
Venn diagram of an event. B is the sample space and A is an event. By the ratio of their areas, the probability of A is approximately 0.4.
18.
Mathematics
–
Mathematics is the study of topics such as quantity, structure, space, and change. There is a range of views among mathematicians and philosophers as to the exact scope, Mathematicians seek out patterns and use them to formulate new conjectures. Mathematicians resolve the truth or falsity of conjectures by mathematical proof, when mathematical structures are good models of real phenomena, then mathematical reasoning can provide insight or predictions about nature. Through the use of abstraction and logic, mathematics developed from counting, calculation, measurement, practical mathematics has been a human activity from as far back as written records exist. The research required to solve mathematical problems can take years or even centuries of sustained inquiry, rigorous arguments first appeared in Greek mathematics, most notably in Euclids Elements. Galileo Galilei said, The universe cannot be read until we have learned the language and it is written in mathematical language, and the letters are triangles, circles and other geometrical figures, without which means it is humanly impossible to comprehend a single word. Without these, one is wandering about in a dark labyrinth, carl Friedrich Gauss referred to mathematics as the Queen of the Sciences. Benjamin Peirce called mathematics the science that draws necessary conclusions, David Hilbert said of mathematics, We are not speaking here of arbitrariness in any sense. Mathematics is not like a game whose tasks are determined by arbitrarily stipulated rules, rather, it is a conceptual system possessing internal necessity that can only be so and by no means otherwise. Albert Einstein stated that as far as the laws of mathematics refer to reality, they are not certain, Mathematics is essential in many fields, including natural science, engineering, medicine, finance and the social sciences. Applied mathematics has led to entirely new mathematical disciplines, such as statistics, Mathematicians also engage in pure mathematics, or mathematics for its own sake, without having any application in mind. There is no clear line separating pure and applied mathematics, the history of mathematics can be seen as an ever-increasing series of abstractions. The earliest uses of mathematics were in trading, land measurement, painting and weaving patterns, in Babylonian mathematics elementary arithmetic first appears in the archaeological record. Numeracy pre-dated writing and numeral systems have many and diverse. Between 600 and 300 BC the Ancient Greeks began a study of mathematics in its own right with Greek mathematics. Mathematics has since been extended, and there has been a fruitful interaction between mathematics and science, to the benefit of both. Mathematical discoveries continue to be made today, the overwhelming majority of works in this ocean contain new mathematical theorems and their proofs. The word máthēma is derived from μανθάνω, while the modern Greek equivalent is μαθαίνω, in Greece, the word for mathematics came to have the narrower and more technical meaning mathematical study even in Classical times
Mathematics
–
Euclid (holding
calipers), Greek mathematician, 3rd century BC, as imagined by
Raphael in this detail from
The School of Athens.
Mathematics
–
Greek mathematician
Pythagoras (c. 570 – c. 495 BC), commonly credited with discovering the
Pythagorean theorem
Mathematics
–
Leonardo Fibonacci, the
Italian mathematician who established the Hindu–Arabic numeral system to the Western World
Mathematics
–
Carl Friedrich Gauss, known as the prince of mathematicians
19.
Statistics
–
Statistics is a branch of mathematics dealing with the collection, analysis, interpretation, presentation, and organization of data. In applying statistics to, e. g. a scientific, industrial, or social problem, populations can be diverse topics such as all people living in a country or every atom composing a crystal. Statistics deals with all aspects of data including the planning of data collection in terms of the design of surveys, statistician Sir Arthur Lyon Bowley defines statistics as Numerical statements of facts in any department of inquiry placed in relation to each other. When census data cannot be collected, statisticians collect data by developing specific experiment designs, representative sampling assures that inferences and conclusions can safely extend from the sample to the population as a whole. In contrast, an observational study does not involve experimental manipulation, inferences on mathematical statistics are made under the framework of probability theory, which deals with the analysis of random phenomena. A standard statistical procedure involves the test of the relationship between two data sets, or a data set and a synthetic data drawn from idealized model. A hypothesis is proposed for the relationship between the two data sets, and this is compared as an alternative to an idealized null hypothesis of no relationship between two data sets. Rejecting or disproving the hypothesis is done using statistical tests that quantify the sense in which the null can be proven false. Working from a hypothesis, two basic forms of error are recognized, Type I errors and Type II errors. Multiple problems have come to be associated with this framework, ranging from obtaining a sufficient sample size to specifying an adequate null hypothesis, measurement processes that generate statistical data are also subject to error. Many of these errors are classified as random or systematic, the presence of missing data or censoring may result in biased estimates and specific techniques have been developed to address these problems. Statistics continues to be an area of research, for example on the problem of how to analyze Big data. Statistics is a body of science that pertains to the collection, analysis, interpretation or explanation. Some consider statistics to be a mathematical science rather than a branch of mathematics. While many scientific investigations make use of data, statistics is concerned with the use of data in the context of uncertainty, mathematical techniques used for this include mathematical analysis, linear algebra, stochastic analysis, differential equations, and measure-theoretic probability theory. In applying statistics to a problem, it is practice to start with a population or process to be studied. Populations can be diverse topics such as all living in a country or every atom composing a crystal. Ideally, statisticians compile data about the entire population and this may be organized by governmental statistical institutes
Statistics
–
Scatter plots are used in descriptive statistics to show the observed relationships between different variables.
Statistics
–
More
probability density is found as one gets closer to the expected (mean) value in a
normal distribution. Statistics used in
standardized testing assessment are shown. The scales include
standard deviations, cumulative percentages, percentile equivalents, Z-scores, T-scores, standard nines, and percentages in standard nines.
Statistics
–
Gerolamo Cardano, the earliest pioneer on the mathematics of probability.
Statistics
–
Karl Pearson, a founder of mathematical statistics.
20.
Finance
–
Finance is a field that deals with the study of investments. It includes the dynamics of assets and liabilities over time under conditions of different degrees of uncertainty, Finance can also be defined as the science of money management. Finance aims to price assets based on their level and their expected rate of return. Finance can be broken into three different sub-categories, public finance, corporate finance and personal finance. g, health and property insurance, investing and saving for retirement. Personal finance may also involve paying for a loan, or debt obligations, net worth is a persons balance sheet, calculated by adding up all assets under that persons control, minus all liabilities of the household, at one point in time. Household cash flow totals up all the sources of income within a year. From this analysis, the financial planner can determine to what degree, adequate protection, the analysis of how to protect a household from unforeseen risks. These risks can be divided into the following, liability, property, death, disability, health, some of these risks may be self-insurable, while most will require the purchase of an insurance contract. Determining how much insurance to get, at the most cost effective terms requires knowledge of the market for personal insurance, business owners, professionals, athletes and entertainers require specialized insurance professionals to adequately protect themselves. Since insurance also enjoys some tax benefits, utilizing insurance investment products may be a piece of the overall investment planning. Tax planning, typically the income tax is the single largest expense in a household, managing taxes is not a question of if you will pay taxes, but when and how much. Government gives many incentives in the form of tax deductions and credits, most modern governments use a progressive tax. Typically, as ones income grows, a marginal rate of tax must be paid. Understanding how to take advantage of the tax breaks when planning ones personal finances can make a significant impact in which it can later save you money in the long term. Investment and accumulation goals, planning how to accumulate enough money - for large purchases, major reasons to accumulate assets include, purchasing a house or car, starting a business, paying for education expenses, and saving for retirement. Achieving these goals requires projecting what they will cost, and when you need to withdraw funds that will be necessary to be able to achieve these goals, a major risk to the household in achieving their accumulation goal is the rate of price increases over time, or inflation. Using net present value calculators, the planner will suggest a combination of asset earmarking. In order to overcome the rate of inflation, the investment portfolio has to get a higher rate of return, managing these portfolio risks is most often accomplished using asset allocation, which seeks to diversify investment risk and opportunity
Finance
–
London Stock Exchange, global center of finance.
Finance
Finance
–
Wall Street, the center of American finance.
21.
Gambling
–
Gambling is the wagering of money or something of value on an event with an uncertain outcome with the primary intent of winning money and/or material goods. Gambling thus requires three elements be present, consideration, chance and prize, the term gaming in this context typically refers to instances in which the activity has been specifically permitted by law. However, this distinction is not universally observed in the English-speaking world, for instance, in the United Kingdom, the regulator of gambling activities is called the Gambling Commission. Gambling is also an international commercial activity, with the legal gambling market totaling an estimated $335 billion in 2009. In other forms, gambling can be conducted with materials which have a value, many popular games played in modern casinos originate from Europe and China. Games such as craps, baccarat, roulette, and blackjack originate from different areas of Europe, a version of keno, an ancient Chinese lottery game, is played in casinos around the world. In addition, pai gow poker, a hybrid between pai gow and poker is also played, many jurisdictions, local as well as national, either ban gambling or heavily control it by licensing the vendors. Such regulation generally leads to gambling tourism and illegal gambling in the areas where it is not allowed, there is generally legislation requiring that the odds in gaming devices are statistically random, to prevent manufacturers from making some high-payoff results impossible. Since these high-payoffs have very low probability, a bias can quite easily be missed unless the odds are checked carefully. Most jurisdictions that allow gambling require participants to be above a certain age, in some jurisdictions, the gambling age differs depending on the type of gambling. For example, in many American states one must be over 21 to enter a casino, E. g. Nonetheless, both insurance and gambling contracts are typically considered aleatory contracts under most legal systems, though they are subject to different types of regulation. Under common law, particularly English Law, a contract may not give a casino bona fide purchaser status. For case law on recovery of gambling losses where the loser had stolen the funds see Rights of owner of money as against one who won it in gambling transaction from thief. This was a plot point in a Perry Mason novel, The Case of the Singing Skirt. Religious perspectives on gambling have been mixed, ancient Hindu poems like the Gamblers Lament and the Mahabharata testify to the popularity of gambling among ancient Indians. However, the text Arthashastra recommends taxation and control of gambling, ancient Jewish authorities frowned on gambling, even disqualifying professional gamblers from testifying in court. For these social and religious reasons, most legal jurisdictions limit gambling, in at least one case, the same bishop opposing a casino has sold land to be used for its construction. Although different interpretations of law exist in the Muslim world
Gambling
–
Caravaggio,
The Cardsharps, c. 1594
Gambling
–
Gamblers in the
Ship of Fools, 1494
Gambling
–
Bag with 65 Inlaid Gambling Sticks, Tsimshian (Native American), 19th century,
Brooklyn Museum
Gambling
–
The
Caesars Palace main fountain. The statue is a copy of the ancient
Winged Victory of Samothrace.
22.
Science
–
Science is a systematic enterprise that builds and organizes knowledge in the form of testable explanations and predictions about the universe. The formal sciences are often excluded as they do not depend on empirical observations, disciplines which use science, like engineering and medicine, may also be considered to be applied sciences. However, during the Islamic Golden Age foundations for the method were laid by Ibn al-Haytham in his Book of Optics. In the 17th and 18th centuries, scientists increasingly sought to formulate knowledge in terms of physical laws, over the course of the 19th century, the word science became increasingly associated with the scientific method itself as a disciplined way to study the natural world. It was during this time that scientific disciplines such as biology, chemistry, Science in a broad sense existed before the modern era and in many historical civilizations. Modern science is distinct in its approach and successful in its results, Science in its original sense was a word for a type of knowledge rather than a specialized word for the pursuit of such knowledge. In particular, it was the type of knowledge which people can communicate to each other, for example, knowledge about the working of natural things was gathered long before recorded history and led to the development of complex abstract thought. This is shown by the construction of calendars, techniques for making poisonous plants edible. For this reason, it is claimed these men were the first philosophers in the strict sense and they were mainly speculators or theorists, particularly interested in astronomy. In contrast, trying to use knowledge of nature to imitate nature was seen by scientists as a more appropriate interest for lower class artisans. A clear-cut distinction between formal and empirical science was made by the pre-Socratic philosopher Parmenides, although his work Peri Physeos is a poem, it may be viewed as an epistemological essay on method in natural science. Parmenides ἐὸν may refer to a system or calculus which can describe nature more precisely than natural languages. Physis may be identical to ἐὸν and he criticized the older type of study of physics as too purely speculative and lacking in self-criticism. He was particularly concerned that some of the early physicists treated nature as if it could be assumed that it had no intelligent order, explaining things merely in terms of motion and matter. The study of things had been the realm of mythology and tradition, however. Aristotle later created a less controversial systematic programme of Socratic philosophy which was teleological and he rejected many of the conclusions of earlier scientists. For example, in his physics, the sun goes around the earth, each thing has a formal cause and final cause and a role in the rational cosmic order. Motion and change is described as the actualization of potentials already in things, while the Socratics insisted that philosophy should be used to consider the practical question of the best way to live for a human being, they did not argue for any other types of applied science
Science
–
Maize, known in some English-speaking countries as corn, is a large
grain plant domesticated by
indigenous peoples in
Mesoamerica in
prehistoric times.
Science
–
The scale of the universe mapped to the branches of science and the hierarchy of science.
Science
–
Aristotle, 384 BC – 322 BC, - one of the early figures in the development of the
scientific method.
Science
–
Galen (129—c.216) noted the optic chiasm is X-shaped. (Engraving from
Vesalius, 1543)
23.
Physics
–
Physics is the natural science that involves the study of matter and its motion and behavior through space and time, along with related concepts such as energy and force. One of the most fundamental disciplines, the main goal of physics is to understand how the universe behaves. Physics is one of the oldest academic disciplines, perhaps the oldest through its inclusion of astronomy, Physics intersects with many interdisciplinary areas of research, such as biophysics and quantum chemistry, and the boundaries of physics are not rigidly defined. New ideas in physics often explain the mechanisms of other sciences while opening new avenues of research in areas such as mathematics. Physics also makes significant contributions through advances in new technologies that arise from theoretical breakthroughs, the United Nations named 2005 the World Year of Physics. Astronomy is the oldest of the natural sciences, the stars and planets were often a target of worship, believed to represent their gods. While the explanations for these phenomena were often unscientific and lacking in evidence, according to Asger Aaboe, the origins of Western astronomy can be found in Mesopotamia, and all Western efforts in the exact sciences are descended from late Babylonian astronomy. The most notable innovations were in the field of optics and vision, which came from the works of many scientists like Ibn Sahl, Al-Kindi, Ibn al-Haytham, Al-Farisi and Avicenna. The most notable work was The Book of Optics, written by Ibn Al-Haitham, in which he was not only the first to disprove the ancient Greek idea about vision, but also came up with a new theory. In the book, he was also the first to study the phenomenon of the pinhole camera, many later European scholars and fellow polymaths, from Robert Grosseteste and Leonardo da Vinci to René Descartes, Johannes Kepler and Isaac Newton, were in his debt. Indeed, the influence of Ibn al-Haythams Optics ranks alongside that of Newtons work of the same title, the translation of The Book of Optics had a huge impact on Europe. From it, later European scholars were able to build the devices as what Ibn al-Haytham did. From this, such important things as eyeglasses, magnifying glasses, telescopes, Physics became a separate science when early modern Europeans used experimental and quantitative methods to discover what are now considered to be the laws of physics. Newton also developed calculus, the study of change, which provided new mathematical methods for solving physical problems. The discovery of new laws in thermodynamics, chemistry, and electromagnetics resulted from greater research efforts during the Industrial Revolution as energy needs increased, however, inaccuracies in classical mechanics for very small objects and very high velocities led to the development of modern physics in the 20th century. Modern physics began in the early 20th century with the work of Max Planck in quantum theory, both of these theories came about due to inaccuracies in classical mechanics in certain situations. Quantum mechanics would come to be pioneered by Werner Heisenberg, Erwin Schrödinger, from this early work, and work in related fields, the Standard Model of particle physics was derived. Areas of mathematics in general are important to this field, such as the study of probabilities, in many ways, physics stems from ancient Greek philosophy
Physics
–
Further information:
Outline of physics
Physics
–
Ancient
Egyptian astronomy is evident in monuments like the
ceiling of Senemut's tomb from the
Eighteenth Dynasty of Egypt.
Physics
–
Sir Isaac Newton (1643–1727), whose
laws of motion and
universal gravitation were major milestones in classical physics
Physics
–
Albert Einstein (1879–1955), whose work on the
photoelectric effect and the
theory of relativity led to a revolution in 20th century physics
24.
Artificial intelligence
–
Artificial intelligence is intelligence exhibited by machines. Colloquially, the artificial intelligence is applied when a machine mimics cognitive functions that humans associate with other human minds, such as learning. As machines become increasingly capable, mental facilities once thought to require intelligence are removed from the definition, for instance, optical character recognition is no longer perceived as an example of artificial intelligence, having become a routine technology. AI research is divided into subfields that focus on specific problems or on specific approaches or on the use of a tool or towards satisfying particular applications. The central problems of AI research include reasoning, knowledge, planning, learning, natural language processing, perception, general intelligence is among the fields long-term goals. Approaches include statistical methods, computational intelligence, and traditional symbolic AI, Many tools are used in AI, including versions of search and mathematical optimization, logic, methods based on probability and economics. The AI field draws upon computer science, mathematics, psychology, linguistics, philosophy, neuroscience, the field was founded on the claim that human intelligence can be so precisely described that a machine can be made to simulate it. Some people also consider AI a danger to humanity if it progresses unabatedly, while thought-capable artificial beings appeared as storytelling devices in antiquity, the idea of actually trying to build a machine to perform useful reasoning may have begun with Ramon Llull. With his Calculus ratiocinator, Gottfried Leibniz extended the concept of the calculating machine, since the 19th century, artificial beings are common in fiction, as in Mary Shelleys Frankenstein or Karel Čapeks R. U. R. The study of mechanical or formal reasoning began with philosophers and mathematicians in antiquity, in the 19th century, George Boole refined those ideas into propositional logic and Gottlob Frege developed a notational system for mechanical reasoning. Around the 1940s, Alan Turings theory of computation suggested that a machine, by shuffling symbols as simple as 0 and 1 and this insight, that digital computers can simulate any process of formal reasoning, is known as the Church–Turing thesis. Along with concurrent discoveries in neurology, information theory and cybernetics, the first work that is now generally recognized as AI was McCullouch and Pitts 1943 formal design for Turing-complete artificial neurons. The field of AI research was born at a conference at Dartmouth College in 1956, attendees Allen Newell, Herbert Simon, John McCarthy, Marvin Minsky and Arthur Samuel became the founders and leaders of AI research. At the conference, Newell and Simon, together with programmer J. C, shaw, presented the first true artificial intelligence program, the Logic Theorist. This spurred tremendous research in the domain, computers were winning at checkers, solving problems in algebra, proving logical theorems. By the middle of the 1960s, research in the U. S. was heavily funded by the Department of Defense and laboratories had been established around the world. AIs founders were optimistic about the future, Herbert Simon predicted, machines will be capable, within twenty years, Marvin Minsky agreed, writing, within a generation. The problem of creating artificial intelligence will substantially be solved and they failed to recognize the difficulty of some of the remaining tasks
Artificial intelligence
–
Kismet, a robot with rudimentary social skills
Artificial intelligence
–
An ontology represents knowledge as a set of concepts within a domain and the relationships between those concepts.
Artificial intelligence
–
Main articles
25.
Machine learning
–
Machine learning is the subfield of computer science that, according to Arthur Samuel in 1959, gives computers the ability to learn without being explicitly programmed. Machine learning is related to computational statistics, which also focuses on prediction-making through the use of computers. It has strong ties to optimization, which delivers methods, theory. Machine learning is sometimes conflated with data mining, where the latter subfield focuses more on data analysis and is known as unsupervised learning. Machine learning can also be unsupervised and be used to learn and establish baseline behavioral profiles for various entities, tom M. be replaced with the question Can machines do what we can do. In the proposal he explores the characteristics that could be possessed by a thinking machine. Machine learning tasks are typically classified into three categories, depending on the nature of the learning signal or feedback available to a learning system. These are Supervised learning, The computer is presented with example inputs and their outputs, given by a teacher. Unsupervised learning, No labels are given to the learning algorithm, unsupervised learning can be a goal in itself or a means towards an end. Reinforcement learning, A computer program interacts with an environment in which it must perform a certain goal. The program is provided feedback in terms of rewards and punishments as it navigates its problem space, between supervised and unsupervised learning is semi-supervised learning, where the teacher gives an incomplete training signal, a training set with some of the target outputs missing. Transduction is a case of this principle where the entire set of problem instances is known at learning time. Among other categories of machine learning problems, learning to learn learns its own inductive bias based on previous experience and this is typically tackled in a supervised way. Spam filtering is an example of classification, where the inputs are email messages, in regression, also a supervised problem, the outputs are continuous rather than discrete. In clustering, a set of inputs is to be divided into groups, unlike in classification, the groups are not known beforehand, making this typically an unsupervised task. Density estimation finds the distribution of inputs in some space, dimensionality reduction simplifies inputs by mapping them into a lower-dimensional space. Topic modeling is a problem, where a program is given a list of human language documents and is tasked to find out which documents cover similar topics. As a scientific endeavour, machine learning grew out of the quest for artificial intelligence, already in the early days of AI as an academic discipline, some researchers were interested in having machines learn from data
Machine learning
–
Machine learning and
data mining
26.
Computer science
–
Computer science is the study of the theory, experimentation, and engineering that form the basis for the design and use of computers. An alternate, more succinct definition of science is the study of automating algorithmic processes that scale. A computer scientist specializes in the theory of computation and the design of computational systems and its fields can be divided into a variety of theoretical and practical disciplines. Some fields, such as computational complexity theory, are highly abstract, other fields still focus on challenges in implementing computation. Human–computer interaction considers the challenges in making computers and computations useful, usable, the earliest foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating fixed numerical tasks such as the abacus have existed since antiquity, further, algorithms for performing computations have existed since antiquity, even before the development of sophisticated computing equipment. Wilhelm Schickard designed and constructed the first working mechanical calculator in 1623, in 1673, Gottfried Leibniz demonstrated a digital mechanical calculator, called the Stepped Reckoner. He may be considered the first computer scientist and information theorist, for, among other reasons and he started developing this machine in 1834, and in less than two years, he had sketched out many of the salient features of the modern computer. A crucial step was the adoption of a card system derived from the Jacquard loom making it infinitely programmable. Around 1885, Herman Hollerith invented the tabulator, which used punched cards to process statistical information, when the machine was finished, some hailed it as Babbages dream come true. During the 1940s, as new and more powerful computing machines were developed, as it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. Computer science began to be established as an academic discipline in the 1950s. The worlds first computer science program, the Cambridge Diploma in Computer Science. The first computer science program in the United States was formed at Purdue University in 1962. Since practical computers became available, many applications of computing have become distinct areas of study in their own rights and it is the now well-known IBM brand that formed part of the computer science revolution during this time. IBM released the IBM704 and later the IBM709 computers, still, working with the IBM was frustrating if you had misplaced as much as one letter in one instruction, the program would crash, and you would have to start the whole process over again. During the late 1950s, the science discipline was very much in its developmental stages. Time has seen significant improvements in the usability and effectiveness of computing technology, modern society has seen a significant shift in the users of computer technology, from usage only by experts and professionals, to a near-ubiquitous user base
Computer science
–
Ada Lovelace is credited with writing the first
algorithm intended for processing on a computer.
Computer science
Computer science
–
The
German military used the
Enigma machine (shown here) during
World War II for communications they wanted kept secret. The large-scale decryption of Enigma traffic at
Bletchley Park was an important factor that contributed to Allied victory in WWII.
Computer science
–
Digital logic
27.
Game theory
–
Game theory is the study of mathematical models of conflict and cooperation between intelligent rational decision-makers. Game theory is used in economics, political science, and psychology, as well as logic, computer science. Originally, it addressed zero-sum games, in one persons gains result in losses for the other participants. Today, game theory applies to a range of behavioral relations, and is now an umbrella term for the science of logical decision making in humans, animals. Modern game theory began with the idea regarding the existence of equilibria in two-person zero-sum games. Von Neumanns original proof used Brouwer fixed-point theorem on continuous mappings into compact convex sets and his paper was followed by the 1944 book Theory of Games and Economic Behavior, co-written with Oskar Morgenstern, which considered cooperative games of several players. The second edition of this provided an axiomatic theory of expected utility. This theory was developed extensively in the 1950s by many scholars, Game theory was later explicitly applied to biology in the 1970s, although similar developments go back at least as far as the 1930s. Game theory has been recognized as an important tool in many fields. With the Nobel Memorial Prize in Economic Sciences going to game theorist Jean Tirole in 2014, John Maynard Smith was awarded the Crafoord Prize for his application of game theory to biology. Early discussions of examples of two-person games occurred long before the rise of modern, the first known discussion of game theory occurred in a letter written by Charles Waldegrave, an active Jacobite, and uncle to James Waldegrave, a British diplomat, in 1713. In this letter, Waldegrave provides a mixed strategy solution to a two-person version of the card game le Her. James Madison made what we now recognize as an analysis of the ways states can be expected to behave under different systems of taxation. In 1913 Ernst Zermelo published Über eine Anwendung der Mengenlehre auf die Theorie des Schachspiels and it proved that the optimal chess strategy is strictly determined. This paved the way for more general theorems, the Danish mathematician Zeuthen proved that the mathematical model had a winning strategy by using Brouwers fixed point theorem. In his 1938 book Applications aux Jeux de Hasard and earlier notes, Borel conjectured that non-existence of mixed-strategy equilibria in two-person zero-sum games would occur, a conjecture that was proved false. Game theory did not really exist as a field until John von Neumann published a paper in 1928. Von Neumanns original proof used Brouwers fixed-point theorem on continuous mappings into compact convex sets and his paper was followed by his 1944 book Theory of Games and Economic Behavior co-authored with Oskar Morgenstern
Game theory
–
An extensive form game
28.
Philosophy
–
Philosophy is the study of general and fundamental problems concerning matters such as existence, knowledge, values, reason, mind, and language. The term was coined by Pythagoras. Philosophical methods include questioning, critical discussion, rational argument and systematic presentation, classic philosophical questions include, Is it possible to know anything and to prove it. However, philosophers might also pose more practical and concrete questions such as, is it better to be just or unjust. Historically, philosophy encompassed any body of knowledge, from the time of Ancient Greek philosopher Aristotle to the 19th century, natural philosophy encompassed astronomy, medicine and physics. For example, Newtons 1687 Mathematical Principles of Natural Philosophy later became classified as a book of physics, in the 19th century, the growth of modern research universities led academic philosophy and other disciplines to professionalize and specialize. In the modern era, some investigations that were part of philosophy became separate academic disciplines, including psychology, sociology. Other investigations closely related to art, science, politics, or other pursuits remained part of philosophy, for example, is beauty objective or subjective. Are there many scientific methods or just one, is political utopia a hopeful dream or hopeless fantasy. Major sub-fields of academic philosophy include metaphysics, epistemology, ethics, aesthetics, political philosophy, logic, philosophy of science, since the 20th century, professional philosophers contribute to society primarily as professors, researchers and writers. Traditionally, the term referred to any body of knowledge. In this sense, philosophy is related to religion, mathematics, natural science, education. This division is not obsolete but has changed, Natural philosophy has split into the various natural sciences, especially astronomy, physics, chemistry, biology and cosmology. Moral philosophy has birthed the social sciences, but still includes value theory, metaphysical philosophy has birthed formal sciences such as logic, mathematics and philosophy of science, but still includes epistemology, cosmology and others. Many philosophical debates that began in ancient times are still debated today, colin McGinn and others claim that no philosophical progress has occurred during that interval. Chalmers and others, by contrast, see progress in philosophy similar to that in science, in one general sense, philosophy is associated with wisdom, intellectual culture and a search for knowledge. In that sense, all cultures and literate societies ask philosophical questions such as how are we to live, a broad and impartial conception of philosophy then, finds a reasoned inquiry into such matters as reality, morality and life in all world civilizations. Socrates was an influential philosopher, who insisted that he possessed no wisdom but was a pursuer of wisdom
Philosophy
–
René Descartes
Philosophy
–
Thomas Aquinas
Philosophy
–
Jeremy Bentham
Philosophy
–
Thomas Hobbes
29.
Complex systems
–
Complex systems present problems both in mathematical modelling and philosophical foundations. The subject is also called complex systems theory, complexity science, study of complex systems, complex networks, network science. Such a systems approach is used in computer science, biology, economics, physics, chemistry, architecture. A variety of abstract theoretical complex systems is studied as a field of mathematics, the key problems of complex systems are difficulties with their formal modelling and simulation. From such a perspective, in different research contexts complex systems are defined on the basis of their different attributes, since all complex systems have many interconnected components, the science of networks and network theory are important and useful tools for the study of complex systems. A theory for the resilience of system of systems represented by a network of interdependent networks was developed by Buldyrev et al, a consensus regarding a single universal definition of complex system does not yet exist. For systems that are less usefully represented with various other kinds of narratives. The study of complex system models is used for many scientific questions poorly suited to the traditional mechanistic conception provided by science. Linear systems represent the class of systems for which general techniques for stability control. However, many systems are inherently complex systems in terms of the definition above. This debate would notably lead economists, politicians and other parties to explore the question of computational complexity, gregory Bateson played a key role in establishing the connection between anthropology and systems theory, he recognized that the interactive parts of cultures function much like ecosystems. The first research institute focused on systems, the Santa Fe Institute, was founded in 1984. Today, there are over 50 institutes and research focusing on complex systems. The traditional approach to dealing with complexity is to reduce or constrain it, typically, this involves compartmentalisation, dividing a large system into separate parts. Organizations, for instance, divide their work into departments that deal with separate issues. Engineering systems are designed using modular components. However, modular designs become susceptible to failure when issues arise that bridge the divisions, as projects and acquisitions become increasingly complex, companies and governments are challenged to find effective ways to manage mega-acquisitions such as the Army Future Combat Systems. Acquisitions such as the FCS rely on a web of interrelated parts which interact unpredictably, over the last decades, within the emerging field of complexity economics new predictive tools have been developed to explain economic growth
Complex systems
–
Complex systems
Complex systems
–
A
Braitenberg simulation, programmed in
breve, an
artificial life simulator
Complex systems
–
A
complex adaptive system model
Complex systems
–
This is a schematic representation of three types of mathematical models of complex systems with the level of their mechanistic understanding.
30.
Probability interpretations
–
The word probability has been used in a variety of ways since it was first applied to the mathematical study of games of chance. Does probability measure the real, physical tendency of something to occur or is it a measure of how strongly one believes it will occur, in answering such questions, mathematicians interpret the probability values of probability theory. There are two categories of probability interpretations which can be called physical and evidential probabilities. Physical probabilities, which are also called objective or frequency probabilities, are associated with random physical systems such as wheels, rolling dice. In such systems, a type of event tends to occur at a persistent rate, or relative frequency. Physical probabilities either explain, or are invoked to explain, these stable frequencies, the two main kinds of theory of physical probability are frequentist accounts and propensity accounts. On most accounts, evidential probabilities are considered to be degrees of belief, the four main evidential interpretations are the classical interpretation, the subjective interpretation, the epistemic or inductive interpretation and the logical interpretation. There are also interpretations of probability covering groups, which are often labelled as intersubjective. Some interpretations of probability are associated with approaches to inference, including theories of estimation. The physical interpretation, for example, is taken by followers of frequentist statistical methods, such as Ronald Fisher, Jerzy Neyman and this article, however, focuses on the interpretations of probability rather than theories of statistical inference. The terminology of this topic is rather confusing, in part because probabilities are studied within a variety of academic fields, the word frequentist is especially tricky. To philosophers it refers to a theory of physical probability. To scientists, on the hand, frequentist probability is just another name for physical probability. Those who promote Bayesian inference view frequentist statistics as an approach to inference that recognises only physical probabilities. It is unanimously agreed that statistics depends somehow on probability, but, as to what probability is and how it is connected with statistics, there has seldom been such complete disagreement and breakdown of communication since the Tower of Babel. Doubtless, much of the disagreement is merely terminological and would disappear under sufficiently sharp analysis, the philosophy of probability presents problems chiefly in matters of epistemology and the uneasy interface between mathematical concepts and ordinary language as it is used by non-mathematicians. Probability theory is a field of study in mathematics. The first attempt at mathematical rigour in the field of probability, developed from studies of games of chance it states that probability is shared equally between all the possible outcomes, provided these outcomes can be deemed equally likely
Probability interpretations
–
The classical definition of probability works well for situations with only a finite number of equally-likely outcomes.
Probability interpretations
–
For frequentists, the probability of the ball landing in any pocket can be determined only by repeated trials in which the observed result converges to the underlying probability in the long run.
Probability interpretations
–
Gambling odds reflect the average bettor's 'degree of belief' in the outcome.