Martin Gardner

Martin Gardner was an American popular mathematics and popular science writer, with interests encompassing scientific skepticism, philosophy and literature—especially the writings of Lewis Carroll, L. Frank Baum, G. K. Chesterton, he is recognized as a leading authority on Lewis Carroll. The Annotated Alice, which incorporated the text of Carroll's two Alice books, was his most successful work and sold over a million copies, he had a lifelong interest in magic and illusion and was regarded as one of the most important magicians of the twentieth century. He was considered the doyen of American puzzlers, he was a versatile author, publishing more than 100 books. Gardner was best known for creating and sustaining interest in recreational mathematics—and by extension, mathematics in general—throughout the latter half of the 20th century, principally through his "Mathematical Games" columns; these appeared for twenty-five years in Scientific American, his subsequent books collecting them. Gardner was one of the foremost anti-pseudoscience polemicists of the 20th century.

His 1957 book Fads and Fallacies in the Name of Science became a classic and seminal work of the skeptical movement. In 1976 he joined with fellow skeptics to found CSICOP, an organization promoting scientific inquiry and the use of reason in examining extraordinary claims. Gardner, son of a petroleum geologist father and an educator and artist mother, grew up in and around Tulsa, Oklahoma, his lifelong interest in puzzles started in his boyhood when his father gave him a copy of Sam Loyd's Cyclopedia of 5000 Puzzles and Conundrums. He attended the University of Chicago, where he earned his bachelor's degree in philosophy in 1936. Early jobs included reporter on the Tulsa Tribune, writer at the University of Chicago Office of Press Relations, case worker in Chicago's Black Belt for the city's Relief Administration. During World War II, he served for four years in the U. S. Navy as a yeoman on board the destroyer escort USS Pope in the Atlantic, his ship was still in the Atlantic when the war came to an end with the surrender of Japan in August 1945.

After the war, Gardner returned to the University of Chicago. He attended graduate school for a year there. In 1950 he wrote an article in the Antioch Review entitled "The Hermit Scientist", it was one of Gardner's earliest articles about junk science, in 1952 a much-expanded version became his first published book: In the Name of Science: An Entertaining Survey of the High Priests and Cultists of Science and Present. In the late 1940s, Gardner moved to New York City and became a writer and editor at Humpty Dumpty magazine where for eight years he wrote features and stories for it and several other children's magazines, his paper-folding puzzles at that magazine led to his first work at Scientific American. For many decades, his wife Charlotte, their two sons and Tom, lived in Hastings-on-Hudson, New York, where he earned his living as a freelance author, publishing books with several different publishers, publishing hundreds of magazine and newspaper articles. Appropriately enough—given his interest in logic and mathematics—they lived on Euclid Avenue.

The year 1960 saw the original edition of the best-selling book of The Annotated Alice. In 1979, Gardner retired from Scientific American and he and his wife Charlotte moved to Hendersonville, North Carolina. Gardner never retired as an author, but continued to write math articles, sending them to The Mathematical Intelligencer, Math Horizons, The College Mathematics Journal, Scientific American, he revised some of his older books such as Origami and the Soma Cube. Charlotte died in 2000 and two years Gardner returned to Norman, where his son, James Gardner, was a professor of education at the University of Oklahoma, he died there on May 22, 2010. An autobiography — Undiluted Hocus-Pocus: The Autobiography of Martin Gardner — was published posthumously. Martin Gardner had a major impact on mathematics in the second half of the 20th century, his column was called "Mathematical Games" but it was much more than that. His writing introduced many readers to real mathematics for the first time in their lives.

The column lasted for 25 years and was read avidly by the generation of mathematicians and physicists who grew up in the years 1956 to 1981. It was the original inspiration for many of them to become scientists themselves. David Auerbach wrote: A case can be made, in purely practical terms, for Martin Gardner as one of the most influential writers of the 20th century, his popularizations of science and mathematical games in Scientific American, over the 25 years he wrote for them, might have helped create more young mathematicians and computer scientists than any other single factor prior to the advent of the personal computer. Among the wide array of mathematicians, computer scientists, magicians, artists and other influential thinkers who inspired and were in turn inspired by Gardner are John Horton Conway, Bill Gosper, Ron Rivest, Richard K. Guy, Piet Hein, Ronald Graham, Donald Knuth, Robert Nozick, Lee Sallows, Scott Kim, M. C. Escher, Douglas Hofstadter, Roger Penrose, Ian Stewart, David A. Klarner, Benoit Mandelbrot, Elwyn R. Berlekamp, Solomon W. Golomb, Raymond Smullyan, James Randi, Persi Diaconis, Penn & Teller, Ray Hyman.

His admirers included such diverse people as W. H. Auden, Arthur C. Clarke, Carl Sagan, Isaac Asimov, Richard Dawkins, Stephen Jay Gould, the entire French literary group known as the Oulipo. Salvador Dali once sought him out to discuss four-dimensional hypercubes. Gardner wrote to M. C. Escher in 1961 to ask permission

Rock–paper–scissors

Rock–paper–scissors is a hand game played between two people, in which each player forms one of three shapes with an outstretched hand. These shapes are "rock", "paper", "scissors". "Scissors" is identical to the two-fingered V sign except that it is pointed horizontally instead of being held upright in the air. A simultaneous, zero-sum game, it has only two possible outcomes: a draw, or a win for one player and a loss for the other. A player who decides to play rock will beat another player who has chosen scissors, but will lose to one who has played paper. If both players choose the same shape, the game is tied and is immediately replayed to break the tie; the type of game originated in China and spread with increased contact with East Asia, while developing different variants in signs over time. Other names for the game in the English-speaking world include roshambo and other orderings of the three items, with "rock" sometimes being called "stone". Rock–paper–scissors is used as a fair choosing method between two people, similar to coin flipping, drawing straws, or throwing dice in order to settle a dispute or make an unbiased group decision.

Unlike random selection methods, rock–paper–scissors can be played with a degree of skill by recognizing and exploiting non-random behavior in opponents. The players count aloud to three, or speak the name of the game, each time either raising one hand in a fist and swinging it down on the count or holding it behind, they "throw" by extending it towards their opponent. Variations include a version where players use only three counts before throwing their gesture, or a version where they shake their hands three times before "throwing"; the first known mention of the game was in the book Wuzazu by the Chinese Ming-dynasty writer Xie Zhaozhi, who wrote that the game dated back to the time of the Chinese Han dynasty. In the book, the game was called shoushiling. Li Rihua's book Note of Liuyanzhai mentions this game, calling it shoushiling, huozhitou, or huoquan. Throughout Japanese history there are frequent references to sansukumi-ken, meaning ken games where "the three who are afraid of one another".

This type of game originated in China before being imported to Japan and subsequently becoming popular among the Japanese. The earliest Japanese sansukumi-ken game was known as mushi-ken, imported directly from China. In mushi-ken the "frog" is superseded by the "slug", which, in turn is superseded by the "snake", superseded by the "frog". Although this game was imported from China the Japanese version differs in the animals represented. In adopting the game, the original Chinese characters for the poisonous centipede were confused with the characters for the slug; the most popular sansukumi-ken game in Japan was kitsune-ken. In the game, a supernatural fox called a kitsune defeats the village head, the village head defeats the hunter, the hunter defeats the fox. Kitsune-ken, unlike mushi-ken or rock–paper–scissors, is played by making gestures with both hands. Today, the best-known sansukumi-ken is called jan-ken, a variation of the Chinese games introduced in the 17th century. Jan-ken uses the rock and scissors signs and is the game that the modern version of rock–paper–scissors derives from directly.

Hand-games using gestures to represent the three conflicting elements of rock and scissors have been most common since the modern version of the game was created in the late 19th century, between the Edo and Meiji periods. By the early 20th century, rock–paper–scissors had spread beyond Asia through increased Japanese contact with the west, its English-language name is therefore taken from a translation of the names of the three Japanese hand-gestures for rock and scissors: elsewhere in Asia the open-palm gesture represents "cloth" rather than "paper". The shape of the scissors is adopted from the Japanese style. In Britain in 1924 it was described in a letter to The Times as a hand game of Mediterranean origin, called "zhot". A reader wrote in to say that the game "zhot" referred to was evidently Jan-ken-pon, which she had seen played throughout Japan. Although at this date the game appears to have been new enough to British readers to need explaining, the appearance by 1927 of a popular thriller with the title Scissors Cut Paper, followed by Stone Blunts Scissors, suggests it became popular.

In 1927 La Vie au patronage, a children's magazine in France, described it in detail, referring to it as a "jeu japonais". Its French name, "Chi-fou-mi", is based on the Old Japanese words for "one, three". A 1932 New York Times article on the Tokyo rush hour describes the rules of the game for the benefit of American readers, suggesting it was not at that time known in the U. S; the 1933 edition of the Compton's Pictured Encyclopedia described it as a common method of settling disputes between children in its article on Japan.

Extensive-form game

An extensive-form game is a specification of a game in game theory, allowing for the explicit representation of a number of key aspects, like the sequencing of players' possible moves, their choices at every decision point, the information each player has about the other player's moves when they make a decision, their payoffs for all possible game outcomes. Extensive-form games allow for the representation of incomplete information in the form of chance events modeled as "moves by nature"; some authors in introductory textbooks define the extensive-form game as being just a game tree with payoffs, add the other elements in subsequent chapters as refinements. Whereas the rest of this article follows this gentle approach with motivating examples, we present upfront the finite extensive-form games as constructed here; this general definition was introduced by Harold W. Kuhn in 1953, who extended an earlier definition of von Neumann from 1928. Following the presentation from Hart, an n-player extensive-form game thus consists of the following: A finite set of n players A rooted tree, called the game tree Each terminal node of the game tree has an n-tuple of payoffs, meaning there is one payoff for each player at the end of every possible play A partition of the non-terminal nodes of the game tree in n+1 subsets, one for each player, with a special subset for a fictitious player called Chance.

Each player's subset of nodes is referred to as the "nodes of the player". Each node of the Chance player has a probability distribution over its outgoing edges; each set of nodes of a rational player is further partitioned in information sets, which make certain choices indistinguishable for the player when making a move, in the sense that: there is a one-to-one correspondence between outgoing edges of any two nodes of the same information set—thus the set of all outgoing edges of an information set is partitioned in equivalence classes, each class representing a possible choice for a player's move at some point—, every path in the tree from the root to a terminal node can cross each information set at most once the complete description of the game specified by the above parameters is common knowledge among the playersA play is thus a path through the tree from the root to a terminal node. At any given non-terminal node belonging to Chance, an outgoing branch is chosen according to the probability distribution.

At any rational player's node, the player must choose one of the equivalence classes for the edges, which determines one outgoing edge except the player doesn't know which one is being followed. A pure strategy for a player thus consists of a selection—choosing one class of outgoing edges for every information set. In a game of perfect information, the information sets are singletons. It's less evident, it is assumed that each player has a von Neumann–Morgenstern utility function defined for every game outcome. The above presentation, while defining the mathematical structure over which the game is played, elides however the more technical discussion of formalizing statements about how the game is played like "a player cannot distinguish between nodes in the same information set when making a decision"; these can be made precise using epistemic modal logic. A perfect information two-player game over a game tree can be represented as an extensive form game with outcomes. Examples of such games include tic-tac-toe and infinite chess.

A game over an expectminimax tree, like that of backgammon, has no imperfect information but has moves of chance. For example, poker has both moves of imperfect information. A complete extensive-form representation specifies: the players of a game for every player every opportunity they have to move what each player can do at each of their moves what each player knows for every move the payoffs received by every player for every possible combination of moves The game on the right has two players: 1 and 2; the numbers by every non-terminal node indicate. The numbers by every terminal node represent the payoffs to the players; the labels by every edge of the graph are the name of the action. The initial node belongs to player 1. Play according to the tree is as follows: player 1 chooses between U and D; the payoffs are as specified in the tree. There are four outcomes represented by the four terminal nodes of the tree:, and; the payoffs associated with each outcome are as follows, and. If player 1 plays D, player 2 will play U' to maximise their payoff and so player 1 will only receive 1.

However, if player 1 plays U, player 2 maximises their payoff by playing D' and player 1 receives 2. Player 1 prefers 2 to 1 and s

Probability

Probability is the measure of the likelihood that an event will occur. See glossary of probability and statistics. Probability quantifies as a number between 0 and 1, loosely speaking, 0 indicates impossibility and 1 indicates certainty; the higher the probability of an event, the more it is that the event will occur. A simple example is the tossing of a fair coin. Since the coin is fair, the two outcomes are both probable; these concepts have been given an axiomatic mathematical formalization in probability theory, used in such areas of study as mathematics, finance, science, artificial intelligence/machine learning, computer science, game theory, philosophy to, for example, draw inferences about the expected frequency of events. Probability theory is used to describe the underlying mechanics and regularities of complex systems; when dealing with experiments that are random and well-defined in a purely theoretical setting, probabilities can be numerically described by the number of desired outcomes divided by the total number of all outcomes.

For example, tossing a fair coin twice will yield "head-head", "head-tail", "tail-head", "tail-tail" outcomes. The probability of getting an outcome of "head-head" is 1 out of 4 outcomes, or, in numerical terms, 1/4, 0.25 or 25%. However, when it comes to practical application, there are two major competing categories of probability interpretations, whose adherents possess different views about the fundamental nature of probability: Objectivists assign numbers to describe some objective or physical state of affairs; the most popular version of objective probability is frequentist probability, which claims that the probability of a random event denotes the relative frequency of occurrence of an experiment's outcome, when repeating the experiment. This interpretation considers probability to be the relative frequency "in the long run" of outcomes. A modification of this is propensity probability, which interprets probability as the tendency of some experiment to yield a certain outcome if it is performed only once.

Subjectivists assign numbers per subjective probability. The degree of belief has been interpreted as, "the price at which you would buy or sell a bet that pays 1 unit of utility if E, 0 if not E." The most popular version of subjective probability is Bayesian probability, which includes expert knowledge as well as experimental data to produce probabilities. The expert knowledge is represented by some prior probability distribution; these data are incorporated in a likelihood function. The product of the prior and the likelihood, results in a posterior probability distribution that incorporates all the information known to date. By Aumann's agreement theorem, Bayesian agents whose prior beliefs are similar will end up with similar posterior beliefs. However, sufficiently different priors can lead to different conclusions regardless of how much information the agents share; the word probability derives from the Latin probabilitas, which can mean "probity", a measure of the authority of a witness in a legal case in Europe, correlated with the witness's nobility.

In a sense, this differs much from the modern meaning of probability, which, in contrast, is a measure of the weight of empirical evidence, is arrived at from inductive reasoning and statistical inference. The scientific study of probability is a modern development of mathematics. Gambling shows that there has been an interest in quantifying the ideas of probability for millennia, but exact mathematical descriptions arose much later. There are reasons for the slow development of the mathematics of probability. Whereas games of chance provided the impetus for the mathematical study of probability, fundamental issues are still obscured by the superstitions of gamblers. According to Richard Jeffrey, "Before the middle of the seventeenth century, the term'probable' meant approvable, was applied in that sense, unequivocally, to opinion and to action. A probable action or opinion was one such as sensible people would undertake or hold, in the circumstances." However, in legal contexts especially,'probable' could apply to propositions for which there was good evidence.

The sixteenth century Italian polymath Gerolamo Cardano demonstrated the efficacy of defining odds as the ratio of favourable to unfavourable outcomes. Aside from the elementary work by Cardano, the doctrine of probabilities dates to the correspondence of Pierre de Fermat and Blaise Pascal. Christiaan Huygens gave the earliest known scientific treatment of the subject. Jakob Bernoulli's Ars Conjectandi and Abraham de Moivre's Doctrine of Chances treated the subject as a branch of mathematics. See Ian Hacking's The Emergence of Probability and James Franklin's The Science of Conjecture for histories of the early development of the concept of mathematical probability; the theory of errors may be traced back to Roger Cotes's Opera Miscellanea, but a memoir prepared by Thomas Simpson in 1755 first applied the theory to the discussion of errors of observation. The reprint of this memoir lays down the axioms that positive and negative errors are probable, that certain assignable limits define the range of all errors.

Simpson discusses c

Tit for tat

Tit for tat is an English saying meaning "equivalent retaliation". It is a effective strategy in game theory for the iterated prisoner's dilemma; the strategy was first introduced by Anatol Rapoport in Robert Axelrod's two tournaments, held around 1980. Notably, it was both the most successful in direct competition; the phrase came from another phrase "tip for tap", first used in 1558. An agent using this strategy will first cooperate subsequently replicate an opponent's previous action. If the opponent was cooperative, the agent is cooperative. If not, the agent is not; this is similar to reciprocal altruism in biology. The success of the tit-for-tat strategy, cooperative despite that its name emphasizes an adversarial nature, took many by surprise. Arrayed against strategies produced by various teams it won in two competitions. After the first competition, new strategies formulated to combat tit-for-tat failed due to their negative interactions with each other; this result may give insight into how groups of animals have come to live in cooperative societies, rather than the individualistic "red in tooth and claw" way that might be expected from individuals engaged in a Hobbesian state of nature.

This, its application to human society and politics, is the subject of Robert Axelrod's book The Evolution of Cooperation. Moreover, the tit-for-tat strategy has been of beneficial use to social psychologists and sociologists in studying effective techniques to reduce conflict. Research has indicated that when individuals who have been in competition for a period of time no longer trust one another, the most effective competition reverser is the use of the tit-for-tat strategy. Individuals engage in behavioral assimilation, a process in which they tend to match their own behaviors to those displayed by cooperating or competing group members. Therefore, if the tit-for-tat strategy begins with cooperation cooperation ensues. On the other hand, if the other party competes the tit-for-tat strategy will lead the alternate party to compete as well; each action by the other member is countered with a matching response, competition with competition and cooperation with cooperation. In the case of conflict resolution, the tit-for-tat strategy is effective for several reasons: the technique is recognized as clear, nice and forgiving.

Firstly, It is a recognizable strategy. Those using it recognize its contingencies and adjust their behavior accordingly. Moreover, it is considered to be nice as it begins with cooperation and only defects in following competitive move; the strategy is provocable because it provides immediate retaliation for those who compete. It is forgiving as it produces cooperation should the competitor make a cooperative move; the implications of the tit-for-tat strategy have been of relevance to conflict research and many aspects of applied social science. Take for example the following infinitely repeated prisoners dilemma game: The Tit for Tat strategy copies what the other player choose. If player's cooperate by playing strategy they cooperate forever. Cooperation gives the following payoff: 6 + 6 δ + 6 δ 2 + 6 δ 3... Which can be further simplified using the geometric sum rule: 6 + 6 δ + 6 δ 2 + 6 δ 3... 6 6 1 − δ If a player deviates to defecting the next round they get punished. Alternate between outcomes where p1 cooperates and p2 deviates, vice versa.

Deviation gives the following payoff: 9 + 2 δ + 9 δ 2 + 2 δ 3 + 9 δ 4 + 2 δ 5... This equation can be split in the following way 9 + 9 δ 2 + 9 δ 4... 2 δ + 2 δ 3 + 2 δ 5... Simplify the 2 equations using the geometric sum rule: 9 + 9 δ 2 + 9 δ 4... 9 9 ∗ 1 1 − δ 2 {\d

Sequential game

In game theory, a sequential game is a game where one player chooses their action before the others choose theirs. The players must have some information of the first's choice, otherwise the difference in time would have no strategic effect. Sequential games hence are governed by the time axis, represented in the form of decision trees. Unlike sequential games, simultaneous games do not have a time axis as players choose their moves without being sure of the other's, are represented in the form of payoff matrices. Extensive form representations are used for sequential games, since they explicitly illustrate the sequential aspects of a game. Combinatorial games are sequential games. Games such as chess, infinite chess, tic-tac-toe and Go are examples of sequential games; the size of the decision trees can vary according to game complexity, ranging from the small game tree of tic-tac-toe, to an immensely complex game tree of chess so large that computers cannot map it completely. In sequential games with perfect information, a subgame perfect equilibrium can be found by backward induction.

Simultaneous game Subgame perfection Sequential auction

Preference (economics)

In economics and other social sciences, preference is the ordering of alternatives based on their relative utility, a process which results in an optimal "choice". The character of the individual preferences is determined purely by taste factors, independent of considerations of prices, income, or availability of goods. With the help of the scientific method many practical decisions of life can be modelled, resulting in testable predictions about human behavior. Although economists are not interested in the underlying causes of the preferences in themselves, they are interested in the theory of choice because it serves as a background for empirical demand analysis. In 1926 Ragnar Frisch developed for the first time a mathematical model of preferences in the context of economic demand and utility functions. Up to economists had developed an elaborated theory of demand that omitted primitive characteristics of people; this omission ceased when, at the end of the 19th and the beginning of the 20th century, logical positivism predicated the need of theoretical concepts to be related with observables.

Whereas economists in the 18th and 19th centuries felt comfortable theorizing about utility, with the advent of logical positivism in the 20th century, they felt that it needed more of an empirical structure. Because binary choices are directly observable, it appealed to economists; the search for observables in microeconomics is taken further by revealed preference theory. Since the pioneer efforts of Frisch in the 1920s, one of the major issues which has pervaded the theory of preferences is the representability of a preference structure with a real-valued function; this has been achieved by mapping it to the mathematical index called utility. Von Neumann and Morgenstern 1944 book "Games and Economic Behaviour" treated preferences as a formal relation whose properties can be stated axiomatically; these type of axiomatic handling of preferences soon began to influence other economists: Marschak adopted it by 1950, Houthakker employed it in a 1950 paper, Kenneth Arrow perfected it in his 1951 book "Social Choice and Individual Values".

Gérard Debreu, influenced by the ideas of the Bourbaki group, championed the axiomatization of consumer theory in the 1950s, the tools he borrowed from the mathematical field of binary relations have become mainstream since then. Though the economics of choice can be examined either at the level of utility functions or at the level of preferences, to move from one to the other can be useful. For example, shifting the conceptual basis from an abstract preference relation to an abstract utility scale results in a new mathematical framework, allowing new kinds of conditions on the structure of preference to be formulated and investigated. Another historical turnpoint can be traced back to 1895, when Georg Cantor proved in a theorem that if a binary relation is linearly ordered it is isomorphically embeddable in the ordered real numbers; this notion would become influential for the theory of preferences in economics: by the 1940s prominent authors such as Paul Samuelson, would theorize about people having weakly ordered preferences.

Suppose the set of all states of the world is X and an agent has a preference relation on X. It is common to mark the weak preference relation by ⪯, so that x ⪯ y means "the agent wants y at least as much as x" or "the agent weakly prefers y to x"; the symbol ∼ is used as a shorthand to the indifference relation: x ∼ y ⟺, which reads "the agent is indifferent between y and x". The symbol ≺ is used as a shorthand to the strong preference relation: x ≺ y ⟺, which reads "the agent prefers y to x". In everyday speech, the statement "x is preferred to y" is understood to mean that someone chooses x over y. However, decision theory rests on more precise definitions of preferences given that there are many experimental conditions influencing people's choices in many directions. Suppose a person is confronted with a mental experiment that she must solve with the aid of introspection, she is offered apples and oranges, is asked to verbally choose one of the two. A decision scientist observing this single event would be inclined to say that whichever is chosen is the preferred alternative.

Under several repetitions of this experiment, if the scientist observes that apples are chosen 51% of the time it would mean that x ≻ y. If half of the time oranges are chosen x ∼ y. If 51% of the time she chooses oranges it means that y ≻ x. Preference is here being identified with a greater frequency of choice; this experiment implicitly assumes. Otherwise, out of 100 repetitions, some of them will give as a result that neither apples, oranges or ties are chosen; these few cases of uncertainty will ruin any preference information resulting from the frequency attributes of the other valid cases. However, this example was used