OCLC Online Computer Library Center, Incorporated d/b/a OCLC is an American nonprofit cooperative organization "dedicated to the public purposes of furthering access to the world's information and reducing information costs". It was founded in 1967 as the Ohio College Library Center. OCLC and its member libraries cooperatively produce and maintain WorldCat, the largest online public access catalog in the world. OCLC is funded by the fees that libraries have to pay for its services. OCLC maintains the Dewey Decimal Classification system. OCLC began in 1967, as the Ohio College Library Center, through a collaboration of university presidents, vice presidents, library directors who wanted to create a cooperative computerized network for libraries in the state of Ohio; the group first met on July 5, 1967 on the campus of the Ohio State University to sign the articles of incorporation for the nonprofit organization, hired Frederick G. Kilgour, a former Yale University medical school librarian, to design the shared cataloging system.
Kilgour wished to merge the latest information storage and retrieval system of the time, the computer, with the oldest, the library. The plan was to merge the catalogs of Ohio libraries electronically through a computer network and database to streamline operations, control costs, increase efficiency in library management, bringing libraries together to cooperatively keep track of the world's information in order to best serve researchers and scholars; the first library to do online cataloging through OCLC was the Alden Library at Ohio University on August 26, 1971. This was the first online cataloging by any library worldwide. Membership in OCLC is based on use of services and contribution of data. Between 1967 and 1977, OCLC membership was limited to institutions in Ohio, but in 1978, a new governance structure was established that allowed institutions from other states to join. In 2002, the governance structure was again modified to accommodate participation from outside the United States.
As OCLC expanded services in the United States outside Ohio, it relied on establishing strategic partnerships with "networks", organizations that provided training and marketing services. By 2008, there were 15 independent United States regional service providers. OCLC networks played a key role in OCLC governance, with networks electing delegates to serve on the OCLC Members Council. During 2008, OCLC commissioned two studies to look at distribution channels. In early 2009, OCLC negotiated new contracts with the former networks and opened a centralized support center. OCLC provides bibliographic and full-text information to anyone. OCLC and its member libraries cooperatively produce and maintain WorldCat—the OCLC Online Union Catalog, the largest online public access catalog in the world. WorldCat has holding records from private libraries worldwide; the Open WorldCat program, launched in late 2003, exposed a subset of WorldCat records to Web users via popular Internet search and bookselling sites.
In October 2005, the OCLC technical staff began a wiki project, WikiD, allowing readers to add commentary and structured-field information associated with any WorldCat record. WikiD was phased out; the Online Computer Library Center acquired the trademark and copyrights associated with the Dewey Decimal Classification System when it bought Forest Press in 1988. A browser for books with their Dewey Decimal Classifications was available until July 2013; until August 2009, when it was sold to Backstage Library Works, OCLC owned a preservation microfilm and digitization operation called the OCLC Preservation Service Center, with its principal office in Bethlehem, Pennsylvania. The reference management service QuestionPoint provides libraries with tools to communicate with users; this around-the-clock reference service is provided by a cooperative of participating global libraries. Starting in 1971, OCLC produced catalog cards for members alongside its shared online catalog. OCLC commercially sells software, such as CONTENTdm for managing digital collections.
It offers the bibliographic discovery system WorldCat Discovery, which allows for library patrons to use a single search interface to access an institution's catalog, database subscriptions and more. OCLC has been conducting research for the library community for more than 30 years. In accordance with its mission, OCLC makes its research outcomes known through various publications; these publications, including journal articles, reports and presentations, are available through the organization's website. OCLC Publications – Research articles from various journals including Code4Lib Journal, OCLC Research, Reference & User Services Quarterly, College & Research Libraries News, Art Libraries Journal, National Education Association Newsletter; the most recent publications are displayed first, all archived resources, starting in 1970, are available. Membership Reports – A number of significant reports on topics ranging from virtual reference in libraries to perceptions about library funding. Newsletters – Current and archived newsletters for the library and archive community.
Presentations – Presentations from both guest speakers and OCLC research from conferences and other events. The presentations are organized into five categories: Conference presentations, Dewey presentations, Distinguished Seminar Series, Guest presentations, Research staff
In game theory, the centipede game, first introduced by Robert Rosenthal in 1981, is an extensive form game in which two players take turns choosing either to take a larger share of an increasing pot, or to pass the pot to the other player. The payoffs are arranged so that if one passes the pot to one's opponent and the opponent takes the pot on the next round, one receives less than if one had taken the pot on this round. Although the traditional centipede game had a limit of 100 rounds, any game with this structure but a different number of rounds is called a centipede game; the unique subgame perfect equilibrium of these games indicates that the first player take the pot on the first round of the game. These results are taken to show that subgame perfect equilibria and Nash equilibria fail to predict human play in some circumstances; the Centipede game is used in introductory game theory courses and texts to highlight the concept of backward induction and the iterated elimination of dominated strategies, which show a standard way of providing a solution to the game.
One possible version of a centipede game could be played as follows: Consider two players: Alice and Bob. Alice moves first. At the start of the game, Alice has two piles of coins in front of her: one pile contains 4 coins and the other pile contains 1 coin; each player has two moves available: either "take" the larger pile of coins and give the smaller pile to the other player or "push" both piles across the table to the other player. Each time the piles of coins pass across the table, the quantity of coins in each pile doubles. For example, assume that Alice chooses to "push" the piles on her first move, handing the piles of 1 and 4 coins over to Bob, doubling them to 2 and 8. Bob could now use his first move to either "take" the pile of 8 coins and give 2 coins to Alice, or he can "push" the two piles back across the table again to Alice, again increasing the size of the piles to 4 and 16 coins; the game continues for a fixed number of rounds or until a player decides to end the game by pocketing a pile of coins.
The addition of coins is taken to be an externality. Standard game theoretic tools predict that the first player will defect on the first round, taking the pile of coins for himself. In the centipede game, a pure strategy consists of a set of actions and a mixed strategy is a probability distribution over the possible pure strategies. There are several pure strategy Nash equilibria of the centipede game and infinitely many mixed strategy Nash equilibria. However, there is only one subgame perfect equilibrium. In the unique subgame perfect equilibrium, each player chooses to defect at every opportunity. This, of course, means defection at the first stage. In the Nash equilibria, the actions that would be taken after the initial choice opportunities may be cooperative. Defection by the first player is the unique subgame perfect equilibrium and required by any Nash equilibrium, it can be established by backward induction. Suppose two players reach the final round of the game. Since we suppose the second player will defect, the first player does better by defecting in the second to last round, taking a higher payoff than she would have received by allowing the second player to defect in the last round.
But knowing this, the second player ought to defect in the third to last round, taking a higher payoff than he would have received by allowing the first player to defect in the second to last round. This reasoning proceeds backwards through the game tree until one concludes that the best action is for the first player to defect in the first round; the same reasoning can apply to any node in the game tree. For a game that ends after four rounds, this reasoning proceeds. If we were to reach the last round of the game, Player 2 would do better by choosing d instead of r, receiving 4 coins instead of 3. However, given that 2 will choose d, 1 should choose D in the second to last round, receiving 3 instead of 2. Given that 1 would choose D in the second to last round, 2 should choose d in the third to last round, receiving 2 instead of 1, but given this, Player 1 should choose D in the first round, receiving 1 instead of 0. There are a large number of Nash equilibria in a centipede game, but in each, the first player defects on the first round and the second player defects in the next round enough to dissuade the first player from passing.
Being in a Nash equilibrium does not require that strategies be rational at every point in the game as in the subgame perfect equilibrium. This means that strategies that are cooperative in the never-reached rounds of the game could still be in a Nash equilibrium. In the example above, one Nash equilibrium is for both players to defect on each round. Another Nash equilibrium is for player 1 to defect on the first round, but pass on the third round and for player 2 to defect at any opportunity. Several studies have demonstrated that the Nash equilibrium play is observed. Instead, subjects show partial cooperation, playing "R" for several moves before choosing "D", it is rare f
Reinhard Justus Reginald Selten was a German economist, who won the 1994 Nobel Memorial Prize in Economic Sciences. He is well known for his work in bounded rationality and can be considered as one of the founding fathers of experimental economics. Selten was born in Breslau in Lower Silesia, now in Poland, to a Jewish father, Adolf Selten, Protestant mother, Käthe Luther. Reinhard Selten was raised as Protestant. After a brief family exile in Saxony and Austria, Selten returned to Hesse, Germany after the war and, in high school, read an article in Fortune magazine about game theory by the business writer John D. McDonald, he recalled he would occupy his "mind with problems of elementary geometry and algebra" while walking back and forth to school during that time. He studied mathematics at Goethe University Frankfurt and obtained his diploma in 1957, he worked as scientific assistant to Heinz Sauermann until 1967. In 1959, he married with Elisabeth Lang Reiner, they had no children. In 1961, he received his doctorate in Frankfurt in mathematics with a thesis on the evaluation of n-person games.
He was a visiting professor at Berkeley and taught from 1969 to 1972 at the Free University of Berlin and, from 1972 to 1984, at the University of Bielefeld. He accepted a professorship at the University of Bonn. There he built the BonnEconLab, a laboratory for experimental economic research, where he was active after his retirement. Selten was professor emeritus at the University of Bonn and held several honorary doctoral degrees, he met his wife through the Esperanto movement. He was a co-founder of the International Academy of Sciences San Marino. For the 2009 European Parliament election, he was the top candidate for the German wing of Europe – Democracy – Esperanto. For his work in game theory, Selten won the 1994 Nobel Memorial Prize in Economic Sciences. Selten was Germany's first and, at the time of his death, only Nobel winner for economics, he is well known for his work in bounded rationality, can be considered as one of the founding fathers of experimental economics. With Gerd Gigerenzer he edited the book Bounded Rationality: The Adaptive Toolbox.
He developed an example of a game called Selten's Horse because of its extensive form representation. His last work was "Impulse Balance Theory and its Extension by an Additional Criterion", he is noted for his publishing in non-refereed journals to avoid being forced to make unwanted changes to his work. Preispolitik der Mehrproduktenunternehmung in der statischen Theorie, Berlin-Heidelberg-New York: Springer-Verlag. – in German General Equilibrium with Price-Making Firms, Lecture Notes in Economics and Mathematical Systems, Berlin-Heidelberg-New York: Springer-Verlag. A General Theory of Equilibrium Selection in Games, Massachusetts: MIT-Press. Models of Strategic Rationality and Decision Library, Series C: Game Theory, Mathematical Programming and Operations Research, Dordrecht-Boston-London: Kluwer Academic Publishers. Enkonduko en la Teorion de Lingvaj Ludoj – Ĉu mi lernu Esperanton?, Berlin-Paderborn: Akademia Libroservo, Institut für Kybernetik. – in Esperanto Game Theory and Economic Behavior: Selected Essays, 2.
Vol Cheltenham-Northampton: Edward Elgar Publishing. New edition of: Models of Strategic Rationality, with a Chinese Introduction. Outstanding Academic Works on Economics by Nobel Prize Winners. Dordrecht-Boston-London: Kluwer Academic Publishers. Chinese Translation of: Models of Strategic Rationality. Outstanding Academic Works on Economics by Nobel Prize Winners. Dordrecht-Boston-London: Kluwer Academic Publishers. Russian Translation of: A General Theory of Equilibrium Selection in Games, Massachusetts: MIT-Press. Gigerenzer, G. & Selten, R... Bounded rationality: The adaptive toolbox. Cambridge, Massachusetts: MIT Press. Impulse Balance Theory and its Extension by an Additional Criterion. BoD. Subgame perfect Nash equilibrium Laboratory for Experimental Economics, at the University of Bonn, Germany Reinhard Selten – Autobiography IDEAS/RePEc Reinhard Selten at the Mathematics Genealogy Project Economista alemán, nacido en Breslau. Reinhard Selten; the Concise Encyclopedia of Economics. Library of Economics and Liberty.
Liberty Fund. 2008
An extensive-form game is a specification of a game in game theory, allowing for the explicit representation of a number of key aspects, like the sequencing of players' possible moves, their choices at every decision point, the information each player has about the other player's moves when they make a decision, their payoffs for all possible game outcomes. Extensive-form games allow for the representation of incomplete information in the form of chance events modeled as "moves by nature"; some authors in introductory textbooks define the extensive-form game as being just a game tree with payoffs, add the other elements in subsequent chapters as refinements. Whereas the rest of this article follows this gentle approach with motivating examples, we present upfront the finite extensive-form games as constructed here; this general definition was introduced by Harold W. Kuhn in 1953, who extended an earlier definition of von Neumann from 1928. Following the presentation from Hart, an n-player extensive-form game thus consists of the following: A finite set of n players A rooted tree, called the game tree Each terminal node of the game tree has an n-tuple of payoffs, meaning there is one payoff for each player at the end of every possible play A partition of the non-terminal nodes of the game tree in n+1 subsets, one for each player, with a special subset for a fictitious player called Chance.
Each player's subset of nodes is referred to as the "nodes of the player". Each node of the Chance player has a probability distribution over its outgoing edges; each set of nodes of a rational player is further partitioned in information sets, which make certain choices indistinguishable for the player when making a move, in the sense that: there is a one-to-one correspondence between outgoing edges of any two nodes of the same information set—thus the set of all outgoing edges of an information set is partitioned in equivalence classes, each class representing a possible choice for a player's move at some point—, every path in the tree from the root to a terminal node can cross each information set at most once the complete description of the game specified by the above parameters is common knowledge among the playersA play is thus a path through the tree from the root to a terminal node. At any given non-terminal node belonging to Chance, an outgoing branch is chosen according to the probability distribution.
At any rational player's node, the player must choose one of the equivalence classes for the edges, which determines one outgoing edge except the player doesn't know which one is being followed. A pure strategy for a player thus consists of a selection—choosing one class of outgoing edges for every information set. In a game of perfect information, the information sets are singletons. It's less evident, it is assumed that each player has a von Neumann–Morgenstern utility function defined for every game outcome. The above presentation, while defining the mathematical structure over which the game is played, elides however the more technical discussion of formalizing statements about how the game is played like "a player cannot distinguish between nodes in the same information set when making a decision"; these can be made precise using epistemic modal logic. A perfect information two-player game over a game tree can be represented as an extensive form game with outcomes. Examples of such games include tic-tac-toe and infinite chess.
A game over an expectminimax tree, like that of backgammon, has no imperfect information but has moves of chance. For example, poker has both moves of imperfect information. A complete extensive-form representation specifies: the players of a game for every player every opportunity they have to move what each player can do at each of their moves what each player knows for every move the payoffs received by every player for every possible combination of moves The game on the right has two players: 1 and 2; the numbers by every non-terminal node indicate. The numbers by every terminal node represent the payoffs to the players; the labels by every edge of the graph are the name of the action. The initial node belongs to player 1. Play according to the tree is as follows: player 1 chooses between U and D; the payoffs are as specified in the tree. There are four outcomes represented by the four terminal nodes of the tree:, and; the payoffs associated with each outcome are as follows, and. If player 1 plays D, player 2 will play U' to maximise their payoff and so player 1 will only receive 1.
However, if player 1 plays U, player 2 maximises their payoff by playing D' and player 1 receives 2. Player 1 prefers 2 to 1 and s
Game theory is the study of mathematical models of strategic interaction between rational decision-makers. It has applications in all fields of social science, as well as in computer science, it addressed zero-sum games, in which one person's gains result in losses for the other participants. Today, game theory applies to a wide range of behavioral relations, is now an umbrella term for the science of logical decision making in humans and computers. Modern game theory began with the idea regarding the existence of mixed-strategy equilibria in two-person zero-sum games and its proof by John von Neumann. Von Neumann's original proof used the Brouwer fixed-point theorem on continuous mappings into compact convex sets, which became a standard method in game theory and mathematical economics, his paper was followed by the 1944 book Theory of Games and Economic Behavior, co-written with Oskar Morgenstern, which considered cooperative games of several players. The second edition of this book provided an axiomatic theory of expected utility, which allowed mathematical statisticians and economists to treat decision-making under uncertainty.
Game theory was developed extensively in the 1950s by many scholars. It was explicitly applied to biology in the 1970s, although similar developments go back at least as far as the 1930s. Game theory has been recognized as an important tool in many fields; as of 2014, with the Nobel Memorial Prize in Economic Sciences going to game theorist Jean Tirole, eleven game theorists have won the economics Nobel Prize. John Maynard Smith was awarded the Crafoord Prize for his application of game theory to biology. Early discussions of examples of two-person games occurred long before the rise of modern, mathematical game theory; the first known discussion of game theory occurred in a letter written by Charles Waldegrave, an active Jacobite, uncle to James Waldegrave, a British diplomat, in 1713. In this letter, Waldegrave provides a minimax mixed strategy solution to a two-person version of the card game le Her, the problem is now known as Waldegrave problem. In his 1838 Recherches sur les principes mathématiques de la théorie des richesses, Antoine Augustin Cournot considered a duopoly and presents a solution, a restricted version of the Nash equilibrium.
In 1913, Ernst Zermelo published Über eine Anwendung der Mengenlehre auf die Theorie des Schachspiels. It proved that the optimal chess strategy is determined; this paved the way for more general theorems. In 1938, the Danish mathematical economist Frederik Zeuthen proved that the mathematical model had a winning strategy by using Brouwer's fixed point theorem. In his 1938 book Applications aux Jeux de Hasard and earlier notes, Émile Borel proved a minimax theorem for two-person zero-sum matrix games only when the pay-off matrix was symmetric. Borel conjectured that non-existence of mixed-strategy equilibria in two-person zero-sum games would occur, a conjecture, proved false. Game theory did not exist as a unique field until John von Neumann published the paper On the Theory of Games of Strategy in 1928. Von Neumann's original proof used Brouwer's fixed-point theorem on continuous mappings into compact convex sets, which became a standard method in game theory and mathematical economics, his paper was followed by his 1944 book Theory of Games and Economic Behavior co-authored with Oskar Morgenstern.
The second edition of this book provided an axiomatic theory of utility, which reincarnated Daniel Bernoulli's old theory of utility as an independent discipline. Von Neumann's work in game theory culminated in this 1944 book; this foundational work contains the method for finding mutually consistent solutions for two-person zero-sum games. During the following time period, work on game theory was focused on cooperative game theory, which analyzes optimal strategies for groups of individuals, presuming that they can enforce agreements between them about proper strategies. In 1950, the first mathematical discussion of the prisoner's dilemma appeared, an experiment was undertaken by notable mathematicians Merrill M. Flood and Melvin Dresher, as part of the RAND Corporation's investigations into game theory. RAND pursued the studies because of possible applications to global nuclear strategy. Around this same time, John Nash developed a criterion for mutual consistency of players' strategies, known as Nash equilibrium, applicable to a wider variety of games than the criterion proposed by von Neumann and Morgenstern.
Nash proved that every n-player, non-zero-sum non-cooperative game has what is now known as a Nash equilibrium. Game theory experienced a flurry of activity in the 1950s, during which time the concepts of the core, the extensive form game, fictitious play, repeated games, the Shapley value were developed. In addition, the first applications of game theory to philosophy and political science occurred during this time. In 1979 Robert Axelrod tried setting up computer programs as players and found that in tournaments between them the winner was a simple "tit-for-tat" program that cooperates on the first step on subsequent steps just does whatever its opponent did on the previous step; the same winner was often obtained by natural selection. In 1965, Reinhard Selten introduced his solution concept of subgame perfect equilibria, which further refined the Nash equilibrium. In 1994 Nash and Harsanyi became Economics Nobel Laureates for their contributi
In economics and other social sciences, preference is the ordering of alternatives based on their relative utility, a process which results in an optimal "choice". The character of the individual preferences is determined purely by taste factors, independent of considerations of prices, income, or availability of goods. With the help of the scientific method many practical decisions of life can be modelled, resulting in testable predictions about human behavior. Although economists are not interested in the underlying causes of the preferences in themselves, they are interested in the theory of choice because it serves as a background for empirical demand analysis. In 1926 Ragnar Frisch developed for the first time a mathematical model of preferences in the context of economic demand and utility functions. Up to economists had developed an elaborated theory of demand that omitted primitive characteristics of people; this omission ceased when, at the end of the 19th and the beginning of the 20th century, logical positivism predicated the need of theoretical concepts to be related with observables.
Whereas economists in the 18th and 19th centuries felt comfortable theorizing about utility, with the advent of logical positivism in the 20th century, they felt that it needed more of an empirical structure. Because binary choices are directly observable, it appealed to economists; the search for observables in microeconomics is taken further by revealed preference theory. Since the pioneer efforts of Frisch in the 1920s, one of the major issues which has pervaded the theory of preferences is the representability of a preference structure with a real-valued function; this has been achieved by mapping it to the mathematical index called utility. Von Neumann and Morgenstern 1944 book "Games and Economic Behaviour" treated preferences as a formal relation whose properties can be stated axiomatically; these type of axiomatic handling of preferences soon began to influence other economists: Marschak adopted it by 1950, Houthakker employed it in a 1950 paper, Kenneth Arrow perfected it in his 1951 book "Social Choice and Individual Values".
Gérard Debreu, influenced by the ideas of the Bourbaki group, championed the axiomatization of consumer theory in the 1950s, the tools he borrowed from the mathematical field of binary relations have become mainstream since then. Though the economics of choice can be examined either at the level of utility functions or at the level of preferences, to move from one to the other can be useful. For example, shifting the conceptual basis from an abstract preference relation to an abstract utility scale results in a new mathematical framework, allowing new kinds of conditions on the structure of preference to be formulated and investigated. Another historical turnpoint can be traced back to 1895, when Georg Cantor proved in a theorem that if a binary relation is linearly ordered it is isomorphically embeddable in the ordered real numbers; this notion would become influential for the theory of preferences in economics: by the 1940s prominent authors such as Paul Samuelson, would theorize about people having weakly ordered preferences.
Suppose the set of all states of the world is X and an agent has a preference relation on X. It is common to mark the weak preference relation by ⪯, so that x ⪯ y means "the agent wants y at least as much as x" or "the agent weakly prefers y to x"; the symbol ∼ is used as a shorthand to the indifference relation: x ∼ y ⟺, which reads "the agent is indifferent between y and x". The symbol ≺ is used as a shorthand to the strong preference relation: x ≺ y ⟺, which reads "the agent prefers y to x". In everyday speech, the statement "x is preferred to y" is understood to mean that someone chooses x over y. However, decision theory rests on more precise definitions of preferences given that there are many experimental conditions influencing people's choices in many directions. Suppose a person is confronted with a mental experiment that she must solve with the aid of introspection, she is offered apples and oranges, is asked to verbally choose one of the two. A decision scientist observing this single event would be inclined to say that whichever is chosen is the preferred alternative.
Under several repetitions of this experiment, if the scientist observes that apples are chosen 51% of the time it would mean that x ≻ y. If half of the time oranges are chosen x ∼ y. If 51% of the time she chooses oranges it means that y ≻ x. Preference is here being identified with a greater frequency of choice; this experiment implicitly assumes. Otherwise, out of 100 repetitions, some of them will give as a result that neither apples, oranges or ties are chosen; these few cases of uncertainty will ruin any preference information resulting from the frequency attributes of the other valid cases. However, this example was used
In game theory, a sequential game is a game where one player chooses their action before the others choose theirs. The players must have some information of the first's choice, otherwise the difference in time would have no strategic effect. Sequential games hence are governed by the time axis, represented in the form of decision trees. Unlike sequential games, simultaneous games do not have a time axis as players choose their moves without being sure of the other's, are represented in the form of payoff matrices. Extensive form representations are used for sequential games, since they explicitly illustrate the sequential aspects of a game. Combinatorial games are sequential games. Games such as chess, infinite chess, tic-tac-toe and Go are examples of sequential games; the size of the decision trees can vary according to game complexity, ranging from the small game tree of tic-tac-toe, to an immensely complex game tree of chess so large that computers cannot map it completely. In sequential games with perfect information, a subgame perfect equilibrium can be found by backward induction.
Simultaneous game Subgame perfection Sequential auction