A computer program is a collection of instructions that performs a specific task when executed by a computer. A computer requires programs to function. A computer program is written by a computer programmer in a programming language. From the program in its human-readable form of source code, a compiler can derive machine code—a form consisting of instructions that the computer can directly execute. Alternatively, a computer program may be executed with the aid of an interpreter. A collection of computer programs and related data are referred to as software. Computer programs may be categorized along functional lines, such as application software and system software; the underlying method used for some calculation or manipulation is known as an algorithm. The earliest programmable machines preceded the invention of the digital computer. In 1801, Joseph-Marie Jacquard devised a loom that would weave a pattern by following a series of perforated cards. Patterns could be repeated by arranging the cards.
In 1837, Charles Babbage was inspired by Jacquard's loom to attempt to build the Analytical Engine. The names of the components of the calculating device were borrowed from the textile industry. In the textile industry, yarn was brought from the store to be milled; the device would have had a "store"—memory to hold 1,000 numbers of 40 decimal digits each. Numbers from the "store" would have been transferred to the "mill", for processing, and a "thread" being the execution of programmed instructions by the device. It was programmed using two sets of perforated cards—one to direct the operation and the other for the input variables. However, after more than 17,000 pounds of the British government's money, the thousands of cogged wheels and gears never worked together. During a nine-month period in 1842–43, Ada Lovelace translated the memoir of Italian mathematician Luigi Menabrea; the memoir covered the Analytical Engine. The translation contained Note G which detailed a method for calculating Bernoulli numbers using the Analytical Engine.
This note is recognized by some historians as the world's first written computer program. In 1936, Alan Turing introduced the Universal Turing machine—a theoretical device that can model every computation that can be performed on a Turing complete computing machine, it is a finite-state machine. The machine can move the tape forth, changing its contents as it performs an algorithm; the machine starts in the initial state, goes through a sequence of steps, halts when it encounters the halt state. This machine is considered by some to be the origin of the stored-program computer—used by John von Neumann for the "Electronic Computing Instrument" that now bears the von Neumann architecture name; the Z3 computer, invented by Konrad Zuse in Germany, was a programmable computer. A digital computer uses electricity as the calculating component; the Z3 contained 2,400 relays to create the circuits. The circuits provided a floating-point, nine-instruction computer. Programming the Z3 was through a specially designed keyboard and punched tape.
The Electronic Numerical Integrator And Computer was a Turing complete, general-purpose computer that used 17,468 vacuum tubes to create the circuits. At its core, it was a series of Pascalines wired together, its 40 units weighed 30 tons, occupied 1,800 square feet, consumed $650 per hour in electricity when idle. It had 20 base-10 accumulators. Programming the ENIAC took up to two months. Three function tables needed to be rolled to fixed function panels. Function tables were connected to function panels using heavy black cables; each function table had 728 rotating knobs. Programming the ENIAC involved setting some of the 3,000 switches. Debugging a program took a week; the programmers of the ENIAC were women who were known collectively as the "ENIAC girls." The ENIAC featured parallel operations. Different sets of accumulators could work on different algorithms, it used punched card machines for input and output, it was controlled with a clock signal. It ran for eight years, calculating hydrogen bomb parameters, predicting weather patterns, producing firing tables to aim artillery guns.
The Manchester Baby was a stored-program computer. Programming transitioned away from setting dials. Only three bits of memory were available to store each instruction, so it was limited to eight instructions. 32 switches were available for programming. Computers manufactured; the computer program was written on paper for reference. An instruction was represented by a configuration of on/off settings. After setting the configuration, an execute button was pressed; this process was repeated. Computer programs were manually input via paper tape or punched cards. After the medium was loaded, the starting address was set via switches and the execute button pressed. In 1961, the Burroughs B5000 was built to be programmed in the ALGOL 60 language; the hardware featured circuits to ease the compile phase. In 1964, the IBM System/360 was a line of six computers each having the same instruction set architecture; the Model 30 was the least expensive. Customers could retain the same application software; each System/360 model featured multiprogramming.
With operating system support, multiple programs could be in memory at once. When one was waiting for input/output, another could compute; each model could emulate other computers. Customers could upgrade to the System/360 and ret
A mnemonic device, or memory device, is any learning technique that aids information retention or retrieval in the human memory. Mnemonics make use of elaborative encoding, retrieval cues, imagery as specific tools to encode any given information in a way that allows for efficient storage and retrieval. Mnemonics aid original information in becoming associated with something more accessible or meaningful—which, in turn, provides better retention of the information. Encountered mnemonics are used for lists and in auditory form, such as short poems, acronyms, or memorable phrases, but mnemonics can be used for other types of information and in visual or kinesthetic forms, their use is based on the observation that the human mind more remembers spatial, surprising, sexual, humorous, or otherwise "relatable" information, rather than more abstract or impersonal forms of information. The word "mnemonic" is derived from the Ancient Greek word μνημονικός, meaning "of memory, or relating to memory" and is related to Mnemosyne, the name of the goddess of memory in Greek mythology.
Both of these words are derived from μνήμη, "remembrance, memory". Mnemonics in antiquity were most considered in the context of what is today known as the art of memory. Ancient Greeks and Romans distinguished between two types of memory: the "natural" memory and the "artificial" memory; the former is inborn, is the one that everyone uses instinctively. The latter in contrast has to be trained and developed through the learning and practice of a variety of mnemonic techniques. Mnemonic systems are strategies consciously used to improve memory, they help use information stored in long-term memory to make memorisation an easier task. The general name of mnemonics, or memoria technica, was the name applied to devices for aiding the memory, to enable the mind to reproduce a unfamiliar idea, a series of dissociated ideas, by connecting it, or them, in some artificial whole, the parts of which are mutually suggestive. Mnemonic devices were much cultivated by Greek sophists and philosophers and are referred to by Plato and Aristotle.
In times the poet Simonides was credited for development of these techniques for no reason other than that the power of his memory was famous. Cicero, who attaches considerable importance to the art, but more to the principle of order as the best help to memory, speaks of Carneades of Athens and Metrodorus of Scepsis as distinguished examples of people who used well-ordered images to aid the memory; the Romans valued. The Greek and the Roman system of mnemonics was founded on the use of mental places and signs or pictures, known as "topical" mnemonics; the most usual method was to choose a large house, of which the apartments, windows, furniture, etc. were each associated with certain names, events or ideas, by means of symbolic pictures. To recall these, an individual had only to search over the apartments of the house until discovering the places where images had been placed by the imagination. In accordance with said system, if it were desired to fix a historic date in memory, it was localised in an imaginary town divided into a certain number of districts, each of with ten houses, each house with ten rooms, each room with a hundred quadrates or memory-places on the floor on the four walls on the roof.
Therefore, if it were desired to fix in the memory the date of the invention of printing, an imaginary book, or some other symbol of printing, would be placed in the thirty-sixth quadrate or memory-place of the fourth room of the first house of the historic district of the town. Except that the rules of mnemonics are referred to by Martianus Capella, nothing further is known regarding the practice until the 13th century. Among the voluminous writings of Roger Bacon is a tractate De arte memorativa. Ramon Llull devoted special attention to mnemonics in connection with his ars generalis; the first important modification of the method of the Romans was that invented by the German poet Konrad Celtes, who, in his Epitoma in utramque Ciceronis rhetoricam cum arte memorativa nova, used letters of the alphabet for associations, rather than places. About the end of the 15th century, Petrus de Ravenna provoked such astonishment in Italy by his mnemonic feats that he was believed by many to be a necromancer.
His Phoenix artis memoriae went through as many as nine editions, the seventh being published at Cologne in 1608. About the end of the 16th century, Lambert Schenkel, who taught mnemonics in France and Germany surprised people with his memory, he was denounced as a sorcerer by the University of Louvain, but in 1593 he published his tractate De memoria at Douai with the sanction of that celebrated theological faculty. The most complete account of his system is given in two works by his pupil Martin Sommer, published in Venice in 1619. In 1618 John Willis published Mnemonica. Giordano Bruno included a memoria technica in his treatise De umbris idearum, as part of his study of the ars generalis of Llull. Other writers of this period are the Florentine Publicius. Porta, Ars reminiscendi. In 1648 Stanislaus Mink von Wennsshein revealed what he called the "most fertile secret" in mnemonics — using consonants for figures, thus expressing numbers by words, i
Set theory is a branch of mathematical logic that studies sets, which informally are collections of objects. Although any type of object can be collected into a set, set theory is applied most to objects that are relevant to mathematics; the language of set theory can be used to define nearly all mathematical objects. The modern study of set theory was initiated by Richard Dedekind in the 1870s. After the discovery of paradoxes in naive set theory, such as Russell's paradox, numerous axiom systems were proposed in the early twentieth century, of which the Zermelo–Fraenkel axioms, with or without the axiom of choice, are the best-known. Set theory is employed as a foundational system for mathematics in the form of Zermelo–Fraenkel set theory with the axiom of choice. Beyond its foundational role, set theory is a branch of mathematics in its own right, with an active research community. Contemporary research into set theory includes a diverse collection of topics, ranging from the structure of the real number line to the study of the consistency of large cardinals.
Mathematical topics emerge and evolve through interactions among many researchers. Set theory, was founded by a single paper in 1874 by Georg Cantor: "On a Property of the Collection of All Real Algebraic Numbers". Since the 5th century BC, beginning with Greek mathematician Zeno of Elea in the West and early Indian mathematicians in the East, mathematicians had struggled with the concept of infinity. Notable is the work of Bernard Bolzano in the first half of the 19th century. Modern understanding of infinity began in 1870–1874 and was motivated by Cantor's work in real analysis. An 1872 meeting between Cantor and Richard Dedekind influenced Cantor's thinking and culminated in Cantor's 1874 paper. Cantor's work polarized the mathematicians of his day. While Karl Weierstrass and Dedekind supported Cantor, Leopold Kronecker, now seen as a founder of mathematical constructivism, did not. Cantorian set theory became widespread, due to the utility of Cantorian concepts, such as one-to-one correspondence among sets, his proof that there are more real numbers than integers, the "infinity of infinities" resulting from the power set operation.
This utility of set theory led to the article "Mengenlehre" contributed in 1898 by Arthur Schoenflies to Klein's encyclopedia. The next wave of excitement in set theory came around 1900, when it was discovered that some interpretations of Cantorian set theory gave rise to several contradictions, called antinomies or paradoxes. Bertrand Russell and Ernst Zermelo independently found the simplest and best known paradox, now called Russell's paradox: consider "the set of all sets that are not members of themselves", which leads to a contradiction since it must be a member of itself and not a member of itself. In 1899 Cantor had himself posed the question "What is the cardinal number of the set of all sets?", obtained a related paradox. Russell used his paradox as a theme in his 1903 review of continental mathematics in his The Principles of Mathematics. In 1906 English readers gained the book Theory of Sets of Points by husband and wife William Henry Young and Grace Chisholm Young, published by Cambridge University Press.
The momentum of set theory was such. The work of Zermelo in 1908 and the work of Abraham Fraenkel and Thoralf Skolem in 1922 resulted in the set of axioms ZFC, which became the most used set of axioms for set theory; the work of analysts such as Henri Lebesgue demonstrated the great mathematical utility of set theory, which has since become woven into the fabric of modern mathematics. Set theory is used as a foundational system, although in some areas—such as algebraic geometry and algebraic topology—category theory is thought to be a preferred foundation. Set theory begins with a fundamental binary relation between an object o and a set A. If o is a member of A, the notation o. Since sets are objects, the membership relation can relate sets as well. A derived binary relation between two sets is the subset relation called set inclusion. If all the members of set A are members of set B A is a subset of B, denoted A ⊆ B. For example, is a subset of, so is but is not; as insinuated from this definition, a set is a subset of itself.
For cases where this possibility is unsuitable or would make sense to be rejected, the term proper subset is defined. A is called a proper subset of B if and only if A is a subset of B, but A is not equal to B. Note that 1, 2, 3 are members of the set but are not subsets of it. Just as arithmetic features binary operations on numbers, set theory features binary operations on sets. The: Union of the sets A and B, denoted A ∪ B, is the set of all objects that are a member of A, or B, or both; the union of and is the set. Intersection of the sets A and B, denoted A ∩ B, is the set of all objects that are members of both A and B; the intersection of and is the set. Set difference of U and A, denoted U \ A, is the set of all members of U that are not members of A; the set difference \ is, conversely, the set difference \ is. When A is a subset of U, the set difference U \ A is called the complement of A in U. In this case, if the choice of U is clear from the context, the notation Ac is sometimes used instead of U \ A if U is a universal set as in the study of Venn diagrams.
Symmetric difference of sets A and B, denoted A △ B or A ⊖ B, is
Augustus De Morgan
Augustus De Morgan was a British mathematician and logician. He formulated De Morgan's laws and introduced the term mathematical induction, making its idea rigorous. Augustus De Morgan was born in Madurai, India in 1806, his father was Lieut.-Colonel John De Morgan, who held various appointments in the service of the East India Company. His mother, Elizabeth Dodson, was a descendant of James Dodson, who computed a table of anti-logarithms, that is, the numbers corresponding to exact logarithms. Augustus De Morgan became blind in one eye; the family moved to England. As his father and grandfather had both been born in India, De Morgan used to say that he was neither English, nor Scottish, nor Irish, but a Briton "unattached", using the technical term applied to an undergraduate of Oxford or Cambridge, not a member of any one of the Colleges; when De Morgan was ten years old his father died. Mrs De Morgan resided at various places in the southwest of England, her son received his elementary education at various schools of no great account.
His mathematical talents went unnoticed until he was fourteen, when a family-friend discovered him making an elaborate drawing of a figure in Euclid with ruler and compasses. She explained the aim of Euclid to Augustus, gave him an initiation into demonstration, he received his secondary education from Mr Parsons, a fellow of Oriel College, who appreciated classics better than mathematics. His mother was an active and ardent member of the Church of England, desired that her son should become a clergyman, but by this time De Morgan had begun to show his non-conforming disposition, he became an atheist. There is a word in our language with which I shall not confuse this subject, both on account of the dishonourable use, made of it, as an imputation thrown by one sect upon another, of the variety of significations attached to it. I shall use the word Anti-Deism to signify the opinion that there does not exist a Creator who made and sustains the Universe. In 1823, at the age of sixteen, he entered Trinity College, where he came under the influence of George Peacock and William Whewell, who became his lifelong friends.
His college tutor was John Philips Higman, FRS. At college he was prominent in the musical clubs, his love of knowledge for its own sake interfered with training for the great mathematical race. This entitled him to the degree of Bachelor of Arts. To the signing of any such test De Morgan felt a strong objection, although he had been brought up in the Church of England. In about 1875 theological tests for academic degrees were abolished in the Universities of Oxford and Cambridge; as no career was open to him at his own university, he decided to go to the Bar, took up residence in London. About this time the movement for founding London University took shape; the two ancient universities of Oxford and Cambridge were so guarded by theological tests that no Jew or Dissenter outside the Church of England could enter as a student, still less be appointed to any office. A body of liberal-minded men resolved to meet the difficulty by establishing in London a University on the principle of religious neutrality.
De Morgan 22 years of age, was appointed professor of mathematics. His introductory lecture "On the study of mathematics" is a discourse upon mental education of permanent value, has been reprinted in the United States; the London University was a new institution, the relations of the Council of management, the Senate of professors and the body of students were not well defined. A dispute arose between the professor of anatomy and his students, in consequence of the action taken by the Council, several professors resigned, headed by De Morgan. Another professor of mathematics was appointed, who drowned a few years later. De Morgan had shown himself a prince of teachers: he was invited to return to his chair, which thereafter became the continuous centre of his labours for thirty years; the same body of reformers—headed by Lord Brougham, a Scotsman eminent both in science and politics who had instituted the London University—founded about the same time a Society for the Diffusion of Useful Knowledge.
Its object was to spread scientific and other knowledge by means of cheap and written treatises by the best writers of the time. One of its most voluminous and effective writers was De Morgan, he wrote a great work on The Differential and Integral Calculus, published by the Society. When De Morgan came to reside in London he found a congenial friend in William Frend, notwithstanding his mathematical heresy about negative quantities. Both were arithmeticians and actuaries, their religious views were somewhat similar. Frend lived in what was a suburb of London, in a country-house occupied by Daniel Defoe and Isaac Watts. De Morgan with his flute was a welcome visitor; the London University of which De Morgan was a professor was a different institution from the University of London. The University of London was founded about ten years by the Government for the purpose of granting degrees after
A Venn diagram is a diagram that shows all possible logical relations between a finite collection of different sets. These diagrams depict elements as points in the plane, sets as regions inside closed curves. A Venn diagram consists of multiple overlapping closed curves circles, each representing a set; the points inside a curve labelled S represent elements of the set S, while points outside the boundary represent elements not in the set S. This lends to read visualizations. In Venn diagrams the curves are overlapped in every possible way, showing all possible relations between the sets, they are thus a special case of Euler diagrams, which do not show all relations. Venn diagrams were conceived around 1880 by John Venn, they are used to teach elementary set theory, as well as illustrate simple set relationships in probability, statistics and computer science. A Venn diagram in which the area of each shape is proportional to the number of elements it contains is called an area-proportional or scaled Venn diagram.
This example involves A and B, represented here as coloured circles. The orange circle, set A, represents all living creatures; the blue circle, set B, represents the living creatures. Each separate type of creature can be imagined as a point somewhere in the diagram. Living creatures that both can fly and have two legs—for example, parrots—are in both sets, so they correspond to points in the region where the blue and orange circles overlap, it is important to note that this overlapping region would only contain those elements that are members of both set A and are members of set B Humans and penguins are bipedal, so are in the orange circle, but since they cannot fly they appear in the left part of the orange circle, where it does not overlap with the blue circle. Mosquitoes have six legs, fly, so the point for mosquitoes is in the part of the blue circle that does not overlap with the orange one. Creatures that are not two-legged and cannot fly would all be represented by points outside both circles.
The combined region of sets A and B is called the union of A and B, denoted by A ∪ B. The union in this case contains all living creatures that can fly; the region in both A and B, where the two sets overlap, is called the intersection of A and B, denoted by A ∩ B. For example, the intersection of the two sets is not empty, because there are points that represent creatures that are in both the orange and blue circles. Venn diagrams were introduced in 1880 by John Venn in a paper entitled On the Diagrammatic and Mechanical Representation of Propositions and Reasonings in the "Philosophical Magazine and Journal of Science", about the different ways to represent propositions by diagrams; the use of these types of diagrams in formal logic, according to Frank Ruskey and Mark Weston, is "not an easy history to trace, but it is certain that the diagrams that are popularly associated with Venn, in fact, originated much earlier. They are rightly associated with Venn, because he comprehensively surveyed and formalized their usage, was the first to generalize them".
Venn himself did not use the term "Venn diagram" and referred to his invention as "Eulerian Circles". For example, in the opening sentence of his 1880 article Venn writes, "Schemes of diagrammatic representation have been so familiarly introduced into logical treatises during the last century or so, that many readers those who have made no professional study of logic, may be supposed to be acquainted with the general nature and object of such devices. Of these schemes one only, viz. that called'Eulerian circles,' has met with any general acceptance..." Lewis Carroll includes "Venn's Method of Diagrams" as well as "Euler's Method of Diagrams" in an "Appendix, Addressed to Teachers" of his book "Symbolic Logic". The term "Venn diagram" was used by Clarence Irving Lewis in 1918, in his book "A Survey of Symbolic Logic". Venn diagrams are similar to Euler diagrams, which were invented by Leonhard Euler in the 18th century. M. E. Baron has noted that Leibniz in the 17th century produced similar diagrams before Euler, but much of it was unpublished.
She observes earlier Euler-like diagrams by Ramon Llull in the 13th Century. In the 20th century, Venn diagrams were further developed. D. W. Henderson showed in 1963 that the existence of an n-Venn diagram with n-fold rotational symmetry implied that n was a prime number, he showed that such symmetric Venn diagrams exist when n is five or seven. In 2002 Peter Hamburger found symmetric Venn diagrams for n = 11 and in 2003, Griggs and Savage showed that symmetric Venn diagrams exist for all other primes, thus rotationally symmetric Venn diagrams exist. Venn diagrams and Euler diagrams were incorporated as part of instruction in set theory as part of the new math movement in the 1960s. Since they have been adopted in the curriculum of other fields such as reading. A Venn diagram is constructed with a collection of simple closed curves drawn in a plane. According to Lewis, the "principle of these diagrams is that classes be represented by regions in such relation to one another that all the possible logical relations of these classes can be indicated in the same diagram.
That is, the diagram leaves room for any possible relation
In mathematics, a theorem is a statement, proven on the basis of established statements, such as other theorems, accepted statements, such as axioms. A theorem is a logical consequence of the axioms; the proof of a mathematical theorem is a logical argument for the theorem statement given in accord with the rules of a deductive system. The proof of a theorem is interpreted as justification of the truth of the theorem statement. In light of the requirement that theorems be proved, the concept of a theorem is fundamentally deductive, in contrast to the notion of a scientific law, experimental. Many mathematical theorems are conditional statements. In this case, the proof deduces the conclusion from conditions called premises. In light of the interpretation of proof as justification of truth, the conclusion is viewed as a necessary consequence of the hypotheses, that the conclusion is true in case the hypotheses are true, without any further assumptions. However, the conditional could be interpreted differently in certain deductive systems, depending on the meanings assigned to the derivation rules and the conditional symbol.
Although they can be written in a symbolic form, for example, within the propositional calculus, theorems are expressed in a natural language such as English. The same is true of proofs, which are expressed as logically organized and worded informal arguments, intended to convince readers of the truth of the statement of the theorem beyond any doubt, from which a formal symbolic proof can in principle be constructed; such arguments are easier to check than purely symbolic ones—indeed, many mathematicians would express a preference for a proof that not only demonstrates the validity of a theorem, but explains in some way why it is true. In some cases, a picture alone may be sufficient to prove a theorem; because theorems lie at the core of mathematics, they are central to its aesthetics. Theorems are described as being "trivial", or "difficult", or "deep", or "beautiful"; these subjective judgments vary not only from person to person, but with time: for example, as a proof is simplified or better understood, a theorem, once difficult may become trivial.
On the other hand, a deep theorem may be stated but its proof may involve surprising and subtle connections between disparate areas of mathematics. Fermat's Last Theorem is a well-known example of such a theorem. Logically, many theorems are of the form of an indicative conditional: if A B; such a theorem does not assert B, only that B is a necessary consequence of A. In this case A is called B the conclusion; the theorem "If n is an natural number n/2 is a natural number" is a typical example in which the hypothesis is "n is an natural number" and the conclusion is "n/2 is a natural number". To be proved, a theorem must be expressible as a formal statement. Theorems are expressed in natural language rather than in a symbolic form, with the intention that the reader can produce a formal statement from the informal one, it is common in mathematics to choose a number of hypotheses within a given language and declare that the theory consists of all statements provable from these hypotheses. These hypotheses are called axioms or postulates.
The field of mathematics known as proof theory studies formal languages and the structure of proofs. Some theorems are "trivial", in the sense that they follow from definitions and other theorems in obvious ways and do not contain any surprising insights. Some, on the other hand, may be called "deep", because their proofs may be long and difficult, involve areas of mathematics superficially distinct from the statement of the theorem itself, or show surprising connections between disparate areas of mathematics. A theorem might be simple to state and yet be deep. An excellent example is Fermat's Last Theorem, there are many other examples of simple yet deep theorems in number theory and combinatorics, among other areas. Other theorems have a known proof that cannot be written down; the most prominent examples are the Kepler conjecture. Both of these theorems are only known to be true by reducing them to a computational search, verified by a computer program. Many mathematicians did not accept this form of proof, but it has become more accepted.
The mathematician Doron Zeilberger has gone so far as to claim that these are the only nontrivial results that mathematicians have proved. Many mathematical theorems can be reduced to more straightforward computation, including polynomial identities, trigonometric identities and hypergeometric identities. To establish a mathematical statement as a theorem, a proof is required, that is, a line of reasoning from axioms in the system to the given statement must be demonstrated. However, the proof is considered as separate from the theorem statement. Although more than one proof may be known for a single theorem, only one proof is required to establish the status of a statement as a theorem; the Pythagorean theorem and the law of quadratic reciprocity are contenders for the title of theorem with the greatest number of distinct proofs. Theorems in mathematics and theories in science are fundamentally different in their epistemology. A scientific theory cannot be proved.