1.
History of logic
–
The history of logic deals with the study of the development of the science of valid inference. Formal logics developed in ancient times in China, India, Greek methods, particularly Aristotelian logic as found in the Organon, found wide application and acceptance in Western science and mathematics for millennia. The Stoics, especially Chrysippus, began the development of predicate logic, christian and Islamic philosophers such as Boethius and William of Ockham further developed Aristotles logic in the Middle Ages, reaching a high point in the mid-fourteenth century. The period between the fourteenth century and the beginning of the century saw largely decline and neglect. Empirical methods ruled the day, as evidenced by Sir Francis Bacons Novum Organon of 1620, valid reasoning has been employed in all periods of human history. However, logic studies the principles of reasoning, inference. It is probable that the idea of demonstrating a conclusion first arose in connection with geometry, the ancient Egyptians discovered geometry, including the formula for the volume of a truncated pyramid. Ancient Babylon was also skilled in mathematics, while the ancient Egyptians empirically discovered some truths of geometry, the great achievement of the ancient Greeks was to replace empirical methods by demonstrative proof. Both Thales and Pythagoras of the Pre-Socratic philosophers seem aware of geometrys methods, fragments of early proofs are preserved in the works of Plato and Aristotle, and the idea of a deductive system was probably known in the Pythagorean school and the Platonic Academy. The proofs of Euclid of Alexandria are a paradigm of Greek geometry, the three basic principles of geometry are as follows, Certain propositions must be accepted as true without demonstration, such a proposition is known as an axiom of geometry. Every proposition that is not an axiom of geometry must be demonstrated as following from the axioms of geometry, the proof must be formal, that is, the derivation of the proposition must be independent of the particular subject matter in question. Further evidence that early Greek thinkers were concerned with the principles of reasoning is found in the fragment called dissoi logoi and this is part of a protracted debate about truth and falsity. Thales was said to have had a sacrifice in celebration of discovering Thales Theorem just as Pythagoras had the Pythagorean Theorem, Indian and Babylonian mathematicians knew his theorem for special cases before he proved it. It is believed that Thales learned that an angle inscribed in a semicircle is a right angle during his travels to Babylon, before 520 BC, on one of his visits to Egypt or Greece, Pythagoras might have met the c.54 years older Thales. The systematic study of proof seems to have begun with the school of Pythagoras in the sixth century BC. Indeed, the Pythagoreans, believing all was number, are the first philosophers to emphasize rather than matter. He is known for his obscure sayings and this logos holds always but humans always prove unable to understand it, both before hearing it and when they have first heard it. But other people fail to notice what they do when awake, in contrast to Heraclitus, Parmenides held that all is one and nothing changes
2.
Lookup table
–
In computer science, a lookup table is an array that replaces runtime computation with a simpler array indexing operation. The savings in terms of processing time can be significant, since retrieving a value from memory is often faster than undergoing an expensive computation or input/output operation. The tables may be precalculated and stored in static program storage, calculated as part of a programs initialization phase, fPGAs also make extensive use of reconfigurable, hardware-implemented, lookup tables to provide programmable hardware functionality. Before the advent of computers, lookup tables of values were used to speed up hand calculations of complex functions, such as in trigonometry, logarithms, in ancient India, Aryabhata created one of the first sine tables, which he encoded in a Sanskrit-letter-based number system. Early in the history of computers, input/output operations were particularly slow – even in comparison to processor speeds of the time. Despite the introduction of systemwide caching that now automates this process, application level lookup tables can still improve performance for data items that rarely, if ever, change. Lookup tables were one of the earliest functionalities implemented in computer spreadsheets and this has been followed by subsequent spreadsheets, such as Microsoft Excel, and complemented by specialized VLOOKUP and HLOOKUP functions to simplify lookup in a vertical or horizontal table. This is known as a search or brute-force search, each element being checked for equality in turn. This is often the slowest search method unless frequently occurring values occur early in the list, for a one-dimensional array or linked list, the lookup is usually to determine whether or not there is a match with an input data value. This is only if the list is sorted but gives good performance even if the list is lengthy. For a trivial hash function lookup, the raw data value is used directly as an index to a one-dimensional table to extract a result. For small ranges, this can be amongst the fastest lookup, even exceeding binary search speed with zero branches and executing in constant time. One discrete problem that is expensive to solve on many computers, is that of counting the number of bits which are set to 1 in a number, sometimes called the population function. For example, the decimal number 37 is 00100101 in binary and this can be ameliorated using loop unrolling and some other compiler optimizations. There is however a simple and much faster algorithmic solution - using a hash function table lookup. Simply construct a table, bits_set, with 256 entries giving the number of one bits set in each possible byte value. Then use this table to find the number of ones in each byte of the using a trivial hash function lookup on each byte in turn. This requires no branches, and just four indexed memory accesses, the above source can be improved easily, by recasting x as a 4 byte unsigned char array and, preferably, coded in-line as a single statement instead of being a function
3.
Logic
–
Logic, originally meaning the word or what is spoken, is generally held to consist of the systematic study of the form of arguments. A valid argument is one where there is a relation of logical support between the assumptions of the argument and its conclusion. Historically, logic has been studied in philosophy and mathematics, and recently logic has been studied in science, linguistics, psychology. The concept of form is central to logic. The validity of an argument is determined by its logical form, traditional Aristotelian syllogistic logic and modern symbolic logic are examples of formal logic. Informal logic is the study of natural language arguments, the study of fallacies is an important branch of informal logic. Since much informal argument is not strictly speaking deductive, on some conceptions of logic, formal logic is the study of inference with purely formal content. An inference possesses a purely formal content if it can be expressed as an application of a wholly abstract rule, that is. The works of Aristotle contain the earliest known study of logic. Modern formal logic follows and expands on Aristotle, in many definitions of logic, logical inference and inference with purely formal content are the same. This does not render the notion of informal logic vacuous, because no formal logic captures all of the nuances of natural language, Symbolic logic is the study of symbolic abstractions that capture the formal features of logical inference. Symbolic logic is divided into two main branches, propositional logic and predicate logic. Mathematical logic is an extension of logic into other areas, in particular to the study of model theory, proof theory, set theory. Logic is generally considered formal when it analyzes and represents the form of any valid argument type, the form of an argument is displayed by representing its sentences in the formal grammar and symbolism of a logical language to make its content usable in formal inference. Simply put, formalising simply means translating English sentences into the language of logic and this is called showing the logical form of the argument. It is necessary because indicative sentences of ordinary language show a variety of form. Second, certain parts of the sentence must be replaced with schematic letters, thus, for example, the expression all Ps are Qs shows the logical form common to the sentences all men are mortals, all cats are carnivores, all Greeks are philosophers, and so on. The schema can further be condensed into the formula A, where the letter A indicates the judgement all - are -, the importance of form was recognised from ancient times
4.
Metamathematics
–
Metamathematics is the study of mathematics itself using mathematical methods. This study produces metatheories, which are mathematical theories about other mathematical theories, emphasis on metamathematics owes itself to David Hilberts attempt to secure the foundations of mathematics in the early part of the 20th Century. Metamathematics provides a mathematical technique for investigating a great variety of foundation problems for mathematics. An important feature of metamathematics is its emphasis on differentiating between reasoning from inside a system and from outside a system, an informal illustration of this is categorizing the proposition 2+2=4 as belonging to mathematics while categorizing the proposition 2+2=4 is valid as belonging to metamathematics. Something similar can be said around the well-known Russells paradox, Metamathematics was intimately connected to mathematical logic, so that the early histories of the two fields, during the late 19th and early 20th centuries, largely overlap. More recently, mathematical logic has often included the study of new pure mathematics, such as set theory, recursion theory and pure model theory, serious metamathematical reflection began with the work of Gottlob Frege, especially his Begriffsschrift. David Hilbert was the first to invoke the term metamathematics with regularity, in his hands, it meant something akin to contemporary proof theory, in which finitary methods are used to study various axiomatized mathematical theorems. Today, metalogic and metamathematics are largely synonymous with each other, the discovery of hyperbolic geometry had important philosophical consequences for Metamathematics. Before its discovery there was just one geometry and mathematics, the idea that another geometry existed was considered improbable, the uproar of the Boeotians came and went, and gave an impetus to metamathematics and great improvements in mathematical rigour, analytical philosophy and logic. Begriffsschrift is a book on logic by Gottlob Frege, published in 1879, Begriffsschrift is usually translated as concept writing or concept notation, the full title of the book identifies it as a formula language, modeled on that of arithmetic, of pure thought. Freges motivation for developing his formal approach to logic resembled Leibnizs motivation for his calculus ratiocinator, Frege went on to employ his logical calculus in his research on the foundations of mathematics, carried out over the next quarter century. As such, this project is of great importance in the history of mathematics and philosophy. One of the inspirations and motivations for PM was the earlier work of Gottlob Frege on logic. PM sought to avoid this problem by ruling out the creation of arbitrary sets. This was achieved by replacing the notion of a set with notion of a hierarchy of sets of different types. Contemporary mathematics, however, avoids paradoxes such as Russells in less unwieldy ways, gödels completeness theorem is a fundamental theorem in mathematical logic that establishes a correspondence between semantic truth and syntactic provability in first-order logic. It makes a link between model theory that deals with what is true in different models, and proof theory that studies what can be formally proven in particular formal systems. More formally, the theorem says that if a formula is logically valid then there is a finite deduction of the formula
5.
Logical NOR
–
In boolean logic, logical nor or joint denial is a truth-functional operator which produces a result that is the negation of logical or. That is, a sentence of the form is true precisely when neither p nor q is true—i. e, when both of p and q are false. In grammar, nor is a coordinating conjunction, thus, as with its dual, the NAND operator, NOR can be used by itself, without any other logical operator, to constitute a logical formal system. It is also known as Quines dagger, the ampheck by Peirce, one way of expressing p NOR q is p ∨ q ¯, where the symbol ∨ signifies OR and the bar signifies the negation of the expression under it, in essence, simply ¬. Other ways of expressing p NOR q are Xpq, and p + q ¯, the computer used in the spacecraft that first carried humans to the moon, the Apollo Guidance Computer, was constructed entirely using NOR gates with three inputs. The NOR operation is an operation on two logical values, typically the values of two propositions, that produces a value of true if and only if both operands are false. In other words, it produces a value of false if, the truth table of A NOR B is as follows, Logical NOR does not possess any of the five qualities required to be absent from at least one member of a set of functionally complete operators. Thus, the set containing only NOR suffices as a complete set, NOR has the interesting feature that all other logical operators can be expressed by interlaced NOR operations. The logical NAND operator also has this ability, the logical NOR ↓ is the negation of the disjunction, Expressed in terms of NOR ↓, the usual operators of propositional logic are
6.
Material nonimplication
–
Material nonimplication or abjunction is the negation of material implication. That is to say that for any two propositions P and Q, the material nonimplication from P to Q is true if and this is more naturally stated as that the material nonimplication from P to Q is true only if P is true and Q is false. The symbol for material nonimplication is simply a crossed-out material implication symbol, bitwise operation, A& Logical operation, A&& Implication
7.
Exclusive or
–
Exclusive or or exclusive disjunction is a logical operation that outputs true only when inputs differ. It is symbolized by the prefix operator J and by the infix operators XOR, EOR, EXOR, ⊻, ⊕, ↮, the negation of XOR is logical biconditional, which outputs true only when both inputs are the same. It gains the exclusive or because the meaning of or is ambiguous when both operands are true, the exclusive or operator excludes that case. This is sometimes thought of as one or the other but not both and this could be written as A or B, but not, A and B. More generally, XOR is true only when an odd number of inputs are true, a chain of XORs—a XOR b XOR c XOR d —is true whenever an odd number of the inputs are true and is false whenever an even number of inputs are true. The truth table of A XOR B shows that it outputs true whenever the inputs differ,0, false 1, true Exclusive disjunction essentially means either one, in other words, the statement is true if and only if one is true and the other is false. For example, if two horses are racing, then one of the two win the race, but not both of them. The exclusive or is equivalent to the negation of a logical biconditional, by the rules of material implication. This unfortunately prevents the combination of two systems into larger structures, such as a mathematical ring. However, the system using exclusive or is an abelian group, the combination of operators ∧ and ⊕ over elements produce the well-known field F2. This field can represent any logic obtainable with the system and has the benefit of the arsenal of algebraic analysis tools for fields. The Oxford English Dictionary explains either, or as follows, The primary function of either, etc. is to emphasize the perfect indifference of the two things or courses. But a secondary function is to emphasize the mutual exclusiveness, = either of the two, but not both, the exclusive-or explicitly states one or the other, but not neither nor both. Following this kind of common-sense intuition about or, it is argued that in many natural languages, English included. The exclusive disjunction of a pair of propositions, is supposed to mean that p is true or q is true, but not both. For example, it might be argued that the intention of a statement like You may have coffee. Certainly under some circumstances a sentence like this example should be taken as forbidding the possibility of accepting both options. Even so, there is reason to suppose that this sort of sentence is not disjunctive at all
8.
Logical conjunction
–
In logic and mathematics, and is the truth-functional operator of logical conjunction, the and of a set of operands is true if and only if all of its operands are true. The logical connective that represents this operator is written as ∧ or ⋅. A and B is true only if A is true and B is true, an operand of a conjunction is a conjunct. Related concepts in other fields are, In natural language, the coordinating conjunction, in programming languages, the short-circuit and control structure. And is usually denoted by an operator, in mathematics and logic, ∧ or ×, in electronics, ⋅. In Jan Łukasiewiczs prefix notation for logic, the operator is K, logical conjunction is an operation on two logical values, typically the values of two propositions, that produces a value of true if and only if both of its operands are true. The conjunctive identity is 1, which is to say that AND-ing an expression with 1 will never change the value of the expression. In keeping with the concept of truth, when conjunction is defined as an operator or function of arbitrary arity. The truth table of A ∧ B, As a rule of inference, conjunction introduction is a classically valid, the argument form has two premises, A and B. Intuitively, it permits the inference of their conjunction, therefore, A and B. or in logical operator notation, A, B ⊢ A ∧ B Here is an example of an argument that fits the form conjunction introduction, Bob likes apples. Therefore, Bob likes apples and oranges, Conjunction elimination is another classically valid, simple argument form. Intuitively, it permits the inference from any conjunction of either element of that conjunction, therefore, A. or alternately, A and B. In logical operator notation, A ∧ B ⊢ A. falsehood-preserving, yes When all inputs are false, walsh spectrum, Nonlinearity,1 If using binary values for true and false, then logical conjunction works exactly like normal arithmetic multiplication. Many languages also provide short-circuit control structures corresponding to logical conjunction. Logical conjunction is used for bitwise operations, where 0 corresponds to false and 1 to true,0 AND0 =0,0 AND1 =0,1 AND0 =0,1 AND1 =1. The operation can also be applied to two binary words viewed as bitstrings of length, by taking the bitwise AND of each pair of bits at corresponding positions. For example,11000110 AND10100011 =10000010 and this can be used to select part of a bitstring using a bit mask. For example,10011101 AND00001000 =00001000 extracts the fifth bit of an 8-bit bitstring
9.
Logical biconditional
–
In logic and mathematics, the logical biconditional is the logical connective of two statements asserting p if and only if q, where p is an antecedent and q is a consequent. This is often abbreviated p iff q, the operator is denoted using a doubleheaded arrow, a prefixed E, an equality sign, an equivalence sign, or EQV. It is logically equivalent to ∧, or the XNOR boolean operator and it is also logically equivalent to or, meaning both or neither. The only difference from material conditional is the case when the hypothesis is false, in that case, in the conditional, the result is true, yet in the biconditional the result is false. In the conceptual interpretation, a = b means All a s are b s and all b s are a s, in other words and this does not mean that the concepts have the same meaning. Examples, triangle and trilateral, equiangular trilateral and equilateral triangle, the antecedent is the subject and the consequent is the predicate of a universal affirmative proposition. In the propositional interpretation, a ⇔ b means that a b and b implies a, in other words, that the propositions are equivalent. This does not mean that they have the same meaning, example, The triangle ABC has two equal sides, and The triangle ABC has two equal angles. The antecedent is the premise or the cause and the consequent is the consequence, when an implication is translated by a hypothetical judgment the antecedent is called the hypothesis and the consequent is called the thesis. A common way of demonstrating a biconditional is to use its equivalence to the conjunction of two converse conditionals, demonstrating these separately. When both members of the biconditional are propositions, it can be separated into two conditionals, of one is called a theorem and the other its reciprocal. Thus whenever a theorem and its reciprocal are true we have a biconditional, a simple theorem gives rise to an implication whose antecedent is the hypothesis and whose consequent is the thesis of the theorem. When a theorem and its reciprocal are true we say that its hypothesis is the necessary and sufficient condition of the thesis, that is to say, that it is at the same time both cause and consequence. Logical equality is an operation on two values, typically the values of two propositions, that produces a value of true if and only if both operands are false or both operands are true. The truth table for A ↔ B is as follows, More than two statements combined by ↔ are ambiguous, x 1 ↔ x 2 ↔ x 3 ↔. ↔ x n may be meant as ↔ x n, or may be used to say that all x i are together true or together false, commutativity, yes associativity, yes distributivity, Biconditional doesnt distribute over any binary function, but logical disjunction distributes over biconditional. Idempotency, no monotonicity, no truth-preserving, yes When all inputs are true, falsehood-preserving, no When all inputs are false, the output is not false. Walsh spectrum, Nonlinearity,0 Like all connectives in first-order logic, Biconditional introduction allows you to infer that, if B follows from A, and A follows from B, then A if and only if B
10.
Logical disjunction
–
In logic and mathematics, or is the truth-functional operator of disjunction, also known as alternation, the or of a set of operands is true if and only if one or more of its operands is true. The logical connective that represents this operator is written as ∨ or +. A or B is true if A is true, or if B is true, or if both A and B are true. In logic, or by means the inclusive or, distinguished from an exclusive or. An operand of a disjunction is called a disjunct, related concepts in other fields are, In natural language, the coordinating conjunction or. In programming languages, the short-circuit or control structure, or is usually expressed with an infix operator, in mathematics and logic, ∨, in electronics, +, and in most programming languages, |, ||, or or. In Jan Łukasiewiczs prefix notation for logic, the operator is A, logical disjunction is an operation on two logical values, typically the values of two propositions, that has a value of false if and only if both of its operands are false. More generally, a disjunction is a formula that can have one or more literals separated only by ors. A single literal is often considered to be a degenerate disjunction, the disjunctive identity is false, which is to say that the or of an expression with false has the same value as the original expression. In keeping with the concept of truth, when disjunction is defined as an operator or function of arbitrary arity. Falsehood-preserving, The interpretation under which all variables are assigned a value of false produces a truth value of false as a result of disjunction. The mathematical symbol for logical disjunction varies in the literature, in addition to the word or, and the formula Apq, the symbol ∨, deriving from the Latin word vel is commonly used for disjunction. For example, A ∨ B is read as A or B, such a disjunction is false if both A and B are false. In all other cases it is true, all of the following are disjunctions, A ∨ B ¬ A ∨ B A ∨ ¬ B ∨ ¬ C ∨ D ∨ ¬ E. The corresponding operation in set theory is the set-theoretic union, operators corresponding to logical disjunction exist in most programming languages. Disjunction is often used for bitwise operations, for example, x = x | 0b00000001 will force the final bit to 1 while leaving other bits unchanged. Logical disjunction is usually short-circuited, that is, if the first operand evaluates to true then the second operand is not evaluated, the logical disjunction operator thus usually constitutes a sequence point. In a parallel language, it is possible to both sides, they are evaluated in parallel, and if one terminates with value true