1.
Propositional calculus
–
Logical connectives are found in natural languages. In English for example, some examples are and, or, not”, the following is an example of a very simple inference within the scope of propositional logic, Premise 1, If its raining then its cloudy. Both premises and the conclusion are propositions, the premises are taken for granted and then with the application of modus ponens the conclusion follows. Not only that, but they will also correspond with any other inference of this form, Propositional logic may be studied through a formal system in which formulas of a formal language may be interpreted to represent propositions. A system of rules and axioms allows certain formulas to be derived. These derived formulas are called theorems and may be interpreted to be true propositions, a constructed sequence of such formulas is known as a derivation or proof and the last formula of the sequence is the theorem. The derivation may be interpreted as proof of the represented by the theorem. When a formal system is used to represent formal logic, only statement letters are represented directly, usually in truth-functional propositional logic, formulas are interpreted as having either a truth value of true or a truth value of false. Truth-functional propositional logic and systems isomorphic to it, are considered to be zeroth-order logic, although propositional logic had been hinted by earlier philosophers, it was developed into a formal logic by Chrysippus in the 3rd century BC and expanded by his successor Stoics. The logic was focused on propositions and this advancement was different from the traditional syllogistic logic which was focused on terms. However, later in antiquity, the propositional logic developed by the Stoics was no longer understood, consequently, the system was essentially reinvented by Peter Abelard in the 12th century. Propositional logic was eventually refined using symbolic logic, the 17th/18th-century mathematician Gottfried Leibniz has been credited with being the founder of symbolic logic for his work with the calculus ratiocinator. Although his work was the first of its kind, it was unknown to the larger logical community, consequently, many of the advances achieved by Leibniz were recreated by logicians like George Boole and Augustus De Morgan completely independent of Leibniz. Just as propositional logic can be considered an advancement from the earlier syllogistic logic, one author describes predicate logic as combining the distinctive features of syllogistic logic and propositional logic. Consequently, predicate logic ushered in a new era in history, however, advances in propositional logic were still made after Frege, including Natural Deduction. Natural deduction was invented by Gerhard Gentzen and Jan Łukasiewicz, Truth-Trees were invented by Evert Willem Beth. The invention of truth-tables, however, is of controversial attribution, within works by Frege and Bertrand Russell, are ideas influential to the invention of truth tables. The actual tabular structure, itself, is credited to either Ludwig Wittgenstein or Emil Post
2.
Law of excluded middle
–
In logic, the law of excluded middle is the third of the three classic laws of thought. It states that for any proposition, either that proposition is true, the law is also known as the law of the excluded third, in Latin principium tertii exclusi. Another Latin designation for this law is tertium non datur, no third is given, the principle was stated as a theorem of propositional logic by Russell and Whitehead in Principia Mathematica as, ∗2 ⋅11. The principle should not be confused with the principle of bivalence. The principle of excluded middle, along with its complement, the law of contradiction, are correlates of the law of identity, some systems of logic have different but analogous laws. For some finite n-valued logics, there is a law called the law of excluded n+1th. If negation is cyclic and ∨ is a max operator, then the law can be expressed in the language by. It is easy to check that the sentence must receive at least one of the n truth values, Other systems reject the law entirely. For example, if P is the proposition, Socrates is mortal, then the law of excluded middle holds that the logical disjunction, Either Socrates is mortal, or it is not the case that Socrates is mortal. is true by virtue of its form alone. That is, the position, that Socrates is neither mortal nor not-mortal, is excluded by logic. An example of an argument that depends on the law of excluded middle follows and we seek to prove that there exist two irrational numbers a and b such that a b is rational. It is known that 2 is irrational, clearly this number is either rational or irrational. If it is rational, the proof is complete, and a =2 and b =2, but if 22 is irrational, then let a =22 and b =2. Then a b =2 =2 =22 =2, in the above argument, the assertion this number is either rational or irrational invokes the law of excluded middle. An intuitionist, for example, would not accept this argument without further support for that statement and this might come in the form of a proof that the number in question is in fact irrational, or a finite algorithm that could determine whether the number is rational. By non-constructive Davis means that a proof that actually are mathematic entities satisfying certain conditions would have to provide a method to exhibit explicitly the entities in question. For example, to prove there exists an n such that P, under both the classical and the intuitionistic logic, by reductio ad absurdum this gives not for all n, not P. The classical logic allows this result to be transformed into there exists an n such that P, indeed, David Hilbert and Luitzen E. J. Brouwer both give examples of the law of excluded middle extended to the infinite
3.
Law of noncontradiction
–
This article uses forms of logical notation. For a concise description of the used in this notation. In classical logic, the law of non-contradiction is the second of the three laws of thought. It states that contradictory statements cannot both be true in the same sense at the time, e. g. the two propositions A is B and A is not B are mutually exclusive. The principle was stated as a theorem of propositional logic by Russell and Whitehead in Principia Mathematica as, the law of non-contradiction is merely an expression of the mutually exclusive aspect of that dichotomy, and the law of excluded middle, an expression of its jointly exhaustive aspect. One difficulty in applying the law of non-contradiction is ambiguity in the propositions, for instance, if time is not explicitly specified as part of the propositions A and B, then A may be B at one time, and not at another. A and B may in some cases be made to sound mutually exclusive linguistically even though A may be partly B and partly not B at the same time. However, it is impossible to predicate of the thing, at the same time, and in the same sense, the absence. According to both Plato and Aristotle, Heraclitus was said to have denied the law of non-contradiction and this is quite likely if, as Plato pointed out, the law of non-contradiction does not hold for changing things in the world. If a philosophy of Becoming is not possible without change, then what is to become must already exist in the present object, unfortunately, so little remains of Heraclitus aphorisms that not much about his philosophy can be said with certainty. The road up and down are one and the same implies either the road leads both ways, or there can be no road at all and this is the logical complement of the law of non-contradiction. According to Heraclitus, change, and the constant conflict of opposites is the logos of nature. Personal subjective perceptions or judgments can only be said to be true at the time in the same respect, in which case. The most famous saying of Protagoras is, Man is the measure of all things, of things which are, that they are, and of things which are not, however, Protagoras was referring to things that are used by or in some way related to humans. This makes a difference in the meaning of his aphorism. Properties, social entities, ideas, feelings, judgements, etc. originate in the human mind, however, Protagoras has never suggested that man must be the measure of stars, or the motion of the stars. Parmenides employed an ontological version of the law of non-contradiction to prove that being is and to deny the void, change and he also similarly disproved contrary propositions. In his poem On Nature, he said, The nature of the ‘is’ or what-is in Parmenides is a contentious subject
4.
First-order logic
–
First-order logic – also known as first-order predicate calculus and predicate logic – is a collection of formal systems used in mathematics, philosophy, linguistics, and computer science. This distinguishes it from propositional logic, which does not use quantifiers, Sometimes theory is understood in a more formal sense, which is just a set of sentences in first-order logic. In first-order theories, predicates are associated with sets. In interpreted higher-order theories, predicates may be interpreted as sets of sets, There are many deductive systems for first-order logic which are both sound and complete. Although the logical relation is only semidecidable, much progress has been made in automated theorem proving in first-order logic. First-order logic also satisfies several metalogical theorems that make it amenable to analysis in proof theory, such as the Löwenheim–Skolem theorem, first-order logic is the standard for the formalization of mathematics into axioms and is studied in the foundations of mathematics. Peano arithmetic and Zermelo–Fraenkel set theory are axiomatizations of number theory and set theory, respectively, no first-order theory, however, has the strength to uniquely describe a structure with an infinite domain, such as the natural numbers or the real line. Axioms systems that do fully describe these two structures can be obtained in stronger logics such as second-order logic, for a history of first-order logic and how it came to dominate formal logic, see José Ferreirós. While propositional logic deals with simple declarative propositions, first-order logic additionally covers predicates, a predicate takes an entity or entities in the domain of discourse as input and outputs either True or False. Consider the two sentences Socrates is a philosopher and Plato is a philosopher, in propositional logic, these sentences are viewed as being unrelated and might be denoted, for example, by variables such as p and q. The predicate is a philosopher occurs in both sentences, which have a structure of a is a philosopher. The variable a is instantiated as Socrates in the first sentence and is instantiated as Plato in the second sentence, while first-order logic allows for the use of predicates, such as is a philosopher in this example, propositional logic does not. Relationships between predicates can be stated using logical connectives, consider, for example, the first-order formula if a is a philosopher, then a is a scholar. This formula is a statement with a is a philosopher as its hypothesis. The truth of this depends on which object is denoted by a. Quantifiers can be applied to variables in a formula, the variable a in the previous formula can be universally quantified, for instance, with the first-order sentence For every a, if a is a philosopher, then a is a scholar. The universal quantifier for every in this sentence expresses the idea that the if a is a philosopher. The negation of the sentence For every a, if a is a philosopher, then a is a scholar is logically equivalent to the sentence There exists a such that a is a philosopher and a is not a scholar
5.
Monotonicity of entailment
–
Monotonicity of entailment is a property of many logical systems that states that the hypotheses of any derived fact may be freely extended with additional assumptions. Logical systems with this property are occasionally called monotonic logics in order to them from non-monotonic logics. To illustrate, starting from the natural deduction sequent, Γ ⊢ C weakening allows one to conclude, Γ, A ⊢ C For example, can be weakened by adding a premise, All men are mortal. The validity of the conclusion is not changed by the addition of premises. In most logics, weakening is either an inference rule or a metatheorem if the logic doesnt have an explicit rule, notable exceptions are, Strict logic or relevant logic, where every hypothesis must be necessary for the conclusion. Linear logic which disallows arbitrary contraction in addition to arbitrary weakening, bunched implications where weakening is restricted to additive composition. Abductive reasoning, the process of deriving the most likely explanations of the known facts, reasoning about knowledge, where statements specifying that something is not known need to be retracted when that thing is learned. Contraction Exchange rule Substructural logic No-cloning theorem
6.
Tautology (logic)
–
In logic, a tautology is a formula that is true in every possible interpretation. Philosopher Ludwig Wittgenstein first applied the term to redundancies of propositional logic in 1921, a formula is satisfiable if it is true under at least one interpretation, and thus a tautology is a formula whose negation is unsatisfiable. Unsatisfiable statements, both through negation and affirmation, are known formally as contradictions, a formula that is neither a tautology nor a contradiction is said to be logically contingent. Such a formula can be either true or false based on the values assigned to its propositional variables. The double turnstile notation ⊨ S is used to indicate that S is a tautology, Tautology is sometimes symbolized by Vpq, and contradiction by Opq. Tautologies are a key concept in logic, where a tautology is defined as a propositional formula that is true under any possible Boolean valuation of its propositional variables. A key property of tautologies in propositional logic is that a method exists for testing whether a given formula is always satisfied. The definition of tautology can be extended to sentences in predicate logic, in propositional logic, there is no distinction between a tautology and a logically valid formula. The set of formulas is a proper subset of the set of logically valid sentences of predicate logic. In 1800, Immanuel Kant wrote in his book Logic, The identity of concepts in analytical judgments can be explicit or non-explicit. In the former case analytic propositions are tautological, here analytic proposition refers to an analytic truth, a statement in natural language that is true solely because of the terms involved. In 1884, Gottlob Frege proposed in his Grundlagen that a truth is analytic if it can be derived using logic. But he maintained a distinction between analytic truths and tautologies, in 1921, in his Tractatus Logico-Philosophicus, Ludwig Wittgenstein proposed that statements that can be deduced by logical deduction are tautological as well as being analytic truths. Henri Poincaré had made remarks in Science and Hypothesis in 1905. It has got to be something that has some quality, which I do not know how to define. Here logical proposition refers to a proposition that is using the laws of logic. During the 1930s, the formalization of the semantics of propositional logic in terms of truth assignments was developed, the term tautology began to be applied to those propositional formulas that are true regardless of the truth or falsity of their propositional variables. Some early books on logic used the term for any proposition that is universally valid, propositional logic begins with propositional variables, atomic units that represent concrete propositions
7.
Boolean algebra
–
In mathematics and mathematical logic, Boolean algebra is the branch of algebra in which the values of the variables are the truth values true and false, usually denoted 1 and 0 respectively. It is thus a formalism for describing logical relations in the way that ordinary algebra describes numeric relations. Boolean algebra was introduced by George Boole in his first book The Mathematical Analysis of Logic, according to Huntington, the term Boolean algebra was first suggested by Sheffer in 1913. Boolean algebra has been fundamental in the development of digital electronics and it is also used in set theory and statistics. Booles algebra predated the modern developments in algebra and mathematical logic. In an abstract setting, Boolean algebra was perfected in the late 19th century by Jevons, Schröder, Huntington, in fact, M. H. Stone proved in 1936 that every Boolean algebra is isomorphic to a field of sets. Shannon already had at his disposal the abstract mathematical apparatus, thus he cast his switching algebra as the two-element Boolean algebra, in circuit engineering settings today, there is little need to consider other Boolean algebras, thus switching algebra and Boolean algebra are often used interchangeably. Efficient implementation of Boolean functions is a problem in the design of combinational logic circuits. Logic sentences that can be expressed in classical propositional calculus have an equivalent expression in Boolean algebra, thus, Boolean logic is sometimes used to denote propositional calculus performed in this way. Boolean algebra is not sufficient to capture logic formulas using quantifiers, the closely related model of computation known as a Boolean circuit relates time complexity to circuit complexity. Whereas in elementary algebra expressions denote mainly numbers, in Boolean algebra they denote the truth values false and these values are represented with the bits, namely 0 and 1. Addition and multiplication then play the Boolean roles of XOR and AND respectively, Boolean algebra also deals with functions which have their values in the set. A sequence of bits is a commonly used such function, another common example is the subsets of a set E, to a subset F of E is associated the indicator function that takes the value 1 on F and 0 outside F. The most general example is the elements of a Boolean algebra, as with elementary algebra, the purely equational part of the theory may be developed without considering explicit values for the variables. The basic operations of Boolean calculus are as follows, AND, denoted x∧y, satisfies x∧y =1 if x = y =1 and x∧y =0 otherwise. OR, denoted x∨y, satisfies x∨y =0 if x = y =0, NOT, denoted ¬x, satisfies ¬x =0 if x =1 and ¬x =1 if x =0. Alternatively the values of x∧y, x∨y, and ¬x can be expressed by tabulating their values with truth tables as follows, the first operation, x → y, or Cxy, is called material implication. If x is then the value of x → y is taken to be that of y
8.
Contradiction
–
In classical logic, a contradiction consists of a logical incompatibility between two or more propositions. It occurs when the propositions, taken together, yield two conclusions which form the logical, usually opposite inversions of each other. Illustrating a general tendency in applied logic, Aristotles law of noncontradiction states that One cannot say of something that it is, by extension, outside of classical logic, one can speak of contradictions between actions when one presumes that their motives contradict each other. By creation of a paradox, Platos Euthydemus dialogue demonstrates the need for the notion of contradiction, in the ensuing dialogue Dionysodorus denies the existence of contradiction, all the while that Socrates is contradicting him. I in my astonishment said, What do you mean Dionysodorus, the dictum is that there is no such thing as a falsehood, a man must either say what is true or say nothing. Indeed, Dionysodorus agrees that there is no such thing as false opinion, there is no such thing as ignorance and demands of Socrates to Refute me. Socrates responds But how can I refute you, if, as you say, note, The symbol ⊥ represents an arbitrary contradiction, with the dual tee symbol ⊤ used to denote an arbitrary tautology. Contradiction is sometimes symbolized by Opq, and tautology by Vpq, the turnstile symbol, ⊢ is often read as yields or proves. In classical logic, particularly in propositional and first-order logic, a proposition φ is a contradiction if, since for contradictory φ it is true that ⊢ φ → ψ for all ψ, one may prove any proposition from a set of axioms which contains contradictions. This is called the principle of explosion or ex falso quodlibet, in a complete logic, a formula is contradictory if and only if it is unsatisfiable. Therefore, a proof that ¬ φ ⊢ ⊥ also proves that φ is true, the use of this fact constitutes the technique of the proof by contradiction, which mathematicians use extensively. This applies only in a logic using the excluded middle A ∨ ¬ A as an axiom, in mathematics, the symbol used to represent a contradiction within a proof varies. A consistency proof requires an axiomatic system a demonstration that it is not the case both the formula p and its negation ~p can be derived in the system. Posts solution to the problem is described in the demonstration An Example of a Successful Absolute Proof of Consistency offered by Ernest Nagel and they too observe a problem with respect to the notion of contradiction with its usual truth values of truth and falsity. They observe that, The property of being a tautology has been defined in notions of truth, yet these notions obviously involve a reference to something outside the formula calculus. Therefore, the mentioned in the text in effect offers an interpretation of the calculus. This being so, the authors have not done what they promised, namely, proofs of consistency which are based on models, and which argue from the truth of axioms to their consistency, merely shift the problem. Given some primitive formulas such as PMs primitives S1 V S2, so what will be the definition of tautologous
9.
Logical NOR
–
In boolean logic, logical nor or joint denial is a truth-functional operator which produces a result that is the negation of logical or. That is, a sentence of the form is true precisely when neither p nor q is true—i. e, when both of p and q are false. In grammar, nor is a coordinating conjunction, thus, as with its dual, the NAND operator, NOR can be used by itself, without any other logical operator, to constitute a logical formal system. It is also known as Quines dagger, the ampheck by Peirce, one way of expressing p NOR q is p ∨ q ¯, where the symbol ∨ signifies OR and the bar signifies the negation of the expression under it, in essence, simply ¬. Other ways of expressing p NOR q are Xpq, and p + q ¯, the computer used in the spacecraft that first carried humans to the moon, the Apollo Guidance Computer, was constructed entirely using NOR gates with three inputs. The NOR operation is an operation on two logical values, typically the values of two propositions, that produces a value of true if and only if both operands are false. In other words, it produces a value of false if, the truth table of A NOR B is as follows, Logical NOR does not possess any of the five qualities required to be absent from at least one member of a set of functionally complete operators. Thus, the set containing only NOR suffices as a complete set, NOR has the interesting feature that all other logical operators can be expressed by interlaced NOR operations. The logical NAND operator also has this ability, the logical NOR ↓ is the negation of the disjunction, Expressed in terms of NOR ↓, the usual operators of propositional logic are
10.
Converse nonimplication
–
In logic, converse nonimplication is a logical connective which is the negation of the converse of implication. P ⊄ q which is the same as ∼ The truth table of p ⊄ q, the Venn Diagram of It is not the case that B implies A. Also related to the complement, where the relative complement of A in B is denoted B ∖ A. M p q, uses prefixed capital letter, P ↚ q, ↚ combines Converse implications left arrow denied by means of a stroke. The Art of Computer Programming, Volume 4A, Combinatorial Algorithms, Part 1
11.
Negation
–
Negation is thus a unary logical connective. It may be applied as an operation on propositions, truth values, in classical logic, negation is normally identified with the truth function that takes truth to falsity and vice versa. In intuitionistic logic, according to the Brouwer–Heyting–Kolmogorov interpretation, the negation of a proposition p is the proposition whose proofs are the refutations of p. Classical negation is an operation on one logical value, typically the value of a proposition, that produces a value of true when its operand is false and a value of false when its operand is true. So, if statement A is true, then ¬A would therefore be false, the truth table of ¬p is as follows, Classical negation can be defined in terms of other logical operations. For example, ¬p can be defined as p → F, conversely, one can define F as p & ¬p for any proposition p, where & is logical conjunction. The idea here is that any contradiction is false, while these ideas work in both classical and intuitionistic logic, they do not work in paraconsistent logic, where contradictions are not necessarily false. But in classical logic, we get an identity, p → q can be defined as ¬p ∨ q. Algebraically, classical negation corresponds to complementation in a Boolean algebra and these algebras provide a semantics for classical and intuitionistic logic respectively. The negation of a proposition p is notated in different ways in various contexts of discussion and fields of application. Among these variants are the following, In set theory \ is also used to indicate not member of, U \ A is the set of all members of U that are not members of A. No matter how it is notated or symbolized, the negation ¬p / −p can be read as it is not the case p, not that p. Within a system of logic, double negation, that is. In intuitionistic logic, a proposition implies its double negation but not conversely and this marks one important difference between classical and intuitionistic negation. Algebraically, classical negation is called an involution of period two and this result is known as Glivenkos theorem. De Morgans laws provide a way of distributing negation over disjunction and conjunction, ¬ ≡, in Boolean algebra, a linear function is one such that, If there exists a0, a1. An ∈ such that f = a0 ⊕ ⊕, another way to express this is that each variable always makes a difference in the truth-value of the operation or it never makes a difference. Negation is a logical operator
12.
Material nonimplication
–
Material nonimplication or abjunction is the negation of material implication. That is to say that for any two propositions P and Q, the material nonimplication from P to Q is true if and this is more naturally stated as that the material nonimplication from P to Q is true only if P is true and Q is false. The symbol for material nonimplication is simply a crossed-out material implication symbol, bitwise operation, A& Logical operation, A&& Implication
13.
Exclusive or
–
Exclusive or or exclusive disjunction is a logical operation that outputs true only when inputs differ. It is symbolized by the prefix operator J and by the infix operators XOR, EOR, EXOR, ⊻, ⊕, ↮, the negation of XOR is logical biconditional, which outputs true only when both inputs are the same. It gains the exclusive or because the meaning of or is ambiguous when both operands are true, the exclusive or operator excludes that case. This is sometimes thought of as one or the other but not both and this could be written as A or B, but not, A and B. More generally, XOR is true only when an odd number of inputs are true, a chain of XORs—a XOR b XOR c XOR d —is true whenever an odd number of the inputs are true and is false whenever an even number of inputs are true. The truth table of A XOR B shows that it outputs true whenever the inputs differ,0, false 1, true Exclusive disjunction essentially means either one, in other words, the statement is true if and only if one is true and the other is false. For example, if two horses are racing, then one of the two win the race, but not both of them. The exclusive or is equivalent to the negation of a logical biconditional, by the rules of material implication. This unfortunately prevents the combination of two systems into larger structures, such as a mathematical ring. However, the system using exclusive or is an abelian group, the combination of operators ∧ and ⊕ over elements produce the well-known field F2. This field can represent any logic obtainable with the system and has the benefit of the arsenal of algebraic analysis tools for fields. The Oxford English Dictionary explains either, or as follows, The primary function of either, etc. is to emphasize the perfect indifference of the two things or courses. But a secondary function is to emphasize the mutual exclusiveness, = either of the two, but not both, the exclusive-or explicitly states one or the other, but not neither nor both. Following this kind of common-sense intuition about or, it is argued that in many natural languages, English included. The exclusive disjunction of a pair of propositions, is supposed to mean that p is true or q is true, but not both. For example, it might be argued that the intention of a statement like You may have coffee. Certainly under some circumstances a sentence like this example should be taken as forbidding the possibility of accepting both options. Even so, there is reason to suppose that this sort of sentence is not disjunctive at all
14.
Sheffer stroke
–
It is also called nand or the alternative denial, since it says in effect that at least one of its operands is false. In Boolean algebra and digital electronics it is known as the NAND operation, like its dual, the NOR operator, NAND can be used by itself, without any other logical operator, to constitute a logical formal system. This property makes the NAND gate crucial to modern digital electronics, the NAND operation is a logical operation on two logical values. It produces a value of true, if — and only if — at least one of the propositions is false, the truth table of A NAND B is as follows, The stroke is named after Henry M. Because of self-duality of Boolean algebras, Sheffers axioms are equally valid for either of the NAND or NOR operations in place of the stroke. Sheffer interpreted the stroke as a sign for non-disjunction in his paper, mentioning non-conjunction only in a footnote and it was Jean Nicod who first used the stroke as a sign for non-conjunction in a paper of 1917 and which has since become current practice. Russell and Whitehead used the Sheffer stroke in the 1927 second edition of Principia Mathematica and suggested it as a replacement for the or and not operations of the first edition. Charles Sanders Peirce had discovered the functional completeness of NAND or NOR more than 30 years earlier, using the term ampheck, therefore is a functionally complete set. This can also be realized as follows, All three elements of the complete set can be constructed using only NAND. Thus the set must be complete as well. Hence any formal system including the Sheffer stroke must also include a means of indicating grouping and we shall employ to this effect. We also write p, q, r, … instead of p0, p1, Construction Rule I, For each natural number n, the symbol pn is a well-formed formula, called an atom. Construction Rule II, If X and Y are wffs, then is a wff, closure Rule, Any formulae which cannot be constructed by means of the first two Construction Rules are not wffs. The letters U, V, W, X, and Y are metavariables standing for wffs, then repeat this recursive deconstruction process to each of the subformulae. Eventually the formula should be reduced to its atoms, but if some subformula cannot be so reduced, All wffs of the form are axioms. Instances of, U ⊢ W are inference rules, since the only connective of this logic is |, the symbol | could be discarded altogether, leaving only the parentheses to group the letters. A pair of parentheses must always enclose a pair of wffs, examples of theorems in this simplified notation are. The notation can be simplified further, by letting, = ≡ U for any U and this simplification causes the need to change some rules, More than two letters are allowed within parentheses
15.
Logical conjunction
–
In logic and mathematics, and is the truth-functional operator of logical conjunction, the and of a set of operands is true if and only if all of its operands are true. The logical connective that represents this operator is written as ∧ or ⋅. A and B is true only if A is true and B is true, an operand of a conjunction is a conjunct. Related concepts in other fields are, In natural language, the coordinating conjunction, in programming languages, the short-circuit and control structure. And is usually denoted by an operator, in mathematics and logic, ∧ or ×, in electronics, ⋅. In Jan Łukasiewiczs prefix notation for logic, the operator is K, logical conjunction is an operation on two logical values, typically the values of two propositions, that produces a value of true if and only if both of its operands are true. The conjunctive identity is 1, which is to say that AND-ing an expression with 1 will never change the value of the expression. In keeping with the concept of truth, when conjunction is defined as an operator or function of arbitrary arity. The truth table of A ∧ B, As a rule of inference, conjunction introduction is a classically valid, the argument form has two premises, A and B. Intuitively, it permits the inference of their conjunction, therefore, A and B. or in logical operator notation, A, B ⊢ A ∧ B Here is an example of an argument that fits the form conjunction introduction, Bob likes apples. Therefore, Bob likes apples and oranges, Conjunction elimination is another classically valid, simple argument form. Intuitively, it permits the inference from any conjunction of either element of that conjunction, therefore, A. or alternately, A and B. In logical operator notation, A ∧ B ⊢ A. falsehood-preserving, yes When all inputs are false, walsh spectrum, Nonlinearity,1 If using binary values for true and false, then logical conjunction works exactly like normal arithmetic multiplication. Many languages also provide short-circuit control structures corresponding to logical conjunction. Logical conjunction is used for bitwise operations, where 0 corresponds to false and 1 to true,0 AND0 =0,0 AND1 =0,1 AND0 =0,1 AND1 =1. The operation can also be applied to two binary words viewed as bitstrings of length, by taking the bitwise AND of each pair of bits at corresponding positions. For example,11000110 AND10100011 =10000010 and this can be used to select part of a bitstring using a bit mask. For example,10011101 AND00001000 =00001000 extracts the fifth bit of an 8-bit bitstring
16.
Logical biconditional
–
In logic and mathematics, the logical biconditional is the logical connective of two statements asserting p if and only if q, where p is an antecedent and q is a consequent. This is often abbreviated p iff q, the operator is denoted using a doubleheaded arrow, a prefixed E, an equality sign, an equivalence sign, or EQV. It is logically equivalent to ∧, or the XNOR boolean operator and it is also logically equivalent to or, meaning both or neither. The only difference from material conditional is the case when the hypothesis is false, in that case, in the conditional, the result is true, yet in the biconditional the result is false. In the conceptual interpretation, a = b means All a s are b s and all b s are a s, in other words and this does not mean that the concepts have the same meaning. Examples, triangle and trilateral, equiangular trilateral and equilateral triangle, the antecedent is the subject and the consequent is the predicate of a universal affirmative proposition. In the propositional interpretation, a ⇔ b means that a b and b implies a, in other words, that the propositions are equivalent. This does not mean that they have the same meaning, example, The triangle ABC has two equal sides, and The triangle ABC has two equal angles. The antecedent is the premise or the cause and the consequent is the consequence, when an implication is translated by a hypothetical judgment the antecedent is called the hypothesis and the consequent is called the thesis. A common way of demonstrating a biconditional is to use its equivalence to the conjunction of two converse conditionals, demonstrating these separately. When both members of the biconditional are propositions, it can be separated into two conditionals, of one is called a theorem and the other its reciprocal. Thus whenever a theorem and its reciprocal are true we have a biconditional, a simple theorem gives rise to an implication whose antecedent is the hypothesis and whose consequent is the thesis of the theorem. When a theorem and its reciprocal are true we say that its hypothesis is the necessary and sufficient condition of the thesis, that is to say, that it is at the same time both cause and consequence. Logical equality is an operation on two values, typically the values of two propositions, that produces a value of true if and only if both operands are false or both operands are true. The truth table for A ↔ B is as follows, More than two statements combined by ↔ are ambiguous, x 1 ↔ x 2 ↔ x 3 ↔. ↔ x n may be meant as ↔ x n, or may be used to say that all x i are together true or together false, commutativity, yes associativity, yes distributivity, Biconditional doesnt distribute over any binary function, but logical disjunction distributes over biconditional. Idempotency, no monotonicity, no truth-preserving, yes When all inputs are true, falsehood-preserving, no When all inputs are false, the output is not false. Walsh spectrum, Nonlinearity,0 Like all connectives in first-order logic, Biconditional introduction allows you to infer that, if B follows from A, and A follows from B, then A if and only if B
17.
Material conditional
–
The material conditional is a logical connective that is often symbolized by a forward arrow →. The material conditional is used to form statements of the form
18.
Logical disjunction
–
In logic and mathematics, or is the truth-functional operator of disjunction, also known as alternation, the or of a set of operands is true if and only if one or more of its operands is true. The logical connective that represents this operator is written as ∨ or +. A or B is true if A is true, or if B is true, or if both A and B are true. In logic, or by means the inclusive or, distinguished from an exclusive or. An operand of a disjunction is called a disjunct, related concepts in other fields are, In natural language, the coordinating conjunction or. In programming languages, the short-circuit or control structure, or is usually expressed with an infix operator, in mathematics and logic, ∨, in electronics, +, and in most programming languages, |, ||, or or. In Jan Łukasiewiczs prefix notation for logic, the operator is A, logical disjunction is an operation on two logical values, typically the values of two propositions, that has a value of false if and only if both of its operands are false. More generally, a disjunction is a formula that can have one or more literals separated only by ors. A single literal is often considered to be a degenerate disjunction, the disjunctive identity is false, which is to say that the or of an expression with false has the same value as the original expression. In keeping with the concept of truth, when disjunction is defined as an operator or function of arbitrary arity. Falsehood-preserving, The interpretation under which all variables are assigned a value of false produces a truth value of false as a result of disjunction. The mathematical symbol for logical disjunction varies in the literature, in addition to the word or, and the formula Apq, the symbol ∨, deriving from the Latin word vel is commonly used for disjunction. For example, A ∨ B is read as A or B, such a disjunction is false if both A and B are false. In all other cases it is true, all of the following are disjunctions, A ∨ B ¬ A ∨ B A ∨ ¬ B ∨ ¬ C ∨ D ∨ ¬ E. The corresponding operation in set theory is the set-theoretic union, operators corresponding to logical disjunction exist in most programming languages. Disjunction is often used for bitwise operations, for example, x = x | 0b00000001 will force the final bit to 1 while leaving other bits unchanged. Logical disjunction is usually short-circuited, that is, if the first operand evaluates to true then the second operand is not evaluated, the logical disjunction operator thus usually constitutes a sequence point. In a parallel language, it is possible to both sides, they are evaluated in parallel, and if one terminates with value true
19.
Lookup table
–
In computer science, a lookup table is an array that replaces runtime computation with a simpler array indexing operation. The savings in terms of processing time can be significant, since retrieving a value from memory is often faster than undergoing an expensive computation or input/output operation. The tables may be precalculated and stored in static program storage, calculated as part of a programs initialization phase, fPGAs also make extensive use of reconfigurable, hardware-implemented, lookup tables to provide programmable hardware functionality. Before the advent of computers, lookup tables of values were used to speed up hand calculations of complex functions, such as in trigonometry, logarithms, in ancient India, Aryabhata created one of the first sine tables, which he encoded in a Sanskrit-letter-based number system. Early in the history of computers, input/output operations were particularly slow – even in comparison to processor speeds of the time. Despite the introduction of systemwide caching that now automates this process, application level lookup tables can still improve performance for data items that rarely, if ever, change. Lookup tables were one of the earliest functionalities implemented in computer spreadsheets and this has been followed by subsequent spreadsheets, such as Microsoft Excel, and complemented by specialized VLOOKUP and HLOOKUP functions to simplify lookup in a vertical or horizontal table. This is known as a search or brute-force search, each element being checked for equality in turn. This is often the slowest search method unless frequently occurring values occur early in the list, for a one-dimensional array or linked list, the lookup is usually to determine whether or not there is a match with an input data value. This is only if the list is sorted but gives good performance even if the list is lengthy. For a trivial hash function lookup, the raw data value is used directly as an index to a one-dimensional table to extract a result. For small ranges, this can be amongst the fastest lookup, even exceeding binary search speed with zero branches and executing in constant time. One discrete problem that is expensive to solve on many computers, is that of counting the number of bits which are set to 1 in a number, sometimes called the population function. For example, the decimal number 37 is 00100101 in binary and this can be ameliorated using loop unrolling and some other compiler optimizations. There is however a simple and much faster algorithmic solution - using a hash function table lookup. Simply construct a table, bits_set, with 256 entries giving the number of one bits set in each possible byte value. Then use this table to find the number of ones in each byte of the using a trivial hash function lookup on each byte in turn. This requires no branches, and just four indexed memory accesses, the above source can be improved easily, by recasting x as a 4 byte unsigned char array and, preferably, coded in-line as a single statement instead of being a function
20.
Logic
–
Logic, originally meaning the word or what is spoken, is generally held to consist of the systematic study of the form of arguments. A valid argument is one where there is a relation of logical support between the assumptions of the argument and its conclusion. Historically, logic has been studied in philosophy and mathematics, and recently logic has been studied in science, linguistics, psychology. The concept of form is central to logic. The validity of an argument is determined by its logical form, traditional Aristotelian syllogistic logic and modern symbolic logic are examples of formal logic. Informal logic is the study of natural language arguments, the study of fallacies is an important branch of informal logic. Since much informal argument is not strictly speaking deductive, on some conceptions of logic, formal logic is the study of inference with purely formal content. An inference possesses a purely formal content if it can be expressed as an application of a wholly abstract rule, that is. The works of Aristotle contain the earliest known study of logic. Modern formal logic follows and expands on Aristotle, in many definitions of logic, logical inference and inference with purely formal content are the same. This does not render the notion of informal logic vacuous, because no formal logic captures all of the nuances of natural language, Symbolic logic is the study of symbolic abstractions that capture the formal features of logical inference. Symbolic logic is divided into two main branches, propositional logic and predicate logic. Mathematical logic is an extension of logic into other areas, in particular to the study of model theory, proof theory, set theory. Logic is generally considered formal when it analyzes and represents the form of any valid argument type, the form of an argument is displayed by representing its sentences in the formal grammar and symbolism of a logical language to make its content usable in formal inference. Simply put, formalising simply means translating English sentences into the language of logic and this is called showing the logical form of the argument. It is necessary because indicative sentences of ordinary language show a variety of form. Second, certain parts of the sentence must be replaced with schematic letters, thus, for example, the expression all Ps are Qs shows the logical form common to the sentences all men are mortals, all cats are carnivores, all Greeks are philosophers, and so on. The schema can further be condensed into the formula A, where the letter A indicates the judgement all - are -, the importance of form was recognised from ancient times
21.
History of logic
–
The history of logic deals with the study of the development of the science of valid inference. Formal logics developed in ancient times in China, India, Greek methods, particularly Aristotelian logic as found in the Organon, found wide application and acceptance in Western science and mathematics for millennia. The Stoics, especially Chrysippus, began the development of predicate logic, christian and Islamic philosophers such as Boethius and William of Ockham further developed Aristotles logic in the Middle Ages, reaching a high point in the mid-fourteenth century. The period between the fourteenth century and the beginning of the century saw largely decline and neglect. Empirical methods ruled the day, as evidenced by Sir Francis Bacons Novum Organon of 1620, valid reasoning has been employed in all periods of human history. However, logic studies the principles of reasoning, inference. It is probable that the idea of demonstrating a conclusion first arose in connection with geometry, the ancient Egyptians discovered geometry, including the formula for the volume of a truncated pyramid. Ancient Babylon was also skilled in mathematics, while the ancient Egyptians empirically discovered some truths of geometry, the great achievement of the ancient Greeks was to replace empirical methods by demonstrative proof. Both Thales and Pythagoras of the Pre-Socratic philosophers seem aware of geometrys methods, fragments of early proofs are preserved in the works of Plato and Aristotle, and the idea of a deductive system was probably known in the Pythagorean school and the Platonic Academy. The proofs of Euclid of Alexandria are a paradigm of Greek geometry, the three basic principles of geometry are as follows, Certain propositions must be accepted as true without demonstration, such a proposition is known as an axiom of geometry. Every proposition that is not an axiom of geometry must be demonstrated as following from the axioms of geometry, the proof must be formal, that is, the derivation of the proposition must be independent of the particular subject matter in question. Further evidence that early Greek thinkers were concerned with the principles of reasoning is found in the fragment called dissoi logoi and this is part of a protracted debate about truth and falsity. Thales was said to have had a sacrifice in celebration of discovering Thales Theorem just as Pythagoras had the Pythagorean Theorem, Indian and Babylonian mathematicians knew his theorem for special cases before he proved it. It is believed that Thales learned that an angle inscribed in a semicircle is a right angle during his travels to Babylon, before 520 BC, on one of his visits to Egypt or Greece, Pythagoras might have met the c.54 years older Thales. The systematic study of proof seems to have begun with the school of Pythagoras in the sixth century BC. Indeed, the Pythagoreans, believing all was number, are the first philosophers to emphasize rather than matter. He is known for his obscure sayings and this logos holds always but humans always prove unable to understand it, both before hearing it and when they have first heard it. But other people fail to notice what they do when awake, in contrast to Heraclitus, Parmenides held that all is one and nothing changes