1.
Enquiry character
–
In computer communications, enquiry is a transmission-control character that requests a response from the receiving station with which a connection has been set up. It represents a signal intended to trigger a response at the receiving end, the response, and answer-back code to the terminal that transmitted the WRU signal, may include station identification, the type of equipment in service, and the status of the remote station. Some teleprinters had a drum, which could hold a 20 or 22 character message. The message was encoded on the drum by breaking tabs off the drum and this sequence could be transmitted upon receipt of an enquiry signal, if enabled, or by pressing the Here is key on the keyboard. The 5-bit ITA2 has a character, as do the later ASCII. In the 1960s, DEC routinely disabled the feature on Teletype Model 33 terminals because it interfered with the use of the paper-tape reader. However, the DEC VT100 terminals from 1978 responded to enquiry with a user-configurable answerback message, c0 and C1 control codes SCSI Test Unit Ready Command Browser Test Page for Unicode Character ENQUIRY Enquiry Character from ComputerHope. com Enquiry character from SmartComputing. com
2.
Knowledge
–
Knowledge can refer to a theoretical or practical understanding of a subject. It can be implicit or explicit, it can be more or less formal or systematic, however, several definitions of knowledge and theories to explain it exist. Knowledge acquisition involves complex cognitive processes, perception, communication, and reasoning, the eventual demarcation of philosophy from science was made possible by the notion that philosophys core was theory of knowledge, a theory distinct from the sciences because it was their foundation. Without this idea of a theory of knowledge, it is hard to imagine what philosophy could have been in the age of modern science, the definition of knowledge is a matter of ongoing debate among philosophers in the field of epistemology. Some claim that conditions are not sufficient, as Gettier case examples allegedly demonstrate. Richard Kirkham suggests that our definition of knowledge requires that the evidence for the belief necessitates its truth. In contrast to this approach, Ludwig Wittgenstein observed, following Moores paradox, that one can say He believes it, but it isnt so, but not He knows it, but it isnt so. He goes on to argue that these do not correspond to distinct mental states, what is different here is not the mental state of the speaker, but the activity in which they are engaged. For example, on account, to know that the kettle is boiling is not to be in a particular state of mind. Wittgenstein sought to bypass the difficulty of definition by looking to the way knowledge is used in natural languages and he saw knowledge as a case of a family resemblance. Following this idea, knowledge has been reconstructed as a concept that points out relevant features. Symbolic representations can be used to indicate meaning and can be thought of as a dynamic process, hence the transfer of the symbolic representation can be viewed as one ascription process whereby knowledge can be transferred. Other forms of communication include observation and imitation, verbal exchange, philosophers of language and semioticians construct and analyze theories of knowledge transfer or communication. In his collection of essays Technopoly, Neil Postman demonstrates the argument against the use of writing through an excerpt from Platos work Phaedrus, in this excerpt, the scholar Socrates recounts the story of Thamus, the Egyptian king and Theuth the inventor of the written word. In this story, Theuth presents his new invention writing to King Thamus, King Thamus is skeptical of this new invention and rejects it as a tool of recollection rather than retained knowledge. Media theorists like Andrew Robinson emphasise that the depiction of knowledge in the modern world was often seen as being truer than oral knowledge. It is harder to preserve records of what was said or who originally said it – usually neither the nor the content can be verified. Gossip and rumors are examples prevalent in both media, major libraries today can have millions of books of knowledge
3.
Doubt
–
Doubt characterises a status in which the mind remains suspended between two contradictory propositions and unable to assent to either of them. Doubt on a level is indecision between belief and disbelief. Doubt involves uncertainty, distrust or lack of sureness of a fact, an action. Doubt questions a notion of a reality, and may involve delaying or rejecting relevant action out of concerns for mistakes or faults or appropriateness. Doubt sometimes tends to call on reason, Doubt may encourage people to hesitate before acting, and/or to apply more rigorous methods. Doubt may have particular importance as leading towards disbelief or non-acceptance, societally, doubt creates an atmosphere of distrust, being accusatory in nature and de facto alleging either foolishness or deceit on the part of another. Such a stance has been fostered in Western European society since the Enlightenment, in opposition to tradition, psychoanalytic theory attributes doubt to childhood, when the ego develops. Childhood experiences, these theories maintain, can plant doubt about ones abilities, cognitive mental as well as more spiritual approaches abound in response to the wide variety of potential causes for doubt. Behavioral therapy — in which a person systematically asks his own if the doubt has any real basis — uses rational. This method contrasts to those of say, the Buddhist faith, buddhism sees doubt as a negative attachment to ones perceived past and future. To let go of the history of ones life plays a central role in releasing the doubts — developed in. Partial or intermittent negative reinforcement can create a climate of fear. Descartes employed Cartesian doubt as a pre-eminent methodological tool in his fundamental philosophical investigations, branches of philosophy like logic devote much effort to distinguish the dubious, the probable and the certain. Much of illogic rests on assumptions, dubious data or dubious conclusions, with rhetoric, whitewashing. Doubt that god exist may form the basis of agnosticism — the belief that one cannot determine the existence or non-existence of god. It may also form other brands of skepticism, such as Pyrrhonism, which do not take a stance in regard to the existence of god. Alternatively, doubt over the existence of god may lead to acceptance of a particular religion, Doubt of a specific theology, scriptural or deistic, may bring into question the truth of that theologys set of beliefs. On the other hand, doubt as to some doctrines but acceptance of others may lead to the growth of heresy and/or the splitting off of sects or groups of thought, thus proto-Protestants doubted papal authority, and substituted alternative methods of governance in their new churches
4.
Aristotle
–
Aristotle was an ancient Greek philosopher and scientist born in the city of Stagira, Chalkidice, on the northern periphery of Classical Greece. His father, Nicomachus, died when Aristotle was a child, at seventeen or eighteen years of age, he joined Platos Academy in Athens and remained there until the age of thirty-seven. Shortly after Plato died, Aristotle left Athens and, at the request of Philip II of Macedon, teaching Alexander the Great gave Aristotle many opportunities and an abundance of supplies. He established a library in the Lyceum which aided in the production of many of his hundreds of books and he believed all peoples concepts and all of their knowledge was ultimately based on perception. Aristotles views on natural sciences represent the groundwork underlying many of his works, Aristotles views on physical science profoundly shaped medieval scholarship. Their influence extended from Late Antiquity and the Early Middle Ages into the Renaissance, some of Aristotles zoological observations, such as on the hectocotyl arm of the octopus, were not confirmed or refuted until the 19th century. His works contain the earliest known study of logic, which was incorporated in the late 19th century into modern formal logic. Aristotle was well known among medieval Muslim intellectuals and revered as The First Teacher and his ethics, though always influential, gained renewed interest with the modern advent of virtue ethics. All aspects of Aristotles philosophy continue to be the object of academic study today. Though Aristotle wrote many elegant treatises and dialogues – Cicero described his style as a river of gold – it is thought that only around a third of his original output has survived. Aristotle, whose means the best purpose, was born in 384 BC in Stagira, Chalcidice. His father Nicomachus was the physician to King Amyntas of Macedon. Aristotle was orphaned at a young age, although there is little information on Aristotles childhood, he probably spent some time within the Macedonian palace, making his first connections with the Macedonian monarchy. At the age of seventeen or eighteen, Aristotle moved to Athens to continue his education at Platos Academy and he remained there for nearly twenty years before leaving Athens in 348/47 BC. Aristotle then accompanied Xenocrates to the court of his friend Hermias of Atarneus in Asia Minor, there, he traveled with Theophrastus to the island of Lesbos, where together they researched the botany and zoology of the island. Aristotle married Pythias, either Hermiass adoptive daughter or niece and she bore him a daughter, whom they also named Pythias. Soon after Hermias death, Aristotle was invited by Philip II of Macedon to become the tutor to his son Alexander in 343 BC, Aristotle was appointed as the head of the royal academy of Macedon. During that time he gave not only to Alexander
5.
Prior Analytics
–
The Prior Analytics is Aristotles work on deductive reasoning, which is known as his syllogistic. Being one of the six extant Aristotelian writings on logic and scientific method, modern work on Aristotles logic builds on the tradition started in 1951 with the establishment by Jan Lukasiewicz of a revolutionary paradigm. The term analytics comes from the Greek words ἀναλυτός and ἀναλύω, however, in Aristotles corpus, there are distinguishable differences in the meaning of ἀναλύω and its cognates. There is also the possibility that Aristotle may have borrowed his use of the analysis from his teacher Plato. Therefore, Analysis is the process of finding the reasoned facts, Aristotles Prior Analytics represents the first time in history when Logic is scientifically investigated. On those grounds alone, Aristotle could be considered the Father of Logic for as he says in Sophistical Refutations. When it comes to this subject, it is not the case that part had been worked out before in advance and part had not, instead, some scholars prefer to use the word deduction instead as the meaning given by Aristotle to the Greek word συλλογισμός syllogismos. In the Analytics then, Prior Analytics is the first theoretical part dealing with the science of deduction, Prior Analytics gives an account of deductions in general narrowed down to three basic syllogisms while Posterior Analytics deals with demonstration. In the Prior Analytics, Aristotle defines syllogism as, a deduction in a discourse in which, certain things being supposed, something different from the things supposed results of necessity because these things are so. In modern times, this definition has led to a debate as to how the word syllogism should be interpreted, scholars Jan Lukasiewicz, Józef Maria Bocheński and Günther Patzig have sided with the Protasis-Apodosis dichotomy while John Corcoran prefers to consider a syllogism as simply a deduction. In the third century AD, Alexander of Aphrodisiass commentary on the Prior Analytics is the oldest extant, in the sixth century, Boethius composed the first known Latin translation of the Prior Analytics. No Westerner between Boethius and Bernard of Utrecht is known to have read the Prior Analytics, the so-called Anonymus Aurelianensis III from the second half of the twelfth century is the first extant Latin commentary, or rather fragment of a commentary. The Prior Analytics represents the first formal study of logic, where logic is understood as the study of arguments, an argument is a series of true or false statements which lead to a true or false conclusion. In the Prior Analytics, Aristotle identifies valid and invalid forms of arguments called syllogisms, a syllogism is an argument that consists of at least three sentences, at least two premises and a conclusion. Although Aristotles does not call them categorical sentences, tradition does, he deals with them briefly in the Analytics, each proposition of a syllogism is a categorical sentence which has a subject and a predicate connected by a verb. In his formulation of syllogistic propositions, instead of the copula, belongs to/does not belong to all/some. Is said/is not said of all/some, there are four different types of categorical sentences, universal affirmative, particular affirmative, universal negative and particular negative. Depending on the position of the term, Aristotle divides the syllogism into three kinds, Syllogism in the first, second and third figure
6.
Lune (geometry)
–
In plane geometry, a lune is the concave-convex area bounded by two circular arcs, while a convex-convex area is termed a lens. The word lune derives from luna, the Latin word for Moon, formally, a lune is the relative complement of one disk in another. Alternatively, if A and B are disks, then L = A − A ∩ B is a lune, in the 5th century BC, Hippocrates of Chios showed that certain lunes could be exactly squared by straightedge and compass. Arbelos Crescent Gauss–Bonnet theorem Weisstein, Eric W. Lune, the Five Squarable Lunes at MathPages
7.
Charles Sanders Peirce
–
Charles Sanders Peirce was an American philosopher, logician, mathematician, and scientist who is sometimes known as the father of pragmatism. He was educated as a chemist and employed as a scientist for 30 years, today he is appreciated largely for his contributions to logic, mathematics, philosophy, scientific methodology, and semiotics, and for his founding of pragmatism. An innovator in mathematics, statistics, philosophy, research methodology, and various sciences, Peirce considered himself, first and foremost and he made major contributions to logic, but logic for him encompassed much of that which is now called epistemology and philosophy of science. As early as 1886 he saw that logical operations could be carried out by electrical switching circuits, in 1934, the philosopher Paul Weiss called Peirce the most original and versatile of American philosophers and Americas greatest logician. Websters Biographical Dictionary said in 1943 that Peirce was now regarded as the most original thinker, keith Devlin similarly referred to Peirce as one of the greatest philosophers ever. Peirce was born at 3 Phillips Place in Cambridge, Massachusetts and he was the son of Sarah Hunt Mills and Benjamin Peirce, himself a professor of astronomy and mathematics at Harvard University and perhaps the first serious research mathematician in America. At age 12, Charles read his older brothers copy of Richard Whatelys Elements of Logic, so began his lifelong fascination with logic and reasoning. At Harvard, he began lifelong friendships with Francis Ellingwood Abbot, Chauncey Wright, one of his Harvard instructors, Charles William Eliot, formed an unfavorable opinion of Peirce. This opinion proved fateful, because Eliot, while President of Harvard 1869–1909—a period encompassing nearly all of Peirces working life—repeatedly vetoed Harvards employing Peirce in any capacity. Peirce suffered from his late teens onward from a condition then known as facial neuralgia. Its consequences may have led to the isolation which made his lifes later years so tragic. That employment exempted Peirce from having to part in the Civil War, it would have been very awkward for him to do so. At the Survey, he worked mainly in geodesy and gravimetry and he was elected a resident fellow of the American Academy of Arts and Sciences in January 1867. From 1869 to 1872, he was employed as an Assistant in Harvards astronomical observatory, doing important work on determining the brightness of stars, on April 20,1877 he was elected a member of the National Academy of Sciences. Also in 1877, he proposed measuring the meter as so many wavelengths of light of a certain frequency, during the 1880s, Peirces indifference to bureaucratic detail waxed while his Survey works quality and timeliness waned. Peirce took years to write reports that he should have completed in months, meanwhile, he wrote entries, ultimately thousands during 1883–1909, on philosophy, logic, science, and other subjects for the encyclopedic Century Dictionary. In 1885, an investigation by the Allison Commission exonerated Peirce, in 1891, Peirce resigned from the Coast Survey at Superintendent Thomas Corwin Mendenhalls request. He never again held regular employment, in 1879, Peirce was appointed Lecturer in logic at Johns Hopkins University, which had strong departments in a number of areas that interested him, such as philosophy, psychology, and mathematics
8.
William James
–
William James was an American philosopher and psychologist who was also trained as a physician. A Review of General Psychology survey, published in 2002, ranked James as the 14th most cited psychologist of the 20th century and he also developed the philosophical perspective known as radical empiricism. James work has influenced intellectuals such as Émile Durkheim, W. E. B, du Bois, Edmund Husserl, Bertrand Russell, Ludwig Wittgenstein, Hilary Putnam, and Richard Rorty, and has even influenced Presidents, such as Jimmy Carter. Born into a family, James was the son of the Swedenborgian theologian Henry James Sr. James wrote widely on topics, including epistemology, education, metaphysics, psychology, religion. William James was born at the Astor House in New York City and he was the son of Henry James Sr. a noted and independently wealthy Swedenborgian theologian well acquainted with the literary and intellectual elites of his day. William James received an eclectic trans-Atlantic education, developing fluency in both German and French, education in the James household encouraged cosmopolitanism. The family made two trips to Europe while William James was still a child, setting a pattern that resulted in thirteen more European journeys during his life. In his early adulthood, James suffered from a variety of ailments, including those of the eyes, back, stomach. Two younger brothers, Garth Wilkinson and Robertson, fought in the Civil War, the other three siblings all suffered from periods of invalidism. He took up studies at Harvard Medical School in 1864. His studies were interrupted once again due to illness in April 1867 and he traveled to Germany in search of a cure and remained there until November 1868, at that time he was 26 years old. During this period, he began to publish, reviews of his works appeared in periodicals such as the North American Review. James finally earned his M. D. degree in June 1869, what he called his soul-sickness would only be resolved in 1872, after an extended period of philosophical searching. He married Alice Gibbens in 1878, in 1882 he joined the Theosophical Society. Jamess time in Germany proved intellectually fertile, helping him find that his interests lay not in medicine but in philosophy. Later, in 1902 he would write, I originally studied medicine in order to be a physiologist, I never had any philosophic instruction, the first lecture on psychology I ever heard being the first I ever gave. In 1875–1876, James, Henry Pickering Bowditch, Charles Pickering Putnam, G. Stanley Hall, Henri Bergson and Sigmund Freud
9.
John Dewey
–
John Dewey was an American philosopher, psychologist, and educational reformer whose ideas have been influential in education and social reform. Dewey is one of the figures associated with the philosophy of pragmatism and is considered one of the fathers of functional psychology. A Review of General Psychology survey, published in 2002, ranked Dewey as the 93rd most cited psychologist of the 20th century, a well-known public intellectual, he was also a major voice of progressive education and liberalism. Although Dewey is known best for his publications about education, he wrote about many other topics, including epistemology, metaphysics, aesthetics, art, logic, social theory. He was an educational reformer for the 20th century. The overriding theme of Deweys works was his belief in democracy, be it in politics, education or communication. As Dewey himself stated in 1888, while still at the University of Michigan, Democracy, John Dewey was born in Burlington, Vermont, to a family of modest means. Dewey was one of four born to Archibald Sprague Dewey. The second born son and first John born to Archibald and Lucina died in an accident on January 17,1859. On October 20,1859 John Dewey was born, forty weeks after the death of his older brother. Like his older, surviving brother, Davis Rich Dewey, he attended the University of Vermont, where he was initiated into Delta Psi, and graduated Phi Beta Kappa in 1879. A significant professor of Deweys at the University of Vermont was Henry A. P. Torrey, Dewey studied privately with Torrey between his graduation from Vermont and his enrollment at Johns Hopkins University. After studying with George Sylvester Morris, Charles Sanders Peirce, Herbert Baxter Adams, in 1884, he accepted a faculty position at the University of Michigan with the help of George Sylvester Morris. His unpublished and now lost dissertation was titled The Psychology of Kant, in 1894 Dewey joined the newly founded University of Chicago where he developed his belief in Rational Empiricism, becoming associated with the newly emerging Pragmatic philosophy. Disagreements with the administration ultimately caused his resignation from the University, in 1899, Dewey was elected president of the American Psychological Association. From 1904 until his retirement in 1930 he was professor of philosophy at both Columbia University and Columbia Universitys Teachers College, in 1905 he became president of the American Philosophical Association. He was a member of the American Federation of Teachers. Along with the historians Charles A, beard and James Harvey Robinson, and the economist Thorstein Veblen, Dewey is one of the founders of The New School
10.
Logic
–
Logic, originally meaning the word or what is spoken, is generally held to consist of the systematic study of the form of arguments. A valid argument is one where there is a relation of logical support between the assumptions of the argument and its conclusion. Historically, logic has been studied in philosophy and mathematics, and recently logic has been studied in science, linguistics, psychology. The concept of form is central to logic. The validity of an argument is determined by its logical form, traditional Aristotelian syllogistic logic and modern symbolic logic are examples of formal logic. Informal logic is the study of natural language arguments, the study of fallacies is an important branch of informal logic. Since much informal argument is not strictly speaking deductive, on some conceptions of logic, formal logic is the study of inference with purely formal content. An inference possesses a purely formal content if it can be expressed as an application of a wholly abstract rule, that is. The works of Aristotle contain the earliest known study of logic. Modern formal logic follows and expands on Aristotle, in many definitions of logic, logical inference and inference with purely formal content are the same. This does not render the notion of informal logic vacuous, because no formal logic captures all of the nuances of natural language, Symbolic logic is the study of symbolic abstractions that capture the formal features of logical inference. Symbolic logic is divided into two main branches, propositional logic and predicate logic. Mathematical logic is an extension of logic into other areas, in particular to the study of model theory, proof theory, set theory. Logic is generally considered formal when it analyzes and represents the form of any valid argument type, the form of an argument is displayed by representing its sentences in the formal grammar and symbolism of a logical language to make its content usable in formal inference. Simply put, formalising simply means translating English sentences into the language of logic and this is called showing the logical form of the argument. It is necessary because indicative sentences of ordinary language show a variety of form. Second, certain parts of the sentence must be replaced with schematic letters, thus, for example, the expression all Ps are Qs shows the logical form common to the sentences all men are mortals, all cats are carnivores, all Greeks are philosophers, and so on. The schema can further be condensed into the formula A, where the letter A indicates the judgement all - are -, the importance of form was recognised from ancient times
11.
Immanuel Kant
–
Immanuel Kant was a German philosopher who is considered a central figure in modern philosophy. Kant took himself to have effected a Copernican revolution in philosophy and his beliefs continue to have a major influence on contemporary philosophy, especially the fields of metaphysics, epistemology, ethics, political theory, and aesthetics. Politically, Kant was one of the earliest exponents of the idea that peace could be secured through universal democracy. He believed that this will be the outcome of universal history. Kant wanted to put an end to an era of futile and speculative theories of human experience, Kant argued that our experiences are structured by necessary features of our minds. In his view, the shapes and structures experience so that, on an abstract level. Among other things, Kant believed that the concepts of space and time are integral to all human experience, as are our concepts of cause, Kant published other important works on ethics, religion, law, aesthetics, astronomy, and history. These included the Critique of Practical Reason, the Metaphysics of Morals, which dealt with ethics, and the Critique of Judgment, Immanuel Kant was born in 1724 in Königsberg, Prussia. His mother, Anna Regina Reuter, was born in Königsberg to a father from Nuremberg. His father, Johann Georg Kant, was a German harness maker from Memel, Immanuel Kant believed that his paternal grandfather Hans Kant was of Scottish origin. Kant was the fourth of nine children, baptized Emanuel, he changed his name to Immanuel after learning Hebrew. Young Kant was a solid, albeit unspectacular, student and he was brought up in a Pietist household that stressed religious devotion, humility, and a literal interpretation of the Bible. His education was strict, punitive and disciplinary, and focused on Latin and religious instruction over mathematics, despite his religious upbringing and maintaining a belief in God, Kant was skeptical of religion in later life, various commentators have labelled him agnostic. Common myths about Kants personal mannerisms are listed, explained, and refuted in Goldthwaits introduction to his translation of Observations on the Feeling of the Beautiful and Sublime. It is often held that Kant lived a strict and disciplined life. He never married, but seemed to have a social life — he was a popular teacher. He had a circle of friends whom he met, among them Joseph Green. A common myth is that Kant never traveled more than 16 kilometres from Königsberg his whole life, in fact, between 1750 and 1754 he worked as a tutor in Judtschen and in Groß-Arnsdorf
12.
George Boole
–
George Boole was an English mathematician, educator, philosopher and logician. He worked in the fields of differential equations and algebraic logic, Boolean logic is credited with laying the foundations for the information age. Boole was born in Lincoln, Lincolnshire, England, the son of John Boole Sr and he had a primary school education, and received lessons from his father, but had little further formal and academic teaching. William Brooke, a bookseller in Lincoln, may have helped him with Latin and he was self-taught in modern languages. At age 16 Boole became the breadwinner for his parents and three siblings, taking up a junior teaching position in Doncaster at Heighams School. Boole participated in the Mechanics Institute, in the Greyfriars, Lincoln, without a teacher, it took him many years to master calculus. At age 19, Boole successfully established his own school in Lincoln, four years later he took over Halls Academy in Waddington, outside Lincoln, following the death of Robert Hall. In 1840 he moved back to Lincoln, where he ran a boarding school, Boole became a prominent local figure, an admirer of John Kaye, the bishop. He took part in the campaign for early closing. With E. R. Larken and others he set up a society in 1847. He associated also with the Chartist Thomas Cooper, whose wife was a relation, from 1838 onwards Boole was making contacts with sympathetic British academic mathematicians and reading more widely. He studied algebra in the form of symbolic methods, as far as these were understood at the time, Booles status as mathematician was recognised by his appointment in 1849 as the first professor of mathematics at Queens College, Cork in Ireland. He met his wife, Mary Everest, there in 1850 while she was visiting her uncle John Ryall who was Professor of Greek. They married some years later in 1855 and he maintained his ties with Lincoln, working there with E. R. Larken in a campaign to reduce prostitution. Boole was awarded the Keith Medal by the Royal Society of Edinburgh in 1855 and was elected a Fellow of the Royal Society in 1857 and he received honorary degrees of LL. D. from the University of Dublin and the University of Oxford. In late November 1864, Boole walked, in rain, from his home at Lichfield Cottage in Ballintemple to the university. He soon became ill, developing a cold and high fever. As his wife believed that remedies should resemble their cause, she put her husband to bed and poured buckets of water over him – the wet having brought on his illness, Booles condition worsened and on 8 December 1864, he died of fever-induced pleural effusion
13.
Augustus De Morgan
–
Augustus De Morgan was a British mathematician and logician. He formulated De Morgans laws and introduced the mathematical induction. Augustus De Morgan was born in Madurai, India in 1806 and his father was Lieut. -Colonel John De Morgan, who held various appointments in the service of the East India Company. His mother, Elizabeth Dodson descended from James Dodson, who computed a table of anti-logarithms, that is, Augustus De Morgan became blind in one eye a month or two after he was born. The family moved to England when Augustus was seven months old, when De Morgan was ten years old, his father died. Mrs. De Morgan resided at various places in the southwest of England and his mathematical talents went unnoticed until he was fourteen, when a family-friend discovered him making an elaborate drawing of a figure in Euclid with ruler and compasses. She explained the aim of Euclid to Augustus, and gave him an initiation into demonstration and he received his secondary education from Mr. Parsons, a fellow of Oriel College, Oxford, who appreciated classics better than mathematics. His mother was an active and ardent member of the Church of England, and desired that her son should become a clergyman, I shall use the world Anti-Deism to signify the opinion that there does not exist a Creator who made and sustains the Universe. His college tutor was John Philips Higman, FRS, at college he played the flute for recreation and was prominent in the musical clubs. His love of knowledge for its own sake interfered with training for the great mathematical race, as a consequence he came out fourth wrangler. This entitled him to the degree of Bachelor of Arts, but to take the degree of Master of Arts. To the signing of any such test De Morgan felt a strong objection, in about 1875 theological tests for academic degrees were abolished in the Universities of Oxford and Cambridge. As no career was open to him at his own university, he decided to go to the Bar, and took up residence in London, about this time the movement for founding London University took shape. A body of liberal-minded men resolved to meet the difficulty by establishing in London a University on the principle of religious neutrality, De Morgan, then 22 years of age, was appointed professor of mathematics. His introductory lecture On the study of mathematics is a discourse upon mental education of permanent value, the London University was a new institution, and the relations of the Council of management, the Senate of professors and the body of students were not well defined. A dispute arose between the professor of anatomy and his students, and in consequence of the action taken by the Council, another professor of mathematics was appointed, who then drowned a few years later. De Morgan had shown himself a prince of teachers, he was invited to return to his chair and its object was to spread scientific and other knowledge by means of cheap and clearly written treatises by the best writers of the time. One of its most voluminous and effective writers was De Morgan, when De Morgan came to reside in London he found a congenial friend in William Frend, notwithstanding his mathematical heresy about negative quantities
14.
Hypothesis
–
A hypothesis is a proposed explanation for a phenomenon. For a hypothesis to be a scientific hypothesis, the method requires that one can test it. Scientists generally base scientific hypotheses on previous observations that cannot satisfactorily be explained with the scientific theories. Even though the hypothesis and theory are often used synonymously. A working hypothesis is a provisionally accepted hypothesis proposed for further research, P is the assumption in a What If question. Remember, the way that you prove an implication is by assuming the hypothesis, --Philip Wadler In its ancient usage, hypothesis referred to a summary of the plot of a classical drama. The English word hypothesis comes from the ancient Greek ὑπόθεσις word hupothesis, in Platos Meno, Socrates dissects virtue with a method used by mathematicians, that of investigating from a hypothesis. In this sense, hypothesis refers to an idea or to a convenient mathematical approach that simplifies cumbersome calculations. In common usage in the 21st century, a hypothesis refers to an idea whose merit requires evaluation. For proper evaluation, the framer of a hypothesis needs to define specifics in operational terms, a hypothesis requires more work by the researcher in order to either confirm or disprove it. In due course, a hypothesis may become part of a theory or occasionally may grow to become a theory itself. Normally, scientific hypotheses have the form of a mathematical model, in entrepreneurial science, a hypothesis is used to formulate provisional ideas within a business setting. The formulated hypothesis is then evaluated where either the hypothesis is proven to be true or false through a verifiability- or falsifiability-oriented Experiment, any useful hypothesis will enable predictions by reasoning. It might predict the outcome of an experiment in a setting or the observation of a phenomenon in nature. The prediction may also invoke statistics and only talk about probabilities, other philosophers of science have rejected the criterion of falsifiability or supplemented it with other criteria, such as verifiability or coherence. The scientific method involves experimentation, to test the ability of some hypothesis to adequately answer the question under investigation. In contrast, unfettered observation is not as likely to raise unexplained issues or open questions in science, a thought experiment might also be used to test the hypothesis as well. In framing a hypothesis, the investigator must not currently know the outcome of a test or that it remains reasonably under continuing investigation, only in such cases does the experiment, test or study potentially increase the probability of showing the truth of a hypothesis
15.
Community of inquiry
–
The community of inquiry is broadly defined as any group of individuals involved in a process of empirical or conceptual inquiry into problematic situations. The Buddhist parable of the men and the elephant offers a colorful way to make sense of the notion of the community of inquiry. The tale finds many blind men fumbling about an elephant, each trying to discover what it is they are touching, one finds the elephants leg and believes it a tree. Another finds its trunk and believes it a rope, yet another finds its side and believes it a wall. The insight is that we are all trapped inside our limited experience, by sharing their experiences in a democratic and participatory manner they could arrive at a more comprehensive truth than their impoverished perspectives allow, isolated from each other. They would show each other why one found the elephant to be like a rope and they would go further, using other ways to collect evidence. Together they would try to reconcile their conflicting conclusions, the blind men would never see the elephant, but they would no longer be trapped in their own limited perspectives. In short, they would be likely to resolve the problematic situation. But resolution is never final, even their consensus could be in error, all findings are provisional and subject to revision. This is the quality of the community of inquiry. While Peirce originally intended the concept of the community of inquiry as a way to model the natural sciences, the concept has been borrowed, adapted and this article touches on the contributions in the fields of education and public administration. According to Matthew Lipman, C. S. Peirce originally restricted the concept to the community of scientists, John Dewey broadened the scope of the concept, applying it to the educational setting. Borrowing from Dewey, Lipman systematically applies the concept to the educational setting. He argues that a classroom is a type of community of inquiry, which leads to “questioning, reasoning, connecting, deliberating, challenging, and developing problem-solving techniques. ”Students and teachers involved in inquiry form a community of inquiry under certain circumstances. Therefore, an understanding of a community of students and teachers engaged in authentic inquiry is the working definition of the key term ‘community of inquiry’. There is a dimension to the concept that is underlined by Lipman. Lipman defined community of inquiry as a rigorous, democratic and reflective form of built up over time with the same group of learners. Lipman also provides a set of antonymic statements that contrasts the standard educational paradigm with the reflective educational paradigm in which communities of inquiry can occur
16.
Rhetoric
–
Rhetoric is the art of discourse, wherein a writer or speaker strives to inform, persuade or motivate particular audiences in specific situations. As a subject of study and a productive civic practice. Its best known definition comes from Aristotle, who considers it a counterpart of both logic and politics, and calls it the faculty of observing in any case the available means of persuasion. The five canons of rhetoric, which trace the traditional tasks in designing a persuasive speech, were first codified in classical Rome, invention, arrangement, style, memory, along with grammar and logic, rhetoric is one of the three ancient arts of discourse. From Ancient Greece to the late 19th century, it was a part of Western education. Scholars have debated the scope of rhetoric since ancient times, although some have limited rhetoric to the specific realm of political discourse, many modern scholars liberate it to encompass every aspect of culture. Contemporary studies of rhetoric address a diverse range of domains than was the case in ancient times. Many contemporary approaches treat rhetoric as human communication that includes purposeful, Public relations, lobbying, law, marketing, professional and technical writing, and advertising are modern professions that employ rhetorical practitioners. Because the ancient Greeks highly valued public political participation, rhetoric emerged as a tool to influence politics. Consequently, rhetoric remains associated with its political origins, however, even the original instructors of Western speech—the Sophists—disputed this limited view of rhetoric. According to the Sophists, such as Gorgias, a successful rhetorician could speak convincingly on any topic and this method suggested rhetoric could be a means of communicating any expertise, not just politics. In his Encomium to Helen, Gorgias even applied rhetoric to fiction by seeking for his own pleasure to prove the blamelessness of the mythical Helen of Troy in starting the Trojan War. Looking to another key rhetorical theorist, Plato defined the scope of rhetoric according to his opinions of the art. He criticized the Sophists for using rhetoric as a means of deceit instead of discovering truth, in Gorgias, one of his Socratic Dialogues, Plato defines rhetoric as the persuasion of ignorant masses within the courts and assemblies. Rhetoric, in Platos opinion, is merely a form of flattery and functions similarly to cookery, thus, Plato considered any speech of lengthy prose aimed at flattery as within the scope of rhetoric. Aristotle both redeemed rhetoric from his teacher and narrowed its focus by defining three genres of rhetoric—deliberative, forensic or judicial, and epideictic, when one considers that rhetoric included torture, it is clear that rhetoric cannot be viewed only in academic terms. However, the enthymeme based upon logic was viewed as the basis of rhetoric, however, since the time of Aristotle, logic has changed. For example, Modal logic has undergone a major development that also modifies rhetoric, yet, Aristotle also outlined generic constraints that focused the rhetorical art squarely within the domain of public political practice
17.
Syllogism
–
A syllogism is a kind of logical argument that applies deductive reasoning to arrive at a conclusion based on two or more propositions that are asserted or assumed to be true. In its earliest form, defined by Aristotle, from the combination of a statement and a specific statement. For example, knowing that all men are mortal and that Socrates is a man, Syllogistic arguments are usually represented in a three-line form, All men are mortal. In antiquity, two theories of the syllogism existed, Aristotelian syllogistic and Stoic syllogistic. Aristotle defines the syllogism as. a discourse in which certain things having been supposed, despite this very general definition, in Aristotles work Prior Analytics, he limits himself to categorical syllogisms that consist of three categorical propositions. From the Middle Ages onwards, categorical syllogism and syllogism were usually used interchangeably and this article is concerned only with this traditional use. The use of syllogisms as a tool for understanding can be dated back to the logical reasoning discussions of Aristotle, the onset of a New Logic, or logica nova, arose alongside the reappearance of Prior Analytics, the work in which Aristotle develops his theory of the syllogism. Prior Analytics, upon re-discovery, was regarded by logicians as a closed and complete body of doctrine, leaving very little for thinkers of the day to debate. Aristotles theories on the syllogism for assertoric sentences was considered especially remarkable, Aristotles Prior Analytics did not, however, incorporate such a comprehensive theory on the modal syllogism—a syllogism that has at least one modalized premise. Aristotles terminology in this aspect of his theory was deemed vague and in many cases unclear and his original assertions on this specific component of the theory were left up to a considerable amount of conversation, resulting in a wide array of solutions put forth by commentators of the day. The system for modal syllogisms laid forth by Aristotle would ultimately be deemed unfit for practical use, boethius contributed an effort to make the ancient Aristotelian logic more accessible. While his Latin translation of Prior Analytics went primarily unused before the twelfth century and his perspective on syllogisms can be found in other works as well, such as Logica Ingredientibus. With the help of Abelards distinction between de dicto modal sentences and de re modal sentences, medieval logicians began to shape a coherent concept of Aristotles modal syllogism model. For two hundred years after Buridans discussions, little was said about syllogistic logic, the Aristotelian syllogism dominated Western philosophical thought for many centuries. In the 17th century, Sir Francis Bacon rejected the idea of syllogism as being the best way to draw conclusions in nature. Instead, Bacon proposed a more inductive approach to the observation of nature, in the 19th century, modifications to syllogism were incorporated to deal with disjunctive and conditional statements. Kant famously claimed, in Logic, that logic was the one completed science, though there were alternative systems of logic such as Avicennian logic or Indian logic elsewhere, Kants opinion stood unchallenged in the West until 1879 when Frege published his Begriffsschrift. This introduced a calculus, a method of representing categorical statements by the use of quantifiers, in the last 20 years, Bolzanos work has resurfaced and become subject of both translation and contemporary study
18.
Boolean algebra (logic)
–
In mathematics and mathematical logic, Boolean algebra is the branch of algebra in which the values of the variables are the truth values true and false, usually denoted 1 and 0 respectively. It is thus a formalism for describing logical relations in the way that ordinary algebra describes numeric relations. Boolean algebra was introduced by George Boole in his first book The Mathematical Analysis of Logic, according to Huntington, the term Boolean algebra was first suggested by Sheffer in 1913. Boolean algebra has been fundamental in the development of digital electronics and it is also used in set theory and statistics. Booles algebra predated the modern developments in algebra and mathematical logic. In an abstract setting, Boolean algebra was perfected in the late 19th century by Jevons, Schröder, Huntington, in fact, M. H. Stone proved in 1936 that every Boolean algebra is isomorphic to a field of sets. Shannon already had at his disposal the abstract mathematical apparatus, thus he cast his switching algebra as the two-element Boolean algebra, in circuit engineering settings today, there is little need to consider other Boolean algebras, thus switching algebra and Boolean algebra are often used interchangeably. Efficient implementation of Boolean functions is a problem in the design of combinational logic circuits. Logic sentences that can be expressed in classical propositional calculus have an equivalent expression in Boolean algebra, thus, Boolean logic is sometimes used to denote propositional calculus performed in this way. Boolean algebra is not sufficient to capture logic formulas using quantifiers, the closely related model of computation known as a Boolean circuit relates time complexity to circuit complexity. Whereas in elementary algebra expressions denote mainly numbers, in Boolean algebra they denote the truth values false and these values are represented with the bits, namely 0 and 1. Addition and multiplication then play the Boolean roles of XOR and AND respectively, Boolean algebra also deals with functions which have their values in the set. A sequence of bits is a commonly used such function, another common example is the subsets of a set E, to a subset F of E is associated the indicator function that takes the value 1 on F and 0 outside F. The most general example is the elements of a Boolean algebra, as with elementary algebra, the purely equational part of the theory may be developed without considering explicit values for the variables. The basic operations of Boolean calculus are as follows, AND, denoted x∧y, satisfies x∧y =1 if x = y =1 and x∧y =0 otherwise. OR, denoted x∨y, satisfies x∨y =0 if x = y =0, NOT, denoted ¬x, satisfies ¬x =0 if x =1 and ¬x =1 if x =0. Alternatively the values of x∧y, x∨y, and ¬x can be expressed by tabulating their values with truth tables as follows, the first operation, x → y, or Cxy, is called material implication. If x is then the value of x → y is taken to be that of y
19.
Propositional logic
–
Logical connectives are found in natural languages. In English for example, some examples are and, or, not”, the following is an example of a very simple inference within the scope of propositional logic, Premise 1, If its raining then its cloudy. Both premises and the conclusion are propositions, the premises are taken for granted and then with the application of modus ponens the conclusion follows. Not only that, but they will also correspond with any other inference of this form, Propositional logic may be studied through a formal system in which formulas of a formal language may be interpreted to represent propositions. A system of rules and axioms allows certain formulas to be derived. These derived formulas are called theorems and may be interpreted to be true propositions, a constructed sequence of such formulas is known as a derivation or proof and the last formula of the sequence is the theorem. The derivation may be interpreted as proof of the represented by the theorem. When a formal system is used to represent formal logic, only statement letters are represented directly, usually in truth-functional propositional logic, formulas are interpreted as having either a truth value of true or a truth value of false. Truth-functional propositional logic and systems isomorphic to it, are considered to be zeroth-order logic, although propositional logic had been hinted by earlier philosophers, it was developed into a formal logic by Chrysippus in the 3rd century BC and expanded by his successor Stoics. The logic was focused on propositions and this advancement was different from the traditional syllogistic logic which was focused on terms. However, later in antiquity, the propositional logic developed by the Stoics was no longer understood, consequently, the system was essentially reinvented by Peter Abelard in the 12th century. Propositional logic was eventually refined using symbolic logic, the 17th/18th-century mathematician Gottfried Leibniz has been credited with being the founder of symbolic logic for his work with the calculus ratiocinator. Although his work was the first of its kind, it was unknown to the larger logical community, consequently, many of the advances achieved by Leibniz were recreated by logicians like George Boole and Augustus De Morgan completely independent of Leibniz. Just as propositional logic can be considered an advancement from the earlier syllogistic logic, one author describes predicate logic as combining the distinctive features of syllogistic logic and propositional logic. Consequently, predicate logic ushered in a new era in history, however, advances in propositional logic were still made after Frege, including Natural Deduction. Natural deduction was invented by Gerhard Gentzen and Jan Łukasiewicz, Truth-Trees were invented by Evert Willem Beth. The invention of truth-tables, however, is of controversial attribution, within works by Frege and Bertrand Russell, are ideas influential to the invention of truth tables. The actual tabular structure, itself, is credited to either Ludwig Wittgenstein or Emil Post
20.
Logical consequence
–
Logical consequence is a fundamental concept in logic, which describes the relationship between statements that holds true when one statement logically follows from one or more statements. A valid logical argument is one in which the conclusions are entailed by the premises, the philosophical analysis of logical consequence involves the questions, In what sense does a conclusion follow from its premises. And What does it mean for a conclusion to be a consequence of premises, All of philosophical logic is meant to provide accounts of the nature of logical consequence and the nature of logical truth. Logical consequence is necessary and formal, by way of examples that explain with formal proof and models of interpretation. A sentence is said to be a consequence of a set of sentences, for a given language, if and only if. The most widely prevailing view on how to best account for logical consequence is to appeal to formality and this is to say that whether statements follow from one another logically depends on the structure or logical form of the statements without regard to the contents of that form. Syntactic accounts of logical consequence rely on schemes using inference rules, for instance, we can express the logical form of a valid argument as, All A are B. All C are A. Therefore, all C are B and this argument is formally valid, because every instance of arguments constructed using this scheme are valid. This is in contrast to an argument like Fred is Mikes brothers son, if you know that Q follows logically from P no information about the possible interpretations of P or Q will affect that knowledge. Our knowledge that Q is a consequence of P cannot be influenced by empirical knowledge. Deductively valid arguments can be known to be so without recourse to experience, however, formality alone does not guarantee that logical consequence is not influenced by empirical knowledge. So the a property of logical consequence is considered to be independent of formality. The two prevailing techniques for providing accounts of logical consequence involve expressing the concept in terms of proofs, the study of the syntactic consequence is called proof theory whereas the study of semantic consequence is called model theory. A formula A is a syntactic consequence within some formal system F S of a set Γ of formulas if there is a proof in F S of A from the set Γ. Γ ⊢ F S A Syntactic consequence does not depend on any interpretation of the formal system, or, in other words, the set of the interpretations that make all members of Γ true is a subset of the set of the interpretations that make A true. Modal accounts of logical consequence are variations on the basic idea, Γ ⊢ A is true if and only if it is necessary that if all of the elements of Γ are true. Alternatively, Γ ⊢ A is true if and only if it is impossible for all of the elements of Γ to be true, such accounts are called modal because they appeal to the modal notions of logical necessity and logical possibility. Consider the modal account in terms of the argument given as an example above, the conclusion is a logical consequence of the premises because we cant imagine a possible world where all frogs are green, Kermit is a frog, and Kermit is not green
21.
Analogy
–
Analogy is a cognitive process of transferring information or meaning from a particular subject to another, or a linguistic expression corresponding to such a process. The word analogy can also refer to the relation between the source and the target themselves, which is often, though not necessarily, a similarity, as in the biological notion of analogy. Analogy plays a significant role in solving, as well as decision making, perception, memory, creativity, emotion, explanation. It lies behind basic tasks such as the identification of places, objects and people, for example, in face perception and it has been argued that analogy is the core of cognition. Specific analogical language comprises exemplification, comparisons, metaphors, similes, allegories, and parables, phrases like and so on, and the like, as if, and the very word like also rely on an analogical understanding by the receiver of a message including them. Analogy is important not only in language and common sense but also in science, philosophy. In cognitive linguistics, the notion of conceptual metaphor may be equivalent to that of analogy, Analogy has been studied and discussed since classical antiquity by philosophers, scientists, and lawyers. The last few decades have shown a renewed interest in analogy, in ancient Greek the word αναλογια originally meant proportionality, in the mathematical sense, and it was indeed sometimes translated to Latin as proportio. From there analogy was understood as identity of relation between any two ordered pairs, whether of mathematical nature or not, kants Critique of Judgment held to this notion. Kant argued that there can be exactly the same relation between two different objects. The same notion of analogy was used in the US-based SAT tests, for example, Hand is to palm as foot is to ____. This relation is not apparent in some definitions of palm and sole, where the former is defined as the inner surface of the hand. Analogy and abstraction are different cognitive processes, and analogy is often an easier one and this analogy is not comparing all the properties between a hand and a foot, but rather comparing the relationship between a hand and its palm to a foot and its sole. While a hand and a foot have many dissimilarities, the focuses on their similarity in having an inner surface. A computer algorithm has achieved human-level performance on multiple-choice analogy questions from the SAT test, the algorithm measures the similarity of relations between pairs of words by statistical analysis of a large collection of text. It answers SAT questions by selecting the choice with the highest relational similarity, Greek philosophers such as Plato and Aristotle actually used a wider notion of analogy. They saw analogy as a shared abstraction, analogous objects did not share necessarily a relation, but also an idea, a pattern, a regularity, an attribute, an effect or a philosophy. These authors also accepted that comparisons, metaphors and images could be used as arguments, analogies should also make those abstractions easier to understand and give confidence to the ones using them
22.
Graph (abstract data type)
–
In computer science, a graph is an abstract data type that is meant to implement the undirected graph and directed graph concepts from mathematics. These pairs are known as edges, arcs, or lines for a graph and as arrows, directed edges, directed arcs. The vertices may be part of the structure, or may be external entities represented by integer indices or references. A graph data structure may also associate to each edge some edge value and this data structure allows the storage of additional data on the vertices. Additional data can be stored if edges are stored as objects, in which case each vertex stores its incident edges. Adjacency matrix A two-dimensional matrix, in which the rows represent source vertices, data on edges and vertices must be stored externally. Only the cost for one edge can be stored between each pair of vertices, incidence matrix A two-dimensional Boolean matrix, in which the rows represent the vertices and columns represent the edges. The entries indicate whether the vertex at a row is incident to the edge at a column. The following table gives the time complexity cost of performing operations on graphs, for each of these representations, with |V | the number of vertices. In the matrix representations, the entries encode the cost of following an edge, the cost of edges that are not present are assumed to be ∞. Adjacency lists are generally preferred because they efficiently represent sparse graphs. a, boost Networkx, a Python graph library Graphs Tutorial by Jebril FILALI
23.
Subset
–
In mathematics, especially in set theory, a set A is a subset of a set B, or equivalently B is a superset of A, if A is contained inside B, that is, all elements of A are also elements of B. The relationship of one set being a subset of another is called inclusion or sometimes containment, the subset relation defines a partial order on sets. The algebra of subsets forms a Boolean algebra in which the relation is called inclusion. For any set S, the inclusion relation ⊆ is an order on the set P of all subsets of S defined by A ≤ B ⟺ A ⊆ B. We may also partially order P by reverse set inclusion by defining A ≤ B ⟺ B ⊆ A, when quantified, A ⊆ B is represented as, ∀x. So for example, for authors, it is true of every set A that A ⊂ A. Other authors prefer to use the symbols ⊂ and ⊃ to indicate proper subset and superset, respectively and this usage makes ⊆ and ⊂ analogous to the inequality symbols ≤ and <. For example, if x ≤ y then x may or may not equal y, but if x < y, then x definitely does not equal y, and is less than y. Similarly, using the convention that ⊂ is proper subset, if A ⊆ B, then A may or may not equal B, the set A = is a proper subset of B =, thus both expressions A ⊆ B and A ⊊ B are true. The set D = is a subset of E =, thus D ⊆ E is true, any set is a subset of itself, but not a proper subset. The empty set, denoted by ∅, is also a subset of any given set X and it is also always a proper subset of any set except itself. These are two examples in both the subset and the whole set are infinite, and the subset has the same cardinality as the whole. The set of numbers is a proper subset of the set of real numbers. In this example, both sets are infinite but the set has a larger cardinality than the former set. Another example in an Euler diagram, Inclusion is the partial order in the sense that every partially ordered set is isomorphic to some collection of sets ordered by inclusion. The ordinal numbers are a simple example—if each ordinal n is identified with the set of all ordinals less than or equal to n, then a ≤ b if and only if ⊆. For the power set P of a set S, the partial order is the Cartesian product of k = |S| copies of the partial order on for which 0 <1. This can be illustrated by enumerating S = and associating with each subset T ⊆ S the k-tuple from k of which the ith coordinate is 1 if and only if si is a member of T
24.
Logical implication
–
Logical consequence is a fundamental concept in logic, which describes the relationship between statements that holds true when one statement logically follows from one or more statements. A valid logical argument is one in which the conclusions are entailed by the premises, the philosophical analysis of logical consequence involves the questions, In what sense does a conclusion follow from its premises. And What does it mean for a conclusion to be a consequence of premises, All of philosophical logic is meant to provide accounts of the nature of logical consequence and the nature of logical truth. Logical consequence is necessary and formal, by way of examples that explain with formal proof and models of interpretation. A sentence is said to be a consequence of a set of sentences, for a given language, if and only if. The most widely prevailing view on how to best account for logical consequence is to appeal to formality and this is to say that whether statements follow from one another logically depends on the structure or logical form of the statements without regard to the contents of that form. Syntactic accounts of logical consequence rely on schemes using inference rules, for instance, we can express the logical form of a valid argument as, All A are B. All C are A. Therefore, all C are B and this argument is formally valid, because every instance of arguments constructed using this scheme are valid. This is in contrast to an argument like Fred is Mikes brothers son, if you know that Q follows logically from P no information about the possible interpretations of P or Q will affect that knowledge. Our knowledge that Q is a consequence of P cannot be influenced by empirical knowledge. Deductively valid arguments can be known to be so without recourse to experience, however, formality alone does not guarantee that logical consequence is not influenced by empirical knowledge. So the a property of logical consequence is considered to be independent of formality. The two prevailing techniques for providing accounts of logical consequence involve expressing the concept in terms of proofs, the study of the syntactic consequence is called proof theory whereas the study of semantic consequence is called model theory. A formula A is a syntactic consequence within some formal system F S of a set Γ of formulas if there is a proof in F S of A from the set Γ. Γ ⊢ F S A Syntactic consequence does not depend on any interpretation of the formal system, or, in other words, the set of the interpretations that make all members of Γ true is a subset of the set of the interpretations that make A true. Modal accounts of logical consequence are variations on the basic idea, Γ ⊢ A is true if and only if it is necessary that if all of the elements of Γ are true. Alternatively, Γ ⊢ A is true if and only if it is impossible for all of the elements of Γ to be true, such accounts are called modal because they appeal to the modal notions of logical necessity and logical possibility. Consider the modal account in terms of the argument given as an example above, the conclusion is a logical consequence of the premises because we cant imagine a possible world where all frogs are green, Kermit is a frog, and Kermit is not green
25.
Curiosity
–
Curiosity is a quality related to inquisitive thinking such as exploration, investigation, and learning, evident by observation in humans and other animals. Curiosity is heavily associated with all aspects of development, in which derives the process of learning and desire to acquire knowledge. The term curiosity can also be used to denote the behavior or emotion of being curious, curiosity as a behavior and emotion is attributed over millennia as the driving force behind not only human development, but developments in science, language, and industry. Curiosity can be seen as a quality of many different species. It is common to human beings at all ages from infancy through adulthood, and is easy to observe in other animal species, these include apes, cats. Early definitions cite curiosity as a desire for information. This motivational desire has been said to stem from a passion or an appetite for knowledge, information, like other desires and need states that take on an appetitive quality, curiosity is linked with exploratory behavior and experiences of reward. Curiosity can be described as positive emotions and acquiring knowledge, when one’s curiosity has been aroused it is considered inherently rewarding, discovering new information may also be rewarding because it can help reduce undesirable states of uncertainty rather than stimulating interest. Theories have arisen in attempts to understand this need to rectify states of uncertainty. Curiosity-drive theory relates to the experiences of uncertainty. The reduction of these unpleasant feelings, in turn, is rewarding and this theory suggests that people desire coherence and understanding in their thought processes. Once understanding of unfamiliar has been achieved and coherence has been restored, causes can range from basic needs that need to be satisfied to needs in fear induced situations. Each of these theories state, that whether the need is primary or secondary curiosity is developed from experiences that create a sensation of uncertainty or perceived unpleasantness. Curiosity then acts as a means in which to dispel this uncertainty, by exhibiting curious and exploratory behavior, one is able to gain knowledge of the unfamiliar and thus reduce the state of uncertainty or unpleasantness. This theory, however, does not address the idea that curiosity can often be displayed even in the absence of new or unfamiliar situations and this type of exploratory behavior is common in many species. Take the example of a toddler who, if bored in his current situation devoid of arousing stimuli. The observation of curiosity even in the absence of novel stimuli pinpoints one of the shortcomings in the curiosity-drive model. Optimal-arousal theory developed out of the need to explain the desire for some to seek out opportunities to engage in exploratory behaviors without the presence of uncertain or ambiguous situations
26.
Entropy (information theory)
–
In information theory, systems are modeled by a transmitter, channel, and receiver. The transmitter produces messages that are sent through the channel, the channel modifies the message in some way. The receiver attempts to infer which message was sent, in this context, entropy is the expected value of the information contained in each message. Messages can be modeled by any flow of information, in a more technical sense, there are reasons to define information as the negative of the logarithm of the probability distribution of possible events or messages. The amount of information of every event forms a random variable whose expected value, units of entropy are the shannon, nat, or hartley, depending on the base of the logarithm used to define it, though the shannon is commonly referred to as a bit. The logarithm of the probability distribution is useful as a measure of entropy because it is additive for independent sources, for instance, the entropy of a coin toss is 1 shannon, whereas of m tosses it is m shannons. Generally, you need log2 bits to represent a variable that can take one of n if n is a power of 2. If these values are equally probable, the entropy is equal to the number of bits, equality between number of bits and shannons holds only while all outcomes are equally probable. If one of the events is more probable than others, observation of event is less informative. Conversely, rarer events provide more information when observed, since observation of less probable events occurs more rarely, the net effect is that the entropy received from non-uniformly distributed data is less than log2. Entropy is zero when one outcome is certain, Shannon entropy quantifies all these considerations exactly when a probability distribution of the source is known. The meaning of the events observed does not matter in the definition of entropy, generally, entropy refers to disorder or uncertainty. Shannon entropy was introduced by Claude E. Shannon in his 1948 paper A Mathematical Theory of Communication, Shannon entropy provides an absolute limit on the best possible average length of lossless encoding or compression of an information source. Entropy is a measure of unpredictability of the state, or equivalently, to get an intuitive understanding of these terms, consider the example of a political poll. Usually, such polls happen because the outcome of the poll is not already known, now, consider the case that the same poll is performed a second time shortly after the first poll. Now consider the example of a coin toss, assuming the probability of heads is the same as the probability of tails, then the entropy of the coin toss is as high as it could be. Such a coin toss has one shannon of entropy since there are two possible outcomes that occur with probability, and learning the actual outcome contains one shannon of information. Contrarily, a toss with a coin that has two heads and no tails has zero entropy since the coin will always come up heads
27.
Information theory
–
Information theory studies the quantification, storage, and communication of information. A key measure in information theory is entropy, entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a coin flip provides less information than specifying the outcome from a roll of a die. Some other important measures in information theory are mutual information, channel capacity, error exponents, applications of fundamental topics of information theory include lossless data compression, lossy data compression, and channel coding. The field is at the intersection of mathematics, statistics, computer science, physics, neurobiology, Information theory studies the transmission, processing, utilization, and extraction of information. Abstractly, information can be thought of as the resolution of uncertainty, Information theory is a broad and deep mathematical theory, with equally broad and deep applications, amongst which is the vital field of coding theory. These codes can be subdivided into data compression and error-correction techniques. In the latter case, it took years to find the methods Shannons work proved were possible. A third class of information theory codes are cryptographic algorithms, concepts, methods and results from coding theory and information theory are widely used in cryptography and cryptanalysis. See the article ban for a historical application, Information theory is also used in information retrieval, intelligence gathering, gambling, statistics, and even in musical composition. Prior to this paper, limited information-theoretic ideas had been developed at Bell Labs, the unit of information was therefore the decimal digit, much later renamed the hartley in his honour as a unit or scale or measure of information. Alan Turing in 1940 used similar ideas as part of the analysis of the breaking of the German second world war Enigma ciphers. Much of the mathematics behind information theory with events of different probabilities were developed for the field of thermodynamics by Ludwig Boltzmann, Information theory is based on probability theory and statistics. Information theory often concerns itself with measures of information of the associated with random variables. Important quantities of information are entropy, a measure of information in a random variable, and mutual information. The choice of base in the following formulae determines the unit of information entropy that is used. A common unit of information is the bit, based on the binary logarithm, other units include the nat, which is based on the natural logarithm, and the hartley, which is based on the common logarithm. In what follows, an expression of the form p log p is considered by convention to be equal to zero whenever p =0 and this is justified because lim p →0 + p log p =0 for any logarithmic base