John Ball (cognitive scientist)
John Samuel Ball is an American cognitive scientist, an expert in machine intelligence, computer architecture and the inventor of Patom Theory. Born in Iowa USA whilst his Australian father Samuel Ball was working on his PhD in Educational Psychology, Ball returned with the family to Australia in 1978 to finish his secondary schooling on the north shore of Sydney. Ball received a Bachelor of Science in 1984 from the University of Sydney, a Masters of Cognitive Science from University of NSW in 1989 and a Master of Business Administration from MGSM in 1997. From a young age, Ball was fascinated by computers having been exposed to early mainframes at Educational Testing Service in Princeton in the 1970s, he was challenged by a lecturer as an undergraduate to pursue machine intelligence when she announced that computers would never be able to perform human like functions such as language or visual recognition. His career begun at IBM Australia as a mainframe engineer, leading to country support specialist responsible for supporting and training hardware engineers across Australia and New Zealand on mainframe and I/O devices.
His expertise was in the IBM 370 I/O architecture, learning from global designer of channel architecture, Kenneth Trowell. Following IBM in 1996 he worked in other large Australian corporations managing and defining the commercials of complex IT contracts between stakeholders. Always interested in how machines could better emulate human brain functions, he postulated Patom theory – the word representing a combination of pattern matching and atom; this reflected his belief that the brain stores and uses hierarchical, bidirectional linkset patterns as sufficient to explain human capabilities. This he claimed was the approach of the human brain to language and vision and was first publicly aired in 2000, on Robyn Williams’ Okham's Razor. Over the years, exchanges with Artificial Intelligence experts such as Marvin Minsky lead him to work on a prototype to demonstrate and prove his theory. Ball left corporate life to focus full-time on proving a natural language understanding system with samples across diverse languages including Mandarin, German, Spanish, French and Portuguese.
Since 2007, Ball has filed two patents. In 2011 Ball came across a book of Emma L. Pavey's whilst visiting a Barnes & Noble store in Princeton, New Jersey; this included a reference to a linguistic theory developed by Professor Robert Van Valin, Jr. and Professor William A. Foley, called Role & Reference Grammar. Ball determined the explanation of a meaning based linguistic framework described in Pavey's book, to be the missing link for implementation of his theory, he began integrating RRG into his prototype. Unlike dominant linguistic theories such as Universal Grammar, by Noam Chomsky, Ball's approach focused on meaning and provided a way for computers to break down any human language by meaning enabling communications between man and machine. In Van Valin's Paper, From NLP to NLU, Van Valin talks about progressing from natural language processing to NLU with the introduction of meaning achieved by the combination of RRG & Patom theory. In 2014, The University of Sydney completed an external review analyzing its capabilities across Word Sense Disambiguation, context tracking, word boundary identification, machine translation and conversation.
By 2015, Ball had included samples across nine languages and could demonstrate a solution to open scientific problems in the field of NLU, including: Word Sense Disambiguation Context Tracking Machine Translation Word Boundary IdentificationIn 2015, Ball wrote a seven-part series for Computerworld, Speaking Artificial Intelligence in which he traced the dominant approaches of statistical analysis and machine learning, from the 1980s to the present. Applications for this technology and its implications for intelligent machines have been published by Dr Hossein Eslambolchi in World Economic Forum. Ball's work to date refutes the held belief that the human brain ‘processes’ information like a computer, his lab work and NLU demonstrate human-like conversation and accuracy in translation, written about in his papers "The Science of NLU" and "Patom Theory". In December 2018, his machine intelligence company, Pat Inc received the award of'Best New Algorithm for AI' by London-based Into. AI organization as recognition of his novel approach to AI-hard problem, natural-language understanding.
Using NLU in Context for Question Answering: Improving on Facebook's bAbI Tasks Machine Intelligence Can Machines Talk Series'Patom Theory' Speaking Artificial Intelligence https://pat.ai/
In language, a clause is the smallest grammatical unit that can express a complete proposition. A typical clause consists of a subject and a predicate, the latter a verb phrase, a verb with any objects and other modifiers. However, the subject is sometimes not said or explicit the case in null-subject languages if the subject is retrievable from context, but it sometimes occurs in other languages such as English. A simple sentence consists of a single finite clause with a finite verb, independent. More complex sentences may contain multiple clauses. Main clauses are those. Subordinate clauses are those that would be incomplete if they were alone. A primary division for the discussion of clauses is the distinction between main clauses and subordinate clauses. A main clause can stand alone, i.e. it can constitute a complete sentence by itself. A subordinate clause, in contrast, is reliant on the appearance of a main clause. A second major distinction concerns the difference between non-finite clauses.
A finite clause contains a structurally central finite verb, whereas the structurally central word of a non-finite clause is a non-finite verb. Traditional grammar focuses on finite clauses, the awareness of non-finite clauses having arisen much in connection with the modern study of syntax; the discussion here focuses on finite clauses, although some aspects of non-finite clauses are considered further below. Clauses can be classified according to a distinctive trait, a prominent characteristic of their syntactic form; the position of the finite verb is one major trait used for classification, the appearance of a specific type of focusing word is another. These two criteria overlap to an extent, which means that no single aspect of syntactic form is always decisive in determining how the clause functions. There are, strong tendencies. Standard SV-clauses are the norm in English, they are declarative. The pig has not yet been fed. - Declarative clause, standard SV order I've been hungry for two hours.
- Declarative clause, standard SV order...that I've been hungry for two hours. - Declarative clause, standard SV order, but functioning as a subordinate clause due to the appearance of the subordinator thatDeclarative clauses like these are by far the most occurring type of clause in any language. They can be viewed as other clause types being derived from them. Standard SV-clauses can be interrogative or exclamative, given the appropriate intonation contour and/or the appearance of a question word, e.g. a. The pig has not yet been fed? - Rising intonation on fed makes the clause a yes/no-question.b. The pig has not yet been fed! - Spoken forcefully, this clause is exclamative.c. You've been hungry for how long? - Appearance of interrogative word how and rising intonation make the clause a constituent questionExamples like these demonstrate that how a clause functions cannot be known based on a single distinctive syntactic criterion. SV-clauses are declarative, but intonation and/or the appearance of a question word can render them interrogative or exclamative.
Verb first clauses in English play one of three roles: 1. They express a yes/no-question via subject–auxiliary inversion, 2, they express a condition as an embedded clause, or 3. They express a command via e.g. a. He must stop laughing. - Standard declarative SV-clause b. Should he stop laughing? - Yes/no-question expressed by verb first order c. Had he stopped laughing... - Condition expressed by verb first order d. Stop laughing! - Imperative formed with verb first ordera. They have done the job. - Standard declarative SV-clause b. Have they done the job? - Yes/no-question expressed by verb first order c. Had they done the job... - Condition expressed by verb first order d. Do the job! - Imperative formed with verb first orderMost verb first clauses are main clauses. Verb first conditional clauses, must be classified as embedded clauses because they cannot stand alone. Wh-clauses contain a wh-word. Wh-words serve to help express a constituent question, they are prevalent, though, as relative pronouns, in which case they serve to introduce a relative clause and are not part of a question.
The wh-word focuses a particular constituent and most of the time, it appears in clause-initial position. The following examples illustrate standard interrogative wh-clauses; the b-sentences are direct questions, the c-sentences contain the corresponding indirect questions: a. Sam likes the meat. - Standard declarative SV-clause b. Who likes the meat? - Matrix interrogative wh-clause focusing on the subject c. They asked. - Embedded interrogative wh-clause focusing on the subjecta. Larry sent Susan to the store. - Standard declarative SV-clause b. Whom did Larry send to the store? - Matrix interrogative wh-clause focusing on the object, subject-auxiliary inversion present c. We know. - Embedded wh-clause focusing on the object, subject-auxiliary inversion absenta. Larry sent Susan to the store. - Standard declarative SV-clause b. Where did Larry send Susan? - Matrix interrogative wh-clause focusing on the ob
In linguistics, grammar is the set of structural rules governing the composition of clauses and words in any given natural language. The term refers to the study of such rules, this field includes phonology and syntax complemented by phonetics and pragmatics. Speakers of a language have a set of internalized rules for using that language, these rules constitute that language's grammar; the vast majority of the information in the grammar is – at least in the case of one's native language – acquired not by conscious study or instruction, but by observing other speakers. Much of this work is done during early childhood. Thus, grammar is the cognitive information underlying language use; the term "grammar" can be used to describe the rules that govern the linguistic behavior of a group of speakers. The term "English grammar", may have several meanings, it may refer to the whole of English grammar, that is, to the grammars of all the speakers of the language, in which case, the term encompasses a great deal of variation.
Alternatively, it may refer only to what is common to the grammars of all, or of the vast majority of English speakers. Or it may refer to the rules of a particular well-defined variety of English. A specific description, study or analysis of such rules may be referred to as a grammar. A reference book describing the grammar of a language is called a "reference grammar" or "a grammar". A explicit grammar that exhaustively describes the grammatical constructions of a particular lect is called a descriptive grammar; this kind of linguistic description contrasts with linguistic prescription, an attempt to discourage or suppress some grammatical constructions, while codifying and promoting others, either in an absolute sense, or in reference to a standard variety. For example, preposition stranding occurs in Germanic languages, has a long history in English, is considered standard usage. John Dryden, objected to it, leading other English speakers to avoid the construction and discourage its use. Outside linguistics, the term grammar is used in a rather different sense.
In some respects, it may be used more broadly, including rules of spelling and punctuation, which linguists would not consider to form part of grammar, but rather as a part of orthography, the set of conventions used for writing a language. In other respects, it may be used more narrowly, to refer to a set of prescriptive norms only and excluding those aspects of a language's grammar that are not subject to variation or debate on their normative acceptability. Jeremy Butterfield claimed that, for non-linguists, "Grammar is a generic way of referring to any aspect of English that people object to." The word grammar is derived from Greek γραμματικὴ τέχνη, which means "art of letters", from γράμμα, "letter", itself from γράφειν, "to draw, to write". The same Greek root appears in graphics and photograph. Vedic Sanskrit is the earliest language known to the world; the grammatical rules were formulated by Indra, etc. but the modern systematic grammar, of Sanskrit, originated in Iron Age India, with Yaska, Pāṇini and his commentators Pingala and Patanjali.
Tolkāppiyam, the earliest Tamil grammar, is dated to before the 5th century AD. The Babylonians made some early attempts at language description,In the West, grammar emerged as a discipline in Hellenism from the 3rd century BC forward with authors like Rhyanus and Aristarchus of Samothrace; the oldest known grammar handbook is the Art of Grammar, a succinct guide to speaking and writing and written by the ancient Greek scholar Dionysius Thrax, a student of Aristarchus of Samothrace who established a school on the Greek island of Rhodes. Dionysius Thrax's grammar book remained the primary grammar textbook for Greek schoolboys until as late as the twelfth century AD; the Romans based their grammatical writings on it and its basic format remains the basis for grammar guides in many languages today. Latin grammar developed by following Greek models from the 1st century BC, due to the work of authors such as Orbilius Pupillus, Remmius Palaemon, Marcus Valerius Probus, Verrius Flaccus, Aemilius Asper.
A grammar of Irish originated in the 7th century with the Auraicept na n-Éces. Arabic grammar emerged with Abu al-Aswad al-Du'ali in the 7th century; the first treatises on Hebrew grammar appeared in the context of Mishnah. The Karaite tradition originated in Abbasid Baghdad; the Diqduq is one of the earliest grammatical commentaries on the Hebrew Bible. Ibn Barun in the 12th century compares the Hebrew language with Arabic in the Islamic grammatical tradition. Belonging to the trivium of the seven liberal arts, grammar was taught as a core discipline throughout the Middle Ages, following the influence of authors from Late Antiquity, such as Priscian. Treatment of vernaculars began during the High Middle Ages, with isolated works such as the First Grammatical Treatise, but became influential only in the Renaissance and Baroque periods. In 1486, Antonio de Nebrija published Las introduciones Latinas contrapuesto el romance al Latin, the first Spanish grammar, Gramática de la lengua castellana, in 1492.
During the 16th-century Italian Ren