Record (computer science)
In computer science, a record is a basic data structure. Records in a database or spreadsheet are called "rows". A record is a collection of fields of different data types in fixed number and sequence; the fields of a record may be called members in object-oriented programming. For example, a date could be stored as a record containing a numeric year field, a month field represented as a string, a numeric day-of-month field. A personnel record might contain a name, a salary, a rank. A Circle record might contain a center and a radius—in this instance, the center itself might be represented as a point record containing x and y coordinates. Records are distinguished from arrays by the fact that their number of fields is fixed, each field has a name, that each field may have a different type. A record type is a data type that describes such variables. Most modern computer languages allow the programmer to define new record types; the definition includes specifying the data type of each field and an identifier by which it can be accessed.
In type theory, product types are preferred due to their simplicity, but proper record types are studied in languages such as System F-sub. Since type-theoretical records may contain first-class function-typed fields in addition to data, they can express many features of object-oriented programming. Records can exist in any storage medium, including main memory and mass storage devices such as magnetic tapes or hard disks. Records are a fundamental component of most data structures linked data structures. Many computer files are organized as arrays of logical records grouped into larger physical records or blocks for efficiency; the parameters of a function or procedure can be viewed as the fields of a record variable. In the call stack, used to implement procedure calls, each entry is an activation record or call frame, containing the procedure parameters and local variables, the return address, other internal fields. An object in object-oriented language is a record that contains procedures specialized to handle that record.
Indeed, in most object-oriented languages, records are just special cases of objects, are known as plain old data structures, to contrast with objects that use OO features. A record can be viewed as the computer analog of a mathematical tuple, although a tuple may or may not be considered a record, vice versa, depending on conventions and the specific programming language. In the same vein, a record type can be viewed as the computer language analog of the Cartesian product of two or more mathematical sets, or the implementation of an abstract product type in a specific language. A record may have zero or more keys. A key is a set of fields in the record that serves as an identifier. A unique key is called the primary key, or the record key. For example an employee file might contain employee number, name and salary; the employee number would be the primary key. Depending on the storage medium and file organization the employee number might be indexed—that is stored in a separate file to make lookup faster.
The department code may not be unique. If it is not indexed the entire employee file would have to be scanned to produce a listing of all employees in a specific department; the salary field would not be considered usable as a key. Indexing is one factor considered; the concept of record can be traced to various types of tables and ledgers used in accounting since remote times. The modern notion of records in computer science, with fields of well-defined type and size, was implicit in 19th century mechanical calculators, such as Babbage's Analytical Engine; the original machine-readable medium used for data was punch card used for records in the 1890 United States Census: each punch card was a single record. Compare the journal entry from 1880 and the punch card from 1895. Records were well established in the first half of the 20th century, when most data processing was done using punched cards; each record of a data file would be recorded in one punched card, with specific columns assigned to specific fields.
A record was the smallest unit that could be read in from external storage. Most machine language implementations and early assembly languages did not have special syntax for records, but the concept was available through the use of index registers, indirect addressing, self-modifying code; some early computers, such as the IBM 1620, had hardware support for delimiting records and fields, special instructions for copying such records. The concept of records and fields was central in some early file sorting and tabulating utilities, such as IBM's Report Program Generator. COBOL was the first widespread programming language to support record types, its record definition facilities were quite sophisticated at the time; the language allows for the definition of nested records with alphanumeric and fractional fields of arbitrary size and precision, as well as fields that automatically format any value assigned to them (e.g. insertion of currency
Computer science is the study of processes that interact with data and that can be represented as data in the form of programs. It enables the use of algorithms to manipulate and communicate digital information. A computer scientist studies the theory of computation and the practice of designing software systems, its fields can be divided into practical disciplines. Computational complexity theory is abstract, while computer graphics emphasizes real-world applications. Programming language theory considers approaches to the description of computational processes, while computer programming itself involves the use of programming languages and complex systems. Human–computer interaction considers the challenges in making computers useful and accessible; the earliest foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating fixed numerical tasks such as the abacus have existed since antiquity, aiding in computations such as multiplication and division.
Algorithms for performing computations have existed since antiquity before the development of sophisticated computing equipment. Wilhelm Schickard designed and constructed the first working mechanical calculator in 1623. In 1673, Gottfried Leibniz demonstrated a digital mechanical calculator, called the Stepped Reckoner, he may be considered the first computer scientist and information theorist, among other reasons, documenting the binary number system. In 1820, Thomas de Colmar launched the mechanical calculator industry when he released his simplified arithmometer, the first calculating machine strong enough and reliable enough to be used daily in an office environment. Charles Babbage started the design of the first automatic mechanical calculator, his Difference Engine, in 1822, which gave him the idea of the first programmable mechanical calculator, his Analytical Engine, he started developing this machine in 1834, "in less than two years, he had sketched out many of the salient features of the modern computer".
"A crucial step was the adoption of a punched card system derived from the Jacquard loom" making it infinitely programmable. In 1843, during the translation of a French article on the Analytical Engine, Ada Lovelace wrote, in one of the many notes she included, an algorithm to compute the Bernoulli numbers, considered to be the first computer program. Around 1885, Herman Hollerith invented the tabulator, which used punched cards to process statistical information. In 1937, one hundred years after Babbage's impossible dream, Howard Aiken convinced IBM, making all kinds of punched card equipment and was in the calculator business to develop his giant programmable calculator, the ASCC/Harvard Mark I, based on Babbage's Analytical Engine, which itself used cards and a central computing unit; when the machine was finished, some hailed it as "Babbage's dream come true". During the 1940s, as new and more powerful computing machines were developed, the term computer came to refer to the machines rather than their human predecessors.
As it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. In 1945, IBM founded the Watson Scientific Computing Laboratory at Columbia University in New York City; the renovated fraternity house on Manhattan's West Side was IBM's first laboratory devoted to pure science. The lab is the forerunner of IBM's Research Division, which today operates research facilities around the world; the close relationship between IBM and the university was instrumental in the emergence of a new scientific discipline, with Columbia offering one of the first academic-credit courses in computer science in 1946. Computer science began to be established as a distinct academic discipline in the 1950s and early 1960s; the world's first computer science degree program, the Cambridge Diploma in Computer Science, began at the University of Cambridge Computer Laboratory in 1953. The first computer science degree program in the United States was formed at Purdue University in 1962.
Since practical computers became available, many applications of computing have become distinct areas of study in their own rights. Although many believed it was impossible that computers themselves could be a scientific field of study, in the late fifties it became accepted among the greater academic population, it is the now well-known IBM brand that formed part of the computer science revolution during this time. IBM released the IBM 704 and the IBM 709 computers, which were used during the exploration period of such devices. "Still, working with the IBM was frustrating if you had misplaced as much as one letter in one instruction, the program would crash, you would have to start the whole process over again". During the late 1950s, the computer science discipline was much in its developmental stages, such issues were commonplace. Time has seen significant improvements in the effectiveness of computing technology. Modern society has seen a significant shift in the users of computer technology, from usage only by experts and professionals, to a near-ubiquitous user base.
Computers were quite costly, some degree of humanitarian aid was needed for efficient use—in part from professional computer operators. As computer adoption became more widespread and affordable, less human assistance was needed for common usage. Despite its short history as a formal academic discipline, computer science has made a number of fundamental contributions to science and society—in fact, along with electronics, it is
The Art of Computer Programming
The Art of Computer Programming is a comprehensive monograph written by Donald Knuth that covers many kinds of programming algorithms and their analysis. Knuth began the project conceived as a single book with twelve chapters, in 1962; the first three volumes of what was expected to be a seven-volume set were published in 1968, 1969, 1973. The first published installment of Volume 4 appeared in paperback as Fascicle 2 in 2005; the hardback Volume 4A, combining Volume 4, Fascicles 0–4, was published in 2011. Volume 4, Fascicle 6 was released in December 2015. Fascicles 5 and 6 are expected to comprise the first two-thirds of Volume 4B, scheduled to be published on May 17, 2019. After winning a Westinghouse Talent Search scholarship, Knuth enrolled at the Case Institute of Technology, where his performance was so outstanding that the faculty voted to award him a master of science upon his completion of the baccalaureate degree. During his summer vacations, Knuth was hired by the Burroughs Corporation to write compilers, earning more in his summer months than full professors did for an entire year.
Such exploits made Knuth a topic of discussion among the mathematics department, which included Richard S. Varga. Knuth started to write a book about compiler design in 1962, soon realized that the scope of the book needed to be much larger. In June 1965, Knuth finished the first draft of what was planned to be a single volume of twelve chapters, his hand-written first-draft manuscript was 3000 pages long: he had assumed that about five hand-written pages would translate into one printed page, but his publisher said instead that about 1 1⁄2 hand-written pages translated to one printed page. This meant the book would be 2000 pages in length; the publisher was nervous about accepting such a project from a graduate student. At this point, Knuth received support from Richard S. Varga, the scientific adviser to the publisher. Varga was visiting Olga John Todd at Caltech. With Varga's enthusiastic endorsement, the publisher accepted Knuth's expanded plans. In its expanded version, the book would be published in seven volumes, each with just one or two chapters.
Due to the growth in the material, the plan for Volume 4 has since expanded to include Volumes 4A, 4B, 4C, 4D, more. In 1976, Knuth prepared a second edition of Volume 2, requiring it to be typeset again, but the style of type used in the first edition was no longer available. In 1977, he decided to spend some time creating something more suitable. Eight years he returned with TEX, used for all volumes; the offer of a so-called Knuth reward check worth "one hexadecimal dollar" for any errors found, the correction of these errors in subsequent printings, has contributed to the polished and still-authoritative nature of the work, long after its first publication. Another characteristic of the volumes is the variation in the difficulty of the exercises; the level of difficulty ranges from "warm-up" exercises to unsolved research problems. Knuth's dedication reads: This series of books is affectionately dedicatedto the Type 650 computer once installed atCase Institute of Technology,with whom I have spent many pleasant evenings.
All examples in the books use a language called "MIX assembly language", which runs on the hypothetical MIX computer. The MIX computer is being replaced by the MMIX computer, a RISC version. Software such as GNU MDK exists to provide emulation of the MIX architecture. Knuth considers the use of assembly language necessary for the speed and memory usage of algorithms to be judged. Knuth was awarded the 1974 Turing Award "for his major contributions to the analysis of algorithms, in particular for his contributions to the'art of computer programming' through his well-known books in a continuous series by this title." American Scientist has included this work among "100 or so Books that shaped a Century of Science", referring to the twentieth century, within the computer science community it is regarded as the first and still the best comprehensive treatment of its subject. Covers of the third edition of Volume 1 quote Bill Gates as saying, "If you think you're a good programmer… read Art of Computer Programming… You should send me a résumé if you can read the whole thing."
The New York Times referred to it as "the profession's defining treatise". Volume 1 – Fundamental AlgorithmsChapter 1 – Basic concepts Chapter 2 – Information structuresVolume 2 – Seminumerical AlgorithmsChapter 3 – Random numbers Chapter 4 – ArithmeticVolume 3 – Sorting and SearchingChapter 5 – Sorting Chapter 6 – SearchingVolume 4A – Combinatorial AlgorithmsChapter 7 – Combinatorial searching Volume 4B... – Combinatorial Algorithms Chapter 7 – Combinatorial searching Chapter 8 – RecursionVolume 5 – Syntactic Algorithms Chapter 9 – Lexical scanning Chapter 10 – Parsing techniquesVolume 6 – The Theory of Context-Free Languages Volume 7 – Compiler Techniques Chapter 1 – Basic concepts 1.1. Algorithms 1.2. Mathematical Preliminaries 1.2.1. Mathematical Induction 1.2.2. Numbers and Logarithms 1.2.3. Sums and Products 1.2.4. Integer Functions and Elementary Number Theory 1.2.5. Permutations and Factorials 1.2.6. Binomial Coefficients 1.2.7. Harmonic Numbers 1.2.8. Fibonacci Numbers 1.2.9. Generating Functions 1.2.10.
Analysis of an
Donald Ervin Knuth is an American computer scientist and professor emeritus at Stanford University. He is the author of the multi-volume work The Art of Computer Programming, he contributed to the development of the rigorous analysis of the computational complexity of algorithms and systematized formal mathematical techniques for it. In the process he popularized the asymptotic notation. In addition to fundamental contributions in several branches of theoretical computer science, Knuth is the creator of the TeX computer typesetting system, the related METAFONT font definition language and rendering system, the Computer Modern family of typefaces; as a writer and scholar, Knuth created the WEB and CWEB computer programming systems designed to encourage and facilitate literate programming, designed the MIX/MMIX instruction set architectures. Knuth opposes granting software patents, having expressed his opinion to the United States Patent and Trademark Office and European Patent Organisation. Knuth was born in Milwaukee, Wisconsin, to German-Americans Ervin Henry Knuth and Louise Marie Bohning.
His father had two jobs: running a small printing company and teaching bookkeeping at Milwaukee Lutheran High School. Donald, a student at Milwaukee Lutheran High School, received academic accolades there because of the ingenious ways that he thought of solving problems. For example, in eighth grade, he entered a contest to find the number of words that the letters in "Ziegler's Giant Bar" could be rearranged to create. Although the judges only had 2,500 words on their list, Donald found 4,500 words, winning the contest; as prizes, the school received a new television and enough candy bars for all of his schoolmates to eat. In 1956, Knuth received a scholarship to the Case Institute of Technology in Ohio, he joined Beta Nu Chapter of the Theta Chi fraternity. While studying physics at the Case Institute of Technology, Knuth was introduced to the IBM 650, one of the early mainframes. After reading the computer's manual, Knuth decided to rewrite the assembly and compiler code for the machine used in his school, because he believed he could do it better.
In 1958, Knuth created a program to help his school's basketball team win their games. He assigned "values" to players in order to gauge their probability of getting points, a novel approach that Newsweek and CBS Evening News reported on. Knuth was one of the founding editors of the Engineering and Science Review, which won a national award as best technical magazine in 1959, he switched from physics to mathematics, in 1960 he received his bachelor of science degree being given a master of science degree by a special award of the faculty who considered his work exceptionally outstanding. In 1963, with mathematician Marshall Hall as his adviser, he earned a PhD in mathematics from the California Institute of Technology. After receiving his PhD, Knuth joined Caltech's faculty as an assistant professor, he accepted a commission to write a book on computer programming language compilers. While working on this project, Knuth decided that he could not adequately treat the topic without first developing a fundamental theory of computer programming, which became The Art of Computer Programming.
He planned to publish this as a single book. As Knuth developed his outline for the book, he concluded that he required six volumes, seven, to cover the subject, he published the first volume in 1968. Just before publishing the first volume of The Art of Computer Programming, Knuth left Caltech to accept employment with the Institute for Defense Analyses' Communications Research Division situated on the Princeton University campus, performing mathematical research in cryptography to support the National Security Agency. Knuth left this position to join the Stanford University faculty, where he is now Fletcher Jones Professor of Computer Science, Emeritus. Knuth is a writer, as well as a computer scientist. Knuth has been called the "father of the analysis of algorithms". In the 1970s, Knuth described computer science as "a new field with no real identity, and the standard of available publications was not that high. A lot of the papers coming out were quite wrong.... So one of my motivations was to put straight a story, badly told."
By 2011, the first three volumes and part one of volume four of his series had been published. Concrete Mathematics: A Foundation for Computer Science 2nd ed. which originated with an expansion of the mathematical preliminaries section of Volume 1 of TAoCP, has been published. Bill Gates has praised the difficulty of the subject matter in The Art of Computer Programming, stating, "If you think you're a good programmer... You should send me a résumé if you can read the whole thing." Knuth is the author of Surreal Numbers, a mathematical novelette on John Conway's set theory construction of an alternate system of numbers. Instead of explaining the subject, the book seeks to show the development of the mathematics. Knuth wanted the book to prepare students for doing creative research. In 1995, Knuth wrote the foreword to the book A=B by Marko Petkovšek, Herbert Wilf and Doron Zeilberger. Knuth is an occasional contributor of language puzzles to Word Ways: The Journal of Recreational Linguistics. Knuth has delved into recreational mathematics.
He contributed articles to the Journal of Recreational Mathematics beginning in the 1960s, was acknowledged as a major contributor in Joseph Madachy's Mathematics on Vacation. Knuth has appeared in a number of Numberphile and Computerphile videos on YouTube where he has discussed topics f