Leslie Gabriel Valiant is a British computer scientist and computational theorist. He is the T. Jefferson Coolidge Professor of Computer Science and Applied Mathematics at Harvard University. Valiant was educated at King's College, Imperial College London, the University of Warwick where he received a PhD in computer science in 1974. Valiant is world-renowned for his work in theoretical computer science. Among his many contributions to complexity theory, he introduced the notion of #P-completeness to explain why enumeration and reliability problems are intractable, he introduced the "probably correct" model of machine learning that has helped the field of computational learning theory grow, the concept of holographic algorithms. In computer systems, he is most well-known for introducing the bulk synchronous parallel processing model, his earlier work in automata theory includes an algorithm for context-free parsing, still the asymptotically fastest known. He works in computational neuroscience focusing on understanding memory and learning.
Valiant's 2013 book is Probably Approximately Correct: Nature's Algorithms for Learning and Prospering in a Complex World. In it he argues, among other things, that evolutionary biology does not explain the rate at which evolution occurs, for example, "The evidence for Darwin's general schema for evolution being correct is convincing to the great majority of biologists; this author has been to enough natural history museums. All this, does not mean the current theory of evolution is adequately explanatory. At present the theory of evolution can offer no account of the rate at which evolution progresses to develop complex mechanisms or to maintain them in changing environments." Valiant started teaching at Harvard University in 1982 and is the T. Jefferson Coolidge Professor of Computer Science and Applied Mathematics in the Harvard School of Engineering and Applied Sciences. Prior to 1982 he taught at Carnegie Mellon University, the University of Leeds, the University of Edinburgh. Valiant received the Nevanlinna Prize in 1986, the Knuth Prize in 1997, the EATCS Award in 2008, the ACM Turing Award in 2010.
He was elected a Fellow of the Royal Society in 1991, a Fellow of the Association for the Advancement of Artificial Intelligence, a member of the National Academy of Sciences. Valiant's nomination for the Royal Society reads: Valiant has contributed in a decisive way to the growth of every branch of theoretical computer science, his work is concerned with quantifying mathematically the resource costs of solving problems on a computer. In early work he found the asymptotically fastest algorithm known for recognising context-free languages. At the same time, he pioneered the use of communication properties of graphs for analysing computations. In 1977 he defined the notion of #P-completeness and established its utility in classifying counting or enumeration problems according to computational tractability; the first application was to counting matchings. In 1984 Valiant introduced a definition of inductive learning that for the first time reconciles computational feasibility with the applicability to non-trivial classes of logical rules to be learned.* More he has devised a scheme for efficient routing of communications in a multiprocessor system.
He showed that the overheads involved in a sparse network need not grow with the size of the system. This establishes, from a theoretical viewpoint, the possibility of efficient general purpose parallel computers, his two sons Gregory Valiant and Paul Valiant are both theoretical computer scientists, as faculty at Stanford University and Brown University respectively. This article incorporates text available under the CC BY 4.0 license
A finite-state machine or finite-state automaton, finite automaton, or a state machine, is a mathematical model of computation. It is an abstract machine that can be in one of a finite number of states at any given time; the FSM can change from one state to another in response to some external inputs. An FSM is defined by a list of its states, its initial state, the conditions for each transition. Finite state machines are of two types – deterministic finite state machines and non-deterministic finite state machines. A deterministic finite-state machine can be constructed equivalent to any non-deterministic one; the behavior of state machines can be observed in many devices in modern society that perform a predetermined sequence of actions depending on a sequence of events with which they are presented. Simple examples are vending machines, which dispense products when the proper combination of coins is deposited, whose sequence of stops is determined by the floors requested by riders, traffic lights, which change sequence when cars are waiting, combination locks, which require the input of combination numbers in the proper order.
The finite state machine has less computational power than some other models of computation such as the Turing machine. The computational power distinction means there are computational tasks that a Turing machine can do but a FSM cannot; this is because a FSM's memory is limited by the number of states it has. FSMs are studied in the more general field of automata theory. An example of a simple mechanism that can be modeled by a state machine is a turnstile. A turnstile, used to control access to subways and amusement park rides, is a gate with three rotating arms at waist height, one across the entryway; the arms are locked, blocking the entry, preventing patrons from passing through. Depositing a coin or token in a slot on the turnstile unlocks the arms, allowing a single customer to push through. After the customer passes through, the arms are locked again. Considered as a state machine, the turnstile has two possible states: Unlocked. There are two possible inputs that affect its state: pushing the arm.
In the locked state, pushing on the arm has no effect. Putting a coin in – that is, giving the machine a coin input – shifts the state from Locked to Unlocked. In the unlocked state, putting additional coins in has no effect. However, a customer pushing through the arms, giving a push input, shifts the state back to Locked; the turnstile state machine can be represented by a state transition table, showing for each possible state, the transitions between them and the outputs resulting from each input: The turnstile state machine can be represented by a directed graph called a state diagram. Each state is represented by a node. Edges show the transitions from one state to another; each arrow is labeled with the input. An input that doesn't cause a change of state is represented by a circular arrow returning to the original state; the arrow into the Locked node from the black dot indicates. A state is a description of the status of a system, waiting to execute a transition. A transition is a set of actions to be executed when a condition is fulfilled or when an event is received.
For example, when using an audio system to listen to the radio, receiving a "next" stimulus results in moving to the next station. When the system is in the "CD" state, the "next" stimulus results in moving to the next track. Identical stimuli trigger different actions depending on the current state. In some finite-state machine representations, it is possible to associate actions with a state: an entry action: performed when entering the state, an exit action: performed when exiting the state. Several state transition table types are used; the most common representation is shown below: the combination of current state and input shows the next state. The complete action's information is not directly described in the table and can only be added using footnotes. A FSM definition including the full actions information is possible using state tables; the Unified Modeling Language has a notation for describing state machines. UML state machines overcome the limitations of traditional finite state machines while retaining their main benefits.
UML state machines introduce the new concepts of hierarchically nested states and orthogonal regions, while extending the notion of actions. UML state machines have the characteristics of Moore machines, they support actions that depend on both the state of the system and the triggering event, as in Mealy machines, as well as entry and exit actions, which are associated with states rather than transitions, as in Moore machines. The Specification and Description Language is a standard from ITU that includes graphical symbols to describe actions in the transition: send an event receive an event start a timer cancel a timer start another concurrent state machine decisionSDL embeds basic data types called "Abstract Data Types", an action language, an execution semantic in order to make the finite state machine executable. There are a large number of variants to represent an FSM such as the one in figure 3. In addition to their use in modeling reactive systems
The Art of Computer Programming
The Art of Computer Programming is a comprehensive monograph written by Donald Knuth that covers many kinds of programming algorithms and their analysis. Knuth began the project conceived as a single book with twelve chapters, in 1962; the first three volumes of what was expected to be a seven-volume set were published in 1968, 1969, 1973. The first published installment of Volume 4 appeared in paperback as Fascicle 2 in 2005; the hardback Volume 4A, combining Volume 4, Fascicles 0–4, was published in 2011. Volume 4, Fascicle 6 was released in December 2015. Fascicles 5 and 6 are expected to comprise the first two-thirds of Volume 4B, scheduled to be published on May 17, 2019. After winning a Westinghouse Talent Search scholarship, Knuth enrolled at the Case Institute of Technology, where his performance was so outstanding that the faculty voted to award him a master of science upon his completion of the baccalaureate degree. During his summer vacations, Knuth was hired by the Burroughs Corporation to write compilers, earning more in his summer months than full professors did for an entire year.
Such exploits made Knuth a topic of discussion among the mathematics department, which included Richard S. Varga. Knuth started to write a book about compiler design in 1962, soon realized that the scope of the book needed to be much larger. In June 1965, Knuth finished the first draft of what was planned to be a single volume of twelve chapters, his hand-written first-draft manuscript was 3000 pages long: he had assumed that about five hand-written pages would translate into one printed page, but his publisher said instead that about 1 1⁄2 hand-written pages translated to one printed page. This meant the book would be 2000 pages in length; the publisher was nervous about accepting such a project from a graduate student. At this point, Knuth received support from Richard S. Varga, the scientific adviser to the publisher. Varga was visiting Olga John Todd at Caltech. With Varga's enthusiastic endorsement, the publisher accepted Knuth's expanded plans. In its expanded version, the book would be published in seven volumes, each with just one or two chapters.
Due to the growth in the material, the plan for Volume 4 has since expanded to include Volumes 4A, 4B, 4C, 4D, more. In 1976, Knuth prepared a second edition of Volume 2, requiring it to be typeset again, but the style of type used in the first edition was no longer available. In 1977, he decided to spend some time creating something more suitable. Eight years he returned with TEX, used for all volumes; the offer of a so-called Knuth reward check worth "one hexadecimal dollar" for any errors found, the correction of these errors in subsequent printings, has contributed to the polished and still-authoritative nature of the work, long after its first publication. Another characteristic of the volumes is the variation in the difficulty of the exercises; the level of difficulty ranges from "warm-up" exercises to unsolved research problems. Knuth's dedication reads: This series of books is affectionately dedicatedto the Type 650 computer once installed atCase Institute of Technology,with whom I have spent many pleasant evenings.
All examples in the books use a language called "MIX assembly language", which runs on the hypothetical MIX computer. The MIX computer is being replaced by the MMIX computer, a RISC version. Software such as GNU MDK exists to provide emulation of the MIX architecture. Knuth considers the use of assembly language necessary for the speed and memory usage of algorithms to be judged. Knuth was awarded the 1974 Turing Award "for his major contributions to the analysis of algorithms, in particular for his contributions to the'art of computer programming' through his well-known books in a continuous series by this title." American Scientist has included this work among "100 or so Books that shaped a Century of Science", referring to the twentieth century, within the computer science community it is regarded as the first and still the best comprehensive treatment of its subject. Covers of the third edition of Volume 1 quote Bill Gates as saying, "If you think you're a good programmer… read Art of Computer Programming… You should send me a résumé if you can read the whole thing."
The New York Times referred to it as "the profession's defining treatise". Volume 1 – Fundamental AlgorithmsChapter 1 – Basic concepts Chapter 2 – Information structuresVolume 2 – Seminumerical AlgorithmsChapter 3 – Random numbers Chapter 4 – ArithmeticVolume 3 – Sorting and SearchingChapter 5 – Sorting Chapter 6 – SearchingVolume 4A – Combinatorial AlgorithmsChapter 7 – Combinatorial searching Volume 4B... – Combinatorial Algorithms Chapter 7 – Combinatorial searching Chapter 8 – RecursionVolume 5 – Syntactic Algorithms Chapter 9 – Lexical scanning Chapter 10 – Parsing techniquesVolume 6 – The Theory of Context-Free Languages Volume 7 – Compiler Techniques Chapter 1 – Basic concepts 1.1. Algorithms 1.2. Mathematical Preliminaries 1.2.1. Mathematical Induction 1.2.2. Numbers and Logarithms 1.2.3. Sums and Products 1.2.4. Integer Functions and Elementary Number Theory 1.2.5. Permutations and Factorials 1.2.6. Binomial Coefficients 1.2.7. Harmonic Numbers 1.2.8. Fibonacci Numbers 1.2.9. Generating Functions 1.2.10.
Analysis of an
Donald Ervin Knuth is an American computer scientist and professor emeritus at Stanford University. He is the author of the multi-volume work The Art of Computer Programming, he contributed to the development of the rigorous analysis of the computational complexity of algorithms and systematized formal mathematical techniques for it. In the process he popularized the asymptotic notation. In addition to fundamental contributions in several branches of theoretical computer science, Knuth is the creator of the TeX computer typesetting system, the related METAFONT font definition language and rendering system, the Computer Modern family of typefaces; as a writer and scholar, Knuth created the WEB and CWEB computer programming systems designed to encourage and facilitate literate programming, designed the MIX/MMIX instruction set architectures. Knuth opposes granting software patents, having expressed his opinion to the United States Patent and Trademark Office and European Patent Organisation. Knuth was born in Milwaukee, Wisconsin, to German-Americans Ervin Henry Knuth and Louise Marie Bohning.
His father had two jobs: running a small printing company and teaching bookkeeping at Milwaukee Lutheran High School. Donald, a student at Milwaukee Lutheran High School, received academic accolades there because of the ingenious ways that he thought of solving problems. For example, in eighth grade, he entered a contest to find the number of words that the letters in "Ziegler's Giant Bar" could be rearranged to create. Although the judges only had 2,500 words on their list, Donald found 4,500 words, winning the contest; as prizes, the school received a new television and enough candy bars for all of his schoolmates to eat. In 1956, Knuth received a scholarship to the Case Institute of Technology in Ohio, he joined Beta Nu Chapter of the Theta Chi fraternity. While studying physics at the Case Institute of Technology, Knuth was introduced to the IBM 650, one of the early mainframes. After reading the computer's manual, Knuth decided to rewrite the assembly and compiler code for the machine used in his school, because he believed he could do it better.
In 1958, Knuth created a program to help his school's basketball team win their games. He assigned "values" to players in order to gauge their probability of getting points, a novel approach that Newsweek and CBS Evening News reported on. Knuth was one of the founding editors of the Engineering and Science Review, which won a national award as best technical magazine in 1959, he switched from physics to mathematics, in 1960 he received his bachelor of science degree being given a master of science degree by a special award of the faculty who considered his work exceptionally outstanding. In 1963, with mathematician Marshall Hall as his adviser, he earned a PhD in mathematics from the California Institute of Technology. After receiving his PhD, Knuth joined Caltech's faculty as an assistant professor, he accepted a commission to write a book on computer programming language compilers. While working on this project, Knuth decided that he could not adequately treat the topic without first developing a fundamental theory of computer programming, which became The Art of Computer Programming.
He planned to publish this as a single book. As Knuth developed his outline for the book, he concluded that he required six volumes, seven, to cover the subject, he published the first volume in 1968. Just before publishing the first volume of The Art of Computer Programming, Knuth left Caltech to accept employment with the Institute for Defense Analyses' Communications Research Division situated on the Princeton University campus, performing mathematical research in cryptography to support the National Security Agency. Knuth left this position to join the Stanford University faculty, where he is now Fletcher Jones Professor of Computer Science, Emeritus. Knuth is a writer, as well as a computer scientist. Knuth has been called the "father of the analysis of algorithms". In the 1970s, Knuth described computer science as "a new field with no real identity, and the standard of available publications was not that high. A lot of the papers coming out were quite wrong.... So one of my motivations was to put straight a story, badly told."
By 2011, the first three volumes and part one of volume four of his series had been published. Concrete Mathematics: A Foundation for Computer Science 2nd ed. which originated with an expansion of the mathematical preliminaries section of Volume 1 of TAoCP, has been published. Bill Gates has praised the difficulty of the subject matter in The Art of Computer Programming, stating, "If you think you're a good programmer... You should send me a résumé if you can read the whole thing." Knuth is the author of Surreal Numbers, a mathematical novelette on John Conway's set theory construction of an alternate system of numbers. Instead of explaining the subject, the book seeks to show the development of the mathematics. Knuth wanted the book to prepare students for doing creative research. In 1995, Knuth wrote the foreword to the book A=B by Marko Petkovšek, Herbert Wilf and Doron Zeilberger. Knuth is an occasional contributor of language puzzles to Word Ways: The Journal of Recreational Linguistics. Knuth has delved into recreational mathematics.
He contributed articles to the Journal of Recreational Mathematics beginning in the 1960s, was acknowledged as a major contributor in Joseph Madachy's Mathematics on Vacation. Knuth has appeared in a number of Numberphile and Computerphile videos on YouTube where he has discussed topics f
In mathematics and computer science, an algorithm is an unambiguous specification of how to solve a class of problems. Algorithms can perform calculation, data processing, automated reasoning, other tasks; as an effective method, an algorithm can be expressed within a finite amount of space and time and in a well-defined formal language for calculating a function. Starting from an initial state and initial input, the instructions describe a computation that, when executed, proceeds through a finite number of well-defined successive states producing "output" and terminating at a final ending state; the transition from one state to the next is not deterministic. The concept of algorithm has existed for centuries. Greek mathematicians used algorithms in the sieve of Eratosthenes for finding prime numbers, the Euclidean algorithm for finding the greatest common divisor of two numbers; the word algorithm itself is derived from the 9th century mathematician Muḥammad ibn Mūsā al-Khwārizmī, Latinized Algoritmi.
A partial formalization of what would become the modern concept of algorithm began with attempts to solve the Entscheidungsproblem posed by David Hilbert in 1928. Formalizations were framed as attempts to define "effective calculability" or "effective method"; those formalizations included the Gödel–Herbrand–Kleene recursive functions of 1930, 1934 and 1935, Alonzo Church's lambda calculus of 1936, Emil Post's Formulation 1 of 1936, Alan Turing's Turing machines of 1936–37 and 1939. The word'algorithm' has its roots in Latinizing the name of Muhammad ibn Musa al-Khwarizmi in a first step to algorismus. Al-Khwārizmī was a Persian mathematician, astronomer and scholar in the House of Wisdom in Baghdad, whose name means'the native of Khwarazm', a region, part of Greater Iran and is now in Uzbekistan. About 825, al-Khwarizmi wrote an Arabic language treatise on the Hindu–Arabic numeral system, translated into Latin during the 12th century under the title Algoritmi de numero Indorum; this title means "Algoritmi on the numbers of the Indians", where "Algoritmi" was the translator's Latinization of Al-Khwarizmi's name.
Al-Khwarizmi was the most read mathematician in Europe in the late Middle Ages through another of his books, the Algebra. In late medieval Latin, English'algorism', the corruption of his name meant the "decimal number system". In the 15th century, under the influence of the Greek word ἀριθμός'number', the Latin word was altered to algorithmus, the corresponding English term'algorithm' is first attested in the 17th century. In English, it was first used in about 1230 and by Chaucer in 1391. English adopted the French term, but it wasn't until the late 19th century that "algorithm" took on the meaning that it has in modern English. Another early use of the word is from 1240, in a manual titled Carmen de Algorismo composed by Alexandre de Villedieu, it begins thus: Haec algorismus ars praesens dicitur, in qua / Talibus Indorum fruimur bis quinque figuris. Which translates as: Algorism is the art by which at present we use those Indian figures, which number two times five; the poem is a few hundred lines long and summarizes the art of calculating with the new style of Indian dice, or Talibus Indorum, or Hindu numerals.
An informal definition could be "a set of rules that defines a sequence of operations". Which would include all computer programs, including programs that do not perform numeric calculations. A program is only an algorithm if it stops eventually. A prototypical example of an algorithm is the Euclidean algorithm to determine the maximum common divisor of two integers. Boolos, Jeffrey & 1974, 1999 offer an informal meaning of the word in the following quotation: No human being can write fast enough, or long enough, or small enough† to list all members of an enumerably infinite set by writing out their names, one after another, in some notation, but humans can do something useful, in the case of certain enumerably infinite sets: They can give explicit instructions for determining the nth member of the set, for arbitrary finite n. Such instructions are to be given quite explicitly, in a form in which they could be followed by a computing machine, or by a human, capable of carrying out only elementary operations on symbols.
An "enumerably infinite set" is one whose elements can be put into one-to-one correspondence with the integers. Thus and Jeffrey are saying that an algorithm implies instructions for a process that "creates" output integers from an arbitrary "input" integer or integers that, in theory, can be arbitrarily large, thus an algorithm can be an algebraic equation such as y = m + n – two arbitrary "input variables" m and n that produce an output y. But various authors' attempts to define the notion indicate that the word implies much more than this, something on the order of: Precise instructions for a fast, efficient, "good" process that specifies the "moves" of "the computer" to find and process arbitrary input integers/symbols m and n, symbols + and =... and "effectively" produce, in a "reasonable" time, output-integer y at a specified place and in a specified format
Computer science is the study of processes that interact with data and that can be represented as data in the form of programs. It enables the use of algorithms to manipulate and communicate digital information. A computer scientist studies the theory of computation and the practice of designing software systems, its fields can be divided into practical disciplines. Computational complexity theory is abstract, while computer graphics emphasizes real-world applications. Programming language theory considers approaches to the description of computational processes, while computer programming itself involves the use of programming languages and complex systems. Human–computer interaction considers the challenges in making computers useful and accessible; the earliest foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating fixed numerical tasks such as the abacus have existed since antiquity, aiding in computations such as multiplication and division.
Algorithms for performing computations have existed since antiquity before the development of sophisticated computing equipment. Wilhelm Schickard designed and constructed the first working mechanical calculator in 1623. In 1673, Gottfried Leibniz demonstrated a digital mechanical calculator, called the Stepped Reckoner, he may be considered the first computer scientist and information theorist, among other reasons, documenting the binary number system. In 1820, Thomas de Colmar launched the mechanical calculator industry when he released his simplified arithmometer, the first calculating machine strong enough and reliable enough to be used daily in an office environment. Charles Babbage started the design of the first automatic mechanical calculator, his Difference Engine, in 1822, which gave him the idea of the first programmable mechanical calculator, his Analytical Engine, he started developing this machine in 1834, "in less than two years, he had sketched out many of the salient features of the modern computer".
"A crucial step was the adoption of a punched card system derived from the Jacquard loom" making it infinitely programmable. In 1843, during the translation of a French article on the Analytical Engine, Ada Lovelace wrote, in one of the many notes she included, an algorithm to compute the Bernoulli numbers, considered to be the first computer program. Around 1885, Herman Hollerith invented the tabulator, which used punched cards to process statistical information. In 1937, one hundred years after Babbage's impossible dream, Howard Aiken convinced IBM, making all kinds of punched card equipment and was in the calculator business to develop his giant programmable calculator, the ASCC/Harvard Mark I, based on Babbage's Analytical Engine, which itself used cards and a central computing unit; when the machine was finished, some hailed it as "Babbage's dream come true". During the 1940s, as new and more powerful computing machines were developed, the term computer came to refer to the machines rather than their human predecessors.
As it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. In 1945, IBM founded the Watson Scientific Computing Laboratory at Columbia University in New York City; the renovated fraternity house on Manhattan's West Side was IBM's first laboratory devoted to pure science. The lab is the forerunner of IBM's Research Division, which today operates research facilities around the world; the close relationship between IBM and the university was instrumental in the emergence of a new scientific discipline, with Columbia offering one of the first academic-credit courses in computer science in 1946. Computer science began to be established as a distinct academic discipline in the 1950s and early 1960s; the world's first computer science degree program, the Cambridge Diploma in Computer Science, began at the University of Cambridge Computer Laboratory in 1953. The first computer science degree program in the United States was formed at Purdue University in 1962.
Since practical computers became available, many applications of computing have become distinct areas of study in their own rights. Although many believed it was impossible that computers themselves could be a scientific field of study, in the late fifties it became accepted among the greater academic population, it is the now well-known IBM brand that formed part of the computer science revolution during this time. IBM released the IBM 704 and the IBM 709 computers, which were used during the exploration period of such devices. "Still, working with the IBM was frustrating if you had misplaced as much as one letter in one instruction, the program would crash, you would have to start the whole process over again". During the late 1950s, the computer science discipline was much in its developmental stages, such issues were commonplace. Time has seen significant improvements in the effectiveness of computing technology. Modern society has seen a significant shift in the users of computer technology, from usage only by experts and professionals, to a near-ubiquitous user base.
Computers were quite costly, some degree of humanitarian aid was needed for efficient use—in part from professional computer operators. As computer adoption became more widespread and affordable, less human assistance was needed for common usage. Despite its short history as a formal academic discipline, computer science has made a number of fundamental contributions to science and society—in fact, along with electronics, it is
Air Force Research Laboratory
The Air Force Research Laboratory is a scientific research organization operated by the United States Air Force Materiel Command dedicated to leading the discovery and integration of affordable aerospace warfighting technologies and executing the Air Force science and technology program, providing warfighting capabilities to United States air and cyberspace forces. It controls the entire Air Force science and technology research budget, $2.4 billion in 2006. The Laboratory was formed at Wright-Patterson Air Force Base, Ohio on 31 October 1997 as a consolidation of four Air Force laboratory facilities and the Air Force Office of Scientific Research under a unified command; the Laboratory is composed of seven technical directorates, one wing, the Office of Scientific Research. Each technical directorate emphasizes a particular area of research within the AFRL mission which it specializes in performing experiments in conjunction with universities and contractors. Since the Laboratory's formation in 1997, it has conducted numerous experiments and technical demonstrations in conjunction with NASA, Department of Energy National Laboratories, DARPA, other research organizations within the Department of Defense.
Notable projects include the X-37, X-40, X-53, HTV-3X, YAL-1A, Advanced Tactical Laser, the Tactical Satellite Program. The Laboratory may face problems in the future as 40 percent of its workers are slated to retire over the next two decades while since 1980 the United States has not produced enough science and engineering degrees to keep up with demand. In 1945 the Air Force Cambridge Research Laboratories were established; these laboratories were active from 1945 to 2011, following consolidation to Wright-Patterson Air Force Base and Kirtland Air Force Base under the 2005 Base Realignment and Closure Commission. The labs were founded as the Air Force Cambridge Research Center, a Cold War systems development organization which developed telephone modem communications for a Digital Radar Relay in 1949. Created by General Henry H. Arnold in 1945, AFCRC participated in Project Space Track and Semi-Automatic Ground Environment development; the path to a consolidated Air Force Research Laboratory began with the passage of the Goldwater–Nichols Act, designed to streamline the use of resources by the Department of Defense.
In addition to this Act, the end of the Cold War began a period of budgetary and personnel reductions within the armed forces in preparation for a "stand-down" transition out of readiness for a global war with the Soviet Union. Prior to 1990, the Air Force laboratory system spread research out into 13 different laboratories and the Rome Air Development Center which each reported up two separate chains of command: a product center for personnel, the Air Force Systems Command Director of Science & Technology for budgetary purposes. Bowing to the constraints of a reduced budget and personnel, the Air Force merged the existing research laboratories into four "superlabs" in December 1990. During this same time period, the Air Force Systems Command and Air Force Logistics Command merged to form Air Force Materiel Command in July 1992. While the initial consolidation of Air Force laboratories reduced overhead and budgetary pressure, another push towards a unified laboratory structure came in the form of the National Defense Authorization Act for Fiscal Year 1996, Section 277.
This section instructed the Department of Defense to produce a five-year plan for consolidation and restructuring of all defense laboratories. The existing laboratory structure was created in October 1997 through the consolidation of Phillips Laboratory headquartered in Albuquerque, New Mexico, Wright Laboratory in Dayton, Rome Laboratory in Rome, New York, Armstrong Laboratory in San Antonio and the Air Force Office of Scientific Research; the single laboratory concept was developed and championed by Maj Gen Richard Paul, Director of Science & Technology for AFMC and Gen Henry Viccellio Jr, became the first Commander of AFRL. With the merger of the laboratories into a single entity, the history offices at each site ceased to maintain independent histories and all history functions were transferred to a central History Office located at AFRL HQ at Wright-Patterson AFB. In homage to the predecessor laboratories, the new organization named four of the research sites after the laboratories and assured that each laboratory's history would be preserved as inactivated units.
The laboratory is divided into 8 Technical Directorates, one wing, the Air Force Office of Scientific Research based on different areas of research. AFOSR is a funding body for external research while the other directorates perform research in-house or under contract to external entities. A directorate is equivalent to a military wing; each directorate is composed of a number of divisions and has at least three support divisions in addition to its research divisions. The Operations and Integration Division provides the directorate with well-conceived and executed business computing, human resource management, business development services while the Financial Management Division manages the financial resources and the Procurement Division provides an in-house contracting capability; the support divisions at any given location work together to minimize overhead at any given research site. Each division is further broken down into branches equivalent to a military squadron. Superimposed on the overall AFRL structure are the eight detachments.
Each detachment is composed of the AFRL military personnel at any given geographical location. For example, the personnel