A regular expression, regex or regexp is a sequence of characters that define a search pattern. This pattern is used by string searching algorithms for "find" or "find and replace" operations on strings, or for input validation, it is a technique developed in formal language theory. The concept arose in the 1950s when the American mathematician Stephen Cole Kleene formalized the description of a regular language; the concept came into common use with Unix text-processing utilities. Since the 1980s, different syntaxes for writing regular expressions exist, one being the POSIX standard and another used, being the Perl syntax. Regular expressions are used in search engines and replace dialogs of word processors and text editors, in text processing utilities such as sed and AWK and in lexical analysis. Many programming languages provide regex capabilities, built-in or via libraries; the phrase regular expressions, regexes, is used to mean the specific, standard textual syntax for representing patterns for matching text.
Each character in a regular expression is either a metacharacter, having a special meaning, or a regular character that has a literal meaning. For example, in the regex a. A is a literal character which matches just'a', while'.' is a meta character that matches every character except a newline. Therefore, this regex matches, for example,'a', or'ax', or'a0'. Together and literal characters can be used to identify text of a given pattern, or process a number of instances of it. Pattern matches may vary from a precise equality to a general similarity, as controlled by the metacharacters. For example. Is a general pattern, is less general and a is a precise pattern; the metacharacter syntax is designed to represent prescribed targets in a concise and flexible way to direct the automation of text processing of a variety of input data, in a form easy to type using a standard ASCII keyboard. A simple case of a regular expression in this syntax is to locate a word spelled two different ways in a text editor, the regular expression serialie matches both "serialise" and "serialize".
Wildcards achieve this, but are more limited in what they can pattern, as they have fewer metacharacters and a simple language-base. The usual context of wildcard characters is in globbing similar names in a list of files, whereas regexes are employed in applications that pattern-match text strings in general. For example, the regex ^+|+$ matches excess whitespace at the beginning or end of a line. An advanced regular expression that matches any numeral is??. A regex processor translates a regular expression in the above syntax into an internal representation which can be executed and matched against a string representing the text being searched in. One possible approach is the Thompson's construction algorithm to construct a nondeterministic finite automaton, made deterministic and the resulting deterministic finite automaton is run on the target text string to recognize substrings that match the regular expression; the picture shows the NFA scheme N obtained from the regular expression s*, where s denotes a simpler regular expression in turn, recursively translated to the NFA N.
Regular expressions originated in 1951, when mathematician Stephen Cole Kleene described regular languages using his mathematical notation called regular sets. These arose in theoretical computer science, in the subfields of automata theory and the description and classification of formal languages. Other early implementations of pattern matching include the SNOBOL language, which did not use regular expressions, but instead its own pattern matching constructs. Regular expressions entered popular use from 1968 in two uses: pattern matching in a text editor and lexical analysis in a compiler. Among the first appearances of regular expressions in program form was when Ken Thompson built Kleene's notation into the editor QED as a means to match patterns in text files. For speed, Thompson implemented regular expression matching by just-in-time compilation to IBM 7094 code on the Compatible Time-Sharing System, an important early example of JIT compilation, he added this capability to the Unix editor ed, which led to the popular search tool grep's use of regular expressions.
Around the same time when Thompson developed QED, a group of researchers including Douglas T. Ross implemented a tool based on regular expressions, used for lexical analysis in compiler design. Many variations of these original forms of regular expressions were used in Unix programs at Bell Labs in the 1970s, including vi, sed, AWK, expr, in other programs such as Emacs. Regexes were subsequently adopted by a wide range of programs, with these early forms standardized in the POSIX.2 standard in 1992. In the 1980s the more complicated regexes arose in Perl, which derived from a regex library written by Henry Spencer, who wrote an implementation of Advanced Regular Expressions for Tcl; the Tcl library is a hybrid NFA/DFA implementation with improved performance characteristics. Software projects that have adopted Spencer's Tcl regular expression implementation include PostgreSQL. Perl expanded on Spencer's original library
The IBM 7090 is a second-generation transistorized version of the earlier IBM 709 vacuum tube mainframe computer, designed for "large-scale scientific and technological applications". The 7090 is the third member of the IBM 700/7000 series scientific computers; the first 7090 installation was in November 1959. In 1960, a typical system could be rented for $63,500 a month; the 7090 uses a 36-bit word length, with an address space of 32,768 words. It operates with a basic memory cycle of 2.18 μs, using the IBM 7302 Core Storage core memory technology from the IBM 7030 project. With a processing speed of around 100 Kflop/s, the 7090 is six times faster than the 709, could be rented for half the price. Although the 709 was a superior machine to its predecessor, the 704, it was being built and sold at the time that transistor circuitry was supplanting vacuum tube circuits. Hence, IBM redeployed its 709 engineering group to the design of a transistorized successor; that project became called the 709-T, which because of the sound when spoken shifted to the nomenclature 7090.
The related machines such as the 7070 and other 7000 series equipment were sometimes called by names of digit - digit - decade. An upgraded version, the IBM 7094, was first installed in September 1962, it has seven index registers, instead of three on the earlier machines. The 7094 console has a distinctive box on top. Photos The 7094 introduced double-precision floating point and additional instructions, but is backward compatible with the 7090. Minor changes in instruction formats the way the additional index registers are addressed, sometimes caused problems. On the earlier models, when more than one bit is set in the tag field, the contents of the two or three selected index registers are ORed, not added together, before the decrement takes place. On the 7094, if the three-bit tag field is not zero, it selects just one of seven index registers, however the "or" behavior remains available in a "multiple tag" compatibility mode. In April 1964, the first 7094 II was installed, which had twice as much general speed as the 7090 due to a faster clock cycle, dual memory banks and improved overlap of instruction execution, an early instance of pipelined design.
In 1963, IBM introduced two new, lower cost machines called the IBM 7040 and 7044. They have a 36-bit architecture based on the 7090, but with some instructions omitted or optional, simplified input/output that allows the use of more modern, higher performance peripherals from the IBM 1400 series; the 7094/7044 Direct Coupled System was developed by an IBM customer, the Aerospace Corporation, seeking greater cost efficiency and scheduling flexibility than IBM's IBSYS tape operating system provided. DCS used a less expensive IBM 7044 to handle Input/Output with the 7094 performing computation. Aerospace developed the Direct Couple operating system, an extension to IBSYS, shared with other IBM customers. IBM introduced the DCS as a product; the 7090 uses more than 50,000 germanium alloy-junction transistors and germanium diffused junction drift transistors. The 7090 uses the Standard Modular System cards using current-mode logic some using diffused junction drift transistors; the basic instruction format is the same as the IBM 709, a three-bit prefix, 15-bit decrement, three-bit tag, 15-bit address.
The prefix field specifies the class of instruction. The decrement field contains an immediate operand to modify the results of the operation, or is used to further define the instruction type; the three bits of the tag specify three index registers, the contents of which are subtracted from the address to produce an effective address. The address field contains either an immediate operand. Fixed-point numbers are stored in binary sign/magnitude format. Single-precision floating-point numbers have a magnitude sign, an eight-bit excess-128 exponent and a 27-bit magnitude Double-precision floating-point numbers, introduced on the 7094, have a magnitude sign, an eight-bit excess-128 exponent, a 54-bit magnitude; the double-precision number is stored in memory in an even-odd pair of consecutive words. Alphanumeric characters are six-bit BCD, packed six to a word. Octal notation is used in programming; the 7090 series features a data channel architecture for input and output, a forerunner of modern direct memory access I/O.
Up to eight data channels can be attached, with up to ten IBM 729 tape drives attached to each channel. The data channels have their own limited set of operations called commands; these are used with tape storage as well as card units and printers, offered high performance for the time. Printing and punched card I/O, employed the same modified unit record equipment introduced with the 704 and was slow, it became common to use a less expensive IBM 1401 computer to read cards onto magnetic tape for transfer to the 7090/94. Output would be spooled onto tape and transferred to the 1401 for printing or card punching using its much faster peripherals, notably the IBM 1403 line printer. IBM introduced the 7094/7044 Direct Coupled System, using data channel to data channel communication, with the 7094 pri
Fortran is a general-purpose, compiled imperative programming language, suited to numeric computation and scientific computing. Developed by IBM in the 1950s for scientific and engineering applications, FORTRAN came to dominate this area of programming early on and has been in continuous use for over half a century in computationally intensive areas such as numerical weather prediction, finite element analysis, computational fluid dynamics, computational physics and computational chemistry, it is a popular language for high-performance computing and is used for programs that benchmark and rank the world's fastest supercomputers. Fortran encompasses a lineage of versions, each of which evolved to add extensions to the language while retaining compatibility with prior versions. Successive versions have added support for structured programming and processing of character-based data, array programming, modular programming and generic programming, high performance Fortran, object-oriented programming and concurrent programming.
Fortran's design was the basis for many other programming languages. Among the better known is BASIC, based on FORTRAN II with a number of syntax cleanups, notably better logical structures, other changes to more work in an interactive environment; the names of earlier versions of the language through FORTRAN 77 were conventionally spelled in all-capitals. The capitalization has been dropped in referring to newer versions beginning with Fortran 90; the official language standards now refer to the language as "Fortran" rather than all-caps "FORTRAN". In late 1953, John W. Backus submitted a proposal to his superiors at IBM to develop a more practical alternative to assembly language for programming their IBM 704 mainframe computer. Backus' historic FORTRAN team consisted of programmers Richard Goldberg, Sheldon F. Best, Harlan Herrick, Peter Sheridan, Roy Nutt, Robert Nelson, Irving Ziller, Lois Haibt, David Sayre, its concepts included easier entry of equations into a computer, an idea developed by J. Halcombe Laning and demonstrated in the Laning and Zierler system of 1952.
A draft specification for The IBM Mathematical Formula Translating System was completed by November 1954. The first manual for FORTRAN appeared in October 1956, with the first FORTRAN compiler delivered in April 1957; this was the first optimizing compiler, because customers were reluctant to use a high-level programming language unless its compiler could generate code with performance comparable to that of hand-coded assembly language. While the community was skeptical that this new method could outperform hand-coding, it reduced the number of programming statements necessary to operate a machine by a factor of 20, gained acceptance. John Backus said during a 1979 interview with Think, the IBM employee magazine, "Much of my work has come from being lazy. I didn't like writing programs, so, when I was working on the IBM 701, writing programs for computing missile trajectories, I started work on a programming system to make it easier to write programs."The language was adopted by scientists for writing numerically intensive programs, which encouraged compiler writers to produce compilers that could generate faster and more efficient code.
The inclusion of a complex number data type in the language made Fortran suited to technical applications such as electrical engineering. By 1960, versions of FORTRAN were available for the IBM 709, 650, 1620, 7090 computers; the increasing popularity of FORTRAN spurred competing computer manufacturers to provide FORTRAN compilers for their machines, so that by 1963 over 40 FORTRAN compilers existed. For these reasons, FORTRAN is considered to be the first used programming language supported across a variety of computer architectures; the development of Fortran paralleled the early evolution of compiler technology, many advances in the theory and design of compilers were motivated by the need to generate efficient code for Fortran programs. The initial release of FORTRAN for the IBM 704 contained 32 statements, including: DIMENSION and EQUIVALENCE statements Assignment statements Three-way arithmetic IF statement, which passed control to one of three locations in the program depending on whether the result of the arithmetic statement was negative, zero, or positive IF statements for checking exceptions.
The arithmetic IF statement was reminiscent of a three-way comparison instruction available on the 704. The statement provided the only way to compare numbers – by testing their difference, with an attendant risk of overflow; this deficiency was overcome by "logical" facilities introduced in FORTRAN IV. The FREQUENCY statement was used to give branch probabilities for the three branch cases of the arithmetic IF statement; the first FORTRAN compiler used this weighting to perform at compile time a Monte Carlo simulation of the generated code, the results of which were used to optimize the
Humanities are academic disciplines that study aspects of human society and culture. In the Renaissance, the term contrasted with divinity and referred to what is now called classics, the main area of secular study in universities at the time. Today, the humanities are more contrasted with natural, sometimes social, sciences as well as professional training; the humanities use methods that are critical, or speculative, have a significant historical element—as distinguished from the empirical approaches of the natural sciences, unlike the sciences, it has no central discipline. The humanities include ancient and modern languages, philosophy, human geography, politics and art. Scholars in the humanities are humanists; the term "humanist" describes the philosophical position of humanism, which some "antihumanist" scholars in the humanities reject. The Renaissance scholars and artists were called humanists; some secondary schools offer humanities classes consisting of literature, global studies and art.
Human disciplines like history and cultural anthropology study subject matters that the manipulative experimental method does not apply to—and instead use the comparative method and comparative research. Anthropology is a science of the totality of human existence; the discipline deals with the integration of different aspects of the social sciences and human biology. In the twentieth century, academic disciplines have been institutionally divided into three broad domains; the natural sciences seek to derive general laws through verifiable experiments. The humanities study local traditions, through their history, literature and arts, with an emphasis on understanding particular individuals, events, or eras; the social sciences have attempted to develop scientific methods to understand social phenomena in a generalizable way, though with methods distinct from those of the natural sciences. The anthropological social sciences develop nuanced descriptions rather than the general laws derived in physics or chemistry, or they may explain individual cases through more general principles, as in many fields of psychology.
Anthropology does not fit into one of these categories, different branches of anthropology draw on one or more of these domains. Within the United States, anthropology is divided into four sub-fields: archaeology, physical or biological anthropology, anthropological linguistics, cultural anthropology, it is an area, offered at most undergraduate institutions. The word anthropos is from the Greek for "human being" or "person". Eric Wolf described sociocultural anthropology as "the most scientific of the humanities, the most humanistic of the sciences"; the goal of anthropology is to provide a holistic account of human nature. This means that, though anthropologists specialize in only one sub-field, they always keep in mind the biological, linguistic and cultural aspects of any problem. Since anthropology arose as a science in Western societies that were complex and industrial, a major trend within anthropology has been a methodological drive to study peoples in societies with more simple social organization, sometimes called "primitive" in anthropological literature, but without any connotation of "inferior".
Today, anthropologists use terms such as "less complex" societies, or refer to specific modes of subsistence or production, such as "pastoralist" or "forager" or "horticulturalist", to discuss humans living in non-industrial, non-Western cultures, such people or folk remaining of great interest within anthropology. The quest for holism leads most anthropologists to study a people in detail, using biogenetic and linguistic data alongside direct observation of contemporary customs. In the 1990s and 2000s, calls for clarification of what constitutes a culture, of how an observer knows where his or her own culture ends and another begins, other crucial topics in writing anthropology were heard, it is possible to view all human cultures as part of one large. These dynamic relationships, between what can be observed on the ground, as opposed to what can be observed by compiling many local observations remain fundamental in any kind of anthropology, whether cultural, linguistic or archaeological.
Archaeology is the study of human activity through the analysis of material culture. The archaeological record consists of artifacts, biofacts or ecofacts, cultural landscapes. Archaeology can be considered a branch of the humanities, it has various goals, which range from understanding culture history to reconstructing past lifeways to documenting and explaining changes in human societies through time. Archaeology is thought of as a branch of anthropology in the United States, while in Europe, it is viewed as a discipline in its own right, or grouped under other related disciplines such as history. Classics, in the Western academic tradition, refers to the studies of the cultures of classical antiquity, namely Ancient Greek and Latin and the Ancient Greek and Roman cultures. Classical studies is considered one of the cornerstones of the humanities; the influence of classical ideas on many humanities disciplines, such as philosophy and literature, remains strong. History is systematically collected information about the past.
When used as the name of a field of study, history refers to the study and interpretation of the record of humans, societies and any to
In computer science, an associative array, symbol table, or dictionary is an abstract data type composed of a collection of pairs, such that each possible key appears at most once in the collection. Operations associated with this data type allow: the addition of a pair to the collection the removal of a pair from the collection the modification of an existing pair the lookup of a value associated with a particular keyThe dictionary problem is a classic computer science problem: the task of designing a data structure that maintains a set of data during'search','delete', and'insert' operations; the two major solutions to the dictionary problem are a search tree. In some cases it is possible to solve the problem using directly addressed arrays, binary search trees, or other more specialized structures. Many programming languages include associative arrays as primitive data types, they are available in software libraries for many others. Content-addressable memory is a form of direct hardware-level support for associative arrays.
Associative arrays have many applications including such fundamental programming patterns as memoization and the decorator pattern. The name does not come from the associative property known in mathematics. Rather, it arises from the fact. In an associative array, the association between a key and a value is known as a "binding", the same word "binding" may be used to refer to the process of creating a new association; the operations that are defined for an associative array are: Add or insert: add a new pair to the collection, binding the new key to its new value. The arguments to this operation are the value. Reassign: replace the value in one of the pairs that are in the collection, binding an old key to a new value; as with an insertion, the arguments to this operation are the value. Remove or delete: remove a pair from the collection, unbinding a given key from its value; the argument to this operation is the key. Lookup: find the value, bound to a given key; the argument to this operation is the key, the value is returned from the operation.
If no value is found, some associative array implementations raise an exception, while others create a pair with the given key and the default value of the value type. Instead of add or reassign there is a single set operation that adds a new pair if one does not exist, otherwise reassigns it. In addition, associative arrays may include other operations such as determining the number of bindings or constructing an iterator to loop over all the bindings. For such an operation, the order in which the bindings are returned may be arbitrary. A multimap generalizes an associative array by allowing multiple values to be associated with a single key. A bidirectional map is a related abstract data type in which the bindings operate in both directions: each value must be associated with a unique key, a second lookup operation takes a value as argument and looks up the key associated with that value. Suppose that the set of loans made by a library is represented in a data structure; each book in a library may be checked out only by a single library patron at a time.
However, a single patron may be able to check out multiple books. Therefore, the information about which books are checked out to which patrons may be represented by an associative array, in which the books are the keys and the patrons are the values. Using notation from Python or JSON, the data structure would be: A lookup operation on the key "Great Expectations" would return "John". If John returns his book, that would cause a deletion operation, if Pat checks out a book, that would cause an insertion operation, leading to a different state: For dictionaries with small numbers of bindings, it may make sense to implement the dictionary using an association list, a linked list of bindings. With this implementation, the time to perform the basic dictionary operations is linear in the total number of bindings. Another simple implementation technique, usable when the keys are restricted to a narrow range of integers, is direct addressing into an array: the value for a given key k is stored at the array cell A, or if there is no binding for k the cell stores a special sentinel value that indicates the absence of a binding.
As well as being simple, this technique is fast: each dictionary operation takes constant time. However, the space requirement for this structure is the size of the entire keyspace, making it impractical unless the keyspace is small; the two major approaches to implementing dictionaries are a search tree. The most used general purpose implementation of an associative array is with a hash table: an array combined with a hash function that separates each key into a separate "bucket" of the array; the basic idea behind a hash table is that accessing an element of an array via its index is a simple, constant-time operation. Therefore, the average overhead of an operation for a hash table is only the computation of the key's hash
David J. Farber
David J. Farber is a professor of computer science, noted for his major contributions to programming languages and computer networking, he is Distinguished Career Professor of Computer Science and Public Policy at the School of Computer Science, Heinz College, Department of Engineering and Public Policy at Carnegie Mellon University. Farber graduated from the Stevens Institute of Technology in 1956 and began an 11-year career at Bell Laboratories, where he helped design the first electronic switching system and the SNOBOL programming languages, he subsequently held industry positions at the Rand Corporation and Scientific Data Systems, followed by academic positions at the University of California and the University of Delaware. At Irvine his research work was focused on creating the world's first operational distributed computer system. While a member of the electrical engineering department of the University of Delaware, he helped conceive and organize the major American research networks CSNET, NSFNet, the National Research and Education Network.
He helped create the NSF/DARPA-funded Gigabit Network Test bed Initiative and served as the Chairman of the Gigabit Test bed Coordinating Committee. Farber subsequently was appointed Alfred Fitler Moore Professor of Telecommunication Systems at the University of Pennsylvania where he held appointments as Professor of Business and Public Policy at the Wharton School of Business and as a Faculty Associate of the Annenberg School for Communication, he served as Chief Technologist at the US Federal Communications Commission while on leave from the university. Farber is a founding editor of ICANNWatch, he serves on the board of advisers of The Liquid Information Company. Farber is an IEEE Fellow, ACM Fellow, recipient of the 1995 SIGCOMM Award for lifelong contributions to computer communications, he has served on the board of directors of the Electronic Frontier Foundation, the Electronic Privacy Information Center advisory board, the Board of Trustees of the Internet Society, as a member of the Presidential Advisory Committee on High Performance Computing and Communications, Information Technology and Next Generation Internet.
He runs a large mailing list called Interesting-People. In 2012, in memory of his son, he established the Joseph M. Farber prize at the Stevens Institute of Technology, which recognizes a graduating senior majoring in one of the disciplines of the College of Arts and Letters who displays a keen interest in and concern for civil liberties and their importance in preserving and protecting human rights. On August 3, 2013, Farber was inducted into the Pioneers Circle of the Internet Hall of Fame. Carnegie Mellon University: Engineering and Public Policy: EPP Faculty: David J. Farber at the Wayback Machine Listbox: Interesting-People Farberisms
Nokia Bell Labs is an industrial research and scientific development company owned by Finnish company Nokia. Its headquarters are located in New Jersey. Other laboratories are located around the world. Bell Labs has its origins in the complex past of the Bell System. In the late 19th century, the laboratory began as the Western Electric Engineering Department and was located at 463 West Street in New York City. In 1925, after years of conducting research and development under Western Electric, the Engineering Department was reformed into Bell Telephone Laboratories and under the shared ownership of American Telephone & Telegraph Company and Western Electric. Researchers working at Bell Labs are credited with the development of radio astronomy, the transistor, the laser, the photovoltaic cell, the charge-coupled device, information theory, the Unix operating system, the programming languages C, C++, S. Nine Nobel Prizes have been awarded for work completed at Bell Laboratories. In 1880, when the French government awarded Alexander Graham Bell the Volta Prize of 50,000 francs (approximately US$10,000 at that time for the invention of the telephone, he used the award to fund the Volta Laboratory in Washington, D.
C. in collaboration with Sumner Tainter and Bell's cousin Chichester Bell. The laboratory was variously known as the Volta Bureau, the Bell Carriage House, the Bell Laboratory and the Volta Laboratory, it focused on the analysis and transmission of sound. Bell used his considerable profits from the laboratory for further research and education to permit the " diffusion of knowledge relating to the deaf": resulting in the founding of the Volta Bureau, located at Bell's father's house at 1527 35th Street N. W. in Washington, D. C, its carriage house became their headquarters in 1889. In 1893, Bell constructed a new building close by at 1537 35th Street N. W. to house the lab. This building was declared a National Historic Landmark in 1972. After the invention of the telephone, Bell maintained a distant role with the Bell System as a whole, but continued to pursue his own personal research interests; the Bell Patent Association was formed by Alexander Graham Bell, Thomas Sanders, Gardiner Hubbard when filing the first patents for the telephone in 1876.
Bell Telephone Company, the first telephone company, was formed a year later. It became a part of the American Bell Telephone Company. American Telephone & Telegraph Company and its own subsidiary company, took control of American Bell and the Bell System by 1889. American Bell held a controlling interest in Western Electric whereas AT&T was doing research into the service providers. In 1884, the American Bell Telephone Company created the Mechanical Department from the Electrical and Patent Department formed a year earlier. In 1896, Western Electric bought property at 463 West Street to station their manufacturers and engineers, supplying AT&T with their product; this included everything from telephones, telephone exchange switches, transmission equipment. In 1925, Bell Laboratories was developed to better consolidate the research activities of the Bell System. Ownership was evenly split between Western Electric and AT&T. Throughout the next decade the AT&T Research and Development branch moved into West Street.
Bell Labs carried out consulting work for the Bell Telephone Company, U. S. government work, a few workers were assigned to basic research. The first president of research at Bell Labs was Frank B. Jewett who stayed there until 1940. By the early 1940s, Bell Labs engineers and scientists had begun to move to other locations away from the congestion and environmental distractions of New York City, in 1967 Bell Laboratories headquarters was relocated to Murray Hill, New Jersey. Among the Bell Laboratories locations in New Jersey were Holmdel, Crawford Hill, the Deal Test Site, Lincroft, Long Branch, Neptune, Piscataway, Red Bank and Whippany. Of these, Murray Hill and Crawford Hill remain in existence; the largest grouping of people in the company was in Illinois, at Naperville-Lisle, in the Chicago area, which had the largest concentration of employees prior to 2001. There were groups of employees in Indianapolis, Indiana. Since 2001, many of the former locations closed; the Holmdel site, a 1.9 million square foot structure set on 473 acres, was closed in 2007.
The mirrored-glass building was designed by Eero Saarinen. In August 2013, Somerset Development bought the building, intending to redevelop it into a mixed commercial and residential project. A 2012 article expressed doubt on the success of the newly named Bell Works site however several large tenants had announced plans to move in through 2016 and 2017 Bell Laboratories was, is, regarded by many as the premier research facility of its type, developing a wide range of revolutionary technologies, including radio astronomy, the transistor, the laser, information theory, the operating system Unix, the programming languages C and C++, solar cells, the CCD, floating-gate MOSFET, a whole host of optical and wired communications