Lisp machines are general-purpose computers designed to efficiently run Lisp as their main software and programming language via hardware support. They are an example of a high-level language computer architecture, in a sense, they were the first commercial single-user workstations. Despite being modest in number, Lisp machines commercially pioneered many now-commonplace technologies, including effective garbage collection, laser printing, windowing systems, computer mice, high-resolution bit-mapped raster graphics, computer graphic rendering, networking innovations such as Chaosnet. Several firms built and sold Lisp machines in the 1980s: Symbolics, Lisp Machines Incorporated, Texas Instruments, Xerox; the operating systems were written in Lisp Machine Lisp and partly in Common Lisp. Artificial intelligence computer programs of the 1960s and 1970s intrinsically required what was considered a huge amount of computer power, as measured in processor time and memory space; the power requirements of AI research were exacerbated by the Lisp symbolic programming language, when commercial hardware was designed and optimized for assembly- and Fortran-like programming languages.
At first, the cost of such computer hardware meant. As integrated circuit technology shrank the size and cost of computers in the 1960s and early 1970s, the memory needs of AI programs began to exceed the address space of the most common research computer, the DEC PDP-10, researchers considered a new approach: a computer designed to develop and run large artificial intelligence programs, tailored to the semantics of the Lisp language. To keep the operating system simple, these machines would not be shared, but would be dedicated to single users. In 1973, Richard Greenblatt and Thomas Knight, programmers at Massachusetts Institute of Technology Artificial Intelligence Laboratory, began what would become the MIT Lisp Machine Project when they first began building a computer hardwired to run certain basic Lisp operations, rather than run them in software, in a 24-bit tagged architecture; the machine did incremental garbage collection. More since Lisp variables are typed at runtime rather than compile time, a simple addition of two variables could take five times as long on conventional hardware, due to test and branch instructions.
Lisp Machines ran the tests in parallel with the more conventional single instruction additions. If the simultaneous tests failed the result was discarded and recomputed; this simultaneous checking approach was used as well in testing the bounds of arrays when referenced, other memory management necessities. Type checking was further improved and automated when the conventional byte word of 32-bits was lengthened to 36-bits for Symbolics 3600-model Lisp machines and to 40-bits or more; the first group of extra bits were used to hold type data, making the machine a tagged architecture, the remaining bits were used to implement CDR coding, aiding garbage collection by an order of magnitude. A further improvement was two microcode instructions which supported Lisp functions, reducing the cost of calling a function to as little as 20 clock cycles, in some Symbolics implementations; the first machine was called the CONS machine. It was affectionately referred to as the Knight machine since Knight wrote his master's thesis on the subject.
It was subsequently improved into a version called CADR, based on the same architecture. About 25 of what were prototype CADRs were sold within and without MIT for ~$50,000, it was so well received at an AI conference held at MIT in 1978 that Defense Advanced Research Projects Agency began funding its development. In 1979, Russell Noftsker, being convinced that Lisp machines had a bright commercial future due to the strength of the Lisp language and the enabling factor of hardware acceleration, proposed to Greenblatt that they commercialize the technology. In a counter-intuitive move for an AI Lab hacker, Greenblatt acquiesced, hoping that he could recreate the informal and productive atmosphere of the Lab in a real business; these ideas and goals were different from those of Noftsker. The two negotiated at length; as the proposed firm could succeed only with the full and undivided assistance of the AI Lab hackers as a group and Greenblatt decided that the fate of the enterprise was up to them, so the choice should be left to the hackers.
The ensuing discussions of the choice divided the lab into two factions. In February 1979, matters came to a head; the hackers sided with Noftsker, believing that a commercial venture fund-backed firm had a better chance of surviving and commercializing Lisp machines than Greenblatt's proposed self-sustaining start-up. Greenblatt lost the battle, it was at th
Richard Matthew Stallman known by his initials, RMS, is an American free software movement activist and programmer. He campaigns for software to be distributed in a manner such that its users receive the freedoms to use, study and modify that software. Software that ensures these freedoms is termed free software. Stallman launched the GNU Project, founded the Free Software Foundation, developed the GNU Compiler Collection and GNU Emacs, wrote the GNU General Public License. Stallman launched the GNU Project in September 1983 to create a Unix-like computer operating system composed of free software. With this, he launched the free software movement, he has been the GNU project's lead architect and organizer, developed a number of pieces of used GNU software including, among others, the GNU Compiler Collection, the GNU Debugger and the GNU Emacs text editor. In October 1985 he founded the Free Software Foundation. Stallman pioneered the concept of copyleft, which uses the principles of copyright law to preserve the right to use and distribute free software, is the main author of free software licenses which describe those terms, most notably the GNU General Public License, the most used free software license.
In 1989, he co-founded the League for Programming Freedom. Since the mid-1990s, Stallman had spent most of his time advocating for free software, as well as campaigning against software patents, digital rights management, other legal and technical systems which he sees as taking away users' freedoms; this has included software license agreements, non-disclosure agreements, activation keys, copy restriction, proprietary formats and binary executables without source code. Stallman was born March 16, 1953 in New York City, to a family of Jewish heritage, though Stallman is an atheist, his parents are Alice Lippman, a school teacher, Daniel Stallman, a printing press broker. Stallman had a difficult relationship with his parents, as his father had a drinking habit and verbally abused his stepmother, he came to describe his parents as "tyrants". He was interested in computers at a young age. From 1967 to 1969, Stallman attended a Columbia University Saturday program for high school students. Stallman was a volunteer laboratory assistant in the biology department at Rockefeller University.
Although he was interested in mathematics and physics, his teaching professor at Rockefeller thought he showed promise as a biologist. His first experience with actual computers was at the IBM New York Scientific Center when he was in high school, he was hired for the summer in 1970, following his senior year of high school, to write a numerical analysis program in Fortran. He completed the task after a couple of weeks and spent the rest of the summer writing a text editor in APL and a preprocessor for the PL/I programming language on the IBM System/360; as a first-year student at Harvard University in fall 1970, Stallman was known for his strong performance in Math 55. He was happy: "For the first time in my life, I felt I had found a home at Harvard."In 1971, near the end of his first year at Harvard, he became a programmer at the MIT Artificial Intelligence Laboratory, became a regular in the hacker community, where he was known by his initials, RMS. Stallman received a bachelor's degree in physics from Harvard in 1974.
Stallman considered staying on at Harvard, but instead he decided to enroll as a graduate student at the Massachusetts Institute of Technology. He pursued a doctorate in physics for one year, but left that program to focus on his programming at the MIT AI Laboratory. While working as a research assistant at MIT under Gerry Sussman, Stallman published a paper in 1977 on an AI truth maintenance system, called dependency-directed backtracking; this paper was an early work on the problem of intelligent backtracking in constraint satisfaction problems. As of 2009, the technique Stallman and Sussman introduced is still the most general and powerful form of intelligent backtracking; the technique of constraint recording, wherein partial results of a search are recorded for reuse, was introduced in this paper. As a hacker in MIT's AI laboratory, Stallman worked on software projects such as TECO, Emacs for ITS, the Lisp machine operating system, he would become an ardent critic of restricted computer access in the lab, which at that time was funded by the Defense Advanced Research Projects Agency.
When MIT's Laboratory for Computer Science installed a password control system in 1977, Stallman found a way to decrypt the passwords and sent users messages containing their decoded password, with a suggestion to change it to the empty string instead, to re-enable anonymous access to the systems. Around 20 percent of the users followed his advice at the time, although passwords prevailed. Stallman boasted of the success of his campaign for many years afterward. In the late 1970s and early 1980s, the hacker culture that Stallman thrived on began to fragment. To prevent software from being used on their competitors' computers, most manufacturers stopped distributing source code and began using copyright and restrictive software licenses to limit or prohibit copying and redistribution. Such
Adobe Flash is a deprecated multimedia software platform used for production of animations, rich Internet applications, desktop applications, mobile applications, mobile games and embedded web browser video players. Flash displays text, vector graphics and raster graphics to provide animations, video games and applications, it allows streaming of audio and video, can capture mouse, keyboard and camera input. Related development platform Adobe AIR continues to be supported. Artists may produce Flash animations using Adobe Animate. Software developers may produce applications and video games using Adobe Flash Builder, FlashDevelop, Flash Catalyst, or any text editor when used with the Apache Flex SDK. End-users can view Flash content via AIR or third-party players such as Scaleform. Adobe Flash Player enables end-users to view Flash content using web browsers. Adobe Flash Lite enabled viewing Flash content on older smartphones, but has been discontinued and superseded by Adobe AIR; the ActionScript programming language allows the development of interactive animations, video games, web applications, desktop applications and mobile applications.
Programmers can implement Flash software using an IDE such as Adobe Animate, Adobe Flash Builder, Adobe Director, FlashDevelop and Powerflasher FDT. Adobe AIR enables full-featured desktop and mobile applications to be developed with Flash and published for Windows, macOS, Android, iOS, Xbox One, PlayStation 4, Nintendo Wii U, Switch. Although Flash was a dominant platform for online multimedia content, it is being abandoned as Adobe favors a transition to HTML5. Flash Player has been deprecated and has an official end-of-life at the end of 2020. However, Adobe will continue to develop Adobe AIR, a related technology for building stand-alone applications and games. In the early 2000s, Flash was installed on desktop computers, was used to display interactive web pages, online games, to playback video and audio content. In 2005, YouTube was founded by former PayPal employees, it used Flash Player as a means to display compressed video content on the web. Between 2000 and 2010, numerous businesses used Flash-based websites to launch new products, or to create interactive company portals.
Notable users include Nike, Hewlett-Packard, General Electric, World Wildlife Fund, HBO, Cartoon Network and Motorola. After Adobe introduced hardware-accelerated 3D for Flash, Flash websites saw a growth of 3D content for product demonstrations and virtual tours. In 2007, YouTube offered videos in HTML5 format to support the iPhone and iPad, which did not support Flash Player. After a controversy with Apple, Adobe stopped developing Flash Player for Mobile, focussing its efforts on Adobe AIR applications and HTML5 animation. In 2015, Google introduced Google Swiffy to convert Flash animation to HTML5, a tool Google would use to automatically convert Flash web ads for mobile devices. In 2016, Google discontinued its support. In 2015, YouTube switched to HTML5 technology on all devices. After Flash 5 introduced ActionScript in 2000, developers combined the visual and programming capabilities of Flash to produce interactive experiences and applications for the Web; such Web-based applications came to be known as "Rich Internet Applications".
In 2004, Macromedia Flex was released, targeted the application development market. Flex introduced new user interface components, advanced data visualization components, data remoting, a modern IDE. Flex competed with Microsoft Silverlight during its tenure. Flex was upgraded to support integration with remote data sources, using AMF, BlazeDS, Adobe LiveCycle, Amazon Elastic Compute Cloud, others; as of 2015, Flex applications can be published for desktop platforms using Adobe AIR. Between 2006 and 2016, the Speedtest.net web service conducted over 9.0 billion speed tests using an RIA built with Adobe Flash. In 2016, the service shifted to HTML5 due to the decreasing availability of Adobe Flash Player on PCs; as of 2016, Web applications and RIAs can be developed with Flash using the ActionScript 3.0 programming language and related tools such as Adobe Flash Builder. Third-party IDEs such as FlashDevelop and Powerflasher FDT enable developers to create Flash games and applications, are similar to Microsoft Visual Studio.
Flex applications are built using Flex frameworks such as PureMVC. Flash video games were popular on the Internet, with portals like Newgrounds and Armor Games dedicated to hosting of Flash-based games. Popular games developed with Flash include Angry Birds, Clash of Clans, FarmVille, AdventureQuest, Hundreds, N, QWOP and Solipskier. Adobe introduced various technologies to help build video games, including Adobe AIR, Adobe Scout, CrossBridge, Stage3D. 3D frameworks like Away3D and Flare3D simplified creation of 3D content for Flash. Adobe AIR allows creation of Flash-based mobile games, which may be published to the Google Play and Apple app stores. Flash is used to build interfaces and HUDs for 3D video games using Scaleform GFx, a technology that renders Flash content within non-Flash video games. Scaleform is supported by more than 10 major video game engines including Unreal Engine, UDK, CryEngine and PhyreEngine, has been used to provide 3D interfaces for more than 150 majo
Recursion occurs when a thing is defined in terms of itself or of its type. Recursion is used in a variety of disciplines ranging from linguistics to logic; the most common application of recursion is in mathematics and computer science, where a function being defined is applied within its own definition. While this defines an infinite number of instances, it is done in such a way that no loop or infinite chain of references can occur. In mathematics and computer science, a class of objects or methods exhibit recursive behavior when they can be defined by two properties: A simple base case —a terminating scenario that does not use recursion to produce an answer A set of rules that reduce all other cases toward the base caseFor example, the following is a recursive definition of a person's ancestors: One's parents are one's ancestors; the ancestors of one's ancestors are one's ancestors. The Fibonacci sequence is a classic example of recursion: Fib = 0 as base case 1, Fib = 1 as base case 2, For all integers n > 1, Fib:= Fib + Fib.
Many mathematical axioms are based upon recursive rules. For example, the formal definition of the natural numbers by the Peano axioms can be described as: 0 is a natural number, each natural number has a successor, a natural number. By this base case and recursive rule, one can generate the set of all natural numbers. Recursively defined mathematical objects include functions and fractals. There are various more tongue-in-cheek "definitions" of recursion. Recursion is the process a procedure goes through when one of the steps of the procedure involves invoking the procedure itself. A procedure that goes through recursion is said to be'recursive'. To understand recursion, one must recognize the distinction between a procedure and the running of a procedure. A procedure is a set of steps based on a set of rules; the running of a procedure involves following the rules and performing the steps. An analogy: a procedure is like a written recipe. Recursion is related to, but not the same as, a reference within the specification of a procedure to the execution of some other procedure.
For instance, a recipe might refer to cooking vegetables, another procedure that in turn requires heating water, so forth. However, a recursive procedure is where one of its steps calls for a new instance of the same procedure, like a sourdough recipe calling for some dough left over from the last time the same recipe was made; this creates the possibility of an endless loop. If properly defined, a recursive procedure is not easy for humans to perform, as it requires distinguishing the new from the old invocation of the procedure. For this reason recursive definitions are rare in everyday situations. An example could be the following procedure to find a way through a maze. Proceed forward until reaching either an exit or a branching point. If the point reached is an exit, terminate. Otherwise try each branch in turn, using the procedure recursively. Whether this defines a terminating procedure depends on the nature of the maze: it must not allow loops. In any case, executing the procedure requires recording all explored branching points, which of their branches have been exhaustively tried.
Linguist Noam Chomsky among many others has argued that the lack of an upper bound on the number of grammatical sentences in a language, the lack of an upper bound on grammatical sentence length, can be explained as the consequence of recursion in natural language. This can be understood in terms of a recursive definition of a syntactic category, such as a sentence. A sentence can have a structure in which what follows the verb is another sentence: Dorothy thinks witches are dangerous, in which the sentence witches are dangerous occurs in the larger one. So a sentence can be defined recursively as something with a structure that includes a noun phrase, a verb, optionally another sentence; this is just a special case of the mathematical definition of recursion. This provides a way of understanding the creativity of language—the unbounded number of grammatical sentences—because it predicts that sentences can be of arbitrary length: Dorothy thinks that Toto suspects that Tin Man said that.... There are many structures apart from sentences that can be defined recursively, therefore many ways in which a sentence can embed instances of one
Gödel, Escher, Bach
Gödel, Bach: An Eternal Golden Braid known as GEB, is a 1979 book by Douglas Hofstadter. By exploring common themes in the lives and works of logician Kurt Gödel, artist M. C. Escher, composer Johann Sebastian Bach, the book expounds concepts fundamental to mathematics and intelligence. Through illustration and analysis, the book discusses how, through self-reference and formal rules, systems can acquire meaning despite being made of "meaningless" elements, it discusses what it means to communicate, how knowledge can be represented and stored, the methods and limitations of symbolic representation, the fundamental notion of "meaning" itself. In response to confusion over the book's theme, Hofstadter emphasized that Gödel, Bach is not about the relationships of mathematics and music—but rather about how cognition emerges from hidden neurological mechanisms. One point in the book presents an analogy about how individual neurons in the brain coordinate to create a unified sense of a coherent mind by comparing it to the social organization displayed in a colony of ants.
The tagline "a metaphorical fugue on minds and machines in the spirit of Lewis Carroll" was used by the publisher to describe the book. Gödel, Bach takes the form of interweaving narratives; the main chapters alternate with dialogues between imaginary characters Achilles and the tortoise, first used by Zeno of Elea and by Lewis Carroll in "What the Tortoise Said to Achilles". These origins are related in the first two dialogues, ones introduce new characters such as the Crab; these narratives dip into self-reference and metafiction. Word play features prominently in the work. Puns are used to connect ideas, such as "the Magnificrab, Indeed" with Bach's Magnificat in D. One dialogue contains a story about a genie and various "tonics", titled "Djinn and Tonic". One dialogue in the book is written in the form of a crab canon, in which every line before the midpoint corresponds to an identical line past the midpoint; the conversation still makes sense due to uses of common phrases that can be used as either greetings or farewells and the positioning of lines that double as an answer to a question in the next line.
Another is a sloth canon, where one character repeats the lines of another. The book contains many instances of recursion and self-reference, where objects and ideas speak about or refer back to themselves. One is Quining, a term Hofstadter invented in homage to Willard Van Orman Quine, referring to programs that only produce their own source code. Another is the presence of a fictional author in the index, Egbert B. Gebstadter, a man with initials E, G, B and a surname that matches Hofstadter. A phonograph dubbed "Record Player X" destroys itself by playing a record titled I Cannot Be Played on Record Player X, an examination of canon form in music, a discussion of Escher's lithograph of two hands drawing each other. To describe such self-referencing objects, Hofstadter coins the term "strange loop"—a concept he examines in more depth in his follow-up book I Am a Strange Loop. To escape many of the logical contradictions brought about by these self-referencing objects, Hofstadter discusses Zen koans.
He attempts to show readers how to perceive reality outside their own experience and embrace such paradoxical questions by rejecting the premise—a strategy called "unasking". Elements of computer science such as call stacks are discussed in Gödel, Bach, as one dialogue describes the adventures of Achilles and the Tortoise as they make use of "pushing potion" and "popping tonic" involving entering and leaving different layers of reality. Subsequent sections discuss the basic tenets of logic, self-referring statements and programming. Hofstadter further creates BlooP and FlooP, two simple programming languages, to illustrate his point; the book is filled with puzzles: One is Hofstadter's famous MU puzzle. Another one is hidden in the chapter titled Contracrostipunctus, which combines the words acrostic and contrapunctus. In this dialogue between Achilles and the Tortoise, the author hints that there is a contrapunctal acrostic in the chapter that refers both to the author and Bach; this can be spelled out by taking the first word of each paragraph, to reveal: Hofstadter's Contracrostipunctus Acrostically Backwards Spells'J. S. Bach'.
The second acrostic is found by taking the first letters of the words, reading them backwards to get "J S Bach" – just as the acrostic sentence self-referentially claims. Gödel, Bach won the Pulitzer Prize for general non-fiction and the National Book Award for Science. Martin Gardner's July 1979 column in Scientific American stated, "Every few decades, an unknown author brings out a book of such depth, range, wit and originality that it is recognized at once as a major literary event."For Summer 2007, the Massachusetts Institute of Technology created an online course for high school students built around the book. In its February 19, 2010 investigative summary on the 2001 anthrax attacks, the Federal Bureau of Investigation suggested that Bruce Edwards Ivins was inspired by the book to hide secret codes based upon nucleotide sequences in the anthrax-laced letters he sent in September and October 2001, using bold letters, as suggested on page 404 of the book, it was s
Wine is a free and open-source compatibility layer that aims to allow computer programs developed for Microsoft Windows to run on Unix-like operating systems. Wine provides a software library, known as Winelib, against which developers can compile Windows applications to help port them to Unix-like systems. Wine provides its own Windows runtime environment which translates Windows system calls into POSIX-compliant system calls, recreating the directory structure of Windows systems, providing alternative implementations of Windows system libraries, system services through wineserver and various other components. Wine is predominantly written using black-box testing reverse-engineering, to avoid copyright issues. Wine Project name being Wine is Not an Emulator was set as of August 1993 in the Naming discussion and credited to David Niemi this is a recursive backronym. There is some confusion caused by an early FAQ using Windows Emulator and other invalid sources that appear after the Wine Project name being set.
No code emulation or virtualization occurs. "Emulation" would refer to execution of compiled code intended for one processor by interpreting/recompiling software running on a different processor. While the name sometimes appears in the forms WINE and wine, the project developers have agreed to standardize on the form Wine. Wine is developed for Linux and macOS, there are well-maintained packages available for both platforms. In a 2007 survey by desktoplinux.com of 38,500 Linux desktop users, 31.5% of respondents reported using Wine to run Windows applications. This plurality was larger than all x86 virtualization programs combined, as well as larger than the 27.9% who reported not running Windows applications. Bob Amstadt, the initial project leader, Eric Youngdale started the Wine project in 1993 as a way to run Windows applications on Linux, it was inspired by two Sun Microsystems' products, the Wabi for the Solaris operating system, the Public Windows Initiative, an attempt to get the Windows API reimplemented in the public domain as an ISO standard but rejected due to pressure from Microsoft in 1996.
Wine targeted 16-bit applications for Windows 3.x, but as of 2010 focuses on 32-bit and 64-bit versions which have become the standard on newer operating systems. The project originated in discussions on Usenet in comp.os.linux in June 1993. Alexandre Julliard has led the project since 1994; the project has proven time-consuming and difficult for the developers because of incomplete and incorrect documentation of the Windows API. While Microsoft extensively documents most Win32 functions, some areas such as file formats and protocols have no publicly available specification from Microsoft, Windows includes undocumented low-level functions, undocumented behavior and obscure bugs that Wine must duplicate in order to allow some applications to work properly; the Wine team has reverse-engineered many function calls and file formats in such areas as thunking. The Wine project released Wine under the same MIT License as the X Window System, but owing to concern about proprietary versions of Wine not contributing their changes back to the core project, work as of March 2002 has used the LGPL for its licensing.
Wine entered beta with version 0.9 on 25 October 2005. Version 1.0 was released on 17 June 2008, after 15 years of development. Version 1.2 was released on 16 July 2010, version 1.4 on 7 March 2012, version 1.6 on 18 July 2013. and version 1.8 on 19 December 2015. Development versions are released every two weeks; the main corporate sponsor of Wine is CodeWeavers, which employs Julliard and many other Wine developers to work on Wine and on CrossOver, CodeWeavers' supported version of Wine. CrossOver includes some application-specific tweaks not considered suitable for the WineHQ version, as well as some additional proprietary components; the involvement of Corel for a time assisted the project, chiefly by employing Julliard and others to work on it. Corel had an interest in porting its office suite, to Linux. Corel cancelled all Linux-related projects after Microsoft made major investments in Corel, stopping their Wine effort. Other corporate sponsors include Google, which hired CodeWeavers to fix Wine so Picasa ran well enough to be ported directly to Linux using the same binary as on Windows.
Wine is a regular beneficiary of Google's Summer of Code program. The goal of Wine is to implement the Windows APIs or that are required by programs that the users of Wine wish to run on top of a Unix-like system; the Win32 function calls are collectively called the Win32 API. DirectX is a collection of APIs for rendering and input. While most office software does not make use of these, computer games do; as of 2017, Wine contains a DirectX 9.0c implementation. In February 2019, a re-implemenation of the XAudio2 audio API was merged into Wine and was released as part of Wine 4.3. Many games which use a Direct3D 9 rendering path can run on top of Wine; the Gallium3D driver model creates. A free and open-source Gallium3D State Tracker was written for Microsoft Direct3D 9 in C. After some modification to Wine, it is now possible to use Direct3D 9 games without the requirement to translate Direct3
Unix is a family of multitasking, multiuser computer operating systems that derive from the original AT&T Unix, development starting in the 1970s at the Bell Labs research center by Ken Thompson, Dennis Ritchie, others. Intended for use inside the Bell System, AT&T licensed Unix to outside parties in the late 1970s, leading to a variety of both academic and commercial Unix variants from vendors including University of California, Microsoft, IBM, Sun Microsystems. In the early 1990s, AT&T sold its rights in Unix to Novell, which sold its Unix business to the Santa Cruz Operation in 1995; the UNIX trademark passed to The Open Group, a neutral industry consortium, which allows the use of the mark for certified operating systems that comply with the Single UNIX Specification. As of 2014, the Unix version with the largest installed base is Apple's macOS. Unix systems are characterized by a modular design, sometimes called the "Unix philosophy"; this concept entails that the operating system provides a set of simple tools that each performs a limited, well-defined function, with a unified filesystem as the main means of communication, a shell scripting and command language to combine the tools to perform complex workflows.
Unix distinguishes itself from its predecessors as the first portable operating system: the entire operating system is written in the C programming language, thus allowing Unix to reach numerous platforms. Unix was meant to be a convenient platform for programmers developing software to be run on it and on other systems, rather than for non-programmers; the system grew larger as the operating system started spreading in academic circles, as users added their own tools to the system and shared them with colleagues. At first, Unix was not designed to be multi-tasking. Unix gained portability, multi-tasking and multi-user capabilities in a time-sharing configuration. Unix systems are characterized by various concepts: the use of plain text for storing data; these concepts are collectively known as the "Unix philosophy". Brian Kernighan and Rob Pike summarize this in The Unix Programming Environment as "the idea that the power of a system comes more from the relationships among programs than from the programs themselves".
In an era when a standard computer consisted of a hard disk for storage and a data terminal for input and output, the Unix file model worked quite well, as I/O was linear. In the 1980s, non-blocking I/O and the set of inter-process communication mechanisms were augmented with Unix domain sockets, shared memory, message queues, semaphores, network sockets were added to support communication with other hosts; as graphical user interfaces developed, the file model proved inadequate to the task of handling asynchronous events such as those generated by a mouse. By the early 1980s, users began seeing Unix as a potential universal operating system, suitable for computers of all sizes; the Unix environment and the client–server program model were essential elements in the development of the Internet and the reshaping of computing as centered in networks rather than in individual computers. Both Unix and the C programming language were developed by AT&T and distributed to government and academic institutions, which led to both being ported to a wider variety of machine families than any other operating system.
Under Unix, the operating system consists of many libraries and utilities along with the master control program, the kernel. The kernel provides services to start and stop programs, handles the file system and other common "low-level" tasks that most programs share, schedules access to avoid conflicts when programs try to access the same resource or device simultaneously. To mediate such access, the kernel has special rights, reflected in the division between user space and kernel space - although in microkernel implementations, like MINIX or Redox, functions such as network protocols may run in user space; the origins of Unix date back to the mid-1960s when the Massachusetts Institute of Technology, Bell Labs, General Electric were developing Multics, a time-sharing operating system for the GE-645 mainframe computer. Multics featured several innovations, but presented severe problems. Frustrated by the size and complexity of Multics, but not by its goals, individual researchers at Bell Labs started withdrawing from the project.
The last to leave were Ken Thompson, Dennis Ritchie, Douglas McIlroy, Joe Ossanna, who decided to reimplement their experiences in a new project of smaller scale. This new operating system was without organizational backing, without a name; the new operating system was a single-tasking system. In 1970, the group coined the name Unics for Uniplexed Information and Computing Service, as a pun on Multics, which stood for Multiplexed Information and Computer Services. Brian Kernighan takes credit for the idea, but adds that "no one can remember" the origin of the final spelling Unix. Dennis Ritchie, Doug McIlroy, Peter G. Neumann credit Kernighan; the operating system was written in assembly language, but in 1973, Version 4 Unix was rewritten in C. Version 4 Unix, still had many PDP-11 dependent codes, is not suitable for porting; the first port to other platform was made five years f