The pun called paronomasia, is a form of word play that exploits multiple meanings of a term, or of similar-sounding words, for an intended humorous or rhetorical effect. These ambiguities can arise from the intentional use of homophonic, metonymic, or figurative language. A pun differs from a malapropism in that a malapropism is an incorrect variation on a correct expression, while a pun involves expressions with multiple interpretations. Puns may be regarded as in-jokes or idiomatic constructions as their usage and meaning are specific to a particular language or its culture. Puns have a long history in human writing. For example, the Roman playwright Plautus was famous for word games. Puns can be classified in various ways; the homophonic pun, a common type, are not synonymous. Walter Redfern summarized this type with his statement, "To pun is to treat homonyms as synonyms." For example, in George Carlin's phrase "atheism is a non-prophet institution", the word prophet is put in place of its homophone profit, altering the common phrase "non-profit institution".
The joke "Question: Why do we still have troops in Germany? Answer: To keep the Russians in Czech" relies on the aural ambiguity of the homophones check and Czech. Puns are not homophonic, but play on words of similar, not identical, sound as in the example from the Pinky and the Brain cartoon film series: "I think so, but if we give peas a chance, won't the lima beans feel left out?" which plays with the similar—but not identical—sound of peas and peace in the anti-war slogan "Give Peace a Chance". A homographic pun exploits words which are spelled the same but possess different meanings and sounds; because of their nature, they rely on sight more than hearing, contrary to homophonic puns. They are known as heteronymic puns. Examples in which the punned words exist in two different parts of speech rely on unusual sentence construction, as in the anecdote: "When asked to explain his large number of children, the pig answered simply:'The wild oats of my sow gave us many piglets.'" An example that combines homophonic and homographic punning is Douglas Adams's line "You can tune a guitar, but you can't tuna fish.
Unless of course, you play bass." The phrase uses the homophonic qualities of tune a and tuna, as well as the homographic pun on bass, in which ambiguity is reached through the identical spellings of, and. Homographic puns do not need to follow grammatical rules and do not make sense when interpreted outside the context of the pun. Homonymic puns, another common type, arise from the exploitation of words which are both homographs and homophones; the statement "Being in politics is just like playing golf: you are trapped in one bad lie after another" puns on the two meanings of the word lie as "a deliberate untruth" and as "the position in which something rests". An adaptation of a joke repeated by Isaac Asimov gives us "Did you hear about the little moron who strained himself while running into the screen door?" Playing on strained as "to give much effort" and "to filter". A homonymic pun may be polysemic, in which the words must be homonymic and possess related meanings, a condition, subjective.
However, lexicographers define polysemes as listed under a single dictionary lemma while homonyms are treated in separate lemmata. A compound pun is a statement. In this case, the wordplay cannot go into effect by utilizing the separate words or phrases of the puns that make up the entire statement. For example, a complex statement by Richard Whately includes four puns: "Why can a man never starve in the Great Desert? Because he can eat the sand, there, but what brought the sandwiches there? Why, Noah sent Ham, his descendants mustered and bred." This pun uses sand, there/sandwiches there, Ham/ham, mustered/mustard, bred/bread. The phrase "piano is not my forte" links two meanings of the words forte and piano, one for the dynamic markings in music and the second for the literal meaning of the sentence, as well as alluding to "pianoforte", the older name of the instrument. Compound puns may combine two phrases that share a word. For example, "Where do mathematicians go on weekends? To a Möbius strip club!"
Puns on the terms Möbius strip club. A recursive pun is one in which the second aspect of a pun relies on the understanding of an element in the first. For example, the statement "π is only half a pie.". Another example is. Another example is "a Freudian slip is when you say one thing but mean your mother." The recursive pun "Immanuel doesn't pun, he Kant," is attributed to Oscar Wilde. Visual puns are sometimes used in logos, emblems and other graphic symbols, in which one or more of the pun aspects is replaced by a picture. In European heraldry, this technique is called canting arms. Visual and other puns and word games are common in Dutch gable stones as well as in some cartoons, such as Lost Consonants and The Far Side. Another type of visual pun exists in languages. For example, in Chinese, a pun may be based on a similarity in shape of the written character, despite a complete lack of phonetic similarity in the words punned upon. Mark Elvin describes how this "peculiarly Chinese form of visual punning involved comparing written characters to objects."
Richard J. Alexander notes two additional forms which puns may take: graphological (sometimes
In computing, a computer keyboard is a typewriter-style device which uses an arrangement of buttons or keys to act as mechanical levers or electronic switches. Following the decline of punch cards and paper tape, interaction via teleprinter-style keyboards became the main input method for computers. Keyboard keys have characters engraved or printed on them, each press of a key corresponds to a single written symbol. However, producing some symbols may require pressing and holding several keys or in sequence. While most keyboard keys produce letters, numbers or signs, other keys or simultaneous key presses can produce actions or execute computer commands. In normal usage, the keyboard is used as a text entry interface for typing text and numbers into a word processor, text editor or any other program. In a modern computer, the interpretation of key presses is left to the software. A computer keyboard distinguishes each physical key from every other key and reports all key presses to the controlling software.
Keyboards are used for computer gaming — either regular keyboards or keyboards with special gaming features, which can expedite used keystroke combinations. A keyboard is used to give commands to the operating system of a computer, such as Windows' Control-Alt-Delete combination. Although on Pre-Windows 95 Microsoft operating systems this forced a re-boot, now it brings up a system security options screen. A command-line interface is a type of user interface navigated using a keyboard, or some other similar device that does the job of one. While typewriters are the definitive ancestor of all key-based text entry devices, the computer keyboard as a device for electromechanical data entry and communication derives from the utility of two devices: teleprinters and keypunches, it was through such devices. As early as the 1870s, teleprinter-like devices were used to type and transmit stock market text data from the keyboard across telegraph lines to stock ticker machines to be copied and displayed onto ticker tape.
The teleprinter, in its more contemporary form, was developed from 1907 to 1910 by American mechanical engineer Charles Krum and his son Howard, with early contributions by electrical engineer Frank Pearne. Earlier models were developed separately by individuals such as Royal Earl House and Frederick G. Creed. Earlier, Herman Hollerith developed the first keypunch devices, which soon evolved to include keys for text and number entry akin to normal typewriters by the 1930s; the keyboard on the teleprinter played a strong role in point-to-point and point-to-multipoint communication for most of the 20th century, while the keyboard on the keypunch device played a strong role in data entry and storage for just as long. The development of the earliest computers incorporated electric typewriter keyboards: the development of the ENIAC computer incorporated a keypunch device as both the input and paper-based output device, while the BINAC computer made use of an electromechanically controlled typewriter for both data entry onto magnetic tape and data output.
The keyboard remained the primary, most integrated computer peripheral well into the era of personal computing until the introduction of the mouse as a consumer device in 1984. By this time, text-only user interfaces with sparse graphics gave way to comparatively graphics-rich icons on screen. However, keyboards remain central to human-computer interaction to the present as mobile personal computing devices such as smartphones and tablets adapt the keyboard as an optional virtual, touchscreen-based means of data entry. One factor determining the size of a keyboard is the presence of duplicate keys, such as a separate numeric keyboard or two each of Shift, ALT and CTL for convenience. Further the keyboard size depends on the extent to which a system is used where a single action is produced by a combination of subsequent or simultaneous keystrokes, or multiple pressing of a single key. A keyboard with few keys is called a keypad. Another factor determining the size of a keyboard is the spacing of the keys.
Reduction is limited by the practical consideration that the keys must be large enough to be pressed by fingers. Alternatively a tool is used for pressing small keys. Standard alphanumeric keyboards have keys that are on three-quarter inch centers, have a key travel of at least 0.150 inches. Desktop computer keyboards, such as the 101-key US traditional keyboards or the 104-key Windows keyboards, include alphabetic characters, punctuation symbols, numbers and a variety of function keys; the internationally common 102/104 key keyboards have a smaller left shift key and an additional key with some more symbols between that and the letter to its right. The enter key is shaped differently. Computer keyboards are similar to electric-typewriter keyboards but contain additional keys, such as the command or Windows keys. There is no standard computer keyboard. There are three different PC keyboards: the original PC keyboard with 84 keys, the AT keyboard with 84 keys and the enhanced keyboard with 101 keys.
The three differ somewhat in the placement of function keys, the control keys, the return key, the shift key. Keyboards on laptops and notebook computers have a shorter travel distance for the keystroke, shorter over travel distance, a reduced set of keys, they may not have a numeric keypad, the function keys may be placed in locations that differ from their placement on a standard, full-sized keyboard. The switch
IBM PC compatible
IBM PC compatible computers are computers similar to the original IBM PC, XT, AT, able to use the same software and expansion cards. Such computers used to be referred to as PC clones, or IBM clones, they duplicate exactly all the significant features of the PC architecture, facilitated by IBM's choice of commodity hardware components and various manufacturers' ability to reverse engineer the BIOS firmware using a "clean room design" technique. Columbia Data Products built the first clone of the IBM personal computer by a clean room implementation of its BIOS. Early IBM PC compatibles used the same computer bus as AT models; the IBM AT compatible bus was named the Industry Standard Architecture bus by manufacturers of compatible computers. The term "IBM PC compatible" is now a historical description only, since IBM has ended its personal computer sales. Descendants of the IBM PC compatibles comprise the majority of personal computers on the market presently with the dominant operating system being Microsoft Windows, although interoperability with the bus structure and peripherals of the original PC architecture may be limited or non-existent.
Some computers ran MS-DOS but had enough hardware differences that IBM compatible software could not be used. Only the Macintosh kept significant market share without compatibility with the IBM PC. IBM decided in 1980 to market a low-cost single-user computer as as possible in response to Apple Computer's success in the burgeoning microcomputer market. On 12 August 1981, the first IBM PC went on sale. There were three operating systems available for it; the least expensive and most popular was PC DOS made by Microsoft. In a crucial concession, IBM's agreement allowed Microsoft to sell its own version, MS-DOS, for non-IBM computers; the only component of the original PC architecture exclusive to IBM was the BIOS. IBM at first asked developers to avoid writing software that addressed the computer's hardware directly, to instead make standard calls to BIOS functions that carried out hardware-dependent operations; this software would run on any machine using MS-DOS or PC-DOS. Software that directly addressed the hardware instead of making standard calls was however.
Software addressing IBM PC hardware in this way would not run on MS-DOS machines with different hardware. The IBM PC was sold in high enough volumes to justify writing software for it, this encouraged other manufacturers to produce machines which could use the same programs, expansion cards, peripherals as the PC; the 808x computer marketplace excluded all machines which were not hardware- and software-compatible with the PC. The 640 KB barrier on "conventional" system memory available to MS-DOS is a legacy of that period. Rumors of "lookalike", compatible computers, created without IBM's approval, began immediately after the IBM PC's release. InfoWorld wrote on the first anniversary of the IBM PC that The dark side of an open system is its imitators. If the specs are clear enough for you to design peripherals, they are clear enough for you to design imitations. Apple... has patents on two important components of its systems... IBM, which has no special patents on the PC, is more vulnerable. Numerous PC-compatible machines—the grapevine says 60 or more—have begun to appear in the marketplace.
By June 1983 PC Magazine defined "PC'clone'" as "a computer accommodate the user who takes a disk home from an IBM PC, walks across the room, plugs it into the'foreign' machine". Because of a shortage of IBM PCs that year, many customers purchased clones instead. Columbia Data Products produced the first computer more or less compatible with the IBM PC standard during June 1982, soon followed by Eagle Computer. Compaq announced its first IBM PC compatible in the Compaq Portable; the Compaq was the first sewing machine-sized portable computer, 100% PC-compatible. The company could not copy the BIOS directly as a result of the court decision in Apple v. Franklin, but it could reverse-engineer the IBM BIOS and write its own BIOS using clean room design. At the same time, many manufacturers such as Tandy/RadioShack, Hewlett-Packard, Digital Equipment Corporation, Texas Instruments, Tulip and Olivetti introduced personal computers that supported MS-DOS, but were not software- or hardware-compatible with the IBM PC.
Tandy described the Tandy 2000, for example, as having a "'next generation' true 16-bit CPU", with "More speed. More disk storage. More expansion" than the IBM PC or "other MS-DOS computers". While admitting in 1984 that many MS-DOS programs did not support the computer, the company stated that "the most popular, sophisticated software on the market" was available, either or "over the next six months". Like IBM, Microsoft's intention was that application writers would write to the application programming interfaces in MS-DOS or the firmware BIOS, that this would form what would now be termed a hardware abstraction layer; each computer would have its own Original Equipment Manufacturer version of MS-DOS, customized to its hardware. Any software written for MS-DOS would operate on any MS-DOS computer, despite variations in hardware design; this expectation seemed reasonable in the computer marketplace of the time. Until Microsoft was based on computer languages such as BASIC; the established small system operating software was CP/M from Digital Research, in use both at the hobbyist level and by the more professional of t
An operating system is system software that manages computer hardware and software resources and provides common services for computer programs. Time-sharing operating systems schedule tasks for efficient use of the system and may include accounting software for cost allocation of processor time, mass storage and other resources. For hardware functions such as input and output and memory allocation, the operating system acts as an intermediary between programs and the computer hardware, although the application code is executed directly by the hardware and makes system calls to an OS function or is interrupted by it. Operating systems are found on many devices that contain a computer – from cellular phones and video game consoles to web servers and supercomputers; the dominant desktop operating system is Microsoft Windows with a market share of around 82.74%. MacOS by Apple Inc. is in second place, the varieties of Linux are collectively in third place. In the mobile sector, use in 2017 is up to 70% of Google's Android and according to third quarter 2016 data, Android on smartphones is dominant with 87.5 percent and a growth rate 10.3 percent per year, followed by Apple's iOS with 12.1 percent and a per year decrease in market share of 5.2 percent, while other operating systems amount to just 0.3 percent.
Linux distributions are dominant in supercomputing sectors. Other specialized classes of operating systems, such as embedded and real-time systems, exist for many applications. A single-tasking system can only run one program at a time, while a multi-tasking operating system allows more than one program to be running in concurrency; this is achieved by time-sharing, where the available processor time is divided between multiple processes. These processes are each interrupted in time slices by a task-scheduling subsystem of the operating system. Multi-tasking may be characterized in co-operative types. In preemptive multitasking, the operating system slices the CPU time and dedicates a slot to each of the programs. Unix-like operating systems, such as Solaris and Linux—as well as non-Unix-like, such as AmigaOS—support preemptive multitasking. Cooperative multitasking is achieved by relying on each process to provide time to the other processes in a defined manner. 16-bit versions of Microsoft Windows used cooperative multi-tasking.
32-bit versions of both Windows NT and Win9x, used preemptive multi-tasking. Single-user operating systems have no facilities to distinguish users, but may allow multiple programs to run in tandem. A multi-user operating system extends the basic concept of multi-tasking with facilities that identify processes and resources, such as disk space, belonging to multiple users, the system permits multiple users to interact with the system at the same time. Time-sharing operating systems schedule tasks for efficient use of the system and may include accounting software for cost allocation of processor time, mass storage and other resources to multiple users. A distributed operating system manages a group of distinct computers and makes them appear to be a single computer; the development of networked computers that could be linked and communicate with each other gave rise to distributed computing. Distributed computations are carried out on more than one machine; when computers in a group work in cooperation, they form a distributed system.
In an OS, distributed and cloud computing context, templating refers to creating a single virtual machine image as a guest operating system saving it as a tool for multiple running virtual machines. The technique is used both in virtualization and cloud computing management, is common in large server warehouses. Embedded operating systems are designed to be used in embedded computer systems, they are designed to operate on small machines like PDAs with less autonomy. They are able to operate with a limited number of resources, they are compact and efficient by design. Windows CE and Minix 3 are some examples of embedded operating systems. A real-time operating system is an operating system that guarantees to process events or data by a specific moment in time. A real-time operating system may be single- or multi-tasking, but when multitasking, it uses specialized scheduling algorithms so that a deterministic nature of behavior is achieved. An event-driven system switches between tasks based on their priorities or external events while time-sharing operating systems switch tasks based on clock interrupts.
A library operating system is one in which the services that a typical operating system provides, such as networking, are provided in the form of libraries and composed with the application and configuration code to construct a unikernel: a specialized, single address space, machine image that can be deployed to cloud or embedded environments. Early computers were built to perform a series of single tasks, like a calculator. Basic operating system features were developed in the 1950s, such as resident monitor functions that could automatically run different programs in succession to speed up processing. Operating systems did not exist in their more complex forms until the early 1960s. Hardware features were added, that enabled use of runtime libraries and parallel processing; when personal computers became popular in the 1980s, operating systems were made for them similar in concept to those used on larger computers. In the 1940s, the earliest electronic digital systems had no operating systems.
Electronic systems of this time were programmed on rows of mechanical switches or by jumper wires on plug boards. These were special-purpose systems that, for example, generated ballistics tables for the military or controlled the pri
SUSE Linux is a computer operating system. It is built on top of the free and open source Linux kernel and is distributed with system and application software from other open source projects. SUSE Linux is of German origin an acronym of “Software und System-Entwicklung”, was developed in Europe; the first version appeared in early 1994, making SUSE one of the oldest existing commercial distributions. It is known for its YaST configuration tool. Novell bought the SUSE brands and trademarks in 2003. Novell, one of the founding members of the Open Invention Network, decided to make the community an important part of their development process by opening the distribution development to outside contributors in 2005, creating the openSUSE distribution and the openSUSE Project. Novell employed more than 500 developers working on SUSE in 2004. On 27 April 2011, Novell were acquired by The Attachmate Group, which made SUSE an independent business unit. In October 2014, the entire Attachmate Group, including SUSE, was acquired by the British firm Micro Focus International.
SUSE continues to operate as an independent business unit. On 2 July, 2018, it was announced that Micro Focus would sell SUSE to Blitz 18-679 GmbH, a subsidiary of EQT Partners, for $2.535 billion. Gesellschaft für Software und System Entwicklung mbH was founded on 2 September 1992 in Nuremberg, Germany, by Roland Dyroff, Thomas Fehr, Burchard Steinbild, Hubert Mantel. Three of the founders were still mathematics students at a university; the original idea was that the company would develop software and function as an advisory UNIX group. According to Mantel, the group decided offering support, their name at founding was "S.u. S. E" and it was chosen as a German acronym for "Software und System-Entwicklung", meaning "Software and systems development"; the full name has never been used and the company was known as "S.u. S. E", shortened to "SuSE" in October 1998, in 2003 capitalized to "SUSE"; the official logo and current mascot of the distribution is a veiled chameleon named, "GEEKO", following a competition.
As with the company's name, the "GEEKO" logo brand has evolved over time to reflect the name changes. The company started as a service provider, which among other things released software packages that included Softlanding Linux System and Slackware, they printed UNIX/Linux manuals and they offered technical assistance. These third party products SUSE used had those characteristics and were managed by SUSE in different fashions: In mid-1992, Peter MacDonald created the comprehensive Linux distribution known as SLS, which offered elements such as X and TCP/IP; this was distributed to people. In 1993, Patrick Volkerding cleaned up the SLS Linux distribution, releasing a newer version as Slackware. In 1994, with help from Patrick Volkerding, Slackware scripts were translated into German, marked as the first release of S.u. S. E. Linux 1.0 distribution. It was available first on floppies, on CDs. For building its own distribution of Linux, S.u. S. E used first Slackware in 1992 the jurix distribution in 1996 as starting point.
This was created by Florian La Roche. S. E team, he began to develop YaST, the installer and configuration tool that would become the central point of the distribution. In 1996, the first distribution under the name S.u. S. E Linux was published as "S.u. S. E Linux 4.2". The version number has caused much discussion: it should have been just version 1.1, but using the number 4.2 was an intentional reference to the answer to the "Big Question about Life, the Universe and Everything" of the Hitchhiker's Guide to the Galaxy science fiction novels by the English writer Douglas Adams. YaST's first version number, "0.42", was a similar reference. Over time, SuSE Linux incorporated many aspects of Red Hat Linux, such as its RPM Package Manager and its file structure. S.u. S. E. Became the largest Linux distributor in Germany. In 1997, SuSE, LLC was established under the direction of President and Managing Partner James Gray in Oakland, which enabled the company to develop Linux markets in the Americas and Asia.
While Red Hat was ubiquitous in the United States, SuSE Linux continued to grow in Germany as well as in Nordic countries such as Finland and Sweden. In October 1998, the name was changed to, SuSE. Linus Torvalds, the creator of the Linux kernel, used it often. SuSE entered the UK in 1999. In 2001, the company was forced to reduce its staff in order to survive. On 4 November 2003, Novell announced; the acquisition was finalized in January 2004. In a move to reach its business audience more SuSE introduced the SUSE Linux Enterprise Server in 2001, a few months before Novell's purchase, changed the company name to "SUSE Linux". "SUSE" is now a name, not an acronym. According to J. Philips, Novell's corporate technology strategist for the Asia Pacific region, Novell would not "in the medium term" alter the way in which SUSE was developed. At Novell's annual BrainShare conference in 2004, for the first time, all of their computers were run with SUSE Linux and it was announced that the proprietary SUSE administration program YaST2 would be released under the GPL license.
On 4 August 2005, Novell announced that the SUSE Professional series would become more open, with the launch of the openSUSE Project community. The software always
Berkeley Software Distribution
The Berkeley Software Distribution was an operating system based on Research Unix and distributed by the Computer Systems Research Group at the University of California, Berkeley. Today, "BSD" refers to its descendants, such as FreeBSD, OpenBSD, NetBSD, or DragonFly BSD. BSD was called Berkeley Unix because it was based on the source code of the original Unix developed at Bell Labs. In the 1980s, BSD was adopted by workstation vendors in the form of proprietary Unix variants such as DEC Ultrix and Sun Microsystems SunOS due to its permissive licensing and familiarity to many technology company founders and engineers. Although these proprietary BSD derivatives were superseded in the 1990s by UNIX SVR4 and OSF/1 releases provided the basis for several open-source operating systems including FreeBSD, OpenBSD, NetBSD, DragonFly BSD, TrueOS. These, in turn, have been used by proprietary operating systems, including Apple's macOS and iOS, which derived from them, Microsoft Windows, which used a part of its TCP/IP code.
The earliest distributions of Unix from Bell Labs in the 1970s included the source code to the operating system, allowing researchers at universities to modify and extend Unix. The operating system arrived at Berkeley in 1974, at the request of computer science professor Bob Fabry, on the program committee for the Symposium on Operating Systems Principles where Unix was first presented. A PDP-11/45 was bought to run the system, but for budgetary reasons, this machine was shared with the mathematics and statistics groups at Berkeley, who used RSTS, so that Unix only ran on the machine eight hours per day. A larger PDP-11/70 was installed at Berkeley the following year, using money from the Ingres database project. In 1975, Ken Thompson came to Berkeley as a visiting professor, he started working on a Pascal implementation for the system. Graduate students Chuck Haley and Bill Joy improved Thompson's Pascal and implemented an improved text editor, ex. Other universities became interested in the software at Berkeley, so in 1977 Joy started compiling the first Berkeley Software Distribution, released on March 9, 1978.
1BSD was an add-on to Version 6 Unix rather than a complete operating system in its own right. Some thirty copies were sent out; the second Berkeley Software Distribution, released in May 1979, included updated versions of the 1BSD software as well as two new programs by Joy that persist on Unix systems to this day: the vi text editor and the C shell. Some 75 copies of 2BSD were sent out by Bill Joy. A VAX computer was installed at Berkeley in 1978, but the port of Unix to the VAX architecture, UNIX/32V, did not take advantage of the VAX's virtual memory capabilities; the kernel of 32V was rewritten by Berkeley students to include a virtual memory implementation, a complete operating system including the new kernel, ports of the 2BSD utilities to the VAX, the utilities from 32V was released as 3BSD at the end of 1979. 3BSD was alternatively called Virtual VAX/UNIX or VMUNIX, BSD kernel images were called /vmunix until 4.4BSD. After 4.3BSD was released in June 1986, it was determined that BSD would move away from the aging VAX platform.
The Power 6/32 platform developed by Computer Consoles Inc. seemed promising at the time, but was abandoned by its developers shortly thereafter. Nonetheless, the 4.3BSD-Tahoe port proved valuable, as it led to a separation of machine-dependent and machine-independent code in BSD which would improve the system's future portability. In addition to portability, the CSRG worked on an implementation of the OSI network protocol stack, improvements to the kernel virtual memory system and new TCP/IP algorithms to accommodate the growth of the Internet; until all versions of BSD used proprietary AT&T Unix code, were therefore subject to an AT&T software license. Source code licenses had become expensive and several outside parties had expressed interest in a separate release of the networking code, developed outside AT&T and would not be subject to the licensing requirement; this led to Networking Release 1, made available to non-licensees of AT&T code and was redistributable under the terms of the BSD license.
It was released in June 1989. After Net/1, BSD developer Keith Bostic proposed that more non-AT&T sections of the BSD system be released under the same license as Net/1. To this end, he started a project to reimplement most of the standard Unix utilities without using the AT&T code. Within eighteen months, all of the AT&T utilities had been replaced, it was determined that only a few AT&T files remained in the kernel; these files were removed, the result was the June 1991 release of Networking Release 2, a nearly complete operating system, distributable. Net/2 was the basis for two separate ports of BSD to the Intel 80386 architecture: the free 386BSD by William Jolitz and the proprietary BSD/386 by Berkeley Software Design. 386BSD itself was short-lived, but became the initial code base of the NetBSD and FreeBSD projects that were started shortly thereafter. BSDi soon found itself in legal trouble with AT&T's Unix System Laboratories subsidiary the owners of the System V copyright and the Unix trademark.
The USL v. BSDi lawsuit was filed in 1992 and led to an injunction on the distribution of Net/2 until the validity of USL's copyright claims on the source could be determined; the lawsuit slowed development of the free-
The Cathedral and the Bazaar
The Cathedral and the Bazaar: Musings on Linux and Open Source by an Accidental Revolutionary is an essay, a book, by Eric S. Raymond on software engineering methods, based on his observations of the Linux kernel development process and his experiences managing an open source project, fetchmail, it examines the struggle between bottom-up design. The essay was first presented by the author at the Linux Kongress on May 27, 1997 in Würzburg and was published as part of the book in 1999; the illustration on the cover of the book is a 1913 painting by Liubov Popova titled Composition with Figures and belongs to the collection of the State Tretyakov Gallery. The book was released under the Open Publication License v2.0 around 1999. The essay contrasts two different free software development models: The Cathedral model, in which source code is available with each software release, but code developed between releases is restricted to an exclusive group of software developers. GNU Emacs and GCC were presented as examples.
The Bazaar model, in which the code is developed over the Internet in view of the public. Raymond credits leader of the Linux kernel project, as the inventor of this process. Raymond provides anecdotal accounts of his own implementation of this model for the Fetchmail project; the essay's central thesis is Raymond's proposition that "given enough eyeballs, all bugs are shallow": the more available the source code is for public testing and experimentation, the more all forms of bugs will be discovered. In contrast, Raymond claims that an inordinate amount of time and energy must be spent hunting for bugs in the Cathedral model, since the working version of the code is available only to a few developers. Raymond points to 19 "lessons" learned from various software development efforts, each describing attributes associated with good practice in open source software development: Every good work of software starts by scratching a developer's personal itch. Good programmers know. Great ones know. Plan to throw one away.
If you have the right attitude, interesting problems will find you. When you lose interest in a program, your last duty to it is to hand it off to a competent successor. Treating your users as co-developers is your least-hassle route to rapid code improvement and effective debugging. Release early. Release often, and listen to your customers. Given a large enough beta-tester and co-developer base every problem will be characterized and the fix obvious to someone. Smart data structures and dumb code works a lot better than the other way around. If you treat your beta-testers as if they're your most valuable resource, they will respond by becoming your most valuable resource; the next best thing to having good ideas is recognizing good ideas from your users. Sometimes the latter is better; the most striking and innovative solutions come from realizing that your concept of the problem was wrong. Perfection is achieved not when there is nothing more to add, but rather when there is nothing more to take away.
Any tool should be useful in the expected way, but a great tool lends itself to uses you never expected. When writing gateway software of any kind, take pains to disturb the data stream as little as possible—and never throw away information unless the recipient forces you to! When your language is nowhere near Turing-complete, syntactic sugar can be your friend. A security system is only as secure as its secret. Beware of pseudo-secrets. To solve an interesting problem, start by finding a problem, interesting to you. Provided the development coordinator has a communications medium at least as good as the Internet, knows how to lead without coercion, many heads are better than one. In 1998, the essay helped the final push for Netscape Communications Corporation to release the source code for Netscape Communicator and start the Mozilla project. Netscape's public recognition of this influence brought Raymond renown in hacker culture; when O'Reilly Media published the book in 1999, it became one of the first complete and commercially distributed book published under the Open Publication License.
Marshall Poe, in his essay "The Hive", likens Wikipedia to the Bazaar model. Jimmy Wales himself was in fact inspired by the work, arguing that "It opened my eyes to the possibility of mass collaboration". In 1999 Nikolai Bezroukov published two cited critical essays on Eric Raymond's views on open source software, the second one called "A second look at The Cathedral and the Bazaar", they produced a sharp response from Eric Raymond. GNU Bazaar, a distributed version control system named to highlight its relation with the "bazaar" model "Homesteading the Noosphere" Official website Roberts, Russ. "Eric Raymond on Hacking, Open Source, The Cathedral and the Bazaar". EconTalk. Library of Economics and Liberty