Debian is a Unix-like operating system consisting of free software. Ian Murdock started the Debian Project on August 16, 1993. Debian 0.01 was released on September 15, 1993, the first stable version, 1.1, was released on June 17, 1996. The Debian stable branch is the most popular edition for personal computers and network servers, is used as the basis for many other distributions. Debian is one of the earliest operating systems based on the Linux kernel; the project's work is carried out over the Internet by a team of volunteers guided by the Debian Project Leader and three foundational documents: the Debian Social Contract, the Debian Constitution, the Debian Free Software Guidelines. New distributions are updated continually, the next candidate is released after a time-based freeze. Debian has been developed and distributed according to the principles of the GNU Project, this drew the support of the Free Software Foundation which sponsored the project from November 1994 to November 1995; when the sponsorship ended, the Debian Project formed the nonprofit Software in the Public Interest to continue financially supporting development.
Debian has access to online repositories that contain over 51,000 packages Debian contains only free software, but non-free software can be downloaded and installed from the Debian repositories. Debian includes popular free programs such as LibreOffice, Firefox web browser, Evolution mail, K3b disc burner, VLC media player, GIMP image editor, Evince document viewer. Debian is a popular choice for servers, for example as the operating system component of a LAMP stack. Debian supports Linux having offered kFreeBSD for version 7 but not 8, GNU Hurd unofficially. GNU/kFreeBSD was released as a technology preview for IA-32 and x86-64 architectures, lacked the amount of software available in Debian's Linux distribution. Official support for kFreeBSD was removed for version 8, which did not provide a kFreeBSD-based distribution. Several flavors of the Linux kernel exist for each port. For example, the i386 port has flavors for IA-32 PCs supporting Physical Address Extension and real-time computing, for older PCs, for x86-64 PCs.
The Linux kernel does not contain firmware without sources, although such firmware is available in non-free packages and alternative installation media. Debian offers CD images built for Xfce, the default desktop on CD, DVD images for GNOME, KDE and others. MATE is supported, while Cinnamon support was added with Debian 8.0 Jessie. Less common window managers such as Enlightenment, Fluxbox, IceWM, Window Maker and others are available; the default desktop environment of version 7.0 Wheezy was temporarily switched to Xfce, because GNOME 3 did not fit on the first CD of the set. The default for the version 8.0 Jessie was changed again to Xfce in November 2013, back to GNOME in September 2014. Several parts of Debian are translated into languages other than American English, including package descriptions, configuration messages and the website; the level of software localization depends on the language, ranging from the supported German and French to the hardly translated Creek and Samoan. The installer is available in 73 languages.
Debian offers CD images for installation that can be downloaded using BitTorrent or jigdo. Physical disks can be bought from retailers; the full sets are made up of several discs, but only the first disc is required for installation, as the installer can retrieve software not contained in the first disc image from online repositories. Debian offers different network installation methods. A minimal install of Debian is available via the netinst CD, whereby Debian is installed with just a base and added software can be downloaded from the Internet. Another option is to boot the installer from the network. Installation images can be used to create a bootable USB drive; the default bootstrap loader is GNU GRUB version 2, though the package name is grub, while version 1 was renamed to grub-legacy. This conflicts with e.g. Fedora, where grub version 2 is named grub2; the default desktop may be chosen from the DVD boot menu among GNOME, KDE Plasma, Xfce and LXDE, from special disc 1 CDs. Debian releases live install images for CDs, DVDs and USB thumb drives, for IA-32 and x86-64 architectures, with a choice of desktop environments.
These Debian Live images allow users to boot from removable media and run Debian without affecting the contents of their computer. A full install of Debian to the computer's hard drive can be initiated from the live image environment. Personalized images can be built with the live-build tool for discs, USB drives and for network booting purposes. Debian was first announced on August 16, 1993, by Ian Murdock, who called the system "the Debian Linux Release"; the word "Debian" was formed as a portmanteau of the first name of his then-girlfriend Debra Lynn and his own first name. Before Debian's release, the Softlanding Linux System had been a popular Linux distribution and the basis for Slackware; the perceived poor maintenance and prevalence of bugs in SLS motivated Murdock to launch a new distribution. Debian 0.01, released on September 15, 1993, was the first of several internal releases. Version 0.90 was the first public release, providing support through mailing lists hosted at Pixar. The release included the Debian Linux Manifesto, outlining Murdock's view for the new operating system.
In it he called for the creation of a distribution to be maintained in the spirit of Linux and GNU. The Debian project released the 0.9x versions in 1994 and 1995. During this time it was sponso
Debugging is the process of finding and resolving defects or problems within a computer program that prevent correct operation of computer software or a system. Debugging tactics can involve interactive debugging, control flow analysis, unit testing, integration testing, log file analysis, monitoring at the application or system level, memory dumps, profiling; the terms "bug" and "debugging" are popularly attributed to Admiral Grace Hopper in the 1940s. While she was working on a Mark II computer at Harvard University, her associates discovered a moth stuck in a relay and thereby impeding operation, whereupon she remarked that they were "debugging" the system. However, the term "bug", in the sense of "technical error", dates back at least to 1878 and Thomas Edison; the term "debugging" seems to have been used as a term in aeronautics before entering the world of computers. Indeed, in an interview Grace Hopper remarked; the moth fit the existing terminology, so it was saved. A letter from J. Robert Oppenheimer used the term in a letter to Dr. Ernest Lawrence at UC Berkeley, dated October 27, 1944, regarding the recruitment of additional technical staff.
The Oxford English Dictionary entry for "debug" quotes the term "debugging" used in reference to airplane engine testing in a 1945 article in the Journal of the Royal Aeronautical Society. An article in "Airforce" refers to debugging, this time of aircraft cameras. Hopper's bug was found on September 9, 1947; the term was not adopted by computer programmers until the early 1950s. The seminal article by Gill in 1951 is the earliest in-depth discussion of programming errors, but it does not use the term "bug" or "debugging". In the ACM's digital library, the term "debugging" is first used in three papers from 1952 ACM National Meetings. Two of the three use the term in quotation marks. By 1963 "debugging" was a common enough term to be mentioned in passing without explanation on page 1 of the CTSS manual. Kidwell's article Stalking the Elusive Computer Bug discusses the etymology of "bug" and "debug" in greater detail; as software and electronic systems have become more complex, the various common debugging techniques have expanded with more methods to detect anomalies, assess impact, schedule software patches or full updates to a system.
The words "anomaly" and "discrepancy" can be used, as being more neutral terms, to avoid the words "error" and "defect" or "bug" where there might be an implication that all so-called errors, defects or bugs must be fixed. Instead, an impact assessment can be made to determine if changes to remove an anomaly would be cost-effective for the system, or a scheduled new release might render the change unnecessary. Not all issues are mission-critical in a system, it is important to avoid the situation where a change might be more upsetting to users, long-term, than living with the known problem. Basing decisions of the acceptability of some anomalies can avoid a culture of a "zero-defects" mandate, where people might be tempted to deny the existence of problems so that the result would appear as zero defects. Considering the collateral issues, such as the cost-versus-benefit impact assessment broader debugging techniques will expand to determine the frequency of anomalies to help assess their impact to the overall system.
Debugging ranges in complexity from fixing simple errors to performing lengthy and tiresome tasks of data collection and scheduling updates. The debugging skill of the programmer can be a major factor in the ability to debug a problem, but the difficulty of software debugging varies with the complexity of the system, depends, to some extent, on the programming language used and the available tools, such as debuggers. Debuggers are software tools which enable the programmer to monitor the execution of a program, stop it, restart it, set breakpoints, change values in memory; the term debugger can refer to the person, doing the debugging. High-level programming languages, such as Java, make debugging easier, because they have features such as exception handling and type checking that make real sources of erratic behaviour easier to spot. In programming languages such as C or assembly, bugs may cause silent problems such as memory corruption, it is difficult to see where the initial problem happened.
In those cases, memory debugger tools may be needed. In certain situations, general purpose software tools that are language specific in nature can be useful; these take the form of static code analysis tools. These tools look for a specific set of known problems, some common and some rare, within the source code. Concentrating more on the semantics rather than the syntax, as compilers and interpreters do; some tools claim to be able to detect over 300 different problems. Both commercial and free tools exist for various languages; these tools can be useful when checking large source trees, where it is impractical to do code walkthroughs. A typical example of a problem detected would be a variable dereference that occurs before the variable is assigned a value; as another example, some such tools perform strong type checking when the language does not require it. Thus, they are better at locating errors in code, syntactically correct, but these tools have a reputation of false positives. The old Unix lint program is an early example.
For debugging electronic hardware (e.g
In computing, a server is a computer program or a device that provides functionality for other programs or devices, called "clients". This architecture is called the client–server model, a single overall computation is distributed across multiple processes or devices. Servers can provide various functionalities called "services", such as sharing data or resources among multiple clients, or performing computation for a client. A single server can serve multiple clients, a single client can use multiple servers. A client process may run on the same device or may connect over a network to a server on a different device. Typical servers are database servers, file servers, mail servers, print servers, web servers, game servers, application servers. Client–server systems are today most implemented by the request–response model: a client sends a request to the server, which performs some action and sends a response back to the client with a result or acknowledgement. Designating a computer as "server-class hardware" implies that it is specialized for running servers on it.
This implies that it is more powerful and reliable than standard personal computers, but alternatively, large computing clusters may be composed of many simple, replaceable server components. The use of the word server in computing comes from queueing theory, where it dates to the mid 20th century, being notably used in Kendall, the paper that introduced Kendall's notation. In earlier papers, such as the Erlang, more concrete terms such as " operators" are used. In computing, "server" dates at least to RFC 5, one of the earliest documents describing ARPANET, is contrasted with "user", distinguishing two types of host: "server-host" and "user-host"; the use of "serving" dates to early documents, such as RFC 4, contrasting "serving-host" with "using-host". The Jargon File defines "server" in the common sense of a process performing service for requests remote, with the 1981 version reading: SERVER n. A kind of DAEMON which performs a service for the requester, which runs on a computer other than the one on which the server runs.
Speaking, the term server refers to a computer program or process. Through metonymy, it refers to a device used for running several server programs. On a network, such a device is called a host. In addition to server, the words serve and service are used, though servicer and servant are not; the word service may refer to either the abstract form of e.g. Web service. Alternatively, it may refer to a computer program that turns a computer into a server, e.g. Windows service. Used as "servers serve users", in the sense of "obey", today one says that "servers serve data", in the same sense as "give". For instance, web servers "serve web pages to users" or "service their requests"; the server is part of the client–server model. The nature of communication between a client and server is response; this is in contrast with peer-to-peer model. In principle, any computerized process that can be used or called by another process is a server, the calling process or processes is a client, thus any general purpose computer connected to a network can host servers.
For example, if files on a device are shared by some process, that process is a file server. Web server software can run on any capable computer, so a laptop or a personal computer can host a web server. While request–response is the most common client–server design, there are others, such as the publish–subscribe pattern. In the publish–subscribe pattern, clients register with a pub–sub server, subscribing to specified types of messages. Thereafter, the pub–sub server forwards matching messages to the clients without any further requests: the server pushes messages to the client, rather than the client pulling messages from the server as in request–response; the purpose of a server is to share data as well as to distribute work. A server computer can serve its own computer programs as well; the following table shows several scenarios. The entire structure of the Internet is based upon a client–server model. High-level root nameservers, DNS, routers direct the traffic on the internet. There are millions of servers connected to the Internet, running continuously throughout the world and every action taken by an ordinary Internet user requires one or more interactions with one or more server.
There are exceptions. Hardware requirement for servers vary depending on the server's purpose and its software. Since servers are accessed over a network, many run unattended without a computer monitor or input device, audio hardware and USB interfaces. Many servers do not have a graphical user interface, they are managed remotely. Remote management can be conducted via various methods including Microsoft Management Console, PowerShell, SSH and browser-based out-of-band management systems such as Dell's iDRAC or HP's iLo. Large traditional single servers would need to be run for long periods without interruption. Ava
Google LLC is an American multinational technology company that specializes in Internet-related services and products, which include online advertising technologies, search engine, cloud computing and hardware. It is considered one of the Big Four technology companies, alongside Amazon and Facebook. Google was founded in 1998 by Larry Page and Sergey Brin while they were Ph. D. students at Stanford University in California. Together they own about 14 percent of its shares and control 56 percent of the stockholder voting power through supervoting stock, they incorporated Google as a held company on September 4, 1998. An initial public offering took place on August 19, 2004, Google moved to its headquarters in Mountain View, nicknamed the Googleplex. In August 2015, Google announced plans to reorganize its various interests as a conglomerate called Alphabet Inc. Google is Alphabet's leading subsidiary and will continue to be the umbrella company for Alphabet's Internet interests. Sundar Pichai was appointed CEO of Google.
The company's rapid growth since incorporation has triggered a chain of products and partnerships beyond Google's core search engine. It offers services designed for work and productivity, email and time management, cloud storage, instant messaging and video chat, language translation and navigation, video sharing, note-taking, photo organizing and editing; the company leads the development of the Android mobile operating system, the Google Chrome web browser, Chrome OS, a lightweight operating system based on the Chrome browser. Google has moved into hardware. Google has experimented with becoming an Internet carrier. Google.com is the most visited website in the world. Several other Google services figure in the top 100 most visited websites, including YouTube and Blogger. Google is the most valuable brand in the world as of 2017, but has received significant criticism involving issues such as privacy concerns, tax avoidance, antitrust and search neutrality. Google's mission statement is "to organize the world's information and make it universally accessible and useful".
The companies unofficial slogan "Don't be evil" was removed from the company's code of conduct around May 2018. Google began in January 1996 as a research project by Larry Page and Sergey Brin when they were both PhD students at Stanford University in Stanford, California. While conventional search engines ranked results by counting how many times the search terms appeared on the page, the two theorized about a better system that analyzed the relationships among websites, they called this new technology PageRank. Page and Brin nicknamed their new search engine "BackRub", because the system checked backlinks to estimate the importance of a site, they changed the name to Google. The domain name for Google was registered on September 15, 1997, the company was incorporated on September 4, 1998, it was based in the garage of a friend in California. Craig Silverstein, a fellow PhD student at Stanford, was hired as the first employee. Google was funded by an August 1998 contribution of $100,000 from Andy Bechtolsheim, co-founder of Sun Microsystems.
Google received money from three other angel investors in 1998: Amazon.com founder Jeff Bezos, Stanford University computer science professor David Cheriton, entrepreneur Ram Shriram. Between these initial investors and family Google raised around 1 million dollars, what allowed them to open up their original shop in Menlo Park, California After some additional, small investments through the end of 1998 to early 1999, a new $25 million round of funding was announced on June 7, 1999, with major investors including the venture capital firms Kleiner Perkins and Sequoia Capital. In March 1999, the company moved its offices to Palo Alto, home to several prominent Silicon Valley technology start-ups; the next year, Google began selling advertisements associated with search keywords against Page and Brin's initial opposition toward an advertising-funded search engine. To maintain an uncluttered page design, advertisements were text-based. In June 2000, it was announced that Google would become the default search engine provider for Yahoo!, one of the most popular websites at the time, replacing Inktomi.
In 2003, after outgrowing two other locations, the company leased an office complex from Silicon Graphics, at 1600 Amphitheatre Parkway in Mountain View, California. The complex became known as the Googleplex, a play on the word googolplex, the number one followed by a googol zeroes. Three years Google bought the property from SGI for $319 million. By that time, the name "Google
Der Spiegel is a German weekly news magazine published in Hamburg. With a weekly circulation of 840,000 copies, it is the largest such publication in Europe, it was founded in 1947 by John Seymour Chaloner, a British army officer, Rudolf Augstein, a former Wehrmacht radio operator, recognised in 2000 by the International Press Institute as one of the fifty World Press Freedom Heroes. Spiegel Online, the online sibling of Der Spiegel, was launched in 1994 with an independent editorial staff; the magazine has a content to advertising ratio of 2:1. Der Spiegel is known in German-speaking countries for its investigative journalism, it has played a key role in uncovering many political scandals such as the Spiegel scandal in 1962 and the Flick affair in the 1980s. According to The Economist, Der Spiegel is one of continental Europe's most influential magazines; the first edition of Der Spiegel was published in Hanover on Saturday, 4 January 1947. Its release was initiated and sponsored by the British occupational administration and preceded by a magazine titled Diese Woche, which had first been published in November 1946.
After disagreements with the British, the magazine was handed over to Rudolf Augstein as chief editor, was renamed Der Spiegel. From the first edition in January 1947, Augstein held the position of editor-in-chief, which he retained until his death on 7 November 2002. After 1950, the magazine was owned by John Jahr. In 1969, Augstein bought out Gruner + Jahr for DM 42 million and became the sole owner of Der Spiegel. In 1971, Gruner + Jahr bought back a 25% share in the magazine. In 1974, Augstein restructured the company to make the employees shareholders. All employees with more than three years seniority were offered the opportunity to become an associate and participate in the management of the company, as well as in the profits. Since 1952, Der Spiegel has been headquartered in its own building in the old town part of Hamburg. Der Spiegel's circulation rose quickly. From 15,000 copies in 1947, it grew to 65,000 in 1948 and 437,000 in 1961, it was nearly 500,000 copies in 1962. By the 1970s, it had reached a plateau at about 900,000 copies.
When the German re-unification in 1990 made it available to a new readership in former East Germany, the circulation exceeded one million. The magazine's influence is based on two pillars. Since 1988, it has produced the TV programme Spiegel TV, further diversified during the 1990s. During the second quarter of 1992 the circulation of Der Spiegel was 1.1 million copies. In 1994, Spiegel Online was launched, it has independent editorial staff from Der Spiegel. In 1999, the circulation of Der Spiegel was 1,061,000 copies. Der Spiegel had an average circulation of 1,076,000 copies in 2003. In 2007 the magazine started a new regional supplement in Switzerland, it was the first regional supplement of the magazine. In 2010 Der Spiegel was employing the equivalent of 80 full-time fact checkers, which the Columbia Journalism Review called "most the world's largest fact checking operation"; the same year it was the third best-selling general interest magazine in Europe with a circulation of 1,016,373 copies.
In 2018, Der Spiegel became involved in a journalistic scandal after it discovered and made public that one of its leading reporters, Claas Relotius, had "falsified his articles on a grand scale". When Stefan Aust took over in 1994, the magazine's readers realised that his personality was different from his predecessor. In 2005, a documentary by Stephan Lamby quoted him as follows: "We stand at a big cannon!" Politicians of all stripes who had to deal with the magazine's attention voiced their disaffection for it. The outspoken conservative Franz Josef Strauß contended that Der Spiegel was "the Gestapo of our time", he referred to journalists in general as "rats". The Social Democrat Willy Brandt called it "Scheißblatt" during his term in office as Chancellor. Der Spiegel produces feature-length articles on problems affecting Germany and describes optional strategies and their risks in depth; the magazine plays the role of opinion leader in the German press. Der Spiegel has a distinctive reputation for revealing political misconduct and scandals.
Online Encyclopædia Britannica emphasizes this quality of the magazine as follows: "The magazine is renowned for its aggressive and well-written exposés of government malpractice and scandals." It merited recognition for this as early as 1950, when the federal parliament launched an inquiry into Spiegel's accusations that bribed members of parliament had promoted Bonn over Frankfurt as the seat of West Germany's government. During the Spiegel scandal in 1962, which followed the release of a report about the low state of readiness of the German armed forces, minister of defence and conservative figurehead Franz Josef Strauß had Der Spiegel investigated. In the course of this investigation, the editorial offices were raided by police while Rudolf Augstein and other Der Spiegel editors were arrested on charges of treason. Despite a lack of sufficient authority, Strauß went after the article's author, Conrad Ahlers, arrested in Spain where he was on holiday; when the legal case collapsed, the scandal led to a m
Microsoft Windows is a group of several graphical operating system families, all of which are developed and sold by Microsoft. Each family caters to a certain sector of the computing industry. Active Windows families include Windows Embedded. Defunct Windows families include Windows Mobile and Windows Phone. Microsoft introduced an operating environment named Windows on November 20, 1985, as a graphical operating system shell for MS-DOS in response to the growing interest in graphical user interfaces. Microsoft Windows came to dominate the world's personal computer market with over 90% market share, overtaking Mac OS, introduced in 1984. Apple came to see Windows as an unfair encroachment on their innovation in GUI development as implemented on products such as the Lisa and Macintosh. On PCs, Windows is still the most popular operating system. However, in 2014, Microsoft admitted losing the majority of the overall operating system market to Android, because of the massive growth in sales of Android smartphones.
In 2014, the number of Windows devices sold was less than 25 %. This comparison however may not be relevant, as the two operating systems traditionally target different platforms. Still, numbers for server use of Windows show one third market share, similar to that for end user use; as of October 2018, the most recent version of Windows for PCs, tablets and embedded devices is Windows 10. The most recent versions for server computers is Windows Server 2019. A specialized version of Windows runs on the Xbox One video game console. Microsoft, the developer of Windows, has registered several trademarks, each of which denote a family of Windows operating systems that target a specific sector of the computing industry; as of 2014, the following Windows families are being developed: Windows NT: Started as a family of operating systems with Windows NT 3.1, an operating system for server computers and workstations. It now consists of three operating system subfamilies that are released at the same time and share the same kernel: Windows: The operating system for mainstream personal computers and smartphones.
The latest version is Windows 10. The main competitor of this family is macOS by Apple for personal computers and Android for mobile devices. Windows Server: The operating system for server computers; the latest version is Windows Server 2019. Unlike its client sibling, it has adopted a strong naming scheme; the main competitor of this family is Linux. Windows PE: A lightweight version of its Windows sibling, meant to operate as a live operating system, used for installing Windows on bare-metal computers, recovery or troubleshooting purposes; the latest version is Windows PE 10. Windows IoT: Initially, Microsoft developed Windows CE as a general-purpose operating system for every device, too resource-limited to be called a full-fledged computer. However, Windows CE was renamed Windows Embedded Compact and was folded under Windows Compact trademark which consists of Windows Embedded Industry, Windows Embedded Professional, Windows Embedded Standard, Windows Embedded Handheld and Windows Embedded Automotive.
The following Windows families are no longer being developed: Windows 9x: An operating system that targeted consumers market. Discontinued because of suboptimal performance. Microsoft now caters to the consumer market with Windows NT. Windows Mobile: The predecessor to Windows Phone, it was a mobile phone operating system; the first version was called Pocket PC 2000. The last version is Windows Mobile 6.5. Windows Phone: An operating system sold only to manufacturers of smartphones; the first version was Windows Phone 7, followed by Windows Phone 8, the last version Windows Phone 8.1. It was succeeded by Windows 10 Mobile; the term Windows collectively describes any or all of several generations of Microsoft operating system products. These products are categorized as follows: The history of Windows dates back to 1981, when Microsoft started work on a program called "Interface Manager", it was announced in November 1983 under the name "Windows", but Windows 1.0 was not released until November 1985.
Windows 1.0 was to achieved little popularity. Windows 1.0 is not a complete operating system. The shell of Windows 1.0 is a program known as the MS-DOS Executive. Components included Calculator, Cardfile, Clipboard viewer, Control Panel, Paint, Reversi and Write. Windows 1.0 does not allow overlapping windows. Instead all windows are tiled. Only modal dialog boxes may appear over other windows. Microsoft sold as included Windows Development libraries with the C development environment, which included numerous windows samples. Windows 2.0 was released in December 1987, was more popular than its predecessor. It features several improvements to the user memory management. Windows 2.03 changed the OS from tiled windows to overlapping windows. The result of this change led to Apple Computer filing a suit against Microsoft alleging infringement on Apple's copyrights. Windows 2.0
GNOME is a free and open-source desktop environment for Unix-like operating systems. GNOME was an acronym for GNU Network Object Model Environment, but the acronym was dropped because it no longer reflected the vision of the GNOME project. GNOME is part of the GNU Project and developed by The GNOME Project, composed of both volunteers and paid contributors, the largest corporate contributor being Red Hat, it is an international project that aims to develop software frameworks for the development of software, to program end-user applications based on these frameworks, to coordinate efforts for internationalization and localization and accessibility of that software. GNOME 3 is the default desktop environment on many major Linux distributions including Fedora, Ubuntu, SUSE Linux Enterprise, Red Hat Enterprise Linux, CentOS, Oracle Linux, Scientific Linux, SteamOS, Kali Linux and Endless OS; the continued fork of the last GNOME 2 release that goes under the name MATE is default on many distributions that targets low usage of system resources.
GNOME was started on August 15 1997 by Miguel de Icaza and Federico Mena as a free software project to develop a desktop environment and applications for it. It was founded in part because K Desktop Environment, growing in popularity, relied on the Qt widget toolkit which used a proprietary software license until version 2.0. In place of Qt, the GTK toolkit was chosen as the base of GNOME. GTK uses the GNU Lesser General Public License, a free software license that allows software linking to it to use a much wider set of licenses, including proprietary software licenses. GNOME itself is licensed under the LGPL for its libraries, the GNU General Public License for its applications; the name "GNOME" was an acronym of GNU Network Object Model Environment, referring to the original intention of creating a distributed object framework similar to Microsoft's OLE, but the acronym was dropped because it no longer reflected the vision of the GNOME project. The California startup Eazel developed the Nautilus file manager from 1999 to 2001.
De Icaza and Nat Friedman founded Helix Code in 1999 in Massachusetts. During the transition to GNOME 2 around the year 2001 and shortly thereafter there were brief talks about creating a GNOME Office suite. On September 15, 2003 GNOME-Office 1.0, consisting of AbiWord 2.0, GNOME-DB 1.0 and Gnumeric 1.2.0 was released. Although some release planning for GNOME Office 1.2 was happening on gnome-office mailing list, Gnumeric 1.4 was announced as a part of it, the 1.2 release of the suite itself never materialized. As of May 4, 2014 GNOME wiki only mentions "GNOME/Gtk applications that are useful in an office environment". GNOME 2 was similar to a conventional desktop interface, featuring a simple desktop in which users could interact with virtual objects, such as windows and files. GNOME 2 started out with Sawfish, but switched to Metacity as its default window manager; the handling of windows and files in GNOME 2 is similar to that of contemporary desktop operating systems. In the default configuration of GNOME 2, the desktop has a launcher menu for quick access to installed programs and file locations.
However, these features can be moved to any position or orientation the user desires, replaced with other functions or removed altogether. As of 2009, GNOME 2 was the default desktop for OpenSolaris. GNOME 1 and 2 followed the traditional desktop metaphor. GNOME 3, released in 2011, changed this with GNOME Shell, a more abstract metaphor where switching between different tasks and virtual desktops takes place in a separate area called "Overview". Since Mutter replaced Metacity as the default window manager, the minimize and maximize buttons no longer appear by default, the title bar, menu bar and tool bar combinated in one horizontal bar called "header bar" via Client-Side Decoration mechanism. Adwaita replaced Clearlooks as the default theme. Many GNOME Core Applications went through redesigns to provide a more consistent user experience; the release of GNOME 3, notable for its move away from the traditional menu bar and taskbar, has caused considerable controversy in the GNU and Linux community.
Many users and developers have expressed concerns about usability. A few projects have been initiated to continue development of GNOME 2.x or to modify GNOME 3.x to be more like the 2.x releases. GNOME 3 aims to provide a single interface for desktop computers and tablet computers; this means using only input techniques that work on all those devices, requiring abandonment of certain concepts to which desktop users were accustomed, such as right-clicking, or saving files on the desktop. These major changes evoked widespread criticism; the MATE desktop environment was forked from the GNOME 2 code-base with the intent of retaining the traditional GNOME 2 interface, whilst keeping compatibility with modern Linux technology, such as GTK 3. The Linux Mint team addressed the issue in another way by developing the "Mint GNOME Shell Extensions" that ran on top of GNOME Shell and allowed it to be used via the traditional desktop metaphor; this led to the creation of the Cinnamon user interface, forked from the GNOME 3 codebase.
Among those critical of the early releases of GNOME 3 is Linus Torvalds, the creator of the Linux kernel. Torvalds abandoned GNOME for a wh