The Linux kernel is a free and open-source, Unix-like operating system kernel. The Linux family of operating systems is based on this kernel and deployed on both traditional computer systems such as personal computers and servers in the form of Linux distributions, on various embedded devices such as routers, wireless access points, PBXes, set-top boxes, FTA receivers, smart TVs, PVRs, NAS appliances. While the adoption of the Linux kernel in desktop computer operating system is low, Linux-based operating systems dominate nearly every other segment of computing, from mobile devices to mainframes; as of November 2017, all of the world's 500 most powerful supercomputers run Linux. The Android operating system for tablet computers and smartwatches uses the Linux kernel; the Linux kernel was conceived and created in 1991 by Linus Torvalds for his personal computer and with no cross-platform intentions, but has since expanded to support a huge array of computer architectures, many more than other operating systems or kernels.
Linux attracted developers and users who adopted it as the kernel for other free software projects, notably the GNU Operating System, created as a free, non-proprietary operating system, based on UNIX as a by-product of the fallout of the Unix wars. The Linux kernel API, the application programming interface through which user programs interact with the kernel, is meant to be stable and to not break userspace programs; as part of the kernel's functionality, device drivers control the hardware. However, the interface between the kernel and loadable kernel modules, unlike in many other kernels and operating systems, is not meant to be stable by design; the Linux kernel, developed by contributors worldwide, is a prominent example of free and open source software. Day-to-day development discussions take place on the Linux kernel mailing list; the Linux kernel is released under the GNU General Public License version 2, with some firmware images released under various non-free licenses. In April 1991, Linus Torvalds, at the time a 21-year-old computer science student at the University of Helsinki, started working on some simple ideas for an operating system.
He started with a task switcher in a terminal driver. On 25 August 1991, Torvalds posted the following to comp.os.minix, a newsgroup on Usenet: I'm doing a operating system for 386 AT clones. This has been brewing since April, is starting to get ready. I'd like any feedback on things people like/dislike in minix. I've ported bash and gcc, things seem to work; this implies that I'll get something practical within a few months Yes - it's free of any minix code, it has a multi-threaded fs. It is NOT portable, it never will support anything other than AT-harddisks, as that's all I have:-(. It's in C, but most people wouldn't call what I write C, it uses every conceivable feature of the 386 I could find, as it was a project to teach me about the 386. As mentioned, it uses a MMU, for both paging and segmentation. It's the segmentation; some of my "C"-files are as much assembler as C. Unlike minix, I happen to LIKE interrupts, so interrupts are handled without trying to hide the reason behind them. After that, many people contributed code to the project.
Early on, the MINIX community contributed code and ideas to the Linux kernel. At the time, the GNU Project had created many of the components required for a free operating system, but its own kernel, GNU Hurd, was incomplete and unavailable; the Berkeley Software Distribution had not yet freed itself from legal encumbrances. Despite the limited functionality of the early versions, Linux gained developers and users. In September 1991, Torvalds released version 0.01 of the Linux kernel on the FTP server of the Finnish University and Research Network. It had 10,239 lines of code. On 5 October 1991, version 0.02 of the Linux kernel was released. Torvalds assigned version 0 to the kernel to indicate that it was for testing and not intended for productive use. In December 1991, Linux kernel 0.11 was released. This version was the first to be self-hosted as Linux kernel 0.11 could be compiled by a computer running the same kernel version. When Torvalds released version 0.12 in February 1992, he adopted the GNU General Public License version 2 over his previous self-drafted license, which had not permitted commercial redistribution.
On 19 January 1992, the first post to the new newsgroup alt.os.linux was submitted. On 31 March 1992, the newsgroup was renamed comp.os.linux. The fact that Linux is a monolithic kernel rather than a microkernel was the topic of a debate between Andrew S. Tanenbaum, the creator of MINIX, Torvalds; this discussion is known as the Tanenbaum–Torvalds debate and started in 1992 on the Usenet discussion group comp.os.minix as a general debate about Linux and kernel architecture. Tanenbaum argued that microkernels were superior to monolithic kernels and that therefore Linux was obsolete. Unlike traditional monolithic kernels, device drivers in Linux are configured as loadable kernel modules and are loaded or unloaded while
A Linux distribution is an operating system made from a software collection, based upon the Linux kernel and a package management system. Linux users obtain their operating system by downloading one of the Linux distributions, which are available for a wide variety of systems ranging from embedded devices and personal computers to powerful supercomputers. A typical Linux distribution comprises a Linux kernel, GNU tools and libraries, additional software, documentation, a window system, a window manager, a desktop environment. Most of the included software is free and open-source software made available both as compiled binaries and in source code form, allowing modifications to the original software. Linux distributions optionally include some proprietary software that may not be available in source code form, such as binary blobs required for some device drivers. A Linux distribution may be described as a particular assortment of application and utility software, packaged together with the Linux kernel in such a way that its capabilities meet the needs of many users.
The software is adapted to the distribution and packaged into software packages by the distribution's maintainers. The software packages are available online in so-called repositories, which are storage locations distributed around the world. Beside glue components, such as the distribution installers or the package management systems, there are only few packages that are written from the ground up by the maintainers of a Linux distribution. Six hundred Linux distributions exist, with close to five hundred out of those in active development; because of the huge availability of software, distributions have taken a wide variety of forms, including those suitable for use on desktops, laptops, mobile phones and tablets, as well as minimal environments for use in embedded systems. There are commercially backed distributions, such as Fedora, openSUSE and Ubuntu, community-driven distributions, such as Debian, Slackware and Arch Linux. Most distributions come ready to use and pre-compiled for a specific instruction set, while some distributions are distributed in source code form and compiled locally during installation.
Linus Torvalds developed the Linux kernel and distributed its first version, 0.01, in 1991. Linux was distributed as source code only, as a pair of downloadable floppy disk images – one bootable and containing the Linux kernel itself, the other with a set of GNU utilities and tools for setting up a file system. Since the installation procedure was complicated in the face of growing amounts of available software, distributions sprang up to simplify this. Early distributions included the following: H. J. Lu's "Boot-root", the aforementioned disk image pair with the kernel and the absolute minimal tools to get started, in late 1991 MCC Interim Linux, made available to the public for download in February 1992 Softlanding Linux System, released in 1992, was the most comprehensive distribution for a short time, including the X Window System Yggdrasil Linux/GNU/X, a commercial distribution first released in December 1992The two oldest and still active distribution projects started in 1993; the SLS distribution was not well maintained, so in July 1993 a new distribution, called Slackware and based on SLS, was released by Patrick Volkerding.
Dissatisfied with SLS, Ian Murdock set to create a free distribution by founding Debian, which had its first release in December 1993. Users were attracted to Linux distributions as alternatives to the DOS and Microsoft Windows operating systems on IBM PC compatible computers, Mac OS on the Apple Macintosh, proprietary versions of Unix. Most early adopters were familiar with Unix from school, they embraced Linux distributions for their low cost, availability of the source code for most or all of the software included. The distributions were a convenience, offering a free alternative to proprietary versions of Unix but they became the usual choice for Unix or Linux experts. To date, Linux has become more popular in server and embedded devices markets than in the desktop market. For example, Linux is used on over 50% of web servers, whereas its desktop market share is about 3.7%. Many Linux distributions provide an installation system akin to that provided with other modern operating systems. On the other hand, some distributions, including Gentoo Linux, provide only the binaries of a basic kernel, compilation tools, an installer.
Distributions are segmented into packages. Each package contains service. Examples of packages are a library for handling the PNG image format, a collection of fonts or a web browser; the package is provided as compiled code, with installation and removal of packages handled by a package management system rather than a simple file archiver. Each package intended for such a PMS contains meta-information such as a package description, "dependencies"; the package management system can evaluate this meta-information to allow package searches, to perform an automatic upgrade to a newer version, to check that all dependencies of a package are fulfilled, and/or to fulfill them automatically. Alth
A debugger or debugging tool is a computer program, used to test and debug other programs. The code to be examined might alternatively be running on an instruction set simulator, a technique that allows great power in its ability to halt when specific conditions are encountered, but which will be somewhat slower than executing the code directly on the appropriate processor; some debuggers offer two modes of operation, partial simulation, to limit this impact. A "trap" occurs when the program cannot continue because of a programming bug or invalid data. For example, the program might have tried to use an instruction not available on the current version of the CPU or attempted to access unavailable or protected memory; when the program "traps" or reaches a preset condition, the debugger shows the location in the original code if it is a source-level debugger or symbolic debugger now seen in integrated development environments. If it is a low-level debugger or a machine-language debugger it shows the line in the disassembly.
Debuggers offer a query processor, a symbol resolver, an expression interpreter, a debug support interface at its top level. Debuggers offer more sophisticated functions such as running a program step by step, stopping at some event or specified instruction by means of a breakpoint, tracking the values of variables; some debuggers have the ability to modify program state. It may be possible to continue execution at a different location in the program to bypass a crash or logical error; the same functionality which makes a debugger useful for eliminating bugs allows it to be used as a software cracking tool to evade copy protection, digital rights management, other software protection features. It also makes it useful as a general verification tool, fault coverage, performance analyzer if instruction path lengths are shown. Most mainstream debugging engines, such as gdb and dbx, provide console-based command line interfaces. Debugger front-ends are popular extensions to debugger engines that provide IDE integration, program animation, visualization features.
Some debuggers include a feature called "reverse debugging" known as "historical debugging" or "backwards debugging". These debuggers make it possible to step a program's execution backwards in time. Various debuggers include this feature. Microsoft Visual Studio offers IntelliTrace reverse debugging for Visual Basic. NET, some other languages, but not C++. Reverse debuggers exist for C, C++, Python and other languages; some are open source. Some reverse debuggers slow down the target by orders of magnitude, but the best reverse debuggers cause a slowdown of 2× or less. Reverse debugging is useful for certain types of problems, but is still not used yet; some debuggers operate on a single specific language while others can handle multiple languages transparently. For example, if the main target program is written in COBOL but calls assembly language subroutines and PL/1 subroutines, the debugger may have to dynamically switch modes to accommodate the changes in language as they occur; some debuggers incorporate memory protection to avoid storage violations such as buffer overflow.
This may be important in transaction processing environments where memory is dynamically allocated from memory'pools' on a task by task basis. Most modern microprocessors have at least one of these features in their CPU design to make debugging easier: Hardware support for single-stepping a program, such as the trap flag. An instruction set that meets the Popek and Goldberg virtualization requirements makes it easier to write debugger software that runs on the same CPU as the software being debugged. In-system programming allows an external hardware debugger to reprogram a system under test. Many systems with such ISP support have other hardware debug support. Hardware support for code and data breakpoints, such as address comparators and data value comparators or, with more work involved, page fault hardware. JTAG access to hardware debug interfaces such as those on ARM architecture processors or using the Nexus command set. Processors used in embedded systems have extensive JTAG debug support.
The GNU Manifesto was written by Richard Stallman and published in March 1985 in Dr. Dobb's Journal of Software Tools as an explanation of goals of the GNU Project, as a call for support and participation in developing GNU, a free software computer operating system, it is held in high regard within the free software movement as a fundamental philosophical source. The full text is included with GNU software such as Emacs, is publicly available; some parts of the GNU Manifesto begun as an announcement of the GNU Project posted by Richard Stallman on September 27, 1983 in form of email on Usenet newsgroups. The project's aim was to give computer users freedom and control over their computers by collaboratively developing and providing software, based on Stallman's idea of software freedom; the manifesto was written as a way to familiarize more people with these concepts, to find more support in form of work, money and hardware. The GNU Manifesto has taken its name and full form in 1985 and was updated in minor ways in 1987.
The GNU Manifesto opens with an explanation of what the GNU Project is, what is the current, at the time, progress in creation of the GNU operating system. The system, although based on, compatible with Unix, is meant by the author to have many improvements over it, which are listed in detail in the manifesto. One of the major driving points behind the GNU project, according to Stallman, is the rapid trend toward Unix and its various components becoming proprietary software; the manifesto lays a philosophical basis for launching the project, importance of bringing it to fruition — proprietary software is a way to divide users, who are no longer able to help each other. Stallman refuses to write proprietary software as a sign of solidarity with them; the author provides many reasons for why the project and software freedom is beneficial to users, although he agrees that its wide adoption will make a work of programmer less profitable. Large part of the GNU Manifesto is focused on rebutting possible objections to GNU Project's goals.
They include the programmer's need to make a living, the issue of advertising ad distributing free software, the perceived need of a profit incentive. History of free and open-source software Open Letter to Hobbyists GNU Manifesto
An operating system is system software that manages computer hardware and software resources and provides common services for computer programs. Time-sharing operating systems schedule tasks for efficient use of the system and may include accounting software for cost allocation of processor time, mass storage and other resources. For hardware functions such as input and output and memory allocation, the operating system acts as an intermediary between programs and the computer hardware, although the application code is executed directly by the hardware and makes system calls to an OS function or is interrupted by it. Operating systems are found on many devices that contain a computer – from cellular phones and video game consoles to web servers and supercomputers; the dominant desktop operating system is Microsoft Windows with a market share of around 82.74%. MacOS by Apple Inc. is in second place, the varieties of Linux are collectively in third place. In the mobile sector, use in 2017 is up to 70% of Google's Android and according to third quarter 2016 data, Android on smartphones is dominant with 87.5 percent and a growth rate 10.3 percent per year, followed by Apple's iOS with 12.1 percent and a per year decrease in market share of 5.2 percent, while other operating systems amount to just 0.3 percent.
Linux distributions are dominant in supercomputing sectors. Other specialized classes of operating systems, such as embedded and real-time systems, exist for many applications. A single-tasking system can only run one program at a time, while a multi-tasking operating system allows more than one program to be running in concurrency; this is achieved by time-sharing, where the available processor time is divided between multiple processes. These processes are each interrupted in time slices by a task-scheduling subsystem of the operating system. Multi-tasking may be characterized in co-operative types. In preemptive multitasking, the operating system slices the CPU time and dedicates a slot to each of the programs. Unix-like operating systems, such as Solaris and Linux—as well as non-Unix-like, such as AmigaOS—support preemptive multitasking. Cooperative multitasking is achieved by relying on each process to provide time to the other processes in a defined manner. 16-bit versions of Microsoft Windows used cooperative multi-tasking.
32-bit versions of both Windows NT and Win9x, used preemptive multi-tasking. Single-user operating systems have no facilities to distinguish users, but may allow multiple programs to run in tandem. A multi-user operating system extends the basic concept of multi-tasking with facilities that identify processes and resources, such as disk space, belonging to multiple users, the system permits multiple users to interact with the system at the same time. Time-sharing operating systems schedule tasks for efficient use of the system and may include accounting software for cost allocation of processor time, mass storage and other resources to multiple users. A distributed operating system manages a group of distinct computers and makes them appear to be a single computer; the development of networked computers that could be linked and communicate with each other gave rise to distributed computing. Distributed computations are carried out on more than one machine; when computers in a group work in cooperation, they form a distributed system.
In an OS, distributed and cloud computing context, templating refers to creating a single virtual machine image as a guest operating system saving it as a tool for multiple running virtual machines. The technique is used both in virtualization and cloud computing management, is common in large server warehouses. Embedded operating systems are designed to be used in embedded computer systems, they are designed to operate on small machines like PDAs with less autonomy. They are able to operate with a limited number of resources, they are compact and efficient by design. Windows CE and Minix 3 are some examples of embedded operating systems. A real-time operating system is an operating system that guarantees to process events or data by a specific moment in time. A real-time operating system may be single- or multi-tasking, but when multitasking, it uses specialized scheduling algorithms so that a deterministic nature of behavior is achieved. An event-driven system switches between tasks based on their priorities or external events while time-sharing operating systems switch tasks based on clock interrupts.
A library operating system is one in which the services that a typical operating system provides, such as networking, are provided in the form of libraries and composed with the application and configuration code to construct a unikernel: a specialized, single address space, machine image that can be deployed to cloud or embedded environments. Early computers were built to perform a series of single tasks, like a calculator. Basic operating system features were developed in the 1950s, such as resident monitor functions that could automatically run different programs in succession to speed up processing. Operating systems did not exist in their more complex forms until the early 1960s. Hardware features were added, that enabled use of runtime libraries and parallel processing; when personal computers became popular in the 1980s, operating systems were made for them similar in concept to those used on larger computers. In the 1940s, the earliest electronic digital systems had no operating systems.
Electronic systems of this time were programmed on rows of mechanical switches or by jumper wires on plug boards. These were special-purpose systems that, for example, generated ballistics tables for the military or controlled the pri
Free software movement
The free software movement or free/open-source software movement or free/libre open-source software movement is a social movement with the goal of obtaining and guaranteeing certain freedoms for software users, namely the freedom to run the software, to study and change the software, to redistribute copies with or without changes. Although drawing on traditions and philosophies among members of the 1970s hacker culture and academia, Richard Stallman formally founded the movement in 1983 by launching the GNU Project. Stallman established the Free Software Foundation in 1985 to support the movement; the philosophy of the movement is that the use of computers should not lead to people being prevented from cooperating with each other. In practice, this means rejecting "proprietary software", which imposes such restrictions, promoting free software, with the ultimate goal of liberating everyone in cyberspace – that is, every computer user. Stallman notes that this action will promote rather than hinder the progression of technology, since "it means that much wasteful duplication of system programming effort will be avoided.
This effort can go instead into advancing the state of the art". Members of the free software movement believe that all users of software should have the freedoms listed in The Free Software Definition. Many of them hold that it is immoral to prohibit or prevent people from exercising these freedoms and that these freedoms are required to create a decent society where software users can help each other, to have control over their computers; some free software users and programmers do not believe that proprietary software is immoral, citing an increased profitability in the business models available for proprietary software or technical features and convenience as their reasons."While social change may occur as an unintended by-product of technological change, advocates of new technologies have promoted them as instruments of positive social change." This quote by San Jose State professor Joel West explains much of the philosophy, or the reason that the free source movement is alive. If it is assumed that social change is not only affected, but in some points of view, directed by the advancement of technology, is it ethical to hold these technologies from certain people?
If not to make a direct change, this movement is in place to raise awareness about the effects that take place because of the physical things around us. A computer, for instance, allows us so many more freedoms than we have without a computer, but should these technological mediums be implied freedoms, or selective privileges? The debate over the morality of both sides to the free software movement is a difficult topic to compromise respective opposition; the Free Software Foundation believes all software needs free documentation, in particular because conscientious programmers should be able to update manuals to reflect modification that they made to the software, but deems the freedom to modify less important for other types of written works. Within the free software movement, the FLOSS Manuals foundation specialises on the goal of providing such documentation. Members of the free software movement advocate that works which serve a practical purpose should be free; the core work of the free software movement focused on software development.
The free software movement rejects proprietary software, refusing to install software that does not give them the freedoms of free software. According to Stallman, "The only thing in the software field, worse than an unauthorised copy of a proprietary program, is an authorised copy of the proprietary program because this does the same harm to its whole community of users, in addition the developer, the perpetrator of this evil, profits from it." Some supporters of the free software movement take up public speaking, or host a stall at software-related conferences to raise awareness of software freedom. This is seen as important since people who receive free software, but who are not aware that it is free software, will accept a non-free replacement or will add software, not free software. Margaret S. Elliot, a researcher in the Institute for Software at the University of California Irvine, not only outlines many benefits that could come from a free software movement, she claims that it is inherently necessary to give every person equal opportunity to utilize the Internet, assuming that the computer is globally accessible.
Since the world has become more based in the framework of technology and its advancement, creating a selective internet that allows only some to surf the web is nonsensical according to Elliot. If there is a desire to live in a more coexistent world, benefited by communication and global assistance globally free software should be a position to strive for, according to many scholars who promote awareness about the free software movement; the ideas sparked by the GNU associates are an attempt to promote a "cooperative environment" that understands the benefits of having a local community and a global community. A lot of lobbying work has been done against software expansions of copyright law. Other lobbying focusses directly on use of free software by government agencies and government-funded projects; the Venezuelan government implemented a free software law in January 2006. Decree No. 3,390 mandated all government agencies to migrate to free software over a two-year period. Congressmen Edgar David Villanueva and Jacques Rodrich Ackerman have been instrumental in introducing free software in Peru, with bill 1609 on "Free Software in Public Administration".
The incident invited the attention of Microsoft Inc, whose general manager wrote a letter to Villanueva. His response received worldwide attention and is seen a
GNU/Linux naming controversy
The GNU/Linux naming controversy is a dispute between members of the free software community and open-source software community over whether to refer to computer operating systems that use a combination of GNU software and the Linux kernel as "GNU/Linux" or "Linux". Proponents of the term Linux argue that it is far more used by the public and media, that it serves as a generic term for systems that combine that kernel with software from multiple other sources. Proponents of the term GNU/Linux note that GNU alone would be just as good a name for GNU variants which combine the GNU operating system software with software from other sources. GNU/Linux is a term promoted by its founder Richard Stallman. Proponents call for the correction of the more extended term, on the grounds that it doesn't give credit to the major contributor and the associated free software philosophy. GNU is a longstanding project begun in 1984 to develop a free operating system, it is argued that when the Linux kernel was independently created in 1991, it provided a substantial missing piece.
Several distributions employ the FSF-endorsed name, such as Debian and Parabola GNU/Linux-libre. In 1983, Richard Stallman, founder of the Free Software Foundation, set forth plans of a complete Unix-like operating system, called GNU, composed of free software. In September of that year, Stallman published a manifesto in Dr. Dobb's Journal detailing his new project publicly, outlining his vision of free software. Software development work began in January 1984. By 1991, the GNU mid-level portions of the operating system were complete, the upper level could be supplied by the X Window System, but the lower level was still lacking; the GNU kernel was called GNU Hurd. The Hurd followed an ambitious design which proved unexpectedly difficult to implement and has only been marginally usable. Independently, in 1991, Linus Torvalds released the first version of the Linux kernel. Early Linux developers ported GNU code, including the GNU C Compiler, to the kernel; the free software community adopted the use of the Linux kernel as the missing kernel for the GNU operating system.
This work filled the remaining gaps in providing a free operating system. Over the next few years, several suggestions arose for naming operating systems using the Linux kernel and GNU components. In 1992, the Yggdrasil Linux distribution adopted the name "Linux/GNU/X". In Usenet and mailing-list discussions, one can find usages of "GNU/Linux" as early as 1992 and of "GNU+Linux" as early as 1993; the Debian project, at one time sponsored by the Free Software Foundation, switched to calling its product "Debian GNU/Linux" in early 1994. GNU's June 1994 Bulletin describes "Linux" as a "free Unix system for 386 machines", but the January 1995 Bulletin switched to the term "GNU/Linux" instead. Stallman's and the FSF's efforts to include "GNU" in the name started around 1994, but were mostly via private communications until 1996. In May 1996, Stallman released Emacs 19.31 with the Autoconf system target "linux" changed to "lignux", included an essay "Linux and the GNU system" suggesting that people use the terms "Linux-based GNU system".
He used "GNU/Linux" and the essay was superseded by Stallman's 1997 essay, "Linux and the GNU project". Modern free software and Open-source software systems are composed of software by many different authors, including the Linux kernel developers, the GNU project, other vendors such as those behind the X Window System. Desktop- and server-based distributions use GNU components such as the GNU C Library, GNU Core Utilities, bash. In a 2002 analysis of the source code for Red Hat Linux 7.1, a typical Linux distribution, the total size of the packages from the GNU project was found to be much larger than the Linux kernel. A 2011 analysis of Ubuntu's "Natty" release main repository found that 8% to 13% of it consisted of GNU components, while only 6% is taken by the Linux kernel. Determining what constitutes the "operating system" per se is a matter of continuing debate. On the other hand, some embedded systems, such as handheld devices and smartphones, residential gateways, Voice over IP devices, are engineered with space efficiency in mind and use a Linux kernel with few or no components of GNU.
A system running μClinux is to substitute uClibc for glibc and BusyBox for Coreutils. Google's Linux-based Android operating system does not use any GNU components or libraries, replacing glibc with Google's own BSD-based Bionic C library; the FSF agrees. There are systems that use a GNU userspace and/or C library on top of a non-Linux kernel, for example Debian GNU/Hurd or Debian GNU/kFreeBSD; the FSF justifies the name "GNU/Linux" on the grounds that the GNU project was developing a complete system, of which they argue that the Linux kernel filled one of the final gaps.