Free and open-source software
Free and open-source software is software that can be classified as both free software and open-source software. That is, anyone is licensed to use, copy and change the software in any way, the source code is shared so that people are encouraged to voluntarily improve the design of the software; this is in contrast to proprietary software, where the software is under restrictive copyright licensing and the source code is hidden from the users. FOSS maintains the software user's civil liberty rights. Other benefits of using FOSS can include decreased software costs, increased security and stability, protecting privacy and giving users more control over their own hardware. Free and open-source operating systems such as Linux and descendants of BSD are utilized today, powering millions of servers, desktops and other devices. Free-software licenses and open-source licenses are used by many software packages; the free-software movement and the open-source software movement are online social movements behind widespread production and adoption of FOSS.
"Free and open-source software" is an umbrella term for software, considered both Free software and open-source software. FOSS allows the user to inspect the source code and provides a high level of control of the software's functions compared to proprietary software; the term "free software" does not refer to the monetary cost of the software at all, but rather whether the license maintains the software user's civil liberties. There are a number of related terms and abbreviations for free and open-source software, or free/libre and open-source software. Although there is a complete overlap between free-software licenses and open-source-software licenses, there is a strong philosophical disagreement between the advocates of these two positions; the terminology of FOSS or "Free and Open-source software" was created to be a neutral on these philosophical disagreements between the FSF and OSI and have a single unified term that could refer to both concepts. As the Free Software Foundation explains the philosophical difference between free software and open-source software: "The two terms describe the same category of software, but they stand for views based on fundamentally different values.
Open-source is a development methodology. For the free-software movement, free software is an ethical imperative, essential respect for the users' freedom. By contrast, the philosophy of open-source considers issues in terms of how to make software “better”—in a practical sense only." In parallel to this the Open Source Initiative considers many free-software licenses to be open source. These include the latest versions of the FSF's three main licenses: the GPL, the Lesser General Public License, the GNU Affero General Public License. Richard Stallman's Free Software Definition, adopted by the Free Software Foundation, defines free software as a matter of liberty not price, it upholds the Four Essential Freedoms; the earliest-known publication of the definition of his free-software idea was in the February 1986 edition of the FSF's now-discontinued GNU's Bulletin publication. The canonical source for the document is in the philosophy section of the GNU Project website; as of August 2017, it is published there in 40 languages.
To meet the definition of "free software", the FSF requires the software's licensing rights what the FSF respect the civil liberties / human rights of what the FSF calls the software user's "Four Essential Freedoms". The freedom to run the program as you wish, for any purpose; the freedom to study how the program works, change it so it does your computing as you wish. Access to the source code is a precondition for this; the freedom to redistribute copies. The freedom to distribute copies of your modified versions to others. By doing this you can give the whole community a chance to benefit from your changes. Access to the source code is a precondition for this; the open-source-software definition is used by the Open Source Initiative to determine whether a software license qualifies for the organization's insignia for Open-source software. The definition was based on the Debian Free Software Guidelines and adapted by Bruce Perens. Perens did not base his writing on the Four Essential Freedoms of free software from the Free Software Foundation, which were only available on the web.
Perens subsequently stated that he felt Eric Raymond's promotion of Open-source unfairly overshadowed the Free Software Foundation's efforts and reaffirmed his support for Free software. In the following 2000s, he spoke about open source again. In the 1950s through the 1980s, it was common for computer users to have the source code for all programs they used, the permission and ability to modify it for their own use. Software, including source code, was shared by individuals who used computers as public domain software. Most companies had a business model based on hardware sales, provided or bundled software with hardware, free of charge. By the late 1960s, the prevailing business model around software was changing. A growing and evolving software industry was competing with the hardware manufacturer's bundled software products. Leased machines required software support while providing n
An operating system is system software that manages computer hardware and software resources and provides common services for computer programs. Time-sharing operating systems schedule tasks for efficient use of the system and may include accounting software for cost allocation of processor time, mass storage and other resources. For hardware functions such as input and output and memory allocation, the operating system acts as an intermediary between programs and the computer hardware, although the application code is executed directly by the hardware and makes system calls to an OS function or is interrupted by it. Operating systems are found on many devices that contain a computer – from cellular phones and video game consoles to web servers and supercomputers; the dominant desktop operating system is Microsoft Windows with a market share of around 82.74%. MacOS by Apple Inc. is in second place, the varieties of Linux are collectively in third place. In the mobile sector, use in 2017 is up to 70% of Google's Android and according to third quarter 2016 data, Android on smartphones is dominant with 87.5 percent and a growth rate 10.3 percent per year, followed by Apple's iOS with 12.1 percent and a per year decrease in market share of 5.2 percent, while other operating systems amount to just 0.3 percent.
Linux distributions are dominant in supercomputing sectors. Other specialized classes of operating systems, such as embedded and real-time systems, exist for many applications. A single-tasking system can only run one program at a time, while a multi-tasking operating system allows more than one program to be running in concurrency; this is achieved by time-sharing, where the available processor time is divided between multiple processes. These processes are each interrupted in time slices by a task-scheduling subsystem of the operating system. Multi-tasking may be characterized in co-operative types. In preemptive multitasking, the operating system slices the CPU time and dedicates a slot to each of the programs. Unix-like operating systems, such as Solaris and Linux—as well as non-Unix-like, such as AmigaOS—support preemptive multitasking. Cooperative multitasking is achieved by relying on each process to provide time to the other processes in a defined manner. 16-bit versions of Microsoft Windows used cooperative multi-tasking.
32-bit versions of both Windows NT and Win9x, used preemptive multi-tasking. Single-user operating systems have no facilities to distinguish users, but may allow multiple programs to run in tandem. A multi-user operating system extends the basic concept of multi-tasking with facilities that identify processes and resources, such as disk space, belonging to multiple users, the system permits multiple users to interact with the system at the same time. Time-sharing operating systems schedule tasks for efficient use of the system and may include accounting software for cost allocation of processor time, mass storage and other resources to multiple users. A distributed operating system manages a group of distinct computers and makes them appear to be a single computer; the development of networked computers that could be linked and communicate with each other gave rise to distributed computing. Distributed computations are carried out on more than one machine; when computers in a group work in cooperation, they form a distributed system.
In an OS, distributed and cloud computing context, templating refers to creating a single virtual machine image as a guest operating system saving it as a tool for multiple running virtual machines. The technique is used both in virtualization and cloud computing management, is common in large server warehouses. Embedded operating systems are designed to be used in embedded computer systems, they are designed to operate on small machines like PDAs with less autonomy. They are able to operate with a limited number of resources, they are compact and efficient by design. Windows CE and Minix 3 are some examples of embedded operating systems. A real-time operating system is an operating system that guarantees to process events or data by a specific moment in time. A real-time operating system may be single- or multi-tasking, but when multitasking, it uses specialized scheduling algorithms so that a deterministic nature of behavior is achieved. An event-driven system switches between tasks based on their priorities or external events while time-sharing operating systems switch tasks based on clock interrupts.
A library operating system is one in which the services that a typical operating system provides, such as networking, are provided in the form of libraries and composed with the application and configuration code to construct a unikernel: a specialized, single address space, machine image that can be deployed to cloud or embedded environments. Early computers were built to perform a series of single tasks, like a calculator. Basic operating system features were developed in the 1950s, such as resident monitor functions that could automatically run different programs in succession to speed up processing. Operating systems did not exist in their more complex forms until the early 1960s. Hardware features were added, that enabled use of runtime libraries and parallel processing; when personal computers became popular in the 1980s, operating systems were made for them similar in concept to those used on larger computers. In the 1940s, the earliest electronic digital systems had no operating systems.
Electronic systems of this time were programmed on rows of mechanical switches or by jumper wires on plug boards. These were special-purpose systems that, for example, generated ballistics tables for the military or controlled the pri
Linux is a family of free and open-source software operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is packaged in a Linux distribution. Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy. Popular Linux distributions include Debian and Ubuntu. Commercial distributions include SUSE Linux Enterprise Server. Desktop Linux distributions include a windowing system such as X11 or Wayland, a desktop environment such as GNOME or KDE Plasma. Distributions intended for servers may omit graphics altogether, include a solution stack such as LAMP; because Linux is redistributable, anyone may create a distribution for any purpose. Linux was developed for personal computers based on the Intel x86 architecture, but has since been ported to more platforms than any other operating system.
Linux is the leading operating system on servers and other big iron systems such as mainframe computers, the only OS used on TOP500 supercomputers. It is used by around 2.3 percent of desktop computers. The Chromebook, which runs the Linux kernel-based Chrome OS, dominates the US K–12 education market and represents nearly 20 percent of sub-$300 notebook sales in the US. Linux runs on embedded systems, i.e. devices whose operating system is built into the firmware and is tailored to the system. This includes routers, automation controls, digital video recorders, video game consoles, smartwatches. Many smartphones and tablet computers run other Linux derivatives; because of the dominance of Android on smartphones, Linux has the largest installed base of all general-purpose operating systems. Linux is one of the most prominent examples of open-source software collaboration; the source code may be used and distributed—commercially or non-commercially—by anyone under the terms of its respective licenses, such as the GNU General Public License.
The Unix operating system was conceived and implemented in 1969, at AT&T's Bell Laboratories in the United States by Ken Thompson, Dennis Ritchie, Douglas McIlroy, Joe Ossanna. First released in 1971, Unix was written in assembly language, as was common practice at the time. In a key pioneering approach in 1973, it was rewritten in the C programming language by Dennis Ritchie; the availability of a high-level language implementation of Unix made its porting to different computer platforms easier. Due to an earlier antitrust case forbidding it from entering the computer business, AT&T was required to license the operating system's source code to anyone who asked; as a result, Unix grew and became adopted by academic institutions and businesses. In 1984, AT&T divested itself of Bell Labs; the GNU Project, started in 1983 by Richard Stallman, had the goal of creating a "complete Unix-compatible software system" composed of free software. Work began in 1984. In 1985, Stallman started the Free Software Foundation and wrote the GNU General Public License in 1989.
By the early 1990s, many of the programs required in an operating system were completed, although low-level elements such as device drivers and the kernel, called GNU/Hurd, were stalled and incomplete. Linus Torvalds has stated that if the GNU kernel had been available at the time, he would not have decided to write his own. Although not released until 1992, due to legal complications, development of 386BSD, from which NetBSD, OpenBSD and FreeBSD descended, predated that of Linux. Torvalds has stated that if 386BSD had been available at the time, he would not have created Linux. MINIX was created by Andrew S. Tanenbaum, a computer science professor, released in 1987 as a minimal Unix-like operating system targeted at students and others who wanted to learn the operating system principles. Although the complete source code of MINIX was available, the licensing terms prevented it from being free software until the licensing changed in April 2000. In 1991, while attending the University of Helsinki, Torvalds became curious about operating systems.
Frustrated by the licensing of MINIX, which at the time limited it to educational use only, he began to work on his own operating system kernel, which became the Linux kernel. Torvalds began the development of the Linux kernel on MINIX and applications written for MINIX were used on Linux. Linux matured and further Linux kernel development took place on Linux systems. GNU applications replaced all MINIX components, because it was advantageous to use the available code from the GNU Project with the fledgling operating system. Torvalds initiated a switch from his original license, which prohibited commercial redistribution, to the GNU GPL. Developers worked to integrate GNU components with the Linux kernel, making a functional and free operating system. Linus Torvalds had wanted to call his invention "Freax", a portmant
Autocomplete, or word completion, is a feature in which an application predicts the rest of a word a user is typing. In graphical user interfaces, users can press the tab key to accept a suggestion or the down arrow key to accept one of several. Autocomplete speeds up human-computer interactions when it predicts the word a user intends to enter after only a few characters have been typed into a text input field, it works best in domains with a limited number of possible words, when some words are much more common, or writing structured and predictable text. Many autocomplete algorithms learn new words after the user has written them a few times, can suggest alternatives based on the learned habits of the individual user; the original purpose of word prediction software was to help people with physical disabilities increase their typing speed, as well as to help them decrease the number of keystrokes needed in order to complete a word or a sentence. The need to increase speed is noted by the fact that people who use speech-generating devices produce speech at a rate, less than 10% as fast as people who use oral speech.
But the function is very useful for anybody who writes text people–such as medical doctors–who use long, hard-to-spell terminology that may be technical or medical in nature. Autocomplete or word completion works so that when the writer writes the first letter or letters of a word, the program predicts one or more possible words as choices. If the word he intends to write is included in the list he can select it, for example by using the number keys. If the word that the user wants is not predicted, the writer must enter the next letter of the word. At this time, the word choice is altered so that the words provided begin with the same letters as those that have been selected; when the word that the user wants appears it is selected, the word is inserted into the text. In another form of word prediction, words most to follow the just written one are predicted, based on recent word pairs used. Word prediction uses language modeling, where within a set vocabulary the words are most to occur are calculated.
Along with language modeling, basic word prediction on AAC devices is coupled with a recency model, where words that are used more by the AAC user are more to be predicted. Word prediction software also allows the user to enter their own words into the word prediction dictionaries either directly, or by "learning" words that have been written; some search returns related to genitals or other vulgar terms are omitted from autocompletion technologies, as are morbid terms There are standalone tools that add autocomplete functionality to existing applications. These programs monitor user suggests a list of words based on first typed letter. Examples are Letmetype. LetMeType, freeware, is no longer developed, the author has published the source code and allows anybody to continue development. Typingaid freeware, is developed. Intellicomplete, both a freeware and payware version, works only in certain programs which hook into the intellicomplete server program. Many Autocomplete programs can be used to create a Shorthand list.
The original autocomplete software was Smartype, which dates back to the late 1980s and is still available today. It was developed for medical transcriptionists working in WordPerfect for MS/DOS, but it now functions for any application in any Windows or Web-based program. Shorthand called Autoreplace, is a related feature that involves automatic replacement of a particular string with another one one, longer and harder to type, as "myname" with "Lee John Nikolai François Al Rahman"; this can quietly fix simple typing errors, such as turning "teh" into "the". Several Autocomplete programs, standalone or integrated in text editors, based on word lists include a shorthand function for used phrases. Context completion is a text editor feature, similar to word completion, which completes words based on the current context and context of other similar words within the same document, or within some training data set; the main advantage of context completion is the ability to predict anticipated words more and with no initial letters.
The main disadvantage is the need of a training data set, larger for context completion than for simpler word completion. Most common use of context completion is seen in advanced programming language editors and IDEs, where training data set is inherently available and context completion makes more sense to the user than broad word completion would. Line completion is a type of context completion, first introduced by Juraj Simlovic in TED Notepad, in July 2006; the context in line completion is the current line, while current document poses as training data set. When user begins a line which starts with a used phrase, the editor automatically completes it, up to the position where similar lines differ, or proposes a list of common continuations. Action completion in applications are standalone tools that add autocomplete functionality to an existing applications or all existing applications of a OS, based on the current context; the main advantage of Action completion is the ability to predict anticipated actions.
The main disadvantage is the need of a data set. Most common use of Action completion is seen in IDEs, but there are action completion tools that work globally, in parallel, across all applications of the entire PC without hindering the action completion of the respective a
A software bug is an error, failure or fault in a computer program or system that causes it to produce an incorrect or unexpected result, or to behave in unintended ways. The process of finding and fixing bugs is termed "debugging" and uses formal techniques or tools to pinpoint bugs, since the 1950s, some computer systems have been designed to deter, detect or auto-correct various computer bugs during operations. Most bugs arise from mistakes and errors made in either a program's source code or its design, or in components and operating systems used by such programs. A few are caused by compilers producing incorrect code. A program that contains a large number of bugs, and/or bugs that interfere with its functionality, is said to be buggy. Bugs can trigger errors. Bugs may cause the program to crash or freeze the computer. Other bugs qualify as security bugs and might, for example, enable a malicious user to bypass access controls in order to obtain unauthorized privileges; some software bugs have been linked to disasters.
Bugs in code that controlled the Therac-25 radiation therapy machine were directly responsible for patient deaths in the 1980s. In 1996, the European Space Agency's US$1 billion prototype Ariane 5 rocket had to be destroyed less than a minute after launch due to a bug in the on-board guidance computer program. In June 1994, a Royal Air Force Chinook helicopter crashed into the Mull of Kintyre, killing 29; this was dismissed as pilot error, but an investigation by Computer Weekly convinced a House of Lords inquiry that it may have been caused by a software bug in the aircraft's engine-control computer. In 2002, a study commissioned by the US Department of Commerce's National Institute of Standards and Technology concluded that "software bugs, or errors, are so prevalent and so detrimental that they cost the US economy an estimated $59 billion annually, or about 0.6 percent of the gross domestic product". The term "bug" to describe defects has been a part of engineering jargon since the 1870s and predates electronic computers and computer software.
For instance, Thomas Edison wrote the following words in a letter to an associate in 1878: It has been just so in all of my inventions. The first step is an intuition, comes with a burst difficulties arise—this thing gives out and that "Bugs"—as such little faults and difficulties are called—show themselves and months of intense watching and labor are requisite before commercial success or failure is reached; the Middle English word bugge is the basis for the terms "bugbear" and "bugaboo" as terms used for a monster. Baffle Ball, the first mechanical pinball game, was advertised as being "free of bugs" in 1931. Problems with military gear during World War II were referred to as bugs. In a book published in 1942, Louise Dickinson Rich, speaking of a powered ice cutting machine, said, "Ice sawing was suspended until the creator could be brought in to take the bugs out of his darling."Isaac Asimov used the term "bug" to relate to issues with a robot in his short story "Catch That Rabbit", published in 1944.
The term "bug" was used in an account by computer pioneer Grace Hopper, who publicized the cause of a malfunction in an early electromechanical computer. A typical version of the story is: In 1946, when Hopper was released from active duty, she joined the Harvard Faculty at the Computation Laboratory where she continued her work on the Mark II and Mark III. Operators traced an error in the Mark II to a moth trapped in a relay; this bug was removed and taped to the log book. Stemming from the first bug, today we call errors or glitches in a program a bug. Hopper did not find the bug, as she acknowledged; the date in the log book was September 9, 1947. The operators who found it, including William "Bill" Burke of the Naval Weapons Laboratory, Virginia, were familiar with the engineering term and amusedly kept the insect with the notation "First actual case of bug being found." Hopper loved to recount the story. This log book, complete with attached moth, is part of the collection of the Smithsonian National Museum of American History.
The related term "debug" appears to predate its usage in computing: the Oxford English Dictionary's etymology of the word contains an attestation from 1945, in the context of aircraft engines. The concept that software might contain errors dates back to Ada Lovelace's 1843 notes on the analytical engine, in which she speaks of the possibility of program "cards" for Charles Babbage's analytical engine being erroneous:... an analysing process must have been performed in order to furnish the Analytical Engine with the necessary operative data. Granted that the actual mechanism is unerring in its processes, the cards may give it wrong orders; the first documented use of the term "bug" for a technical malfunction was by Thomas Edison. The Open Technology Institute, run by the group, New America, released a report "Bugs in the System" in August 2016 stating that U. S. policymakers should make reforms to help researchers address software bugs. The report "highlights the need for reform in the field of software vulnerability discovery and disclosure."
One of the report’s authors said that Congress has not done enough to address cyber software vulnerability though Congress has passed a number of bills to combat the larger issue of cyber security. Government researchers and cyber security experts are the people who discover software flaws
A terminal emulator, terminal application, or term, is a program that emulates a video terminal within some other display architecture. Though synonymous with a shell or text terminal, the term terminal covers all remote terminals, including graphical interfaces. A terminal emulator inside a graphical user interface is called a terminal window. A terminal window allows the user access to a text terminal and all its applications such as command-line interfaces and text user interface applications; these may be running either on a different one via telnet, ssh, or dial-up. On Unix-like operating systems, it is common to have one or more terminal windows connected to the local machine. Terminals support a set of escape sequences for controlling color, cursor position, etc. Examples include the family of terminal control sequence standards known as ECMA-48, ANSI X3.64 or ISO/IEC 6429. Terminal emulators may implement a local echo function, which may erroneously be named "half-duplex", or still incorrectly "echoplex".
Terminal emulators may implement local editing known as "line-at-a-time mode". This is mistakenly referred to as "half-duplex". In this mode, the terminal emulator only sends complete lines of input to the host system; the user enters and edits a line, but it is held locally within the terminal emulator as it is being edited. It is not transmitted until the user signals its completion with the ↵ Enter key on the keyboard or a "send" button of some sort in the user interface. At that point, the entire line is transmitted. Line-at-a-time mode implies local echo, since otherwise the user will not be able to see the line as it is being edited and constructed. However, line-at-a-time mode does not require local echo; when entering a password, for example, line-at-a-time entry with local editing is possible, but local echo is turned off. The complexities of line-at-a-time mode are exemplified by the line-at-a-time mode option in the telnet protocol. To implement it the Network Virtual Terminal implementation provided by the terminal emulator program must be capable of recognizing and properly dealing with "interrupt" and "abort" events that arrive in the middle of locally editing a line.
In asynchronous terminals data can flow in any direction at any time. In synchronous terminals a protocol controls. IBM 3270-based terminals used with IBM mainframe computers are an example of synchronous terminals, they operate in an "screen-at-a-time" mode. Users can make numerous changes to a page, before submitting the updated screen to the remote machine as a single action. Terminal emulators that simulate the 3270 protocol are available for most operating systems, for use both by those administering systems such as the z9, as well as those using the corresponding applications such as CICS. Other examples of synchronous terminals include the IBM 5250, ICL 7561, Honeywell Bull VIP7800 and Hewlett-Packard 700/92. Many terminal emulators have been developed for terminals such as VT52, VT100, VT220, VT320, IBM 3270/8/9/E, IBM 5250, IBM 3179G, Data General D211, Hewlett Packard HP700/92, Sperry/Unisys 2000-series UTS60, Burroughs/Unisys A-series T27/TD830/ET1100, ADDS ViewPoint, Sun console, QNX, AT386, SCO-ANSI, SNI 97801, Wyse 50/60.
Additionally, programs have been developed to emulate other terminal emulators such as xterm and assorted console terminals. Some emulators refer to a standard, such as ANSI; such programs are available on many platforms ranging from DOS and Unix to Windows and macOS to embedded operating systems found in cellphones and industrial hardware. Binary Synchronous Communications List of terminal emulators Online service provider Serial interface Terminal Emulation at Curlie Terminal Window Definition by The Linux Information Project
Fred Fish was a computer programmer notable for work on the GNU Debugger and his series of Fish disks of freeware for the Amiga. He was a pioneering spirit pervasive in the Amiga community; the Fish Disks became a sort of early postal system. Fish would get his disks off around the world in time for regional and local user group meetings who in turn duplicated them for local consumption. Only the cost of materials changed hands; the Fish Disk series ran from 1986 to 1994. In it, one can chart the growing sophistication of Amiga software and see the emergence of many software trends; the Fish Disks were distributed at Commodore Amiga enthusiast clubs. Contributors submitted applications and source code and the best of these each month were assembled and released as a diskette. Since the Internet was not yet in popular usage outside military and university circles, this was a primary way for enthusiasts to share work and ideas, he initiated the "GeekGadgets" project, a GNU standard environment for AmigaOS and BeOS.
Fish worked for Cygnus Solutions in the 1990s before he left for Be Inc. in 1998. In 1978, he self-published User Survival Guide for TI-58/59 Master Library, advertised in enthusiast newsletters covering the TI-59 programmable calculator. Fred Fish was married to Michelle Fish at the time of his death, he lived in Phoenix with his wife, until he lived his dream of living aboard at sea. He and his wife lived at sea for two years with their two dogs and Thor. Living space was small, they were cut off from most things; when they decided their two-year boat crusade was done, they lived Michelle's dream of living in a "cabin" like home. They moved to Idaho. Fred Fish died at his home in Idaho on Friday April 2007 of a heart attack, his widow lives in Phoenix, running a photography business, Fish Eye Photography. He had one biological son, from a previous relationship, he had two adoptive fraternal twin sons and Adam, that were a result of a previous relationship of his wife Michelle. Searchable database of Fish disk Amiga PD software Fish Disks Living his LifeLong Dream at the Wayback Machine Announcement of first Fish disks Geek Gadgets Project at the Wayback Machine Fred Fish memorial archive at the Wayback Machine - research in progress, explicitly welcomes Wiki usage