Open-source software is a type of computer software in which source code is released under a license in which the copyright holder grants users the rights to study and distribute the software to anyone and for any purpose. Open-source software may be developed in a collaborative public manner. Open-source software is a prominent example of open collaboration. Open-source software development generates an more diverse scope of design perspective than any company is capable of developing and sustaining long term. A 2008 report by the Standish Group stated that adoption of open-source software models have resulted in savings of about $60 billion per year for consumers. In the early days of computing and developers shared software in order to learn from each other and evolve the field of computing; the open-source notion moved to the way side of commercialization of software in the years 1970-1980. However, academics still developed software collaboratively. For example Donald Knuth in 1979 with the TeX typesetting system or Richard Stallman in 1983 with the GNU operating system.
In 1997, Eric Raymond published The Cathedral and the Bazaar, a reflective analysis of the hacker community and free-software principles. The paper received significant attention in early 1998, was one factor in motivating Netscape Communications Corporation to release their popular Netscape Communicator Internet suite as free software; this source code subsequently became the basis behind SeaMonkey, Mozilla Firefox and KompoZer. Netscape's act prompted Raymond and others to look into how to bring the Free Software Foundation's free software ideas and perceived benefits to the commercial software industry, they concluded that FSF's social activism was not appealing to companies like Netscape, looked for a way to rebrand the free software movement to emphasize the business potential of sharing and collaborating on software source code. The new term they chose was "open source", soon adopted by Bruce Perens, publisher Tim O'Reilly, Linus Torvalds, others; the Open Source Initiative was founded in February 1998 to encourage use of the new term and evangelize open-source principles.
While the Open Source Initiative sought to encourage the use of the new term and evangelize the principles it adhered to, commercial software vendors found themselves threatened by the concept of distributed software and universal access to an application's source code. A Microsoft executive publicly stated in 2001 that "open source is an intellectual property destroyer. I can't imagine something that could be worse than this for the software business and the intellectual-property business." However, while Free and open-source software has played a role outside of the mainstream of private software development, companies as large as Microsoft have begun to develop official open-source presences on the Internet. IBM, Oracle and State Farm are just a few of the companies with a serious public stake in today's competitive open-source market. There has been a significant shift in the corporate philosophy concerning the development of FOSS; the free-software movement was launched in 1983. In 1998, a group of individuals advocated that the term free software should be replaced by open-source software as an expression, less ambiguous and more comfortable for the corporate world.
Software licenses grant rights to users which would otherwise be reserved by copyright law to the copyright holder. Several open-source software licenses have qualified within the boundaries of the Open Source Definition; the most prominent and popular example is the GNU General Public License, which "allows free distribution under the condition that further developments and applications are put under the same licence", thus free. The open source label came out of a strategy session held on April 7, 1998 in Palo Alto in reaction to Netscape's January 1998 announcement of a source code release for Navigator. A group of individuals at the session included Tim O'Reilly, Linus Torvalds, Tom Paquin, Jamie Zawinski, Larry Wall, Brian Behlendorf, Sameer Parekh, Eric Allman, Greg Olson, Paul Vixie, John Ousterhout, Guido van Rossum, Philip Zimmermann, John Gilmore and Eric S. Raymond, they used the opportunity before the release of Navigator's source code to clarify a potential confusion caused by the ambiguity of the word "free" in English.
Many people claimed that the birth of the Internet, since 1969, started the open-source movement, while others do not distinguish between open-source and free software movements. The Free Software Foun
Apple Inc. is an American multinational technology company headquartered in Cupertino, that designs and sells consumer electronics, computer software, online services. It is considered one of the Big Four of technology along with Amazon and Facebook; the company's hardware products include the iPhone smartphone, the iPad tablet computer, the Mac personal computer, the iPod portable media player, the Apple Watch smartwatch, the Apple TV digital media player, the HomePod smart speaker. Apple's software includes the macOS and iOS operating systems, the iTunes media player, the Safari web browser, the iLife and iWork creativity and productivity suites, as well as professional applications like Final Cut Pro, Logic Pro, Xcode, its online services include the iTunes Store, the iOS App Store, Mac App Store, Apple Music, Apple TV+, iMessage, iCloud. Other services include Apple Store, Genius Bar, AppleCare, Apple Pay, Apple Pay Cash, Apple Card. Apple was founded by Steve Jobs, Steve Wozniak, Ronald Wayne in April 1976 to develop and sell Wozniak's Apple I personal computer, though Wayne sold his share back within 12 days.
It was incorporated as Apple Computer, Inc. in January 1977, sales of its computers, including the Apple II, grew quickly. Within a few years and Wozniak had hired a staff of computer designers and had a production line. Apple went public in 1980 to instant financial success. Over the next few years, Apple shipped new computers featuring innovative graphical user interfaces, such as the original Macintosh in 1984, Apple's marketing advertisements for its products received widespread critical acclaim. However, the high price of its products and limited application library caused problems, as did power struggles between executives. In 1985, Wozniak departed Apple amicably and remained an honorary employee, while Jobs and others resigned to found NeXT; as the market for personal computers expanded and evolved through the 1990s, Apple lost market share to the lower-priced duopoly of Microsoft Windows on Intel PC clones. The board recruited CEO Gil Amelio to what would be a 500-day charge for him to rehabilitate the financially troubled company—reshaping it with layoffs, executive restructuring, product focus.
In 1997, he led Apple to buy NeXT, solving the failed operating system strategy and bringing Jobs back. Jobs pensively regained leadership status, becoming CEO in 2000. Apple swiftly returned to profitability under the revitalizing Think different campaign, as he rebuilt Apple's status by launching the iMac in 1998, opening the retail chain of Apple Stores in 2001, acquiring numerous companies to broaden the software portfolio. In January 2007, Jobs renamed the company Apple Inc. reflecting its shifted focus toward consumer electronics, launched the iPhone to great critical acclaim and financial success. In August 2011, Jobs resigned as CEO due to health complications, Tim Cook became the new CEO. Two months Jobs died, marking the end of an era for the company. Apple is well known for its size and revenues, its worldwide annual revenue totaled $265 billion for the 2018 fiscal year. Apple is the world's largest information technology company by revenue and the world's third-largest mobile phone manufacturer after Samsung and Huawei.
In August 2018, Apple became the first public U. S. company to be valued at over $1 trillion. The company employs 123,000 full-time employees and maintains 504 retail stores in 24 countries as of 2018, it operates the iTunes Store, the world's largest music retailer. As of January 2018, more than 1.3 billion Apple products are in use worldwide. The company has a high level of brand loyalty and is ranked as the world's most valuable brand. However, Apple receives significant criticism regarding the labor practices of its contractors, its environmental practices and unethical business practices, including anti-competitive behavior, as well as the origins of source materials. Apple Computer Company was founded on April 1, 1976, by Steve Jobs, Steve Wozniak, Ronald Wayne; the company's first product is the Apple I, a computer designed and hand-built by Wozniak, first shown to the public at the Homebrew Computer Club. Apple I was sold as a motherboard —a base kit concept which would now not be marketed as a complete personal computer.
The Apple I went on sale in July 1976 and was market-priced at $666.66. Apple Computer, Inc. was incorporated on January 3, 1977, without Wayne, who had left and sold his share of the company back to Jobs and Wozniak for $800 only twelve days after having co-founded Apple. Multimillionaire Mike Markkula provided essential business expertise and funding of $250,000 during the incorporation of Apple. During the first five years of operations revenues grew exponentially, doubling about every four months. Between September 1977 and September 1980, yearly sales grew from $775,000 to $118 million, an average annual growth rate of 533%; the Apple II invented by Wozniak, was introduced on April 16, 1977, at the first West Coast Computer Faire. It differs from its major rivals, the TRS-80 and Commodore PET, because of its character cell-based color graphics and open architecture. While early Apple II models use ordinary cassette tapes as storage devices, they were superseded by the introduction of a 5 1⁄4-inch floppy disk drive and interface called the Disk II.
The Apple II was chosen to be the desktop platform for the first "killer app" of the business world: VisiCalc, a spreadsheet program. VisiCalc created a business market for the Apple II and gave home users an additional reason to buy an Apple II: compatibility with the office. Before VisiCalc, Apple had been a distant third place c
Carnegie Mellon University
Carnegie Mellon University is a private research university based in Pittsburgh, Pennsylvania. Founded in 1900 by Andrew Carnegie as the Carnegie Technical Schools, the university became the Carnegie Institute of Technology in 1912 and began granting four-year degrees. In 1967, the Carnegie Institute of Technology merged with the Mellon Institute of Industrial Research to form Carnegie Mellon University. With its main campus located 3 miles from Downtown Pittsburgh, Carnegie Mellon has grown into an international university with over a dozen degree-granting locations in six continents, including campuses in Qatar and Silicon Valley, more than 20 research partnerships; the university has seven colleges and independent schools which all offer interdisciplinary programs: the College of Engineering, College of Fine Arts, Dietrich College of Humanities and Social Sciences, Mellon College of Science, Tepper School of Business, H. John Heinz III College of Information Systems and Public Policy, the School of Computer Science.
Carnegie Mellon counts 13,961 students from 109 countries, over 105,000 living alumni, over 5,000 faculty and staff. Past and present faculty and alumni include 20 Nobel Prize laureates, 13 Turing Award winners, 23 Members of the American Academy of Arts and Sciences, 22 Fellows of the American Association for the Advancement of Science, 79 Members of the National Academies, 124 Emmy Award winners, 47 Tony Award laureates, 10 Academy Award winners; the Carnegie Technical Schools were founded in 1900 in Pittsburgh by the Scottish American industrialist and philanthropist Andrew Carnegie, who wrote the time-honored words "My heart is in the work", when he donated the funds to create the institution. Carnegie's vision was to open a vocational training school for the sons and daughters of working-class Pittsburghers. Carnegie was inspired for the design of his school by the Pratt Institute in Brooklyn, New York founded by industrialist Charles Pratt in 1887. In 1912, the institution changed its name to Carnegie Institute of Technology and began offering four-year degrees.
During this time, CIT consisted of four constituent schools: the School of Fine and Applied Arts, the School of Apprentices and Journeymen, the School of Science and Technology, the Margaret Morrison Carnegie School for Women. The Mellon Institute of Industrial Research was founded in 1913 by a banker and industrialist brothers Andrew and Richard B. Mellon in honor of their father, Thomas Mellon, the patriarch of the Mellon family; the Institute began as a research organization which performed work for government and industry on a contract and was established as a department within the University of Pittsburgh. In 1927, the Mellon Institute incorporated as an independent nonprofit. In 1938, the Mellon Institute's iconic building was completed and it moved to its new, current, location on Fifth Avenue. In 1967, with support from Paul Mellon, Carnegie Tech merged with the Mellon Institute of Industrial Research to become Carnegie Mellon University. Carnegie Mellon's coordinate women's college, the Margaret Morrison Carnegie College closed in 1973 and merged its academic programs with the rest of the university.
The industrial research mission of the Mellon Institute survived the merger as the Carnegie Mellon Research Institute and continued doing work on contract to industry and government. CMRI closed in 2001 and its programs were subsumed by other parts of the university or spun off into autonomous entities. Carnegie Mellon's 140-acre main campus is three miles from downtown Pittsburgh, between Schenley Park and the Squirrel Hill and Oakland neighborhoods. Carnegie Mellon is bordered to the west by the campus of the University of Pittsburgh. Carnegie Mellon owns 81 buildings in the Squirrel Hill neighborhoods of Pittsburgh. For decades the center of student life on campus was the University's student union. Built in the 1950s, Skibo Hall's design was typical of Mid-Century Modern architecture, but was poorly equipped to deal with advances in computer and internet connectivity; the original Skibo was razed in the summer of 1994 and replaced by a new student union, wi-fi enabled. Known as University Center, the building was dedicated in 1996.
In 2014, Carnegie Mellon re-dedicated the University Center as the Cohon University Center in recognition of the eighth president of the university, Jared Cohon. A large grassy area known as "the Cut" forms the backbone of the campus, with a separate grassy area known as "the Mall" running perpendicular; the Cut was formed by filling in a ravine with soil from a nearby hill, leveled to build the College of Fine Arts building. The northwestern part of the campus was acquired from the United States Bureau of Mines in the 1980s. In 2006, Carnegie Mellon Trustee Jill Gansman Kraus donated the 80-foot -tall sculpture Walking to the Sky, placed on the lawn facing Forbes Ave between the Cohon University Center and Warner Hall; the sculpture was controversial for its placement, the general lack of input that the campus community had, its aesthetic appeal. In April 2015, Carnegie Mellon University, in collaboration with Jones Lang LaSalle, announced the planning of a second office space structure, alongside the Robert Mehrabian Collaborative Innovation Center, an upscale and full-service hotel, retail and dining development along Forbes Avenue.
This complex will connect to the Tepper Quadrangle, the Heinz College, the Tata Consultancy Services Building, the Gates-Hillman Center to create an innovation corridor on the university campus. The eff
Macro (computer science)
A macro in computer science is a rule or pattern that specifies how a certain input sequence should be mapped to a replacement output sequence according to a defined procedure. The mapping process that instantiates a macro use into a specific sequence is known as macro expansion. A facility for writing macros may be provided as part of a software application or as a part of a programming language. In the former case, macros are used to make tasks using the application less repetitive. In the latter case, they are a tool that allows a programmer to enable code reuse or to design domain-specific languages. Macros are used to make a sequence of computing instructions available to the programmer as a single program statement, making the programming task less tedious and less error-prone. Macros allow positional or keyword parameters that dictate what the conditional assembler program generates and have been used to create entire programs or program suites according to such variables as operating system, platform or other factors.
The term derives from "macro instruction", such expansions were used in generating assembly language code. Keyboard macros and mouse macros allow short sequences of keystrokes and mouse actions to transform into other more time-consuming, sequences of keystrokes and mouse actions. In this way used or repetitive sequences of keystrokes and mouse movements can be automated. Separate programs for creating these macros are called macro recorders. During the 1980s, macro programs – SmartKey SuperKey, KeyWorks, Prokey – were popular, first as a means to automatically format screenplays for a variety of user input tasks; these programs were based on the TSR mode of operation and applied to all keyboard input, no matter in which context it occurred. They have to some extent fallen into obsolescence following the advent of mouse-driven user interface and the availability of keyboard and mouse macros in applications such as word processors and spreadsheets, making it possible to create application-sensitive keyboard macros.
Keyboard macros have in more recent times come to life as a method of exploiting the economy of massively multiplayer online role-playing games. By tirelessly performing a boring, but low risk action, a player running a macro can earn a large amount of the game's currency or resources; this effect is larger when a macro-using player operates multiple accounts or operates the accounts for a large amount of time each day. As this money is generated without human intervention, it can upset the economy of the game. For this reason, use of macros is a violation of the TOS or EULA of most MMORPGs, administrators of MMORPGs fight a continual war to identify and punish macro users. Keyboard and mouse macros that are created using an application's built-in macro features are sometimes called application macros, they are created by letting the application record the actions. An underlying macro programming language, most a scripting language, with direct access to the features of the application may exist.
The programmers' text editor, follows this idea to a conclusion. In effect, most of the editor is made of macros. Emacs was devised as a set of macros in the editing language TECO. Another programmers' text editor, Vim has full implementation of macros, it can record into a register what a person types on the keyboard and it can be replayed or edited just like VBA macros for Microsoft Office. Vim has a scripting language called Vimscript to create macros. Visual Basic for Applications is a programming language included in Microsoft Office from Office 97 through Office 2019. However, its function has evolved from and replaced the macro languages that were included in some of these applications. VBA executes when documents are opened; this makes it easy to write computer viruses in VBA known as macro viruses. In the mid-to-late 1990s, this became one of the most common types of computer virus. However, during the late 1990s and to date, Microsoft has been updating their programs. In addition, current anti-virus programs counteract such attacks.
A parameterized macro is a macro, able to insert given objects into its expansion. This gives the macro some of the power of a function; as a simple example, in the C programming language, this is a typical macro, not a parameterized macro: #define PI 3.14159 This causes the string "PI" to be replaced with "3.14159" wherever it occurs. It will always be replaced by this string, the resulting string cannot be modified in any way. An example of a parameterized macro, on the other hand, is this: #define pred What this macro expands to depends on what argument x is passed to it. Here are some possible expansions: pred → pred → pred → Parameterized macros are a useful source-level mechanism for performing in-line expansion, but in languages such as C where they use simple textual substitution, they have a number of severe disadvantages over other mechanisms for performing in-line expansion, such as inline functions; the parameterized macros used in languages such
X86-64 is the 64-bit version of the x86 instruction set. It introduces two new modes of operation, 64-bit mode and compatibility mode, along with a new 4-level paging mode. With 64-bit mode and the new paging mode, it supports vastly larger amounts of virtual memory and physical memory than is possible on its 32-bit predecessors, allowing programs to store larger amounts of data in memory. X86-64 expands general-purpose registers to 64-bit, as well extends the number of them from 8 to 16, provides numerous other enhancements. Floating point operations are supported via mandatory SSE2-like instructions, x87/MMX style registers are not used. In 64-bit mode, instructions are modified to support 64-bit addressing mode; the compatibility mode allows 16- and 32-bit user applications to run unmodified coexisting with 64-bit applications if the 64-bit operating system supports them. As the full x86 16-bit and 32-bit instruction sets remain implemented in hardware without any intervening emulation, these older executables can run with little or no performance penalty, while newer or modified applications can take advantage of new features of the processor design to achieve performance improvements.
A processor supporting x86-64 still powers on in real mode for full backward compatibility. The original specification, created by AMD and released in 2000, has been implemented by AMD, Intel and VIA; the AMD K8 processor was the first to implement it. This was the first significant addition to the x86 architecture designed by a company other than Intel. Intel was forced to follow suit and introduced a modified NetBurst family, software-compatible with AMD's specification. VIA Technologies introduced x86-64 with the VIA Nano; the x86-64 architecture is distinct from the Intel Itanium architecture, not compatible on the native instruction set level with the x86 architecture. Operating systems and applications written for one cannot be run on the other. AMD64 was created as an alternative to the radically different IA-64 architecture, designed by Intel and Hewlett Packard. Announced in 1999 while a full specification became available in August 2000, the AMD64 architecture was positioned by AMD from the beginning as an evolutionary way to add 64-bit computing capabilities to the existing x86 architecture, as opposed to Intel's approach of creating an new 64-bit architecture with IA-64.
The first AMD64-based processor, the Opteron, was released in April 2003. AMD's processors implementing the AMD64 architecture include Opteron, Athlon 64, Athlon 64 X2, Athlon 64 FX, Athlon II, Turion 64, Turion 64 X2, Phenom, Phenom II, FX, Fusion/APU and Ryzen/Epyc; the primary defining characteristic of AMD64 is the availability of 64-bit general-purpose processor registers, 64-bit integer arithmetic and logical operations, 64-bit virtual addresses. The designers took the opportunity to make other improvements as well; some of the most significant changes are described below. 64-bit integer capability All general-purpose registers are expanded from 32 bits to 64 bits, all arithmetic and logical operations, memory-to-register and register-to-memory operations, etc. can now operate directly on 64-bit integers. Pushes and pops on the stack default to 8-byte strides, pointers are 8 bytes wide. Additional registers In addition to increasing the size of the general-purpose registers, the number of named general-purpose registers is increased from eight in x86 to 16.
It is therefore possible to keep more local variables in registers rather than on the stack, to let registers hold accessed constants. AMD64 still has fewer registers than many RISC instruction sets or VLIW-like machines such as the IA-64. However, an AMD64 implementation may have far more internal registers than the number of architectural registers exposed by the instruction set. Additional XMM registers Similarly, the number of 128-bit XMM registers is increased from 8 to 16; the traditional x87 FPU register stack is not included in the register file size extension in 64-bit mode, compared with the XMM registers used by SSE2, which did get extended. The x87 register stack is not a simple register file although it does allow direct access to individual registers by low cost exchange operations. Larger virtual address space The AMD64 architecture defines a 64-bit virtual address format, of which the low-order 48 bits are used in current implementations; this allows up to 256 TB of virtual address space.
The architecture definition allows this limit to be raised in future implementations to the full 64 bits, exten
Unix is a family of multitasking, multiuser computer operating systems that derive from the original AT&T Unix, development starting in the 1970s at the Bell Labs research center by Ken Thompson, Dennis Ritchie, others. Intended for use inside the Bell System, AT&T licensed Unix to outside parties in the late 1970s, leading to a variety of both academic and commercial Unix variants from vendors including University of California, Microsoft, IBM, Sun Microsystems. In the early 1990s, AT&T sold its rights in Unix to Novell, which sold its Unix business to the Santa Cruz Operation in 1995; the UNIX trademark passed to The Open Group, a neutral industry consortium, which allows the use of the mark for certified operating systems that comply with the Single UNIX Specification. As of 2014, the Unix version with the largest installed base is Apple's macOS. Unix systems are characterized by a modular design, sometimes called the "Unix philosophy"; this concept entails that the operating system provides a set of simple tools that each performs a limited, well-defined function, with a unified filesystem as the main means of communication, a shell scripting and command language to combine the tools to perform complex workflows.
Unix distinguishes itself from its predecessors as the first portable operating system: the entire operating system is written in the C programming language, thus allowing Unix to reach numerous platforms. Unix was meant to be a convenient platform for programmers developing software to be run on it and on other systems, rather than for non-programmers; the system grew larger as the operating system started spreading in academic circles, as users added their own tools to the system and shared them with colleagues. At first, Unix was not designed to be multi-tasking. Unix gained portability, multi-tasking and multi-user capabilities in a time-sharing configuration. Unix systems are characterized by various concepts: the use of plain text for storing data; these concepts are collectively known as the "Unix philosophy". Brian Kernighan and Rob Pike summarize this in The Unix Programming Environment as "the idea that the power of a system comes more from the relationships among programs than from the programs themselves".
In an era when a standard computer consisted of a hard disk for storage and a data terminal for input and output, the Unix file model worked quite well, as I/O was linear. In the 1980s, non-blocking I/O and the set of inter-process communication mechanisms were augmented with Unix domain sockets, shared memory, message queues, semaphores, network sockets were added to support communication with other hosts; as graphical user interfaces developed, the file model proved inadequate to the task of handling asynchronous events such as those generated by a mouse. By the early 1980s, users began seeing Unix as a potential universal operating system, suitable for computers of all sizes; the Unix environment and the client–server program model were essential elements in the development of the Internet and the reshaping of computing as centered in networks rather than in individual computers. Both Unix and the C programming language were developed by AT&T and distributed to government and academic institutions, which led to both being ported to a wider variety of machine families than any other operating system.
Under Unix, the operating system consists of many libraries and utilities along with the master control program, the kernel. The kernel provides services to start and stop programs, handles the file system and other common "low-level" tasks that most programs share, schedules access to avoid conflicts when programs try to access the same resource or device simultaneously. To mediate such access, the kernel has special rights, reflected in the division between user space and kernel space - although in microkernel implementations, like MINIX or Redox, functions such as network protocols may run in user space; the origins of Unix date back to the mid-1960s when the Massachusetts Institute of Technology, Bell Labs, General Electric were developing Multics, a time-sharing operating system for the GE-645 mainframe computer. Multics featured several innovations, but presented severe problems. Frustrated by the size and complexity of Multics, but not by its goals, individual researchers at Bell Labs started withdrawing from the project.
The last to leave were Ken Thompson, Dennis Ritchie, Douglas McIlroy, Joe Ossanna, who decided to reimplement their experiences in a new project of smaller scale. This new operating system was without organizational backing, without a name; the new operating system was a single-tasking system. In 1970, the group coined the name Unics for Uniplexed Information and Computing Service, as a pun on Multics, which stood for Multiplexed Information and Computer Services. Brian Kernighan takes credit for the idea, but adds that "no one can remember" the origin of the final spelling Unix. Dennis Ritchie, Doug McIlroy, Peter G. Neumann credit Kernighan; the operating system was written in assembly language, but in 1973, Version 4 Unix was rewritten in C. Version 4 Unix, still had many PDP-11 dependent codes, is not suitable for porting; the first port to other platform was made five years f
An operating system is system software that manages computer hardware and software resources and provides common services for computer programs. Time-sharing operating systems schedule tasks for efficient use of the system and may include accounting software for cost allocation of processor time, mass storage and other resources. For hardware functions such as input and output and memory allocation, the operating system acts as an intermediary between programs and the computer hardware, although the application code is executed directly by the hardware and makes system calls to an OS function or is interrupted by it. Operating systems are found on many devices that contain a computer – from cellular phones and video game consoles to web servers and supercomputers; the dominant desktop operating system is Microsoft Windows with a market share of around 82.74%. MacOS by Apple Inc. is in second place, the varieties of Linux are collectively in third place. In the mobile sector, use in 2017 is up to 70% of Google's Android and according to third quarter 2016 data, Android on smartphones is dominant with 87.5 percent and a growth rate 10.3 percent per year, followed by Apple's iOS with 12.1 percent and a per year decrease in market share of 5.2 percent, while other operating systems amount to just 0.3 percent.
Linux distributions are dominant in supercomputing sectors. Other specialized classes of operating systems, such as embedded and real-time systems, exist for many applications. A single-tasking system can only run one program at a time, while a multi-tasking operating system allows more than one program to be running in concurrency; this is achieved by time-sharing, where the available processor time is divided between multiple processes. These processes are each interrupted in time slices by a task-scheduling subsystem of the operating system. Multi-tasking may be characterized in co-operative types. In preemptive multitasking, the operating system slices the CPU time and dedicates a slot to each of the programs. Unix-like operating systems, such as Solaris and Linux—as well as non-Unix-like, such as AmigaOS—support preemptive multitasking. Cooperative multitasking is achieved by relying on each process to provide time to the other processes in a defined manner. 16-bit versions of Microsoft Windows used cooperative multi-tasking.
32-bit versions of both Windows NT and Win9x, used preemptive multi-tasking. Single-user operating systems have no facilities to distinguish users, but may allow multiple programs to run in tandem. A multi-user operating system extends the basic concept of multi-tasking with facilities that identify processes and resources, such as disk space, belonging to multiple users, the system permits multiple users to interact with the system at the same time. Time-sharing operating systems schedule tasks for efficient use of the system and may include accounting software for cost allocation of processor time, mass storage and other resources to multiple users. A distributed operating system manages a group of distinct computers and makes them appear to be a single computer; the development of networked computers that could be linked and communicate with each other gave rise to distributed computing. Distributed computations are carried out on more than one machine; when computers in a group work in cooperation, they form a distributed system.
In an OS, distributed and cloud computing context, templating refers to creating a single virtual machine image as a guest operating system saving it as a tool for multiple running virtual machines. The technique is used both in virtualization and cloud computing management, is common in large server warehouses. Embedded operating systems are designed to be used in embedded computer systems, they are designed to operate on small machines like PDAs with less autonomy. They are able to operate with a limited number of resources, they are compact and efficient by design. Windows CE and Minix 3 are some examples of embedded operating systems. A real-time operating system is an operating system that guarantees to process events or data by a specific moment in time. A real-time operating system may be single- or multi-tasking, but when multitasking, it uses specialized scheduling algorithms so that a deterministic nature of behavior is achieved. An event-driven system switches between tasks based on their priorities or external events while time-sharing operating systems switch tasks based on clock interrupts.
A library operating system is one in which the services that a typical operating system provides, such as networking, are provided in the form of libraries and composed with the application and configuration code to construct a unikernel: a specialized, single address space, machine image that can be deployed to cloud or embedded environments. Early computers were built to perform a series of single tasks, like a calculator. Basic operating system features were developed in the 1950s, such as resident monitor functions that could automatically run different programs in succession to speed up processing. Operating systems did not exist in their more complex forms until the early 1960s. Hardware features were added, that enabled use of runtime libraries and parallel processing; when personal computers became popular in the 1980s, operating systems were made for them similar in concept to those used on larger computers. In the 1940s, the earliest electronic digital systems had no operating systems.
Electronic systems of this time were programmed on rows of mechanical switches or by jumper wires on plug boards. These were special-purpose systems that, for example, generated ballistics tables for the military or controlled the pri