The Rainbow 100 was a microcomputer introduced by Digital Equipment Corporation in 1982. This desktop unit had a monitor similar to the VT220 in a dual-CPU box with both 4 MHz Zilog Z80 and 4.81 MHz Intel 8088 CPUs. The Rainbow 100 was a triple-use machine: VT100 mode, 8-bit CP/M mode, 16-bit CP/M-86 or MS-DOS mode using the 8088; the Rainbow came in three models, the 100A, 100B and 100+. The "A" model was the first released, followed by the "B" model; the most noticeable differences between the two models were the firmware and slight hardware changes. The systems were referred to with model numbers PC-100B respectively; the "A" model was the first produced by Digital. The distinguishing characteristic of the "A" model from an end-user perspective was that the earlier firmware did not support booting from a hard disk. Other distinguishing hardware features included the three 2764 ROM chips holding the system firmware and the case fan/power supply combinations. Versions of the 100A shipped outside the USA included a user-changeable ROM chip in a special casing.
The user changed the built-in ROM for this one to support their keyboard layout and language of the boot screen. The "B" model followed the "A" model, introduced a number of changes; the "B" model featured the ability to boot from a hard disk via the boot menu due to updated firmware. The hardware changes included bigger firmware stored on two 27128 ROMs and an improved case fan/power supply; the firmware allowed selection of the boot screen language and keyboard layout, eliminating the need to switch ROM. The "B" model allowed remapping of hardware interrupts to be more compatible with MS-DOS; the "100+" model was a marketing designation signifying that the system shipped with a hard drive installed. When a hard-disk option was installed on the Rainbow, the kit included the 100+ emblem for the computer's case; the Rainbow contained two separate data buses controlled by the Zilog Z80 and the Intel 8088 respectively. The buses exchanged information via a shared 62 kB memory; when not executing 8-bit code, the Zilog Z80 was used for floppy disk access.
The 8088 bus was used for control of all other subsystems, including graphics, hard disk access, communications. While it may have been theoretically possible to load Z80 binary code into the Rainbow to execute alongside 8088 code, this procedure has never been demonstrated; the 8088 could be upgraded to the faster NEC V20 chip. This gave about 10-15% speed improvement, but required a two-byte change to the system's ROMS to fix two timing loops; the 100A model shipped with 64 kB memory on the motherboard, while the 100B had 128 KB memory on the motherboard. Daughterboards were available from Digital Equipment Corporation that could increase system memory with up to an additional 768 kB for a total 892 kB; the floppy disk drives, known as the RX50, accepted proprietary 400 kB single-sided, quad-density 5¼-inch diskettes. Initial versions of the operating systems on the Rainbow did not allow for low-level formatting, requiring users to purchase RX50 media from Digital Equipment Corporation; the high cost of media led to accusations of vendor "lock-in" against Digital.
However versions of MS-DOS and CP/M allowed formatting of diskettes. Of note was the single motor used to drive both disk drives via a common spindle, which were arranged one on top of the other; that meant that one disk inserted upside-down. This earned the diskette drive the nickname "toaster"; the unusual orientation confused many first-time users, who would complain that the machine would not read the disk. Digital Equipment Corporation produced a Winchester disk controller for the Rainbow capable of controlling hard disks compatible with the ST-506 interface; the controller, was limited to a single drive with up to and including 8 heads and 1024 cylinders, limiting storage to a maximum of 67 MB. Third-party hard-disk controllers were available, including a dual winchester support from CHS The base Rainbow system was capable of displaying text in 80×24- or 132×24-character format in monochrome only; the system could apply attributes to text including bolding, double-width, double-height-double-width.
The graphics option was a user-installable module that added graphics and color display capabilities to the Rainbow system. The Graphic module was based on a NEC 7220 graphic display controller and an 8×64 KB DRAM video memory. Due to the design of the graphics system, the Rainbow was capable of controlling two monitors one displaying graphics and another displaying text; the base Rainbow system generates a TTL composite-video signal in monochrome mode. With the inclusion of the graphics option, the Rainbow could output sync-on-green RGB video signals at TTL levels; the Rainbow was most coupled with the 12-inch VR201 monochrome monitor or the 13-inch VR241 color monitor, both produced by Digital Equipment Corporation. The Rainbow 100 and the other two microcomputers which DEC announced at the same time had two quirks that annoyed conservative users; the LK201 keyboard used a new layout. However, the VT220 style of this keyboard can be seen in the layout of the enhanced 101-key keyboard adopted by IBM in 1985.
Third-party upgrades were available, including an 80286 processor upgrade, a 3.5-inch disk adapter kit, a battery-backed clock chip, all from Suitable Solutions. In 1984, the first
Intellectual property is a category of property that includes intangible creations of the human intellect. Intellectual property encompasses two types of rights, it was not until the 19th century that the term "intellectual property" began to be used, not until the late 20th century that it became commonplace in the majority of the world. The main purpose of intellectual property law is to encourage the creation of a large variety of intellectual goods. To achieve this, the law gives people and businesses property rights to the information and intellectual goods they create – for a limited period of time; this gives economic incentive for their creation, because it allows people to profit from the information and intellectual goods they create. These economic incentives are expected to stimulate innovation and contribute to the technological progress of countries, which depends on the extent of protection granted to innovators; the intangible nature of intellectual property presents difficulties when compared with traditional property like land or goods.
Unlike traditional property, intellectual property is "indivisible" – an unlimited number of people can "consume" an intellectual good without it being depleted. Additionally, investments in intellectual goods suffer from problems of appropriation – a landowner can surround their land with a robust fence and hire armed guards to protect it, but a producer of information or an intellectual good can do little to stop their first buyer from replicating it and selling it at a lower price. Balancing rights so that they are strong enough to encourage the creation of intellectual goods but not so strong that they prevent the goods' wide use is the primary focus of modern intellectual property law; the Statute of Monopolies and the British Statute of Anne are seen as the origins of patent law and copyright firmly establishing the concept of intellectual property. "Literary property" was the term predominantly used in the British legal debates of the 1760s and 1770s over the extent to which authors and publishers of works had rights deriving from the common law of property.
The first known use of the term intellectual property dates to this time, when a piece published in the Monthly Review in 1769 used the phrase. The first clear example of modern usage goes back as early as 1808, when it was used as a heading title in a collection of essays; the German equivalent was used with the founding of the North German Confederation whose constitution granted legislative power over the protection of intellectual property to the confederation. When the administrative secretariats established by the Paris Convention and the Berne Convention merged in 1893, they located in Berne, adopted the term intellectual property in their new combined title, the United International Bureaux for the Protection of Intellectual Property; the organization subsequently relocated to Geneva in 1960, was succeeded in 1967 with the establishment of the World Intellectual Property Organization by treaty as an agency of the United Nations. According to legal scholar Mark Lemley, it was only at this point that the term began to be used in the United States, it did not enter popular usage there until passage of the Bayh-Dole Act in 1980.
"The history of patents does not begin with inventions, but rather with royal grants by Queen Elizabeth I for monopoly privileges... 200 years after the end of Elizabeth's reign, however, a patent represents a legal right obtained by an inventor providing for exclusive control over the production and sale of his mechanical or scientific invention... the evolution of patents from royal prerogative to common-law doctrine." The term can be found used in an October 1845 Massachusetts Circuit Court ruling in the patent case Davoll et al. v. Brown. In which Justice Charles L. Woodbury wrote that "only in this way can we protect intellectual property, the labors of the mind and interests are as much a man's own...as the wheat he cultivates, or the flocks he rears." The statement that "discoveries are..property" goes back earlier. Section 1 of the French law of 1791 stated, "All new discoveries are the property of the author. In Europe, French author A. Nion mentioned propriété intellectuelle in his Droits civils des auteurs, artistes et inventeurs, published in 1846.
Until the purpose of intellectual property law was to give as little protection as possible in order to encourage innovation. Therefore, they were granted only when they were necessary to encourage invention, limited in time and scope; this is as a result of knowledge being traditionally viewed as a public good, in order to allow its extensive dissemination and improvement thereof. The concept's origins can be traced back further. Jewish law includes several considerations whose effects are similar to those of modern intellectual property laws, though the notion of intellectual creations as property does not seem to exist – notably the principle of Hasagat Ge'vul was used to justify limited-term publisher copyright in the 16th century. In 500 BCE, the government of the Greek state of Sybaris offered one year's patent "to all who should discover any new refinement in luxury". According to Jean-Frédéric Morin, "the global inte
Apricot Computers was a British company that produced desktop personal computers in the mid-1980s. Apricot Computers was a British manufacturer of business personal computers, founded in 1965 as "Applied Computer Techniques" changing its name to Apricot Computers, Ltd, it was a wholly owned UK company until it was acquired in the early 1990s by the Mitsubishi Electric Corporation, which hoped that Apricot would help them compete against Japanese PC manufacturers, in particular NEC which commanded over 50% of the Japanese market at the time. Mitsubishi shut down the Apricot brand. In 2008 a new, independent Apricot company was launched in the UK. Apricot was an innovative computer hardware company, whose Birmingham R&D centre could build every aspect of a personal computer except for the integrated circuits themselves, from custom BIOS and system-level programming to the silk-screen of motherboards and metal-bending for internal chassis all the way to radio-frequency testing of a finished system.
This coupled with a smart and aggressive engineering team allowed Apricot to be the first company in the world with several technical innovations including the first commercial shipment of an all-in-one system with a 3.5-inch floppy drive, while in the early 1990s they manufactured one of the world’s most secure x86-based PCs, sold to the UK government. Their technical innovation led them down some paths which were technically advanced but proved to be disadvantageous in the marketplace. For example, when IBM abandoned their ill-fated but technically superior Micro Channel Architecture, Apricot was the only other OEM using it, in the Apricot Qi and VX FT ranges of PCs; this left the company at a technical dead-end without the financial or market power which helped IBM survive the failure of MCA. Apricot continued to experiment with unusual form-factors in a market dominated by standardised'beige boxes', they produced a range of high-availability servers with integrated uninterruptible power supply, low-profile'LANStation' PCs designed for use on office networks, diskless workstations booted over the network.
This long-running pattern of tenaciously investing in technical innovation and complete end-to-end system design and manufacture created technically excellent computers, but meant that Apricot was slow to adapt as the worldwide market grew and changed. By the mid-1990s major PC OEMs such as Compaq and Hewlett-Packard were outsourcing their own complete end-to-end system design and manufacture to Original Design Manufacturers based in Taiwan, were moving at least some of their manufacturing to cheaper locations overseas. Apricot was late in adopting this method of manufacturing though a motherboard designed and manufactured in Asia cost Apricot as little as a third of the cost of design and testing in Birmingham and manufacture in Scotland. Apricot tried to move to outsourcing but the market outpaced them, MELCO closed the company down, selling off the final assets in 1999. A management buyout resulted in new company Network Si UK Ltd being formed. In 1982 ACT released their first microcomputer, built by another company but marketed under the ACT brand.
In America it was a moderate success. In 1982 ACT signed a deal with Victor to distribute the "Victor 9000" as the ACT "Sirius 1" in the UK and Europe; the £2754 "Sirius 1" ran MS-DOS but was not hardware-compatible with the IBM PC. The Sirius 1 became the most popular 16-bit business computer in Europe in Britain and Germany, while IBM delayed the release of the PC there, its success led to the Apricot PC or ACT Apricot in September 1983, based on an Intel 8086 microprocessor running at 4.77 MHz. It ran MS-DOS or CP/M-86 but was not compatible at a hardware level with the IBM PC, it had two floppy disks, was one of the first systems to use 3.5" disks, rather than the 5.25" disks which were the norm at the time. The graphics quality was critically acclaimed, with an 800 x 400 resolution and a keyboard with eight "normal" function keys and six flat programmable ones, associated with a built-in LCD screen which displayed the current function of the keys, or could be configured to echo the current command line in MS-DOS.
The keyboard contained an integrated calculator. Microsoft Word and Multiplan were supplied with the Apricot PC. Lotus 123 was available, took advantage of the machine's high-resolution graphics. A flap covered the floppy drives when not in use; the industrial design of the machine was well conceived. The keyboard could be clipped to the base of the machine, an integrated handle used for transporting it; the supplied green phosphor monitor had a nylon mesh glare filter. A model with a built-in 10Mb hard disk was made available in 1984. In 1984 ACT released a home computer, the "Apricot F1." It ran MS-DOS with a GUI front end. The machine was only successful in the UK, it was bundled with software for graphics, word processing, a spreadsheet, some games, system tools. It had one 3.5". The same infra-red trackball pointing device used with the Apricot Portable was available for the F1. In 1984, the Apricot Portable was released, with an infra-red keyboard, a voice system, 4.77 MHz CPU, 640 x 200 LCD display for £1965.
In 1985 ACT was renamed "Apricot Computers". By this time, the F1 had become one model in the F Series.
A command-line interface or command language interpreter known as command-line user interface, console user interface and character user interface, is a means of interacting with a computer program where the user issues commands to the program in the form of successive lines of text. A program which handles the interface is called shell; the CLI was the primary means of interaction with most computer systems on computer terminals in the mid-1960s, continued to be used throughout the 1970s and 1980s on OpenVMS, Unix systems and personal computer systems including MS-DOS, CP/M and Apple DOS. The interface is implemented with a command line shell, a program that accepts commands as text input and converts commands into appropriate operating system functions. Today, many end users if use command-line interfaces and instead rely upon graphical user interfaces and menu-driven interactions. However, many software developers, system administrators and advanced users still rely on command-line interfaces to perform tasks more efficiently, configure their machine, or access programs and program features that are not available through a graphical interface.
Alternatives to the command line include, but are not limited to text user interface menus, keyboard shortcuts, various other desktop metaphors centered on the pointer. Examples of this include the Windows versions 1, 2, 3, 3.1, 3.11, DosShell, Mouse Systems PowerPanel. Programs with command-line interfaces are easier to automate via scripting. Command-line interfaces for software other than operating systems include a number of programming languages such as Tcl/Tk, PHP, others, as well as utilities such as the compression utility WinZip, some FTP and SSH/Telnet clients. Compared with a graphical user interface, a command line requires fewer system resources to implement. Since options to commands are given in a few characters in each command line, an experienced user finds the options easier to access. Automation of repetitive tasks is simplified - most operating systems using a command line interface support some mechanism for storing used sequences in a disk file, for re-use. A command-line history can be kept, allowing repetition of commands.
A command-line system may require paper or online manuals for the user's reference, although a "help" option provides a concise review of the options of a command. The command-line environment may not provide the graphical enhancements such as different fonts or extended edit windows found in a GUI, it may be difficult for a new user to become familiar with all the commands and options available, compared with the drop-down menus of a graphical user interface, without repeated reference to manuals. Operating system command line interfaces are distinct programs supplied with the operating system. A program that implements such a text interface is called a command-line interpreter, command processor or shell. Examples of command-line interpreters include DEC's DIGITAL Command Language in OpenVMS and RSX-11, the various Unix shells, CP/M's CCP, DOS's COMMAND. COM, as well as the OS/2 and the Windows CMD. EXE programs, the latter groups being based on DEC's RSX-11 and RSTS CLIs. Under most operating systems, it is possible to replace the default shell program with alternatives.
Although the term'shell' is used to describe a command-line interpreter speaking a'shell' can be any program that constitutes the user-interface, including graphically oriented ones. For example, the default Windows GUI is a shell program named EXPLORER. EXE, as defined in the SHELL=EXPLORER. EXE line in the WIN. INI configuration file; these programs are shells, but not CLIs. Application programs may have command line interfaces. An application program may support none, any, or all of these three major types of command line interface mechanisms: Parameters: Most operating systems support a means to pass additional information to a program when it is launched; when a program is launched from an OS command line shell, additional text provided along with the program name is passed to the launched program. Interactive command line sessions: After launch, a program may provide an operator with an independent means to enter commands in the form of text. OS inter-process communication: Most operating systems support means of inter-process communication.
Command lines from client processes may be redirected to a CLI program by one of these methods. Some applications support only a CLI, presenting a CLI prompt to the user and acting upon command lines as they are entered. Other programs support both a CLI and a GUI. In some cases, a GUI is a wrapper around a separate CLI executable file. In other cases, a program may provide a CLI as an optional alternative to its GUI. CLIs and GUIs support different functionality. For example, all features of MATLAB, a numerical analysis computer program, are available via the CLI, whereas the MATLAB GUI exposes only a subset of features; the early Sierra games, such as the first three King's Quest games, used commands from an internal command line to move the character around in the graphic window. The command-line interface evolved from a form of dialog once conducted by humans over teleprinter machines, in which human operators remotely exchanged inf
International Business Machines Corporation is an American multinational information technology company headquartered in Armonk, New York, with operations in over 170 countries. The company began in 1911, founded in Endicott, New York, as the Computing-Tabulating-Recording Company and was renamed "International Business Machines" in 1924. IBM produces and sells computer hardware and software, provides hosting and consulting services in areas ranging from mainframe computers to nanotechnology. IBM is a major research organization, holding the record for most U. S. patents generated by a business for 26 consecutive years. Inventions by IBM include the automated teller machine, the floppy disk, the hard disk drive, the magnetic stripe card, the relational database, the SQL programming language, the UPC barcode, dynamic random-access memory; the IBM mainframe, exemplified by the System/360, was the dominant computing platform during the 1960s and 1970s. IBM has continually shifted business operations by focusing on higher-value, more profitable markets.
This includes spinning off printer manufacturer Lexmark in 1991 and the sale of personal computer and x86-based server businesses to Lenovo, acquiring companies such as PwC Consulting, SPSS, The Weather Company, Red Hat. In 2014, IBM announced that it would go "fabless", continuing to design semiconductors, but offloading manufacturing to GlobalFoundries. Nicknamed Big Blue, IBM is one of 30 companies included in the Dow Jones Industrial Average and one of the world's largest employers, with over 380,000 employees, known as "IBMers". At least 70% of IBMers are based outside the United States, the country with the largest number of IBMers is India. IBM employees have been awarded five Nobel Prizes, six Turing Awards, ten National Medals of Technology and five National Medals of Science. In the 1880s, technologies emerged that would form the core of International Business Machines. Julius E. Pitrap patented the computing scale in 1885. On June 16, 1911, their four companies were amalgamated in New York State by Charles Ranlett Flint forming a fifth company, the Computing-Tabulating-Recording Company based in Endicott, New York.
The five companies had offices and plants in Endicott and Binghamton, New York. C.. They manufactured machinery for sale and lease, ranging from commercial scales and industrial time recorders and cheese slicers, to tabulators and punched cards. Thomas J. Watson, Sr. fired from the National Cash Register Company by John Henry Patterson, called on Flint and, in 1914, was offered a position at CTR. Watson joined CTR as General Manager 11 months was made President when court cases relating to his time at NCR were resolved. Having learned Patterson's pioneering business practices, Watson proceeded to put the stamp of NCR onto CTR's companies, he implemented sales conventions, "generous sales incentives, a focus on customer service, an insistence on well-groomed, dark-suited salesmen and had an evangelical fervor for instilling company pride and loyalty in every worker". His favorite slogan, "THINK", became a mantra for each company's employees. During Watson's first four years, revenues reached $9 million and the company's operations expanded to Europe, South America and Australia.
Watson never liked the clumsy hyphenated name "Computing-Tabulating-Recording Company" and on February 14, 1924 chose to replace it with the more expansive title "International Business Machines". By 1933 most of the subsidiaries had been merged into one company, IBM. In 1937, IBM's tabulating equipment enabled organizations to process unprecedented amounts of data, its clients including the U. S. Government, during its first effort to maintain the employment records for 26 million people pursuant to the Social Security Act, the tracking of persecuted groups by Hitler's Third Reich through the German subsidiary Dehomag. In 1949, Thomas Watson, Sr. created IBM World Trade Corporation, a subsidiary of IBM focused on foreign operations. In 1952, he stepped down after 40 years at the company helm, his son Thomas Watson, Jr. was named president. In 1956, the company demonstrated the first practical example of artificial intelligence when Arthur L. Samuel of IBM's Poughkeepsie, New York, laboratory programmed an IBM 704 not to play checkers but "learn" from its own experience.
In 1957, the FORTRAN scientific programming language was developed. In 1961, IBM developed the SABRE reservation system for American Airlines and introduced the successful Selectric typewriter. In 1963, IBM employees and computers helped. A year it moved its corporate headquarters from New York City to Armonk, New York; the latter half of the 1960s saw IBM continue its support of space exploration, participating in the 1965 Gemini flights, 1966 Saturn flights and 1969 lunar mission. On April 7, 1964, IBM announced the first computer system family, the IBM System/360, it spanned the complete range of commercial and scientific applications from large to small, allowing companies for the first time to upgrade to models with greater computing capability without having to rewrite their applications. It was followed by the IBM System/370 in 1970. Together the
The Intel 8088 microprocessor is a variant of the Intel 8086. Introduced on July 1, 1979, the 8088 had an eight-bit external data bus instead of the 16-bit bus of the 8086; the 16-bit registers and the one megabyte address range were however. In fact, according to the Intel documentation, the 8086 and 8088 have the same execution unit —only the bus interface unit is different; the original IBM PC was based on the 8088. The 8088 was designed at Intel's laboratory in Haifa, Israel, as were a large number of Intel's processors; the 8088 was targeted at economical systems by allowing the use of an eight-bit data path and eight-bit support and peripheral chips. The prefetch queue of the 8088 was shortened to four bytes, from the 8086's six bytes, the prefetch algorithm was modified to adapt to the narrower bus; these modifications of the basic 8086 design were one of the first jobs assigned to Intel's then-new design office and laboratory in Haifa. Variants of the 8088 with more than 5 MHz maximal clock frequency include the 8088-2, fabricated using Intel's new enhanced nMOS process called HMOS and specified for a maximal frequency of 8 MHz.
Followed the 80C88, a static CHMOS design, which could operate with clock speeds from 0 to 8 MHz. There were several other, more or less similar, variants from other manufacturers. For instance, the NEC V20 was a pin-compatible and faster variant of the 8088, designed and manufactured by NEC. Successive NEC 8088 compatible processors would run at up to 16 MHz. In 1984, Commodore International signed a deal to manufacture the 8088 for use in a licensed Dynalogic Hyperion clone, in a move, regarded as signaling a major new direction for the company; when announced, the list price of the 8088 was US$124.80. The 8088 is architecturally similar to the 8086; the main difference is. All of the other pins of the device perform the same function as they do with the 8086 with two exceptions. First, pin 34 is no longer BHE. Instead it outputs a maximum mode status, SSO. Combined with the IO/M and DT/R signals, the bus cycles can be decoded; the second change is the pin that signals whether a memory access or input/output access is being made has had it sense reversed.
The pin on the 8088 is IO/M. On the 8086 part it is IO/M; the reason for the reversal is that it makes the 8088 compatible with the 8085. Depending on the clock frequency, the number of memory wait states, as well as on the characteristics of the particular application program, the average performance for the Intel 8088 ranged from 0.33 to 1 million instructions per second. Meanwhile, the mov reg,reg and ALU reg,reg instructions, taking two and three cycles yielded an absolute peak performance of between 1⁄3 and 1⁄2 MIPS per MHz, that is, somewhere in the range 3–5 MIPS at 10 MHz; the speed of the execution unit and the bus of the 8086 CPU was well balanced. Cutting down the bus to eight bits made it a serious bottleneck in the 8088. With the speed of instruction fetch reduced by 50% in the 8088 as compared to the 8086, a sequence of fast instructions can drain the four-byte prefetch queue; when the queue is empty, instructions take as long to complete. Both the 8086 and 8088 take four clock cycles to complete a bus cycle.
Therefore, for example, a two-byte shift or rotate instruction, which takes the EU only two clock cycles to execute takes eight clock cycles to complete if it is not in the prefetch queue. A sequence of such fast instructions prevents the queue from being filled as fast as it is drained, in general, because so many basic instructions execute in fewer than four clocks per instruction byte—including all the ALU and data-movement instructions on register operands and some of these on memory operands—it is impossible to avoid idling the EU in the 8088 at least ¼ of the time while executing useful real-world programs, it is not hard to idle it half the time. In short, an 8088 runs about half as fast as 8086 clocked at the same rate, because of the bus bottleneck. A side effect of the 8088 design, with the slow bus and the small prefetch queue, is that the speed of code execution can be dependent on instruction order; when programming the 8088, for CPU efficiency, it is vital to interleave long-running instructions with short ones whenever possible.
For example, a repeated string operation or a shift by three or more will take long enough to allow time for the 4-byte prefetch queue to fill. If short instructions are placed between slower instructions like these, the short ones can execute at full speed out of the queue. If, on the other hand, the slow instructions are executed sequentially, back to back after the first of them the bus unit will be forced to idle because the queue will be full, with the consequence that more of the faster instructions will suffer fetch delays that might have been avoidable; as some instructions, such as single-bit-position shifts and rotates, take 4 times as long to fetch as to execute, the overall effec
Microsoft Corporation is an American multinational technology company with headquarters in Redmond, Washington. It develops, licenses and sells computer software, consumer electronics, personal computers, related services, its best known software products are the Microsoft Windows line of operating systems, the Microsoft Office suite, the Internet Explorer and Edge web browsers. Its flagship hardware products are the Xbox video game consoles and the Microsoft Surface lineup of touchscreen personal computers; as of 2016, it is the world's largest software maker by revenue, one of the world's most valuable companies. The word "Microsoft" is a portmanteau of "microcomputer" and "software". Microsoft is ranked No. 30 in the 2018 Fortune 500 rankings of the largest United States corporations by total revenue. Microsoft was founded by Bill Gates and Paul Allen on April 4, 1975, to develop and sell BASIC interpreters for the Altair 8800, it rose to dominate the personal computer operating system market with MS-DOS in the mid-1980s, followed by Microsoft Windows.
The company's 1986 initial public offering, subsequent rise in its share price, created three billionaires and an estimated 12,000 millionaires among Microsoft employees. Since the 1990s, it has diversified from the operating system market and has made a number of corporate acquisitions, their largest being the acquisition of LinkedIn for $26.2 billion in December 2016, followed by their acquisition of Skype Technologies for $8.5 billion in May 2011. As of 2015, Microsoft is market-dominant in the IBM PC-compatible operating system market and the office software suite market, although it has lost the majority of the overall operating system market to Android; the company produces a wide range of other consumer and enterprise software for desktops and servers, including Internet search, the digital services market, mixed reality, cloud computing and software development. Steve Ballmer replaced Gates as CEO in 2000, envisioned a "devices and services" strategy; this began with the acquisition of Danger Inc. in 2008, entering the personal computer production market for the first time in June 2012 with the launch of the Microsoft Surface line of tablet computers.
Since Satya Nadella took over as CEO in 2014, the company has scaled back on hardware and has instead focused on cloud computing, a move that helped the company's shares reach its highest value since December 1999. In 2018, Microsoft surpassed Apple as the most valuable publicly traded company in the world after being dethroned by the tech giant in 2010. Childhood friends Bill Gates and Paul Allen sought to make a business utilizing their shared skills in computer programming. In 1972 they founded their first company, named Traf-O-Data, which sold a rudimentary computer to track and analyze automobile traffic data. While Gates enrolled at Harvard, Allen pursued a degree in computer science at Washington State University, though he dropped out of school to work at Honeywell; the January 1975 issue of Popular Electronics featured Micro Instrumentation and Telemetry Systems's Altair 8800 microcomputer, which inspired Allen to suggest that they could program a BASIC interpreter for the device. After a call from Gates claiming to have a working interpreter, MITS requested a demonstration.
Since they didn't yet have one, Allen worked on a simulator for the Altair while Gates developed the interpreter. Although they developed the interpreter on a simulator and not the actual device, it worked flawlessly when they demonstrated the interpreter to MITS in Albuquerque, New Mexico. MITS agreed to distribute it, marketing it as Altair BASIC. Gates and Allen established Microsoft on April 4, 1975, with Gates as the CEO; the original name of "Micro-Soft" was suggested by Allen. In August 1977 the company formed an agreement with ASCII Magazine in Japan, resulting in its first international office, "ASCII Microsoft". Microsoft moved to a new home in Bellevue, Washington in January 1979. Microsoft entered the operating system business in 1980 with its own version of Unix, called Xenix. However, it was MS-DOS. After negotiations with Digital Research failed, IBM awarded a contract to Microsoft in November 1980 to provide a version of the CP/M OS, set to be used in the upcoming IBM Personal Computer.
For this deal, Microsoft purchased a CP/M clone called 86-DOS from Seattle Computer Products, which it branded as MS-DOS, though IBM rebranded it to PC DOS. Following the release of the IBM PC in August 1981, Microsoft retained ownership of MS-DOS. Since IBM had copyrighted the IBM PC BIOS, other companies had to reverse engineer it in order for non-IBM hardware to run as IBM PC compatibles, but no such restriction applied to the operating systems. Due to various factors, such as MS-DOS's available software selection, Microsoft became the leading PC operating systems vendor; the company expanded into new markets with the release of the Microsoft Mouse in 1983, as well as with a publishing division named Microsoft Press. Paul Allen resigned from Microsoft in 1983 after developing Hodgkin's disease. Allen claimed that Gates wanted to dilute his share in the company when he was diagnosed with Hodgkin's disease because he didn't think he was working hard enough. After leaving Microsoft, Allen lost billions of dollars on ill-conceived or mistimed technology investments.
He invested in low-tech sectors, sports teams, commercial real estate. Despite having begun jointly developing a new operating system, OS/2, with IBM in