Color Graphics Adapter
The Color Graphics Adapter also called the Color/Graphics Adapter or IBM Color/Graphics Monitor Adapter, introduced in 1981, was IBM's first graphics card and first color display card for the IBM PC. For this reason, it became that computer's first color computer display standard; the standard IBM CGA graphics card was equipped with 16 kilobytes of video memory and could be connected either to a dedicated direct-drive CRT monitor using a 4-bit digital RGBI interface, such as the IBM 5153 color display, or to an NTSC-compatible television or composite video monitor via an RCA connector. The RCA connector provided only baseband video, so to connect the CGA card to a standard television set required a separate RF modulator unless the TV had an RCA jack though with the former combined with an amplifier sometimes was more practical since one could hook up an antenna to the amplifier and get wireless video. Built around the Motorola MC6845 display controller, the CGA card featured several graphics and text modes.
The highest display resolution of any mode was 640×200, the highest color depth supported was 4-bit. CGA supports: 320×200 in 4 colors from a 16 color hardware palette. Pixel aspect ratio of 1:1.2. 640×200 in 2 colors. Pixel aspect ratio of 1:2.4 Text modes: 40×25 with 8×8 pixel font 80×25 with 8×8 pixel font Extended graphics modes: 160×100 16 color mode Artifact colors using a NTSC monitor IBM intended that CGA be compatible with a home television set. The 40×25 text and 320×200 graphics modes are usable with a television, the 80×25 text and 640×200 graphics modes are intended for a monitor. Despite varying bit depths among the CGA graphics modes, CGA processes colors in its palette in four bits, yielding 24 = 16 different colors; the four color bits are arranged according to the RGBI color model: the lower three bits represent red and blue color components. In graphics modes, colors are set per-pixel; these four bits are passed on unmodified to the DE-9 connector at the back of the card, leaving all color processing to the RGBI monitor connected to it.
With respect to the RGBI color model described above, the monitor would use the following formula to process the digital four-bit color number to analog voltages ranging from 0.0 to 1.0: red:= 2/3×/4 + 1/3×/8 green:= 2/3×/2 + 1/3×/8 blue:= 2/3×/1 + 1/3×/8 Color 6 is treated differently. For the composite output, these four-bit color numbers are encoded by the CGA's onboard hardware into an NTSC-compatible signal fed to the card's RCA output jack. For cost reasons, this is not done using an RGB-to-YIQ converter as called for by the NTSC standard, but by a series of flip-flops and delay lines; the hues seen are lacking in purity. The relative luminances of the colors produced by the composite color-generating circuit differ between CGA revisions: they are identical for colors 1-6 and 9-14 with early CGAs produced until 1983, are different for CGAs due to the addition of additional resistors; as noted however, this method only works on NTSC television sets, PAL TVs do not display the colors as expected when connected to the composite output, as PAL's superior color separation prevents artifacting from occurring.
When the CGA was introduced in 1981, IBM did not offer an RGBI monitor. Instead, customers were supposed to use the RCA output with an RF modulator to connect the CGA to their television set; the IBM 5153 Personal Computer Color Display would not be introduced until March 1983. Resulting from the lack of available RGBI monitors in 1981 and 1982, many users would use simpler RGB monitors, reducing the number of available colors to eight, displaying both colors 6 and 14 as yellow; this is relevant insofar as if an application or game programmer used either one of these configurations, they will have expected color 6 to look dark yellow instead of brown. CGA offers four BIOS text modes: 40×25 characters in up to 16 colors; each character is a pattern of 8×8 dots. The effective screen resolution in this mode is 320×200 pixels, though individual pixels cannot be addressed independently; the choice of patterns for any location is thus limited to one of the 256 available characters, the patterns for which are stored in a ROM chip on the card itself.
The display font in text mode i
A modem is a hardware device that converts data between transmission media so that it can be transmitted from computer to computer. The goal is to produce a signal that can be transmitted and decoded to reproduce the original digital data. Modems can be used with any means of transmitting analog signals from light-emitting diodes to radio. A common type of modem is one that turns the digital data of a computer into modulated electrical signal for transmission over telephone lines and demodulated by another modem at the receiver side to recover the digital data. Modems are classified by the maximum amount of data they can send in a given unit of time expressed in bits per second or bytes per second. Modems can be classified by their symbol rate, measured in baud; the baud unit denotes symbols per second, or the number of times per second the modem sends a new signal. For example, the ITU V.21 standard used audio frequency-shift keying with two possible frequencies, corresponding to two distinct symbols, to carry 300 bits per second using 300 baud.
By contrast, the original ITU V.22 standard, which could transmit and receive four distinct symbols, transmitted 1,200 bits by sending 600 symbols per second using phase-shift keying News wire services in the 1920s used multiplex devices that satisfied the definition of a modem. However, the modem function was incidental to the multiplexing function, so they are not included in the history of modems. Modems grew out of the need to connect teleprinters over ordinary phone lines instead of the more expensive leased lines, used for current loop–based teleprinters and automated telegraphs. In 1941, the Allies developed a voice encryption system called SIGSALY which used a vocoder to digitize speech encrypted the speech with one-time pad and encoded the digital data as tones using frequency shift keying. Mass-produced modems in the United States began as part of the SAGE air-defense system in 1958, connecting terminals at various airbases, radar sites, command-and-control centers to the SAGE director centers scattered around the United States and Canada.
SAGE modems were described by AT&T's Bell Labs as conforming to their newly published Bell 101 dataset standard. While they ran on dedicated telephone lines, the devices at each end were no different from commercial acoustically coupled Bell 101, 110 baud modems; the 201A and 201B Data-Phones were synchronous modems using two-bit-per-baud phase-shift keying. The 201A operated half-duplex at 2,000 bit/s over normal phone lines, while the 201B provided full duplex 2,400 bit/s service on four-wire leased lines, the send and receive channels each running on their own set of two wires; the famous Bell 103A dataset standard was introduced by AT&T in 1962. It provided full-duplex service at 300 bit/s over normal phone lines. Frequency-shift keying was used, with the call originator transmitting at 1,070 or 1,270 Hz and the answering modem transmitting at 2,025 or 2,225 Hz; the available 103A2 gave an important boost to the use of remote low-speed terminals such as the Teletype Model 33 ASR and KSR, the IBM 2741.
AT&T reduced modem costs by introducing the answer-only 113B/C modems. For many years, the Bell System maintained a monopoly on the use of its phone lines and what devices could be connected to them. However, the FCC's seminal Carterfone Decision of 1968, the FCC concluded that electronic devices could be connected to the telephone system as long as they used an acoustic coupler. Since most handsets were supplied by Western Electric and thus of a standard design, acoustic couplers were easy to build. Acoustically coupled Bell 103A-compatible 300 bit/s modems were common during the 1970s. Well-known models included the Novation CAT and the Anderson-Jacobson, the latter spun off from an in-house project at Stanford Research Institute. An lower-cost option was the Pennywhistle modem, designed to be built using parts from electronics scrap and surplus stores. In December 1972, Vadic introduced the VA3400, notable for full-duplex operation at 1,200 bit/s over the phone network. Like the 103A, it used different frequency bands for receive.
In November 1976, AT&T introduced the 212A modem to compete with Vadic. It used the lower frequency set for transmission. One could use the 212A with a 103A modem at 300 bit/s. According to Vadic, the change in frequency assignments made the 212 intentionally incompatible with acoustic coupling, thereby locking out many potential modem manufacturers. In 1977, Vadic responded with the VA3467 triple modem, an answer-only modem sold to computer center operators that supported Vadic's 1,200-bit/s mode, AT&T's 212A mode, 103A operation; the Hush-a-Phone decision applied only to mechanical connections, but the Carterfone decision of 1968, led to the FCC introducing a rule setting stringent AT&T-designed tests for electronically coupling a device to the phone lines. This opened the door to direct-connect modems that plugged directly into the phone line rather than via a handset. However, the cost of passing the tests was considerable, acoustically coupled modems remained common into the early 1980s.
The falling prices of electronics in the late 1970s led to an increasing number of direct-connect models around 1980. In spite of being directly connected, these modems were operated like their earlier acoustic versions – dialing and other phone-control operations were completed by hand, using an attached handset
A floppy disk known as a floppy, diskette, or disk, is a type of disk storage composed of a disk of thin and flexible magnetic storage medium, sealed in a rectangular plastic enclosure lined with fabric that removes dust particles. Floppy disks are written by a floppy disk drive. Floppy disks as 8-inch media and in 5 1⁄4-inch and 3 1⁄2 inch sizes, were a ubiquitous form of data storage and exchange from the mid-1970s into the first years of the 21st century. By 2006 computers were manufactured with installed floppy disk drives; these formats are handled by older equipment. The prevalence of floppy disks in late-twentieth century culture was such that many electronic and software programs still use the floppy disks as save icons. While floppy disk drives still have some limited uses with legacy industrial computer equipment, they have been superseded by data storage methods with much greater capacity, such as USB flash drives, flash storage cards, portable external hard disk drives, optical discs, cloud storage and storage available through computer networks.
The first commercial floppy disks, developed in the late 1960s, were 8 inches in diameter. These disks and associated drives were produced and improved upon by IBM and other companies such as Memorex, Shugart Associates, Burroughs Corporation; the term "floppy disk" appeared in print as early as 1970, although IBM announced its first media as the "Type 1 Diskette" in 1973, the industry continued to use the terms "floppy disk" or "floppy". In 1976, Shugart Associates introduced the 5 1⁄4-inch FDD. By 1978 there were more than 10 manufacturers producing such FDDs. There were competing floppy disk formats, with hard- and soft-sector versions and encoding schemes such as FM, MFM, M2FM and GCR; the 5 1⁄4-inch format displaced the 8-inch one for most applications, the hard-sectored disk format disappeared. The most common capacity of the 5 1⁄4-inch format in DOS-based PCs was 360 KB, for the DSDD format using MFM encoding. In 1984 IBM introduced with its PC-AT model the 1.2 MB dual-sided 5 1⁄4-inch floppy disk, but it never became popular.
IBM started using the 720 KB double-density 3 1⁄2-inch microfloppy disk on its Convertible laptop computer in 1986 and the 1.44 MB high-density version with the PS/2 line in 1987. These disk drives could be added to older PC models. In 1988 IBM introduced a drive for 2.88 MB "DSED" diskettes in its top-of-the-line PS/2 models, but this was a commercial failure. Throughout the early 1980s, limitations of the 5 1⁄4-inch format became clear. Designed to be more practical than the 8-inch format, it was itself too large. A number of solutions were developed, with drives at 2-, 2 1⁄2-, 3-, 3 1⁄4-, 3 1⁄2- and 4-inches offered by various companies, they all shared a number of advantages over the old format, including a rigid case with a sliding metal shutter over the head slot, which helped protect the delicate magnetic medium from dust and damage, a sliding write protection tab, far more convenient than the adhesive tabs used with earlier disks. The large market share of the well-established 5 1⁄4-inch format made it difficult for these diverse mutually-incompatible new formats to gain significant market share.
A variant on the Sony design, introduced in 1982 by a large number of manufacturers, was rapidly adopted. The term floppy disk persisted though style floppy disks have a rigid case around an internal floppy disk. By the end of the 1980s, 5 1⁄4-inch disks had been superseded by 3 1⁄2-inch disks. During this time, PCs came equipped with drives of both sizes. By the mid-1990s, 5 1⁄4-inch drives had disappeared, as the 3 1⁄2-inch disk became the predominant floppy disk; the advantages of the 3 1⁄2-inch disk were its higher capacity, its smaller size, its rigid case which provided better protection from dirt and other environmental risks. If a person touches the exposed disk surface of a 5 1⁄4-inch disk through the drive hole, fingerprints may foul the disk—and the disk drive head if the disk is subsequently loaded into a drive—and it is easily possible to damage a disk of this type by folding or creasing it rendering it at least unreadable; however due to its simpler construction the 5 1⁄4-inch disk unit price was lower throughout its history in the range of a third to a half that of a 3 1⁄2-inch disk.
Floppy disks became commonplace during the 1980s and 1990s in their use with personal computers to distribute software, transfer data, create backups. Before hard disks became affordable to the general population, floppy disks were used to store a computer's operating system. Most home computers from that period have an elementary OS and BASIC stored in ROM, with the option of loading a more advanced operating system from a floppy disk. By the early 1990s, the increasing software size meant large packages like Windows or Adobe Photoshop required a dozen disks or more. In 1996, there were an estimated five billion standard floppy disks in use. Distribution of larger packages was replaced by CD-ROMs, DVDs and online distribution. An
The Z80 CPU is an 8-bit based microprocessor. It was introduced by Zilog in 1976 as the startup company's first product; the Z80 was conceived by Federico Faggin in late 1974 and developed by him and his then-11 employees at Zilog from early 1975 until March 1976, when the first working samples were delivered. With the revenue from the Z80, the company built its own chip factories and grew to over a thousand employees over the following two years; the Zilog Z80 was a software-compatible extension and enhancement of the Intel 8080 and, like it, was aimed at embedded systems. According to the designers, the primary targets for the Z80 CPU were products like intelligent terminals, high end printers and advanced cash registers as well as telecom equipment, industrial robots and other kinds of automation equipment; the Z80 was introduced on the market in July 1976 and came to be used in general desktop computers using CP/M and other operating systems as well as in the home computers of the 1980s.
It was common in military applications, musical equipment, such as synthesizers, in the computerized coin operated video games of the late 1970s and early 1980, the arcade machines or video game arcade cabinets. The Z80 was one of the most used CPUs in the home computer market from the late 1970s to the mid-1980s. Zilog licensed the Z80 to the US-based Synertek and Mostek, which had helped them with initial production, as well as to a European second source manufacturer, SGS; the design was copied by several Japanese, East European and Soviet manufacturers. This won the Z80 acceptance in the world market since large companies like NEC, Toshiba and Hitachi started to manufacture the device. In recent decades Zilog has refocused on the ever-growing market for embedded systems and the most recent Z80-compatible microcontroller family, the pipelined 24-bit eZ80 with a linear 16 MB address range, has been introduced alongside the simpler Z180 and Z80 products; the Z80 came about when physicist Federico Faggin left Intel at the end of 1974 to found Zilog with Ralph Ungermann.
At Fairchild Semiconductor, at Intel, Faggin had been working on fundamental transistor and semiconductor manufacturing technology. He developed the basic design methodology used for memories and microprocessors at Intel and led the work on the Intel 4004, the 8080 and several other ICs. Masatoshi Shima, the principal logic and transistor level-designer of the 4004 and the 8080 under Faggin's supervision, joined the Zilog team. By March 1976, Zilog had developed the Z80 as well as an accompanying assembler based development system for its customers, by July 1976, this was formally launched onto the market. Early Z80s were manufactured by Synertek and Mostek, before Zilog had its own manufacturing factory ready, in late 1976; these companies were chosen because they could do the ion implantation needed to create the depletion-mode MOSFETs that the Z80 design used as load transistors in order to cope with a single 5 Volt power supply. Faggin designed the instruction set to be binary compatible with the Intel 8080 so that most 8080 code, notably the CP/M operating system and Intel's PL/M compiler for 8080, would run unmodified on the new Z80 CPU.
Masatoshi Shima designed most of the microarchitecture as well as the gate and transistor levels of the Z80 CPU, assisted by a small number of engineers and layout people. CEO Federico Faggin was heavily involved in the chip layout work, together with two dedicated layout people. Faggin worked 80 hours a week in order to meet the tight schedule given by the financial investors, according to himself; the Z80 offered many improvements over the 8080: An enhanced instruction set including single-bit addressing, shifts/rotates on memory and registers other than the accumulator, rotate instructions for BCD number strings in memory, program looping, program counter relative jumps, block copy, block input/output, byte search instructions. The Z80 had better support for signed 8 - and 16-bit arithmetics. New IX and IY index registers with instructions for direct base+offset addressing A better interrupt system A more automatic and general vectorized interrupt system, mode 2 intended for Zilog's line of counter/timers, DMA and communications controllers, as well as a fixed vector interrupt system, mode 1, for simple systems with minimal hardware.
A non maskable interrupt which can be used to respond to power down situations or other high priority events. Two separate register files, which could be switched, to speed up response to interrupts such as fast asynchronous event handlers or a multitasking dispatcher. Although they were not intended as extra registers for general code, they were used that way in some applications. Less hardware required for power supply, clock generation and interface to memory and I/O Single 5-volt power supply. Single-phase 5 V clock. A built-in DRAM refresh mechanism. Non-multiplexed buses. A special reset function which clears only the program counter so that a single Z80 CPU could be used in a
In telecommunications, an acoustic coupler is an interface device for coupling electrical signals by acoustical means—usually into and out of a telephone. The link is achieved through converting electric signals from the phone line to sound and reconvert sound to electric signals needed for the end terminal, such as a teletypewriter, back, rather than through direct electrical connection. Prior to its breakup in 1984, Bell System's legal monopoly over telephony in the United States allowed the company to impose strict rules on how consumers could access their network. Customers were sold by Bell to the network; the same set-up was operative in nearly all countries, where the telephone companies were nationally owned. In many households, telephones were hard-wired to wall terminals before connectors like RJ11 and BS 6312 became standardized; the situation was similar in other countries. In Australia, until 1975 the PMG, a Government monopoly, owned all telephone wiring and equipment in user premises and prohibited attachment of third party devices, while most handsets were connected by 600 series connectors, these were peculiar to Australia so imported equipment could not be directly connected in any case, despite the general electrical compatibility.
It was not until a landmark court ruling regarding the Hush-A-Phone in 1956 that the use of a phone attachment was allowed for the first time. A second court decision in 1968 regarding the Carterfone further allowed any device not harmful to the system to be connected directly to the AT&T network; this decision enabled the proliferation of innovations like answering machines, fax machines, modems. When inventors began developing devices to send non-voice signals over the telephone line, the need for a workaround for the Bell restrictions was apparent; as early as 1937, telefax machines used by newspapers were using some kind of couplers acoustic but more magnetic for single-directional communication. Multiplexed bidirectional telephone coupling was not needed by these early fax machines. Robert Weitbrecht created a workaround for the Bell restrictions in 1963, he developed a coupling device that converted sound from the ear piece of the telephone handset to electrical signals, converts the electrical pulses coming from the teletypewriter to sound that goes into the mouth piece of the telephone handset.
His acoustic coupler is known as the Weitbrecht Modem. The Weitbrecht Modem inspired other engineers to develop other modems to work with 8-bit ASCII terminals at a faster rate; such modems or couplers were developed around 1966 by John van Geen at the Stanford Research Institute, that mimicked handset operations. An early commercial model was built by Livermore Data Systems in 1968. One would dial the computer system on one's phone, when the connection was established, place the handset into the acoustic modem. Since the handsets were all supplied by the telephone company, most had the same shape, simplifying the physical interface. A microphone and a speaker inside the modem box would pick up and transmit the signaling tones, circuitry would convert those audio shift-key encoded frequency binary signals for an RS232 output socket. With luck one could get 300 baud transmission rates; that speed was sufficient for typewriter-based terminals, as the IBM 2741, running at 134.5 baud, or a teleprinter, running at 110 baud.
The practical upper limit for acoustic-coupled modems was 1200-baud, first made available in 1973 by Vadic and 1977 by AT&T. 1200 baud endpoints became widespread in 1985 with the advent of the Hayes Smartmodem 1200A, though it used an RJ11 jack and was not an acoustic coupler. Such devices facilitated the creation of dial-up bulletin board systems, a forerunner of modern internet chat rooms, message boards, e-mail. A standard telephone handset was placed into a cradle, engineered to fit around the microphone and earpiece of the handset. A modem would modulate a loudspeaker in the cup attached to the handset's microphone, sound from the loudspeaker in the telephone handset's earpiece would be picked up by a microphone in the cup attached to the earpiece. In this way signals could be passed in both directions. Acoustic couplers were sensitive to external noise and depended on the widespread standardization of the dimensions of telephone handsets. Direct electrical connections to telephone networks, once they were made legal became the preferred method of attaching modems, the use of acoustic couplers dwindled.
Acoustic couplers were still used until at least the late 1990s by people travelling in areas of the world where electrical connection to the telephone network is illegal or impractical. Many models of TDDs still have a built-in acoustic coupler, which allow more universal use with pay phones and for 911 calls by deaf people. An acoustic coupler is prominently shown early in the 1983 film "WarGames", when character David Lightman places a telephone handset into the cradle of a film prop acoustic modem to accentuate the act of using telephone lines for interconnection to the developing computer networks of the period—in this case, a military command computer; the earliest major motion picture depicting an acoustic coupler was the 1968 Steve McQueen film Bullitt. Carterfone Federal Standard 1037C MIL-STD-188 Telecommunications device for the deaf 1964 Li
MS-DOS is an operating system for x86-based personal computers developed by Microsoft. Collectively, MS-DOS, its rebranding as IBM PC DOS, some operating systems attempting to be compatible with MS-DOS, are sometimes referred to as "DOS". MS-DOS was the main operating system for IBM PC compatible personal computers during the 1980s and the early 1990s, when it was superseded by operating systems offering a graphical user interface, in various generations of the graphical Microsoft Windows operating system. MS-DOS was the result of the language developed in the seventies, used by IBM for its mainframe operating system. Microsoft acquired the rights to meet IBM specifications. IBM re-released it on August 12, 1981 as PC DOS 1.0 for use in their PCs. Although MS-DOS and PC DOS were developed in parallel by Microsoft and IBM, the two products diverged after twelve years, in 1993, with recognizable differences in compatibility and capabilities. During its lifetime, several competing products were released for the x86 platform, MS-DOS went through eight versions, until development ceased in 2000.
MS-DOS was targeted at Intel 8086 processors running on computer hardware using floppy disks to store and access not only the operating system, but application software and user data as well. Progressive version releases delivered support for other mass storage media in greater sizes and formats, along with added feature support for newer processors and evolving computer architectures, it was the key product in Microsoft's growth from a programming language company to a diverse software development firm, providing the company with essential revenue and marketing resources. It was the underlying basic operating system on which early versions of Windows ran as a GUI, it is a flexible operating system, consumes negligible installation space. MS-DOS was a renamed form of 86-DOS – owned by Seattle Computer Products, written by Tim Paterson. Development of 86-DOS took only six weeks, as it was a clone of Digital Research's CP/M, ported to run on 8086 processors and with two notable differences compared to CP/M.
This first version was shipped in August 1980. Microsoft, which needed an operating system for the IBM Personal Computer hired Tim Paterson in May 1981 and bought 86-DOS 1.10 for $75,000 in July of the same year. Microsoft kept the version number, but renamed it MS-DOS, they licensed MS-DOS 1.10/1.14 to IBM, who, in August 1981, offered it as PC DOS 1.0 as one of three operating systems for the IBM 5150, or the IBM PC. Within a year Microsoft licensed MS-DOS to over 70 other companies, it was designed to be an OS. Each computer would have its own distinct hardware and its own version of MS-DOS, similar to the situation that existed for CP/M, with MS-DOS emulating the same solution as CP/M to adapt for different hardware platforms. To this end, MS-DOS was designed with a modular structure with internal device drivers, minimally for primary disk drives and the console, integrated with the kernel and loaded by the boot loader, installable device drivers for other devices loaded and integrated at boot time.
The OEM would use a development kit provided by Microsoft to build a version of MS-DOS with their basic I/O drivers and a standard Microsoft kernel, which they would supply on disk to end users along with the hardware. Thus, there were many different versions of "MS-DOS" for different hardware, there is a major distinction between an IBM-compatible machine and an MS-DOS machine; some machines, like the Tandy 2000, were MS-DOS compatible but not IBM-compatible, so they could run software written for MS-DOS without dependence on the peripheral hardware of the IBM PC architecture. This design would have worked well for compatibility, if application programs had only used MS-DOS services to perform device I/O, indeed the same design philosophy is embodied in Windows NT. However, in MS-DOS's early days, the greater speed attainable by programs through direct control of hardware was of particular importance for games, which pushed the limits of their contemporary hardware. Soon an IBM-compatible architecture became the goal, before long all 8086-family computers emulated IBM's hardware, only a single version of MS-DOS for a fixed hardware platform was needed for the market.
This version is the version of MS-DOS, discussed here, as the dozens of other OEM versions of "MS-DOS" were only relevant to the systems they were designed for, in any case were similar in function and capability to some standard version for the IBM PC—often the same-numbered version, but not always, since some OEMs used their own proprietary version numbering schemes —with a few notable exceptions. Microsoft omitted multi-user support from MS-DOS because Microsoft's Unix-based operating system, was multi-user; the company planned, over time, to improve MS-DOS so it would be indistinguishable from single-user Xenix, or XEDOS, which would run on the Motorola 68000, Zilog Z8000, the LSI-11. Microsoft advertised MS-DOS and Xenix together, listing the shared features of its "single-user OS" and "the multi-user, multi-tasking, UNIX-derived operating system", promising easy
Commodore International was an American home computer and electronics manufacturer founded by Jack Tramiel. Commodore International, along with its subsidiary Commodore Business Machines, participated in the development of the home–personal computer industry in the 1970s and 1980s; the company developed and marketed the world's best-selling desktop computer, the Commodore 64, released its Amiga computer line in July 1985. With quarterly sales ending 1983 of $49 million, Commodore was one of the world's largest personal computer manufacturers; the company that would become Commodore Business Machines, Inc. was founded in 1954 in Toronto as the Commodore Portable Typewriter Company by Polish-Jewish immigrant and Auschwitz survivor Jack Tramiel. For a few years he had been living in New York, driving a taxicab, running a small business repairing typewriters, when he managed to sign a deal with a Czechoslovakian company to manufacture their designs in Canada, he moved to Toronto to start production.
By the late 1950s a wave of Japanese machines forced most North American typewriter companies to cease business, but Tramiel instead turned to adding machines. In 1955, the company was formally incorporated as Inc. in Canada. In 1962 Commodore went public on the New York Stock Exchange, under the name of Commodore International Limited. In the late 1960s, history repeated itself when Japanese firms started producing and exporting adding machines; the company's main investor and chairman, Irving Gould, suggested that Tramiel travel to Japan to understand how to compete. Instead, Tramiel returned with the new idea to produce electronic calculators, which were just coming on the market. Commodore soon had a profitable calculator line and was one of the more popular brands in the early 1970s, producing both consumer as well as scientific/programmable calculators. However, in 1975, Texas Instruments, the main supplier of calculator parts, entered the market directly and put out a line of machines priced at less than Commodore's cost for the parts.
Commodore obtained an infusion of cash from Gould, which Tramiel used beginning in 1976 to purchase several second-source chip suppliers, including MOS Technology, Inc. in order to assure his supply. He agreed to buy MOS, having troubles of its own, only on the condition that its chip designer Chuck Peddle join Commodore directly as head of engineering. Through the 1970s Commodore produced numerous peripherals and consumer electronic products such as the Chessmate, a chess computer based around a MOS 6504 chip, released in 1978. In December 2007, when Tramiel was visiting the Computer History Museum in Mountain View, for the 25th anniversary of the Commodore 64, he was asked why he called his company Commodore, he said: "I wanted to call my company General, but there's so many Generals in the U. S.: General Electric, General Motors. I went to Admiral, but, taken. So I wind up in Berlin, with my wife, we were in a cab, the cab made a short stop, in front of us was an Opel Commodore." Tramiel gave this account in many interviews, but Opel's Commodore didn't debut until 1967, years after the company had been named.
Once Chuck Peddle had taken over engineering at Commodore, he convinced Jack Tramiel that calculators were a dead end, that they should turn their attention to home computers. Peddle packaged his single-board computer design in a metal case with a keyboard using calculator keys with a full-travel QWERTY keyboard, monochrome monitor, tape recorder for program and data storage, to produce the Commodore PET. From PET's 1977 debut, Commodore would be a computer company. Commodore had been reorganized the year before into Commodore International, Ltd. moving its financial headquarters to the Bahamas and its operational headquarters to West Chester, near the MOS Technology site. The operational headquarters, where research and development of new products occurred, retained the name Commodore Business Machines, Inc. In 1980 Commodore launched production for the European market in Braunschweig. By 1980, Commodore was one of the three largest microcomputer companies, the largest in the Common Market.
The company had lost its early domestic-market sales leadership, however. BYTE stated of the business computer market that "the lack of a marketing strategy by Commodore, as well as its past nonchalant attitude toward the encouragement and development of good software, has hurt its credibility in comparison to the other systems on the market"; the author of Programming the PET/CBM stated in its introduction that "CBM's product manuals are recognized to be unhelpful. Commodore reemphasized the US market with the VIC-20; the PET computer line was used in schools, where its tough all-metal construction and ability to share printers and disk drives on a simple local area network were advantages, but PETs did not compete well in the home setting where graphics and sound were important. This was addressed with the VIC-20 in 1981, introduced at a cost of US$299 and sold in retail stores. Commodore bought aggressive advertisements featuring William Shatner asking consumers "Why buy just a video game?"
The strategy worked and the VIC-20 became the first computer to ship more than one million units. A total of 2.5 million units were sold over the machine's lifetime and helped Commodore's sales to Canadian schools. In another promotion aimed at schools (and as a