Random-access memory is a form of computer data storage that stores data and machine code being used. A random-access memory device allows data items to be read or written in the same amount of time irrespective of the physical location of data inside the memory. In contrast, with other direct-access data storage media such as hard disks, CD-RWs, DVD-RWs and the older magnetic tapes and drum memory, the time required to read and write data items varies depending on their physical locations on the recording medium, due to mechanical limitations such as media rotation speeds and arm movement. RAM contains multiplexing and demultiplexing circuitry, to connect the data lines to the addressed storage for reading or writing the entry. More than one bit of storage is accessed by the same address, RAM devices have multiple data lines and are said to be "8-bit" or "16-bit", etc. devices. In today's technology, random-access memory takes the form of integrated circuits. RAM is associated with volatile types of memory, where stored information is lost if power is removed, although non-volatile RAM has been developed.
Other types of non-volatile memories exist that allow random access for read operations, but either do not allow write operations or have other kinds of limitations on them. These include most types of ROM and a type of flash memory called NOR-Flash. Integrated-circuit RAM chips came into the market in the early 1970s, with the first commercially available DRAM chip, the Intel 1103, introduced in October 1970. Early computers used relays, mechanical counters or delay lines for main memory functions. Ultrasonic delay lines could only reproduce data in the order. Drum memory could be expanded at low cost but efficient retrieval of memory items required knowledge of the physical layout of the drum to optimize speed. Latches built out of vacuum tube triodes, out of discrete transistors, were used for smaller and faster memories such as registers; such registers were large and too costly to use for large amounts of data. The first practical form of random-access memory was the Williams tube starting in 1947.
It stored data. Since the electron beam of the CRT could read and write the spots on the tube in any order, memory was random access; the capacity of the Williams tube was a few hundred to around a thousand bits, but it was much smaller and more power-efficient than using individual vacuum tube latches. Developed at the University of Manchester in England, the Williams tube provided the medium on which the first electronically stored program was implemented in the Manchester Baby computer, which first ran a program on 21 June 1948. In fact, rather than the Williams tube memory being designed for the Baby, the Baby was a testbed to demonstrate the reliability of the memory. Magnetic-core memory was developed up until the mid-1970s, it became a widespread form of random-access memory. By changing the sense of each ring's magnetization, data could be stored with one bit stored per ring. Since every ring had a combination of address wires to select and read or write it, access to any memory location in any sequence was possible.
Magnetic core memory was the standard form of memory system until displaced by solid-state memory in integrated circuits, starting in the early 1970s. Dynamic random-access memory allowed replacement of a 4 or 6-transistor latch circuit by a single transistor for each memory bit increasing memory density at the cost of volatility. Data was stored in the tiny capacitance of each transistor, had to be periodically refreshed every few milliseconds before the charge could leak away; the Toshiba Toscal BC-1411 electronic calculator, introduced in 1965, used a form of DRAM built from discrete components. DRAM was developed by Robert H. Dennard in 1968. Prior to the development of integrated read-only memory circuits, permanent random-access memory was constructed using diode matrices driven by address decoders, or specially wound core rope memory planes; the two used forms of modern RAM are static RAM and dynamic RAM. In SRAM, a bit of data is stored using the state of a six transistor memory cell.
This form of RAM is more expensive to produce, but is faster and requires less dynamic power than DRAM. In modern computers, SRAM is used as cache memory for the CPU. DRAM stores a bit of data using a transistor and capacitor pair, which together comprise a DRAM cell; the capacitor holds a high or low charge, the transistor acts as a switch that lets the control circuitry on the chip read the capacitor's state of charge or change it. As this form of memory is less expensive to produce than static RAM, it is the predominant form of computer memory used in modern computers. Both static and dynamic RAM are considered volatile, as their state is lost or reset when power is removed from the system. By contrast, read-only memory stores data by permanently enabling or disabling selected transistors, such that the memory cannot be altered. Writeable variants of ROM share properties of both ROM and RAM, enabling data to persist without power and to be updated without requiring special equipment; these persistent forms of semiconductor ROM include USB flash drives, memory cards for cameras and portable devices, solid-state drives.
ECC memory includes special circuitry to detect and/or correct random faults (mem
Cromemco was a Mountain View, California microcomputer company known for its high-end Z80-based S-100 bus computers and peripherals in the early days of the personal computer revolution. The company began as a partnership in 1974 between Harry Garland and Roger Melen, two Stanford Ph. D. students. The company was named for their residence at Stanford University. Cromemco was incorporated in 1976 and their first products were the Cromemco Cyclops digital camera, the Cromemco Dazzler color graphics interface - both groundbreaking at the time - before they moved on to making computer systems. In December 1981 Inc. Magazine named Cromemco in the top ten fastest-growing held companies in the U. S; the collaboration, to become Cromemco began in 1970 when Harry Garland and Roger Melen, graduate students at Stanford University, began working on a series of articles for Popular Electronics magazine. These articles described construction projects for the electronic hobbyist. Since it was sometimes difficult for the hobbyist to find the needed parts for these projects and Melen licensed third-party suppliers to provide kits of parts.
A kit for one of these projects, an “Op Amp Tester”, was sold by a company called MITS which would launch a revolutionary microcomputer on the cover of Popular Electronics. In 1974 Roger Melen was visiting the New York editorial offices of Popular Electronics where he saw a prototype of the MITS Altair microcomputer. Melen was so impressed with this machine that he changed his return flight to California to go through Albuquerque, where he met with Ed Roberts, the president of MITS. At that meeting Roberts encouraged Melen to develop add-on products for the Altair, beginning with the Cyclops digital camera, slated to appear in the February 1975 issue of Popular Electronics. On returning to California and Garland formed a partnership to produce the Cyclops camera and future microcomputer products, they named the company “Cromemco” after the Stanford dorm where they first began their collaboration. Melen and Garland began work on the Cyclops Camera interface for the Altair, this spawned several other projects for their young company.
There was no convenient way to store software for the Altair, other than on punched paper tape. To remedy this problem Melen and Garland went to work on designing a programmable read-only memory card they called the “Bytesaver.” The Bytesaver could support a resident program that allowed the computer to function when it was powered up, without having to first manually load a bootstrap program. The Bytesaver proved to be a popular peripheral. There was no way to see a Cyclops image stored in the Altair. So work began on a graphics interface card; this graphics interface, called the Dazzler, was introduced in the February 1976 issue of Popular Electronics. One use for an Altair Computer with a Dazzler was to play games, but there was no way to interface a game joystick to the Altair. So the next project was to design a joystick console and an interface card that supported an 8-bit digital channel and 7 analog channels; the D+7A could do much more than just interface a joystick, it was this card that allowed the Altair to be connected to the world of data acquisition and industrial computing.
Cromemco called themselves “Specialists in Computer Peripherals” and had a reputation for innovative designs and quality construction. They were, just a few steps away from offering their own computer system based on the Altair computer bus structure, named by Garland and Melen the "S-100 bus"; the first computer released by Cromemco was the Z-1 in August 1976. The Z-1 came with 8K of static RAM and used the same chassis as the IMSAI 8080 but featured the Z80 microprocessor rather than the IMSAI computer's Intel 8080 chip; the Z-1 was succeeded by the Z-2 in June 1977, which featured 64K of RAM and the ability to run Cromemco DOS, a CP/M-like operating system. The Z-2 added a parallel interface in addition to an RS-232C serial port and no longer included the large panel of switches, part of the Z-1 model. Cromemco re-packaged their systems to produce the System One, followed by the larger System Two and System Three; the System Three, announced in 1978 was capable of running both FORTRAN IV and Z80 BASIC programming languages.
The System Three was designed for multiuser professional use and included an optional hard disk, CRT terminal and the main computer unit. In 1979, Cromemco released the first Unix-like operating system for microcomputers. CROMIX ran on the System Three and would run on Cromemco systems using the Motorola 68000 series of microprocessors. In 1982, Cromemco introduced a Motorola 68000 CPU card for their systems, it was a dual-processor card with both a Zilog Z-80 processor. Their System One and Three computers evolved to the 100-series, 200-series, 300-series respectively. Additionally a 400-series was introduced in a tower-style case; the DPU was followed by the increasing capable XPU and XXU cards based on the Motorola 68000 family of processors. Cromemco introduced the C-10 personal computer in 1982, a Z-80 floppy disk based system for the low end of the market. By 1983, Cromemco had annual revenues of US$55 million; the company was wholly owned by Garland and Melen until it was sold to Dynatech in 1987 as a supplier to their ColorGraphics Weather Systems subsidiary.
The European division of Cromemco reorganized as Cromemco AG and was in liquidation in 2018 (https
A desktop computer is a personal computer designed for regular use at a single location on or near a desk or table due to its size and power requirements. The most common configuration has a case that houses the power supply, disk storage; the case may be oriented horizontally or vertically and placed either underneath, beside, or on top of a desk. Prior to the widespread use of microprocessors, a computer that could fit on a desk was considered remarkably small. Early computers took up the space of a whole room. Minicomputers fit into one or a few refrigerator-sized racks, it was not until the 1970s when programmable computers appeared that could fit on top of a desk. 1970 saw the introduction of the Datapoint 2200, a "smart" computer terminal complete with keyboard and monitor, was designed to connect with a mainframe computer but that didn't stop owners from using its built in computational abilities as a stand alone desktop computer. The HP 9800 series, which started out as programmable calculators in 1971 but was programmable in BASIC by 1972, used a smaller version of a minicomputer design based on ROM memory and had small one-line LED alphanumeric displays and displayed graphics with a plotter.
The Wang 2200 of 1973 had cassette tape storage. The IBM 5100 in 1975 had a small CRT display and could be programmed in BASIC and APL; these were expensive specialized computers sold for business or scientific uses. Apple II, TRS-80 and Commodore PET were first generation personal home computers launched in 1977, which were aimed at the consumer market – rather than businessmen or computer hobbyists. Byte magazine referred to these three as the "1977 Trinity" of personal computing. Throughout the 1980s and 1990s, desktop computers became the predominant type, the most popular being the IBM PC and its clones, followed by the Apple Macintosh, with the third-placed Commodore Amiga having some success in the mid-1980s but declining by the early 1990s. Early personal computers, like the original IBM Personal Computer, were enclosed in a "desktop case", horizontally oriented to have the display screen placed on top, thus saving space on the user's actual desk, although these cases had to be sturdy enough to support the weight of CRT displays that were widespread at the time.
Over the course of the 1990s, desktop cases became less common than the more-accessible tower cases that may be located on the floor under or beside a desk rather than on a desk. Not only do these tower cases have more room for expansion, they have freed up desk space for monitors which were becoming larger every year. Desktop cases the compact form factors, remain popular for corporate computing environments and kiosks; some computer cases can be interchangeably positioned either horizontally or upright. Influential games such as Doom and Quake during the 1990s had pushed gamers and enthusiasts to upgrade to the latest CPUs and graphics cards for their desktops in order to run these applications, though this has slowed since the late 2000s as the growing popularity of Intel integrated graphics forced game developers to scale back. Creative Technology's Sound Blaster series were a de facto standard for sound cards in desktop PCs during the 1990s until the early 2000s, when they were reduced to a niche product, as OEM desktop PCs came with sound boards integrated directly onto the motherboard.
While desktops have long been the most common configuration for PCs, by the mid-2000s the growth shifted from desktops to laptops. Notably, while desktops were produced in the United States, laptops had long been produced by contract manufacturers based in Asia, such as Foxconn; this shift led to the closure of the many desktop assembly plants in the United States by 2010. Another trend around this time was the increasing proportion of inexpensive base-configuration desktops being sold, hurting PC manufacturers such as Dell whose build-to-order customization of desktops relied on upselling added features to buyers. Battery-powered portable computers had just 2% worldwide market share in 1986. However, laptops have become popular, both for business and personal use. Around 109 million notebook PCs shipped worldwide in 2007, a growth of 33% compared to 2006. In 2008, it was estimated that 145.9 million notebooks were sold, that the number would grow in 2009 to 177.7 million. The third quarter of 2008 was the first time when worldwide notebook PC shipments exceeded desktops, with 38.6 million units versus 38.5 million units.
The sales breakdown of the Apple Macintosh have seen sales of desktop Macs staying constant while being surpassed by that of Mac notebooks whose sales rate has grown considerably. The change in sales of form factors is due to the desktop iMac moving from affordable to upscale and subsequent releases are considered premium all-in-ones. By contrast, the MSRP of the MacBook laptop lines have dropped through successive generations such that the MacBook Air and MacBook Pro constitute the lowest price of entry to a Mac, with the exception of the more inexpensive Mac Mini (albeit with
A microprocessor is a computer processor that incorporates the functions of a central processing unit on a single integrated circuit, or at most a few integrated circuits. The microprocessor is a multipurpose, clock driven, register based, digital integrated circuit that accepts binary data as input, processes it according to instructions stored in its memory, provides results as output. Microprocessors contain sequential digital logic. Microprocessors operate on symbols represented in the binary number system; the integration of a whole CPU onto a single or a few integrated circuits reduced the cost of processing power. Integrated circuit processors are produced in large numbers by automated processes, resulting in a low unit price. Single-chip processors increase reliability because there are many fewer electrical connections that could fail; as microprocessor designs improve, the cost of manufacturing a chip stays the same according to Rock's law. Before microprocessors, small computers had been built using racks of circuit boards with many medium- and small-scale integrated circuits.
Microprocessors combined this into a few large-scale ICs. Continued increases in microprocessor capacity have since rendered other forms of computers completely obsolete, with one or more microprocessors used in everything from the smallest embedded systems and handheld devices to the largest mainframes and supercomputers; the complexity of an integrated circuit is bounded by physical limitations on the number of transistors that can be put onto one chip, the number of package terminations that can connect the processor to other parts of the system, the number of interconnections it is possible to make on the chip, the heat that the chip can dissipate. Advancing technology makes more powerful chips feasible to manufacture. A minimal hypothetical microprocessor might include only an arithmetic logic unit, a control logic section; the ALU performs addition and operations such as AND or OR. Each operation of the ALU sets one or more flags in a status register, which indicate the results of the last operation.
The control logic retrieves instruction codes from memory and initiates the sequence of operations required for the ALU to carry out the instruction. A single operation code might affect many individual data paths and other elements of the processor; as integrated circuit technology advanced, it was feasible to manufacture more and more complex processors on a single chip. The size of data objects became larger. Additional features were added to the processor architecture. Floating-point arithmetic, for example, was not available on 8-bit microprocessors, but had to be carried out in software. Integration of the floating point unit first as a separate integrated circuit and as part of the same microprocessor chip sped up floating point calculations. Physical limitations of integrated circuits made such practices as a bit slice approach necessary. Instead of processing all of a long word on one integrated circuit, multiple circuits in parallel processed subsets of each data word. While this required extra logic to handle, for example and overflow within each slice, the result was a system that could handle, for example, 32-bit words using integrated circuits with a capacity for only four bits each.
The ability to put large numbers of transistors on one chip makes it feasible to integrate memory on the same die as the processor. This CPU cache has the advantage of faster access than off-chip memory and increases the processing speed of the system for many applications. Processor clock frequency has increased more than external memory speed, so cache memory is necessary if the processor is not delayed by slower external memory. A microprocessor is a general-purpose entity. Several specialized processing devices have followed: A digital signal processor is specialized for signal processing. Graphics processing units are processors designed for realtime rendering of images. Other specialized units exist for video machine vision. Microcontrollers integrate a microprocessor with peripheral devices in embedded systems. Systems on chip integrate one or more microprocessor or microcontroller cores. Microprocessors can be selected for differing applications based on their word size, a measure of their complexity.
Longer word sizes allow each clock cycle of a processor to carry out more computation, but correspond to physically larger integrated circuit dies with higher standby and operating power consumption. 4, 8 or 12 bit processors are integrated into microcontrollers operating embedded systems. Where a system is expected to handle larger volumes of data or require a more flexible user interface, 16, 32 or 64 bit processors are used. An 8- or 16-bit processor may be selected over a 32-bit processor for system on a chip or microcontroller applications that require low-power electronics, or are part of a mixed-signal integrated circuit with noise-sensitive on-chip analog electronics such as high-resolution analog to digital converters, or both. Running 32-bit arithmetic on an 8-bit chip could end up using more power, as the chip must execute software with multiple instructions. Thousands of items that were traditionally not computer-related inc
The Compact Cassette, Compact Audio Cassette or Musicassette commonly called the cassette tape or tape or cassette, is an analog magnetic tape recording format for audio recording and playback. It was developed by Philips in Hasselt and released in 1962. Compact cassettes come in two forms, either containing content as a prerecorded cassette, or as a recordable "blank" cassette. Both forms are reversible by the user; the compact cassette technology was designed for dictation machines, but improvements in fidelity led the Compact Cassette to supplant the Stereo 8-track cartridge and Reel-to-reel tape recording in most non-professional applications. Its uses ranged from portable audio to home recording to data storage for early microcomputers; the first cassette player designed for use in car dashboards was introduced in 1968. Between the early 1970s and the early 2000s, the cassette was one of the two most common formats for prerecorded music, first alongside the LP record and the compact disc.
Compact Cassettes contain two miniature spools, between which the magnetically coated, polyester-type plastic film is passed and wound. These spools and their attendant parts are held inside a protective plastic shell, 4 by 2.5 by 0.5 inches at its largest dimensions. The tape itself was referred to as "eighth-inch" tape 1⁄8 inches wide, but it was larger: 0.15 inches. Two stereo pairs of tracks or two monaural audio tracks are available on the tape; this reversal is achieved either by flipping the cassette, or by the reversal of tape movement when the mechanism detects that the tape has come to an end. In 1935, decades before the introduction of the Compact Cassette, AEG released the first reel-to-reel tape recorder, with the commercial name "Magnetophon", it was based on the invention of the magnetic tape by Fritz Pfleumer, which used similar technology but with open reels. These instruments were expensive and difficult to use and were therefore used by professionals in radio stations and recording studios.
In 1958, following four years of development, RCA Victor introduced the stereo, quarter-inch, reel-to-reel RCA tape cartridge. However, it was a large cassette, offered few pre-recorded tapes. Despite the multiple versions, it failed. Consumer use of magnetic tape machines took off in the early 1960s, after playback machines reached a comfortable, user-friendly design; this was aided by the introduction of transistors which replaced the bulky and costly vacuum tubes of earlier designs. Reel-to-reel tape became more suitable to household use, but still remained an esoteric product. WIRAG, the Vienna division of Philips developed a cartridge, described as single-hole cassette, adapted from its German described name Einloch-Kassette. Tape and tape speed were identical with the Compact Cassette. Grundig came up with the DC-International derived from blue prints of the Compact Cassette in 1965, but failed on the demand of distributing companies. In 1962, Philips invented the Compact Cassette medium for audio storage, introducing it in Europe on 30 August 1963 at the Berlin Radio Show, in the United States in November 1964, with the trademark name Compact Cassette.
The team at Philips was led by Lou Ottens in Hasselt, Belgium."Philips was competing with Telefunken and Grundig in a race to establish its cassette tape as the worldwide standard, it wanted support from Japanese electronics manufacturers." However, the Philips' Compact Cassette became dominant as a result of Philips' decision to license the format free of charge. Philips released the Norelco Carry-Corder 150 recorder/player in the US in November 1964. By 1966 over 250,000 recorders had been sold in the US alone and Japan soon became the major source of recorders. By 1968, 85 manufacturers had sold over 2.4 million players. By the end of the 1960s, the cassette business was worth an estimated 150 million dollars. In the early years sound quality was mediocre, but it improved by the early 1970s when it caught up with the quality of 8-track tape and kept improving; the Compact Cassette went on to become a popular alternative to the 12-inch vinyl LP during the late 1970s. The mass production of "blank" Compact Cassettes began in 1964 in Germany.
Prerecorded music cassettes were launched in Europe in late 1965. The Mercury Record Company, a US affiliate of Philips, introduced M. C. to the US in July 1966. The initial offering consisted of 49 titles. However, the system had been designed for dictation and portable use, with the audio quality of early players not well suited for music; some early models had an unreliable mechanical design. In 1971, the Advent Corporation introduced their Model 201 tape deck that combined Dolby type B noise reduction and chromium oxide tape, with a commercial-grade tape transport mechanism supplied by the Wollensak camera division of 3M Corporation; this resulted in the format being taken more for musical use, started the era of high fidelity cassettes and players. Although the birth and growth of the cassette began in the 1960s, its cultural moment took place during the 1970s and 1980s; the cassette's popularity grew
IBM Z is a family name used by IBM for all of its non-POWER mainframe computers from the Z900 on. In July 2017, with another generation of products, the official family was changed to IBM Z from IBM z Systems; the zSeries, zEnterprise, System z and IBM Z families were named for their availability – z stands for zero downtime. The systems are built with spare components capable of hot failovers to ensure continuous operations; the IBM Z family maintains full backward compatibility. In effect, current systems are the direct, lineal descendants of System/360, announced in 1964, the System/370 from the 1970s. Many applications written for these systems can still run unmodified on the newest IBM Z system over five decades later. Virtualization is required by default on IBM Z systems. First layer virtualization is provided by the Processor Resource and System Manager to deploy one or more Logical Partitions; each LPAR supports a variety of operating systems. A hypervisor called z/VM can be run as the second layer virtualization in LPARs to create as many virtual machines as there are resources assigned to the LPARs to support them.
The first layer of IBM Z virtualization allows a z machine to run a limited number of LPARs. These can be considered virtual "bare metal" servers because PR/SM allows CPUs to be dedicated to individual LPARs. Z/VM LPARs allocated within PR/SM LPARs can run a large number of virtual machines as long as there are adequate CPU, I/O resources configured with the system for the desired performance and throughput. IBM Z's PR/SM and hardware attributes allow compute resources to be dynamically changed to meet workload demands. CPU and memory resources can be non-disruptively added to the system and dynamically assigned and used by LPARs. I/O resources such as IP and SAN ports can be added dynamically, they are shared across all LPARs. The hardware component that provides this capability is called the Channel Subsystem; each LPAR can be configured to either "see" or "not see" the virtualized I/O ports to establish desired "shareness" or isolation. This virtualization capability allows significant reduction in I/O resources because of its ability to share them and drive up utilization.
PR/SM on IBM Z has earned Common Criteria Evaluation Assurance Level 5+ security certification, z/VM has earned Common Criteria EAL4+ certification. The KVM hypervisor from Linux has been ported. Since the move away from the System/390 name, a number of IBM Z models have been released; these can be grouped into families with similar architectural characteristics. IBM z14 ZR1 single-frame mainframe introduced on April 10, 2018 IBM z14 mainframe introduced on July 17, 2017 Official IBM z14 mainframe product page IBM Redbooks z14 technical guide z Systems z13s, introduced on February 17 2016 z Systems z13, introduced on January 13, 2015 The IBM zEnterprise System, announced in July 2010, with the z196 model, is designed to offer both mainframe and distributed server technologies in an integrated system; the zEnterprise System consists of three components. First is a System z server. Second is the IBM zEnterprise BladeCenter Extension. Last is the management layer, IBM zEnterprise Unified Resource Manager, which provides a single management view of zEnterprise resources.
The zEnterprise is designed to extend mainframe capabilities – management efficiency, dynamic resource allocation, serviceability – to other systems and workloads running on AIX on POWER7, Microsoft Windows or Linux on x86. The zEnterprise BladeCenter Extension is an infrastructure component that hosts both general purpose blade servers and appliance-like workload optimizers which can all be managed as if they were a single mainframe; the zBX supports a private high speed internal network that connects it to the central processing complex, which reduces the need for networking hardware and provides inherently high security. The IBM zEnterprise Unified Resource Manager integrates the System z and zBX resources as a single virtualized system and provides unified and integrated management across the zEnterprise System, it can identify system bottlenecks or failures among disparate systems and if a failure occurs it can dynamically reallocate system resources to prevent or reduce application problems.
The Unified Resource Manager provides energy monitoring and management, resource management, increased security, virtual networking, information management from a single user interface. Highlights of the original zEnterprise z196 include: BladeCenter Extension and Unified Resource Manager Up to 80 central processors 60% higher capacity than the z10 Twice the memory capacity 5.2 GHz quad-core chipsThe newest zEnterprise, the EC12, was announced in August 2012, included: Up to 101 central processors 50% higher capacity than the z196 Transactional Execution 5.5 GHz hex-core chips Flash Express – integrated SSDs which improve paging and certain other I/O performanceOn April 8, 2014, in honor of the 50th anniversary of the System/360 mainframe, IBM announced the release of its first converged infrastructure solution based on mainframe technology. Dubbed the IBM Enterprise Cloud System, this new offering combines IBM mainframe hardwa
The Z80 CPU is an 8-bit based microprocessor. It was introduced by Zilog in 1976 as the startup company's first product; the Z80 was conceived by Federico Faggin in late 1974 and developed by him and his then-11 employees at Zilog from early 1975 until March 1976, when the first working samples were delivered. With the revenue from the Z80, the company built its own chip factories and grew to over a thousand employees over the following two years; the Zilog Z80 was a software-compatible extension and enhancement of the Intel 8080 and, like it, was aimed at embedded systems. According to the designers, the primary targets for the Z80 CPU were products like intelligent terminals, high end printers and advanced cash registers as well as telecom equipment, industrial robots and other kinds of automation equipment; the Z80 was introduced on the market in July 1976 and came to be used in general desktop computers using CP/M and other operating systems as well as in the home computers of the 1980s.
It was common in military applications, musical equipment, such as synthesizers, in the computerized coin operated video games of the late 1970s and early 1980, the arcade machines or video game arcade cabinets. The Z80 was one of the most used CPUs in the home computer market from the late 1970s to the mid-1980s. Zilog licensed the Z80 to the US-based Synertek and Mostek, which had helped them with initial production, as well as to a European second source manufacturer, SGS; the design was copied by several Japanese, East European and Soviet manufacturers. This won the Z80 acceptance in the world market since large companies like NEC, Toshiba and Hitachi started to manufacture the device. In recent decades Zilog has refocused on the ever-growing market for embedded systems and the most recent Z80-compatible microcontroller family, the pipelined 24-bit eZ80 with a linear 16 MB address range, has been introduced alongside the simpler Z180 and Z80 products; the Z80 came about when physicist Federico Faggin left Intel at the end of 1974 to found Zilog with Ralph Ungermann.
At Fairchild Semiconductor, at Intel, Faggin had been working on fundamental transistor and semiconductor manufacturing technology. He developed the basic design methodology used for memories and microprocessors at Intel and led the work on the Intel 4004, the 8080 and several other ICs. Masatoshi Shima, the principal logic and transistor level-designer of the 4004 and the 8080 under Faggin's supervision, joined the Zilog team. By March 1976, Zilog had developed the Z80 as well as an accompanying assembler based development system for its customers, by July 1976, this was formally launched onto the market. Early Z80s were manufactured by Synertek and Mostek, before Zilog had its own manufacturing factory ready, in late 1976; these companies were chosen because they could do the ion implantation needed to create the depletion-mode MOSFETs that the Z80 design used as load transistors in order to cope with a single 5 Volt power supply. Faggin designed the instruction set to be binary compatible with the Intel 8080 so that most 8080 code, notably the CP/M operating system and Intel's PL/M compiler for 8080, would run unmodified on the new Z80 CPU.
Masatoshi Shima designed most of the microarchitecture as well as the gate and transistor levels of the Z80 CPU, assisted by a small number of engineers and layout people. CEO Federico Faggin was heavily involved in the chip layout work, together with two dedicated layout people. Faggin worked 80 hours a week in order to meet the tight schedule given by the financial investors, according to himself; the Z80 offered many improvements over the 8080: An enhanced instruction set including single-bit addressing, shifts/rotates on memory and registers other than the accumulator, rotate instructions for BCD number strings in memory, program looping, program counter relative jumps, block copy, block input/output, byte search instructions. The Z80 had better support for signed 8 - and 16-bit arithmetics. New IX and IY index registers with instructions for direct base+offset addressing A better interrupt system A more automatic and general vectorized interrupt system, mode 2 intended for Zilog's line of counter/timers, DMA and communications controllers, as well as a fixed vector interrupt system, mode 1, for simple systems with minimal hardware.
A non maskable interrupt which can be used to respond to power down situations or other high priority events. Two separate register files, which could be switched, to speed up response to interrupts such as fast asynchronous event handlers or a multitasking dispatcher. Although they were not intended as extra registers for general code, they were used that way in some applications. Less hardware required for power supply, clock generation and interface to memory and I/O Single 5-volt power supply. Single-phase 5 V clock. A built-in DRAM refresh mechanism. Non-multiplexed buses. A special reset function which clears only the program counter so that a single Z80 CPU could be used in a