Central processing unit
A central processing unit called a central processor or main processor, is the electronic circuitry within a computer that carries out the instructions of a computer program by performing the basic arithmetic, logic and input/output operations specified by the instructions. The computer industry has used the term "central processing unit" at least since the early 1960s. Traditionally, the term "CPU" refers to a processor, more to its processing unit and control unit, distinguishing these core elements of a computer from external components such as main memory and I/O circuitry; the form and implementation of CPUs have changed over the course of their history, but their fundamental operation remains unchanged. Principal components of a CPU include the arithmetic logic unit that performs arithmetic and logic operations, processor registers that supply operands to the ALU and store the results of ALU operations and a control unit that orchestrates the fetching and execution of instructions by directing the coordinated operations of the ALU, registers and other components.
Most modern CPUs are microprocessors, meaning they are contained on a single integrated circuit chip. An IC that contains a CPU may contain memory, peripheral interfaces, other components of a computer; some computers employ a multi-core processor, a single chip containing two or more CPUs called "cores". Array processors or vector processors have multiple processors that operate in parallel, with no unit considered central. There exists the concept of virtual CPUs which are an abstraction of dynamical aggregated computational resources. Early computers such as the ENIAC had to be physically rewired to perform different tasks, which caused these machines to be called "fixed-program computers". Since the term "CPU" is defined as a device for software execution, the earliest devices that could rightly be called CPUs came with the advent of the stored-program computer; the idea of a stored-program computer had been present in the design of J. Presper Eckert and John William Mauchly's ENIAC, but was omitted so that it could be finished sooner.
On June 30, 1945, before ENIAC was made, mathematician John von Neumann distributed the paper entitled First Draft of a Report on the EDVAC. It was the outline of a stored-program computer that would be completed in August 1949. EDVAC was designed to perform a certain number of instructions of various types; the programs written for EDVAC were to be stored in high-speed computer memory rather than specified by the physical wiring of the computer. This overcame a severe limitation of ENIAC, the considerable time and effort required to reconfigure the computer to perform a new task. With von Neumann's design, the program that EDVAC ran could be changed by changing the contents of the memory. EDVAC, was not the first stored-program computer. Early CPUs were custom designs used as part of a sometimes distinctive computer. However, this method of designing custom CPUs for a particular application has given way to the development of multi-purpose processors produced in large quantities; this standardization began in the era of discrete transistor mainframes and minicomputers and has accelerated with the popularization of the integrated circuit.
The IC has allowed complex CPUs to be designed and manufactured to tolerances on the order of nanometers. Both the miniaturization and standardization of CPUs have increased the presence of digital devices in modern life far beyond the limited application of dedicated computing machines. Modern microprocessors appear in electronic devices ranging from automobiles to cellphones, sometimes in toys. While von Neumann is most credited with the design of the stored-program computer because of his design of EDVAC, the design became known as the von Neumann architecture, others before him, such as Konrad Zuse, had suggested and implemented similar ideas; the so-called Harvard architecture of the Harvard Mark I, completed before EDVAC used a stored-program design using punched paper tape rather than electronic memory. The key difference between the von Neumann and Harvard architectures is that the latter separates the storage and treatment of CPU instructions and data, while the former uses the same memory space for both.
Most modern CPUs are von Neumann in design, but CPUs with the Harvard architecture are seen as well in embedded applications. Relays and vacuum tubes were used as switching elements; the overall speed of a system is dependent on the speed of the switches. Tube computers like EDVAC tended to average eight hours between failures, whereas relay computers like the Harvard Mark I failed rarely. In the end, tube-based CPUs became dominant because the significant speed advantages afforded outweighed the reliability problems. Most of these early synchronous CPUs ran at low clock rates compared to modern microelectronic designs. Clock signal frequencies ranging from 100 kHz to 4 MHz were common at this time, limited by the speed of the switching de
The Amiga is a family of personal computers introduced by Commodore in 1985. The original model was part of a wave of 16- and 32-bit computers that featured 256 KB or more of RAM, mouse-based GUIs, improved graphics and audio over 8-bit systems; this wave included the Atari ST—released the same year—Apple's Macintosh, the Apple IIGS. Based on the Motorola 68000 microprocessor, the Amiga differed from its contemporaries through the inclusion of custom hardware to accelerate graphics and sound, including sprites and a blitter, a pre-emptive multitasking operating system called AmigaOS; the Amiga 1000 was released in July 1985, but a series of production problems kept it from becoming available until early 1986. The best selling model, the Amiga 500, was introduced in 1987 and became one of the leading home computers of the late 1980s and early 1990s with four to six million sold; the A3000, introduced in 1990, started the second generation of Amiga systems, followed by the A500+, the A600 in March 1992.
As the third generation, the A1200 and the A4000 were released in late 1992. The platform became popular for gaming and programming demos, it found a prominent role in the desktop video, video production, show control business, leading to video editing systems such as the Video Toaster. The Amiga's native ability to play back multiple digital sound samples made it a popular platform for early tracker music software; the powerful processor and ability to access several megabytes of memory enabled the development of several 3D rendering packages, including LightWave 3D, Aladdin4D, TurboSilver and Traces, a predecessor to Blender. Although early Commodore advertisements attempt to cast the computer as an all-purpose business machine when outfitted with the Amiga Sidecar PC compatibility add-on, the Amiga was most commercially successful as a home computer, with a wide range of games and creative software. Poor marketing and the failure of the models to repeat the technological advances of the first systems meant that the Amiga lost its market share to competing platforms, such as the fourth generation game consoles and the dropping prices of IBM PC compatibles which gained 256-color VGA graphics in 1987.
Commodore went bankrupt in April 1994 after the Amiga CD32 model failed in the marketplace. Since the demise of Commodore, various groups have marketed successors to the original Amiga line, including Genesi, Eyetech, ACube Systems Srl and A-EON Technology. AmigaOS has influenced replacements and compatible systems such as MorphOS, AmigaOS 4 and AROS. "The Amiga was so far ahead of its time that nobody—including Commodore's marketing department—could articulate what it was all about. Today, it's obvious the Amiga was the first multimedia computer, but in those days it was derided as a game machine because few people grasped the importance of advanced graphics and video. Nine years vendors are still struggling to make systems that work like 1985 Amigas." Jay Miner joined Atari in the 1970s to develop custom integrated circuits, led development of the Atari 2600's TIA. As soon as its development was complete, the team began developing a much more sophisticated set of chips, CTIA, ANTIC and POKEY, that formed the basis of the Atari 8-bit family.
With the 8-bit line's launch in 1979, the team once again started looking at a next generation chipset. Nolan Bushnell had sold the company to Warner Communications in 1978, the new management was much more interested in the existing lines than development of new products that might cut into their sales. Miner wanted to start work with the new Motorola 68000, but management was only interested in another 6502 based system. Miner left the company, for a time, the industry. In 1979, Larry Kaplan founded Activision. In 1982, Kaplan was approached by a number of investors. Kaplan hired Miner to run the hardware side of the newly formed company, "Hi-Toro"; the system was code-named "Lorraine" in keeping with Miner's policy of giving systems female names, in this case the company president's wife, Lorraine Morse. When Kaplan left the company late in 1982, Miner was promoted to head engineer and the company relaunched as Amiga Corporation. A breadboard prototype was completed by late 1983, shown at the January 1984 Consumer Electronics Show.
At the time, the operating system was not ready, so the machine was demonstrated with the Boing Ball demo. A further developed version of the system was demonstrated at the June 1984 CES and shown to many companies in hopes of garnering further funding, but found little interest in a market, in the final stages of the North American video game crash of 1983. In March, Atari expressed a tepid interest in Lorraine for its potential use in a games console or home computer tentatively known as the 1850XLD, but the talks were progressing and Amiga was running out of money. A temporary arrangement in June led to a $500,000 loan from Atari to Amiga to keep the company going; the terms required the loan to be repaid at the end of the month, otherwise Amiga would forfeit the Lorraine design to Atari. During 1983, Atari lost over $1 million a week, due to the combined effects of the crash and the ongoing price war in the home computer market. By the end of the year, Warner was desperate to sell the company.
In January 1984, Jack Tramiel resigned from Commodore due to internal battles over the future direction of the company. A number of Commodore employees followed him to Tramiel Technology; this included a number of the senior technical staff, where they began development of a 68000-based machine of the
The Amiga 2000, or A2000, is a personal computer released by Commodore in March 1987. It was introduced as a "big box" expandable variant of the Amiga 1000 but redesigned to share most of its electronic components with the contemporary Amiga 500 for cost reduction. Expansion capabilities include two 3.5" drive bays and one 5.25" bay that can be used by a 5.25" floppy drive, a hard drive, or CD-ROM once they became available. The Amiga 2000 is the first Amiga model. SCSI host adapters, memory cards, CPU cards, network cards, graphics cards, serial port cards, PC compatibility cards were available, multiple expansions can be used without requiring an expansion cage like the Amiga 1000 does. Not only does the Amiga 2000 include five Zorro II card slots, the motherboard has four PC ISA slots, two of which are inline with Zorro II slots for use with the A2088 bridgeboard, which adds IBM PC XT compatibility to the A2000; the Amiga 2000 was the most versatile and expandable Amiga computer until the Amiga 3000T was introduced four years later.
Aimed at the high-end market, the original Europe-only model adds a Zorro II backplane, implemented in programmable logic, to the custom Amiga chipset used in the Amiga 1000. Improved models have redesigned hardware using the more integrated A500 chipset, with the addition of a gate-array called "Buster", which integrates the Zorro subsystem; this enables hand-off of the system control to a coprocessor slot device, implements the full video slot for add-on video devices. Like the earlier model, the Amiga 1000, most IBM PC compatibles of the era, but unlike the Amiga 500, the A2000 comes in a desktop case with a separate keyboard; the case is taller than the A1000 to accommodate expansion cards, two 3.5" and one 5.25" drive bays. The A2000's case lacks the "keyboard garage" of the Amiga 1000 but has space for five Zorro II expansion slots, two 16-bit and two 8-bit ISA slots, a CPU upgrade slot and a video slot. Unlike the A1000, the A2000's motherboard includes a battery-backed real-time clock.
The Amiga 2000 offers graphics capabilities exceeded among its contemporaries only by the Macintosh II, which sold for about twice the price of a comparably-outfitted Amiga 2000 additionally equipped with the IBM PC Compatible bridgeboard and 5.25" floppy disk drive. Like the A1000, the A2000 was sold only by specialty computer dealers, it was announced at a price of 1495 USDThe A2000 was succeeded by the Amiga 3000 in 1990. The 3000 features fewer options for internal expansion than the 2000 models, so Commodore supplemented the Amiga 3000 with the Amiga 3000T in 1991; the Amiga 2000 was designed with an open architecture. Commodore's engineers believed that the company would be unsuccessful in matching the rate of system obsolesce and replacement common in the PC industry, with new models every year or so. Commodore's approach was to build a single system architecture. Commodore was so successful at this that Info magazine judged that the A2000 would not become obsolete "until well after the turn of the century" at the earliest.
The final design was the result of an internal battle within Commodore, which pitted the USA division, who wanted to build a system more like the Amiga 3000, against the German division, fresh from the successful introduction of the first Commodore PC-compatible systems and planned to include this capability in the Amiga 2000 from the start. The bottom-line practicality of the German design won out, the final A2000 shipped with not only Zorro II slots, but a complement of PC standard ISA slots; this architecture was subject to major revisions. The "B2000-CR" motherboard was the most common, it was designed by Dave Haynie and Terry Fisher, while an A2000 variant, was a redesign of the Amiga 1000 motherboard incorporating some Amiga 500 technological advances to achieve the "CR": Cost Reduction. The original Amiga 2000 shipped with just a single floppy drive for storage; this was followed up early by the Amiga 2000/HD, which bundled an Amiga 2090 hard drive controller and a SCSI-based hard drive.
In 1988, Commodore shipped the Amiga 2500/20, which added the Amiga 2620 CPU card to the CPU slot, a 14.3 MHz 68020, a 68881 FPU, a 68851 MMU to the A2000, along with 2 MB of 32-bit-wide memory. The A2000's original 68000 CPU remained installed on the motherboard of these machines, but is not used. In 1989 this model was replaced by the Amiga 2500/30, which added an Amiga 2630 CPU card: 25 MHz 68030 and the 68882 FPU with up to 4 MB of 32-bit memory; the A2630 card can take a memory expansion daughter card, capable of supporting up to 64 MB of additional memory. Commodore never released one. In 1990, Commodore UK sold a variant of the A2000, the A1500, for £999; the model designation was not sanctioned by Commodore International. The A1500 shipped with dual floppy drives, 1 MB of ChipRAM as standard. Initial units came with Kickstart 1.3, though the Original Chipset onboard includes a Agnus revision allowing the 1MB of ChipRAM. Early machines were bundled with a Commodore 1084SD1 monitor. Machines came with the ECS chipset and AmigaOS 2.04.
The second floppy drive replaces the hard disk drive. The A1500 has no hard disk drive as standard. A1500s are convertible into A2000/HDs by addition of a hard disk controller, simply peeling off the A1500 label revealing the A2
RGB color model
The RGB color model is an additive color model in which red and blue light are added together in various ways to reproduce a broad array of colors. The name of the model comes from the initials of the three additive primary colors, red and blue; the main purpose of the RGB color model is for the sensing and display of images in electronic systems, such as televisions and computers, though it has been used in conventional photography. Before the electronic age, the RGB color model had a solid theory behind it, based in human perception of colors. RGB is a device-dependent color model: different devices detect or reproduce a given RGB value differently, since the color elements and their response to the individual R, G, B levels vary from manufacturer to manufacturer, or in the same device over time, thus an RGB value does not define the same color across devices without some kind of color management. Typical RGB input devices are color TV and video cameras, image scanners, digital cameras. Typical RGB output devices are TV sets of various technologies and mobile phone displays, video projectors, multicolor LED displays and large screens such as JumboTron.
Color printers, on the other hand subtractive color devices. This article discusses concepts common to all the different color spaces that use the RGB color model, which are used in one implementation or another in color image-producing technology. To form a color with RGB, three light beams must be superimposed; each of the three beams is called a component of that color, each of them can have an arbitrary intensity, from off to on, in the mixture. The RGB color model is additive in the sense that the three light beams are added together, their light spectra add, wavelength for wavelength, to make the final color's spectrum; this is opposite to the subtractive color model that applies to paints, inks and other substances whose color depends on reflecting the light under which we see them. Because of properties, these three colours create white, this is in stark contrast to physical colours, such as dyes which create black when mixed. Zero intensity for each component gives the darkest color, full intensity of each gives a white.
When the intensities for all the components are the same, the result is a shade of gray, darker or lighter depending on the intensity. When the intensities are different, the result is a colorized hue, more or less saturated depending on the difference of the strongest and weakest of the intensities of the primary colors employed; when one of the components has the strongest intensity, the color is a hue near this primary color, when two components have the same strongest intensity the color is a hue of a secondary color. A secondary color is formed by the sum of two primary colors of equal intensity: cyan is green+blue, magenta is red+blue, yellow is red+green; every secondary color is the complement of one primary color. The RGB color model itself does not define what is meant by red and blue colorimetrically, so the results of mixing them are not specified as absolute, but relative to the primary colors; when the exact chromaticities of the red and blue primaries are defined, the color model becomes an absolute color space, such as sRGB or Adobe RGB.
The choice of primary colors is related to the physiology of the human eye. The normal three kinds of light-sensitive photoreceptor cells in the human eye respond most to yellow and violet light; the difference in the signals received from the three kinds allows the brain to differentiate a wide gamut of different colors, while being most sensitive to yellowish-green light and to differences between hues in the green-to-orange region. As an example, suppose that light in the orange range of wavelengths enters the eye and strikes the retina. Light of these wavelengths would activate both the medium and long wavelength cones of the retina, but not equally—the long-wavelength cells will respond more; the difference in the response can be detected by the brain, this difference is the basis of our perception of orange. Thus, the orange appearance of an object results from light from the object entering our eye and stimulating the different cones but to different degrees. Use of the three primary colors is not sufficient to reproduce all colors.
The RGB color model is based on the Young–Helmholtz theory of trichromatic color vision, developed by Thomas Young and Hermann Helmholtz in the early to mid nineteenth century, on James Clerk Maxwell's c
A software bug is an error, failure or fault in a computer program or system that causes it to produce an incorrect or unexpected result, or to behave in unintended ways. The process of finding and fixing bugs is termed "debugging" and uses formal techniques or tools to pinpoint bugs, since the 1950s, some computer systems have been designed to deter, detect or auto-correct various computer bugs during operations. Most bugs arise from mistakes and errors made in either a program's source code or its design, or in components and operating systems used by such programs. A few are caused by compilers producing incorrect code. A program that contains a large number of bugs, and/or bugs that interfere with its functionality, is said to be buggy. Bugs can trigger errors. Bugs may cause the program to crash or freeze the computer. Other bugs qualify as security bugs and might, for example, enable a malicious user to bypass access controls in order to obtain unauthorized privileges; some software bugs have been linked to disasters.
Bugs in code that controlled the Therac-25 radiation therapy machine were directly responsible for patient deaths in the 1980s. In 1996, the European Space Agency's US$1 billion prototype Ariane 5 rocket had to be destroyed less than a minute after launch due to a bug in the on-board guidance computer program. In June 1994, a Royal Air Force Chinook helicopter crashed into the Mull of Kintyre, killing 29; this was dismissed as pilot error, but an investigation by Computer Weekly convinced a House of Lords inquiry that it may have been caused by a software bug in the aircraft's engine-control computer. In 2002, a study commissioned by the US Department of Commerce's National Institute of Standards and Technology concluded that "software bugs, or errors, are so prevalent and so detrimental that they cost the US economy an estimated $59 billion annually, or about 0.6 percent of the gross domestic product". The term "bug" to describe defects has been a part of engineering jargon since the 1870s and predates electronic computers and computer software.
For instance, Thomas Edison wrote the following words in a letter to an associate in 1878: It has been just so in all of my inventions. The first step is an intuition, comes with a burst difficulties arise—this thing gives out and that "Bugs"—as such little faults and difficulties are called—show themselves and months of intense watching and labor are requisite before commercial success or failure is reached; the Middle English word bugge is the basis for the terms "bugbear" and "bugaboo" as terms used for a monster. Baffle Ball, the first mechanical pinball game, was advertised as being "free of bugs" in 1931. Problems with military gear during World War II were referred to as bugs. In a book published in 1942, Louise Dickinson Rich, speaking of a powered ice cutting machine, said, "Ice sawing was suspended until the creator could be brought in to take the bugs out of his darling."Isaac Asimov used the term "bug" to relate to issues with a robot in his short story "Catch That Rabbit", published in 1944.
The term "bug" was used in an account by computer pioneer Grace Hopper, who publicized the cause of a malfunction in an early electromechanical computer. A typical version of the story is: In 1946, when Hopper was released from active duty, she joined the Harvard Faculty at the Computation Laboratory where she continued her work on the Mark II and Mark III. Operators traced an error in the Mark II to a moth trapped in a relay; this bug was removed and taped to the log book. Stemming from the first bug, today we call errors or glitches in a program a bug. Hopper did not find the bug, as she acknowledged; the date in the log book was September 9, 1947. The operators who found it, including William "Bill" Burke of the Naval Weapons Laboratory, Virginia, were familiar with the engineering term and amusedly kept the insect with the notation "First actual case of bug being found." Hopper loved to recount the story. This log book, complete with attached moth, is part of the collection of the Smithsonian National Museum of American History.
The related term "debug" appears to predate its usage in computing: the Oxford English Dictionary's etymology of the word contains an attestation from 1945, in the context of aircraft engines. The concept that software might contain errors dates back to Ada Lovelace's 1843 notes on the analytical engine, in which she speaks of the possibility of program "cards" for Charles Babbage's analytical engine being erroneous:... an analysing process must have been performed in order to furnish the Analytical Engine with the necessary operative data. Granted that the actual mechanism is unerring in its processes, the cards may give it wrong orders; the first documented use of the term "bug" for a technical malfunction was by Thomas Edison. The Open Technology Institute, run by the group, New America, released a report "Bugs in the System" in August 2016 stating that U. S. policymakers should make reforms to help researchers address software bugs. The report "highlights the need for reform in the field of software vulnerability discovery and disclosure."
One of the report’s authors said that Congress has not done enough to address cyber software vulnerability though Congress has passed a number of bills to combat the larger issue of cyber security. Government researchers and cyber security experts are the people who discover software flaws
A crystal oscillator is an electronic oscillator circuit that uses the mechanical resonance of a vibrating crystal of piezoelectric material to create an electrical signal with a precise frequency. This frequency is used to keep track of time, as in quartz wristwatches, to provide a stable clock signal for digital integrated circuits, to stabilize frequencies for radio transmitters and receivers; the most common type of piezoelectric resonator used is the quartz crystal, so oscillator circuits incorporating them became known as crystal oscillators, but other piezoelectric materials including polycrystalline ceramics are used in similar circuits. A crystal oscillator one using a quartz crystal, works by distorting the crystal with an electric field, when voltage is applied to an electrode near or on the crystal; this property is known as inverse piezoelectricity. When the field is removed, the quartz - which oscillates in a precise frequency - generates an electric field as it returns to its previous shape, this can generate a voltage.
The result is that a quartz crystal behaves like an RLC circuit, but with a much higher Q. Quartz crystals are manufactured for frequencies from a few tens of kilohertz to hundreds of megahertz. More than two billion crystals are manufactured annually. Most are used for consumer devices such as wristwatches, radios and cellphones. Quartz crystals are found inside test and measurement equipment, such as counters, signal generators, oscilloscopes. A crystal oscillator is an electronic oscillator circuit that uses a piezoelectric resonator, a crystal, as its frequency-determining element. Crystal is the common term used in electronics for the frequency-determining component, a wafer of quartz crystal or ceramic with electrodes connected to it. A more accurate term for it is piezoelectric resonator. Crystals are used in other types of electronic circuits, such as crystal filters. Piezoelectric resonators are sold as separate components for use in crystal oscillator circuits. An example is shown in the picture.
They are often incorporated in a single package with the crystal oscillator circuit, shown on the righthand side. Piezoelectricity was discovered by Jacques and Pierre Curie in 1880. Paul Langevin first investigated quartz resonators for use in sonar during World War I; the first crystal-controlled oscillator, using a crystal of Rochelle salt, was built in 1917 and patented in 1918 by Alexander M. Nicholson at Bell Telephone Laboratories, although his priority was disputed by Walter Guyton Cady. Cady built the first quartz crystal oscillator in 1921. Other early innovators in quartz crystal oscillators include Louis Essen. Quartz crystal oscillators were developed for high-stability frequency references during the 1920s and 1930s. Prior to crystals, radio stations controlled their frequency with tuned circuits, which could drift off frequency by 3–4 kHz. Since broadcast stations were assigned frequencies only 10 kHz apart, interference between adjacent stations due to frequency drift was a common problem.
In 1925, Westinghouse installed a crystal oscillator in its flagship station KDKA, by 1926, quartz crystals were used to control the frequency of many broadcasting stations and were popular with amateur radio operators. In 1928, Warren Marrison of Bell Telephone Laboratories developed the first quartz-crystal clock. With accuracies of up to 1 second in 30 years, quartz clocks replaced precision pendulum clocks as the world's most accurate timekeepers until atomic clocks were developed in the 1950s. Using the early work at Bell Labs, AT&T established their Frequency Control Products division spun off and known today as Vectron International. A number of firms started producing quartz crystals for electronic use during this time. Using what are now considered primitive methods, about 100,000 crystal units were produced in the United States during 1939. Through World War II crystals were made from natural quartz crystal all from Brazil. Shortages of crystals during the war caused by the demand for accurate frequency control of military and naval radios and radars spurred postwar research into culturing synthetic quartz, by 1950 a hydrothermal process for growing quartz crystals on a commercial scale was developed at Bell Laboratories.
By the 1970s all crystals used in electronics were synthetic. In 1968, Juergen Staudte invented a photolithographic process for manufacturing quartz crystal oscillators while working at North American Aviation that allowed them to be made small enough for portable products like watches. Although crystal oscillators still most use quartz crystals, devices using other materials are becoming more common, such as ceramic resonators. A crystal is a solid in which the constituent atoms, molecules, or ions are packed in a ordered, repeating pattern extending in all three spatial dimensions. Any object made of an elastic material could be used like a crystal, with appropriate transducers, since all objects have natural resonant frequencies of vibration. For example, steel is elastic and has a high speed of sound, it was used in mechanical filters before quartz. The resonant frequency depends on size, shape and the speed of sound in the material. High-frequency crystals are cut in the shape of a simple rectangle or circular disk.
Low-frequency crystals, such as those used in digital watches, are cut in the shape of a tuning fork. For applications not needing precise timing, a low-cost ceramic resonator is used in place of a quartz crystal; when a crystal of quartz is properly cut and mounted, it can be made to distort in an electric field by applying a vol
Read-only memory is a type of non-volatile memory used in computers and other electronic devices. Data stored in ROM can only be modified with difficulty, or not at all, so it is used to store firmware or application software in plug-in cartridges. Read-only memory refers to memory, hard-wired, such as diode matrix and the mask ROM, which cannot be changed after manufacture. Although discrete circuits can be altered in principle, integrated circuits cannot, are useless if the data is bad or requires an update; that such memory can never be changed is a disadvantage in many applications, as bugs and security issues cannot be fixed, new features cannot be added. More ROM has come to include memory, read-only in normal operation, but can still be reprogrammed in some way. Erasable programmable read-only memory and electrically erasable programmable read-only memory can be erased and re-programmed, but this can only be done at slow speeds, may require special equipment to achieve, is only possible a certain number of times.
IBM used Capacitor Read Only Storage and Transformer Read Only Storage to store microcode for the smaller System/360 models, the 360/85 and the initial two models of the S/370. On some models there was a Writeable Control Store for additional diagnostics and emulation support; the simplest type of solid-state ROM is as old as the semiconductor technology itself. Combinational logic gates can be joined manually to map n-bit address input onto arbitrary values of m-bit data output. With the invention of the integrated circuit came mask ROM. Mask ROM consists of a grid of word lines and bit lines, selectively joined together with transistor switches, can represent an arbitrary look-up table with a regular physical layout and predictable propagation delay. In mask ROM, the data is physically encoded in the circuit, so it can only be programmed during fabrication; this leads to a number of serious disadvantages: It is only economical to buy mask ROM in large quantities, since users must contract with a foundry to produce a custom design.
The turnaround time between completing the design for a mask ROM and receiving the finished product is long, for the same reason. Mask ROM is impractical for R&D work since designers need to modify the contents of memory as they refine a design. If a product is shipped with faulty mask ROM, the only way to fix it is to recall the product and physically replace the ROM in every unit shipped. Subsequent developments have addressed these shortcomings. PROM, invented in 1956, allowed users to program its contents once by physically altering its structure with the application of high-voltage pulses; this addressed problems 1 and 2 above, since a company can order a large batch of fresh PROM chips and program them with the desired contents at its designers' convenience. The 1971 invention of EPROM solved problem 3, since EPROM can be reset to its unprogrammed state by exposure to strong ultraviolet light. EEPROM, invented in 1983, went a long way to solving problem 4, since an EEPROM can be programmed in-place if the containing device provides a means to receive the program contents from an external source.
Flash memory, invented at Toshiba in the mid-1980s, commercialized in the early 1990s, is a form of EEPROM that makes efficient use of chip area and can be erased and reprogrammed thousands of times without damage. All of these technologies improved the flexibility of ROM, but at a significant cost-per-chip, so that in large quantities mask ROM would remain an economical choice for many years. Rewriteable technologies were envisioned as replacements for mask ROM; the most recent development is NAND flash invented at Toshiba. Its designers explicitly broke from past practice, stating plainly that "the aim of NAND Flash is to replace hard disks," rather than the traditional use of ROM as a form of non-volatile primary storage; as of 2007, NAND has achieved this goal by offering throughput comparable to hard disks, higher tolerance of physical shock, extreme miniaturization, much lower power consumption. Every stored-program computer may use a form of non-volatile storage to store the initial program that runs when the computer is powered on or otherwise begins execution.
Every non-trivial computer needs some form of mutable memory to record changes in its state as it executes. Forms of read-only memory were employed as non-volatile storage for programs in most early stored-program computers, such as ENIAC after 1948. Read-only memory was simpler to implement since it needed only a mechanism to read stored values, not to change them in-place, thus could be implemented with crude electromechanical devices. With the advent of integrated circuits in the 1960s, both ROM and its mutable counterpart static RAM were implemented as arrays of transistors in silicon chips.