RGB color model
The RGB color model is an additive color model in which red and blue light are added together in various ways to reproduce a broad array of colors. The name of the model comes from the initials of the three additive primary colors, red and blue; the main purpose of the RGB color model is for the sensing and display of images in electronic systems, such as televisions and computers, though it has been used in conventional photography. Before the electronic age, the RGB color model had a solid theory behind it, based in human perception of colors. RGB is a device-dependent color model: different devices detect or reproduce a given RGB value differently, since the color elements and their response to the individual R, G, B levels vary from manufacturer to manufacturer, or in the same device over time, thus an RGB value does not define the same color across devices without some kind of color management. Typical RGB input devices are color TV and video cameras, image scanners, digital cameras. Typical RGB output devices are TV sets of various technologies and mobile phone displays, video projectors, multicolor LED displays and large screens such as JumboTron.
Color printers, on the other hand subtractive color devices. This article discusses concepts common to all the different color spaces that use the RGB color model, which are used in one implementation or another in color image-producing technology. To form a color with RGB, three light beams must be superimposed; each of the three beams is called a component of that color, each of them can have an arbitrary intensity, from off to on, in the mixture. The RGB color model is additive in the sense that the three light beams are added together, their light spectra add, wavelength for wavelength, to make the final color's spectrum; this is opposite to the subtractive color model that applies to paints, inks and other substances whose color depends on reflecting the light under which we see them. Because of properties, these three colours create white, this is in stark contrast to physical colours, such as dyes which create black when mixed. Zero intensity for each component gives the darkest color, full intensity of each gives a white.
When the intensities for all the components are the same, the result is a shade of gray, darker or lighter depending on the intensity. When the intensities are different, the result is a colorized hue, more or less saturated depending on the difference of the strongest and weakest of the intensities of the primary colors employed; when one of the components has the strongest intensity, the color is a hue near this primary color, when two components have the same strongest intensity the color is a hue of a secondary color. A secondary color is formed by the sum of two primary colors of equal intensity: cyan is green+blue, magenta is red+blue, yellow is red+green; every secondary color is the complement of one primary color. The RGB color model itself does not define what is meant by red and blue colorimetrically, so the results of mixing them are not specified as absolute, but relative to the primary colors; when the exact chromaticities of the red and blue primaries are defined, the color model becomes an absolute color space, such as sRGB or Adobe RGB.
The choice of primary colors is related to the physiology of the human eye. The normal three kinds of light-sensitive photoreceptor cells in the human eye respond most to yellow and violet light; the difference in the signals received from the three kinds allows the brain to differentiate a wide gamut of different colors, while being most sensitive to yellowish-green light and to differences between hues in the green-to-orange region. As an example, suppose that light in the orange range of wavelengths enters the eye and strikes the retina. Light of these wavelengths would activate both the medium and long wavelength cones of the retina, but not equally—the long-wavelength cells will respond more; the difference in the response can be detected by the brain, this difference is the basis of our perception of orange. Thus, the orange appearance of an object results from light from the object entering our eye and stimulating the different cones but to different degrees. Use of the three primary colors is not sufficient to reproduce all colors.
The RGB color model is based on the Young–Helmholtz theory of trichromatic color vision, developed by Thomas Young and Hermann Helmholtz in the early to mid nineteenth century, on James Clerk Maxwell's c
MOS Technology 8563
The 8563 Video Display Controller was an integrated circuit produced by MOS Technology. It was used in the Commodore 128 computer to generate an 80-column RGB video display, running alongside a VIC-II which supported Commodore 64-compatible graphics; the DCR models of the C128 used the and more technically advanced 8568 VDC controller. Intended for a planned UNIX-based business computer based around the Zilog Z8000, Commodore designed the VDC into several prototype machines. Of these, only the Commodore 128 saw production. Unlike earlier MOS video chips such as the popular VIC-II, the VDC had dedicated video memory, 16 kilobytes in the original or "flat" C128 and 64 kilobytes in the C128DCR; this RAM was not directly accessible by the microprocessor. The 8563 was more difficult to produce than most of the rest of the MOS Technology line, initial yields were low; the early units had significant reliability problems and tended to self-destruct from overheating. There were timing issues with the VDC that would cause indirect load and store operations on its registers to malfunction.
The VDC was a text-only chip, although a careful reading of the technical literature by MOS Technology, given to the early C128 developers did indicate that a high-resolution bitmap mode was possible—it wasn't described in any detail. BASIC 7.0, the Commodore 128's built-in programming language, only supported high-resolution graphics in 40-column mode via the legacy VIC-II chip. Shortly after the release of the C128 the VDC's bitmap mode was described in considerable detail in the Data Becker book "Commodore 128 - Das große GRAFIK-Buch", an assembly language program was provided by the German authors Klaus Löffelmann and Dieter Vüllers, in which it was possible to set or clear any pixel or, using BASIC to perform the necessary calculations, generate bitmapped geometric shapes on the 80 column screen. In February 1986, less than a year after the Commodore 128's release, RUN magazine published "Ultra Hi-Res Graphics", an article describing the VDC's bitmapped mode and including a type-in program that extended BASIC 7.0's capabilities to support 640×200 high-resolution graphics using the 8563.
Authors Lou Wallace and David Darus developed the Ultra Hi-Res utility into a commercial package, BASIC 8. One of the most popular third-party utilities for the C128, this offered more advanced VDC high-resolution capabilities to a wide audience of programmers. Commodore offered complete official documentation on the VDC in the Commodore 128 Programmer's Reference Guide. VDC bitmap modes were used extensively in the C128 version of the GEOS operating system; the VDC lacked sprite capabilities. However, it did contain blitting capabilities to autonomously perform small block memory copies within its dedicated video RAM. While the VDC is performing such a copy, the system CPU can continue running code, provided no other VDC accesses are attempted before the copy is finished; these functions were used by the C128's screen editor ROM to scroll or clear screen sections. RGBI output compatible with IBM's CGA video standard. 16 or 64 kilobyte address space for display, character shape and display attribute memory.
Up to 720×700 pixel video resolution in interlaced mode. Other image sizes are possible, depending on programmer's needs, such as 640×200 non-interlaced, 640×400 interlaced, etc. 80×25 characters text resolution. 8 colors at 2 intensities.: This applies to US 60 Hz C128s only. 50 Hz C128 machines output a signal with a 50 Hz vertical refresh. Although not conforming to the CGA standard, most CGA monitors were capable of displaying the 50 Hz signal without problems. However, some monitors either failed to resolve the signal or succeeded in resolving it, but sooner or their deflection circuits would fail. Addressing the VDC's internal registers and dedicated video memory must be accomplished by indirect means. First the program must tell the VDC. Next the program must wait until the VDC is ready for the access, after which a read or write on the selected internal register may be performed; the following code is typical of a register read: The following code is typical of a register write operation: Owing to this somewhat cumbersome method of controlling the VDC, the maximum possible frame rate in bitmapped mode is too slow for arcade-style action video games, in which bit-intensive manipulation of the display is required.
In standard text mode, the VDC behaves much like the VIC-II except with 2k of screen memory instead of 1k. The power on default configuration places screen memory in $0-$7FF and the color memory at $800-$9FF and they can be moved anywhere in VDC memory as long as it's on a 2k boundary. Attributes are handled like the VIC-II's high resolution mode with a global background color and each character foreground color set individually per the color RAM. In addition to color data, the latter contains attribute data for each character. Bit 4 causes the character to blink if enabled, Bit 5 produces underlined characters, Bit 6 inverts the character's bitmap pattern. Bit 7 enables the alternate character set; the VDC can use as many as 512 characters. When the alternate character flag for a given character is enabled, the character pattern will be drawn from characters 256-511, thus if char
Random-access memory is a form of computer data storage that stores data and machine code being used. A random-access memory device allows data items to be read or written in the same amount of time irrespective of the physical location of data inside the memory. In contrast, with other direct-access data storage media such as hard disks, CD-RWs, DVD-RWs and the older magnetic tapes and drum memory, the time required to read and write data items varies depending on their physical locations on the recording medium, due to mechanical limitations such as media rotation speeds and arm movement. RAM contains multiplexing and demultiplexing circuitry, to connect the data lines to the addressed storage for reading or writing the entry. More than one bit of storage is accessed by the same address, RAM devices have multiple data lines and are said to be "8-bit" or "16-bit", etc. devices. In today's technology, random-access memory takes the form of integrated circuits. RAM is associated with volatile types of memory, where stored information is lost if power is removed, although non-volatile RAM has been developed.
Other types of non-volatile memories exist that allow random access for read operations, but either do not allow write operations or have other kinds of limitations on them. These include most types of ROM and a type of flash memory called NOR-Flash. Integrated-circuit RAM chips came into the market in the early 1970s, with the first commercially available DRAM chip, the Intel 1103, introduced in October 1970. Early computers used relays, mechanical counters or delay lines for main memory functions. Ultrasonic delay lines could only reproduce data in the order. Drum memory could be expanded at low cost but efficient retrieval of memory items required knowledge of the physical layout of the drum to optimize speed. Latches built out of vacuum tube triodes, out of discrete transistors, were used for smaller and faster memories such as registers; such registers were large and too costly to use for large amounts of data. The first practical form of random-access memory was the Williams tube starting in 1947.
It stored data. Since the electron beam of the CRT could read and write the spots on the tube in any order, memory was random access; the capacity of the Williams tube was a few hundred to around a thousand bits, but it was much smaller and more power-efficient than using individual vacuum tube latches. Developed at the University of Manchester in England, the Williams tube provided the medium on which the first electronically stored program was implemented in the Manchester Baby computer, which first ran a program on 21 June 1948. In fact, rather than the Williams tube memory being designed for the Baby, the Baby was a testbed to demonstrate the reliability of the memory. Magnetic-core memory was developed up until the mid-1970s, it became a widespread form of random-access memory. By changing the sense of each ring's magnetization, data could be stored with one bit stored per ring. Since every ring had a combination of address wires to select and read or write it, access to any memory location in any sequence was possible.
Magnetic core memory was the standard form of memory system until displaced by solid-state memory in integrated circuits, starting in the early 1970s. Dynamic random-access memory allowed replacement of a 4 or 6-transistor latch circuit by a single transistor for each memory bit increasing memory density at the cost of volatility. Data was stored in the tiny capacitance of each transistor, had to be periodically refreshed every few milliseconds before the charge could leak away; the Toshiba Toscal BC-1411 electronic calculator, introduced in 1965, used a form of DRAM built from discrete components. DRAM was developed by Robert H. Dennard in 1968. Prior to the development of integrated read-only memory circuits, permanent random-access memory was constructed using diode matrices driven by address decoders, or specially wound core rope memory planes; the two used forms of modern RAM are static RAM and dynamic RAM. In SRAM, a bit of data is stored using the state of a six transistor memory cell.
This form of RAM is more expensive to produce, but is faster and requires less dynamic power than DRAM. In modern computers, SRAM is used as cache memory for the CPU. DRAM stores a bit of data using a transistor and capacitor pair, which together comprise a DRAM cell; the capacitor holds a high or low charge, the transistor acts as a switch that lets the control circuitry on the chip read the capacitor's state of charge or change it. As this form of memory is less expensive to produce than static RAM, it is the predominant form of computer memory used in modern computers. Both static and dynamic RAM are considered volatile, as their state is lost or reset when power is removed from the system. By contrast, read-only memory stores data by permanently enabling or disabling selected transistors, such that the memory cannot be altered. Writeable variants of ROM share properties of both ROM and RAM, enabling data to persist without power and to be updated without requiring special equipment; these persistent forms of semiconductor ROM include USB flash drives, memory cards for cameras and portable devices, solid-state drives.
ECC memory includes special circuitry to detect and/or correct random faults (mem
The volt is the derived unit for electric potential, electric potential difference, electromotive force. It is named after the Italian physicist Alessandro Volta. One volt is defined as the difference in electric potential between two points of a conducting wire when an electric current of one ampere dissipates one watt of power between those points, it is equal to the potential difference between two parallel, infinite planes spaced 1 meter apart that create an electric field of 1 newton per coulomb. Additionally, it is the potential difference between two points that will impart one joule of energy per coulomb of charge that passes through it, it can be expressed in terms of SI base units as V = potential energy charge = J C = kg ⋅ m 2 A ⋅ s 3. It can be expressed as amperes times ohms, watts per ampere, or joules per coulomb, equivalent to electronvolts per elementary charge: V = A ⋅ Ω = W A = J C = eV e; the "conventional" volt, V90, defined in 1987 by the 18th General Conference on Weights and Measures and in use from 1990, is implemented using the Josephson effect for exact frequency-to-voltage conversion, combined with the caesium frequency standard.
For the Josephson constant, KJ = 2e/h, the "conventional" value KJ-90 is used: K J-90 = 0.4835979 GHz μ V. This standard is realized using a series-connected array of several thousand or tens of thousands of junctions, excited by microwave signals between 10 and 80 GHz. Empirically, several experiments have shown that the method is independent of device design, measurement setup, etc. and no correction terms are required in a practical implementation. In the water-flow analogy, sometimes used to explain electric circuits by comparing them with water-filled pipes, voltage is likened to difference in water pressure. Current is proportional to the amount of water flowing at that pressure. A resistor would be a reduced diameter somewhere in the piping and a capacitor/inductor could be likened to a "U" shaped pipe where a higher water level on one side could store energy temporarily; the relationship between voltage and current is defined by Ohm's law. Ohm's Law is analogous to the Hagen–Poiseuille equation, as both are linear models relating flux and potential in their respective systems.
The voltage produced by each electrochemical cell in a battery is determined by the chemistry of that cell. See Galvanic cell § Cell voltage. Cells can be combined in series for multiples of that voltage, or additional circuitry added to adjust the voltage to a different level. Mechanical generators can be constructed to any voltage in a range of feasibility. Nominal voltages of familiar sources: Nerve cell resting potential: ~75 mV Single-cell, rechargeable NiMH or NiCd battery: 1.2 V Single-cell, non-rechargeable: alkaline battery: 1.5 V. Some antique vehicles use 6.3 volts. Electric vehicle battery: 400 V when charged Household mains electricity AC: 100 V in Japan 120 V in North America, 230 V in Europe, Asia and Australia Rapid transit third rail: 600–750 V High-speed train overhead power lines: 25 kV at 50 Hz, but see the List of railway electrification systems and 25 kV at 60 Hz for exceptions. High-voltage electric power transmission lines: 110 kV and up Lightning: Varies often around 100 MV.
In 1800, as the result of a professional disagreement over the galvanic response advocated by Luigi Galvani, Alessandro Volta developed the so-called voltaic pile, a forerunner of the battery, which produced a steady electric current. Volta had determined that the most effective pair of dissimilar metals to produce electricity was zinc and silver. In 1861, Latimer Clark and Sir Charles Bright coined the name "volt" for the unit of resistance. By 1873, the British Association for the Advancement of Science had defined the volt and farad. In 1881, the International Electrical Congress, now the International Electrotechnical Commission, approved the volt as the unit for electromotive force, they made the volt equal to 108 cgs units of voltage
Productivity software is application software used for producing information. Its names arose from the fact that it increases productivity of individual office workers, from typists to knowledge workers, although its scope is now wider than that. Office suites, which brought word processing and relational database programs to the desktop in the 1980s, are the core example of productivity software, they revolutionized the office with the magnitude of the productivity increase they brought as compared with the pre-1980s office environments of typewriters, paper filing, handwritten lists and ledgers. Some 78% of "middle-skill" occupations now require the use of productivity software. In the 2010s, productivity software has become more consumerized than it was, as computing becomes more integrated into daily personal life. Productivity software traditionally runs directly on a computer. For example, Commodore Plus/4 model of computer contained in ROM for applications of productivity software. Productivity software is one of the reasons.
Productivity software can fall into the following categories: Time Management Software: With time management software, one is able to track time on a desktop without any user intervention. This allows the person to analyse how much time is spent on each task and what one can do to re-prioritise his tasks and spend time on the most important tasks. Project Management Software: With project management software, one is able to delegate, track major projects and have a quick overview of the progress made by each team member. An office suite is a collection of bundled productivity software intended to be used by knowledge workers; the components are distributed together, have a consistent user interface and can interact with each other, sometimes in ways that the operating system would not allow. The earliest office suite for personal computers was Starburst in the early 1980s, comprising the word processor WordStar, together with companion apps CalcStar and DataStar. Various other suites arose in the 1980s, over the course of the 1990s Microsoft Office came to dominate the market, a position it retains as of 2018.
Existing office suites contain wide range of various components. Most the base components include: Word processor Spreadsheet Presentation programOther components of office suites include: Database software Graphics suite Desktop publishing software Formula editor Diagramming software Email client Communication software Personal information manager Notetaking software Groupware Project management software Web log analysis software Office Suites at Curlie
A monochromic image is composed of one color. The term monochrome comes from the Ancient Greek: translit. Monochromos, lit.'having one color'. A monochromatic object or image reflects colors in shades of limited hues. Images using only shades of grey are called black-and-white. However, scientifically speaking, monochromatic light refers to visible light of a narrow band of wavelengths. Of an image, the term monochrome is taken to mean the same as black and white or, more grayscale, but may be used to refer to other combinations containing only tones of a single color, such as green-and-white or green-and-red, it may refer to sepia displaying tones from light tan to dark brown or cyanotype images, early photographic methods such as daguerreotypes and tintypes, each of which may be used to produce a monochromatic image. In computing, monochrome has two meanings: it may mean having only one color, either on or off, allowing shades of that color. A monochrome computer display is able to display only a single color green, red or white, also shades of that color.
In film photography, monochrome is the use of black-and-white film. All photography was done in monochrome. Although color photography was possible in the late 19th century used color films, such as Kodachrome, were not available until the mid-1930s. In digital photography, monochrome is the capture of only shades of black by the sensor, or by post-processing a color image to present only the perceived brightness by combining the values of multiple channels; the weighting of individual channels may be selected to achieve a desired artistic effect. If the red channel is eliminated and the green and blue combined the effect will be similar to that of orthochromatic film or the use of a cyan filter on panchromatic film; the selection of weighting thus allows a wide range of artistic expression in the final monochromatic image. For production of an anaglyph image the original color stereogram source may first be reduced to monochrome in order to simplify the rendering of the image; this is sometimes required in cases where a color image would render in a confusing manner given the colors and patterns present in the source image and the selection filters used.
In physics, monochromatic light is electromagnetic radiation of a single frequency. In the physical sense, no source of electromagnetic radiation is purely monochromatic, since that would require a wave of infinite duration as a consequence of the Fourier transform's localization property. Controlled sources such as lasers operate in a range of frequencies. In practice, filtered light, diffraction grating separated light and laser light are all referred to as monochromatic. Light sources can be compared and one be labeled as “more monochromatic”. A device which isolates a narrow band of frequencies from a broader-bandwidth source is called a monochromator though the bandwidth is explicitly specified, thus a collection of frequencies is understood. Duotone – the use of two ink colors in printing Halftone – the use of black and white in a pattern, perceived as shades of grey Polychrome – of multiple colors, the opposite of monochrome Monochromacy Monochromatic color Selective color – use of monochrome and color selectively within an image Monochrome painting – monochromes in art
A personal computer is a multi-purpose computer whose size and price make it feasible for individual use. Personal computers are intended to be operated directly by an end user, rather than by a computer expert or technician. Unlike large costly minicomputer and mainframes, time-sharing by many people at the same time is not used with personal computers. Institutional or corporate computer owners in the 1960s had to write their own programs to do any useful work with the machines. While personal computer users may develop their own applications these systems run commercial software, free-of-charge software or free and open-source software, provided in ready-to-run form. Software for personal computers is developed and distributed independently from the hardware or operating system manufacturers. Many personal computer users no longer need to write their own programs to make any use of a personal computer, although end-user programming is still feasible; this contrasts with mobile systems, where software is only available through a manufacturer-supported channel, end-user program development may be discouraged by lack of support by the manufacturer.
Since the early 1990s, Microsoft operating systems and Intel hardware have dominated much of the personal computer market, first with MS-DOS and with Microsoft Windows. Alternatives to Microsoft's Windows operating systems occupy a minority share of the industry; these include free and open-source Unix-like operating systems such as Linux. Advanced Micro Devices provides the main alternative to Intel's processors; the advent of personal computers and the concurrent Digital Revolution have affected the lives of people in all countries. "PC" is an initialism for "personal computer". The IBM Personal Computer incorporated the designation in its model name, it is sometimes useful to distinguish personal computers of the "IBM Personal Computer" family from personal computers made by other manufacturers. For example, "PC" is used in contrast with "Mac", an Apple Macintosh computer.. Since none of these Apple products were mainframes or time-sharing systems, they were all "personal computers" and not "PC" computers.
The "brain" may one day come down to our level and help with our income-tax and book-keeping calculations. But this is speculation and there is no sign of it so far. In the history of computing, early experimental machines could be operated by a single attendant. For example, ENIAC which became operational in 1946 could be run by a single, albeit trained, person; this mode pre-dated the batch programming, or time-sharing modes with multiple users connected through terminals to mainframe computers. Computers intended for laboratory, instrumentation, or engineering purposes were built, could be operated by one person in an interactive fashion. Examples include such systems as the Bendix G15 and LGP-30of 1956, the Programma 101 introduced in 1964, the Soviet MIR series of computers developed from 1965 to 1969. By the early 1970s, people in academic or research institutions had the opportunity for single-person use of a computer system in interactive mode for extended durations, although these systems would still have been too expensive to be owned by a single person.
In what was to be called the Mother of All Demos, SRI researcher Douglas Engelbart in 1968 gave a preview of what would become the staples of daily working life in the 21st century: e-mail, word processing, video conferencing, the mouse. The demonstration required technical support staff and a mainframe time-sharing computer that were far too costly for individual business use at the time; the development of the microprocessor, with widespread commercial availability starting in the mid 1970's, made computers cheap enough for small businesses and individuals to own. Early personal computers—generally called microcomputers—were sold in a kit form and in limited volumes, were of interest to hobbyists and technicians. Minimal programming was done with toggle switches to enter instructions, output was provided by front panel lamps. Practical use required adding peripherals such as keyboards, computer displays, disk drives, printers. Micral N was the earliest commercial, non-kit microcomputer based on a microprocessor, the Intel 8008.
It was built starting in 1972, few hundred units were sold. This had been preceded by the Datapoint 2200 in 1970, for which the Intel 8008 had been commissioned, though not accepted for use; the CPU design implemented in the Datapoint 2200 became the basis for x86 architecture used in the original IBM PC and its descendants. In 1973, the IBM Los Gatos Scientific Center developed a portable computer prototype called SCAMP based on the IBM PALM processor with a Philips compact cassette drive, small CRT, full function keyboard. SCAMP emulated an IBM 1130 minicomputer in order to run APL/1130. In 1973, APL was available only on mainframe computers, most desktop sized microcomputers such as the Wang 2200 or HP 9800 offered only BASIC; because SCAMP was the first to emulate APL/1130 performance on a portable, single user computer, PC Magazine in 1983 designated SCAMP a "revolutionary concept" and "the world's first personal computer". This seminal, single user portable computer now resides in the Smithsonian Institution, Washington, D.
C.. Successful demonstrations of the 1973 SCAMP prototype led to the IBM 5100 portable microcomputer launched in 1975 with the ability to be programmed in both APL and BASIC for engineers, analysts and other business problem-solvers. In the late 1960s such a machine would have been nearly as large as two desks and would have weigh