Computer graphics are pictures and films created using computers. The term refers to computer-generated image data created with the help of specialized graphical hardware and software, it is a vast and developed area of computer science. The phrase was coined in 1960, by computer graphics researchers Verne Hudson and William Fetter of Boeing, it is abbreviated as CG, though sometimes erroneously referred to as computer-generated imagery. Some topics in computer graphics include user interface design, sprite graphics, vector graphics, 3D modeling, shaders, GPU design, implicit surface visualization with ray tracing, computer vision, among others; the overall methodology depends on the underlying sciences of geometry and physics. Computer graphics is responsible for displaying art and image data and meaningfully to the consumer, it is used for processing image data received from the physical world. Computer graphics development has had a significant impact on many types of media and has revolutionized animation, advertising, video games, graphic design in general.
The term computer graphics has been used in a broad sense to describe "almost everything on computers, not text or sound". The term computer graphics refers to several different things: the representation and manipulation of image data by a computer the various technologies used to create and manipulate images the sub-field of computer science which studies methods for digitally synthesizing and manipulating visual content, see study of computer graphicsToday, computer graphics is widespread; such imagery is found in and on television, weather reports, in a variety of medical investigations and surgical procedures. A well-constructed graph can present complex statistics in a form, easier to understand and interpret. In the media "such graphs are used to illustrate papers, theses", other presentation material. Many tools have been developed to visualize data. Computer generated imagery can be categorized into several different types: two dimensional, three dimensional, animated graphics; as technology has improved, 3D computer graphics have become more common, but 2D computer graphics are still used.
Computer graphics has emerged as a sub-field of computer science which studies methods for digitally synthesizing and manipulating visual content. Over the past decade, other specialized fields have been developed like information visualization, scientific visualization more concerned with "the visualization of three dimensional phenomena, where the emphasis is on realistic renderings of volumes, illumination sources, so forth with a dynamic component"; the precursor sciences to the development of modern computer graphics were the advances in electrical engineering and television that took place during the first half of the twentieth century. Screens could display art since the Lumiere brothers' use of mattes to create special effects for the earliest films dating from 1895, but such displays were limited and not interactive; the first cathode ray tube, the Braun tube, was invented in 1897 – it in turn would permit the oscilloscope and the military control panel – the more direct precursors of the field, as they provided the first two-dimensional electronic displays that responded to programmatic or user input.
Computer graphics remained unknown as a discipline until the 1950s and the post-World War II period – during which time the discipline emerged from a combination of both pure university and laboratory academic research into more advanced computers and the United States military's further development of technologies like radar, advanced aviation, rocketry developed during the war. New kinds of displays were needed to process the wealth of information resulting from such projects, leading to the development of computer graphics as a discipline. Early projects like the Whirlwind and SAGE Projects introduced the CRT as a viable display and interaction interface and introduced the light pen as an input device. Douglas T. Ross of the Whirlwind SAGE system performed a personal experiment in which a small program he wrote captured the movement of his finger and displayed its vector on a display scope. One of the first interactive video games to feature recognizable, interactive graphics – Tennis for Two – was created for an oscilloscope by William Higinbotham to entertain visitors in 1958 at Brookhaven National Laboratory and simulated a tennis match.
In 1959, Douglas T. Ross innovated again while working at MIT on transforming mathematic statements into computer generated 3D machine tool vectors by taking the opportunity to create a display scope image of a Disney cartoon character. Electronics pioneer Hewlett-Packard went public in 1957 after incorporating the decade prior, established strong ties with Stanford University through its founders, who were alumni; this began the decades-long transformation of the southern San Francisco Bay Area into the world's leading computer technology hub - now known as Silicon Valley. The field of computer graphics developed with the emergence of computer graphics hardware. Further advances in computing led to greater advancements in interactive computer graphics. In 1959, the TX-2 computer was developed at MIT's Lincoln Laboratory; the TX-2 integrated a number of new man-machine interfaces. A light pen could be used to draw sketches on the computer using Ivan Sutherland's revolutionary Sketchpad software.
Using a light pen, Sketchpad allowed one to draw simple shapes on the computer screen, save them and recall them later. The light pen itself had a small photoelectric cell in its tip. T
Random-access memory is a form of computer data storage that stores data and machine code being used. A random-access memory device allows data items to be read or written in the same amount of time irrespective of the physical location of data inside the memory. In contrast, with other direct-access data storage media such as hard disks, CD-RWs, DVD-RWs and the older magnetic tapes and drum memory, the time required to read and write data items varies depending on their physical locations on the recording medium, due to mechanical limitations such as media rotation speeds and arm movement. RAM contains multiplexing and demultiplexing circuitry, to connect the data lines to the addressed storage for reading or writing the entry. More than one bit of storage is accessed by the same address, RAM devices have multiple data lines and are said to be "8-bit" or "16-bit", etc. devices. In today's technology, random-access memory takes the form of integrated circuits. RAM is associated with volatile types of memory, where stored information is lost if power is removed, although non-volatile RAM has been developed.
Other types of non-volatile memories exist that allow random access for read operations, but either do not allow write operations or have other kinds of limitations on them. These include most types of ROM and a type of flash memory called NOR-Flash. Integrated-circuit RAM chips came into the market in the early 1970s, with the first commercially available DRAM chip, the Intel 1103, introduced in October 1970. Early computers used relays, mechanical counters or delay lines for main memory functions. Ultrasonic delay lines could only reproduce data in the order. Drum memory could be expanded at low cost but efficient retrieval of memory items required knowledge of the physical layout of the drum to optimize speed. Latches built out of vacuum tube triodes, out of discrete transistors, were used for smaller and faster memories such as registers; such registers were large and too costly to use for large amounts of data. The first practical form of random-access memory was the Williams tube starting in 1947.
It stored data. Since the electron beam of the CRT could read and write the spots on the tube in any order, memory was random access; the capacity of the Williams tube was a few hundred to around a thousand bits, but it was much smaller and more power-efficient than using individual vacuum tube latches. Developed at the University of Manchester in England, the Williams tube provided the medium on which the first electronically stored program was implemented in the Manchester Baby computer, which first ran a program on 21 June 1948. In fact, rather than the Williams tube memory being designed for the Baby, the Baby was a testbed to demonstrate the reliability of the memory. Magnetic-core memory was developed up until the mid-1970s, it became a widespread form of random-access memory. By changing the sense of each ring's magnetization, data could be stored with one bit stored per ring. Since every ring had a combination of address wires to select and read or write it, access to any memory location in any sequence was possible.
Magnetic core memory was the standard form of memory system until displaced by solid-state memory in integrated circuits, starting in the early 1970s. Dynamic random-access memory allowed replacement of a 4 or 6-transistor latch circuit by a single transistor for each memory bit increasing memory density at the cost of volatility. Data was stored in the tiny capacitance of each transistor, had to be periodically refreshed every few milliseconds before the charge could leak away; the Toshiba Toscal BC-1411 electronic calculator, introduced in 1965, used a form of DRAM built from discrete components. DRAM was developed by Robert H. Dennard in 1968. Prior to the development of integrated read-only memory circuits, permanent random-access memory was constructed using diode matrices driven by address decoders, or specially wound core rope memory planes; the two used forms of modern RAM are static RAM and dynamic RAM. In SRAM, a bit of data is stored using the state of a six transistor memory cell.
This form of RAM is more expensive to produce, but is faster and requires less dynamic power than DRAM. In modern computers, SRAM is used as cache memory for the CPU. DRAM stores a bit of data using a transistor and capacitor pair, which together comprise a DRAM cell; the capacitor holds a high or low charge, the transistor acts as a switch that lets the control circuitry on the chip read the capacitor's state of charge or change it. As this form of memory is less expensive to produce than static RAM, it is the predominant form of computer memory used in modern computers. Both static and dynamic RAM are considered volatile, as their state is lost or reset when power is removed from the system. By contrast, read-only memory stores data by permanently enabling or disabling selected transistors, such that the memory cannot be altered. Writeable variants of ROM share properties of both ROM and RAM, enabling data to persist without power and to be updated without requiring special equipment; these persistent forms of semiconductor ROM include USB flash drives, memory cards for cameras and portable devices, solid-state drives.
ECC memory includes special circuitry to detect and/or correct random faults (mem
Animation is a method in which pictures are manipulated to appear as moving images. In traditional animation, images are drawn or painted by hand on transparent celluloid sheets to be photographed and exhibited on film. Today, most animations are made with computer-generated imagery. Computer animation can be detailed 3D animation, while 2D computer animation can be used for stylistic reasons, low bandwidth or faster real-time renderings. Other common animation methods apply a stop motion technique to two and three-dimensional objects like paper cutouts, puppets or clay figures; the effect of animation is achieved by a rapid succession of sequential images that minimally differ from each other. The illusion—as in motion pictures in general—is thought to rely on the phi phenomenon and beta movement, but the exact causes are still uncertain. Analog mechanical animation media that rely on the rapid display of sequential images include the phénakisticope, flip book and film. Television and video are popular electronic animation media that were analog and now operate digitally.
For display on the computer, techniques like animated GIF and Flash animation were developed. Animation is more pervasive. Apart from short films, feature films, animated gifs and other media dedicated to the display of moving images, animation is heavily used for video games, motion graphics and special effects. Animation is prevalent in information technology interfaces; the physical movement of image parts through simple mechanics – in for instance the moving images in magic lantern shows – can be considered animation. The mechanical manipulation of puppets and objects to emulate living beings has a long history in automata. Automata were popularised by Disney as animatronics. Animators are artists; the word "animation" stems from the Latin "animationem", noun of action from past participle stem of "animare", meaning "the action of imparting life". The primary meaning of the English word is "liveliness" and has been in use much longer than the meaning of "moving image medium"; the history of animation started long before the development of cinematography.
Humans have attempted to depict motion as far back as the paleolithic period. Shadow play and the magic lantern offered popular shows with moving images as the result of manipulation by hand and/or some minor mechanics. A 5,200-year old pottery bowl discovered in Shahr-e Sukhteh, has five sequential images painted around it that seem to show phases of a goat leaping up to nip at a tree. In 1833, the phenakistiscope introduced the stroboscopic principle of modern animation, which would provide the basis for the zoetrope, the flip book, the praxinoscope and cinematography. Charles-Émile Reynaud further developed his projection praxinoscope into the Théâtre Optique with transparent hand-painted colorful pictures in a long perforated strip wound between two spools, patented in December 1888. From 28 October 1892 to March 1900 Reynaud gave over 12,800 shows to a total of over 500.000 visitors at the Musée Grévin in Paris. His Pantomimes Lumineuses series of animated films each contained 300 to 700 frames that were manipulated back and forth to last 10 to 15 minutes per film.
Piano music and some dialogue were performed live, while some sound effects were synchronized with an electromagnet. When film became a common medium some manufacturers of optical toys adapted small magic lanterns into toy film projectors for short loops of film. By 1902, they were producing many chromolithography film loops by tracing live-action film footage; some early filmmakers, including J. Stuart Blackton, Arthur Melbourne-Cooper, Segundo de Chomón and Edwin S. Porter experimented with stop-motion animation since around 1899. Blackton's The Haunted Hotel was the first huge success that baffled audiences with objects moving by themselves and inspired other filmmakers to try the technique for themselves. J. Stuart Blackton experimented with animation drawn on blackboards and some cutout animation in Humorous Phases of Funny Faces. In 1908, Émile Cohl's Fantasmagorie was released with a white-on-black chalkline look created with negative prints from black ink drawings on white paper; the film consists of a stick figure moving about and encountering all kinds of morphing objects, including a wine bottle that transforms into a flower.
Inspired by Émile Cohl's stop-motion film Les allumettes animées, Ladislas Starevich started making his influential puppet animations in 1910. Winsor McCay's Little Nemo showcased detailed drawings, his Gertie the Dinosaur was an early example of character development in drawn animation. During the 1910s, the production of animated short films referred to as "cartoons", became an industry of its own and cartoon shorts were produced for showing in movie theaters; the most successful producer at the time was John Randolph Bray, along with animator Earl Hurd, patented the cel animation process that dominated the animation industry for the rest of the decade. El Apóstol was a 1917 Argentine animated film utilizing cutout animation, the world's first animated feature film. A fire that destroyed producer Federico Valle's film studio incinerated the only known copy of El Apóstol, it is now considered a lost film. In 1919, the silent animated short Feline Follies was released, marking the debut of Felix the Cat, being the first animated character in the silent film era to win a high level of popularity.
The earliest extant feature-length animated film is The Adve
The Amiga is a family of personal computers introduced by Commodore in 1985. The original model was part of a wave of 16- and 32-bit computers that featured 256 KB or more of RAM, mouse-based GUIs, improved graphics and audio over 8-bit systems; this wave included the Atari ST—released the same year—Apple's Macintosh, the Apple IIGS. Based on the Motorola 68000 microprocessor, the Amiga differed from its contemporaries through the inclusion of custom hardware to accelerate graphics and sound, including sprites and a blitter, a pre-emptive multitasking operating system called AmigaOS; the Amiga 1000 was released in July 1985, but a series of production problems kept it from becoming available until early 1986. The best selling model, the Amiga 500, was introduced in 1987 and became one of the leading home computers of the late 1980s and early 1990s with four to six million sold; the A3000, introduced in 1990, started the second generation of Amiga systems, followed by the A500+, the A600 in March 1992.
As the third generation, the A1200 and the A4000 were released in late 1992. The platform became popular for gaming and programming demos, it found a prominent role in the desktop video, video production, show control business, leading to video editing systems such as the Video Toaster. The Amiga's native ability to play back multiple digital sound samples made it a popular platform for early tracker music software; the powerful processor and ability to access several megabytes of memory enabled the development of several 3D rendering packages, including LightWave 3D, Aladdin4D, TurboSilver and Traces, a predecessor to Blender. Although early Commodore advertisements attempt to cast the computer as an all-purpose business machine when outfitted with the Amiga Sidecar PC compatibility add-on, the Amiga was most commercially successful as a home computer, with a wide range of games and creative software. Poor marketing and the failure of the models to repeat the technological advances of the first systems meant that the Amiga lost its market share to competing platforms, such as the fourth generation game consoles and the dropping prices of IBM PC compatibles which gained 256-color VGA graphics in 1987.
Commodore went bankrupt in April 1994 after the Amiga CD32 model failed in the marketplace. Since the demise of Commodore, various groups have marketed successors to the original Amiga line, including Genesi, Eyetech, ACube Systems Srl and A-EON Technology. AmigaOS has influenced replacements and compatible systems such as MorphOS, AmigaOS 4 and AROS. "The Amiga was so far ahead of its time that nobody—including Commodore's marketing department—could articulate what it was all about. Today, it's obvious the Amiga was the first multimedia computer, but in those days it was derided as a game machine because few people grasped the importance of advanced graphics and video. Nine years vendors are still struggling to make systems that work like 1985 Amigas." Jay Miner joined Atari in the 1970s to develop custom integrated circuits, led development of the Atari 2600's TIA. As soon as its development was complete, the team began developing a much more sophisticated set of chips, CTIA, ANTIC and POKEY, that formed the basis of the Atari 8-bit family.
With the 8-bit line's launch in 1979, the team once again started looking at a next generation chipset. Nolan Bushnell had sold the company to Warner Communications in 1978, the new management was much more interested in the existing lines than development of new products that might cut into their sales. Miner wanted to start work with the new Motorola 68000, but management was only interested in another 6502 based system. Miner left the company, for a time, the industry. In 1979, Larry Kaplan founded Activision. In 1982, Kaplan was approached by a number of investors. Kaplan hired Miner to run the hardware side of the newly formed company, "Hi-Toro"; the system was code-named "Lorraine" in keeping with Miner's policy of giving systems female names, in this case the company president's wife, Lorraine Morse. When Kaplan left the company late in 1982, Miner was promoted to head engineer and the company relaunched as Amiga Corporation. A breadboard prototype was completed by late 1983, shown at the January 1984 Consumer Electronics Show.
At the time, the operating system was not ready, so the machine was demonstrated with the Boing Ball demo. A further developed version of the system was demonstrated at the June 1984 CES and shown to many companies in hopes of garnering further funding, but found little interest in a market, in the final stages of the North American video game crash of 1983. In March, Atari expressed a tepid interest in Lorraine for its potential use in a games console or home computer tentatively known as the 1850XLD, but the talks were progressing and Amiga was running out of money. A temporary arrangement in June led to a $500,000 loan from Atari to Amiga to keep the company going; the terms required the loan to be repaid at the end of the month, otherwise Amiga would forfeit the Lorraine design to Atari. During 1983, Atari lost over $1 million a week, due to the combined effects of the crash and the ongoing price war in the home computer market. By the end of the year, Warner was desperate to sell the company.
In January 1984, Jack Tramiel resigned from Commodore due to internal battles over the future direction of the company. A number of Commodore employees followed him to Tramiel Technology; this included a number of the senior technical staff, where they began development of a 68000-based machine of the
Video Graphics Array
Video Graphics Array is a graphics standard for video display controller first introduced with the IBM PS/2 line of computers in 1987, following CGA and EGA introduced in earlier IBM personal computers. Through widespread adoption, the term has come to mean either an analog computer display standard, the 15-pin D-subminiature VGA connector, or the 640×480 resolution characteristic of the VGA hardware. VGA was the last IBM graphics standard to which the majority of PC clone manufacturers conformed, making it the lowest common denominator that all post-1990 PC graphics hardware can be expected to implement, it was followed by IBM's Extended Graphics Array standard, but was superseded by numerous different extensions to VGA made by clone manufacturers, collectively known as Super VGA. Today, the VGA analog interface is used for high-definition video, including resolutions of 1080p and higher. While the transmission bandwidth of VGA is high enough to support higher resolution playback, there can be picture quality degradation depending on cable quality and length.
How discernible this degradation is depends on the individual's eyesight and the display, though it is more noticeable when switching to and from digital inputs like HDMI or DVI. The VGA supports both alphanumeric text modes. Standard graphics modes are: 640×480 in 16 colors or monochrome 640×350 or 640×200 in 16 colors or monochrome 320×200 in 4 or 16 colors 320×200 in 256 colors The 640×480 16-color and 320×200 256-color modes had redefinable palettes, with each entry selectable from within an 18-bit RGB table, although the high resolution mode is most familiar from its use with a fixed palette under Microsoft Windows; the other color modes defaulted to standard EGA or CGA compatible palettes, but could still be redefined if desired using VGA-specific programming. Higher-resolution and other display modes are achievable with standard cards and most standard monitors – on the whole, a typical VGA system can produce displays with any combination of: 512 to 800 pixels wide, in 16 colors, or 256 to 400 pixels wide, in 256 colors with heights of: 200, or 350 to 410 lines at 70 Hz refresh rate, or 224 to 256, or 448 to 512 lines at 60 Hz refresh rate 512 to 600 lines at reduced vertical refresh rates, depending on individual monitor compatibility.
For example, high resolution modes with square pixels are available at 768×576 or 704×528 in 16 colors, or medium-low resolution at 320×240 with 256 colors. "Narrow" modes such as 256×224 tend to preserve the same pixel ratio as in e.g. 320×240 mode unless the monitor is adjusted to stretch the image out to fill the screen, as they are derived by masking down the wider mode instead of altering pixel or line timings, but can be useful for reducing memory requirements and pixel addressing calculations for arcade game conversions or console emulators. Standard text modes: 80×25 character display, rendered with a 9×16 pixel font, with an effective resolution of 720×400 in either 16 colors or monochrome, the latter being compatible with legacy MDA-based applications. 40×25, using the same font grid, for an effective resolution of 360×400 80×43 or 80×50 in 16-colors, with an effective resolution of 640×344 or 640×400 pixels. As with the pixel-based graphics modes, additional text modes are technically possible with an overall maximum of about 100×80 cells and an active area spanning about 88×64 cells, but these are used as it makes much more sense to just use a graphics mode – with a small proportional font – if a larger text display is required.
One variant, sometimes seen is 80×30 or 80×60, using an 8×16 or 8×8 font and an effective 640×480 pixel display, which trades use of the more flickery 60 Hz mode for an additional 5 or 10 lines of text and square character blocks. VGA is referred to as an "Array" instead of an "adapter" because it was implemented from the start as a single chip – an application-specific integrated circuit which replaced both the Motorola 6845 video address generator and the dozens of discrete logic chips that covered the full-length ISA boards of the MDA and CGA. More directly, it replaced many discrete logic chips on the EGA board, its single-chip implementation allowed the VGA to be placed directly on a PC′s motherboard with a minimum of difficulty, which in turn increased the reliability of the video subsystem by reducing the number of component connections, since the VGA required only video memory, timing crystals and an external RAMDAC. As a result, the first IBM PS/2 models were equipped with VGA on the motherboard, in contrast to all of the "family one" IBM PC desktop models – the PC, PC/XT, PC AT – which required a display adapter installed in a slot in order to connect a mo
Enhanced Graphics Adapter
The Enhanced Graphics Adapter is an IBM PC computer display standard from 1984 that superseded and exceeded the capabilities of the CGA standard introduced with the original IBM PC, was itself superseded by the VGA standard in 1987. EGA was introduced in October 1984 by IBM, shortly after its new PC/AT; the EGA standard was made obsolete by the introduction in 1987 of MCGA and VGA with the PS/2 computer line. Shortly before the introduction of VGA, Genoa Systems introduced a half-size graphics card built around a proprietary chip set, which they called Super EGA. EGA produces a display of sixteen simultaneous colors from a palette of sixty-four, at a resolution of up to 640×350 pixels; the EGA card includes a 16 KB ROM to extend the system BIOS for additional graphics functions, includes a custom CRT controller that has limited backward compatibility with the Motorola MC6845 chip used to generate video timing signals in earlier IBM PC graphics controllers. The EGA CRTC can support all of the modes of the IBM MDA and CGA adapters through specific mode options intended for this purpose, but in its maximum-compatibility mode configuration it is not register-compatible with an MC6845, so programs that directly program the 6845 to set up video modes will fail on an EGA.
When an MDA or CGA mode is set up on the EGA by calling the BIOS the raster timing, video memory layout, data format, some other low-level hardware details such as cursor control are identical to those aspects of the operation of an MDA or CGA, providing a high degree of direct software and hardware compatibility. In the 640×350 high resolution mode, each of the sixteen colors can be selected from a palette comprising all possible combinations of two bits per pixel each for red and blue, allowing four levels of intensity for each primary color and sixty-four possible colors overall. EGA includes full sixteen-color versions of the CGA 640×200 and 320×200 graphics modes. Depending on the monitor, only the sixteen CGA/RGBI colors are available in these modes. EGA four-bit graphic modes are notable for a sophisticated use of bit planes and mask registers together with CPU bitwise operations, which constitutes an early graphics accelerator inherited by VGA and numerous compatible hardware design models.
EGA is dual-sync. The original CGA modes are present, though EGA is not 100% hardware compatible with CGA, as was mentioned. EGA can drive an MDA monitor by a special setting of switches on the board. In summary, the EGA supports all BIOS-standard video modes of all previous IBM PC video adapters except the enhanced CGA of the IBM PCjr; the standard MDA and CGA modes are supported 100% at the BIOS level, with additional support in some details at the hardware register level. The three enhanced graphics modes of the PCjr—160x200 16-color, 320x200 16-color, 640x200 4-color—are not supported by the EGA, IBM lists their BIOS video mode numbers as reserved. However, the EGA can generate displays equivalent to these modes using different modes; the EGA 320x200 16-color mode is not the same as nor compatible with the PCjr mode of the same format: the PCjr mode uses 4-bit pixels that are packed 2 pixels to a byte in a 32 KB video buffer, split into 4 banks of interleaved lines, whereas the EGA mode uses the linear bit-plane format, native to the EGA.
EGA cards were available starting in both eight - and sixteen-bit versions. The original IBM EGA card had 64 KB of onboard RAM and required a daughter-board to add an additional 64 KB. All third-party cards came with 128 KB installed and some 256 KB, allowing multiple graphics pages, multiple text-mode character sets, large scrolling displays. A few third-party EGA clones feature a range of extended graphics modes, as well as automatic monitor type detection, sometimes a special 400-line interlace mode for use on CGA monitors. EGA supports: 640×350 w/16 colors, pixel aspect ratio of 1:1.37. 640×350 w/2 colors, pixel aspect ratio of 1:1.37. 640×200 w/16 colors, pixel aspect ratio of 1:2.4. 320×200 w/16 colors, pixel aspect ratio of 1:1.2. Text modes: 40×25 with 8×8 pixel font 80×25 with 8×8 pixel font 80×25 with 8×14 pixel font 80×43 with 8×8 pixel font Extended graphics modes of third party boards: 640×400 640×480 720×540 The EGA palette allows all 16 CGA colors to be used and it allows substitution of each of these colors with any one from a total of 64 colors.
This allows the CGA's alternate brown color to be used without any additional display hardware. The VGA standard built on this by allowing each of the 64 colors to be further customized. However, standard EGA monitors do not support use of the extended color palette in 200-line modes; the monitor cannot distinguish between being connected to
The Atari ST is a line of home computers from Atari Corporation and the successor to the Atari 8-bit family. The initial ST model, the 520ST, saw limited release in April–June 1985 and was available in July; the Atari ST is the first personal computer to come with a bitmapped color GUI, using a version of Digital Research's GEM released in February 1985. The 1040ST, released in 1986, is the first personal computer to ship with a megabyte of RAM in the base configuration and the first with a cost-per-kilobyte of less than US$1; the Atari ST is part of a mid-1980s generation of home computers that have 16 or 32-bit processors, 256 KB or more of RAM, mouse-controlled graphical user interfaces. This generation includes the Macintosh, Commodore Amiga, Apple IIGS, and, in certain markets, the Acorn Archimedes. "ST" stands for "Sixteen/Thirty-two", which refers to the Motorola 68000's 16-bit external bus and 32-bit internals. The ST was sold with the less expensive monochrome monitor; the system's two color graphics modes are only available on the former while the highest-resolution mode needs the monochrome monitor.
In some markets Germany, the machine gained a strong foothold as a small business machine for CAD and desktop publishing work. Thanks to its built-in MIDI ports, the ST enjoyed success for running music-sequencer software and as a controller of musical instruments among amateurs and well-known musicians alike; the ST was superseded by the Atari STE, Atari TT, Atari MEGA STE, Falcon computers. The Atari ST was born from the rivalry between home-computer makers Atari, Inc. and Commodore International. Jay Miner, one of the original designers for the custom chips found in the Atari 2600 and Atari 8-bit family, tried to convince Atari management to create a new chipset for a video game console and computer; when his idea was rejected, Miner left Atari to form a small think tank called Hi-Toro in 1982 and began designing the new "Lorraine" chipset. The company, renamed Amiga Corporation, was pretending to sell video game controllers to deceive competition while it developed a Lorraine-based computer.
Amiga ran out of capital to complete Lorraine's development, Atari, owned by Warner Communications, paid Amiga to continue development work. In return Atari received exclusive use of the Lorraine design for one year as a video game console. After one year Atari would have the right to add a keyboard and market the complete computer, designated the 1850XLD; as Atari was involved with Disney at the time, it was code-named "Mickey", the 256K memory expansion board was codenamed "Minnie". After leaving Commodore International in January 1984, Jack Tramiel formed Tramel Technology with his sons and other ex-Commodore employees and, in April, began planning a new computer; the company considered the National Semiconductor NS320xx microprocessor but was disappointed with its performance. This started the move to the 68000; the lead designer of the Atari ST was ex-Commodore employee Shiraz Shivji, who had worked on the Commodore 64's development. Atari in mid-1984 was losing about a million dollars per day.
Interested in Atari's overseas manufacturing and worldwide distribution network for his new computer, Tramiel negotiated with Warner in May and June 1984. He bought Atari's Consumer Division in July; as executives and engineers left Commodore to join Tramiel's new Atari Corporation, Commodore responded by filing lawsuits against four former engineers for theft of trade secrets. The Tramiels did not purchase the employee contracts when they bought the assets of Atari Inc. so one of their first acts was to interview Atari Inc. employees to decide whom to hire at what was a brand new company. This company was called TTL renamed to Atari Corp. At the time of the purchase of Atari Inc's assets, there were 900 employees remaining from a high point of 10,000. After the interviews 100 employees were hired to work at Atari Corp. At one point a custom sound processor called AMY was a planned component for the new ST computer design, but the chip needed more time to complete, so AMY was dropped in favor of an off-the-shelf Yamaha sound chip.
It was during this time in late July/early August that Leonard Tramiel discovered the original Amiga contract, which required Amiga Corporation to deliver the Lorraine chipset to Atari on June 30, 1984. Amiga Corp. had sought more monetary support from investors in spring 1984. Having heard rumors that Tramiel was negotiating to buy Atari, Amiga Corp. entered into discussions with Commodore. The discussions led to Commodore wanting to purchase Amiga Corporation outright, which Commodore believed would cancel any outstanding contracts, including Atari's. Instead of Amiga Corp. delivering Lorraine to Atari, Commodore delivered a check of $500,000 to Atari on Amiga's behalf, in effect returning the funds Atari invested into Amiga for the chipset. Tramiel countersued Amiga Corp. on August 13, 1984. He sought an injunction to bar Amiga from producing anything with its technology. At Commodore, the Amiga team was in limbo during the summer of 1984 because of the lawsuit. No word on the status of the chipset, the Lorraine computer, or the team's fate was known.
In the fall of 1984, Commodore informed the team that the Lorraine project was active again, the chipset was to be improved, the operating system developed, the hardware design completed. While Commodore announced the Amiga 1000 with the Lorraine chipset in July 1985, the delay gave Atari, with its ma