MOS Technology 6502
The MOS Technology 6502 is an 8-bit microprocessor, designed by a small team led by Chuck Peddle for MOS Technology. When it was introduced in 1975, the 6502 was, by a considerable margin, the least expensive microprocessor on the market, it sold for less than one-sixth the cost of competing designs from larger companies, such as Motorola and Intel, caused rapid decreases in pricing across the entire processor market. Along with the Zilog Z80, it sparked a series of projects that resulted in the home computer revolution of the early 1980s. Popular home video game consoles and computers, such as the Atari 2600, Atari 8-bit family, Apple II, Nintendo Entertainment System, Commodore 64, Atari Lynx, BBC Micro and others, used the 6502 or variations of the basic design. Soon after the 6502's introduction, MOS Technology was purchased outright by Commodore International, who continued to sell the microprocessor and licenses to other manufacturers. In the early days of the 6502, it was second-sourced by Rockwell and Synertek, licensed to other companies.
In its CMOS form, developed by the Western Design Center, the 6502 family continues to be used in embedded systems, with estimated production volumes in the hundreds of millions. The 6502 was designed by many of the same engineers that had designed the Motorola 6800 microprocessor family. Motorola started the 6800 microprocessor project in 1971 with Tom Bennett as the main architect; the chip layout began in late 1972, the first 6800 chips were fabricated in February 1974 and the full family was released in November 1974. John Buchanan was the designer of the 6800 chip and Rod Orgill, who did the 6501, assisted Buchanan with circuit analyses and chip layout. Bill Mensch joined Motorola in June 1971 after graduating from the University of Arizona, his first assignment was helping define the peripheral ICs for the 6800 family and he was the principal designer of the 6820 Peripheral Interface Adapter. Motorola's engineers could run digital simulations on an IBM 370-165 mainframe computer. Bennett hired Chuck Peddle in 1973 to do architectural support work on the 6800 family products in progress.
He contributed in many areas, including the design of the 6850 ACIA. Motorola's target customers were established electronics companies such as Hewlett-Packard, Tektronix, TRW, Chrysler. In May 1972, Motorola's engineers began visiting select customers and sharing the details of their proposed 8-bit microprocessor system with ROM, RAM, parallel and serial interfaces. In early 1974, they provided engineering samples of the chips so that customers could prototype their designs. Motorola's "total product family" strategy did not focus on the price of the microprocessor, but on reducing the customer's total design cost, they offered development software on a timeshare computer, the "EXORciser" debugging system, onsite training and field application engineer support. Both Intel and Motorola had announced a $360 price for a single microprocessor; the actual price for production quantities was much less. Motorola offered a design kit containing the 6800 with six support chips for $300. Peddle, who would accompany the sales people on customer visits, found that customers were put off by the high cost of the microprocessor chips.
To lower the price, the IC chip size would have to shrink so that more chips could be produced on each silicon wafer. This could be done by removing inessential features in the 6800 and using a newer fabrication technology, "depletion-mode" MOS transistors. Peddle and other team members started outlining the design of an improved feature, reduced size microprocessor. At that time, Motorola's new semiconductor fabrication facility in Austin, was having difficulty producing MOS chips and mid 1974 was the beginning of a year-long recession in the semiconductor industry. Many of the Mesa, employees were displeased with the upcoming relocation to Austin. Motorola Semiconductor Products Division's management was overwhelmed with problems and showed no interest in Peddle's low-cost microprocessor proposal. Chuck Peddle was frustrated with Motorola's management for missing this new opportunity. In a November 1975 interview, Motorola's Chairman, Robert Galvin, agreed, he said, "We did not choose the right leaders in the Semiconductor Products division."
The division was reorganized and the management replaced. New group vice-president John Welty said, "The semiconductor sales organization lost its sensitivity to customer needs and couldn't make speedy decisions."Peddle began looking for a source of funding for this new project and found a small semiconductor company in Pennsylvania. In August 1974, Chuck Peddle, Bill Mensch, Rod Orgill, Harry Bawcom, Ray Hirt, Terry Holdt and Wil Mathys left Motorola to join MOS Technology. Of the seventeen chip designers and layout people on the 6800 team, seven left. There were 30 to 40 application engineers and system engineers on the 6800 team; that December, Gary Daniels transferred into the 6800 microprocessor group. Tom Bennett did not want to leave the Phoenix area so Daniels took over the microprocessor development in Austin, his first project was a "depletion-mode" version of the 6800. The faster parts were available in July 1976; this was followed by the 6802 which added 128 bytes of an on-chip clock oscillator circuit.
MOS Technology was formed in 1969 by three executives from General Instrument, Mort Jaffe, Don McLaughlin, John Pavinen, to produce metal-oxide-semiconductor integrated circuits. Allen-Br
An amusement arcade is a venue where people play arcade games such as video games, pinball machines, electro-mechanical games, redemption games, merchandisers, or coin-operated billiards or air hockey tables. In some countries, some types of arcades are legally permitted to provide gambling machines such as slot machines or pachinko machines. Games are housed in cabinets; the term used for ancestors of these venues in the beginning of the 20th century was penny arcades. Video games were introduced in amusement arcades in the late 1970s and were most popular during the golden age of arcade video games, the early 1980s. Arcades became popular with children and adolescents, which led parents to be concerned that video game playing might cause them to skip school. A penny arcade can be any type of venue for coin-operated devices for entertainment; the term came into use about 1905-1910. The name derives from the penny, once a staple coin for the machines; the machines used included: bagatelles, a game with elements of billiards and non-electrical pinball, early forms of non-electrical pinball machines, fortune-telling machinery, slot machines, coin-operated Amberolas peep show machines, which allowed the viewer to see various objects and pictures Mutoscopes love tester machines.
Coin operated shooter gamesPenny arcades led to the creation of video arcades in the 1970s. Arcades catering for video games began to gain momentum in the late 1970s with games such as Space Invaders and Galaxian and became widespread in 1980 with Pac-Man and others; the central processing unit in these games allowed for more complexity than earlier discrete-circuitry games such as Atari's Pong. During the late 1970s video-arcade game technology had become sophisticated enough to offer good-quality graphics and sounds, but it remained basic and so the success of a game had to rely on simple and fun gameplay; this emphasis on the gameplay explains why many of these games continue to be enjoyed as of 2018, despite the progress made by modern computing technology. The golden age of video arcade games in the 1980s became a peak era of video arcade game popularity and earnings. Color arcade games became more prevalent and video arcades themselves started appearing outside their traditional bowling-alley and bar locales.
Designers experimented with a wide variety of game genres, while developers still had to work within strict limits of available processor-power and memory. The era saw the rapid spread of video arcades across Western Europe and Japan; the number of video-game arcades in North America, for example, more than doubled between 1980 and 1982, reaching a peak of 13,000 video game arcades across the region. Beginning with Space Invaders, video arcade games started to appear in supermarkets, liquor stores, gas stations and many other retail establishments looking for extra income; this boom came to an end in the mid-1980s, in what has been referred to as "the great coin-op video crash of 1983". On November 30, 1982, Jerry Parker, the Mayor of Ottumwa, declared his city the "Video Game Capital of the World"; this initiative resulted in many firsts in video game history. Playing a central role in arcade history, Ottumwa saw the birth of the Twin Galaxies Intergalactic Scoreboard and the U. S. National Video Game Team, two organizations that still exist today.
Other firsts that happened in the Video Game Capital of the World included: the first video-game-themed parade the first video game world championship the first study of the brain waves of video-game champions the first billion-point video-game performance the first official day to honor a video-game player High game-turnover in Japanese arcades required quick game-design, leading to the adoption of standardized systems like JAMMA, Neo-Geo and CPS-2. These systems provided arcade-only consoles where the video game ROM could be swapped to replace a game; this allowed easier development and replacement of games, but it discouraged the hardware innovation necessary to stay ahead of the technology curve. Most US arcades didn't see the intended benefit of this practice since many games weren't exported to the US, if they were, distributors refused to release them as a ROM, preferring to sell the entire ROM, sometimes the cabinet as a package. In fact, several arcade systems such as Sega's NAOMI board are arcade versions of home systems.
The arcade industry entered a major slump in mid-1994. Arcade attendance and per-visit spending, though not as poor as during the 1983 crash, declined to the point where several of the largest arcade chains either were put up for sale or declared bankruptcy, while many large arcade machine manufacturers moved to get out of the business. In the second quarter of 1996, video game factories reported 90,000 arcade cabinets sold, as compared to 150,000 cabinets sold in 1990; the main reason for the slump was increasing competition from console ports. During the 1980s it took several years for an arcade game to be released on a home console, the port differed from the arcade version. In the late 1990s, a bar opened in the new Crown Casino complex in Melbourne, Australia named Barcode
The demoscene is an international computer art subculture focused on producing demos: self-contained, sometimes small, computer programs that produce audio-visual presentations. The purpose of a demo is to show off programming, visual art, musical skills. Demos and other demoscene productions are shared at festivals known as demoparties, voted on by those who attend, released online; the demoscene's roots are in the home computer revolution of the late 1970s, the subsequent advent of software cracking. Crackers altered the code of video games to remove copy protection, claiming credit by adding introduction screens of their own, they soon started competing for the best visual presentation of these additions. Through the making of intros and stand-alone demos, a new community evolved, independent of the gaming and software sharing scenes. Prior to the popularity of IBM PC compatibles, most home computers of a given line had little variance in their basic hardware, which made their capabilities identical.
Therefore, the variations among demos created for one computer line were attributed to programming alone, rather than one computer having better hardware. This created a competitive environment in which demoscene groups would try to outperform each other in creating outstanding effects, to demonstrate why they felt one machine was better than another. Demo writers went to great lengths to get every last bit of performance out of their target machine. Where games and application writers were concerned with the stability and functionality of their software, the demo writer was interested in how many CPU cycles a routine would consume and, more how best to squeeze great activity onto the screen. Writers went so far as to exploit known hardware errors to produce effects that the manufacturer of the computer had not intended; the perception that the demo scene was going to extremes and charting new territory added to its draw. There are several categories demos are informally classified into, the most important being the division between freeform demos and size-restricted intros, a difference visible in the competitions of nearly any demo party.
The most typical competition categories for intros are the 64K intro and the 4K intro, where the size of the executable file is restricted to 65536 and 4096 bytes, respectively. In other competitions the choice of platform is restricted; such restrictions provide a challenge for coders and graphics artists, to make a device do more than was intended in its original design. The earliest computer programs that have some resemblance to demos and demo effects can be found among the so-called display hacks. Display hacks predate the demoscene by several decades, with the earliest examples dating back to the early 1950s. Demos in the demoscene sense began as software crackers' "signatures", that is, crack screens and crack intros attached to software whose copy protection was removed; the first crack screens appeared on the Apple II in the late 1970s and early 1980s, they were nothing but plain text screens crediting the cracker or their group. These static screens evolved into impressive-looking introductions containing animated effects and music.
Many cracker groups started to release intro-like programs separately, without being attached to unlicensed software. These programs were known by various names, such as letters or messages, but they came to be known as demos. In 1980, Inc. began using a looping demo with visual effects and music to show off the features of the Atari 400/800 computers in stores. At the 1985 Consumer Electronics Show, Atari showed a demoscene-style demo for its latest 8-bit computers that alternated between a 3D walking robot and a flying spaceship, each with its own music, animating larger objects than seen on those systems; the program was released to the public. In 1985, a large, checkered ball—casting a translucent shadow—was the signature demo of what the hardware was capable of when Commodore's Amiga was announced. Simple demo-like music collections were put together on the C64 in 1985 by Charles Deenen, inspired by crack intros, using music taken from games and adding some homemade color graphics. In the following year the movement now known as the demoscene was born.
The Dutch groups 1001 Crew and The Judges, both Commodore 64-based, are mentioned as the earliest demo groups. Whilst competing with each other in 1986, they both produced pure demos with original graphics and music involving more than just casual work, used extensive hardware trickery. At the same time demos from others, such as Antony Crowther, had started circulating on Compunet in the United Kingdom. On the ZX Spectrum, Castor Cracking Group released their first demo called Castor Intro in 1986; the ZX Spectrum demo scene was slow to start, but it started to rise in the late 1980s, most noticeably in Eastern Europe. The demoscene is a European phenomenon, is predominantly male, it is a competition-oriented subculture, with groups and individual artists competing against each other in technical and artistic excellence. Those who achieve excellence are dubbed "elite", while those who do not follow the demoscene's implicit rules are called "lamers". Both this competitiveness and the sense of cooperation among demosceners have led to comparisons with
S-Video is a signaling standard for standard definition video 480i or 576i. By separating the black-and-white and coloring signals, it achieves better image quality than composite video, but has lower color resolution than component video. Standard analog television signals go through several processing steps on their way to being broadcast, each of which discards information and lowers the quality of the resulting images; the image is captured in RGB form and processed into three signals known as YPbPr. The first of these signals is called Y, created from all three original signals based on a formula that produces an overall brightness of the image, or luma; this signal matches a traditional black and white television signal and the Y/C method of encoding was key to offering backward compatibility. Once the Y signal is produced, it is subtracted from the blue signal to produce Pb and from the red signal to produce Pr. To recover the original RGB information for display, the signals are mixed with the Y to produce the original blue and red, the sum of those is mixed with the Y to recover the green.
A signal with three components is no easier to broadcast than the original three-signal RGB, so additional processing is required. The first step is to combine the Pr to form the C signal, for chrominance; the phase and amplitude of the signal represent the two original signals. This signal is bandwidth-limited to comply with requirements for broadcasting; the resulting Y and C signals are mixed together to produce composite video. To play back composite video, the Y and C signals must be separated, this is difficult to do without adding artifacts; each of these steps is subject to unavoidable loss of quality. To retain that quality in the final image, it is desirable to eliminate as many of the encoding/decoding steps as possible. S-Video is an approach to this problem, it eliminates the final subsequent separation at playback time. The S-video cable carries video using two synchronized signal and ground pairs, termed Y and C. Y is the luma signal, which carries the luminance – or black-and-white – of the picture, including synchronization pulses.
C is the chroma signal. This signal contains the hue of the video; the luminance signal carries horizontal and vertical sync pulses in the same way as a composite video signal. Luma is a signal carrying luminance after gamma correction, is therefore termed "Y" because of the similarity to the lower-case Greek letter gamma. In composite video, the signals co-exist on different frequencies. To achieve this, the luminance signal must be low-pass filtered; as S-Video maintains the two as separate signals, such detrimental low-pass filtering for luminance is unnecessary, although the chrominance signal still has limited bandwidth relative to component video. Compared with component video, which carries the identical luminance signal but separates the color-difference signals into Cb/Pb and Cr/Pr, the color resolution of S-Video is limited by the modulation on a subcarrier frequency of 3.57 to 4.43 megahertz, depending on the standard. This difference is meaningless on home videotape systems, as the chrominance is severely constrained by both VHS and Betamax.
Carrying the color information as one signal means that the color has to be encoded in some way in accord with NTSC, PAL, or SECAM, depending on the applicable local standard. S-Video suffers from low color resolution. NTSC S-Video color resolution is 120 lines horizontal, versus 250 lines horizontal for the Rec. 601-encoded signal of a DVD, or 30 lines horizontal for standard VCRs. In many European Union countries, S-Video was less common because of the dominance of SCART connectors, which are present on most existing televisions, it is possible for a player to output S-Video over SCART, but televisions' SCART connectors are not wired to accept it, if not the display would show only a monochrome image. In this case it is sometimes possible to modify the SCART adapter cable to make it work; some game consoles. Early consoles came with RF adapters, the uncommon composite video on the classic RCA type video jack. Instead of S-Video, consoles like the GameCube had RGB output. In the US and some other NTSC countries, S-Video was provided on some video equipment, including most televisions and game consoles.
The primary exceptions were Beta VCRs. The European usage of RGB video is because the RGB quality of most retro computers and consoles is better than S-Video; the Atari 800 introduced separate Chroma/Luma output in late 1979. The signals were put on pin 5 of a 5-pin 180 degree DIN Connector socket. Atari did not sell a monitor for its 8-bit computer line, however; the Commodore 64 released in 1982 offers separate chroma and luma signals using a different connector. Although Commodore Business Machines did not use the term "S-Video" as the standard did not formally exist until 1987, a simple adapter connects the computer's "LCA" 8-pin DIN socket to a S-Video display, or an S-Video device to the Commodore 1702 monitor's LCA jacks; the four-pin mini-DIN connector is the most common of several S-Video connector types. The same mini-DIN connector is used in the Apple Desktop Bus for Macintosh computers and the two cable types can be interchanged. Other connector variants include seven-pin locking "dub" connectors used on many professional S-VHS machines, dual "Y" and "C" BNC connectors used fo
Video game console
A video game console is a computer device that outputs a video signal or visual image to display a video game that one or more people can play. The term "video game console" is used to distinguish a console machine designed for consumers to use for playing video games, in contrast to arcade machines or home computers. An arcade machine consists of a video game computer, game controller and speakers housed in large chassis. A home computer is a personal computer designed for home use for a variety of purposes, such as bookkeeping, accessing the Internet and playing video games. While arcades and computers are expensive or “technical” devices, video game consoles were designed with affordability and accessibility to the general public in mind. Unlike similar consumer electronics such as music players and movie players, which use industry-wide standard formats, video game consoles use proprietary formats which compete with each other for market share. There are various types of video game consoles, including home video game consoles, handheld game consoles and dedicated consoles.
Although Ralph Baer had built working game consoles by 1966, it was nearly a decade before the Pong game made them commonplace in regular people's living rooms. Through evolution over the 1990s and 2000s, game consoles have expanded to offer additional functions such as CD players, DVD players, Blu-ray disc players, web browsers, set-top boxes and more; the first video games appeared in the 1960s. They were played on massive computers connected to vector displays, not analog televisions. Ralph H. Baer conceived the idea of a home video game in 1951. In the late 1960s, while working for Sanders Associates, Baer created a series of video game console designs. One of these designs, which gained the nickname of the 1966 "Brown Box", featured changeable game modes and was demonstrated to several TV manufacturers leading to an agreement between Sanders Associates and Magnavox. In 1972, Magnavox released the Magnavox Odyssey, the first home video game console which could be connected to a TV set. Ralph Baer's initial design had called for a huge row of switches that would allow players to turn on and off certain components of the console to create different games like tennis, volleyball and chase.
Magnavox replaced the switch design with separate cartridges for each game. Although Baer had sketched up ideas for cartridges that could include new components for new games, the carts released by Magnavox all served the same function as the switches and allowed players to choose from the Odyssey's built-in games; the Odyssey sold about 100,000 units, making it moderately successful, it was not until Atari's arcade game Pong popularized video games that the public began to take more notice of the emerging industry. By autumn 1975, bowing to the popularity of Pong, canceled the Odyssey and released a scaled-down version that played only Pong and hockey, the Odyssey 100. A second, "higher end" console, the Odyssey 200, was released with the 100 and added on-screen scoring, up to four players, a third game—Smash. Released with Atari's own home Pong console through Sears, these consoles jump-started the consumer market. All three of the new consoles used simpler designs than the original Odyssey did with no board game pieces or extra cartridges.
In the years that followed, the market saw many companies rushing similar consoles to market. After General Instrument released their inexpensive microchips, each containing a complete console on a single chip, many small developers began releasing consoles that looked different externally, but internally were playing the same games. Most of the consoles from this era were dedicated consoles playing only the games that came with the console; these video game consoles were just called video games because there was little reason to distinguish the two yet. While a few companies like Atari and newcomer Coleco pushed the envelope, the market became flooded with simple, similar video games. Fairchild released the Fairchild Video Entertainment System in 1976. While there had been previous game consoles that used cartridges, either the cartridges had no information and served the same function as flipping switches or the console itself was empty and the cartridge contained all of the game components.
The VES, contained a programmable microprocessor so its cartridges only needed a single ROM chip to store microprocessor instructions. RCA and Atari soon released their own cartridge-based consoles, the RCA Studio II and the Atari 2600, respectively; the first handheld game console with interchangeable cartridges was the Microvision designed by Smith Engineering, distributed and sold by Milton-Bradley in 1979. Crippled by a small, fragile LCD display and a narrow selection of games, it was discontinued two years later; the Epoch Game Pocket Computer was released in Japan in 1984. The Game Pocket Computer featured an LCD screen with 75 X 64 resolution and could produce graphics at about the same level as early Atari 2600 games; the system sold poorly, as a result, only five games were made for it. Nintendo's Game & Watch series of dedicated game systems proved more successful, it helped to establish handheld gaming as popular and lasted until 1991. Many Game & Watch games were re-released on Nintendo's subsequent handheld systems.
The VES continued to be sold at a profit after 1977, both Bally and Magnavox brought their own programmable cartridge-based consoles to the market. However, i
Backward compatibility is a property of a system, product, or technology that allows for interoperability with an older legacy system, or with input designed for such a system in telecommunications and computing. Backward compatibility is sometimes called downward compatibility. Modifying a system in a way that does not allow backward compatibility is sometimes called "breaking" backward compatibility. A complementary concept is forward compatibility. A design, forward-compatible has a roadmap for compatibility with future standards and products; the associated benefits of backward compatibility are the appeal to an existing user base through an inexpensive upgrade path as well as the network effect, important, as it increases the value of goods and services proportionally to the size of the user base. One example of this is the Sony PlayStation 2, backward compatible with games for its predecessor PlayStation. While the selection of PS2 games available at launch was small, sales of the console were nonetheless strong in 2000-2001 thanks to the large library of games for the preceding PS1.
This bought time for the PS2 to grow a large installed base and developers to release more quality PS2 games for the crucial 2001 holiday season. The associated costs of backward compatibility are a higher bill of materials if hardware is required to support the legacy systems. A notable example is the Sony PlayStation 3, as the first PS3 iteration was expensive to manufacture in part due to including the Emotion Engine from the preceding PS2 in order to run PS2 games, since the PS3 architecture was different from the PS2. Subsequent PS3 hardware revisions have eliminated the Emotion Engine as it saved production costs while removing the ability to run PS2 titles, as Sony found out that backward compatibility was not a major selling point for the PS3. in contrast to the PS2. The PS3's chief competitor, the Microsoft Xbox 360, took a different approach to backward compatibility by using software emulation in order to run games from the first Xbox, rather than including legacy hardware from the original Xbox, quite different than the Xbox 360, however Microsoft stopped releasing emulation profiles after 2007.
A simple example of both backward and forward compatibility is the introduction of FM radio in stereo. FM radio was mono, with only one audio channel represented by one signal. With the introduction of two-channel stereo FM radio, a large number of listeners had only mono FM receivers. Forward compatibility for mono receivers with stereo signals was achieved through sending the sum of both left and right audio channels in one signal and the difference in another signal; that allows mono FM receivers to receive and decode the sum signal while ignoring the difference signal, necessary only for separating the audio channels. Stereo FM receivers can receive a mono signal and decode it without the need for a second signal, they can separate a sum signal to left and right channels if both sum and difference signals are received. Without the requirement for backward compatibility, a simpler method could have been chosen. Full backward compatibility is important in computer instruction set architectures, one of the most successful being the x86 family of microprocessors.
Their full backward compatibility spans back to the 16-bit Intel 8086/8088 processors introduced in 1978. Backwards compatible processors can process the same binary executable software instructions as their predecessors, allowing the use of a newer processor without having to acquire new applications or operating systems; the success of the Wi-Fi digital communication standard is attributed to its broad forward and backward compatibility. Compiler backward compatibility may refer to the ability of a compiler of a newer version of the language to accept programs or data that worked under the previous version. A data format is said to be backward compatible with its predecessor if every message or file, valid under the old format is still valid, retaining its meaning under the new format