Video game programmer
A game programmer is a software engineer, programmer, or computer scientist who develops codebases for video games or related software, such as game development tools. Game programming has many specialized disciplines, all of which fall under the umbrella term of "game programmer". A game programmer should not be confused with a game designer. In the early days of video games, a game programmer took on the job of a designer and artist; this was because the abilities of early computers were so limited that having specialized personnel for each function was unnecessary. Game concepts were light and games were only meant to be played for a few minutes at a time, but more art content and variations in gameplay were constrained by computers' limited power; as specialized arcade hardware and home systems became more powerful, game developers could develop deeper storylines and could include such features as high-resolution and full color graphics, advanced artificial intelligence and digital sound.
Technology has advanced to such a great degree that contemporary games boast 3D graphics and full motion video using assets developed by professional graphic artists. Nowadays, the derogatory term "programmer art" has come to imply the kind of bright colors and blocky design that were typical of early video games; the desire for adding more depth and assets to games necessitated a division of labor. Art production was relegated to full-time artists. Next game programming became a separate discipline from game design. Now, only some games, such as the puzzle game Bejeweled, are simple enough to require just one full-time programmer. Despite this division, most game developers have some say in the final design of contemporary games. A contemporary video game may include advanced physics, artificial intelligence, 3D graphics, digitised sound, an original musical score, complex strategy and may use several input devices and may be playable against other people via the Internet or over a LAN; each aspect of the game can consume all of one programmer's time and, in many cases, several programmers.
Some programmers may specialize in one area of game programming, but many are familiar with several aspects. The number of programmers needed for each feature depends somewhat on programmers' skills, but are dictated by the type of game being developed. Game engine programmers create the base engine of the game, including the simulated physics and graphics disciplines. Video games use existing game engines, either commercial, open source or free, they are customized for a particular game, these programmers handle these modifications. A game's physics programmer is dedicated to developing the physics. A game will only simulate a few aspects of real-world physics. For example, a space game may need simulated gravity, but would not have any need for simulating water viscosity. Since processing cycles are always at a premium, physics programmers may employ "shortcuts" that are computationally inexpensive, but look and act "good enough" for the game in question. In other cases, unrealistic physics are employed to allow easier gameplay or for dramatic effect.
Sometimes, a specific subset of situations is specified and the physical outcome of such situations are stored in a record of some sort and are never computed at runtime at all. Some physics programmers may delve into the difficult tasks of inverse kinematics and other motions attributed to game characters, but these motions are assigned via motion capture libraries so as not to overload the CPU with complex calculations. For a role-playing game such as World of Warcraft, only one physics programmer may be needed. For a complex combat game such as Battlefield 1942, teams of several physics programmers may be required; this title belonged to a programmer who developed specialized blitter algorithms and clever optimizations for 2D graphics. Today, however, it is exclusively applied to programmers who specialize in developing and modifying complex 3D graphic renderers; some 2D graphics skills have just become useful again, for developing games for the new generation of cell phones and handheld game consoles.
A 3D graphics programmer must have a firm grasp of advanced mathematical concepts such as vector and matrix math and linear algebra. Skilled programmers specializing in this area of game development can demand high wages and are a scarce commodity, their skills can be used for video games on any platform. An AI programmer develops the logic of time to simulate intelligence in opponents, it has evolved into a specialized discipline, as these tasks used to be implemented by programmers who specialized in other areas. An AI programmer may program pathfinding and enemy tactic systems; this is one of the most challenging aspects of game programming and its sophistication is developing rapidly. Contemporary games dedicate 10 to 20 percent of their programming staff to AI; some games, such as strategy games like Civilization III or role-playing video games such as The Elder Scrolls IV: Oblivion, use AI while others, such as puzzle games, use it sparingly or not at all. Many game developers have created entire languages that can be used to program their own AI for games via scripts.
These languages are less technical than the language used to implement the game, will be used by the game or level designers to implement the world of the game. Many studios make their games' scripting available to players
The Xbox is a home video game console and the first installment in the Xbox series of consoles manufactured by Microsoft. It was released as Microsoft's first foray into the gaming console market on November 15, 2001, in North America, followed by Australia and Japan in 2002, it is classified as a sixth generation console, competing with Sony's PlayStation 2 and Nintendo's GameCube. It was the first console produced by an American company since the Atari Jaguar ceased production in 1996. Announced in 2000, the Xbox was graphically powerful compared to its rivals, featuring a 733 MHz Intel Pentium III processor, a processor that could be found on a standard PC, it was noted for its PC-like size and weight, was the first console to feature a built-in hard disk. In November 2002, Microsoft launched Xbox Live, a fee-based online gaming service that enabled subscribers to download new content and connect with other players through a broadband connection. Unlike online services from Sega and Sony, Xbox Live had support in the original console design through an integrated Ethernet port.
The service gave Microsoft an early foothold in online gaming and would help the Xbox become a competitor in the sixth-generation of consoles. The popularity of blockbuster titles such as Bungie's Halo 2 contributed to the popularity of online console gaming, in particular first-person shooters. Despite this, being in second position by the sales numbers—ahead of Nintendo's GameCube and Sega's Dreamcast—sales of the Xbox were always well behind Sony's PlayStation 2. Xbox's successor and the next console in the series, the Xbox 360, was launched in November 2005 as part of the seventh generation; the Xbox was discontinued soon after, beginning with Japan, Microsoft's worst-performing market, in 2005. Other countries followed suit in 2006; the last Xbox game in Europe was Xiaolin Showdown, released in June 2007, the last game in North America was Madden NFL 09 from EA Sports, released in August 2008. Support for out-of-warranty Xbox consoles was discontinued on March 2, 2009. Support for Xbox Live on the console ended on April 15, 2010.
In 1998, four engineers from Microsoft's DirectX team, Kevin Bachus, Seamus Blackley, Ted Hase and DirectX team leader Otto Berkes, disassembled some Dell laptop computers to construct a prototype Microsoft Windows-based video game console. The team hoped to create a console using a standardized set of hardware to compete with Sony's upcoming PlayStation 2, luring game developers away from the Windows platform; the team approached Ed Fries, the leader of Microsoft's game publishing business at the time, pitched their "DirectX Box" console based on the DirectX graphics technology developed by Berkes's team. Fries decided to support the team's idea of creating a Windows DirectX based console. During development, the original DirectXbox name was shortened to Xbox. Microsoft's marketing department did not like the Xbox name, suggested many alternatives. During focus testing, the Xbox name was left on the list of possible names to demonstrate how unpopular the Xbox name would be with consumers. However, consumer testing revealed that Xbox was preferred by far over the other suggested names and "Xbox" became the official name of the product.
It was Microsoft's first video game console after collaborating with Sega to port Windows CE to the Dreamcast console. Microsoft delayed the console, first mentioned publicly in late 1999 during interviews with Microsoft's then-CEO Bill Gates. Gates stated: "we want Xbox to be the platform of choice for the best and most creative game developers in the world"; the Xbox was announced at the Game Developers Conference on March 10, 2000. Audiences were impressed by the console's technology. At the time of Gates's announcement, Sega's Dreamcast sales were diminishing and Sony's PlayStation 2 was just going on sale in Japan. Gates was in talks with Sega's late chairman Isao Okawa about the possibility of Xbox compatibility with Dreamcast games, but negotiations fell apart over whether or not the Dreamcast's SegaNet online service should be implemented; the Xbox was unveiled to the public by Gates and guest professional wrestler The Rock at CES 2001 in Las Vegas on January 3, 2001. Microsoft announced Xbox's release prices at E3 2001 in May.
Most Xbox launch titles were unveiled at E3, most notably Halo: Combat Evolved and Dead or Alive 3. Due to the immense popularity of gaming consoles in Japan, Microsoft delayed the release of the Xbox in Europe to focus on the Japanese video game market. Although delayed, the European release proved to be more successful than the launch of the Xbox in Japan; some of Microsoft's plans proved effective. In preparation for its launch, Microsoft acquired Bungie and used Halo: Combat Evolved as its launch title. At the time, GoldenEye 007 for the Nintendo 64 had been one of the few hit FPS games to appear on a console, as well as titles such as Perfect Dark and Medal of Honor. Halo: Combat Evolved proved a good application to drive the Xbox's sales. In 2002, Microsoft made the second place slot in consoles sold in North America; the Xbox Live service gave Microsoft an early foothold in online gaming and would help the Xbox become a relevant competitor to other sixth-generation consoles. In 2002, the Independent Television Commission banned a television advertisement for the Xbox in the United Kingdom after complaints that it was "offensive, shocking and in bad taste".
It depicted a mother giving birth to a baby boy, fired like a projectile through a window aging as he flies through the air. The advertisement ends with an old man crash-landing into his own grave and the slogan, "Life is short. Play more." The Xbox's successor, the Xbox 360, was announced on May 12, 2005 on MTV. It was the first n
Dio was an American heavy metal band formed in 1982 and led by vocalist Ronnie James Dio, after he left Black Sabbath with intentions to form a new band with fellow former Black Sabbath member Vinny Appice, the band's drummer. The name Dio was chosen because it made sense from a commercial standpoint, as the name was well known at that time; the band released ten studio albums and had numerous line-up changes over the years with Dio himself being the only constant member. Guitarists included Craig Goldy, Doug Aldrich, Vivian Campbell, Warren DeMartini, Tracy G, Jake E. Lee and Rowan Robertson; the band dissolved in 2010 when Ronnie James Dio died of stomach cancer at the age of 67. Dio has sold more than 10 million albums worldwide. In 1982, disagreements originating over the mixing of Black Sabbath's Live Evil resulted in the departure of Ronnie James Dio and Vinny Appice from the band. Wanting to continue together as a band, the two formed Dio in October 1982 in the United States with Vivian Campbell and Jimmy Bain.
It featured two hit singles, "Rainbow in the Dark" and "Holy Diver", which gained popularity from MTV. Ronnie James Dio and Jimmy Bain played keyboards in the studio, but recruited keyboardist Claude Schnell for live shows in 1983 prior to the Holy Diver tour. Claude Schnell played to the side of the stage on the first two tours before coming out front in 1985. Dio had this to say of the band's origins:It was a good time to be in that band, it was perfect for us. Everything just fell into place; the ethic in rehearsal was amazing. The effort in the recording was just as good. Everybody wanted it to be great. We believed in what we were doing and couldn't wait to get that product out and have people hear it. Now a quintet with Schnell on keyboards, the band released their second studio album, The Last in Line, on July 2, 1984, it was followed by their third album, Sacred Heart, released on August 15, 1985. In 1985 Ronnie James Dio and Bain wrote the song "Stars" for the Hear'n Aid project, with many other heavy metal luminaries of the time contributing.
Campbell became unhappy working with Dio, the rift between them culminated in Campbell being fired from Dio's band. Campbell was subsequently invited to join Whitesnake in 1987. Several songs were recorded live during the Sacred Heart tour for the 1986 Intermission EP with Campbell still on guitar, however the EP contained the studio track "Time To Burn", which served to introduce fans to Craig Goldy as the new guitarist. On July 21, 1987 their fourth album Dream. After Dream Evil, wanting to pursue solo projects, left the band. In June 1989 18-year-old Rowan Robertson was announced as Goldy's successor but further changes were to follow, with Schnell and Appice leaving the band. Schnell and Appice were replaced with Jens Johansson, Teddy Cook, former AC/DC drummer Simon Wright; the new band released the album Lock Up the Wolves in the spring of 1990. During the tour, Ronnie James Dio had a chance meeting with former Black Sabbath bandmate Geezer Butler which led to that band's short-lived reunion, producing one album, Dehumanizer.
After this Ronnie James Dio reassembled Dio once again. By early 1993 guitarist Tracy G, keyboardist Scott Warren of Warrant and bassist Jeff Pilson of Dokken had all joined. During this era, the band focused on modern issues; as a result, some fans regard the albums made during this period—1993's Strange Highways, 1996's Angry Machines and the live album Inferno: Last in Live—as the worst in Dio's catalogue, while others view them positively as a step away from the outdated sound of the 1980s. With disappointing record sales for Angry Machines management wanted the band to go back to their earlier style prompting the departure of Tracy G to be replaced by the returning Craig Goldy. In addition, Appice left Dio once again. Craig Goldy's return facilitated the release of Dio's eighth studio album in 2000, regarded by many as the band's "comeback album" and reached No. 13 on the Billboard independent charts. It featured not only the return of Goldy but of Simon Wright and Jimmy Bain, although on the European leg of the tour Chuck Garric played bass.
Scott Warren remained in the band and performed live, although Ronnie and Bain handled all of the keyboard and synth on Magica. A concept album, Magica featured a return to the band's older, more successful sound, while increased use of keyboards gave it a more modern feel. During the following tour, tensions rose between Goldy on the one hand and Bain and Ronnie James Dio on the other, as Goldy was dealing with family obligations. Goldy left the band in January 2002 and was replaced with Doug Aldrich, whom Bain had met while recording a tribute album for Metallica; because of his late arrival, Aldrich did not contribute much to Dio's ninth work, Killing the Dragon, written by Ronnie James Dio and Bain. Killing The Dragon was released in 2002 through Spitfire Records and was well received in the metal community, making the Billboard top 200. Aldrich would stay in the band until April of the following year, when he, like Campbell before him, joined Whitesnake, prompting Goldy's return. Soon afterwards, Bain left the band.
Dio released their tenth studio album, Master of the Moon on August 30, 2004 in Europe through SPV Records and on September 7, 2004 in the United States through Sanctuary Records. The album features multi-instrumentalist Jeff Pilson on bass duties.
North America is a continent within the Northern Hemisphere and all within the Western Hemisphere. It is bordered to the north by the Arctic Ocean, to the east by the Atlantic Ocean, to the west and south by the Pacific Ocean, to the southeast by South America and the Caribbean Sea. North America covers an area of about 24,709,000 square kilometers, about 16.5% of the earth's land area and about 4.8% of its total surface. North America is the third largest continent by area, following Asia and Africa, the fourth by population after Asia and Europe. In 2013, its population was estimated at nearly 579 million people in 23 independent states, or about 7.5% of the world's population, if nearby islands are included. North America was reached by its first human populations during the last glacial period, via crossing the Bering land bridge 40,000 to 17,000 years ago; the so-called Paleo-Indian period is taken to have lasted until about 10,000 years ago. The Classic stage spans the 6th to 13th centuries.
The Pre-Columbian era ended in 1492, the transatlantic migrations—the arrival of European settlers during the Age of Discovery and the Early Modern period. Present-day cultural and ethnic patterns reflect interactions between European colonists, indigenous peoples, African slaves and their descendants. Owing to the European colonization of the Americas, most North Americans speak English, Spanish or French, their culture reflects Western traditions; the Americas are accepted as having been named after the Italian explorer Amerigo Vespucci by the German cartographers Martin Waldseemüller and Matthias Ringmann. Vespucci, who explored South America between 1497 and 1502, was the first European to suggest that the Americas were not the East Indies, but a different landmass unknown by Europeans. In 1507, Waldseemüller produced a world map, in which he placed the word "America" on the continent of South America, in the middle of what is today Brazil, he explained the rationale for the name in the accompanying book Cosmographiae Introductio:... ab Americo inventore... quasi Americi terram sive Americam.
For Waldseemüller, no one should object to the naming of the land after its discoverer. He used the Latinized version of Vespucci's name, but in its feminine form "America", following the examples of "Europa", "Asia" and "Africa". Other mapmakers extended the name America to the northern continent, In 1538, Gerard Mercator used America on his map of the world for all the Western Hemisphere; some argue that because the convention is to use the surname for naming discoveries, the derivation from "Amerigo Vespucci" could be put in question. In 1874, Thomas Belt proposed a derivation from the Amerrique mountains of Central America. Marcou corresponded with Augustus Le Plongeon, who wrote: "The name AMERICA or AMERRIQUE in the Mayan language means, a country of perpetually strong wind, or the Land of the Wind, and... the can mean... a spirit that breathes, life itself." The United Nations formally recognizes "North America" as comprising three areas: Northern America, Central America, The Caribbean.
This has been formally defined by the UN Statistics Division. The term North America maintains various definitions in accordance with context. In Canadian English, North America refers to the land mass as a whole consisting of Mexico, the United States, Canada, although it is ambiguous which other countries are included, is defined by context. In the United States of America, usage of the term may refer only to Canada and the US, sometimes includes Greenland and Mexico, as well as offshore islands. In France, Portugal, Romania and the countries of Latin America, the cognates of North America designate a subcontinent of the Americas comprising Canada, the United States, Mexico, Greenland, Saint Pierre et Miquelon, Bermuda. North America has been referred to by other names. Spanish North America was referred to as Northern America, this was the first official name given to Mexico. Geographically the North American continent has many subregions; these include cultural and geographic regions. Economic regions included those formed by trade blocs, such as the North American Trade Agreement bloc and Central American Trade Agreement.
Linguistically and culturally, the continent could be divided into Latin America. Anglo-America includes most of Northern America and Caribbean islands with English-speaking populations; the southern North American continent is composed of two regions. These are the Caribbean; the north of the continent maintains recognized regions as well. In contrast to the common definition of "North America", which encompasses the whole continent, the term "North America" is sometimes used to refer only to Mexico, the United States, Greenland; the term Northern America refers to the northern-most countries and territories of North America: the United States, Bermuda, St. Pierre and Miquelon and Greenland. Although the term does not refer to a unifie
A personal computer is a multi-purpose computer whose size and price make it feasible for individual use. Personal computers are intended to be operated directly by an end user, rather than by a computer expert or technician. Unlike large costly minicomputer and mainframes, time-sharing by many people at the same time is not used with personal computers. Institutional or corporate computer owners in the 1960s had to write their own programs to do any useful work with the machines. While personal computer users may develop their own applications these systems run commercial software, free-of-charge software or free and open-source software, provided in ready-to-run form. Software for personal computers is developed and distributed independently from the hardware or operating system manufacturers. Many personal computer users no longer need to write their own programs to make any use of a personal computer, although end-user programming is still feasible; this contrasts with mobile systems, where software is only available through a manufacturer-supported channel, end-user program development may be discouraged by lack of support by the manufacturer.
Since the early 1990s, Microsoft operating systems and Intel hardware have dominated much of the personal computer market, first with MS-DOS and with Microsoft Windows. Alternatives to Microsoft's Windows operating systems occupy a minority share of the industry; these include free and open-source Unix-like operating systems such as Linux. Advanced Micro Devices provides the main alternative to Intel's processors; the advent of personal computers and the concurrent Digital Revolution have affected the lives of people in all countries. "PC" is an initialism for "personal computer". The IBM Personal Computer incorporated the designation in its model name, it is sometimes useful to distinguish personal computers of the "IBM Personal Computer" family from personal computers made by other manufacturers. For example, "PC" is used in contrast with "Mac", an Apple Macintosh computer.. Since none of these Apple products were mainframes or time-sharing systems, they were all "personal computers" and not "PC" computers.
The "brain" may one day come down to our level and help with our income-tax and book-keeping calculations. But this is speculation and there is no sign of it so far. In the history of computing, early experimental machines could be operated by a single attendant. For example, ENIAC which became operational in 1946 could be run by a single, albeit trained, person; this mode pre-dated the batch programming, or time-sharing modes with multiple users connected through terminals to mainframe computers. Computers intended for laboratory, instrumentation, or engineering purposes were built, could be operated by one person in an interactive fashion. Examples include such systems as the Bendix G15 and LGP-30of 1956, the Programma 101 introduced in 1964, the Soviet MIR series of computers developed from 1965 to 1969. By the early 1970s, people in academic or research institutions had the opportunity for single-person use of a computer system in interactive mode for extended durations, although these systems would still have been too expensive to be owned by a single person.
In what was to be called the Mother of All Demos, SRI researcher Douglas Engelbart in 1968 gave a preview of what would become the staples of daily working life in the 21st century: e-mail, word processing, video conferencing, the mouse. The demonstration required technical support staff and a mainframe time-sharing computer that were far too costly for individual business use at the time; the development of the microprocessor, with widespread commercial availability starting in the mid 1970's, made computers cheap enough for small businesses and individuals to own. Early personal computers—generally called microcomputers—were sold in a kit form and in limited volumes, were of interest to hobbyists and technicians. Minimal programming was done with toggle switches to enter instructions, output was provided by front panel lamps. Practical use required adding peripherals such as keyboards, computer displays, disk drives, printers. Micral N was the earliest commercial, non-kit microcomputer based on a microprocessor, the Intel 8008.
It was built starting in 1972, few hundred units were sold. This had been preceded by the Datapoint 2200 in 1970, for which the Intel 8008 had been commissioned, though not accepted for use; the CPU design implemented in the Datapoint 2200 became the basis for x86 architecture used in the original IBM PC and its descendants. In 1973, the IBM Los Gatos Scientific Center developed a portable computer prototype called SCAMP based on the IBM PALM processor with a Philips compact cassette drive, small CRT, full function keyboard. SCAMP emulated an IBM 1130 minicomputer in order to run APL/1130. In 1973, APL was available only on mainframe computers, most desktop sized microcomputers such as the Wang 2200 or HP 9800 offered only BASIC; because SCAMP was the first to emulate APL/1130 performance on a portable, single user computer, PC Magazine in 1983 designated SCAMP a "revolutionary concept" and "the world's first personal computer". This seminal, single user portable computer now resides in the Smithsonian Institution, Washington, D.
C.. Successful demonstrations of the 1973 SCAMP prototype led to the IBM 5100 portable microcomputer launched in 1975 with the ability to be programmed in both APL and BASIC for engineers, analysts and other business problem-solvers. In the late 1960s such a machine would have been nearly as large as two desks and would have weigh
A video game is an electronic game that involves interaction with a user interface to generate visual feedback on a two- or three-dimensional video display device such as a TV screen, virtual reality headset or computer monitor. Since the 1980s, video games have become an important part of the entertainment industry, whether they are a form of art is a matter of dispute; the electronic systems used to play video games are called platforms. Video games are developed and released for one or several platforms and may not be available on others. Specialized platforms such as arcade games, which present the game in a large coin-operated chassis, were common in the 1980s in video arcades, but declined in popularity as other, more affordable platforms became available; these include dedicated devices such as video game consoles, as well as general-purpose computers like a laptop, desktop or handheld computing devices. The input device used for games, the game controller, varies across platforms. Common controllers include gamepads, mouse devices, the touchscreens of mobile devices, or a person's body, using a Kinect sensor.
Players view the game on a display device such as a television or computer monitor or sometimes on virtual reality head-mounted display goggles. There are game sound effects and voice actor lines which come from loudspeakers or headphones; some games in the 2000s include haptic, vibration-creating effects, force feedback peripherals and virtual reality headsets. In the 2010s, the commercial importance of the video game industry is increasing; the emerging Asian markets and mobile games on smartphones in particular are driving the growth of the industry. As of 2015, video games generated sales of US$74 billion annually worldwide, were the third-largest segment in the U. S. entertainment market, behind broadcast and cable TV. Early games used interactive electronic devices with various display formats; the earliest example is from 1947—a "Cathode ray tube Amusement Device" was filed for a patent on 25 January 1947, by Thomas T. Goldsmith Jr. and Estle Ray Mann, issued on 14 December 1948, as U. S.
Patent 2455992. Inspired by radar display technology, it consisted of an analog device that allowed a user to control a vector-drawn dot on the screen to simulate a missile being fired at targets, which were drawings fixed to the screen. Other early examples include: The Nimrod computer at the 1951 Festival of Britain; each game used different means of display: NIMROD used a panel of lights to play the game of Nim, OXO used a graphical display to play tic-tac-toe Tennis for Two used an oscilloscope to display a side view of a tennis court, Spacewar! used the DEC PDP-1's vector display to have two spaceships battle each other. In 1971, Computer Space, created by Nolan Bushnell and Ted Dabney, was the first commercially sold, coin-operated video game, it used a black-and-white television for its display, the computer system was made of 74 series TTL chips. The game was featured in the 1973 science fiction film Soylent Green. Computer Space was followed in 1972 by the first home console. Modeled after a late 1960s prototype console developed by Ralph H. Baer called the "Brown Box", it used a standard television.
These were followed by two versions of Atari's Pong. The commercial success of Pong led numerous other companies to develop Pong clones and their own systems, spawning the video game industry. A flood of Pong clones led to the video game crash of 1977, which came to an end with the mainstream success of Taito's 1978 shooter game Space Invaders, marking the beginning of the golden age of arcade video games and inspiring dozens of manufacturers to enter the market; the game inspired arcade machines to become prevalent in mainstream locations such as shopping malls, traditional storefronts and convenience stores. The game became the subject of numerous articles and stories on television and in newspapers and magazines, establishing video gaming as a growing mainstream hobby. Space Invaders was soon licensed for the Atari VCS, becoming the first "killer app" and quadrupling the console's sales; this helped Atari recover from their earlier losses, in turn the Atari VCS revived the home video game market during the second generation of consoles, up until the North American video game crash of 1983.
The home video game industry was revitalized shortly afterwards by the widespread success of the Nintendo Entertainment System, which marked a shift in the dominance of the video game industry from the United States to Japan during the third generation of consoles. A number of video game developers emerged in Britain in the early 1980s; the term "platform" refers to the specific combination of electronic components or computer hardware which, in conjunction with software, allows a video game to operate. The term "system" is commonly used; the distinctions below are not always clear and there may be games that bridge one or more platforms. In addition to laptop/desktop computers and mobile devices, there are other devices which have the ability to play games but are not video game machines, such as PDAs and graphing calculators. In common use a "PC game" refers to a form of media that involves a player interacting with a personal computer conne
Latin honors are Latin phrases used to indicate the level of distinction with which an academic degree has been earned. This system is used in the United States, many countries of continental Europe, some Southeastern Asian countries with European colonial history, such as Indonesia and the Philippines, although some institutions use translations of these phrases rather than the Latin originals; the honors distinction should not be confused with the honors degrees offered in some countries. A college's or university's regulations set out definite criteria to be met in order for a student to obtain a given honors distinction. For example, the student might be required to achieve a specific grade point average, to submit an honors thesis for evaluation, to be part of an honors program, or to graduate early; each university sets its own standards. Since these standards may vary it is possible for the same level of Latin honors conferred by different institutions to represent contrasting levels of academic achievement.
Some institutions may grant equivalent non-Latin honors to undergraduates. The University of Wisconsin–Madison, for example, has a series of plain English grading honors based on class standing; these honors, when they are used, are always awarded to undergraduates earning their bachelor's, with the exception of law school graduates, much more to graduate students receiving their master's or doctorate degree. The honor is indicated on the diploma. Latin honors are conferred upon law school students graduating as a Juris Doctor or J. D. in which case they are based upon class rank or grade point average. In North America, Latin honors are awarded by colleges and universities for undergraduates degrees, such as the Bachelor of Arts or Bachelor of Science, by law schools for the Juris Doctor degree. Latin honors are not used with other graduate degrees, such as M. D. or Ph. D. degrees. Most institutions use two or three levels of Latin honors, listed below in ascending order: cum laude, meaning "with praise".
This honor is awarded to graduates in the top 20%, top 25%, or top 30% of their class, depending on the institution. Magna cum laude, meaning "with great praise"; this honor is awarded to graduates in the top 10% or top 15% of their class, depending on the institution. Summa cum laude, meaning "with greatest praise"; this honor is awarded to graduates in the top 1%, top 2%, or top 5% of their class, depending on the institution. Not all institutions award the summa cum laude distinction; some institutions have additional distinctions. For example, at a few universities maxima cum laude, meaning "with great praise", is an intermediary honor between the magna and the summa honors, it is sometimes used when the summa honor is reserved only for students with perfect academic records. A further used distinction is that of egregia cum laude which means "with outstanding praise," and if used may be for either students achieving summa cum laude honors in a difficult subject area or recipients of a non-standard Bachelor's degree.
For undergraduate degrees, Latin honors are used in only a few countries such as the United States, Indonesia, the Dominican Republic, the Philippines and Canada. Most countries use a different scheme, such as the British undergraduate degree classification, more used with varying criteria and nomenclature depending on country, including Australia, Barbados, Colombia, Hong Kong, Ireland, Kenya, New Zealand, Pakistan, Sri Lanka, South Africa and Tobago, the United Kingdom and many other countries. Malta shows the Latin honors on the degree certificates, but the UK model is shown on the transcript. In Austria, the only Latin honor in use is sub auspiciis Praesidentis rei publicae for doctoral degrees. Candidates must have excellent grades throughout high school and university, making it difficult to attain: only about 20 out of a total of 2,500 doctoral graduates per year achieve a sub auspiciis degree. In Belgium, the university degree awarded is limited to: Satisfaction cum laude magna cum laude summa cum laude In Brazil, the Instituto Tecnológico de Aeronáutica awards the cum laude honor for graduates with every individual grade above 8.5, the magna cum laude honor for graduates with average grade above 8.5 and more than 50% of individual grades above 9.5, the summa cum laude honor for graduates with average grade above 9.5.
As of 2009, only 22 graduates have received the summa cum laude honor at ITA. The Federal University of Rio de Janeiro awards the cum laude honor for graduates with average grade from 8.0 to 8.9, the magna cum laude honor for graduates with average grade from 9.0 to 9.4, the summa cum laude honor for graduates with average grade from 9.5 to 10.0. The Federal University of Ceará awards the magna cum laude honor for undergraduates who have never failed a course, achieved an average grade from 8.5 and have received a fellowship of both Academic Extension and Teaching Initiation. In Estonia, up until 2010 both summa cum laude and cum laude were used. Summa cum laude was awarded only for exceptional work. Since 1 September 2010, only cum laude is used. It