North America is a continent within the Northern Hemisphere and all within the Western Hemisphere. It is bordered to the north by the Arctic Ocean, to the east by the Atlantic Ocean, to the west and south by the Pacific Ocean, to the southeast by South America and the Caribbean Sea. North America covers an area of about 24,709,000 square kilometers, about 16.5% of the earth's land area and about 4.8% of its total surface. North America is the third largest continent by area, following Asia and Africa, the fourth by population after Asia and Europe. In 2013, its population was estimated at nearly 579 million people in 23 independent states, or about 7.5% of the world's population, if nearby islands are included. North America was reached by its first human populations during the last glacial period, via crossing the Bering land bridge 40,000 to 17,000 years ago; the so-called Paleo-Indian period is taken to have lasted until about 10,000 years ago. The Classic stage spans the 6th to 13th centuries.
The Pre-Columbian era ended in 1492, the transatlantic migrations—the arrival of European settlers during the Age of Discovery and the Early Modern period. Present-day cultural and ethnic patterns reflect interactions between European colonists, indigenous peoples, African slaves and their descendants. Owing to the European colonization of the Americas, most North Americans speak English, Spanish or French, their culture reflects Western traditions; the Americas are accepted as having been named after the Italian explorer Amerigo Vespucci by the German cartographers Martin Waldseemüller and Matthias Ringmann. Vespucci, who explored South America between 1497 and 1502, was the first European to suggest that the Americas were not the East Indies, but a different landmass unknown by Europeans. In 1507, Waldseemüller produced a world map, in which he placed the word "America" on the continent of South America, in the middle of what is today Brazil, he explained the rationale for the name in the accompanying book Cosmographiae Introductio:... ab Americo inventore... quasi Americi terram sive Americam.
For Waldseemüller, no one should object to the naming of the land after its discoverer. He used the Latinized version of Vespucci's name, but in its feminine form "America", following the examples of "Europa", "Asia" and "Africa". Other mapmakers extended the name America to the northern continent, In 1538, Gerard Mercator used America on his map of the world for all the Western Hemisphere; some argue that because the convention is to use the surname for naming discoveries, the derivation from "Amerigo Vespucci" could be put in question. In 1874, Thomas Belt proposed a derivation from the Amerrique mountains of Central America. Marcou corresponded with Augustus Le Plongeon, who wrote: "The name AMERICA or AMERRIQUE in the Mayan language means, a country of perpetually strong wind, or the Land of the Wind, and... the can mean... a spirit that breathes, life itself." The United Nations formally recognizes "North America" as comprising three areas: Northern America, Central America, The Caribbean.
This has been formally defined by the UN Statistics Division. The term North America maintains various definitions in accordance with context. In Canadian English, North America refers to the land mass as a whole consisting of Mexico, the United States, Canada, although it is ambiguous which other countries are included, is defined by context. In the United States of America, usage of the term may refer only to Canada and the US, sometimes includes Greenland and Mexico, as well as offshore islands. In France, Portugal, Romania and the countries of Latin America, the cognates of North America designate a subcontinent of the Americas comprising Canada, the United States, Mexico, Greenland, Saint Pierre et Miquelon, Bermuda. North America has been referred to by other names. Spanish North America was referred to as Northern America, this was the first official name given to Mexico. Geographically the North American continent has many subregions; these include cultural and geographic regions. Economic regions included those formed by trade blocs, such as the North American Trade Agreement bloc and Central American Trade Agreement.
Linguistically and culturally, the continent could be divided into Latin America. Anglo-America includes most of Northern America and Caribbean islands with English-speaking populations; the southern North American continent is composed of two regions. These are the Caribbean; the north of the continent maintains recognized regions as well. In contrast to the common definition of "North America", which encompasses the whole continent, the term "North America" is sometimes used to refer only to Mexico, the United States, Greenland; the term Northern America refers to the northern-most countries and territories of North America: the United States, Bermuda, St. Pierre and Miquelon and Greenland. Although the term does not refer to a unifie
IPad is a line of tablet computers designed and marketed by Apple Inc. which run the iOS mobile operating system. The first iPad was released on April 3, 2010; as of May 2017, Apple has sold more than 360 million iPads, though sales peaked in 2013. It is the most popular tablet computer by sales as of the second quarter of 2018; the user interface is built around the device's multi-touch screen, including a virtual keyboard. All iPads can connect via Wi-Fi. IPads can shoot video, take photos, play music, perform Internet functions such as web-browsing and emailing. Other functions – games, reference, GPS navigation, social networking, etc. – can be enabled by downloading and installing apps. As of March 2016, the App Store has more than million apps for the iPad by third parties. There have been eight versions of the iPad; the first generation established design precedents. The 2nd-generation iPad introduced a new thinner design, a dual-core Apple A5 processor, VGA front-facing and 720p rear-facing cameras designed for FaceTime video calling.
The third generation added a Retina Display, the new Apple A5X processor with a quad-core graphics processor, a 5-megapixel camera, HD 1080p video recording, voice dictation, 4G. The fourth generation added the Apple A6X processor and replaced the 30-pin connector with an all-digital Lightning connector; the iPad Air added the Apple A7 processor and the Apple M7 motion coprocessor, reduced the thickness for the first time since the iPad 2. The iPad Air 2 added the Apple A8X processor, the Apple M8 motion coprocessor, an 8-megapixel camera, the Touch ID fingerprint sensor; the iPad introduced in 2017 added the Apple A9 processor, while sacrificing some of the improvements the iPad Air 2 introduced in exchange for a lower launch price. There have been five versions of the iPad Mini; the first generation has similar internal specifications to the iPad 2 but uses the Lightning connector instead. The iPad Mini 2 added the Retina Display, the Apple A7 processor, the Apple M7 motion coprocessor matching the internal specifications of the iPad Air.
The iPad Mini 3 added the Touch ID fingerprint sensor. The iPad Mini 4 features the Apple M8 motion coprocessor; the 5th generation features the Apple A12 SoC. There have been three generations of the iPad Pro; the first generation came with 9.7" and 12.9" screen sizes, while the second came with 10.5" and 12.9" sizes, the third with 11" and 12.9" sizes. The iPad Pros have unique features such as the Smart Connector, which are exclusive to this series of iPads. Apple co-founder Steve Jobs said in a 1983 speech that the company's strategy was simple: "What we want to do is we want to put an great computer in a book that you can carry around with you and learn how to use in 20 minutes... and we want to do it with a radio link in it so you don't have to hook up to anything and you're in communication with all of these larger databases and other computers." Apple's first tablet computer was the Newton MessagePad 100, introduced in 1993, powered by an ARM6 processor core developed by ARM, a 1990 spinout of Acorn Computers in which Apple invested.
Apple developed a prototype PowerBook Duo based tablet, the PenLite, but decided not to sell it in order to avoid hurting MessagePad sales. Apple released several more Newton-based PDAs. Apple re-entered the mobile-computing markets in 2007 with the iPhone. Smaller than the iPad, but featuring a camera and mobile phone, it pioneered the multi-touch finger-sensitive touchscreen interface of Apple's iOS mobile operating system. By late 2009, the iPad's release had been rumored for several years; such speculation talked about "Apple's tablet". The iPad was announced on January 27, 2010, by Steve Jobs at an Apple press conference at the Yerba Buena Center for the Arts in San Francisco. Jobs said that Apple had begun developing the iPad before the iPhone. Jonathan Ive in 1991 had created an industrial design for a stylus-based tablet, the Macintosh Folio, as his first project for Apple. Ive stated that after seeking to produce the tablet first, he came to agree with Jobs that the phone was more important, as the tablet's innovations would work as well in it.
The iPad's internal codename was K48, revealed in the court case surrounding leaking of iPad information before launch. Apple began taking pre-orders for the first-generation iPad on March 12, 2010; the only major change to the device between its announcement and being available to pre-order was the change of the behavior of the side switch to perform either sound muting or screen rotation locking. The Wi-Fi version of the iPad went on sale in the United States on April 3, 2010; the Wi-Fi + 3G version was released on April 30. 3G service in the United States is provided by AT&T and was sold with two prepaid contract-free data plan options: one for unlimited data and the other for 250 MB per month at half the price. On June 2, 2010, AT&T announced that effective June 7 the unlimited plan would be replaced for new
Tom W. Chick is an American television and movie actor, independent journalist, his most prominent TV roles were as Oscar's boyfriend Gil in the US version of The Office, the hard-hitting reporter Gordon in The West Wing. As a writer, Tom has contributed to many past video game publications, he ended his role as editor-in-chief for the now closed Fidgit gaming blog to move on to other opportunities. Tom maintains a gaming and movie blog on his web site Quarter to Three. Chick attended Harvard Divinity School and received a Masters of Theological Studies with a focus on the Old Testament. Deciding not to pursue the ministry, he moved to Hollywood, where he resides and pursues a career in writing about video games and acting for television roles, he is the co-founder and administrator of a web-based site for games discussion, Quarter to Three. In late September 2014, Tom Chick revealed in a podcast he had stage 4 Hypopharyngeal cancer, was about to begin chemotherapy; this treatment would interrupt his podcasting activities at Quarter To Three due to an impact on his speaking voice.
Though he was posting on the Quarter To Three Forums in Winter, 2014, he did not return to the podcast until the March 17, 2015 podcast, at which point he stated he was cancer free though his voice was still in recovery. Chick is an independent journalist; as a freelance columnist, he has written for a number of sites, including Firing Squad, Yahoo Games, GameSpy, GameSpot, Xtreme Gamer, 1Up, Rotten Tomatoes and others. His articles have appeared in magazines such as the "Tom vs. Bruce" series in Computer Gaming World, he was listed as "one of the field's rare American practitioners" in an article on "New Games Journalism" in the New York Times. In May 2008, He partnered with the Sci-Fi Channel as Editor in chief of a new co-branded gaming blog, entitled Fidgit. Chick's reviews are no stranger to controversy as he has been an outspoken critic of what he calls the "7–9 rating scale" at some game review sites, due to game reviews which were considered to be different from the main, such as his harshly critical review of Deus Ex of which he said "I'd say it's only 90% bad."
In a June 2000 review. He has collaborated on other games journalism projects, such as a podcast with independent games developer Brad Wardell hosted at the now-defunct PowerUser. TV, he is now a regular panelist on the strategy game themed podcast Three Moves Ahead with colleagues Troy Goodfellow, Julian Murdoch, Rob Zacny and old collaborator Dr. Bruce Geryk, he appeared in Joystiq Podcast number 114, released on October 23, 2009. After a successful kick-starter campaign, Tom Chick and Dr. Bruce Geryk brought back their popular gaming column, Tom vs Bruce in an online format. Chick's most successful television acting engagement was a recurring role as reporter Gordon in nine episodes of The West Wing, he played Oscar's homosexual lover Gil in The Office, Mario in The Nine. Frank & Jesse... Detective Whitcher Beverly Hills, 90210... Joe ER... Weissbroot Frasier... Waiter The King of Queens... Guy The West Wing... Gordon The Office... Gil The Nine... Mario Tom Chick on IMDb Quarter to Three – Home Page Shoot Club and other columns 60 second reviews Fidgit Gaming Blog Tom vs Bruce Tom Chick's Quarter to Three Reviews on Metacritic
Video game programmer
A game programmer is a software engineer, programmer, or computer scientist who develops codebases for video games or related software, such as game development tools. Game programming has many specialized disciplines, all of which fall under the umbrella term of "game programmer". A game programmer should not be confused with a game designer. In the early days of video games, a game programmer took on the job of a designer and artist; this was because the abilities of early computers were so limited that having specialized personnel for each function was unnecessary. Game concepts were light and games were only meant to be played for a few minutes at a time, but more art content and variations in gameplay were constrained by computers' limited power; as specialized arcade hardware and home systems became more powerful, game developers could develop deeper storylines and could include such features as high-resolution and full color graphics, advanced artificial intelligence and digital sound.
Technology has advanced to such a great degree that contemporary games boast 3D graphics and full motion video using assets developed by professional graphic artists. Nowadays, the derogatory term "programmer art" has come to imply the kind of bright colors and blocky design that were typical of early video games; the desire for adding more depth and assets to games necessitated a division of labor. Art production was relegated to full-time artists. Next game programming became a separate discipline from game design. Now, only some games, such as the puzzle game Bejeweled, are simple enough to require just one full-time programmer. Despite this division, most game developers have some say in the final design of contemporary games. A contemporary video game may include advanced physics, artificial intelligence, 3D graphics, digitised sound, an original musical score, complex strategy and may use several input devices and may be playable against other people via the Internet or over a LAN; each aspect of the game can consume all of one programmer's time and, in many cases, several programmers.
Some programmers may specialize in one area of game programming, but many are familiar with several aspects. The number of programmers needed for each feature depends somewhat on programmers' skills, but are dictated by the type of game being developed. Game engine programmers create the base engine of the game, including the simulated physics and graphics disciplines. Video games use existing game engines, either commercial, open source or free, they are customized for a particular game, these programmers handle these modifications. A game's physics programmer is dedicated to developing the physics. A game will only simulate a few aspects of real-world physics. For example, a space game may need simulated gravity, but would not have any need for simulating water viscosity. Since processing cycles are always at a premium, physics programmers may employ "shortcuts" that are computationally inexpensive, but look and act "good enough" for the game in question. In other cases, unrealistic physics are employed to allow easier gameplay or for dramatic effect.
Sometimes, a specific subset of situations is specified and the physical outcome of such situations are stored in a record of some sort and are never computed at runtime at all. Some physics programmers may delve into the difficult tasks of inverse kinematics and other motions attributed to game characters, but these motions are assigned via motion capture libraries so as not to overload the CPU with complex calculations. For a role-playing game such as World of Warcraft, only one physics programmer may be needed. For a complex combat game such as Battlefield 1942, teams of several physics programmers may be required; this title belonged to a programmer who developed specialized blitter algorithms and clever optimizations for 2D graphics. Today, however, it is exclusively applied to programmers who specialize in developing and modifying complex 3D graphic renderers; some 2D graphics skills have just become useful again, for developing games for the new generation of cell phones and handheld game consoles.
A 3D graphics programmer must have a firm grasp of advanced mathematical concepts such as vector and matrix math and linear algebra. Skilled programmers specializing in this area of game development can demand high wages and are a scarce commodity, their skills can be used for video games on any platform. An AI programmer develops the logic of time to simulate intelligence in opponents, it has evolved into a specialized discipline, as these tasks used to be implemented by programmers who specialized in other areas. An AI programmer may program pathfinding and enemy tactic systems; this is one of the most challenging aspects of game programming and its sophistication is developing rapidly. Contemporary games dedicate 10 to 20 percent of their programming staff to AI; some games, such as strategy games like Civilization III or role-playing video games such as The Elder Scrolls IV: Oblivion, use AI while others, such as puzzle games, use it sparingly or not at all. Many game developers have created entire languages that can be used to program their own AI for games via scripts.
These languages are less technical than the language used to implement the game, will be used by the game or level designers to implement the world of the game. Many studios make their games' scripting available to players
Real-time tactics or RTT is a subgenre of tactical wargames played in real-time simulating the considerations and circumstances of operational warfare and military tactics. It is differentiated from real-time strategy gameplay by the lack of classic resource micromanagement and base or unit building, as well as the greater importance of individual units and a focus on complex battlefield tactics. Typical real-time strategy titles encourage the player to focus on logistics and production as much as or more than combat, whereas real-time tactics games do not feature resource-gathering, base-building or economic management, instead focusing on tactical and operational aspects of warfare such as unit formations or the exploitation of terrain for tactical advantage. Real-time tactical gameplay is characterized by the expectation of players to complete their tasks using only the combat forces provided to them, by the provision of a realistic representation of military tactics and operations; this contrasts with other current strategy game genres.
For instance, in large-scale turn-based strategy games battles are abstracted and the gameplay close to that of related board games. Real-time strategy games de-emphasize realism and focus on the collection and conversion of resources into production capacities which manufacture combat units thereafter used in highly stylized confrontations. In contrast, real-time tactics games' military tactical and realistic focus and comparatively short risk/reward cycle provide a distinctly more immediate and accessible experience of battlefield tactics and mêlée than strategy games of other genres; as suggested by the genre's name fundamental to real-time tactics is real-time gameplay. The genre has its roots in tactical and miniature wargaming, where battle scenarios are recreated using miniatures or simple paper chits; these board and table-top games were out of necessity turn-based. Only with computer support was turn-based play and strategy transposed into real-time. Turn-based strategy and turn-based tactics were obvious first candidates for computer implementation.
While some publications do refer to "RTT" as a distinct subgenre of real-time strategy or strategy, not all publications do so. Further, precise terminology is inconsistent. Nonetheless, efforts have been made to distinguish RTT games from RTSs. For instance, GameSpy described Axis & Allies as a "true RTS", but with a high level of military realism with such features as battlefield command organization and supply lines. A developer for Close Combat said their game never aspired to be an RTS in the "classic sense", but was rather a "real time tactical simulation", lacking such features as resource collection. A developer of Nexus: The Jupiter Incident remarked on his game being called a "tactical fleet simulator" rather than a "traditional RTS", citing its focus on tactical gameplay and fixed units at the start of each mission. In general terms, military strategy refers to the use of a broad arsenal of weapons including diplomatic, informational and economic resources, whereas military tactics is more concerned with short-term goals such as winning an individual battle.
In the context of strategy video games, the difference comes down to the more limited criteria of either a presence or absence of base building and unit production. Real-time strategy games have been criticized for an overabundance of tactical considerations when compared to the amount of strategic gameplay found in such games. According to Chris Taylor, lead designer of Supreme Commander, " was my realizing that although we call this genre'Real-Time Strategy,' it should have been called'Real-Time Tactics' with a dash of strategy thrown in." Taylor went on to say that his own game featured added elements of a broader strategic level. In an article for GameSpy, Mark Walker said that developers need to begin looking outside the genre for new ideas in order for strategy games to continue to be successful in the future. In an article for Gamasutra, Nathan Toronto criticizes real-time strategy games for too having only one valid means of victory—attrition—comparing them unfavorably to real-time tactics games.
According to Toronto, players' awareness that their only way to win is militarily makes them unlikely to respond to gestures of diplomacy. Troy Goodfellow counters this by saying that the problem is not that real-time strategy games are lacking in strategic elements, he says that building and managing armies is the conventional definition of real-time strategy, that it is unfair to make comparisons with other genres when they break convention. Wargaming with items or figurines representing soldiers or units for training or entertainment has been common for as long as organised conflicts. Chess, for example, is based on essentialised battlefield movements of medieval unit types and, beyond its entertainment value, is intended to instill in players a rudimentary sense of tactical considerations. Today, miniature wargaming, where players mount armies of miniature figurines to battle each other, has become popular. Though similar to conventional modern board wargames (e.g. Axis
A computing platform or digital platform is the environment in which a piece of software is executed. It may be the hardware or the operating system a web browser and associated application programming interfaces, or other underlying software, as long as the program code is executed with it. Computing platforms have different abstraction levels, including a computer architecture, an OS, or runtime libraries. A computing platform is the stage. A platform can be seen both as a constraint on the software development process, in that different platforms provide different functionality and restrictions. For example, an OS may be a platform that abstracts the underlying differences in hardware and provides a generic command for saving files or accessing the network. Platforms may include: Hardware alone, in the case of small embedded systems. Embedded systems can access hardware directly, without an OS. A browser in the case of web-based software; the browser itself runs on a hardware+OS platform, but this is not relevant to software running within the browser.
An application, such as a spreadsheet or word processor, which hosts software written in an application-specific scripting language, such as an Excel macro. This can be extended to writing fully-fledged applications with the Microsoft Office suite as a platform. Software frameworks. Cloud computing and Platform as a Service. Extending the idea of a software framework, these allow application developers to build software out of components that are hosted not by the developer, but by the provider, with internet communication linking them together; the social networking sites Twitter and Facebook are considered development platforms. A virtual machine such as the Java virtual machine or. NET CLR. Applications are compiled into a format similar to machine code, known as bytecode, executed by the VM. A virtualized version of a complete system, including virtualized hardware, OS, storage; these allow, for instance, a typical Windows program to run on. Some architectures have multiple layers, with each layer acting as a platform to the one above it.
In general, a component only has to be adapted to the layer beneath it. For instance, a Java program has to be written to use the Java virtual machine and associated libraries as a platform but does not have to be adapted to run for the Windows, Linux or Macintosh OS platforms. However, the JVM, the layer beneath the application, does have to be built separately for each OS. AmigaOS, AmigaOS 4 FreeBSD, NetBSD, OpenBSD IBM i Linux Microsoft Windows OpenVMS Classic Mac OS macOS OS/2 Solaris Tru64 UNIX VM QNX z/OS Android Bada BlackBerry OS Firefox OS iOS Embedded Linux Palm OS Symbian Tizen WebOS LuneOS Windows Mobile Windows Phone Binary Runtime Environment for Wireless Cocoa Cocoa Touch Common Language Infrastructure Mono. NET Framework Silverlight Flash AIR GNU Java platform Java ME Java SE Java EE JavaFX JavaFX Mobile LiveCode Microsoft XNA Mozilla Prism, XUL and XULRunner Open Web Platform Oracle Database Qt SAP NetWeaver Shockwave Smartface Universal Windows Platform Windows Runtime Vexi Ordered from more common types to less common types: Commodity computing platforms Wintel, that is, Intel x86 or compatible personal computer hardware with Windows operating system Macintosh, custom Apple Inc. hardware and Classic Mac OS and macOS operating systems 68k-based PowerPC-based, now migrated to x86 ARM architecture based mobile devices iPhone smartphones and iPad tablet computers devices running iOS from Apple Gumstix or Raspberry Pi full function miniature computers with Linux Newton devices running the Newton OS from Apple x86 with Unix-like systems such as Linux or BSD variants CP/M computers based on the S-100 bus, maybe the earliest microcomputer platform Video game consoles, any variety 3DO Interactive Multiplayer, licensed to manufacturers Apple Pippin, a multimedia player platform for video game console development RISC processor based machines running Unix variants SPARC architecture computers running Solaris or illumos operating systems DEC Alpha cluster running OpenVMS or Tru64 UNIX Midrange computers with their custom operating systems, such as IBM OS/400 Mainframe computers with their custom operating systems, such as IBM z/OS Supercomputer architectures Cross-platform Platform virtualization Third platform Ryan Sarver: What is a platform
Linux is a family of free and open-source software operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is packaged in a Linux distribution. Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy. Popular Linux distributions include Debian and Ubuntu. Commercial distributions include SUSE Linux Enterprise Server. Desktop Linux distributions include a windowing system such as X11 or Wayland, a desktop environment such as GNOME or KDE Plasma. Distributions intended for servers may omit graphics altogether, include a solution stack such as LAMP; because Linux is redistributable, anyone may create a distribution for any purpose. Linux was developed for personal computers based on the Intel x86 architecture, but has since been ported to more platforms than any other operating system.
Linux is the leading operating system on servers and other big iron systems such as mainframe computers, the only OS used on TOP500 supercomputers. It is used by around 2.3 percent of desktop computers. The Chromebook, which runs the Linux kernel-based Chrome OS, dominates the US K–12 education market and represents nearly 20 percent of sub-$300 notebook sales in the US. Linux runs on embedded systems, i.e. devices whose operating system is built into the firmware and is tailored to the system. This includes routers, automation controls, digital video recorders, video game consoles, smartwatches. Many smartphones and tablet computers run other Linux derivatives; because of the dominance of Android on smartphones, Linux has the largest installed base of all general-purpose operating systems. Linux is one of the most prominent examples of open-source software collaboration; the source code may be used and distributed—commercially or non-commercially—by anyone under the terms of its respective licenses, such as the GNU General Public License.
The Unix operating system was conceived and implemented in 1969, at AT&T's Bell Laboratories in the United States by Ken Thompson, Dennis Ritchie, Douglas McIlroy, Joe Ossanna. First released in 1971, Unix was written in assembly language, as was common practice at the time. In a key pioneering approach in 1973, it was rewritten in the C programming language by Dennis Ritchie; the availability of a high-level language implementation of Unix made its porting to different computer platforms easier. Due to an earlier antitrust case forbidding it from entering the computer business, AT&T was required to license the operating system's source code to anyone who asked; as a result, Unix grew and became adopted by academic institutions and businesses. In 1984, AT&T divested itself of Bell Labs; the GNU Project, started in 1983 by Richard Stallman, had the goal of creating a "complete Unix-compatible software system" composed of free software. Work began in 1984. In 1985, Stallman started the Free Software Foundation and wrote the GNU General Public License in 1989.
By the early 1990s, many of the programs required in an operating system were completed, although low-level elements such as device drivers and the kernel, called GNU/Hurd, were stalled and incomplete. Linus Torvalds has stated that if the GNU kernel had been available at the time, he would not have decided to write his own. Although not released until 1992, due to legal complications, development of 386BSD, from which NetBSD, OpenBSD and FreeBSD descended, predated that of Linux. Torvalds has stated that if 386BSD had been available at the time, he would not have created Linux. MINIX was created by Andrew S. Tanenbaum, a computer science professor, released in 1987 as a minimal Unix-like operating system targeted at students and others who wanted to learn the operating system principles. Although the complete source code of MINIX was available, the licensing terms prevented it from being free software until the licensing changed in April 2000. In 1991, while attending the University of Helsinki, Torvalds became curious about operating systems.
Frustrated by the licensing of MINIX, which at the time limited it to educational use only, he began to work on his own operating system kernel, which became the Linux kernel. Torvalds began the development of the Linux kernel on MINIX and applications written for MINIX were used on Linux. Linux matured and further Linux kernel development took place on Linux systems. GNU applications replaced all MINIX components, because it was advantageous to use the available code from the GNU Project with the fledgling operating system. Torvalds initiated a switch from his original license, which prohibited commercial redistribution, to the GNU GPL. Developers worked to integrate GNU components with the Linux kernel, making a functional and free operating system. Linus Torvalds had wanted to call his invention "Freax", a portmant