Linux is a family of free and open-source software operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is packaged in a Linux distribution. Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy. Popular Linux distributions include Debian and Ubuntu. Commercial distributions include SUSE Linux Enterprise Server. Desktop Linux distributions include a windowing system such as X11 or Wayland, a desktop environment such as GNOME or KDE Plasma. Distributions intended for servers may omit graphics altogether, include a solution stack such as LAMP; because Linux is redistributable, anyone may create a distribution for any purpose. Linux was developed for personal computers based on the Intel x86 architecture, but has since been ported to more platforms than any other operating system.
Linux is the leading operating system on servers and other big iron systems such as mainframe computers, the only OS used on TOP500 supercomputers. It is used by around 2.3 percent of desktop computers. The Chromebook, which runs the Linux kernel-based Chrome OS, dominates the US K–12 education market and represents nearly 20 percent of sub-$300 notebook sales in the US. Linux runs on embedded systems, i.e. devices whose operating system is built into the firmware and is tailored to the system. This includes routers, automation controls, digital video recorders, video game consoles, smartwatches. Many smartphones and tablet computers run other Linux derivatives; because of the dominance of Android on smartphones, Linux has the largest installed base of all general-purpose operating systems. Linux is one of the most prominent examples of open-source software collaboration; the source code may be used and distributed—commercially or non-commercially—by anyone under the terms of its respective licenses, such as the GNU General Public License.
The Unix operating system was conceived and implemented in 1969, at AT&T's Bell Laboratories in the United States by Ken Thompson, Dennis Ritchie, Douglas McIlroy, Joe Ossanna. First released in 1971, Unix was written in assembly language, as was common practice at the time. In a key pioneering approach in 1973, it was rewritten in the C programming language by Dennis Ritchie; the availability of a high-level language implementation of Unix made its porting to different computer platforms easier. Due to an earlier antitrust case forbidding it from entering the computer business, AT&T was required to license the operating system's source code to anyone who asked; as a result, Unix grew and became adopted by academic institutions and businesses. In 1984, AT&T divested itself of Bell Labs; the GNU Project, started in 1983 by Richard Stallman, had the goal of creating a "complete Unix-compatible software system" composed of free software. Work began in 1984. In 1985, Stallman started the Free Software Foundation and wrote the GNU General Public License in 1989.
By the early 1990s, many of the programs required in an operating system were completed, although low-level elements such as device drivers and the kernel, called GNU/Hurd, were stalled and incomplete. Linus Torvalds has stated that if the GNU kernel had been available at the time, he would not have decided to write his own. Although not released until 1992, due to legal complications, development of 386BSD, from which NetBSD, OpenBSD and FreeBSD descended, predated that of Linux. Torvalds has stated that if 386BSD had been available at the time, he would not have created Linux. MINIX was created by Andrew S. Tanenbaum, a computer science professor, released in 1987 as a minimal Unix-like operating system targeted at students and others who wanted to learn the operating system principles. Although the complete source code of MINIX was available, the licensing terms prevented it from being free software until the licensing changed in April 2000. In 1991, while attending the University of Helsinki, Torvalds became curious about operating systems.
Frustrated by the licensing of MINIX, which at the time limited it to educational use only, he began to work on his own operating system kernel, which became the Linux kernel. Torvalds began the development of the Linux kernel on MINIX and applications written for MINIX were used on Linux. Linux matured and further Linux kernel development took place on Linux systems. GNU applications replaced all MINIX components, because it was advantageous to use the available code from the GNU Project with the fledgling operating system. Torvalds initiated a switch from his original license, which prohibited commercial redistribution, to the GNU GPL. Developers worked to integrate GNU components with the Linux kernel, making a functional and free operating system. Linus Torvalds had wanted to call his invention "Freax", a portmant
A personal computer is a multi-purpose computer whose size and price make it feasible for individual use. Personal computers are intended to be operated directly by an end user, rather than by a computer expert or technician. Unlike large costly minicomputer and mainframes, time-sharing by many people at the same time is not used with personal computers. Institutional or corporate computer owners in the 1960s had to write their own programs to do any useful work with the machines. While personal computer users may develop their own applications these systems run commercial software, free-of-charge software or free and open-source software, provided in ready-to-run form. Software for personal computers is developed and distributed independently from the hardware or operating system manufacturers. Many personal computer users no longer need to write their own programs to make any use of a personal computer, although end-user programming is still feasible; this contrasts with mobile systems, where software is only available through a manufacturer-supported channel, end-user program development may be discouraged by lack of support by the manufacturer.
Since the early 1990s, Microsoft operating systems and Intel hardware have dominated much of the personal computer market, first with MS-DOS and with Microsoft Windows. Alternatives to Microsoft's Windows operating systems occupy a minority share of the industry; these include free and open-source Unix-like operating systems such as Linux. Advanced Micro Devices provides the main alternative to Intel's processors; the advent of personal computers and the concurrent Digital Revolution have affected the lives of people in all countries. "PC" is an initialism for "personal computer". The IBM Personal Computer incorporated the designation in its model name, it is sometimes useful to distinguish personal computers of the "IBM Personal Computer" family from personal computers made by other manufacturers. For example, "PC" is used in contrast with "Mac", an Apple Macintosh computer.. Since none of these Apple products were mainframes or time-sharing systems, they were all "personal computers" and not "PC" computers.
The "brain" may one day come down to our level and help with our income-tax and book-keeping calculations. But this is speculation and there is no sign of it so far. In the history of computing, early experimental machines could be operated by a single attendant. For example, ENIAC which became operational in 1946 could be run by a single, albeit trained, person; this mode pre-dated the batch programming, or time-sharing modes with multiple users connected through terminals to mainframe computers. Computers intended for laboratory, instrumentation, or engineering purposes were built, could be operated by one person in an interactive fashion. Examples include such systems as the Bendix G15 and LGP-30of 1956, the Programma 101 introduced in 1964, the Soviet MIR series of computers developed from 1965 to 1969. By the early 1970s, people in academic or research institutions had the opportunity for single-person use of a computer system in interactive mode for extended durations, although these systems would still have been too expensive to be owned by a single person.
In what was to be called the Mother of All Demos, SRI researcher Douglas Engelbart in 1968 gave a preview of what would become the staples of daily working life in the 21st century: e-mail, word processing, video conferencing, the mouse. The demonstration required technical support staff and a mainframe time-sharing computer that were far too costly for individual business use at the time; the development of the microprocessor, with widespread commercial availability starting in the mid 1970's, made computers cheap enough for small businesses and individuals to own. Early personal computers—generally called microcomputers—were sold in a kit form and in limited volumes, were of interest to hobbyists and technicians. Minimal programming was done with toggle switches to enter instructions, output was provided by front panel lamps. Practical use required adding peripherals such as keyboards, computer displays, disk drives, printers. Micral N was the earliest commercial, non-kit microcomputer based on a microprocessor, the Intel 8008.
It was built starting in 1972, few hundred units were sold. This had been preceded by the Datapoint 2200 in 1970, for which the Intel 8008 had been commissioned, though not accepted for use; the CPU design implemented in the Datapoint 2200 became the basis for x86 architecture used in the original IBM PC and its descendants. In 1973, the IBM Los Gatos Scientific Center developed a portable computer prototype called SCAMP based on the IBM PALM processor with a Philips compact cassette drive, small CRT, full function keyboard. SCAMP emulated an IBM 1130 minicomputer in order to run APL/1130. In 1973, APL was available only on mainframe computers, most desktop sized microcomputers such as the Wang 2200 or HP 9800 offered only BASIC; because SCAMP was the first to emulate APL/1130 performance on a portable, single user computer, PC Magazine in 1983 designated SCAMP a "revolutionary concept" and "the world's first personal computer". This seminal, single user portable computer now resides in the Smithsonian Institution, Washington, D.
C.. Successful demonstrations of the 1973 SCAMP prototype led to the IBM 5100 portable microcomputer launched in 1975 with the ability to be programmed in both APL and BASIC for engineers, analysts and other business problem-solvers. In the late 1960s such a machine would have been nearly as large as two desks and would have weigh
Survival horror is a subgenre of video games inspired by horror fiction that focuses on survival of the character as the game tries to frighten players with either horror graphics or scary ambience. Although combat can be part of the gameplay, the player is made to feel less in control than in typical action games through limited ammunition, health and vision, or through various obstructions of the player's interaction with the game mechanics; the player is challenged to find items that unlock the path to new areas and solve puzzles to proceed in the game. Games make use of strong horror themes, like dark maze-like environments and unexpected attacks from enemies; the term "survival horror" was first used for the original Japanese release of Resident Evil in 1996, influenced by earlier games with a horror theme such as 1989's Sweet Home and 1992's Alone in the Dark. The name has been used since for games with similar gameplay, has been retroactively applied to earlier titles. Starting with the release of Resident Evil 4 in 2005, the genre began to incorporate more features from action games and more traditional first person and third-person shooter games.
This has led game journalists to question whether long-standing survival horror franchises and more recent franchises have abandoned the genre and moved into a distinct genre referred to as "action horror". Survival horror refers to a subgenre of action-adventure video games; the player character is vulnerable and under-armed, which puts emphasis on puzzle-solving and evasion, rather than the player taking an offensive strategy. Games challenge the player to manage their inventory and ration scarce resources such as ammunition. Another major theme throughout the genre is that of isolation; these games contain few non-player characters and, as a result tell much of their story second-hand through the usage of journals, texts, or audio logs. While many action games feature lone protagonists versus swarms of enemies in a suspenseful environment, survival horror games are distinct from otherwise horror-themed action games, they tend to de-emphasize combat in favor of challenges such as hiding or running from enemies and solving puzzles.
Still, it is not unusual for survival horror games to draw upon elements from first-person shooters, action-adventure games, or role-playing games. According to IGN, "Survival horror is different from typical game genres in that it is not defined by specific mechanics, but subject matter, tone and design philosophy." Survival horror games are a subgenre of horror games, where the player is unable to prepare or arm their avatar. The player encounters several factors to make combat unattractive as a primary option, such as a limited number of weapons or invulnerable enemies, if weapons are available, their ammunition is sparser than in other games, powerful weapons such as rocket launchers are rare, if available at all. Thus, players are more vulnerable than in action games, the hostility of the environment sets up a narrative where the odds are weighed decisively against the avatar; this shifts gameplay away from direct combat, players must learn to evade enemies or turn the environment against them.
Games try to enhance the experience of vulnerability by making the game single player rather than multiplayer, by giving the player an avatar, more frail than the typical action game hero. The survival horror genre is known for other non-combat challenges, such as solving puzzles at certain locations in the game world, collecting and managing an inventory of items. Areas of the game world will be off limits. Levels are designed with alternative routes. Levels challenge players with maze-like environments, which test the player's navigational skills. Levels are designed as dark and claustrophobic to challenge the player and provide suspense, although games in the genre make use of enormous spatial environments. A survival horror storyline involves the investigation and confrontation of horrific forces, thus many games transform common elements from horror fiction into gameplay challenges. Early releases used camera angles seen in horror films, which allowed enemies to lurk in areas that are concealed from the player's view.
Many survival horror games make use of off-screen sound or other warning cues to notify the player of impending danger. This feedback assists the player, but creates feelings of anxiety and uncertainty. Games feature a variety of monsters with unique behavior patterns. Enemies can appear unexpectedly or and levels are designed with scripted sequences where enemies drop from the ceiling or crash through windows. Survival horror games, like many action-adventure games, are structured around the boss encounter where the player must confront a formidable opponent in order to advance to the next area; these boss encounters draw elements from antagonists seen in classic horror stories, defeating the boss will advance the story of the game. The origins of the survival horror game can be traced back to earlier horror fiction. Archetypes have been linked to the books of H. P. Lovecraft, which include investigative narratives, or journeys through the depths. Comparisons have been made between Lovecraft's Great Old Ones and the boss encounters seen in many survival horror games.
Themes of survival have been traced to the slasher film subgenre, where the protagonist endures a confrontation with the ultimate antagonist. Another major influence on the genre is Japanese horror, including classical Noh theatre, the books of Edog
Cory Efram Doctorow is a Canadian-British blogger and science fiction author who serves as co-editor of the blog Boing Boing. He is an activist in favour of liberalising copyright laws and a proponent of the Creative Commons organization, using some of their licences for his books; some common themes of his work include digital rights management, file sharing, post-scarcity economics. Doctorow was born in Ontario, his father was born in a refugee camp in Azerbaijan. Although he is an admirer of acclaimed novelist E. L. Doctorow, the two are of no known relation, contrary to popular belief. In elementary school, Doctorow befriended Tim Wu, he received his high school diploma from the SEED School, attended four universities without attaining a degree. He served on the board of directors for the Grindstone Island Co-operative in Big Rideau Lake in Ontario. In June 1999, he co-founded the free software P2P company Opencola with Grad Conn.. The company was sold to the Open Text Corporation of Waterloo, during the summer of 2003.
The company used. Doctorow relocated to London and worked as European Affairs Coordinator for the Electronic Frontier Foundation for four years, helping to establish the Open Rights Group, before leaving the EFF to pursue writing full-time in January 2006. Upon his departure, Doctorow was named a Fellow of the Electronic Frontier Foundation, he was named the 2006–2007 Canadian Fulbright Chair for Public Diplomacy at the USC Center on Public Diplomacy, sponsored jointly by the Royal Fulbright Commission, the Integrated Media Systems Center, the USC Center on Public Diplomacy. The professorship included a one-year writing and teaching residency at the University of Southern California in Los Angeles, United States, he returned to London, but remained a frequent public speaker on copyright issues. In 2009, Doctorow became the first Independent Studies Scholar in Virtual Residence at the University of Waterloo in Ontario, he was a student in the program during 1993–94, but left without completing a thesis.
Doctorow is a Visiting Professor at the Open University in the United Kingdom. In 2012 he was awarded an honorary doctorate from The Open University. Doctorow married Alice Taylor in October 2008, together they have one daughter named Poesy Emmeline Fibonacci Nautilus Taylor Doctorow, born in 2008. Doctorow became a British citizen by naturalisation on 12 August 2011. In 2015, Doctorow decided to leave London and move to Los Angeles, feeling disappointed by London's "death" from Britain's choice of Conservative government, he claims on his blog, "But London is a city whose two priorities are being a playground for corrupt global elites who turn neighbourhoods into soulless collections of empty safe-deposit boxes in the sky, encouraging the feckless criminality of the finance industry. These two facts are not unrelated." He rejoined the EFF in January 2015 to campaign for the eradication of digital rights management. He served as Canadian Regional Director of the Science Fiction and Fantasy Writers of America in 1999.
Together with Austrian art group monochrom, he initiated the Instant Blitz Copy Fight project, which asks people from all over the world to take flash pictures of copyright warnings in movie theaters. On October 31, 2005, Doctorow was involved in a controversy concerning digital rights management with Sony-BMG, as told in Wikinomics; as a user of the Tor anonymity network for more than a decade during his global travels, Doctorow publicly supports the network. Doctorow began selling fiction when he was 17 years old and sold several stories followed by publication of his story "Craphound" in 1998. Down and Out in the Magic Kingdom, Doctorow's first novel, was published in January 2003, was the first novel released under one of the Creative Commons licences, allowing readers to circulate the electronic edition as long as they neither made money from it nor used it to create derived works; the electronic edition was released with the print edition. In March 2003, it was re-released with a different Creative Commons licence that allowed derivative works such as fan fiction, but still prohibited commercial usage.
It was nominated for a Nebula Award, won the Locus Award for Best First Novel in 2004. A semi-sequel short story named Truncat was published on Salon.com in August 2003. His novel Someone Comes to Town, Someone Leaves Town, published in June 2005, was chosen to launch the Sci-Fi Channel's book club, Sci-Fi Essentials. Doctorow's other novels have been released with Creative Commons licences that allow derived works and prohibit commercial usage, he has used the model of making digital versions available, without charge, at the same time that print versions are published, his Sunburst Award-winning short story collectionA Place So Foreign and Eight More was published in 2004: "0wnz0red" from this collection was nominated for the 2004 Nebula Award for Best Novelette. Doctorow released the bestselling novel Little Brother in 2008 with a Creative Commons Attribution-Noncommercial-ShareAlike licence, it was nominated for a Hugo Award for Best Novel in 2009. and won the 2009 Prometheus Award, Sunburst Award, the 2009 John W. Campbell Memorial Award.
His novel Makers was released in October 2009, was serialised for free on the Tor Books website. Doctorow released another young adult novel, For the Win, in May 2010; the novel is available free on the author's website as a Creative Commons
M. U. L. E. is a seminal multiplayer video game written for the Atari 8-bit family by Ozark Softscape and published in 1983 by Electronic Arts. It was ported to the Commodore 64, Nintendo Entertainment System, IBM PCjr. Japanese versions exist for the PC-8801, the Sharp X1, MSX 2 computers. While it plays like a strategy game, it incorporates aspects. Set on the fictional planet Irata, the game is an exercise in supply and demand economics involving competition among four players, with computer opponents automatically filling in for any missing players. Players are provided with several different choices for the race of their colonist, providing different advantages and disadvantages that can be paired to their respective strategies. To win, players not only compete against each other to amass the largest amount of wealth, but must cooperate for the survival of the colony. Central to the game is the acquisition and use of "M. U. L. E. "s to harvest resources from the player's real estate. Depending on how it is outfitted, a M.
U. L. E. Can be configured to harvest Energy, Food and Crystite. Players must balance supply and demand of these elements, buying what they need, selling what they don't. Players may exploit or create shortages by refusing to sell to other players or to the "store," which raises the price of the resource on the following turns. Scheming between players is encouraged by allowing collusion, which initiates a mode allowing a private transaction. Crystite is the one commodity, not influenced by supply and demand considerations, being deemed to be sold'off world,' so the strategy with this resource is somewhat different—a player may attempt to maximize production without fear of having too much supply for the demand; each resource is required to do certain things on each turn. For instance, if a player is short on Food, there will be less time to take one's turn. If a player is short on Energy, some land plots won't produce any output, while a shortage of Smithore will raise the price of M. U. L. E.s in the store and prevent the store from manufacturing new M.
U. L. E.s to make use of one's land. Players must deal with periodic random events such as run-away M. U. L. E.s, sunspot activity, theft by space pirates and a meteorite, with destructive and beneficial effects. The game features a balancing system for random events that impact only a single player, such that favorable events never happen to the player in first place, while unfavorable events never happen to the player in last place; this same "leveling of the playfield" is applied. The players can hunt the mountain wampus for a cash reward. According to Jim Rushing, M. U. L. E. was called Planet Pioneers during development. It was intended to be similar to Cartels & Cutthroats, with more graphics, better playability, a focus on multiplayer; the real-time auction element came from lead designer Danielle Bunten's Wheeler Dealers. The board game Monopoly was used as a model for the game, for its encouragement of social interaction. From Monopoly came several of the game's elements: the acquisition and development of land as a primary task, the economy of scale effect wherein grouped plots and multiple plots of the same type would have increased production quantities.
It inspired the different species, as the different tokens in Monopoly. Random events affecting each individual were similar to "Chance" cards. Additional game features, such as claim jumping, crystite depletion, were discarded for adding complexity without enhancing gameplay; the setting was inspired by Robert A. Heinlein's Time Enough for Love, wherein galactic colonization is in the style of the American Old West: A few pioneers with drive and primitive tools; the M. U. L. E. Itself is based on the idea of the genetically modified animal in Heinlein's novel, given the appearance of a Star Wars Imperial Walker. Another Heinlein novel, The Moon Is a Harsh Mistress, provided the decision to not have any government or external authority. All land was sold by auction but this caused a feedback loop in which the wealthiest player had the most land and thus made the most money. Ozark Softscape developed the game for the Atari 8-bit first because of its policy of developing for the most advanced computers porting them to other platforms, removing or altering features such as sprites as necessary.
Bunten stated that Ozark did not port the game to the Apple II series because "M. U. L. E. Can't be done for an Apple"; the PC port of M. U. L. E was developed by K-Byte Software, an affiliate of Electronic Arts, published by IBM as part of their venture into the home market with the PCjr, but it sold poorly due to being released in 1985 after the latter had been discontinued. No copies existed on the Internet and it was considered "lost" until 2012 when Vince Bray found an original disk, archived by Jeff Leyda and Jim Leonard. M. U. L. E. Only sold 30,000 copies but was lauded by players and reviewers. Computer Gaming World described it as a "fascinating and enjoyable game which comes to its best point with four human players". Minor criticisms included the lack of a savegame feature. Softline called M. U. L
Fortran is a general-purpose, compiled imperative programming language, suited to numeric computation and scientific computing. Developed by IBM in the 1950s for scientific and engineering applications, FORTRAN came to dominate this area of programming early on and has been in continuous use for over half a century in computationally intensive areas such as numerical weather prediction, finite element analysis, computational fluid dynamics, computational physics and computational chemistry, it is a popular language for high-performance computing and is used for programs that benchmark and rank the world's fastest supercomputers. Fortran encompasses a lineage of versions, each of which evolved to add extensions to the language while retaining compatibility with prior versions. Successive versions have added support for structured programming and processing of character-based data, array programming, modular programming and generic programming, high performance Fortran, object-oriented programming and concurrent programming.
Fortran's design was the basis for many other programming languages. Among the better known is BASIC, based on FORTRAN II with a number of syntax cleanups, notably better logical structures, other changes to more work in an interactive environment; the names of earlier versions of the language through FORTRAN 77 were conventionally spelled in all-capitals. The capitalization has been dropped in referring to newer versions beginning with Fortran 90; the official language standards now refer to the language as "Fortran" rather than all-caps "FORTRAN". In late 1953, John W. Backus submitted a proposal to his superiors at IBM to develop a more practical alternative to assembly language for programming their IBM 704 mainframe computer. Backus' historic FORTRAN team consisted of programmers Richard Goldberg, Sheldon F. Best, Harlan Herrick, Peter Sheridan, Roy Nutt, Robert Nelson, Irving Ziller, Lois Haibt, David Sayre, its concepts included easier entry of equations into a computer, an idea developed by J. Halcombe Laning and demonstrated in the Laning and Zierler system of 1952.
A draft specification for The IBM Mathematical Formula Translating System was completed by November 1954. The first manual for FORTRAN appeared in October 1956, with the first FORTRAN compiler delivered in April 1957; this was the first optimizing compiler, because customers were reluctant to use a high-level programming language unless its compiler could generate code with performance comparable to that of hand-coded assembly language. While the community was skeptical that this new method could outperform hand-coding, it reduced the number of programming statements necessary to operate a machine by a factor of 20, gained acceptance. John Backus said during a 1979 interview with Think, the IBM employee magazine, "Much of my work has come from being lazy. I didn't like writing programs, so, when I was working on the IBM 701, writing programs for computing missile trajectories, I started work on a programming system to make it easier to write programs."The language was adopted by scientists for writing numerically intensive programs, which encouraged compiler writers to produce compilers that could generate faster and more efficient code.
The inclusion of a complex number data type in the language made Fortran suited to technical applications such as electrical engineering. By 1960, versions of FORTRAN were available for the IBM 709, 650, 1620, 7090 computers; the increasing popularity of FORTRAN spurred competing computer manufacturers to provide FORTRAN compilers for their machines, so that by 1963 over 40 FORTRAN compilers existed. For these reasons, FORTRAN is considered to be the first used programming language supported across a variety of computer architectures; the development of Fortran paralleled the early evolution of compiler technology, many advances in the theory and design of compilers were motivated by the need to generate efficient code for Fortran programs. The initial release of FORTRAN for the IBM 704 contained 32 statements, including: DIMENSION and EQUIVALENCE statements Assignment statements Three-way arithmetic IF statement, which passed control to one of three locations in the program depending on whether the result of the arithmetic statement was negative, zero, or positive IF statements for checking exceptions.
The arithmetic IF statement was reminiscent of a three-way comparison instruction available on the 704. The statement provided the only way to compare numbers – by testing their difference, with an attendant risk of overflow; this deficiency was overcome by "logical" facilities introduced in FORTRAN IV. The FREQUENCY statement was used to give branch probabilities for the three branch cases of the arithmetic IF statement; the first FORTRAN compiler used this weighting to perform at compile time a Monte Carlo simulation of the generated code, the results of which were used to optimize the
The HP-41C series are programmable, continuous memory handheld RPN calculators made by Hewlett-Packard from 1979 to 1990. The original model, HP-41C, was the first of its kind to offer alphanumeric display capabilities. Came the HP-41CV and HP-41CX, offering more memory and functionality; the alphanumeric LCD screen of the HP-41C revolutionized the way a pocket calculator could be used, providing user friendliness and expandability. By using an alphanumeric display, the calculator could tell the user what was going on: it could display meaningful error messages instead of a blinking zero. Earlier calculators needed key combination, for every available function; the HP-67 had three shift keys. Hewlett-Packard were constrained by their one byte only instruction format; the more flexible storage format for programs in the TI-59 allowed combining more keys into one instruction. The longest instruction required eleven keypresses; the TI-59 made use of the Op key followed by two digits to access another 40 different functions, but the user had to remember the codes for them.
A more convenient and flexible method of executing the calculator's instructions was needed. The HP-41C had a small keyboard, only one shift key, but provided hundreds of functions; every function, not assigned to a key could be invoked through the XEQ key and spelled out in full, e.g. XEQ FACT for the factorial function; the calculator had a special user mode where the user could assign any function to any key if the default assignments provided by HP were not suited to a specific application. For this mode, the HP-41C came with blank keyboard templates. Hewlett-Packard sold a version of the calculator where hardly any keys had function names printed on them, meant for users who would be using the HP-41C for custom calculations only. Alphanumeric display greatly eased editing programs, as functions were spelled out in full. Numeric-only calculators displayed programming steps as a list of numbers, each number mapped to a key on the keyboard via row and column coordinates. Encoding functions to the corresponding numeric codes, vice versa, was left to the user, having to look up the function–code combinations in a reference guide.
The busy programmer learned most of the codes, but having to learn the codes intimidated the beginners. In addition to this, the user had to mentally keep function codes separate from numeric constants in the program listing; the HP-41C displayed each character in a block consisting of 14 segments that could be turned on or off. The HP-41C used a liquid-crystal display instead of the ubiquitous LED displays of the era, to reduce power consumption. While this allowed the display of uppercase letters, a few punctuation characters, some designs needed to be twisted arbitrarily and lowercase letters were unreadable. HP's competitor Sharp, when introducing the PC-1211, used a dot matrix of 5×7 dots and displayed the characters in principle as we see them today on computer screens. Many users had used all four ports for memory expansion. HP designed the Quad Memory Module with four times the amount of memory, providing the maximum available memory and leaving three empty ports available; the HP-41CV included this memory module on the main board, thus providing five times the memory of the HP-41C, four available slots.
The internal architecture prohibited the addition of more memory, so HP designed an extended memory module that could be seen as secondary storage. You could not access the data directly. To the calculator, data located in the extended memory looked like files on a modern hard disk do for a PC; the final HP-41 model, the HP-41CX, included extended memory, a built-in time module, extended functions. It was introduced in 1983 and discontinued in 1990; the HP-41C is keystroke programmable, meaning that it can remember and execute sequences of keystrokes to solve particular problems of interest to the user. These keystroke programs, in addition to performing any operation available on the keyboard, can make use of conditional and unconditional branching and looping instructions, allowing programs to perform repetitive operations and make decisions; the HP-41C still supports indirect addressing with which it is possible to implement a Universal Turing machine and therefore the programming model of the HP-41C can be considered Turing complete.
Here is a sample program that computes the factorial of an integer number between 1 and 69