Unix is a family of multitasking, multiuser computer operating systems that derive from the original AT&T Unix, development starting in the 1970s at the Bell Labs research center by Ken Thompson, Dennis Ritchie, others. Intended for use inside the Bell System, AT&T licensed Unix to outside parties in the late 1970s, leading to a variety of both academic and commercial Unix variants from vendors including University of California, Microsoft, IBM, Sun Microsystems. In the early 1990s, AT&T sold its rights in Unix to Novell, which sold its Unix business to the Santa Cruz Operation in 1995; the UNIX trademark passed to The Open Group, a neutral industry consortium, which allows the use of the mark for certified operating systems that comply with the Single UNIX Specification. As of 2014, the Unix version with the largest installed base is Apple's macOS. Unix systems are characterized by a modular design, sometimes called the "Unix philosophy"; this concept entails that the operating system provides a set of simple tools that each performs a limited, well-defined function, with a unified filesystem as the main means of communication, a shell scripting and command language to combine the tools to perform complex workflows.
Unix distinguishes itself from its predecessors as the first portable operating system: the entire operating system is written in the C programming language, thus allowing Unix to reach numerous platforms. Unix was meant to be a convenient platform for programmers developing software to be run on it and on other systems, rather than for non-programmers; the system grew larger as the operating system started spreading in academic circles, as users added their own tools to the system and shared them with colleagues. At first, Unix was not designed to be multi-tasking. Unix gained portability, multi-tasking and multi-user capabilities in a time-sharing configuration. Unix systems are characterized by various concepts: the use of plain text for storing data; these concepts are collectively known as the "Unix philosophy". Brian Kernighan and Rob Pike summarize this in The Unix Programming Environment as "the idea that the power of a system comes more from the relationships among programs than from the programs themselves".
In an era when a standard computer consisted of a hard disk for storage and a data terminal for input and output, the Unix file model worked quite well, as I/O was linear. In the 1980s, non-blocking I/O and the set of inter-process communication mechanisms were augmented with Unix domain sockets, shared memory, message queues, semaphores, network sockets were added to support communication with other hosts; as graphical user interfaces developed, the file model proved inadequate to the task of handling asynchronous events such as those generated by a mouse. By the early 1980s, users began seeing Unix as a potential universal operating system, suitable for computers of all sizes; the Unix environment and the client–server program model were essential elements in the development of the Internet and the reshaping of computing as centered in networks rather than in individual computers. Both Unix and the C programming language were developed by AT&T and distributed to government and academic institutions, which led to both being ported to a wider variety of machine families than any other operating system.
Under Unix, the operating system consists of many libraries and utilities along with the master control program, the kernel. The kernel provides services to start and stop programs, handles the file system and other common "low-level" tasks that most programs share, schedules access to avoid conflicts when programs try to access the same resource or device simultaneously. To mediate such access, the kernel has special rights, reflected in the division between user space and kernel space - although in microkernel implementations, like MINIX or Redox, functions such as network protocols may run in user space; the origins of Unix date back to the mid-1960s when the Massachusetts Institute of Technology, Bell Labs, General Electric were developing Multics, a time-sharing operating system for the GE-645 mainframe computer. Multics featured several innovations, but presented severe problems. Frustrated by the size and complexity of Multics, but not by its goals, individual researchers at Bell Labs started withdrawing from the project.
The last to leave were Ken Thompson, Dennis Ritchie, Douglas McIlroy, Joe Ossanna, who decided to reimplement their experiences in a new project of smaller scale. This new operating system was without organizational backing, without a name; the new operating system was a single-tasking system. In 1970, the group coined the name Unics for Uniplexed Information and Computing Service, as a pun on Multics, which stood for Multiplexed Information and Computer Services. Brian Kernighan takes credit for the idea, but adds that "no one can remember" the origin of the final spelling Unix. Dennis Ritchie, Doug McIlroy, Peter G. Neumann credit Kernighan; the operating system was written in assembly language, but in 1973, Version 4 Unix was rewritten in C. Version 4 Unix, still had many PDP-11 dependent codes, is not suitable for porting; the first port to other platform was made five years f
BASIC is a family of general-purpose, high-level programming languages whose design philosophy emphasizes ease of use. In 1964, John G. Kemeny and Thomas E. Kurtz designed the original BASIC language at Dartmouth College, they wanted to enable students in fields other than mathematics to use computers. At the time, nearly all use of computers required writing custom software, something only scientists and mathematicians tended to learn. In addition to the language itself and Kurtz developed the Dartmouth Time Sharing System, which allowed multiple users to edit and run BASIC programs at the same time; this general model became popular on minicomputer systems like the PDP-11 and Data General Nova in the late 1960s and early 1970s. Hewlett-Packard produced an entire computer line for this method of operation, introducing the HP2000 series in the late 1960s and continuing sales into the 1980s. Many early video games trace their history to one of these versions of BASIC; the emergence of early microcomputers in the mid-1970s led to the development of the original Microsoft BASIC in 1975.
Due to the tiny main memory available on these machines 4 kB, a variety of Tiny BASIC dialects were created. BASIC was available for any system of the era, became the de facto programming language for the home computer systems that emerged in the late 1970s; these machines always had a BASIC installed by default in the machine's firmware or sometimes on a ROM cartridge. BASIC fell from use during the 1980s as newer machines with far greater capabilities came to market and other programming languages became tenable. In 1991, Microsoft released Visual Basic, combining a updated version of BASIC with a visual forms builder; this reignited use of the language and "VB" remains a major programming language in the form of VB. NET. John G. Kemeny was the math department chairman at Dartmouth College, on his reputation as an innovator in math teaching, in 1959 the school won an Alfred P. Sloan Foundation award for $500,000 to build a new department building. Thomas E. Kurtz had joined the department in 1956, from the 1960s they agreed on the need for programming literacy among students outside the traditional STEM fields.
Kemeny noted that “Our vision was that every student on campus should have access to a computer, any faculty member should be able to use a computer in the classroom whenever appropriate. It was as simple as that."Kemeny and Kurtz had made two previous experiments with simplified languages, DARSIMCO and DOPE. These did not progress past a single freshman class. New experiments using Fortran and ALGOL followed, but Kurtz concluded these languages were too tricky for what they desired; as Kurtz noted, Fortran had numerous oddly-formed commands, notably an "almost impossible-to-memorize convention for specifying a loop:'DO 100, I = 1, 10, 2'. Is it'1, 10, 2' or'1, 2, 10', is the comma after the line number required or not?"Moreover, the lack of any sort of immediate feedback was a key problem. Kurtz suggested. Small programs would return results in a few seconds; this led to increasing interest in a system using time-sharing and a new language for use by non-STEM students. Kemeny wrote the first version of BASIC.
The acronym BASIC comes from the name of an unpublished paper by Thomas Kurtz. The new language was patterned on FORTRAN II. However, the syntax was changed. For instance, the difficult to remember DO loop was replaced by the much easier to remember FOR I = 1 TO 10 STEP 2, the line number used in the DO was instead indicated by the NEXT I; the cryptic IF statement of Fortran, whose syntax matched a particular instruction of the machine on which it was written, became the simpler IF I=5 THEN GOTO 100. These changes made the language much less idiosyncratic while still having an overall structure and feel similar to the original FORTRAN; the project received a $300,000 grant from the National Science Foundation, used to purchase a GE-225 computer for processing, a Datanet-30 realtime processor to handle the Teletype Model 33 teleprinters used for input and output. A team of a dozen undergraduates worked on the project for about a year, writing both the DTSS system and the BASIC compiler; the main CPU was replaced by a GE-235, still by a GE-635 The first version BASIC language was released on 1 May 1964.
One of the graduate students on the implementation team was Sr. Mary Kenneth Keller, one of the first people in the United States to earn a Ph. D. in computer science and the first woman to do so. BASIC concentrated on supporting straightforward mathematical work, with matrix arithmetic support from its initial implementation as a batch language, character string functionality being added by 1965. Wanting use of the language to become widespread, its designers made the compiler available free of charge, they made it available to high schools in the Hanover, New Hampshire area and put considerable effort into
Microsoft Windows is a group of several graphical operating system families, all of which are developed and sold by Microsoft. Each family caters to a certain sector of the computing industry. Active Windows families include Windows Embedded. Defunct Windows families include Windows Mobile and Windows Phone. Microsoft introduced an operating environment named Windows on November 20, 1985, as a graphical operating system shell for MS-DOS in response to the growing interest in graphical user interfaces. Microsoft Windows came to dominate the world's personal computer market with over 90% market share, overtaking Mac OS, introduced in 1984. Apple came to see Windows as an unfair encroachment on their innovation in GUI development as implemented on products such as the Lisa and Macintosh. On PCs, Windows is still the most popular operating system. However, in 2014, Microsoft admitted losing the majority of the overall operating system market to Android, because of the massive growth in sales of Android smartphones.
In 2014, the number of Windows devices sold was less than 25 %. This comparison however may not be relevant, as the two operating systems traditionally target different platforms. Still, numbers for server use of Windows show one third market share, similar to that for end user use; as of October 2018, the most recent version of Windows for PCs, tablets and embedded devices is Windows 10. The most recent versions for server computers is Windows Server 2019. A specialized version of Windows runs on the Xbox One video game console. Microsoft, the developer of Windows, has registered several trademarks, each of which denote a family of Windows operating systems that target a specific sector of the computing industry; as of 2014, the following Windows families are being developed: Windows NT: Started as a family of operating systems with Windows NT 3.1, an operating system for server computers and workstations. It now consists of three operating system subfamilies that are released at the same time and share the same kernel: Windows: The operating system for mainstream personal computers and smartphones.
The latest version is Windows 10. The main competitor of this family is macOS by Apple for personal computers and Android for mobile devices. Windows Server: The operating system for server computers; the latest version is Windows Server 2019. Unlike its client sibling, it has adopted a strong naming scheme; the main competitor of this family is Linux. Windows PE: A lightweight version of its Windows sibling, meant to operate as a live operating system, used for installing Windows on bare-metal computers, recovery or troubleshooting purposes; the latest version is Windows PE 10. Windows IoT: Initially, Microsoft developed Windows CE as a general-purpose operating system for every device, too resource-limited to be called a full-fledged computer. However, Windows CE was renamed Windows Embedded Compact and was folded under Windows Compact trademark which consists of Windows Embedded Industry, Windows Embedded Professional, Windows Embedded Standard, Windows Embedded Handheld and Windows Embedded Automotive.
The following Windows families are no longer being developed: Windows 9x: An operating system that targeted consumers market. Discontinued because of suboptimal performance. Microsoft now caters to the consumer market with Windows NT. Windows Mobile: The predecessor to Windows Phone, it was a mobile phone operating system; the first version was called Pocket PC 2000. The last version is Windows Mobile 6.5. Windows Phone: An operating system sold only to manufacturers of smartphones; the first version was Windows Phone 7, followed by Windows Phone 8, the last version Windows Phone 8.1. It was succeeded by Windows 10 Mobile; the term Windows collectively describes any or all of several generations of Microsoft operating system products. These products are categorized as follows: The history of Windows dates back to 1981, when Microsoft started work on a program called "Interface Manager", it was announced in November 1983 under the name "Windows", but Windows 1.0 was not released until November 1985.
Windows 1.0 was to achieved little popularity. Windows 1.0 is not a complete operating system. The shell of Windows 1.0 is a program known as the MS-DOS Executive. Components included Calculator, Cardfile, Clipboard viewer, Control Panel, Paint, Reversi and Write. Windows 1.0 does not allow overlapping windows. Instead all windows are tiled. Only modal dialog boxes may appear over other windows. Microsoft sold as included Windows Development libraries with the C development environment, which included numerous windows samples. Windows 2.0 was released in December 1987, was more popular than its predecessor. It features several improvements to the user memory management. Windows 2.03 changed the OS from tiled windows to overlapping windows. The result of this change led to Apple Computer filing a suit against Microsoft alleging infringement on Apple's copyrights. Windows 2.0
The Z80 CPU is an 8-bit based microprocessor. It was introduced by Zilog in 1976 as the startup company's first product; the Z80 was conceived by Federico Faggin in late 1974 and developed by him and his then-11 employees at Zilog from early 1975 until March 1976, when the first working samples were delivered. With the revenue from the Z80, the company built its own chip factories and grew to over a thousand employees over the following two years; the Zilog Z80 was a software-compatible extension and enhancement of the Intel 8080 and, like it, was aimed at embedded systems. According to the designers, the primary targets for the Z80 CPU were products like intelligent terminals, high end printers and advanced cash registers as well as telecom equipment, industrial robots and other kinds of automation equipment; the Z80 was introduced on the market in July 1976 and came to be used in general desktop computers using CP/M and other operating systems as well as in the home computers of the 1980s.
It was common in military applications, musical equipment, such as synthesizers, in the computerized coin operated video games of the late 1970s and early 1980, the arcade machines or video game arcade cabinets. The Z80 was one of the most used CPUs in the home computer market from the late 1970s to the mid-1980s. Zilog licensed the Z80 to the US-based Synertek and Mostek, which had helped them with initial production, as well as to a European second source manufacturer, SGS; the design was copied by several Japanese, East European and Soviet manufacturers. This won the Z80 acceptance in the world market since large companies like NEC, Toshiba and Hitachi started to manufacture the device. In recent decades Zilog has refocused on the ever-growing market for embedded systems and the most recent Z80-compatible microcontroller family, the pipelined 24-bit eZ80 with a linear 16 MB address range, has been introduced alongside the simpler Z180 and Z80 products; the Z80 came about when physicist Federico Faggin left Intel at the end of 1974 to found Zilog with Ralph Ungermann.
At Fairchild Semiconductor, at Intel, Faggin had been working on fundamental transistor and semiconductor manufacturing technology. He developed the basic design methodology used for memories and microprocessors at Intel and led the work on the Intel 4004, the 8080 and several other ICs. Masatoshi Shima, the principal logic and transistor level-designer of the 4004 and the 8080 under Faggin's supervision, joined the Zilog team. By March 1976, Zilog had developed the Z80 as well as an accompanying assembler based development system for its customers, by July 1976, this was formally launched onto the market. Early Z80s were manufactured by Synertek and Mostek, before Zilog had its own manufacturing factory ready, in late 1976; these companies were chosen because they could do the ion implantation needed to create the depletion-mode MOSFETs that the Z80 design used as load transistors in order to cope with a single 5 Volt power supply. Faggin designed the instruction set to be binary compatible with the Intel 8080 so that most 8080 code, notably the CP/M operating system and Intel's PL/M compiler for 8080, would run unmodified on the new Z80 CPU.
Masatoshi Shima designed most of the microarchitecture as well as the gate and transistor levels of the Z80 CPU, assisted by a small number of engineers and layout people. CEO Federico Faggin was heavily involved in the chip layout work, together with two dedicated layout people. Faggin worked 80 hours a week in order to meet the tight schedule given by the financial investors, according to himself; the Z80 offered many improvements over the 8080: An enhanced instruction set including single-bit addressing, shifts/rotates on memory and registers other than the accumulator, rotate instructions for BCD number strings in memory, program looping, program counter relative jumps, block copy, block input/output, byte search instructions. The Z80 had better support for signed 8 - and 16-bit arithmetics. New IX and IY index registers with instructions for direct base+offset addressing A better interrupt system A more automatic and general vectorized interrupt system, mode 2 intended for Zilog's line of counter/timers, DMA and communications controllers, as well as a fixed vector interrupt system, mode 1, for simple systems with minimal hardware.
A non maskable interrupt which can be used to respond to power down situations or other high priority events. Two separate register files, which could be switched, to speed up response to interrupts such as fast asynchronous event handlers or a multitasking dispatcher. Although they were not intended as extra registers for general code, they were used that way in some applications. Less hardware required for power supply, clock generation and interface to memory and I/O Single 5-volt power supply. Single-phase 5 V clock. A built-in DRAM refresh mechanism. Non-multiplexed buses. A special reset function which clears only the program counter so that a single Z80 CPU could be used in a
The user interface, in the industrial design field of human–computer interaction, is the space where interactions between humans and machines occur. The goal of this interaction is to allow effective operation and control of the machine from the human end, whilst the machine feeds back information that aids the operators' decision-making process. Examples of this broad concept of user interfaces include the interactive aspects of computer operating systems, hand tools, heavy machinery operator controls, process controls; the design considerations applicable when creating user interfaces are related to or involve such disciplines as ergonomics and psychology. The goal of user interface design is to produce a user interface which makes it easy and enjoyable to operate a machine in the way which produces the desired result; this means that the operator needs to provide minimal input to achieve the desired output, that the machine minimizes undesired outputs to the human. User interfaces are composed of one or more layers including a human-machine interface interfaces machines with physical input hardware such a keyboards, game pads and output hardware such as computer monitors and printers.
A device that implements a HMI is called a human interface device. Other terms for human-machine interfaces are man–machine interface and when the machine in question is a computer human–computer interface. Additional UI layers may interact with one or more human sense, including: tactile UI, visual UI, auditory UI, olfactory UI, equilibrial UI, gustatory UI. Composite user interfaces are UIs that interact with two or more senses; the most common CUI is a graphical user interface, composed of a tactile UI and a visual UI capable of displaying graphics. When sound is added to a GUI it becomes a multimedia user interface. There are three broad categories of CUI: standard and augmented. Standard composite user interfaces use standard human interface devices like keyboards and computer monitors; when the CUI blocks out the real world to create a virtual reality, the CUI is virtual and uses a virtual reality interface. When the CUI does not block out the real world and creates augmented reality, the CUI is augmented and uses an augmented reality interface.
When a UI interacts with all human senses, it is called a qualia interface, named after the theory of qualia. CUI may be classified by how many senses they interact with as either an X-sense virtual reality interface or X-sense augmented reality interface, where X is the number of senses interfaced with. For example, a Smell-O-Vision is a 3-sense Standard CUI with visual display and smells; the user interface or human–machine interface is the part of the machine that handles the human–machine interaction. Membrane switches, rubber keypads and touchscreens are examples of the physical part of the Human Machine Interface which we can see and touch. In complex systems, the human–machine interface is computerized; the term human–computer interface refers to this kind of system. In the context of computing, the term extends as well to the software dedicated to control the physical elements used for human-computer interaction; the engineering of the human–machine interfaces is enhanced by considering ergonomics.
The corresponding disciplines are human factors engineering and usability engineering, part of systems engineering. Tools used for incorporating human factors in the interface design are developed based on knowledge of computer science, such as computer graphics, operating systems, programming languages. Nowadays, we use the expression graphical user interface for human–machine interface on computers, as nearly all of them are now using graphics. There is a difference between a user interface and an operator interface or a human–machine interface; the term "user interface" is used in the context of computer systems and electronic devices Where a network of equipment or computers are interlinked through an MES -or Host to display information. A human-machine interface is local to one machine or piece of equipment, is the interface method between the human and the equipment/machine. An operator interface is the interface method by which multiple equipment that are linked by a host control system is accessed or controlled.
The system may expose several user interfaces to serve different kinds of users. For example, a computerized library database might provide two user interfaces, one for library patrons and the other for library personnel; the user interface of a mechanical system, a vehicle or an industrial installation is sometimes referred to as the human–machine interface. HMI is a modification of the original term MMI. In practice, the abbreviation MMI is still used although some may claim that MMI stands for something different now. Another abbreviation is HCI, but is more used for human–computer interaction. Other terms used are operator interface terminal; however it is abbreviated, the terms refer to the'layer' that separates a human, operating a machine from the machine itself. Without a clean and usable interface, humans would not be able to
Read-only memory is a type of non-volatile memory used in computers and other electronic devices. Data stored in ROM can only be modified with difficulty, or not at all, so it is used to store firmware or application software in plug-in cartridges. Read-only memory refers to memory, hard-wired, such as diode matrix and the mask ROM, which cannot be changed after manufacture. Although discrete circuits can be altered in principle, integrated circuits cannot, are useless if the data is bad or requires an update; that such memory can never be changed is a disadvantage in many applications, as bugs and security issues cannot be fixed, new features cannot be added. More ROM has come to include memory, read-only in normal operation, but can still be reprogrammed in some way. Erasable programmable read-only memory and electrically erasable programmable read-only memory can be erased and re-programmed, but this can only be done at slow speeds, may require special equipment to achieve, is only possible a certain number of times.
IBM used Capacitor Read Only Storage and Transformer Read Only Storage to store microcode for the smaller System/360 models, the 360/85 and the initial two models of the S/370. On some models there was a Writeable Control Store for additional diagnostics and emulation support; the simplest type of solid-state ROM is as old as the semiconductor technology itself. Combinational logic gates can be joined manually to map n-bit address input onto arbitrary values of m-bit data output. With the invention of the integrated circuit came mask ROM. Mask ROM consists of a grid of word lines and bit lines, selectively joined together with transistor switches, can represent an arbitrary look-up table with a regular physical layout and predictable propagation delay. In mask ROM, the data is physically encoded in the circuit, so it can only be programmed during fabrication; this leads to a number of serious disadvantages: It is only economical to buy mask ROM in large quantities, since users must contract with a foundry to produce a custom design.
The turnaround time between completing the design for a mask ROM and receiving the finished product is long, for the same reason. Mask ROM is impractical for R&D work since designers need to modify the contents of memory as they refine a design. If a product is shipped with faulty mask ROM, the only way to fix it is to recall the product and physically replace the ROM in every unit shipped. Subsequent developments have addressed these shortcomings. PROM, invented in 1956, allowed users to program its contents once by physically altering its structure with the application of high-voltage pulses; this addressed problems 1 and 2 above, since a company can order a large batch of fresh PROM chips and program them with the desired contents at its designers' convenience. The 1971 invention of EPROM solved problem 3, since EPROM can be reset to its unprogrammed state by exposure to strong ultraviolet light. EEPROM, invented in 1983, went a long way to solving problem 4, since an EEPROM can be programmed in-place if the containing device provides a means to receive the program contents from an external source.
Flash memory, invented at Toshiba in the mid-1980s, commercialized in the early 1990s, is a form of EEPROM that makes efficient use of chip area and can be erased and reprogrammed thousands of times without damage. All of these technologies improved the flexibility of ROM, but at a significant cost-per-chip, so that in large quantities mask ROM would remain an economical choice for many years. Rewriteable technologies were envisioned as replacements for mask ROM; the most recent development is NAND flash invented at Toshiba. Its designers explicitly broke from past practice, stating plainly that "the aim of NAND Flash is to replace hard disks," rather than the traditional use of ROM as a form of non-volatile primary storage; as of 2007, NAND has achieved this goal by offering throughput comparable to hard disks, higher tolerance of physical shock, extreme miniaturization, much lower power consumption. Every stored-program computer may use a form of non-volatile storage to store the initial program that runs when the computer is powered on or otherwise begins execution.
Every non-trivial computer needs some form of mutable memory to record changes in its state as it executes. Forms of read-only memory were employed as non-volatile storage for programs in most early stored-program computers, such as ENIAC after 1948. Read-only memory was simpler to implement since it needed only a mechanism to read stored values, not to change them in-place, thus could be implemented with crude electromechanical devices. With the advent of integrated circuits in the 1960s, both ROM and its mutable counterpart static RAM were implemented as arrays of transistors in silicon chips.
The TRS-80 Micro Computer System is a desktop microcomputer launched in 1977 and sold by Tandy Corporation through their RadioShack stores. The name is an abbreviation of Z-80 microprocessor, it is one of mass-marketed retail home computers. The TRS-80 has a full-stroke QWERTY keyboard, the Zilog Z80 processor, 4 KB DRAM standard memory, small size and desk footprint, floating-point BASIC programming language, standard 64-character/line video monitor, a starting price of US$600. An extensive line of upgrades and add-on hardware peripherals for the TRS-80 was developed and marketed by Tandy/RadioShack; the basic system can be expanded with up to 48 KB of RAM, up to four floppy disk drives and/or hard disk drives. Tandy/RadioShack provided full-service support including upgrade and training services in their thousands of stores worldwide. By 1979, the TRS-80 had the largest selection of software in the microcomputer market; until 1982, the TRS-80 was the best-selling PC line, outselling the Apple II series by a factor of five according to one analysis.
In mid-1980, the broadly compatible TRS-80 Model III was released. The Model I was discontinued shortly thereafter due to stricter FCC regulations on radio-frequency interference to nearby electronic devices. In April 1983 the Model III was succeeded by the compatible Model 4. Following the original Model I and its compatible descendants, the TRS-80 name became a generic brand used on other technically unrelated computer lines sold by Tandy, including the TRS-80 Model II, TRS-80 Model 2000, TRS-80 Model 100, TRS-80 Color Computer and TRS-80 Pocket Computer. In the mid-1970s, Tandy Corporation's RadioShack division was a successful American chain of more than 3,000 electronics stores. After buyer Don French purchased a MITS Altair kit computer, he began designing his own and showed it to vice president of manufacturing John Roach. Although the design did not impress Roach, the idea of selling a microcomputer did; when the two men visited National Semiconductor in California in mid-1976, Steve Leininger's expertise on the SC/MP microprocessor impressed them.
National executives refused to provide Leininger's contact information when French and Roach wanted to hire him as a consultant, but they found Leininger working part-time at Byte Shop and he and French began working together in June 1976. The company envisioned a kit, but Leininger persuaded the others that because "too many people can't solder", a preassembled computer would be better. Tandy had 11 million customers that might buy a microcomputer, but it would be much more expensive than the US$30 median price of a RadioShack product, a great risk for the conservative company. Executives feared losing money as Sears did with Cartrivision, many opposed the project; as the popularity of CB radio—at one point comprising more than 20% of RadioShack's sales—declined, the company sought new products. In December 1976 French and Leininger received official approval for the project but were told to emphasize cost savings. In February 1977 they showed their prototype, running a simple tax-accounting program, to Charles Tandy, head of Tandy Corporation.
The program crashed as the computer could not handle the US$150,000 figure that Tandy typed in as his salary, the two men added support for floating-point math to its Tiny BASIC to prevent a recurrence. After the demonstration Tandy revealed that he had leaked the computer's existence to the press, so the project was approved. MITS sold 1,000 Altairs in February 1975, was selling 10,000 a year. Leininger and French suggested that RadioShack could sell 50,000 computers, but others disagreed and suggested 1,000 to 3,000 per year at the target US$199 price. Roach persuaded Tandy to agree to build 3,500—the number of RadioShack stores—so that each store could use a computer for inventory purposes if they did not sell. Having spent less than US$150,000 on development, RadioShack announced the TRS-80 at a New York City press conference on August 3, 1977, it cost a RadioShack tape recorder as datacassette storage. The company hoped that the new computer would help RadioShack sell higher-priced products, improve its "schlocky" image among customers.
Small businesses were the primary target market, followed by educators consumers and hobbyists. Although the press conference did not receive much media attention because of a terrorist bombing elsewhere in the city, the computer received much more publicity at the Personal Computer Faire in Boston two days later. A front-page Associated Press article discussed the novelty of a large consumer-electronics company selling a home computer that could "do a payroll for up to 15 people in a small business, teach children mathematics, store your favorite recipes or keep track of an investment portfolio, it can play cards." Six sacks of mail arrived at Tandy headquarters asking about the computer, over 15,000 people called to purchase a TRS-80—paralyzing the company switchboard—and 250,000 joined the waiting list with a $100 deposit. Despite the internal skepticism, RadioShack aggressively entered the