NeXT, Inc. was an American computer and software company founded in 1985 by Apple Computer co-founder Steve Jobs. Its name was pronounced as "Next". Based in Redwood City, the company developed and manufactured a series of computer workstations intended for the higher education and business markets. NeXT was founded by Jobs. NeXT introduced the first NeXT Computer in 1988, the smaller NeXTstation in 1990; the NeXT computers experienced limited sales, with estimates of about 50,000 units shipped in total. Their innovative object-oriented NeXTSTEP operating system and development environment were influential; the first major outside investment was from Ross Perot, who invested after seeing a segment about NeXT on The Entrepreneurs. In 1987, he invested $20 million in exchange for 16 percent of NeXT's stock and subsequently joined the board of directors in 1988. NeXT released much of the NeXTSTEP system as a programming environment standard called OpenStep. NeXT withdrew from the hardware business in 1993 to concentrate on marketing OPENSTEP for Mach, its own OpenStep implementation, for several original equipment manufacturers.
NeXT developed WebObjects, one of the first enterprise web application frameworks. WebObjects never became popular because of its initial high price of $50,000, but it remains a prominent early example of a Web server based on dynamic page generation rather than on static content. Apple purchased NeXT in 1997 for $429 million, 1.5 million shares of Apple stock. As part of the agreement, Steve Jobs, Chairman and CEO of NeXT Software, returned to Apple, the company he co-founded in 1976; the founder promised to merge software from NeXT with Apple's hardware platforms resulting in macOS, iOS, watchOS, tvOS. These operating systems are based upon the NeXTSTEP and OPENSTEP foundation. In 1985, Apple co-founder Steve Jobs led Apple's SuperMicro division, responsible for the development of the Macintosh and Lisa personal computers; the Macintosh had been successful on university campuses because of the Apple University Consortium, which allowed students and institutions to buy the computers at a discount.
The consortium had earned more than $50 million on computers by February 1984. While chairman, Jobs visited university departments and faculty members to sell Macintosh. Jobs met Paul Berg, a Nobel Laureate in chemistry, at a luncheon held in Silicon Valley to honor François Mitterrand President of France. Berg was frustrated by the expense of teaching students about recombinant DNA from textbooks instead of in wet laboratories, used for the testing and analysis of chemicals and other materials or biological matter. Wet labs were prohibitively expensive for lower-level courses and were too complex to be simulated on personal computers of the time. Berg suggested to Jobs to use his influence at Apple to create a "3M computer" workstation for higher education, featuring at least one megabyte of random-access memory, a megapixel display and megaFLOPs performance, hence the name "3M". Jobs was intrigued by Berg's concept of a workstation and contemplated starting a higher education computer company in the fall of 1985, amidst increasing turmoil at Apple.
Jobs' division did not release upgraded versions of the Macintosh and much of the Macintosh Office system. As a result, sales plummeted, Apple was forced to write off millions of dollars in unsold inventory. Apple's chief executive officer John Sculley ousted Jobs from his day-to-day role at Apple, replacing him with Jean-Louis Gassée in 1985; that year, Jobs began a power struggle to regain control of the company. The board of directors sided with Sculley while Jobs took a business visit to Western Europe and the Soviet Union on behalf of Apple. After several months of being sidelined, Jobs resigned from Apple on September 13, 1985, he told the board he was leaving to set up a new computer company, that he would be taking several Apple employees from the SuperMicro division with him. He told the board that his new company would not compete with Apple and might consider licensing its designs back to them to market under the Macintosh brand. Jobs named his new company Next, Inc. A number of former Apple employees followed him to Next, including Joanna Hoffman, Bud Tribble, George Crow, Rich Page, Susan Barnes, Susan Kare, Dan'l Lewin.
After consulting with major educational buyers from around the country, including a follow-up meeting with Paul Berg, a tentative specification for the workstation was drawn up. It was designed to be powerful enough to run wet lab simulations and cheap enough for college students to use in their dormitory rooms. Before the specifications were finished, Apple sued Next for "nefarious schemes" to take advantage of the cofounders' insider information. Jobs remarked, "It is hard to think that a $2 billion company with 4,300-plus people couldn't compete with six people in blue jeans." The suit was dismissed before trial. In 1986, Jobs recruited the famous graphic designer Paul Rand to create a brand identity costing $100,000. Jobs recalled, "I asked him if he would come up with a few options, he said,'No, I will solve your problem for you and you will pay me. You don’t have to use the solution. If you want options go talk to other people.'" Rand created a 20-page brochure detailing the brand, including the precise angle used for the logo and a new company name spelling, NeXT.
NeXT changed its business plan in mid-1986. The company decided to develop both computer hardware and software, instead of just a low-end workstation
A menu bar is a graphical control element which contains drop-down menus. The menu bar's purpose is to supply a common housing for window- or application-specific menus which provide access to such functions as opening files, interacting with an application, or displaying help documentation or manuals. Menu bars are present in graphical user interfaces that display documents and representations of files in windows and windowing systems but menus can be used as well in command line interface programs like text editors or file managers where drop-down menu is activated with a shortcut or combination key. Through the evolution of user interfaces, the menu bar has been implemented in different ways by different user interfaces and application programs. In the Macintosh operating systems, the menu bar is a horizontal "bar" anchored to the top of the screen. In macOS, the left side contains the Apple menu, the Application menu and the focused application's menus. On the right side, it contains menu extras (for example the system clock, volume control, the Fast user switching menu and the Spotlight icon.
All of these menu extras can be moved horizontally by dragging left or right. If an icon is dragged and dropped vertically it will disappear with a puff of smoke, much like the icons in the dock. In the Classic Mac OS, the right side contains the application menu, allowing the user to switch between open applications. In Mac OS 8.5 and the menu can be dragged downwards, which would cause it to be represented on screen as a floating palette. There is only one menu bar, so the application menus displayed are those of the application, focused. Therefore, for example, if the System Preferences application is focused, its menus are in the menu bar, if the user clicks on the Desktop, a part of the Finder application, the menu bar will display the Finder menus. Apple experiments in GUI design for the Lisa project used multiple menu bars anchored to the bottom of windows, but this was dropped in favor of the current arrangement, as it proved slower to use; the idea of separate menus in each window or document was implemented in Microsoft Windows and is the default representation in most Linux desktop environments.
Before the advent of the Macintosh, the universal graphical menu bar appeared in the Apple Lisa in 1983. It has been a feature of all versions of the Classic Mac OS since the first Macintosh was released in 1984, is still used today in macOS; the menu bar in Microsoft Windows is anchored to the top of a window under the title bar. Menus in the menu bar can be accessed through shortcuts involving the Alt key and the mnemonic letter that appears underlined in the menu title. Additionally, pressing Alt or F10 brings the focus on the first menu of the menu bar. KDE and GNOME allow users to turn Macintosh-style and Windows-style menu bars off. KDE can have both types in use at the same time; the standard GNOME desktop uses a menu bar at the top of the screen, but this menu bar only contains Applications and System menus and status information. The Unity desktop shell shipped with Ubuntu Linux since version 11.04 uses a Macintosh-style menu bar. Other window managers and desktop environments use a similar scheme, where programs have their own menus, but clicking one or more of the mouse buttons on the root window brings up a menu containing, for example, commands to launch various applications or to log out.
Window manager menus in Linux are configurable either by editing text files, by using a desktop-environment-specific Control Panel applet, or both. The Amiga used a menu-bar style similar to that of the Macintosh, with the exception that the machine's custom graphics chips allowed each program to have its own "screen", with its own resolution and colour settings, which could be dragged down to reveal the screens of other programs; the title/menu bar would sit at the top of the screen, could be accessed by pressing the right mouse button, revealing the names of the various menus. When the right menu button was not pressed down, the menu/title bar would display the name of the program which owned the screen, some other information such as the amount of memory used; when accessing menus with right mouse buttons pressed, one could select multiple menu entries by clicking the left mouse button, when right mouse button was released, all actions selected in the menus would be performed in the order they were selected.
This was known as multiselect. The Workbench screen title bar would display the Workbench version and the amount of free Chip RAM and Fast RAM. An unusual feature of the Amiga menu system was that the Workbench screen would display a "Workbench" menu instead of a "File" or "Apple" menu, while conforming applications would display "Project" and "Tools" menus. Keyboard shortcuts could be accessed by pressing the "right Amiga" key along with a normal alphanumeric key; the filled-in and hollowed-out designs of the left- and right-Amiga keys are similar to the closed-Apple and open-Apple keys of Apple II keyboards. The
Lisa is a desktop computer developed by Apple, released on January 19, 1983. It is one of the first personal computers to offer a graphical user interface in a machine aimed at individual business users. Development of the Lisa began in 1978, it underwent many changes during the development period before shipping at US$9,995 with a 5 MB hard drive; the Lisa was challenged by a high price, insufficient performance, insufficient software library, crash-prone operating system, unreliable Apple FileWare floppy disks, the immediate release of the cheaper and faster Macintosh — yielding lifelong sales of only 100,000 units in two years. In 1982, after Steve Jobs was forced out of the Lisa project, he appropriated the existing Macintosh project, which Jef Raskin had conceived in 1979 and led to develop a text-based appliance computer. Jobs redefined Macintosh as a cheaper and more usable version of the graphical Lisa. Macintosh was launched in January 1984 surpassing Lisa sales, assimilating increasing numbers of Lisa staff.
Newer Lisa models were introduced that addressed its faults and lowered its price but the platform failed to achieve favorable sales compared to the much less expensive Mac. The final model, the Lisa 2/10, was modified as the high end of the Macintosh series, the Macintosh XL. Considered a commercial failure but with some technical acclaim, the Lisa introduced a number of advanced features that would not reappear on the Macintosh for many years; these include an operating system with a more document-oriented workflow. The hardware overall is more advanced than the Macintosh, with a hard drive, support for up to 2 megabytes of RAM, expansion slots, a larger, higher-resolution display; the main exception is that the 68000 processor in the Macintosh is clocked at 7.89 MHz and the Lisa's is 5 MHz. The complexity of the Lisa operating system and its associated programs overtaxes the slower processor enough that users perceive it to be sluggish; the workstation-tier price and lack of technical application library made it unviable for the technical workstation market.
Though the documentation shipped with the original Lisa only refers to it as "The Lisa", Apple stated the name was an acronym for "Locally Integrated Software Architecture" or "LISA". Because Steve Jobs's first daughter was named Lisa Nicole Brennan, it was inferred that the name had a personal association, that the acronym was a backronym invented to fit the name. Andy Hertzfeld states the acronym was reverse engineered from the name "Lisa" in late 1982 by the Apple marketing team, after they had hired a marketing consultancy firm to come up with names to replace "Lisa" and "Macintosh" and rejected all of the suggestions. Hertzfeld and the other software developers used "Lisa: Invented Stupid Acronym", a recursive backronym, while computer industry pundits coined the term "Let's Invent Some Acronym" to fit the Lisa's name. Decades Jobs would tell his biographer Walter Isaacson: "Obviously it was named for my daughter." The project began in 1978 as an effort to create a more modern version of the then-conventional design epitomized by the Apple II.
A ten-person team occupied its first dedicated office, nicknamed "the Good Earth building" and located at 20863 Stevens Creek Boulevard next to the restaurant named Good Earth. Initial team leader Ken Rothmuller was soon replaced by John Couch, under whose direction the project evolved into the "window-and-mouse-driven" form of its eventual release. Trip Hawkins and Jef Raskin contributed to this change in design. Apple's cofounder Steve Jobs was involved in the concept. At Xerox's Palo Alto Research Center, research had been underway for several years to create a new humanized way to organize the computer screen, today known as the desktop metaphor. Steve Jobs visited Xerox PARC in 1979, was absorbed and excited by the revolutionary mouse-driven GUI of the Xerox Alto. By late 1979, Jobs negotiated a payment of Apple stock to Xerox, in exchange for his Lisa team to receive two demonstrations of ongoing research projects at Xerox PARC; when the Apple team saw the demonstration of the Alto computer, they were able to see in action the basic elements of what constituted a workable GUI.
The Lisa team put a great deal of work into making the graphical interface a mainstream commercial product. The Lisa was a major project at Apple, which spent more than $50 million on its development. More than 90 people participated in the design, plus more in the sales and marketing effort, to launch the machine. BYTE credited Wayne Rosing with being the most important person on the development of the computer's hardware until the machine went into production, at which point he became technical lead for the entire Lisa project; the hardware development team was headed by Robert Paratore. The industrial design, product design, mechanical packaging were headed by Bill Dresselhaus, the Principal Product Designer of Lisa, with his team of internal product designers and contract product designers from the firm that became IDEO. Bruce Daniels was in charge of applications development, Larry Tesler was in charge of system software; the user interface was designed in a six month period, after which, the hardware, operating system, applications were all created in parallel.
In 1982, after Steve Jobs was forced out of the Lisa project, he appropriated the existing Macintosh project, which Jef Raskin had conceived in 1979 and led to develop a text-based appliance computer. Jobs redefined Macintosh as a cheaper and more usable Lisa, leading the project in parallel and in secret, subst
Apple Inc. is an American multinational technology company headquartered in Cupertino, that designs and sells consumer electronics, computer software, online services. It is considered one of the Big Four of technology along with Amazon and Facebook; the company's hardware products include the iPhone smartphone, the iPad tablet computer, the Mac personal computer, the iPod portable media player, the Apple Watch smartwatch, the Apple TV digital media player, the HomePod smart speaker. Apple's software includes the macOS and iOS operating systems, the iTunes media player, the Safari web browser, the iLife and iWork creativity and productivity suites, as well as professional applications like Final Cut Pro, Logic Pro, Xcode, its online services include the iTunes Store, the iOS App Store, Mac App Store, Apple Music, Apple TV+, iMessage, iCloud. Other services include Apple Store, Genius Bar, AppleCare, Apple Pay, Apple Pay Cash, Apple Card. Apple was founded by Steve Jobs, Steve Wozniak, Ronald Wayne in April 1976 to develop and sell Wozniak's Apple I personal computer, though Wayne sold his share back within 12 days.
It was incorporated as Apple Computer, Inc. in January 1977, sales of its computers, including the Apple II, grew quickly. Within a few years and Wozniak had hired a staff of computer designers and had a production line. Apple went public in 1980 to instant financial success. Over the next few years, Apple shipped new computers featuring innovative graphical user interfaces, such as the original Macintosh in 1984, Apple's marketing advertisements for its products received widespread critical acclaim. However, the high price of its products and limited application library caused problems, as did power struggles between executives. In 1985, Wozniak departed Apple amicably and remained an honorary employee, while Jobs and others resigned to found NeXT; as the market for personal computers expanded and evolved through the 1990s, Apple lost market share to the lower-priced duopoly of Microsoft Windows on Intel PC clones. The board recruited CEO Gil Amelio to what would be a 500-day charge for him to rehabilitate the financially troubled company—reshaping it with layoffs, executive restructuring, product focus.
In 1997, he led Apple to buy NeXT, solving the failed operating system strategy and bringing Jobs back. Jobs pensively regained leadership status, becoming CEO in 2000. Apple swiftly returned to profitability under the revitalizing Think different campaign, as he rebuilt Apple's status by launching the iMac in 1998, opening the retail chain of Apple Stores in 2001, acquiring numerous companies to broaden the software portfolio. In January 2007, Jobs renamed the company Apple Inc. reflecting its shifted focus toward consumer electronics, launched the iPhone to great critical acclaim and financial success. In August 2011, Jobs resigned as CEO due to health complications, Tim Cook became the new CEO. Two months Jobs died, marking the end of an era for the company. Apple is well known for its size and revenues, its worldwide annual revenue totaled $265 billion for the 2018 fiscal year. Apple is the world's largest information technology company by revenue and the world's third-largest mobile phone manufacturer after Samsung and Huawei.
In August 2018, Apple became the first public U. S. company to be valued at over $1 trillion. The company employs 123,000 full-time employees and maintains 504 retail stores in 24 countries as of 2018, it operates the iTunes Store, the world's largest music retailer. As of January 2018, more than 1.3 billion Apple products are in use worldwide. The company has a high level of brand loyalty and is ranked as the world's most valuable brand. However, Apple receives significant criticism regarding the labor practices of its contractors, its environmental practices and unethical business practices, including anti-competitive behavior, as well as the origins of source materials. Apple Computer Company was founded on April 1, 1976, by Steve Jobs, Steve Wozniak, Ronald Wayne; the company's first product is the Apple I, a computer designed and hand-built by Wozniak, first shown to the public at the Homebrew Computer Club. Apple I was sold as a motherboard —a base kit concept which would now not be marketed as a complete personal computer.
The Apple I went on sale in July 1976 and was market-priced at $666.66. Apple Computer, Inc. was incorporated on January 3, 1977, without Wayne, who had left and sold his share of the company back to Jobs and Wozniak for $800 only twelve days after having co-founded Apple. Multimillionaire Mike Markkula provided essential business expertise and funding of $250,000 during the incorporation of Apple. During the first five years of operations revenues grew exponentially, doubling about every four months. Between September 1977 and September 1980, yearly sales grew from $775,000 to $118 million, an average annual growth rate of 533%; the Apple II invented by Wozniak, was introduced on April 16, 1977, at the first West Coast Computer Faire. It differs from its major rivals, the TRS-80 and Commodore PET, because of its character cell-based color graphics and open architecture. While early Apple II models use ordinary cassette tapes as storage devices, they were superseded by the introduction of a 5 1⁄4-inch floppy disk drive and interface called the Disk II.
The Apple II was chosen to be the desktop platform for the first "killer app" of the business world: VisiCalc, a spreadsheet program. VisiCalc created a business market for the Apple II and gave home users an additional reason to buy an Apple II: compatibility with the office. Before VisiCalc, Apple had been a distant third place c
Mac OS memory management
The classic Mac OS used a form of memory management that has fallen out of favor in modern systems. Criticism of this approach was one of the key areas addressed by the change to Mac OS X; the original problem for the engineers of the Macintosh was how to make optimum use of the 128 KB of RAM with which the machine was equipped, on Motorola 68000-based computer hardware that did not support virtual memory. Since at that time the machine could only run one application program at a time, there was no fixed secondary storage, the engineers implemented a simple scheme which worked well with those particular constraints; that design choice did not scale well with the development of the machine, creating various difficulties for both programmers and users. The primary concern of the original engineers appears to have been fragmentation - that is, the repeated allocation and deallocation of memory through pointers leading to many small isolated areas of memory which cannot be used because they are too small though the total free memory may be sufficient to satisfy a particular request for memory.
To solve this, Apple engineers used the concept of a relocatable handle, a reference to memory which allowed the actual data referred to be moved without invalidating the handle. Apple's scheme was simple - a handle was a pointer into a table of further pointers, which in turn pointed to the data. If a memory request required compaction of memory, this was done and the table, called the master pointer block, was updated; the machine itself implemented two areas in memory available for this scheme - the system heap, the application heap. As long as only one application at a time was run, the system worked well. Since the entire application heap was dissolved when the application quit, fragmentation was minimized; the memory management system had weaknesses. In addition, the handle-based approach opened up a source of programming errors, where pointers to data within such relocatable blocks could not be guaranteed to remain valid across calls that might cause memory to move; this was a real problem for every system API that existed.
Because of the transparency of system-owned data structures at the time, the APIs could do little to solve this. Thus the onus was on the programmer not to create such pointers, or at least manage them carefully by dereferencing all handles after every such API call. Since many programmers were not familiar with this approach, early Mac programs suffered from faults arising from this. Palm OS and 16-bit Windows use a similar scheme for memory management, but the Palm and Windows versions make programmer error more difficult. For instance, in Mac OS, to convert a handle to a pointer, a program just de-references the handle directly, but if the handle is not locked, the pointer can become invalid quickly. Calls to lock and unlock handles. In Palm OS and Windows, handles are an opaque type and must be de-referenced with MemHandleLock on Palm OS or Global/LocalLock on Windows; when a Palm or Windows application is finished with a handle, it calls MemHandleUnlock or Global/LocalUnlock. Palm OS and Windows keep a lock count for blocks.
Addressing the problem of nested locks and unlocks can be straightforward by employing various methods, but these intrude upon the readability of the associated code block and require awareness and discipline on the part of the coder. Awareness and discipline are necessary to avoid memory "leaks" and to avoid references to stale handles after release; the situation worsened with the advent of Switcher, a way for a Mac with 512KB or more of memory to run multiple applications at once. This was a necessary step forward for users, who found the one-app-at-a-time approach limiting; because Apple was now committed to its memory management model, as well as compatibility with existing applications, it was forced to adopt a scheme where each application was allocated its own heap from the available RAM. The amount of actual RAM allocated to each heap was set by a value coded into the metadata of each application, set by the programmer. Sometimes this value wasn't enough for particular kinds of work, so the value setting had to be exposed to the user to allow them to tweak the heap size to suit their own requirements.
While popular among "power users", this exposure of a technical implementation detail was against the grain of the Mac user philosophy. Apart from exposing users to esoteric technicalities, it was inefficient, since an application would be made to grab all of its allotted RAM if it left most of it subsequently unused. Another application might be memory starved, but would be unable to utilize the free memory "owned" by another application. While an application could not beneficially utilize a sister application's heap, it could destroy it by inadvertently writing to a nonsense address. An application accidentally treating a fragment of text or image, or an unassigned location as a pointer could overwrite the code or data of other applications or the OS, leaving "lurkers" after the program was exited; such problems could be difficult to analyze and correct. Switcher evolv
Unix is a family of multitasking, multiuser computer operating systems that derive from the original AT&T Unix, development starting in the 1970s at the Bell Labs research center by Ken Thompson, Dennis Ritchie, others. Intended for use inside the Bell System, AT&T licensed Unix to outside parties in the late 1970s, leading to a variety of both academic and commercial Unix variants from vendors including University of California, Microsoft, IBM, Sun Microsystems. In the early 1990s, AT&T sold its rights in Unix to Novell, which sold its Unix business to the Santa Cruz Operation in 1995; the UNIX trademark passed to The Open Group, a neutral industry consortium, which allows the use of the mark for certified operating systems that comply with the Single UNIX Specification. As of 2014, the Unix version with the largest installed base is Apple's macOS. Unix systems are characterized by a modular design, sometimes called the "Unix philosophy"; this concept entails that the operating system provides a set of simple tools that each performs a limited, well-defined function, with a unified filesystem as the main means of communication, a shell scripting and command language to combine the tools to perform complex workflows.
Unix distinguishes itself from its predecessors as the first portable operating system: the entire operating system is written in the C programming language, thus allowing Unix to reach numerous platforms. Unix was meant to be a convenient platform for programmers developing software to be run on it and on other systems, rather than for non-programmers; the system grew larger as the operating system started spreading in academic circles, as users added their own tools to the system and shared them with colleagues. At first, Unix was not designed to be multi-tasking. Unix gained portability, multi-tasking and multi-user capabilities in a time-sharing configuration. Unix systems are characterized by various concepts: the use of plain text for storing data; these concepts are collectively known as the "Unix philosophy". Brian Kernighan and Rob Pike summarize this in The Unix Programming Environment as "the idea that the power of a system comes more from the relationships among programs than from the programs themselves".
In an era when a standard computer consisted of a hard disk for storage and a data terminal for input and output, the Unix file model worked quite well, as I/O was linear. In the 1980s, non-blocking I/O and the set of inter-process communication mechanisms were augmented with Unix domain sockets, shared memory, message queues, semaphores, network sockets were added to support communication with other hosts; as graphical user interfaces developed, the file model proved inadequate to the task of handling asynchronous events such as those generated by a mouse. By the early 1980s, users began seeing Unix as a potential universal operating system, suitable for computers of all sizes; the Unix environment and the client–server program model were essential elements in the development of the Internet and the reshaping of computing as centered in networks rather than in individual computers. Both Unix and the C programming language were developed by AT&T and distributed to government and academic institutions, which led to both being ported to a wider variety of machine families than any other operating system.
Under Unix, the operating system consists of many libraries and utilities along with the master control program, the kernel. The kernel provides services to start and stop programs, handles the file system and other common "low-level" tasks that most programs share, schedules access to avoid conflicts when programs try to access the same resource or device simultaneously. To mediate such access, the kernel has special rights, reflected in the division between user space and kernel space - although in microkernel implementations, like MINIX or Redox, functions such as network protocols may run in user space; the origins of Unix date back to the mid-1960s when the Massachusetts Institute of Technology, Bell Labs, General Electric were developing Multics, a time-sharing operating system for the GE-645 mainframe computer. Multics featured several innovations, but presented severe problems. Frustrated by the size and complexity of Multics, but not by its goals, individual researchers at Bell Labs started withdrawing from the project.
The last to leave were Ken Thompson, Dennis Ritchie, Douglas McIlroy, Joe Ossanna, who decided to reimplement their experiences in a new project of smaller scale. This new operating system was without organizational backing, without a name; the new operating system was a single-tasking system. In 1970, the group coined the name Unics for Uniplexed Information and Computing Service, as a pun on Multics, which stood for Multiplexed Information and Computer Services. Brian Kernighan takes credit for the idea, but adds that "no one can remember" the origin of the final spelling Unix. Dennis Ritchie, Doug McIlroy, Peter G. Neumann credit Kernighan; the operating system was written in assembly language, but in 1973, Version 4 Unix was rewritten in C. Version 4 Unix, still had many PDP-11 dependent codes, is not suitable for porting; the first port to other platform was made five years f
Usability is the ease of use and learnability of a human-made object such as a tool or device. In software engineering, usability is the degree to which a software can be used by specified consumers to achieve quantified objectives with effectiveness and satisfaction in a quantified context of use; the object of use can be a software application, book, machine, vehicle, or anything a human interacts with. A usability study may be conducted as a primary job function by a usability analyst or as a secondary job function by designers, technical writers, marketing personnel, others, it is used in consumer electronics and knowledge transfer objects and mechanical objects such as a door handle or a hammer. Usability includes methods of measuring usability, such as needs analysis and the study of the principles behind an object's perceived efficiency or elegance. In human-computer interaction and computer science, usability studies the elegance and clarity with which the interaction with a computer program or a web site is designed.
Usability considers user satisfaction and utility as quality components, aims to improve user experience through iterative design. The primary notion of usability is that an object designed with a generalized users' psychology and physiology in mind is, for example: More efficient to use—takes less time to accomplish a particular task Easier to learn—operation can be learned by observing the object More satisfying to useComplex computer systems find their way into everyday life, at the same time the market is saturated with competing brands; this has made usability more popular and recognized in recent years, as companies see the benefits of researching and developing their products with user-oriented methods instead of technology-oriented methods. By understanding and researching the interaction between product and user, the usability expert can provide insight, unattainable by traditional company-oriented market research. For example, after observing and interviewing users, the usability expert may identify needed functionality or design flaws that were not anticipated.
A method called contextual inquiry does this in the occurring context of the users own environment. In the user-centered design paradigm, the product is designed with its intended users in mind at all times. In the user-driven or participatory design paradigm, some of the users become actual or de facto members of the design team; the term user friendly is used as a synonym for usable, though it may refer to accessibility. Usability describes the quality of user experience across websites, software and environments. There is no consensus about the relation of the terms usability; some think of usability as the software specialization of the larger topic of ergonomics. Others view these topics as tangential, with ergonomics focusing on physiological matters and usability focusing on psychological matters. Usability is important in website development. According to Jakob Nielsen, "Studies of user behavior on the Web find a low tolerance for difficult designs or slow sites. People don't want to wait.
And they don't want to learn. There's a manual for a Web site. People have to be able to grasp the functioning of the site after scanning the home page—for a few seconds at most." Otherwise, most casual users leave the site and browse or shop elsewhere. ISO defines usability as "The extent to which a product can be used by specified users to achieve specified goals with effectiveness and satisfaction in a specified context of use." The word "usability" refers to methods for improving ease-of-use during the design process. Usability consultant Jakob Nielsen and computer science professor Ben Shneiderman have written about a framework of system acceptability, where usability is a part of "usefulness" and is composed of: Learnability: How easy is it for users to accomplish basic tasks the first time they encounter the design? Efficiency: Once users have learned the design, how can they perform tasks? Memorability: When users return to the design after a period of not using it, how can they re-establish proficiency?
Errors: How many errors do users make, how severe are these errors, how can they recover from the errors? Satisfaction: How pleasant is it to use the design? Usability is associated with the functionalities of the product, in addition to being a characteristic of the user interface. For example, in the context of mainstream consumer products, an automobile lacking a reverse gear could be considered unusable according to the former view, lacking in utility according to the latter view; when evaluating user interfaces for usability, the definition can be as simple as "the perception of a target user of the effectiveness and efficiency of the Interface". Each component may be measured subjectively against criteria, e.g. Principles of User Interface Design, to provide a metric expressed as a percentage, it is important to distinguish between usability engineering. Usability testing is the measurement of ease of use of a piece of software. In contrast, usability engineering is the research and design process that ensures a product with good usability.
Usability is a non-functional requirement. As