1.
Classic Mac OS
–
This is a list of macOS components, features that are included in the current Mac operating system. Latest version 2.7 Automator is a developed by Apple Inc. Automator enables the repetition of tasks across a variety of programs, including Finder, Safari, Calendar, Contacts. It can also work with third-party applications such as Microsoft Office, the icon features a robot holding a pipe, a reference to pipelines, a computer science term for connected data workflows. Automator was first released with Mac OS X Tiger, Automator provides a graphical user interface for automating tasks without knowledge of programming or scripting languages. Tasks can be recorded as they are performed by the user or can be selected from a list, the output of the previous action can become the input to the next action. Automator comes with a library of Actions that act as individual steps in a Workflow document, a Workflow document is used to carry out repetitive tasks. Workflows can be saved and reused, unix command line scripts and AppleScripts can also be invoked as Actions. The actions are linked together in a Workflow, the Workflow can be saved as an application, Workflow file or a contextual menu item. Options can be set when the Workflow is created or when the Workflow is run, a workflow file created in Automator is saved in /Users//Library/Services. Latest version 10.8 Calculator is a calculator application made by Apple Inc. It has three modes, basic, scientific, and programmer, Basic includes a number pad, buttons for adding, subtracting, multiplying, and dividing, as well as memory keys. Scientific mode supports exponents and trigonometric functions, and programmer mode gives the user access to more options related to computer programming, Apple currently ships a different application called Grapher. Calculator has Reverse Polish notation support, and can speak the buttons pressed. The Calculator appeared first as an accessory in first version of Macintosh System for the 1984 Macintosh 128k. Its design was maintained with the basic math operations until the final release of classic Mac OS in 2002. A Dashboard Calculator widget is included in all versions of macOS from Mac OS X Tiger onwards and it only has the basic mode of its desktop counterpart. Since the release of OS X Yosemite, there is also a simple calculator widget available in the notifications area, since the release of Mac OS X Leopard, simple arithmetic functions can be calculated from Spotlight feature
2.
Software developer
–
A software developer is a person concerned with facets of the software development process, including the research, design, programming, and testing of computer software. Other job titles which are used with similar meanings are programmer, software analyst. According to developer Eric Sink, the differences between system design, software development, and programming are more apparent, even more so that developers become systems architects, those who design the multi-leveled architecture or component interactions of a large software system. In a large company, there may be employees whose sole responsibility consists of one of the phases above. In smaller development environments, a few people or even an individual might handle the complete process. The word software was coined as a prank as early as 1953, before this time, computers were programmed either by customers, or the few commercial computer vendors of the time, such as UNIVAC and IBM. The first company founded to provide products and services was Computer Usage Company in 1955. The software industry expanded in the early 1960s, almost immediately after computers were first sold in mass-produced quantities, universities, government, and business customers created a demand for software. Many of these programs were written in-house by full-time staff programmers, some were distributed freely between users of a particular machine for no charge. Others were done on a basis, and other firms such as Computer Sciences Corporation started to grow. The computer/hardware makers started bundling operating systems, systems software and programming environments with their machines, new software was built for microcomputers, so other manufacturers including IBM, followed DECs example quickly, resulting in the IBM AS/400 amongst others. The industry expanded greatly with the rise of the computer in the mid-1970s. In the following years, it created a growing market for games, applications. DOS, Microsofts first operating system product, was the dominant operating system at the time, by 2014 the role of cloud developer had been defined, in this context, one definition of a developer in general was published, Developers make software for the world to use. The job of a developer is to crank out code -- fresh code for new products, code fixes for maintenance, code for business logic, bus factor Software Developer description from the US Department of Labor
3.
Apple Inc.
–
Apple is an American multinational technology company headquartered in Cupertino, California that designs, develops, and sells consumer electronics, computer software, and online services. Apples consumer software includes the macOS and iOS operating systems, the media player, the Safari web browser. Its online services include the iTunes Store, the iOS App Store and Mac App Store, Apple Music, Apple was founded by Steve Jobs, Steve Wozniak, and Ronald Wayne in April 1976 to develop and sell personal computers. It was incorporated as Apple Computer, Inc. in January 1977, Apple joined the Dow Jones Industrial Average in March 2015. In November 2014, Apple became the first U. S. company to be valued at over US$700 billion in addition to being the largest publicly traded corporation in the world by market capitalization. The company employs 115,000 full-time employees as of July 2015 and it operates the online Apple Store and iTunes Store, the latter of which is the worlds largest music retailer. Consumers use more than one billion Apple products worldwide as of March 2016, Apples worldwide annual revenue totaled $233 billion for the fiscal year ending in September 2015. This revenue accounts for approximately 1. 25% of the total United States GDP.1 billion, the corporation receives significant criticism regarding the labor practices of its contractors and its environmental and business practices, including the origins of source materials. Apple was founded on April 1,1976, by Steve Jobs, Steve Wozniak, the Apple I kits were computers single-handedly designed and hand-built by Wozniak and first shown to the public at the Homebrew Computer Club. The Apple I was sold as a motherboard, which was less than what is now considered a personal computer. The Apple I went on sale in July 1976 and was market-priced at $666.66, Apple was incorporated January 3,1977, without Wayne, who sold his share of the company back to Jobs and Wozniak for $800. Multimillionaire Mike Markkula provided essential business expertise and funding of $250,000 during the incorporation of Apple, during the first five years of operations revenues grew exponentially, doubling about every four months. Between September 1977 and September 1980 yearly sales grew from $775,000 to $118m, the Apple II, also invented by Wozniak, was introduced on April 16,1977, at the first West Coast Computer Faire. It differed from its rivals, the TRS-80 and Commodore PET, because of its character cell-based color graphics. While early Apple II models used ordinary cassette tapes as storage devices, they were superseded by the introduction of a 5 1/4 inch floppy disk drive and interface called the Disk II. The Apple II was chosen to be the platform for the first killer app of the business world, VisiCalc. VisiCalc created a market for the Apple II and gave home users an additional reason to buy an Apple II. Before VisiCalc, Apple had been a distant third place competitor to Commodore, by the end of the 1970s, Apple had a staff of computer designers and a production line
4.
Macintosh operating systems
–
The family of Macintosh operating systems developed by Apple Inc. In 1984, Apple debuted the operating system that is now known as the Classic Mac OS with its release of the original Macintosh System Software. The system, rebranded Mac OS in 1996, was preinstalled on every Macintosh until 2002, noted for its ease of use, it was also criticized for its lack of modern technologies compared to its competitors. The current Mac operating system is macOS, originally named Mac OS X until 2012, the current macOS is preinstalled with every Mac and is updated annually. It is the basis of Apples current system software for its devices, iOS, watchOS. Apples effort to expand upon and develop a replacement for its classic Mac OS in the 1990s led to a few cancelled projects, code named Star Trek, Taligent, the Macintosh is credited with having popularized this concept. The classic Mac OS is the original Macintosh operating system that was introduced in 1984 alongside the first Macintosh and remained in primary use on Macs through 2001. Apple released the original Macintosh on January 24,1984, its system software was partially based on the Lisa OS and the Xerox PARC Alto computer. It was originally named System Software, or simply System, Apple rebranded it as Mac OS in 1996 due in part to its Macintosh clone program that ended a year later, Mac OS is characterized by its monolithic system. Nine major versions of the classic Mac OS were released, the name Classic that now signifies the system as a whole is a reference to a compatibility layer that helped ease the transition to Mac OS X. Although the system was marketed as simply version 10 of Mac OS. Precursors to the release of Mac OS X include OpenStep, Apples Rhapsody project. MacOS makes use of the BSD codebase and the XNU kernel, the first desktop version of the system was released on March 24,2001, supporting the Aqua user interface. Since then, several more versions adding newer features and technologies have been released, since 2011, new releases have been offered on an annual basis. It was followed by several more official server-based releases, server functionality has instead been offered as an add-on for the desktop system since 2011. The first version of the system was ready for use in February 1988, in 1988, Apple released its first Unix-based OS, A/UX, which was a Unix operating system with the Mac OS look and feel. It was not very competitive for its time, due in part to the crowded Unix market, A/UX had most of its success in sales to the U. S. government, where POSIX compliance was a requirement that Mac OS could not meet. The Macintosh Application Environment was a software package introduced by Apple in 1994 that allowed users of certain Unix-based computer workstations to run Apple Macintosh application software, MAE used the X Window System to emulate a Macintosh Finder-style graphical user interface
5.
Software release life cycle
–
Usage of the alpha/beta test terminology originated at IBM. As long ago as the 1950s, IBM used similar terminology for their hardware development, a test was the verification of a new product before public announcement. B test was the verification before releasing the product to be manufactured, C test was the final test before general availability of the product. Martin Belsky, a manager on some of IBMs earlier software projects claimed to have invented the terminology, IBM dropped the alpha/beta terminology during the 1960s, but by then it had received fairly wide notice. The usage of beta test to refer to testing done by customers was not done in IBM, rather, IBM used the term field test. Pre-alpha refers to all activities performed during the project before formal testing. These activities can include requirements analysis, software design, software development, in typical open source development, there are several types of pre-alpha versions. Milestone versions include specific sets of functions and are released as soon as the functionality is complete, the alpha phase of the release life cycle is the first phase to begin software testing. In this phase, developers generally test the software using white-box techniques, additional validation is then performed using black-box or gray-box techniques, by another testing team. Moving to black-box testing inside the organization is known as alpha release, alpha software can be unstable and could cause crashes or data loss. Alpha software may not contain all of the features that are planned for the final version, in general, external availability of alpha software is uncommon in proprietary software, while open source software often has publicly available alpha versions. The alpha phase usually ends with a freeze, indicating that no more features will be added to the software. At this time, the software is said to be feature complete, Beta, named after the second letter of the Greek alphabet, is the software development phase following alpha. Software in the stage is also known as betaware. Beta phase generally begins when the software is complete but likely to contain a number of known or unknown bugs. Software in the phase will generally have many more bugs in it than completed software, as well as speed/performance issues. The focus of beta testing is reducing impacts to users, often incorporating usability testing, the process of delivering a beta version to the users is called beta release and this is typically the first time that the software is available outside of the organization that developed it. Beta version software is useful for demonstrations and previews within an organization
6.
Kernel (operating system)
–
The kernel is a computer program that is the core of a computers operating system, with complete control over everything in the system. It is the first program loaded on start-up and it handles the rest of start-up as well as input/output requests from software, translating them into data-processing instructions for the central processing unit. It handles memory and peripherals like keyboards, monitors, printers, the critical code of the kernel is usually loaded into a protected area of memory, which prevents it from being overwritten by applications or other, more minor parts of the operating system. The kernel performs its tasks, such as running processes and handling interrupts, in contrast, everything a user does is in user space, writing text in a text editor, running programs in a GUI, etc. This separation prevents user data and kernel data from interfering with other and causing instability. The kernels interface is an abstraction layer. When a process makes requests of the kernel, it is called a system call, Kernel designs differ in how they manage these system calls and resources. A monolithic kernel runs all the operating instructions in the same address space. A microkernel runs most processes in space, for modularity. The kernel takes responsibility for deciding at any time which of the running programs should be allocated to the processor or processors. Random-access memory Random-access memory is used to both program instructions and data. Typically, both need to be present in memory in order for a program to execute, often multiple programs will want access to memory, frequently demanding more memory than the computer has available. The kernel is responsible for deciding which memory each process can use, input/output devices I/O devices include such peripherals as keyboards, mice, disk drives, printers, network adapters, and display devices. The kernel allocates requests from applications to perform I/O to an appropriate device, key aspects necessary in resource management are the definition of an execution domain and the protection mechanism used to mediate the accesses to the resources within a domain. Kernels also usually provide methods for synchronization and communication between processes called inter-process communication, finally, a kernel must provide running programs with a method to make requests to access these facilities. The kernel has full access to the memory and must allow processes to safely access this memory as they require it. Often the first step in doing this is virtual addressing, usually achieved by paging and/or segmentation, virtual addressing allows the kernel to make a given physical address appear to be another address, the virtual address. This allows every program to behave as if it is the one running
7.
Monolithic kernel
–
A monolithic kernel is an operating system architecture where the entire operating system is working in kernel space and is alone in supervisor mode. The monolithic model differs from other operating system architectures in that it defines a high-level virtual interface over computer hardware. A set of primitives or system calls implement all operating system such as process management, concurrency. Device drivers can be added to the kernel as modules and this modularity of the operating system is at the binary level and not at the architecture level. Modular monolithic operating systems are not to be confused with the level of modularity inherent in Server-Client operating systems which use microkernels. Practically speaking, dynamically loading modules is simply a more flexible way of handling the operating system image at runtime — as opposed to rebooting with a different operating system image, the modules allow easy extension of the operating systems capabilities as required. Dynamically loadable modules incur a small overhead when compared to building the module into the system image. Namely, an unloaded module need not be stored in random access memory
8.
Software license
–
A software license is a legal instrument governing the use or redistribution of software. Under United States copyright law all software is copyright protected, in code as also object code form. The only exception is software in the public domain, most distributed software can be categorized according to its license type. Two common categories for software under copyright law, and therefore with licenses which grant the licensee specific rights, are proprietary software and free, unlicensed software outside the copyright protection is either public domain software or software which is non-distributed, non-licensed and handled as internal business trade secret. Contrary to popular belief, distributed unlicensed software is copyright protected. Examples for this are unauthorized software leaks or software projects which are placed on public software repositories like GitHub without specified license. As voluntarily handing software into the domain is problematic in some international law domains, there are also licenses granting PD-like rights. Therefore, the owner of a copy of software is legally entitled to use that copy of software. Hence, if the end-user of software is the owner of the respective copy, as many proprietary licenses only enumerate the rights that the user already has under 17 U. S. C. §117, and yet proclaim to take away from the user. Proprietary software licenses often proclaim to give software publishers more control over the way their software is used by keeping ownership of each copy of software with the software publisher. The form of the relationship if it is a lease or a purchase, for example UMG v. Augusto or Vernor v. Autodesk. The ownership of goods, like software applications and video games, is challenged by licensed. The Swiss based company UsedSoft innovated the resale of business software and this feature of proprietary software licenses means that certain rights regarding the software are reserved by the software publisher. Therefore, it is typical of EULAs to include terms which define the uses of the software, the most significant effect of this form of licensing is that, if ownership of the software remains with the software publisher, then the end-user must accept the software license. In other words, without acceptance of the license, the end-user may not use the software at all, one example of such a proprietary software license is the license for Microsoft Windows. The most common licensing models are per single user or per user in the appropriate volume discount level, Licensing per concurrent/floating user also occurs, where all users in a network have access to the program, but only a specific number at the same time. Another license model is licensing per dongle which allows the owner of the dongle to use the program on any computer, Licensing per server, CPU or points, regardless the number of users, is common practice as well as site or company licenses
9.
System 1
–
System 1, originally named Macintosh System Software, was the first Apple Macintosh operating system and the beginning of the classic Mac OS series. It ran on the Motorola 68000 microprocessor, System 1 was released on January 24,1984, along with the original Macintosh, the first in the Macintosh family of personal computers. It received one update, System 1.1 on May 5,1984, the features of the operating system included the Finder and menu bar. In addition to this, it popularized the graphical interface and desktop metaphor. Also, items in the Trash were permanently deleted when the computer was shut down or an application was loaded, the menu bar was a new and revolutionary part of the OS. Similar to the one found on the Lisa OS, the Macintosh menu bar had 5 basic headers when on the desktop, the Apple menu, File, Edit, View, when in an application, the menus would change to better fit the applications uses. While within the Finder, the Apple menu contained the About the Finder information, File had drop-downs such as Open, Eject, and Close. Edit had drop-downs for cutting, copying, and pasting, Special was responsible for managing the hardware and other system functions, and was always the rightmost entry on the menu bar in the Finder. In System 1, the menu had items related to emptying the Trash, cleaning up the desktop, System 1 came with multiple desk accessories. These included an Alarm Clock, Calculator, Control Panel, Key Caps, Note Pad, Puzzle, the difference between the desktop accessories and a normal application is that multiple desktop accessories could be run at once, opposed to applications where only one could run at a time. Along with that, the desktop accessories could run on top of an application, Alarm Clock — This DA could be used just like an alarm clock, as the computer would beep, and the menu bar would flash when the alarms set time was reached. It could also be used as a way to change/set the time. When opened, it would show the time and date set on the computer, Calculator — It was a basic calculator capable of addition, subtraction, multiplication, and division. It featured the basic 10 buttons for input, Control Panel — The control panel was used to adjust some of the settings on the computer. What made the control panel unique from other Mac OS control panels was the intended absence of any text. This was chosen to demonstrate the graphical user interface, representation was achieved by using symbols. It could be used to adjust settings such as volume, double speed, mouse sensitivity. On the Macintosh 128K, Macintosh 512K, and the Macintosh Plus the screen brightness was controlled by a mechanical adjustment wheel beneath the screen, Key Caps — A DA used to show the layout of the original Macintosh keyboard
10.
System 7
–
System 7 is a single-user graphical user interface-based operating system for Macintosh computers and was part of the classic Mac OS line of operating systems. It was introduced on May 13,1991, by Apple Computer and it succeeded System 6, and was the main Macintosh operating system until it was succeeded by Mac OS8 in 1997. Features added with the System 7 release included virtual memory, personal file sharing, QuickTime, QuickDraw 3D, System 7 is often used generically to refer to all 7. x versions. With the release of version 7.6 in 1997, Apple officially renamed the operating system Mac OS, System 7 was developed for Macs that used the Motorola 680x0 line of processors, but was ported to the PowerPC after Apple adopted the new processor. The development of the Macintosh Systems up to System 6 followed a smooth progression with the addition of new features. Major additions were fairly limited, notably adding Color QuickDraw and MultiFinder in System 6, some perspective on the scope of the changes can be seen by examining the official system documentation, Inside Macintosh. This initially shipped in three volumes, adding another to describe the changes introduced with the Mac Plus, and another for the Mac II and these limited changes meant that the original Mac system remained largely as it was when initially introduced. That is, the machine was geared towards a single user, however, many of the assumptions of this model were no longer appropriate. Most notable among these was the model, the replacement of which had first been examined in 1986s Switcher. Running MultiFinder normally required larger amount of RAM and a hard drive, while additions had been relatively limited, so had fixes to some of the underlying oddities of the system architecture. If the system were able to support tasks, this one-off solution would no longer be needed - desk accessories could simply be small programs. Yet, as MultiFinder was still optional, such a step had not been taken, numerous examples of this sort of problem could be found throughout the system. Finally, the adoption of hard drives and local area networks led to any number of new features being requested from users and developers. By the late 1980s, the list of new upgrades and suggested changes to the model was considerable. In March 1988, shortly after the release of System 6, development of the ideas contained on the blue and pink cards was to proceed in parallel, and at first the two projects were known simply as blue and pink. Apple intended to have the team release an updated version of the existing Macintosh operating system in the 1990–1991 timeframe. As Blue was aimed at relatively simple upgrades, the feature list reads to some degree as a sort of System 6, a new Sound Manager API, version 2.0, replaced the older ad hoc APIs. The new APIs featured significantly improved hardware abstraction, as well as higher-quality playback, although technically not a new feature for System 7, Sound Manager 2.0 was the first widespread implementation of this technology to make it to most Mac users
11.
Graphical user interface
–
GUIs were introduced in reaction to the perceived steep learning curve of command-line interfaces, which require commands to be typed on a computer keyboard. The actions in a GUI are usually performed through direct manipulation of the graphical elements, beyond computers, GUIs are used in many handheld mobile devices such as MP3 players, portable media players, gaming devices, smartphones and smaller household, office and industrial controls. Designing the visual composition and temporal behavior of a GUI is an important part of application programming in the area of human–computer interaction. Its goal is to enhance the efficiency and ease of use for the logical design of a stored program. Methods of user-centered design are used to ensure that the language introduced in the design is well-tailored to the tasks. The visible graphical interface features of an application are sometimes referred to as chrome or GUI, typically, users interact with information by manipulating visual widgets that allow for interactions appropriate to the kind of data they hold. The widgets of an interface are selected to support the actions necessary to achieve the goals of users. A model–view–controller allows a structure in which the interface is independent from and indirectly linked to application functions. This allows users to select or design a different skin at will, good user interface design relates to users more, and to system architecture less. Large widgets, such as windows, usually provide a frame or container for the main presentation content such as a web page, smaller ones usually act as a user-input tool. A GUI may be designed for the requirements of a market as application-specific graphical user interfaces. By the 1990s, cell phones and handheld game systems also employed application specific touchscreen GUIs, newer automobiles use GUIs in their navigation systems and multimedia centers, or navigation multimedia center combinations. Sample graphical desktop environments A GUI uses a combination of technologies and devices to provide a platform that users can interact with, a series of elements conforming a visual language have evolved to represent information stored in computers. This makes it easier for people with few computer skills to work with, the most common combination of such elements in GUIs is the windows, icons, menus, pointer paradigm, especially in personal computers. The WIMP style of interaction uses a virtual device to represent the position of a pointing device, most often a mouse. Available commands are compiled together in menus, and actions are performed making gestures with the pointing device, a window manager facilitates the interactions between windows, applications, and the windowing system. The windowing system handles hardware devices such as pointing devices, graphics hardware, window managers and other software combine to simulate the desktop environment with varying degrees of realism. Smaller mobile devices such as personal assistants and smartphones typically use the WIMP elements with different unifying metaphors, due to constraints in space
12.
Operating system
–
An operating system is system software that manages computer hardware and software resources and provides common services for computer programs. All computer programs, excluding firmware, require a system to function. Operating systems are found on many devices that contain a computer – from cellular phones, the dominant desktop operating system is Microsoft Windows with a market share of around 83. 3%. MacOS by Apple Inc. is in place, and the varieties of Linux is in third position. Linux distributions are dominant in the server and supercomputing sectors, other specialized classes of operating systems, such as embedded and real-time systems, exist for many applications. A single-tasking system can run one program at a time. Multi-tasking may be characterized in preemptive and co-operative types, in preemptive multitasking, the operating system slices the CPU time and dedicates a slot to each of the programs. Unix-like operating systems, e. g. Solaris, Linux, cooperative multitasking is achieved by relying on each process to provide time to the other processes in a defined manner. 16-bit versions of Microsoft Windows used cooperative multi-tasking, 32-bit versions of both Windows NT and Win9x, used preemptive multi-tasking. Single-user operating systems have no facilities to distinguish users, but may allow multiple programs to run in tandem, a distributed operating system manages a group of distinct computers and makes them appear to be a single computer. The development of networked computers that could be linked and communicate with each other gave rise to distributed computing, distributed computations are carried out on more than one machine. When computers in a work in cooperation, they form a distributed system. The technique is used both in virtualization and cloud computing management, and is common in large server warehouses, embedded operating systems are designed to be used in embedded computer systems. They are designed to operate on small machines like PDAs with less autonomy and they are able to operate with a limited number of resources. They are very compact and extremely efficient by design, Windows CE and Minix 3 are some examples of embedded operating systems. A real-time operating system is a system that guarantees to process events or data by a specific moment in time. A real-time operating system may be single- or multi-tasking, but when multitasking, early computers were built to perform a series of single tasks, like a calculator. Basic operating system features were developed in the 1950s, such as resident monitor functions that could run different programs in succession to speed up processing
13.
Macintosh
–
The Macintosh (/ˈmækᵻntɒʃ/ MAK-in-tosh, is a series of personal computers designed, developed, and marketed by Apple Inc. Steve Jobs introduced the original Macintosh computer on January 24,1984 and this was the companys first mass-market personal computer featuring an integral graphical user interface and mouse. This first model was renamed to Macintosh 128k for uniqueness amongst a populous family of subsequently updated models which are also based on Apples same proprietary architecture. Since 1998, Apple has largely phased out the Macintosh name in favor of Mac, Macintosh systems still found success in education and desktop publishing and kept Apple as the second-largest PC manufacturer for the next decade. In the 1990s, improvements in the rival Wintel platform, notably with the introduction of Windows 3.0, then Windows 95, gradually took market share from the more expensive Macintosh systems. The performance advantage of 68000-based Macintosh systems was eroded by Intels Pentium, even after a transition to the superior PowerPC-based Power Macintosh line in 1994, the falling prices of commodity PC components and the release of Windows 95 saw the Macintosh user base decline. In 1998, after the return of Steve Jobs, Apple consolidated its multiple consumer-level desktop models into the all-in-one iMac G3, since their transition to Intel processors in 2006, the complete lineup is entirely based on said processors and associated systems. Its current lineup comprises three desktops, and three laptops and its Xserve server was discontinued in 2011 in favor of the Mac Mini and Mac Pro. Apple also develops the operating system for the Mac, currently macOS version 10.12 Sierra, Macs are currently capable of running non-Apple operating systems such as Linux, OpenBSD, and Microsoft Windows with the aid of Boot Camp or third-party software. Apple does not license macOS for use on computers, though it did license previous versions of the classic Mac OS through their Macintosh clone program from 1995 to 1997. The Macintosh project was begun in 1979 by Jef Raskin, an Apple employee who envisioned an easy-to-use, in 1978 Apple began to organize the Apple Lisa project, aiming to build a next-generation machine similar to an advanced Apple III or the yet-to-be-introduced IBM PC. In 1979, Steve Jobs learned of the work on graphical user interfaces taking place at Xerox PARC. He arranged a deal in which Xerox received Apple stock options in return for which Apple would license their designs, the basic layout of the Lisa was largely complete by 1982, at which point Jobs continual suggestions for improvements led to him being kicked off the project. At the same time that the Lisa was becoming a GUI machine in 1979, the design at that time was for a low-cost, easy-to-use machine for the average consumer. Raskin was authorized to start hiring for the project in September 1979 and his initial team would eventually consist of himself, Howard, Joanna Hoffman, Burrell Smith, and Bud Tribble. Smiths design used fewer RAM chips than the Lisa, which production of the board significantly more cost-efficient. Though there were no memory slots, its RAM was expandable to 512 kB by means of soldering sixteen IC sockets to accept 256 kb RAM chips in place of the factory-installed chips. The final products screen was a 9-inch, 512x342 pixel monochrome display, burrels innovative design, combining the low production cost of an Apple II with the computing power of Lisas Motorola 68000 CPU, began to receive Jobs attentions
14.
Computer multitasking
–
In computing, multitasking is a concept of performing multiple tasks over a certain period of time by executing them concurrently. As a result, a computer executes segments of multiple tasks in a manner, while the tasks share common processing resources such as central processing units. Multitasking does not necessarily mean that multiple tasks are executing at exactly the same time, even on multiprocessor or multicore computers, which have multiple CPUs/cores so more than one task can be executed at once, multitasking allows many more tasks to be run than there are CPUs. In the case of a computer with a single CPU, only one task is said to be running at any point in time, multitasking solves the problem by scheduling which task may be the one running at any given time, and when another waiting task gets a turn. The act of reassigning a CPU from one task to one is called a context switch. Multiprogramming systems are designed to maximize CPU usage, in time-sharing systems, the running task is required to relinquish the CPU, either voluntarily or by an external event such as a hardware interrupt. Time sharing systems are designed to allow programs to execute apparently simultaneously. In real-time systems, some waiting tasks are guaranteed to be given the CPU when an event occurs. Real time systems are designed to control devices such as industrial robots. The term multitasking has become a term, as the same word is used in many other languages such as German, Italian, Dutch, Danish. In the early days of computing, CPU time was expensive, when the computer ran a program that needed access to a peripheral, the central processing unit would have to stop executing program instructions while the peripheral processed the data. The first computer using a system was the British Leo III owned by J. Lyons. During batch processing, several different programs were loaded in the memory. When the first program reached an instruction waiting for a peripheral, the context of program was stored away. The process continued until all programs finished running, multiprogramming doesnt give any guarantee that a program will run in a timely manner. Indeed, the very first program may very well run for hours without needing access to a peripheral. As there were no users waiting at a terminal, this was no problem, users handed in a deck of punched cards to an operator. Multiprogramming greatly reduced wait times when multiple batches were being processed, the expression time sharing usually designated computers shared by interactive users at terminals, such as IBMs TSO, and VM/CMS
15.
Macro (computer science)
–
A macro in computer science is a rule or pattern that specifies how a certain input sequence should be mapped to a replacement output sequence according to a defined procedure. The mapping process that instantiates a macro use into a sequence is known as macro expansion. A facility for writing macros may be provided as part of an application or as a part of a programming language. In the former case, macros are used to make using the application less repetitive. In the latter case, they are a tool allows a programmer to enable code reuse or even to design domain-specific languages. Macros are used to make a sequence of computing instructions available to the programmer as a single statement, making the programming task less tedious. The term derives from macro instruction, and such expansions were originally used in generating assembly language code, keyboard macros and mouse macros allow short sequences of keystrokes and mouse actions to transform into other, usually more time-consuming, sequences of keystrokes and mouse actions. In this way, frequently used or repetitive sequences of keystrokes, separate programs for creating these macros are called macro recorders. These programs were based on the TSR mode of operation and applied to all keyboard input, keyboard macros have in more recent times come to life as a method of exploiting the economy of massively multiplayer online role-playing games. By tirelessly performing a boring, repetitive, but low risk action and this effect is even larger when a macro-using player operates multiple accounts simultaneously, or operates the accounts for a large amount of time each day. As this money is generated without human intervention, it can upset the economy of the game. For this reason, use of macros is a violation of the TOS or EULA of most MMORPGs, keyboard and mouse macros that are created using an applications built-in macro features are sometimes called application macros. They are created by carrying out the once and letting the application record the actions. An underlying macro programming language, most commonly a scripting language, the programmers text editor, Emacs, follows this idea to a conclusion. In effect, most of the editor is made of macros, Emacs was originally devised as a set of macros in the editing language TECO, it was later ported to dialects of Lisp. Another programmers text editor, Vim, also has full implementation of macros and it can record into a register what a person types on the keyboard and it can be replayed or edited just like VBA macros for Microsoft Office. Vim also has a language called Vimscript to create macros. Visual Basic for Applications is a programming language included in Microsoft Office up to Office 2013, however, its function has evolved from and replaced the macro languages that were originally included in some of these applications
16.
Tape recorder
–
In its present-day form, it records a fluctuating signal by moving the tape across a tape head that polarizes the magnetic domains in the tape in proportion to the audio signal. Tape-recording devices include reel-to-reel tape deck and the cassette deck, the use of magnetic tape for sound recording originated around 1930. Magnetizable tape revolutionized both the radio broadcast and music recording industries and it gave artists and producers the power to record and re-record audio with minimal loss in quality as well as edit and rearrange recordings with ease. The alternative recording technologies of the era, transcription discs and wire recorders, since some early refinements improved the fidelity of the reproduced sound, magnetic tape has been the highest quality analog sound recording medium available. As of the first decade of the 21st century, analog magnetic tape has been replaced by digital recording technologies for consumer purposes. Some individuals and organizations developed innovative uses for magnetic wire recorders while others investigated variations of the technology, one particularly important variation was the application of an oxide powder to a long strip of paper. This German invention was the start of a string of innovations that have led to present day magnetic tape recordings. The earliest known audio tape recorder was a non-magnetic, non-electric version invented by Alexander Graham Bells Volta Laboratory and patented in 1886. It employed a 3⁄16-inch-wide strip of wax-covered paper that was coated by dipping it in a solution of beeswax and paraffin and then had one side scraped clean, the machine was of sturdy wood and metal construction, and hand-powered by means of a knob fastened to the flywheel. The tape was taken up on the other reel. The sharp recording stylus, actuated by a vibrating mica diaphragm, in playback mode, a dull, loosely mounted stylus, attached to a rubber diaphragm, carried the reproduced sounds through an ear tube to its listener. Both recording and playback heads, mounted alternately on the two posts, could be adjusted vertically so that several recordings could be cut on the same 3⁄16-inch-wide strip. While the machine was never developed commercially, it was an ancestor to the modern magnetic tape recorder which it resembled somewhat in design. The tapes and machine created by Bells associates, examined at one of the Smithsonian Institutions museums, became brittle, the machines playback head was also missing. Otherwise, with some reconditioning, they could be placed into working condition, during the recording process, the tape moved through a pair of electrodes which immediately imprinted the modulated sound signals as visible black stripes into the paper tapes surface. On 13 August 1931, Duston filed USPTO Patent Application #556,743 for Method Of And Apparatus For Electrically Recording And Reproducing Sound And Other Vibrations, and which was renewed in 1934. Magnetic recording was conceived as early as 1877 by the American engineer Oberlin Smith, Analog magnetic wire recording, and its successor, magnetic tape recording, involve the use of a magnetizable medium which moves with a constant speed past a recording head. An electrical signal, which is analogous to the sound that is to be recorded, is fed to the recording head, inducing a pattern of magnetization similar to the signal
17.
Microsoft
–
Its best known software products are the Microsoft Windows line of operating systems, Microsoft Office office suite, and Internet Explorer and Edge web browsers. Its flagship hardware products are the Xbox video game consoles and the Microsoft Surface tablet lineup, as of 2016, it was the worlds largest software maker by revenue, and one of the worlds most valuable companies. Microsoft was founded by Paul Allen and Bill Gates on April 4,1975, to develop and it rose to dominate the personal computer operating system market with MS-DOS in the mid-1980s, followed by Microsoft Windows. The companys 1986 initial public offering, and subsequent rise in its share price, since the 1990s, it has increasingly diversified from the operating system market and has made a number of corporate acquisitions. In May 2011, Microsoft acquired Skype Technologies for $8.5 billion, in June 2012, Microsoft entered the personal computer production market for the first time, with the launch of the Microsoft Surface, a line of tablet computers. The word Microsoft is a portmanteau of microcomputer and software, Paul Allen and Bill Gates, childhood friends with a passion for computer programming, sought to make a successful business utilizing their shared skills. In 1972 they founded their first company, named Traf-O-Data, which offered a computer that tracked and analyzed automobile traffic data. Allen went on to pursue a degree in science at Washington State University. The January 1975 issue of Popular Electronics featured Micro Instrumentation and Telemetry Systemss Altair 8800 microcomputer, Allen suggested that they could program a BASIC interpreter for the device, after a call from Gates claiming to have a working interpreter, MITS requested a demonstration. Since they didnt actually have one, Allen worked on a simulator for the Altair while Gates developed the interpreter and they officially established Microsoft on April 4,1975, with Gates as the CEO. Allen came up with the name of Micro-Soft, as recounted in a 1995 Fortune magazine article. In August 1977 the company formed an agreement with ASCII Magazine in Japan, resulting in its first international office, the company moved to a new home in Bellevue, Washington in January 1979. Microsoft entered the OS business in 1980 with its own version of Unix, however, it was MS-DOS that solidified the companys dominance. For this deal, Microsoft purchased a CP/M clone called 86-DOS from Seattle Computer Products, branding it as MS-DOS, following the release of the IBM PC in August 1981, Microsoft retained ownership of MS-DOS. Since IBM copyrighted the IBM PC BIOS, other companies had to engineer it in order for non-IBM hardware to run as IBM PC compatibles. Due to various factors, such as MS-DOSs available software selection, the company expanded into new markets with the release of the Microsoft Mouse in 1983, as well as with a publishing division named Microsoft Press. Paul Allen resigned from Microsoft in 1983 after developing Hodgkins disease, while jointly developing a new OS with IBM in 1984, OS/2, Microsoft released Microsoft Windows, a graphical extension for MS-DOS, on November 20,1985. Once Microsoft informed IBM of NT, the OS/2 partnership deteriorated, in 1990, Microsoft introduced its office suite, Microsoft Office
18.
AppleScript
–
AppleScript is a scripting language created by Apple Inc. and built into the Classic Mac OS since System 7 and into all versions of macOS. The term AppleScript may refer to the system itself, or to an individual script written in the AppleScript language. AppleScript is primarily a language developed by Apple to do Inter-Application Communication using AppleEvents. AppleScript is related to, but different from, AppleEvents, AppleEvents is designed to exchange data between and control other applications in order to automate repetitive tasks. AppleScript has some limited processing abilities of its own, in addition to sending and receiving AppleEvents to applications, AppleScript can do basic calculations and complex text processing, and is extensible, allowing the use of scripting additions that add new functions to the language. Mainly, however, AppleScript relies on the functionality of applications, as a structured command language, AppleScript can be compared to Unix shells, the Microsoft Windows Script Host, or IBM REXX in its functionality, but it is unique from all three. Essential to its functionality is the fact that Macintosh applications publish dictionaries of addressable objects, the AppleScript project was an attempt to consolidate a proliferation of scripting languages created and maintained by different groups and products at Apple. HyperTalk could be used by novices to program a HyperCard stack, AppleScript was released in October 1993 as part of System 7.1.1. QuarkXPress was one of the first major software applications that supported AppleScript and this in turn led to AppleScript being widely adopted within the publishing and prepress world, often tying together complex workflows. This was a key factor in retaining the Macintoshs dominant position in publishing and prepress, even after QuarkXpress and other publishing applications were ported to Microsoft Windows. After some uncertainty about the future of AppleScript on Apples next generation OS, Cocoa applications allow application developers to implement basic scriptability for their apps with minimal effort, broadening the number of applications that are directly scriptable. At the same time, the shift to the Unix underpinnings, AppleScript Studio, released with Mac OS X10.2 as part of Xcode, and later AppleScriptObjC framework, released in Mac OS X10.6, allows users to build native Cocoa applications using AppleScript. In October 2016, longtime AppleScript product manager and automation evangelist Sal Soghoian left Apple when his position was eliminated for business reasons, veterans in the Mac community generally responded with concern, questioning Apples commitment to the developer community and pro users. Apple senior vice president of Software Engineering Craig Federighis responded to a saying that We have every intent to continue our support for the great automation technologies in macOS. Though The Mac Observer felt it did little to assuage skepticism about the future of Apple automation in general, for the time being, AppleScript remains one component of macOS Automation technologies, along with Services, Automator, and Shell scripting. AppleScript was designed to be used as an accessible end-user scripting language, offering users an intelligent mechanism to control applications, AppleScript uses Apple Events, a set of standardized data formats that the Macintosh operating system uses to send information to applications. Apple Events allow a script to work with multiple applications simultaneously, for example, an AppleScript to create a simple web gallery might do the following, Open a photo in a photo-editing application. Tell the photo-editing application to manipulate the image Tell the photo-editing application to save the image in a file in some different folder
19.
Andy Hertzfeld
–
Andy Hertzfeld is an American computer scientist and inventor who was a member of the original Apple Macintosh development team during the 1980s. After buying an Apple II in January 1978, he went to work for Apple Computer from August 1979 until March 1984, since leaving Apple, he has co-founded three companies, Radius in 1986, General Magic in 1990 and Eazel in 1999. In 2002, he helped Mitch Kapor promote open source software with the Open Source Applications Foundation, Hertzfeld joined Google in 2005, and in 2011 was the key designer of the Circles user interface in Google+. After graduating from Brown University with a Computer Science degree in 1975, Hertzfeld attended graduate school at the University of California, in 1978, he bought an Apple II computer and soon began developing software for it. In the early 1980s, he invited his school friend, artist Susan Kare. Hertzfeld was a member of the Apple Macintosh design team, after a shakeup in the Apple II team and at Hertzfelds request, Apple co-founder Steve Jobs added him to the nearly two-year-old team in February 1981. Hertzfelds business card at Apple listed his title as Software Wizard, since leaving Apple in 1984, Hertzfeld has co-founded three new companies — Radius, General Magic and Eazel. At Eazel, he helped to create the Nautilus file manager for Linuxs GNOME desktop and he volunteered for the Open Source Applications Foundation in 2002 and 2003, writing early prototypes of Chandler, their information manager. In 1996, Hertzfeld was interviewed by Robert Cringely on the television documentary Triumph of the Nerds, in early 2004, he started folklore. org, a Web site devoted to collective storytelling that contains dozens of anecdotes about the development of the original Macintosh. The stories have been collected in an OReilly book, Revolution in the Valley, in August 2005, Hertzfeld joined Google. On June 28,2011, Google announced Google+, its latest attempt at social networking, Hertzfeld was the key designer of the Google+ Circles component user interface, but not the entire project as has been mistakenly claimed. Hertzfeld was portrayed by Elden Henson in the 2013 film Jobs and he retired from Google in 2013 Hertzfeld, Andy. The Second Coming of Steve Jobs, differnet. com — Andy Hertzfelds personal homepage, a collection of Web sites designed and/or hosted by him Revolution in the Valley Andy Hertzfelds book about the development of the Macintosh
20.
Random-access memory
–
Random-access memory is a form of computer data storage which stores frequently used program instructions to increase the general speed of a system. A random-access memory device allows data items to be read or written in almost the same amount of time irrespective of the location of data inside the memory. RAM contains multiplexing and demultiplexing circuitry, to connect the lines to the addressed storage for reading or writing the entry. Usually more than one bit of storage is accessed by the same address, in todays technology, random-access memory takes the form of integrated circuits. RAM is normally associated with types of memory, where stored information is lost if power is removed. Other types of non-volatile memories exist that allow access for read operations. These include most types of ROM and a type of memory called NOR-Flash. Integrated-circuit RAM chips came into the market in the early 1970s, with the first commercially available DRAM chip, early computers used relays, mechanical counters or delay lines for main memory functions. Ultrasonic delay lines could only reproduce data in the order it was written, drum memory could be expanded at relatively low cost but efficient retrieval of memory items required knowledge of the physical layout of the drum to optimize speed. Latches built out of vacuum tube triodes, and later, out of transistors, were used for smaller and faster memories such as registers. Such registers were relatively large and too costly to use for large amounts of data, the first practical form of random-access memory was the Williams tube starting in 1947. It stored data as electrically charged spots on the face of a cathode ray tube, since the electron beam of the CRT could read and write the spots on the tube in any order, memory was random access. The capacity of the Williams tube was a few hundred to around a thousand bits, but it was smaller, faster. In fact, rather than the Williams tube memory being designed for the SSEM, magnetic-core memory was invented in 1947 and developed up until the mid-1970s. It became a form of random-access memory, relying on an array of magnetized rings. By changing the sense of each rings magnetization, data could be stored with one bit stored per ring, since every ring had a combination of address wires to select and read or write it, access to any memory location in any sequence was possible. Magnetic core memory was the form of memory system until displaced by solid-state memory in integrated circuits. Data was stored in the capacitance of each transistor, and had to be periodically refreshed every few milliseconds before the charge could leak away
21.
ImageWriter
–
The ImageWriter is a product line of dot matrix printers formerly manufactured by Apple Computer and then designed to be compatible with their entire line of computers. There were three different models introduced over time, which were popular among Apple II and Macintosh owners. The first ImageWriter is a serial based dot matrix printer introduced by Apple Computer in late 1983, the printer was essentially a re-packaged 9-pin dot matrix printer from C. Itoh Electronics, released the same year. It was introduced as a replacement for the earlier parallel-based Apple Dot Matrix Printer/DMP and, while intended for the Apple II. The ImageWriter could produce images as well as text, up to a resolution of 144 DPI and a speed of about 120 CPS. In text mode, the printer was logic-seeking, meaning it would print with the moving in both directions while it would print only in one direction for graphics and Near Letter Quality. The ImageWriter was also supported by the original Macintosh computer, the Macintosh 128K, Apple wanted a graphical printer for the Mac, and had introduced the ImageWriter primarily to support the new machine. This permitted it to produce WYSIWYG output from the screen of the computer, the ImageWriter could be supported by Microsoft Windows-based PCs by using the included C. Itoh 8510 compatible driver. The ImageWriter was succeeded by the ImageWriter II in late 1985, a wider version of the ImageWriter, sold as ImageWriter 15, was introduced in January 1984. It allowed printing to 12 wide as well as to 15 wide paper and this version of ImageWriter remained in production for more than a year after the ImageWriter II was introduced. Production was eventually discontinued in January 1987, in 1984 Thunderware introduced the ThunderScan, an optical scanner that was installed in place of the ImageWriter ribbon cartridge. With support for the Apple II and the Mac, the ThunderScan provided low cost grayscale scanning with moderate resolution and speed. The ImageWriter II is a serial based dot matrix printer that was manufactured by Apple Computer and it had several optional add-ons available, including, a plug-in network card, buffer memory card, and motorized sheet feeder. It also supported color printing with an appropriate ribbon and it was particularly well known for being extremely sturdy – ImageWriter IIs were still in common use for forms printing a decade after they were produced. Compute. reported in 1989, however, that believed that the ImageWriter II was inferior to its predecessor. The magazine stated that the first ImageWriter was sturdier, handled paper better, the ImageWriter II, like its predecessor, used a 9-pin C. Itoh mechanism. However, the ImageWriter II was significantly faster in draft mode, color images and text could be produced by using a color ribbon, and eight colors were supported by the original version of QuickDraw on the Macintosh despite running on a monochrome platform. On the Apple II, complex full-color graphics could be printed, used with the Apple IIGS, the ImageWriter II could produce color photographs with hundreds of simulated colors
22.
PostScript
–
PostScript is a page description language in the electronic publishing and desktop publishing business. It is a typed, concatenative programming language and was created at Adobe Systems by John Warnock, Charles Geschke, Doug Brotz, Ed Taft. The concepts of the PostScript language were seeded in 1976 when John Warnock was working at Evans & Sutherland, at that time John Warnock was developing an interpreter for a large three-dimensional graphics database of New York harbor. Warnock conceived the Design System language to process the graphics, concurrently, researchers at Xerox PARC had developed the first laser printer and had recognized the need for a standard means of defining page images. In 1975-76 Bob Sproull and William Newman developed the Press format, but Press, a data format rather than a language, lacked flexibility, and PARC mounted the Interpress effort to create a successor. In 1978 Evans & Sutherland asked Warnock to move from the San Francisco Bay Area to their headquarters in Utah. He then joined Xerox PARC to work with Martin Newell and they rewrote Design System to create J & M which was used for VLSI design and the investigation of type and graphics printing. This work later evolved and expanded into the Interpress language, Warnock left with Chuck Geschke and founded Adobe Systems in December 1982. They, together with Doug Brotz, Ed Taft and Bill Paxton created a language, similar to Interpress, called PostScript. At about this time they were visited by Steve Jobs, who urged them to adapt PostScript to be used as the language for driving laser printers. In March 1985, the Apple LaserWriter was the first printer to ship with PostScript, the combination of technical merits and widespread availability made PostScript a language of choice for graphical output for printing applications. For a time an interpreter for the PostScript language was a component of laser printers. However, the cost of implementation was high, computers output raw PS code that would be interpreted by the printer into an image at the printers natural resolution. This required high performance microprocessors and ample memory, the LaserWriter used a 12 MHz Motorola 68000, making it faster than any of the Macintosh computers to which it attached. When the laser printer engines themselves cost over a thousand dollars the added cost of PS was marginal, the first version of the PostScript language was released to the market in 1984. The term Level 1 was added when Level 2 was introduced, PostScript 3 came at the end of 1997, and along with many new dictionary-based versions of older operators, introduced better color handling, and new filters. Prior to the introduction of PostScript, printers were designed to print character output given the text—typically in ASCII—as input and this changed to some degree with the increasing popularity of dot matrix printers. The characters on these systems were drawn as a series of dots, dot matrix printers also introduced the ability to print raster graphics
23.
Laser printing
–
Laser printing is an electrostatic digital printing process. It produces high-quality text and graphics by repeatedly passing a laser beam back, the drum then selectively collects electrically charged powdered ink, and transfers the image to paper, which is then heated in order to permanently fuse the text and/or imagery. As with digital photocopiers, laser printers employ a xerographic printing process, however, laser printing differs from analog photocopiers in that the image is produced by the direct scanning of the medium across the printers photoreceptor. This enables laser printing to copy images more quickly than most photocopiers, invented at Xerox PARC in the 1970s, laser printers were introduced for the office and then home markets in subsequent years by IBM, Canon, Xerox, Apple, Hewlett-Packard and many others. Over the decades, quality and speed have increased as price has fallen, in the 1960s, the Xerox Corporation held a dominant position in the photocopier market. In 1969, Gary Starkweather, who worked in Xeroxs product development department, had the idea of using a beam to draw an image of what was to be copied directly onto the copier drum. After transferring to the recently formed Palo Alto Research Center in 1971, in 1972, Starkweather worked with Butler Lampson and Ronald Rider to add a control system and character generator, resulting in a printer called EARS —which later became the Xerox 9700 laser printer. The first commercial implementation of a printer was the IBM3800 in 1976. It was designed for data centers, where it replaced line printers attached to mainframe computers, the IBM3800 was used for high-volume printing on continuous stationery, and achieved speeds of 215 pages per minute, at a resolution of 240 dots per inch. Over 8,000 of these printers were sold, the Xerox 9700 was brought to market in 1977. Unlike the IBM3800, the Xerox 9700 was not targeted to any particular existing printers. The Xerox 9700 excelled at printing high-value documents on paper with varying content. In 1979, inspired by the Xerox 9700s commercial success, Japanese camera and optics company, Canon, developed a low-cost, desktop laser printer, Canon then began work on a much-improved print engine, the Canon CX, resulting in the LBP-CX printer. Lacking experience in selling to computer users, Canon sought partnerships with three Silicon Valley companies, Diablo Data Systems, Hewlett-Packard, and Apple Computer, the first laser printer designed for office use reached market in 1981, the Xerox Star 8010. The system used a metaphor that was unsurpassed in commercial sales. Although it was innovative, the Star workstation was an expensive system, affordable only to a fraction of the businesses. The first laser printer intended for mass-market sales was the HP LaserJet, released in 1984, it used the Canon CX engine, the LaserJet was quickly followed by printers from Brother Industries, IBM, and others. First-generation machines had large photosensitive drums, of greater than the loaded papers length
24.
AppleTalk
–
AppleTalk was a proprietary suite of networking protocols developed by Apple Inc. for their Macintosh computers. AppleTalk includes a number of features that allow local area networks to be connected with no setup or the need for a centralized router or server of any sort. Connected AppleTalk-equipped systems automatically assign addresses, update the distributed namespace, AppleTalk was released in 1985, and was the primary protocol used by Apple devices through the 1980s and 1990s. Versions were also released for the IBM PC and compatibles and the Apple IIGS, AppleTalk support was also available in most networked printers, some file servers, and a number of routers. The rise of TCP/IP during the 1990s led to a reimplementation of most of these types of support on that protocol, many of AppleTalks more advanced autoconfiguration features have since been introduced in Bonjour, while Universal Plug and Play serves similar needs. After the release of the Apple Lisa computer in January 1983, known as AppleNet, it was based on the seminal Xerox XNS protocol stack but running on a custom 1 Mbit/s coaxial cable system rather than Xeroxs 2.94 Mbit/s Ethernet. AppleNet was announced early in 1983 with an introduction at the target price of $500 for plug-in AppleNet cards for the Lisa. At that time, early LAN systems were just coming to market, including Ethernet, Token Ring and this was a topic of major commercial effort at the time, dominating shows like the National Computer Conference in Anaheim in May 1983. All of the systems were jockeying for position in the market and it was at this show that Steve Jobs asked Gursharan Sidhu a seemingly innocuous question, Why has networking not caught on. Four months later, in October, AppleNet was cancelled, at the time, they announced that Apple realized that its not in the business to create a networking system. We built and used AppleNet in-house, but we realized if we had shipped it. In January, Jobs announced that they would instead be supporting IBMs Token Ring, through this period, Apple was deep in development of the Macintosh computer. During development, engineers had made the decision to use the Zilog 8530 serial controller chip instead of the lower cost and more common UART to provide serial port connections. The SCC cost about $5 more than a UART, but offered much higher speeds up to 250 kilobits per second, the SCC was chosen because it would allow multiple devices to be attached to the port. Peripherals equipped with similar SCCs could communicate using the built-in protocols and this would eliminate the need for more ports on the back of the machine, and allowed for the elimination of expansion slots for supporting more complex devices. The initial concept was known as AppleBus, envisioning a system controlled by the host Macintosh polling dumb devices in a similar to the modern Universal Serial Bus. The Macintosh team had begun work on what would become the LaserWriter. A series of memos from Bob Belleville clarified these concepts, outlining the Mac, LaserWriter, by late 1983 it was clear that IBMs Token Ring would not be ready in time for the launch of the Mac, and might miss the launch of these other products as well
25.
Macintosh II
–
The Apple Macintosh II is the first personal computer model of the Macintosh II series in the Apple Macintosh line, and the first Macintosh to support a color display. A basic system with 20 MB drive and monitor cost US$5,498, with a 13-inch color monitor and 8-bit display card the price was around US$7,145. This price placed it in competition with workstations from Silicon Graphics, Sun Microsystems, the Macintosh II was designed by hardware engineers Michael Dhuey and Brian Berkeley and industrial designer Hartmut Esslinger. Two common criticisms of the Macintosh from its introduction in 1984 were the architecture and lack of color. Initially referred to as Little Big Mac, it was codenamed Milwaukee after Dhueys hometown, after Jobs was fired from Apple in September 1985, the project could proceed openly. All previous Macintosh computers used a design with a built-in black-and-white CRT. The Macintosh II had drive bays for a hard disk. It, along with the Macintosh SE, was the first Macintosh computer to use the Apple Desktop Bus introduced with the Apple IIGS for keyboard and mouse interface. The primary improvement in the Mac II was Color QuickDraw in ROM, among the many innovations in Color QuickDraw were an ability to handle any display size, up to 8-bit color depth, and multiple monitors. Because Color QuickDraw was included in the Mac IIs ROM and relied on new 68020 instructions, the Mac II featured a Motorola 68020 processor operating at 16 MHz teamed with a Motorola 68881 floating point unit. The machine shipped with a socket for an MMU, but the Apple HMMU Chip was installed that did not implement virtual memory, standard memory was 1 megabyte, expandable to 8 MB. The Mac II had eight 30-pin SIMMs, and memory was installed in groups of four, a 5. 25-inch 40 MB internal SCSI hard disk was optional, as was a second internal 800 kilobyte 3. 5-inch floppy disk drive. Six NuBus slots were available for expansion and it is possible to connect as many as six displays to a Macintosh II by filling all of the NuBus slots with graphics cards. Another option for expansion included the Mac286, which included an Intel 80286 chip, the original ROMs in the Macintosh II contained a bug which prevented the system from recognizing more than one megabyte of memory address space on a Nubus card. Every Macintosh II manufactured up until about November 1987 had this defect and this happened because Slot Manager was not 32-bit clean. Apple offered a well publicized recall of the faulty ROMs and released a program to test whether a particular Macintosh II had the defect, as a result, it is rare to find a Macintosh II with the original ROMs. The Macintosh II and Macintosh SE were the first Apple computers since the Apple I to be sold without a keyboard, instead the customer was offered the choice of the new ADB Apple Keyboard or the Apple Extended Keyboard as a separate purchase. Dealers could bundle a third-party keyboard or attempt to upsell a customer to the more expensive Extended Keyboard, the Macintosh II was followed by a series of related models including the Macintosh IIx and Macintosh IIfx, all of which used the Motorola 68030 processor
26.
Radius (hardware company)
–
Radius was an American computer hardware firm founded in May 1986 by Burrell Smith, Andy Hertzfeld, Mike Boich, Matt Carter, Alain Rossmann and other members of the original Mac team. The company specialized in Macintosh peripherals and accessory equipment and it completed its IPO in 1990. The first Radius product was the Radius Full Page Display, the first large screen available for any personal computer, first available for the Macintosh Plus, it pioneered the concept of putting multiple screens in a single coordinate space, allowing users to drag windows between multiple screens. This was a concept that Apple later incorporated into the Macintosh II, the second Radius product was the Radius Accelerator, an add-on card that quadrupled the speed of the Macintosh by adding a Motorola 68020 processor. Another product was the Pivot Display, a display that rotated between landscape and portrait orientation with real-time remapping of the menus, mouse and screen drawing. The award-winning product design was by Terry Oyama, former ID lead at Apple Computer, by late 1992, the company faced hard times. In 1993, following the companys first round of layoffs, the strategy was to live off the professional graphics market, the companys first acquisition was VideoFusion, as Radius sought a toehold in the world of video production software. The companys engineering management was given the opportunity to partner with or acquire After Effects, thus they missed the chance to own a product that would come to define the first decade of digital video. In 1994, Radius acquired rival SuperMac and shifted headquarters into the latters building, the SuperMac acquisition netted Radius the Cinepak video compression CODEC, which was still supported by most encoders and almost all media players by the early 2000s. The acquisitions continued with Pipeline Digital and its time code. The advent of Macintosh computers with PCI expansion slots in 1995 saw the end of vendors that made expansion cards exclusively for Macintosh computers, with minor tweaks and new firmware, PC expansion card vendors were able to produce expansion cards for Mac OS computers. With their far greater production volumes from the PC side of the business, vendors such as ATI, Matrox, in March 1995, Radius became the first licensed Macintosh clone vendor, and offered two new products, the Radius System 100 and the Radius 81/110. In its final direction, Radius licensed the brand name SuperMac to Umax in 1996 for its Mac OS clones. In August 1998, the Radius division and its trademark was acquired by miro Displays with the help of its shareholder, Korea Display Systems. 6,1999, it changed its name to Digital Origin and returned to making video editing hardware and software, including EditDV, in 2002, it was acquired by Media 100. Ed Colligan US Robotics and Palm Computing Andy Hertzfeld Google News Timeline Burrell Smith
27.
Apple menu
–
The Apple menu has been a feature in Apples Mac OS since its inception. It is the first drop-down item on the hand side of the menu bar. The Apple menus role has changed throughout the history of Mac OS. In System 6.0.8 and earlier, the Apple menu featured a Control Panel manager, as well as Desk Accessories such as a Calculator, if MultiFinder was active, the Apple menu also allowed the user to switch between multiple running applications. The Macintosh user could add third-party Desk Accessories via the System Utility Font/DA Mover, however, there was a limitation on the number of Desk Accessories that could be displayed in the Apple menu. Third-party shareware packages such as OtherMenu added a second customizable menu that allowed users to install Desk Accessories beyond Apples limitations, System 7.0 introduced the Apple Menu Items folder in the System Folder. This allowed users to place Alias to their software and documents in the menu. Several third-party utilities provided a level of customization of the order of the added to the Apple menu without having to rename each item. The Apple menu also featured a Shut Down command, implemented by a Desk Accessory, an alias to the Control Panels folder was also present. System 7.0 was also the first version to feature the rainbow striped logo, System 7.0 featured built-in multitasking, so MultiFinder was removed as an option. The feature allowing users to switch between running applications as in System 6 was given its own menu on the opposite side of the menubar. In this case, it ran the application called Application Switcher, System 7.5 added an Apple Menu Options control panel, which added submenus to folders and disks in the Apple Menu, showing the contents of the folder or disk. Prior versions of System 7 showed only a standard menu entry that opened the folder in Finder, Apple Menu Options also added Recent Applications, Recent Documents, and Recent Servers to the Apple Menu, the number of Recent Items was editable by the user. Mac OS X features a completely redesigned Apple menu, System management functions from the Special menu have been merged into it. The Apple menu was missing entirely from the Mac OS X Public Beta, replaced by a nonfunctional Apple logo in the center of the menu bar, but the menu was restored in Mac OS X10.0.6
28.
Menu bar
–
A menu bar is a graphical control element which contains drop down menus. Through the evolution of user interfaces, the bar has been implemented in different ways by different user interfaces. In the Macintosh operating systems, the bar is a horizontal bar anchored to the top of the screen. In macOS, the left contains the Apple menu, the Application menu. On the right side, it contains menu extras (for example the system clock, volume control, and the Fast user switching menu, all of these menu extras can be moved horizontally by command-clicking and dragging left or right. If an icon is dragged and dropped vertically it will disappear with a puff of smoke, in the classic Mac OS, the right side contains the application menu, allowing the user to switch between open applications. In Mac OS8.5 and later, the menu can be dragged downwards, there is only one menu bar, so the application menus displayed are those of the application that is currently focused. The idea of separate menus in each window or document was implemented in Microsoft Windows and is the default representation in most Linux desktop environments. Even before the advent of the Macintosh, the graphical menu bar appeared in the Apple Lisa in 1983. It has been a feature of all versions of the classic Mac OS since the first Macintosh was released in 1984, and is still used in MacOS. The menu bar in Microsoft Windows is usually anchored to the top of a window under the title bar, menus in the menu bar can be accessed through shortcuts involving the Alt key and the mnemonic letter that appears underlined in the menu title. Additionally, pressing Alt or F10 brings the focus on the first menu of the menu bar, KDE and GNOME allow users to turn Macintosh-style and Windows-style menu bars on and off. KDE can have both types in use at the same time, window manager menus in Linux are typically configurable either by editing text files, by using a desktop-environment-specific Control Panel applet, or both. The title/menu bar would typically sit at the top of the screen, the Workbench screen title bar would typically display the Workbench version and the amount of free Chip RAM and Fast RAM. Keyboard shortcuts could be accessed by pressing the right Amiga key along with an alphanumeric key. The filled-in and hollowed-out designs, respectively, of the left-, the NeXTstep OS for the NeXT machines would display a menu palette, by default at the top left of the screen. Clicking on the entries in the menu list would display submenus of the commands in the menu, the contents of the menu change depending on whether the user is in the Workspace Manager or an application. The menus and the sub-menus can easily be torn off and moved around the screen as individual palette windows, power users would often switch off the always-on menu, leaving it to be displayed at the mouse pointers location when the right mouse button was pressed
29.
Megabyte
–
The megabyte is a multiple of the unit byte for digital information. Its recommended unit symbol is MB, but sometimes MByte is used, the unit prefix mega is a multiplier of 1000000 in the International System of Units. Therefore, one megabyte is one million bytes of information and this definition has been incorporated into the International System of Quantities. However, in the computer and information fields, several other definitions are used that arose for historical reasons of convenience. A common usage has been to one megabyte as 1048576bytes. However, most standards bodies have deprecated this usage in favor of a set of binary prefixes, less common is a convention that used the megabyte to mean 1000×1024 bytes. The megabyte is commonly used to measure either 10002 bytes or 10242 bytes, the interpretation of using base 1024 originated as a compromise technical jargon for the byte multiples that needed to be expressed by the powers of 2 but lacked a convenient name. As 1024 approximates 1000, roughly corresponding to the SI prefix kilo-, in 1998 the International Electrotechnical Commission proposed standards for binary prefixes requiring the use of megabyte to strictly denote 10002 bytes and mebibyte to denote 10242 bytes. By the end of 2009, the IEC Standard had been adopted by the IEEE, EU, ISO, the Mac OS X10.6 file manager is a notable example of this usage in software. Since Snow Leopard, file sizes are reported in decimal units, base 21 MB =1048576 bytes is the definition used by Microsoft Windows in reference to computer memory, such as RAM. This definition is synonymous with the binary prefix mebibyte. Mixed 1 MB =1024000 bytes is the used to describe the formatted capacity of the 1.44 MB3. 5inch HD floppy disk. Semiconductor memory doubles in size for each address lane added to an integrated circuit package, the capacity of a disk drive is the product of the sector size, number of sectors per track, number of tracks per side, and the number of disk platters in the drive. Changes in any of these factors would not usually double the size, sector sizes were set as powers of two for convenience in processing. It was an extension to give the capacity of a disk drive in multiples of the sector size, giving a mix of decimal. Depending on compression methods and file format, a megabyte of data can roughly be, a 4 megapixel JPEG image with normal compression. Approximately 1 minute of 128 kbit/s MP3 compressed music,6 seconds of uncompressed CD audio. A typical English book volume in plain text format, the human genome consists of DNA representing 800 MB of data
30.
Virtual memory
–
In computing, virtual memory is a memory management technique that is implemented using both hardware and software. It maps memory addresses used by a program, called virtual addresses, main storage as seen by a process or task appears as a contiguous address space or collection of contiguous segments. The operating system virtual address spaces and the assignment of real memory to virtual memory. Address translation hardware in the CPU, often referred to as a management unit or MMU. Memory virtualization can be considered a generalization of the concept of virtual memory, virtual memory is an integral part of a modern computer architecture, implementations usually require hardware support, typically in the form of a memory management unit built into the CPU. While not necessary, emulators and virtual machines can employ hardware support to increase performance of their virtual memory implementations, during the 1960s and early 70s, computer memory was very expensive. The introduction of virtual memory provided an ability for software systems with large memory demands to run on computers with less real memory, the savings from this provided a strong incentive to switch to virtual memory for all systems. The additional capability of providing virtual address spaces added another level of security and reliability, most modern operating systems that support virtual memory also run each process in its own dedicated address space. Each program thus appears to have access to the virtual memory. However, some operating systems and even modern ones are single address space operating systems that run all processes in a single address space composed of virtualized memory. This is because embedded hardware costs are kept low by implementing all such operations with software rather than with dedicated hardware. In the 1940s and 1950s, all larger programs had to contain logic for managing primary and secondary storage, virtual memory was therefore introduced not only to extend primary memory, but to make such an extension as easy as possible for programmers to use. To allow for multiprogramming and multitasking, many early systems divided memory between multiple programs without virtual memory, such as models of the PDP-10 via registers. The first Atlas was commissioned in 1962 but working prototypes of paging had been developed by 1959, in 1961, the Burroughs Corporation independently released the first commercial computer with virtual memory, the B5000, with segmentation rather than paging. Before virtual memory could be implemented in operating systems, many problems had to be addressed. Dynamic address translation required expensive and difficult to build specialized hardware, there were worries that new system-wide algorithms utilizing secondary storage would be less effective than previously used application-specific algorithms. The first minicomputer to introduce virtual memory was the Norwegian NORD-1, during the 1970s, other minicomputers implemented virtual memory, virtual memory was introduced to the x86 architecture with the protected mode of the Intel 80286 processor, but its segment swapping technique scaled poorly to larger segment sizes. The Intel 80386 introduced paging support underneath the existing segmentation layer, however, loading segment descriptors was an expensive operation, causing operating system designers to rely strictly on paging rather than a combination of paging and segmentation
31.
Hard disk drive
–
The platters are paired with magnetic heads, usually arranged on a moving actuator arm, which read and write data to the platter surfaces. Data is accessed in a manner, meaning that individual blocks of data can be stored or retrieved in any order. HDDs are a type of storage, retaining stored data even when powered off. Introduced by IBM in 1956, HDDs became the dominant secondary storage device for computers by the early 1960s. Continuously improved, HDDs have maintained this position into the era of servers. More than 200 companies have produced HDDs historically, though after extensive industry consolidation most current units are manufactured by Seagate, Toshiba, as of 2016, HDD production is growing, although unit shipments and sales revenues are declining. While SSDs have higher cost per bit, SSDs are replacing HDDs where speed, power consumption, small size, the primary characteristics of an HDD are its capacity and performance. Capacity is specified in unit prefixes corresponding to powers of 1000, the two most common form factors for modern HDDs are 3. 5-inch, for desktop computers, and 2. 5-inch, primarily for laptops. HDDs are connected to systems by standard interface cables such as PATA, SATA, Hard disk drives were introduced in 1956, as data storage for an IBM real-time transaction processing computer and were developed for use with general-purpose mainframe and minicomputers. The first IBM drive, the 350 RAMAC in 1956, was approximately the size of two medium-sized refrigerators and stored five million six-bit characters on a stack of 50 disks. In 1962 the IBM350 RAMAC disk storage unit was superseded by the IBM1301 disk storage unit, cylinder-mode read/write operations were supported, and the heads flew about 250 micro-inches above the platter surface. Motion of the head array depended upon a binary system of hydraulic actuators which assured repeatable positioning. The 1301 cabinet was about the size of three home refrigerators placed side by side, storing the equivalent of about 21 million eight-bit bytes, access time was about a quarter of a second. Also in 1962, IBM introduced the model 1311 disk drive, users could buy additional packs and interchange them as needed, much like reels of magnetic tape. Later models of removable pack drives, from IBM and others, became the norm in most computer installations, non-removable HDDs were called fixed disk drives. Some high-performance HDDs were manufactured with one head per track so that no time was lost physically moving the heads to a track, known as fixed-head or head-per-track disk drives they were very expensive and are no longer in production. In 1973, IBM introduced a new type of HDD code-named Winchester and its primary distinguishing feature was that the disk heads were not withdrawn completely from the stack of disk platters when the drive was powered down. Instead, the heads were allowed to land on an area of the disk surface upon spin-down
32.
Gigabyte
–
The gigabyte is a multiple of the unit byte for digital information. The prefix giga means 109 in the International System of Units, the unit symbol for the gigabyte is GB. However, the term is used in some fields of computer science and information technology to denote 1073741824 bytes. The use of gigabyte may thus be ambiguous, to address this ambiguity, the International System of Quantities standardizes the binary prefixes which denote a series of integer powers of 1024. With these prefixes, a module that is labeled as having the size 1GB has one gibibyte of storage capacity. The term gigabyte is commonly used to mean either 10003 bytes or 10243 bytes, the latter binary usage originated as compromise technical jargon for byte multiples that needed to be expressed in a power of 2, but lacked a convenient name. As 1024 is approximately 1000, roughly corresponding to SI multiples, in 1998 the International Electrotechnical Commission published standards for binary prefixes, requiring that the gigabyte strictly denote 10003 bytes and gibibyte denote 10243 bytes. By the end of 2007, the IEC Standard had been adopted by the IEEE, EU, and NIST and this is the recommended definition by the International Electrotechnical Commission. The file manager of Mac OS X version 10.6 and later versions are an example of this usage in software. The binary definition uses powers of the base 2, as is the principle of binary computers. This usage is widely promulgated by some operating systems, such as Microsoft Windows in reference to computer memory and this definition is synonymous with the unambiguous unit gibibyte. Since the first disk drive, the IBM350, disk drive manufacturers expressed hard drive capacities using decimal prefixes, with the advent of gigabyte-range drive capacities, manufacturers based most consumer hard drive capacities in certain size classes expressed in decimal gigabytes, such as 500 GB. The exact capacity of a given model is usually slightly larger than the class designation. Practically all manufacturers of disk drives and flash-memory disk devices continue to define one gigabyte as 1000000000bytes. Some operating systems such as OS X express hard drive capacity or file size using decimal multipliers and this discrepancy causes confusion, as a disk with an advertised capacity of, for example,400 GB might be reported by the operating system as 372 GB, meaning 372 GiB. The JEDEC memory standards use IEEE100 nomenclature which quote the gigabyte as 1073741824bytes and this means that a 300 GB hard disk might be indicated variously as 300 GB,279 GB or 279 GiB, depending on the operating system. As storage sizes increase and larger units are used, these differences even more pronounced. Some legal challenges have been waged over this confusion such as a lawsuit against drive manufacturer Western Digital, Western Digital settled the challenge and added explicit disclaimers to products that the usable capacity may differ from the advertised capacity
33.
Dialog box
–
The graphical control element dialog box is a small window that communicates information to the user and prompts them for a response. Dialog boxes are classified as modal or modeless, depending on whether they block interaction with the software that initiated the dialog, the type of dialog box displayed is dependent upon the desired user interaction. An example of a box is the about box found in many software programs, which usually displays the name of the program, its version number. Non-modal or modeless dialog boxes are used when the information is not essential to continue. In general, good software design calls for dialogs to be of this type where possible, an example might be a dialog of settings for the current document, e. g. the background and text colors. The user can continue adding text to the main window whatever color it is, usability practitioners generally regard modal dialogs as bad design-solutions, since they are prone to produce mode errors. Dangerous actions should be undoable wherever possible, a modal alert dialog that appears unexpectedly or which is dismissed automatically will not protect from the dangerous action, a modal dialog interrupts the main workflow. This effect has either been sought by the developer because it focuses on the completion of the task at hand or rejected because it prevents the user from changing to a different task when needed. The concept of a document modal dialog has recently used, most notably in OS X. In the first case, they are shown as sheets attached to a parent window and these dialogs block only that window until the user dismisses the dialog, permitting work in other windows to continue, even within the same application. In OS X, dialogs appear to emanate from a slot in their parent window and this helps to let the user understand that the dialog is attached to the parent window, not just shown in front of it. The OS X dialog box blocks the parent window, preventing the user from referring to it while interacting with the dialog and this may require the user to close the dialog to access the necessary information, then re-open the dialog box to continue
34.
Desk accessory
–
A desk accessory in computing is a small transient or auxiliary application that can be run concurrently in a desktop environment with any other application on the system. The purpose of this model was to permit very small helper-type applications to be run concurrently with any application on the system. This provided a degree of multitasking on a system that initially did not have any other multitasking ability. DAs were implemented as a class of driver. It was installed in the queue, and given time periodically and co-operatively as a result of the host application calling SystemTask within its main loop. A DA was permitted to have an interface as long as it was confined to one main window. A special window frame with black title bar and rounded corners was reserved for the use of DAs so that the user could distinguish it from the windows of the hosting application, typical early DAs included the Calculator and Alarm Clock. The Control panel, Chooser, and Scrapbook were initially implemented as DAs, third-party DAs such as spelling checkers could be purchased. It was considered hard to write a DA, especially early on there was little in the way of developer tools. However, since on the early Mac OS drivers did not have any special privileges, writing a DA was, with practice, a special Font/DA Mover utility was used to change the configuration of DAs. Because DAs were not installed or launched in the way that applications were. If installed within an application, such as MacWrite, their functionality would be accessible only when that application was running. That is, a desk accessory installed as a resource within an application would appear on the Apple menu as a desk accessory only when that application was active. It could then be activated while the application was run and would disappear when the application was terminated through the Quit function. With the advent of System 7, which included a standard co-operative multitasking feature, the need for DAs diminished greatly, the system continued to run DAs for backward compatibility. Under System 7 and later, DAs could be moved and renamed using the Finder like normal applications, removing the need for Font/DA Mover, when a DA was run under System 7, it always executed in the Finders address space. The icon for a desk accessory program under System 7 and later is roughly a reversed version of the application icon, a similar mechanism to allow small utility programs to run along with regular applications was also present in the operating system for the Apple IIGS. From a programming point of view, desk accessories were implemented, like other GEM applications, as DOS. EXE files, each. ACC file could support multiple accessories, all three of the standard GEM accessories were provided by CALCLOCK. ACC
35.
NeXTSTEP
–
NeXTSTEP is a discontinued object-oriented, multitasking operating system based on UNIX. Although relatively unsuccessful at the time, it attracted interest from computer scientists and it was also the platform on which Tim Berners-Lee created the first web browser. After the purchase of NeXT by Apple, it became the source of the operating systems macOS, iOS, watchOS and tvOS. Many bundled macOS applications, such as TextEdit, Mail and Chess, are descendants of NeXTSTEP applications, the toolkits offer considerable power, and are the canonical development system for all of the software on the machine. NeXTSTEPs user interface is considered to be refined and consistent and it introduced the idea of the Dock and the Shelf. Additional kits were added to the line to make the system more attractive. These include Portable Distributed Objects, which allow easy remote invocation, and Enterprise Objects Framework, the kits made the system particularly interesting to custom application programmers, and NeXTSTEP had a long history in the financial programming community. A preview release of NeXTSTEP was shown with the launch of the NeXT Computer on October 12,1988, the first full release, NeXTSTEP1.0, shipped on September 18,1989. NeXTSTEP was later modified to separate the underlying operating system from the higher-level object libraries, the result was the OpenStep API, which ran on multiple underlying operating systems, including NeXTs own OPENSTEP, Windows NT and SUN Solaris. NeXTSTEPs legacy stands today in the form of its direct descendents, from day one, the operating system of NeXTSTEP was built upon Mach/BSD. It was built on 4. 3BSD Tahoe and it changed to 4. 3BSD Reno after the release of NeXTSTEP3.0. It changed to 4. 4BSD during the development of Rhapsody, the first web browser, WorldWideWeb, and the first ever app store were all invented on the NeXTSTEP platform. Some features and keyboard shortcuts now commonly found in web browsers can be traced back to NeXTSTEP conventions, the basic layout options of HTML1.0 and 2.0 are attributable to those features available in NeXTs Text class. Other games based on the Doom engine such as Heretic and its sequel Hexen by Raven Software as well as Strife by Rogue Entertainment were also developed on NeXT hardware using ids tools. Altsys made a NeXTSTEP application called Virtuoso, version 2 of which was ported to Mac OS, the modern Notebook interface for Mathematica, and the advanced spreadsheet Lotus Improv, were developed using NeXTSTEP. The software that controlled MCIs Friends and Family calling plan program was developed using NeXTSTEP, about the time of the release of NeXTSTEP3.2, NeXT partnered with Sun Microsystems to develop OpenStep. It is the product of an effort to separate the underlying operating system from the higher-level object libraries to create a cross-platform object-oriented API standard derived from NeXTSTEP, the OpenStep API targets multiple underlying operating systems, including NeXTs own OPENSTEP. Implementations of that standard were released for Suns Solaris, Windows NT, NeXTs implementation is called OPENSTEP for Mach and its first release superseded NeXTSTEP3.3 on NeXT, Sun and Intel IA-32 systems
36.
Microsoft Excel
–
Microsoft Excel is a spreadsheet developed by Microsoft for Windows, macOS, Android and iOS. It features calculation, graphing tools, pivot tables, and a programming language called Visual Basic for Applications. It has been a widely applied spreadsheet for these platforms, especially since version 5 in 1993. Excel forms part of Microsoft Office, Microsoft Excel has the basic features of all spreadsheets, using a grid of cells arranged in numbered rows and letter-named columns to organize data manipulations like arithmetic operations. It has a battery of supplied functions to answer statistical, engineering, in addition, it can display data as line graphs, histograms and charts, and with a very limited three-dimensional graphical display. It allows sectioning of data to view its dependencies on various factors for different perspectives, Excel was not designed to be used as a database. Microsoft allows for a number of optional command-line switches to control the manner in which Excel starts, the Windows version of Excel supports programming through Microsofts Visual Basic for Applications, which is a dialect of Visual Basic. Programming with VBA allows spreadsheet manipulation that is awkward or impossible with standard spreadsheet techniques, programmers may write code directly using the Visual Basic Editor, which includes a window for writing code, debugging code, and code module organization environment. A common and easy way to generate VBA code is by using the Macro Recorder, the Macro Recorder records actions of the user and generates VBA code in the form of a macro. These actions can then be repeated automatically by running the macro, the macros can also be linked to different trigger types like keyboard shortcuts, a command button or a graphic. The actions in the macro can be executed from these types or from the generic toolbar options. The VBA code of the macro can also be edited in the VBE, advanced users can employ user prompts to create an interactive program, or react to events such as sheets being loaded or changed. Macro Recorded code may not be compatible between Excel versions, some code that is used in Excel 2010 can not be used in Excel 2003. Making a Macro that changes the colors and making changes to other aspects of cells may not be backward compatible. User-created VBA subroutines execute these actions and operate like macros generated using the macro recorder, from its first version Excel supported end user programming of macros and user defined functions.0. Beginning with version 5.0 Excel recorded macros in VBA by default, after version 5.0 that option was discontinued. All versions of Excel, including Excel 2010 are capable of running an XLM macro, Excel supports charts, graphs, or histograms generated from specified groups of cells. The generated graphic component can either be embedded within the current sheet and these displays are dynamically updated if the content of cells change
37.
Microsoft Works
–
Microsoft Works is a discontinued office suite developed by Microsoft. Works was smaller, was expensive, and had fewer features than Microsoft Office or other major office suites. Its core functionality included a word processor, a spreadsheet and a management system. Later versions had an application and a dictionary while older releases included a terminal emulator. Works was available as a program, and as part of a namesake home productivity suite. Because of its low cost, companies frequently pre-installed Works on their low-cost machines, Microsoft Works originated as MouseWorks, an integrated spreadsheet, word processor and database program, designed for the Macintosh by ex-Apple employee Don Williams and Rupert Lissner. Williams was planning to emulate the success of AppleWorks, a product for Apple II computers. However, Bill Gates and his Head of Acquisitions, Alan M. Boyd, initially it was to be a scaled-down version of Office for the small laptops such as the Radio Shack TRS-80 Model 100 which Microsoft was developing. As laptops grew in power, however, Microsoft Works, as it was to be called, evolved as a product in its own right. On September 14,1987, Microsoft unveiled Works for DOS, through version 4. 5a, Works used a monolithic program architecture whereby the Works Word Processor and Spreadsheet/Database documents ran in windows of the same program interface. This resulted in a memory and disk footprint, which enabled it to run on slower computers with requirements as low as 6 MB of RAM and 12 MB free disk space. Works 2000 switched to an architecture which opens each document as a separate instance. In late 2009, Microsoft announced it was discontinuing Works and replacing it with Office 2010 Starter Edition, newer versions include task panes but do not include significantly updated features. Even in the version, the Windows 95-era icons and toolbars were not updated to make them consistent with later application software. Version 4. 5a is particularly noted in this respect, a Works Portfolio utility offers Microsoft Binder-like functionality. The Works Calendar can store appointments, integrates with the Windows Address Book, as well as Address Books successor, Windows Contacts and it supports importing and exporting iCalendar files. It does not however support subscribing to iCalendar files or publishing them online via WebDAV, up to version 8, using the Works Task Launcher, the calendar and contacts from Windows Address Book could be synchronized with portable devices. In Works 9.0, the sync capability has been removed, Microsoft makes file format converter filters for Microsoft Word for opening and saving to Works Word Processor format
38.
Macintosh 128K
–
The Macintosh 128K, originally released as the Apple Macintosh, is the original Apple Macintosh personal computer. Its beige case consisted of a 9 in CRT monitor and came with a keyboard, a handle built into the top of the case made it easier for the computer to be lifted and carried. It had a selling price of $2,495. The Macintosh was introduced by the now-famous $370,000 television commercial by Ridley Scott,1984, sales of the Macintosh were strong from its initial release on January 24,1984, and reached 70,000 units on May 3,1984. Upon the release of its successor, the Macintosh 512K, it was rebranded as the Macintosh 128K, the centerpiece of the machine was a Motorola 68000 microprocessor running at 7.8336 MHz, connected to 128 KB RAM shared by the processor and the display controller. The boot procedure and some operating system routines were contained in an additional 64 KB ROM chip, Apple did not offer RAM upgrades. Unlike the Apple II, no source code listings of the Macintosh system ROMs were offered, the RAM in the Macintosh consisted of sixteen 4164 64k×1 DRAMs. Such an arrangement reduced the performance of the CPU as much as 35% for most code as the display logic often blocked the CPUs access to RAM. This made the machine appear to run more slowly than several of its competitors, the built-in display was a one-bit black-and-white,9 in CRT with a fixed resolution of 512×342 pixels, establishing the desktop publishing standard of 72 PPI. Expansion and networking were achieved using two non-standard RS-422 DE-9 serial ports named printer and modem, they did not support hardware handshaking, an external floppy disk drive could be added using a proprietary connector. The keyboard and mouse used simple proprietary protocols, allowing some third-party upgrades, the original keyboard had no arrow keys, numeric keypad or function keys. Later, Apple would make a numeric keypad available for the Macintosh 128K, the keyboard sold with the newer Macintosh Plus model would include the numeric keypad and arrow keys, but still no function keys. As with the Apple Lisa before it, the mouse had a single button, standard headphones could also be connected to a monaural jack. Apple also offered their 300 and 1200 bit/s modems originally released for the Apple II line, initially, the only printer available was the Apple ImageWriter, a dot matrix printer which was designed to produce 144 dpi WYSIWYG output from the Macs 72 dpi screen. Eventually, the LaserWriter and other printers were capable of being connected using AppleTalk, the Macintosh contained a single 400 KB, single-sided 3. 5-inch floppy disk drive, dedicating no space to other internal mechanical storage. The Mac OS was disk-based from the beginning, as RAM had to be conserved, one floppy disk was sufficient to store the System Software, an application and the data files created with the application. Indeed, the 400 KB drive capacity was larger than the PC XTs 360 KB5. 25-inch drive, however, more sophisticated work environments of the time required separate disks for documents and the system installation. Due to the constraints of the original Macintosh, and the fact that the floppies could hold 400 KB, users frequently had to swap disks in
39.
Macintosh 512K
–
The Macintosh 512K Personal Computer is the second of a long line of Apple Macintosh computers, and was the first update to the original Macintosh 128K. It was virtually identical to the previous Mac, differing primarily in the amount of built-in memory, soon after Apple introduced the Macintosh 128K they realized that the Macintosh would need more internal memory. Eight months later, on September 10,1984, Apple introduced the Macintosh 512K, with quadrupled RAM, the Macintosh was able to become a more business-capable computer and gained the ability to run more software. The Mac 512K originally shipped with Macintosh System 1.1 but was able to run Macintosh System 1.0 all the way up to System 4.1, when the Macintosh Plus was introduced in 1986, the Macintosh 512K was discontinued on April 14,1986. All support for the Mac 512K was discontinued on September 1,1998, like the 128K Macintosh before it, the 512K contained a Motorola 68000 connected to a 512 kB DRAM by a 16-bit data bus. Though the memory had been quadrupled, it could not be upgraded and this large increase earned it the nickname Fat Mac. A64 kB ROM chip boosts the effective memory to 576 kB, but this is offset by the displays 22 kB framebuffer and this shared arrangement reduces CPU performance by up to 35%. It shared a revised board with the re-badged Macintosh 128K. The resolution of the display was the same, at 512x342, the applications MacPaint and MacWrite were still bundled with the Mac. Soon after this model was released, several other applications available, including MacDraw, MacProject, Macintosh Pascal. In particular, Microsoft Excel, which was specifically for the Macintosh, required a minimum of 512 kB of RAM. Models with the enhanced ROM also supported Apples Switcher, allowing cooperative multitasking among applications, the LaserWriter printer became available shortly after the 512Ks introduction, as well as the number pad, mic, tablet, keyboard, mouse, basic mouse, and much more. It utilized Apples built-in networking scheme LocalTalk which allows sharing of devices among several users, the 512K was the first Macintosh capable of supporting Apples AppleShare built-in file sharing network, when introduced in 1987. The expanded memory in the 512K allowed it to handle large word-processing documents and take better use of the graphical user interface. An updated version replaced the Mac 512K and debuted as the Macintosh 512K enhanced in April 1986 and it differed from the original 512K in that it had an 800 kB floppy disk drive and the same improved ROM as the Macintosh Plus. With the exception of the new number, they were otherwise cosmetically identical. Apple offered a kit which replaced the floppy disk drive. One further OEM upgrade replaced the logicboard and the case entirely with that of the Macintosh Plus