IBM 8514 is an IBM graphics computer display standard supporting a display resolution of 1024x768 pixels with 256 colors at 43.5 Hz, or 640x480 at 60 Hz. 8514 refers to the display controller hardware However, IBM sold the companion CRT monitor which carries the same designation, 8514. 8514 used a standardised programming interface called the "Adapter Interface" or AI. This interface is used by XGA, IBM Image Adapter/A, clones of the 8514/A and XGA such as the ATI Technologies Mach 32 and IIT AGX; the interface allows computer software to offload common 2D-drawing operations onto the 8514 hardware. This freed the host CPU for other tasks, improved the speed of redrawing a graphics visual. 8514 was introduced with the IBM Personal System/2 computers in April 1987. It was an optional upgrade to the Micro Channel architecture based PS/2's Video Graphics Array, was delivered within three months of PS/2's introduction. Although not the first PC video card to support hardware acceleration, IBM's 8514 is credited as the first PC mass-market fixed-function accelerator.
Up until the 8514's introduction, PC graphics acceleration was relegated to expensive workstation-class, graphics coprocessor boards. Coprocessor boards were designed around special CPU or digital signal processor chips which were programmable. Fixed-function accelerators, such as the 8514, sacrificed programmability for better cost/performance ratio. Compatible 8514 boards were based on the Texas Instruments TMS34010 chip. Though the 8514 was never a best-seller, the product created a market for fixed-function PC graphics accelerators which grew exponentially in the early 1990s; the ATI Mach 8 and Mach 32 chips were popular clones, several companies designed graphics accelerator chips which were not register compatible but were conceptually similar to the 8514/A. 8514 was superseded by IBM XGA. Software that supported this graphic standard: OS/2 Windows 2.1 Windows 3.x Windows 95 XFree86 2.1.1 Autocad 10 QuikMenu Third-party graphics suppliers did not clone IBM's 8514/A as extensively as the VGA.
This may be due to the 8514/A being a GPU, not just a graphics array like the VGA. IBM’s 8514/A boards were not cheap; the 8514/A market was limited by the fact that IBM’s boards were made only for MCA systems at a time when ISA systems were the most common. In the late 1980s, several companies cloned the 8514/A for the ISA bus. Notable among those was Western Digital Imaging’s PWGA-1, the Chips & Technologies 82C480, ATI’s Mach 8 and Mach 32 chips. In one way or another, the clones were all better than the original with more speed, enhanced drawing functionality and overall improved video mode selections. Clone support for non-interlaced modes at resolutions like 800×600 and 1280×1024 was typical, all clones had longer command queues for increased performance. Notable clone adapter cards ATI Technologies: the Mach8, Mach32, Graphics Vantage and 8514/Ultra Chips & Technologies: F82C480 B EIZO - AA40 and F82C481 Miro Magic Plus Matrox: MG-108 Paradise Systems: Plus-A, Renaissance Rendition II Desktop Computing: AGA 1024 NEC: Multisync Graphics Engine IIT AGX and Tseng ET4000 are referenced as being IBM 8514 compatible.
The 8514 offered: 640×480 in 256 colors out of 262,144 1024×768 in 256 colors out of 262,144 640×480 text mode with 80×34 characters 1024×768 text mode with 85×38 characters 1024×768 text mode with 146×51 charactersLatter clone board offered additional resolutions: 800×600 with 16-bit and 24-bit color depths 1280×1024 with 16-bit and 24-bit color depths List of IBM products List of defunct graphics chips and card companies Richter, Jake. Graphics Programming for the 8514/A: The New PC Graphics Standard. M & T Books. ISBN 1-55851-086-9. Sanchez, Julio; the PC Graphics Handbook. CRC. ISBN 0-8493-1678-2. Guide to the IBM 8514a 8514 HardwareThis article is based on material taken from the Free On-line Dictionary of Computing prior to 1 November 2008 and incorporated under the "relicensing" terms of the GFDL, version 1.3 or later
Intel Corporation is an American multinational corporation and technology company headquartered in Santa Clara, California, in the Silicon Valley. It is the world's second largest and second highest valued semiconductor chip manufacturer based on revenue after being overtaken by Samsung, is the inventor of the x86 series of microprocessors, the processors found in most personal computers. Intel ranked No. 46 in the 2018 Fortune 500 list of the largest United States corporations by total revenue. Intel supplies processors for computer system manufacturers such as Apple, Lenovo, HP, Dell. Intel manufactures motherboard chipsets, network interface controllers and integrated circuits, flash memory, graphics chips, embedded processors and other devices related to communications and computing. Intel Corporation was founded on July 18, 1968, by semiconductor pioneers Robert Noyce and Gordon Moore, associated with the executive leadership and vision of Andrew Grove; the company's name was conceived as portmanteau of the words integrated and electronics, with co-founder Noyce having been a key inventor of the integrated circuit.
The fact that "intel" is the term for intelligence information made the name appropriate. Intel was an early developer of SRAM and DRAM memory chips, which represented the majority of its business until 1981. Although Intel created the world's first commercial microprocessor chip in 1971, it was not until the success of the personal computer that this became its primary business. During the 1990s, Intel invested in new microprocessor designs fostering the rapid growth of the computer industry. During this period Intel became the dominant supplier of microprocessors for PCs and was known for aggressive and anti-competitive tactics in defense of its market position against Advanced Micro Devices, as well as a struggle with Microsoft for control over the direction of the PC industry; the Open Source Technology Center at Intel hosts PowerTOP and LatencyTOP, supports other open-source projects such as Wayland, Mesa3D, Intel Array Building Blocks, Threading Building Blocks, Xen. Client Computing Group – 55% of 2016 revenues – produces hardware components used in desktop and notebook computers.
Data Center Group – 29% of 2016 revenues – produces hardware components used in server and storage platforms. Internet of Things Group – 5% of 2016 revenues – offers platforms designed for retail, industrial and home use. Non-Volatile Memory Solutions Group – 4% of 2016 revenues – manufactures NAND flash memory and 3D XPoint, branded as Optane, products used in solid-state drives. Intel Security Group – 4% of 2016 revenues – produces software security, antivirus software. Programmable Solutions Group – 3% of 2016 revenues – manufactures programmable semiconductors. In 2017, Dell accounted for about 16% of Intel's total revenues, Lenovo accounted for 13% of total revenues, HP Inc. accounted for 11% of total revenues. According to IDC, while Intel enjoyed the biggest market share in both the overall worldwide PC microprocessor market and the mobile PC microprocessor in the second quarter of 2011, the numbers decreased by 1.5% and 1.9% compared to the first quarter of 2011. In the 1980s, Intel was among the top ten sellers of semiconductors in the world.
In 1992, Intel became the biggest chip maker by revenue and has held the position since. Other top semiconductor companies include TSMC, Advanced Micro Devices, Texas Instruments, Toshiba and STMicroelectronics. Competitors in PC chipsets include Advanced Micro Devices, VIA Technologies, Silicon Integrated Systems, Nvidia. Intel's competitors in networking include NXP Semiconductors, Broadcom Limited, Marvell Technology Group and Applied Micro Circuits Corporation, competitors in flash memory include Spansion, Qimonda, Toshiba, STMicroelectronics, SK Hynix; the only major competitor in the x86 processor market is Advanced Micro Devices, with which Intel has had full cross-licensing agreements since 1976: each partner can use the other's patented technological innovations without charge after a certain time. However, the cross-licensing agreement is canceled in the event of takeover; some smaller competitors such as VIA Technologies produce low-power x86 processors for small factor computers and portable equipment.
However, the advent of such mobile computing devices, in particular, has in recent years led to a decline in PC sales. Since over 95% of the world's smartphones use processors designed by ARM Holdings, ARM has become a major competitor for Intel's processor market. ARM is planning to make inroads into the PC and server market. Intel has been involved in several disputes regarding violation of antitrust laws, which are noted below. Intel was founded in Mountain View, California, in 1968 by Gordon E. Moore, a chemist, Robert Noyce, a physicist and co-inventor of the integrated circuit. Arthur Rock helped. Moore and Noyce had left Fairchild Semiconductor to found Intel. Rock was not an employee; the total initial investment in Intel was $10,000 from Rock. Just 2 years Intel became a public company via an initial public offering, raising $6.8 million. Intel's third employee was Andy Grove, a chemical engineer, who ran the company through much of the 1980s and the high-growth 1990s. In dec
The Intel 80386 known as i386 or just 386, is a 32-bit microprocessor introduced in 1985. The first versions had 275,000 transistors and were the CPU of many workstations and high-end personal computers of the time; as the original implementation of the 32-bit extension of the 80286 architecture, the 80386 instruction set, programming model, binary encodings are still the common denominator for all 32-bit x86 processors, termed the i386-architecture, x86, or IA-32, depending on context. The 32-bit 80386 can execute most code intended for the earlier 16-bit processors such as 8086 and 80286 that were ubiquitous in early PCs. Over the years, successively newer implementations of the same architecture have become several hundreds of times faster than the original 80386. A 33 MHz 80386 was measured to operate at about 11.4 MIPS. The 80386 was introduced in October 1985, while manufacturing of the chips in significant quantities commenced in June 1986. Mainboards for 80386-based computer systems were cumbersome and expensive at first, but manufacturing was rationalized upon the 80386's mainstream adoption.
The first personal computer to make use of the 80386 was designed and manufactured by Compaq and marked the first time a fundamental component in the IBM PC compatible de facto standard was updated by a company other than IBM. In May 2006, Intel announced that 80386 production would stop at the end of September 2007. Although it had long been obsolete as a personal computer CPU, Intel and others had continued making the chip for embedded systems; such systems using an 80386 or one of many derivatives are common in aerospace technology and electronic musical instruments, among others. Some mobile phones used the 80386 processor, such as BlackBerry 950 and Nokia 9000 Communicator; the processor was a significant evolution in the x86 architecture, extended a long line of processors that stretched back to the Intel 8008. The predecessor of the 80386 was the Intel 80286, a 16-bit processor with a segment-based memory management and protection system; the 80386 added a 32-bit architecture and a paging translation unit, which made it much easier to implement operating systems that used virtual memory.
It offered support for register debugging. The 80386 featured three operating modes: protected mode and virtual mode; the protected mode, which debuted in the 286, was extended to allow the 386 to address up to 4 GB of memory. The all new virtual 8086 mode made it possible to run one or more real mode programs in a protected environment, although some programs were not compatible; the ability for a 386 to be set up to act like it had a flat memory model in protected mode despite the fact that it uses a segmented memory model in all modes would arguably be the most important feature change for the x86 processor family until AMD released x86-64 in 2003. Several new instructions have been added to 386: BSF, BSR, BT, BTS, BTR, BTC, CDQ, CWDE, LFS, LGS, LSS, MOVSX, MOVZX, SETcc, SHLD, SHRD. Two new segment registers have been added for general-purpose programs, single Machine Status Word of 286 grew into eight control registers CR0–CR7. Debug registers DR0–DR7 were added for hardware breakpoints. New forms of MOV instruction are used to access them.
Chief architect in the development of the 80386 was John H. Crawford, he was responsible for extending the 80286 architecture and instruction set to 32-bit, led the microprogram development for the 80386 chip. The 80486 and P5 Pentium line of processors were descendants of the 80386 design; the following data types are directly supported and thus implemented by one or more 80386 machine instructions. 8-bit integer, either signed or unsigned. 16-bit integer, either signed or unsigned. 32-bit integer, either signed or unsigned. 64-bit integer, either signed or unsigned. Offset, a 16- or 32-bit displacement referring to a memory location. Pointer, a 16-bit selector together with a 16- or 32-bit offset. Character. String, a sequence of 8-, 16- or 32-bit words. BCD, decimal digits represented by unpacked bytes. Packed BCD, two BCD digits in one byte; the following 80386 assembly source code is for a subroutine named _strtolower that copies a null-terminated ASCIIZ character string from one location to another, converting all alphabetic characters to lower case.
The string is copied one byte at a time. The example code uses the EBP register to establish a call frame, an area on the stack that contains all of the parameters and local variables for the execution of the subroutine; this kind of calling convention supports reentrant and recursive code and has been used by Algol-like languages since the late 1950s. A flat memory model is assumed that the DS and ES segments address the same region of memory. In 1988, Intel introduced the 80386SX, most referred to as the 386SX, a cut-down version of the 80386 with a 16-bit data bus intended for lower-cost PCs aimed at the home and small-business markets, while the 386DX would remain the high-end variant used in workstations and other demanding tasks; the CPU remained 32-bit internally, but the 16-bit
In computing, multitasking is the concurrent execution of multiple tasks over a certain period of time. New tasks can interrupt started ones before they finish, instead of waiting for them to end; as a result, a computer executes segments of multiple tasks in an interleaved manner, while the tasks share common processing resources such as central processing units and main memory. Multitasking automatically interrupts the running program, saving its state and loading the saved state of another program and transferring control to it; this "context switch" may be initiated at fixed time intervals, or the running program may be coded to signal to the supervisory software when it can be interrupted. Multitasking does not require parallel execution of multiple tasks at the same time. On multiprocessor computers, multitasking allows many more tasks to be run than there are CPUs. Multitasking is a common feature of computer operating systems, it allows more efficient use of the computer hardware. In a time sharing system, multiple human operators use the same processor as if it was dedicated to their use, while behind the scenes the computer is serving many users by multitasking their individual programs.
In multiprogramming systems, a task runs until it must wait for an external event or until the operating system's scheduler forcibly swaps the running task out of the CPU. Real-time systems such as those designed to control industrial robots, require timely processing. Multitasking operating systems include measures to change the priority of individual tasks, so that important jobs receive more processor time than those considered less significant. Depending on the operating system, a task might be as large as an entire application program, or might be made up of smaller threads that carry out portions of the overall program. A processor intended for use with multitasking operating systems may include special hardware to securely support multiple tasks, such as memory protection, protection rings that ensure the supervisory software cannot be damaged or subverted by user-mode program errors; the term "multitasking" has become an international term, as the same word is used in many other languages such as German, Dutch and Norwegian.
In the early days of computing, CPU time was expensive, peripherals were slow. When the computer ran a program that needed access to a peripheral, the central processing unit would have to stop executing program instructions while the peripheral processed the data; this was very inefficient. The first computer using a multiprogramming system was the British Leo III owned by J. Co.. During batch processing, several different programs were loaded in the computer memory, the first one began to run; when the first program reached an instruction waiting for a peripheral, the context of this program was stored away, the second program in memory was given a chance to run. The process continued; the use of multiprogramming was enhanced by the arrival of virtual memory and virtual machine technology, which enabled individual programs to make use of memory and operating system resources as if other concurrently running programs were, for all practical purposes, non-existent and invisible to them. Multiprogramming doesn't give any guarantee.
Indeed, the first program may well run for hours without needing access to a peripheral. As there were no users waiting at an interactive terminal, this was no problem: users handed in a deck of punched cards to an operator, came back a few hours for printed results. Multiprogramming reduced wait times when multiple batches were being processed. Early multitasking systems used applications; this approach, supported by many computer operating systems, is known today as cooperative multitasking. Although it is now used in larger systems except for specific applications such as CICS or the JES2 subsystem, cooperative multitasking was once the only scheduling scheme employed by Microsoft Windows and Classic Mac OS to enable multiple applications to run simultaneously. Cooperative multitasking is still used today on RISC OS systems; as a cooperatively multitasked system relies on each process giving up time to other processes on the system, one poorly designed program can consume all of the CPU time for itself, either by performing extensive calculations or by busy waiting.
In a server environment, this is a hazard. Preemptive multitasking allows the computer system to more reliably guarantee to each process a regular "slice" of operating time, it allows the system to deal with important external events like incoming data, which might require the immediate attention of one or another process. Operating systems were developed to take advantage of these hardware capabilities and run multiple processes preemptively. Preemptive multitasking was implemented in the PDP-6 Monitor and MULTICS in 1964, in OS/360 MFT in 1967, in Unix in 1969, was available in some operating systems for computers as small as DEC's PDP-8.
The Intel 80186 known as the iAPX 186, or just 186, is a microprocessor and microcontroller introduced in 1982. It was based on the Intel 8086 and, like it, had a 16-bit external data bus multiplexed with a 20-bit address bus, it was available as the 80188, with an 8-bit external data bus. The 80186 series was intended for embedded systems, as microcontrollers with external memory. Therefore, to reduce the number of integrated circuits required, it included features such as clock generator, interrupt controller, wait state generator, DMA channels, external chip select lines; the initial clock rate of the 80186 was 6 MHz, but due to more hardware available for the microcode to use for address calculation, many individual instructions ran faster than on an 8086 at the same clock frequency. For instance, the common register+immediate addressing mode was faster than on the 8086 when a memory location was both operand and the destination. Multiply and divide showed great improvement being several times as fast as on the original 8086 and multi-bit shifts were done four times as as in the 8086.
A few new instructions were introduced with the 80186: enter/leave, pusha/popa and ins/outs. A useful immediate mode was added for the push and multi-bit shift instructions; these instructions were included in the contemporary 80286 and in successor chips. The CMOS version, 80C186, introduced DRAM refresh, a power-save mode, a direct interface to the 8087 or 80187 floating point numeric coprocessor; the 80186 would have been a natural successor to the 8086 in personal computers. However, because its integrated hardware was incompatible with the hardware used in the original IBM PC, the 80286 was used as the successor instead in the IBM PC/AT. A few notable personal computers used the 80186: the Australian Dulmont Magnum laptop, one of the first laptops. Acorn created a plug-in for the BBC Master range of computers containing an 80186-10 with 512 KB of RAM, the BBC Master 512 system. In addition to the above examples of stand-alone implementations of the 80186 for personal computers, there was at least one example of an "add-in" accelerator card implementation: the Orchid Technology PC Turbo 186, released in 1985.
It was intended for use with the original Intel 8088-based IBM PC. The Intel 80186 is intended to be embedded in electronic devices that are not computers. For example: the 80186 was used to control the Microtek 8086 in-circuit emulator its offshoot, Intel 80188 was embedded inside the Intel 14.4EX modem released in 1991. The 16 MHz processor was used to perform complex algorithms needed for forward error correction, Trellis modulation, echo cancellation in the modem. In May 2006, Intel announced that production of the 186 would cease at the end of September 2007. Pin- and instruction-compatible replacements might still be manufactured by various third party sources. IAPX, for the iAPX name Attribution: This article is based on material taken from the Free On-line Dictionary of Computing prior to 1 November 2008 and incorporated under the "relicensing" terms of the GFDL, version 1.3 or later. Intel Datasheet Scan of the Intel 80186 data book at datasheetarchive.com Intel 80186/80188 images and descriptions at cpu-collection.de Chipdb.org
The Intel 8088 microprocessor is a variant of the Intel 8086. Introduced on July 1, 1979, the 8088 had an eight-bit external data bus instead of the 16-bit bus of the 8086; the 16-bit registers and the one megabyte address range were however. In fact, according to the Intel documentation, the 8086 and 8088 have the same execution unit —only the bus interface unit is different; the original IBM PC was based on the 8088. The 8088 was designed at Intel's laboratory in Haifa, Israel, as were a large number of Intel's processors; the 8088 was targeted at economical systems by allowing the use of an eight-bit data path and eight-bit support and peripheral chips. The prefetch queue of the 8088 was shortened to four bytes, from the 8086's six bytes, the prefetch algorithm was modified to adapt to the narrower bus; these modifications of the basic 8086 design were one of the first jobs assigned to Intel's then-new design office and laboratory in Haifa. Variants of the 8088 with more than 5 MHz maximal clock frequency include the 8088-2, fabricated using Intel's new enhanced nMOS process called HMOS and specified for a maximal frequency of 8 MHz.
Followed the 80C88, a static CHMOS design, which could operate with clock speeds from 0 to 8 MHz. There were several other, more or less similar, variants from other manufacturers. For instance, the NEC V20 was a pin-compatible and faster variant of the 8088, designed and manufactured by NEC. Successive NEC 8088 compatible processors would run at up to 16 MHz. In 1984, Commodore International signed a deal to manufacture the 8088 for use in a licensed Dynalogic Hyperion clone, in a move, regarded as signaling a major new direction for the company; when announced, the list price of the 8088 was US$124.80. The 8088 is architecturally similar to the 8086; the main difference is. All of the other pins of the device perform the same function as they do with the 8086 with two exceptions. First, pin 34 is no longer BHE. Instead it outputs a maximum mode status, SSO. Combined with the IO/M and DT/R signals, the bus cycles can be decoded; the second change is the pin that signals whether a memory access or input/output access is being made has had it sense reversed.
The pin on the 8088 is IO/M. On the 8086 part it is IO/M; the reason for the reversal is that it makes the 8088 compatible with the 8085. Depending on the clock frequency, the number of memory wait states, as well as on the characteristics of the particular application program, the average performance for the Intel 8088 ranged from 0.33 to 1 million instructions per second. Meanwhile, the mov reg,reg and ALU reg,reg instructions, taking two and three cycles yielded an absolute peak performance of between 1⁄3 and 1⁄2 MIPS per MHz, that is, somewhere in the range 3–5 MIPS at 10 MHz; the speed of the execution unit and the bus of the 8086 CPU was well balanced. Cutting down the bus to eight bits made it a serious bottleneck in the 8088. With the speed of instruction fetch reduced by 50% in the 8088 as compared to the 8086, a sequence of fast instructions can drain the four-byte prefetch queue; when the queue is empty, instructions take as long to complete. Both the 8086 and 8088 take four clock cycles to complete a bus cycle.
Therefore, for example, a two-byte shift or rotate instruction, which takes the EU only two clock cycles to execute takes eight clock cycles to complete if it is not in the prefetch queue. A sequence of such fast instructions prevents the queue from being filled as fast as it is drained, in general, because so many basic instructions execute in fewer than four clocks per instruction byte—including all the ALU and data-movement instructions on register operands and some of these on memory operands—it is impossible to avoid idling the EU in the 8088 at least ¼ of the time while executing useful real-world programs, it is not hard to idle it half the time. In short, an 8088 runs about half as fast as 8086 clocked at the same rate, because of the bus bottleneck. A side effect of the 8088 design, with the slow bus and the small prefetch queue, is that the speed of code execution can be dependent on instruction order; when programming the 8088, for CPU efficiency, it is vital to interleave long-running instructions with short ones whenever possible.
For example, a repeated string operation or a shift by three or more will take long enough to allow time for the 4-byte prefetch queue to fill. If short instructions are placed between slower instructions like these, the short ones can execute at full speed out of the queue. If, on the other hand, the slow instructions are executed sequentially, back to back after the first of them the bus unit will be forced to idle because the queue will be full, with the consequence that more of the faster instructions will suffer fetch delays that might have been avoidable; as some instructions, such as single-bit-position shifts and rotates, take 4 times as long to fetch as to execute, the overall effec
Graphical user interface
The graphical user interface is a form of user interface that allows users to interact with electronic devices through graphical icons and visual indicators such as secondary notation, instead of text-based user interfaces, typed command labels or text navigation. GUIs were introduced in reaction to the perceived steep learning curve of command-line interfaces, which require commands to be typed on a computer keyboard; the actions in a GUI are performed through direct manipulation of the graphical elements. Beyond computers, GUIs are used in many handheld mobile devices such as MP3 players, portable media players, gaming devices and smaller household and industrial controls; the term GUI tends not to be applied to other lower-display resolution types of interfaces, such as video games, or not including flat screens, like volumetric displays because the term is restricted to the scope of two-dimensional display screens able to describe generic information, in the tradition of the computer science research at the Xerox Palo Alto Research Center.
Designing the visual composition and temporal behavior of a GUI is an important part of software application programming in the area of human–computer interaction. Its goal is to enhance the efficiency and ease of use for the underlying logical design of a stored program, a design discipline named usability. Methods of user-centered design are used to ensure that the visual language introduced in the design is well-tailored to the tasks; the visible graphical interface features of an application are sometimes referred to as chrome or GUI. Users interact with information by manipulating visual widgets that allow for interactions appropriate to the kind of data they hold; the widgets of a well-designed interface are selected to support the actions necessary to achieve the goals of users. A model–view–controller allows flexible structures in which the interface is independent from and indirectly linked to application functions, so the GUI can be customized easily; this allows users to select or design a different skin at will, eases the designer's work to change the interface as user needs evolve.
Good user interface design relates to users more, to system architecture less. Large widgets, such as windows provide a frame or container for the main presentation content such as a web page, email message or drawing. Smaller ones act as a user-input tool. A GUI may be designed for the requirements of a vertical market as application-specific graphical user interfaces. Examples include automated teller machines, point of sale touchscreens at restaurants, self-service checkouts used in a retail store, airline self-ticketing and check-in, information kiosks in a public space, like a train station or a museum, monitors or control screens in an embedded industrial application which employ a real-time operating system. By the 1980s, cell phones and handheld game systems employed application specific touchscreen GUIs. Newer automobiles use GUIs in their navigation systems and multimedia centers, or navigation multimedia center combinations. Sample graphical desktop environments A GUI uses a combination of technologies and devices to provide a platform that users can interact with, for the tasks of gathering and producing information.
A series of elements conforming a visual language have evolved to represent information stored in computers. This makes it easier for people with few computer skills to use computer software; the most common combination of such elements in GUIs is the windows, menus, pointer paradigm in personal computers. The WIMP style of interaction uses a virtual input device to represent the position of a pointing device, most a mouse, presents information organized in windows and represented with icons. Available commands are compiled together in menus, actions are performed making gestures with the pointing device. A window manager facilitates the interactions between windows and the windowing system; the windowing system handles hardware devices such as pointing devices, graphics hardware, positioning of the pointer. In personal computers, all these elements are modeled through a desktop metaphor to produce a simulation called a desktop environment in which the display represents a desktop, on which documents and folders of documents can be placed.
Window managers and other software combine to simulate the desktop environment with varying degrees of realism. Smaller mobile devices such as personal digital assistants and smartphones use the WIMP elements with different unifying metaphors, due to constraints in space and available input devices. Applications for which WIMP is not well suited may use newer interaction techniques, collectively termed post-WIMP user interfaces; as of 2011, some touchscreen-based operating systems such as Apple's iOS and Android use the class of GUIs named post-WIMP. These support styles of interaction using more than one finger in contact with a display, which allows actions such as pinching and rotating, which are unsupported by one pointer and mouse. Human interface devices, for the efficient interaction with a GUI include a computer keyboard used together with keyboard shortcuts, pointing devices for the cursor control: mouse, pointing stick, trackball, virtual keyboards, head-up displays. There are actions performed by programs that affect the GUI.
For example, there are components like inotify or D-Bus to facilitate communication between computer programs. Ivan Sutherland developed Sketchpad in 1963 held as the first graphical co