Xeon is a brand of x86 microprocessors designed and marketed by Intel, targeted at the non-consumer workstation and embedded system markets. It was introduced in June 1998. Xeon processors are based on the same architecture as regular desktop-grade CPUs, but have some advanced features such as support for ECC memory, higher core counts, support for larger amounts of RAM, larger cache memory and extra provision for enterprise-grade reliability and serviceability features responsible for handling hardware exceptions through the Machine Check Architecture, they are capable of safely continuing execution where a normal processor cannot due to these extra RAS features, depending on the type and severity of the Machine Check Exception. Some support multi-socket systems with two, four, or eight sockets through use of the Quick Path Interconnect bus; some shortcomings that make Xeon processors unsuitable for most consumer-grade desktop PCs include lower clock rates at the same price point an absence of an integrated GPU, lack of support for overclocking.
Despite such disadvantages, Xeon processors have always had popularity among desktop users due to higher core count potential, higher performance to price ratio vs. the Core i7 in terms of total computing power of all cores. Since most Intel Xeon CPUs lack an integrated GPU, systems built with those processors require a discrete graphics card or a separate GPU if computer monitor output is desired. Intel Xeon is a distinct product line from the similarly-named Intel Xeon Phi; the first-generation Xeon Phi is a different type of device more comparable to a graphics card. In the second generation, Xeon Phi evolved into a main processor more similar to the Xeon, it is x86-compatible. The Xeon brand has been maintained over several generations of x86-64 processors. Older models added the Xeon moniker to the end of the name of their corresponding desktop processor, but more recent models used the name Xeon on its own; the Xeon CPUs have more cache than their desktop counterparts in addition to multiprocessing capabilities.
The first Xeon-branded processor was the Pentium II Xeon. It was released in 1998; the Pentium II Xeon was a "Deschutes" Pentium II with 1 MB, or 2 MB L2 cache. The L2 cache was implemented with custom 512 kB SRAMs developed by Intel; the number of SRAMs depended on the amount of cache. A 512 kB configuration required one SRAM, a 1 MB configuration: two SRAMs, a 2 MB configuration: four SRAMs on both sides of the PCB; each SRAM was a 12.90 mm by 17.23 mm die fabricated in a 0.35 µm four-layer metal CMOS process and packaged in a cavity-down wire-bonded land grid array. The additional cache required a larger module and thus the Pentium II Xeon used a larger slot, Slot 2, it was supported by the 440GX dual-processor workstation chipset and the 450NX quad- or octo-processor chipset. In 1999, the Pentium II Xeon was replaced by the Pentium III Xeon. Reflecting the incremental changes from the Pentium II "Deschutes" core to the Pentium III "Katmai" core, the first Pentium III Xeon, named "Tanner", was just like its predecessor except for the addition of Streaming SIMD Extensions and a few cache controller improvements.
The product codes for Tanner mirrored that of Katmai. The second version, named "Cascades", was based on the Pentium III "Coppermine" core; the "Cascades" Xeon used a 133 MHz bus and small 256 kB on-die L2 cache resulting in the same capabilities as the Slot 1 Coppermine processors, which were capable of dual-processor operation but not quad-processor operation. To improve this situation, Intel released another version also named "Cascades", but referred to as "Cascades 2 MB"; that came in 2 MB of L2 cache. Its bus speed was fixed at 100 MHz; the product code for Cascades mirrored that of Coppermine. In mid-2001, the Xeon brand was introduced; the initial variant that used the new NetBurst microarchitecture, "Foster", was different from the desktop Pentium 4. It was a decent chip for workstations, but for server applications it was always outperformed by the older Cascades cores with a 2 MB L2 cache and AMD's Athlon MP. Combined with the need to use expensive Rambus Dynamic RAM, the Foster's sales were somewhat unimpressive.
At most two Foster processors could be accommodated in a symmetric multiprocessing system built with a mainstream chipset, so a second version was introduced with a 1 MB L3 cache and the Jackson Hyper-Threading capacity. This improved performance but not enough to lift it out of third place, it was priced much higher than the dual-processor versions. The Foster shared the 80528 product code with Willamette. In 2002 Intel released a 130 nm version of Xeon branded CPU, codenamed "Prestonia", it had a 512 kB L2 cache. This was based on the "Northwood" Pentium 4 core. A new server chipset, E7500, was released to support this processor in servers, soon the bus speed was boost
The Intel-based iMac is a family of Macintosh desktop computers designed and sold by Apple Inc. since 2006. Pre-2009 iMac models featured either an aluminium enclosure; the October 2009 iMac model featured a unibody aluminum enclosure, a version of which can still be seen on the current model. The current iMacs released since October 2012 feature a much thinner display, with the edge measuring just 5 mm. At the Macworld Conference and Expo on January 10, 2006, Steve Jobs announced that the new iMac would be the first Macintosh to use an Intel CPU, the Core Duo; the introduction of the new iMac along with the Intel-based MacBook Pro was the start of Apple's transition to Intel processors. In the following months, the other Mac products followed, including the introduction of the Intel Core-powered Mac mini on February 28, 2006, the MacBook consumer line of laptop computers on May 16, 2006, the Mac Pro on August 7, 2006, the Xserve in November 2006, completing the transition; the features and case design remained unchanged from the iMac G5.
The processor speed, according to tests run by Apple using SPEC, was declared to be two to three times faster than the iMac G5. Alongside the MacBook Pro, the iMac Core Duo represents Apple's first computer to feature Intel processors instead of PowerPC processors, it retained the style and features of the iMac G5. In early February 2006, Apple confirmed reports of video display problems on the new Intel-based iMacs; when playing video on Apple's Front Row media browser, some 20-inch iMacs showed random horizontal lines, video tearing and other problems. The problem was fixed with a software update. In late 2006, Apple introduced a new version of the iMac including a Core 2 Duo chip and a lower price. Apple added a new 24-inch model with IPS-display and a resolution of 1920 × 1200, making it the first iMac to be able to display 1080p content in its full resolution, a VESA Flat Display Mounting Interface. Except for the 17-inch 1.83 GHz processor model, this version included an 802.11n draft card.
In August 2007, Apple introduced a complete redesign of the iMac, featuring an aluminum and plastic enclosure. There is only one visible screw on the entire computer, located at the base of the iMac for accessing the memory slots, it has a black, plastic backplate, not user-removable. The 17-inch model was removed from the lineup. In March 2009, Apple released a minor refresh of the iMac line. Changes included a fourth USB port, replacement of two FireWire 400 ports with one FireWire 800 port, replacement of mini-DVI with Mini DisplayPort, a redesigned base, thinner; the exterior design was identical to the older Intel-based iMacs. The models were one 20-inch configuration and three 24-inch configurations. Apple doubled the default RAM and hard-disk size on all models, moving the RAM to the DDR2 specification; this revision introduced a new and more compact Apple Keyboard that excluded the numeric keypad and forward delete key in favor of the fn + Delete keyboard shortcut by default. Users could, replace this version with a more traditional, full-size model with a numeric keypad by requesting Apple to build their machine to order through its online store.
In October 2009, a 16:9 aspect ratio screen was introduced in 21.5" and 27" models, replacing the 20" and 24" 16:10 aspect ratio screens of the previous generation. The back is now a continuation of the aluminum body from the front and sides instead of a separate plastic backplate. Video card options switched to AMD, save for the standard onboard Nvidia card in the base smaller model; the iMac's processor selection saw a significant increase. The Intel i-series chips are introduced to Mac for the first time on the higher-spec 27-inch models. Default RAM has been increased across the iMac range. With the advent of the larger screens, Apple doubled the number of memory slots from two to four; the maximum memory capacity was doubled, for Intel Core i-series, quadrupled, to 32 GB. The 27-inch models of the line became the first to offer Target Display Mode, allowing the iMac to be used as an external display for another Mac computer when connected via Mini DisplayPort, a feature, extended to the 21.5-inch models onwards with the introduction of Thunderbolt.
The Late 2011 Unibody iMac is the last model to include an internal SuperDrive. In October 2012, a new iMac model was introduced that featured a smaller body depth than the previous models, measuring 5mm at its thinnest point, now without an internal SuperDrive; this was achieved by using a process called Full lamination. The display and glass are laminated together; the 21.5 in and 27 in screens remained at their previous resolutions, 1920×1080 and 2560×1440 respectively. As with the 2009 model, memory has been upgraded, it was reported that the 21.5 in iMac would have non-replaceable soldered memory similar to the MacBook Air and Retina display MacBook Pro though tear-downs show that it uses removable memory but accessing the modules requires ungluing the screen and removing the motherboard. The 27 in version features an access port to upgrade memory without disassembling the display. Apple upgraded the computers' processors, using Intel's Ivy Bridge microarchitecture-based Core i5 and Core i7 microprocessors.
Video cards are now Nvidia as standard. USB 3.0 ports are now included for the first time. The 2012 iMac also
The Mac mini is a desktop computer made by Apple Inc. One of four desktop computers in the current Macintosh lineup, along with the iMac, Mac Pro, iMac Pro, it uses many components featured in laptops to achieve its small size; the current Mac mini, introduced in October 2018, is the fourth generation of the product. First released in 2005, the Mac mini is Apple's only consumer desktop computer since 1998 to ship without a display, keyboard, or mouse. Apple marketed it as BYODKM, pitching it to users switching from a traditional Windows PC. In 2010, a third-generation Mac mini became Apple's first computer with an HDMI video port to connect to a television or other display, more positioning the unit as a home theater device alternative to the Apple TV. A server version of the Mac mini, bundled with the Server edition of the OS X operating system, was offered from 2009 to 2014. A small form factor computer had been speculated and requested long before the release of the Mac mini. Rumors predicted that the "headless iMac" would be small, include no display, would be positioned as Apple's entry-level desktop computer.
On January 10, 2005, the Mac mini was announced alongside the iPod shuffle at the Macworld Conference & Expo and was described by Apple CEO Steve Jobs at the time as "the cheapest, most affordable Mac ever". Its case measured 2.0 × 6.5 × 6.5 inches. The Mac mini is an entry-level computer intended for budget-minded customers; until the 2011 release, the Mac mini had much less processing power than the other computers of the Macintosh lineup. Unlike regular desktop computers, which use standard-sized components such as 3.5-inch hard drives and full-size DIMM's, Apple uses lower-power laptop components in the Mac mini to fit all the necessary components into the small case and to prevent overheating. With the choice of components on the older models, the machine was considered somewhat slower than standard desktop computers, it had less storage and memory than comparable desktops. However, the 2011 upgrade addressed many of these previous complaints. In general, the Mac mini has been praised as a affordable computer with a solid range of features.
However, many agree that it is costly for a computer aimed at the lower segment of the market. It is possible to buy small computers at the same price with faster processors, better graphics card, more memory, more storage; the small size has made the Mac mini popular as a home theater solution. In addition, its size and reliability has helped keep resale values high. On October 22, 2009, Apple introduced a new server version of the Mac mini along with revisions of the computer; this model had a second hard drive instead of an optical drive, was marketed as an affordable server for small businesses and schools. On June 15, 2010, Apple introduced the third-generation Mac mini; the new model was thinner, with a unibody aluminum case designed to be opened for RAM access, incorporated upgraded hardware, such as an HDMI port and Nvidia GeForce 320M graphics. It included an internal power supply. An update announced July 20, 2011, dropped the internal CD/DVD optical drive from all versions and introduced a Thunderbolt port, Intel Core i5 processor, either Intel HD Graphics 3000 integrated graphics or AMD Radeon HD 6630M dedicated graphics.
The Server model was upgraded to a quad-core Intel Core i7 processor. Quad-core i7 CPUs were used in the late-2012 desktop Mac mini computers. In October 2014, Apple refreshed the line, adding Haswell CPUs, improving the graphics, lowering the base-model price by $100; the only change to the body was the removal of the two holes used to open the case, as the RAM was no longer upgradable because it was soldered to the logic board. On October 30, 2018, after four years, the Mac mini got a refresh. With this came major specification upgrades, new colors, a switch to all-flash storage; the RAM was increased to a baseline of 8 GB, a maximum of 64 GB of SO-DIMM DDR4. This shows Apple's trend back toward user-upgrade-ability in their desktop models; the storage was changed to a baseline 128 GB of flash storage, with a max of 2 TB. It has optional 10 Gb Ethernet, HDMI 2.0, a headphone jack, 2 USB 3.1, 4 USB-C Thunderbolt 3 ports. The Bluetooth was upgraded to the 5.0 standard, the Mac itself was made available in space gray.
The baseline retail price is $799 USD. Missing for the 2018 model is the SD card reader, SATA drive bay, IR receiver, optical S/PDIF audio out, audio in; the most notable feature of the Mac mini is its size. The original design measured only 2.0 × 6.5 × 6.5 inches. The exterior of the original Mac mini was made of aluminum capped with polycarbonate plastic on the top and bottom; the original design was not meant to be upgraded by the user. The back of the machine contains the I/O vents for the cooling system, it had an external power supply rated at 85W or 110W. The Mac mini, updated on June 15, 2010, was redesigned, being slimmer than the prior models at only 1.4 inches tall, but wider at 7.7 inches a side. The weight rose from 2.9 to 3.0 pounds. The power supply is now internal as opposed to external; the chassis no longer has the polycarbonate plastic on the bottom. The newer model, introduced July 20, 2011 has the same physical dimensions
PCI Express abbreviated as PCIe or PCI-e, is a high-speed serial computer expansion bus standard, designed to replace the older PCI, PCI-X and AGP bus standards. It is the common motherboard interface for personal computers' graphics cards, hard drives, SSDs, Wi-Fi and Ethernet hardware connections. PCIe has numerous improvements over the older standards, including higher maximum system bus throughput, lower I/O pin count and smaller physical footprint, better performance scaling for bus devices, a more detailed error detection and reporting mechanism, native hot-swap functionality. More recent revisions of the PCIe standard provide hardware support for I/O virtualization. Defined by its number of lanes, the PCI Express electrical interface is used in a variety of other standards, most notably the laptop expansion card interface ExpressCard and computer storage interfaces SATA Express, U.2 and M.2. Format specifications are maintained and developed by the PCI-SIG, a group of more than 900 companies that maintain the conventional PCI specifications.
Conceptually, the PCI Express bus is a high-speed serial replacement of the older PCI/PCI-X bus. One of the key differences between the PCI Express bus and the older PCI is the bus topology. In contrast, PCI Express is based on point-to-point topology, with separate serial links connecting every device to the root complex; because of its shared bus topology, access to the older PCI bus is arbitrated, limited to one master at a time, in a single direction. Furthermore, the older PCI clocking scheme limits the bus clock to the slowest peripheral on the bus. In contrast, a PCI Express bus link supports full-duplex communication between any two endpoints, with no inherent limitation on concurrent access across multiple endpoints. In terms of bus protocol, PCI Express communication is encapsulated in packets; the work of packetizing and de-packetizing data and status-message traffic is handled by the transaction layer of the PCI Express port. Radical differences in electrical signaling and bus protocol require the use of a different mechanical form factor and expansion connectors.
At the software level, PCI Express preserves backward compatibility with PCI. The PCI Express link between two devices can vary in size from one to 32 lanes. In a multi-lane link, the packet data is striped across lanes, peak data throughput scales with the overall link width; the lane count is automatically negotiated during device initialization, can be restricted by either endpoint. For example, a single-lane PCI Express card can be inserted into a multi-lane slot, the initialization cycle auto-negotiates the highest mutually supported lane count; the link can dynamically down-configure itself to use fewer lanes, providing a failure tolerance in case bad or unreliable lanes are present. The PCI Express standard defines link widths of ×1, ×4, ×8, ×12, ×16 and ×32; this allows the PCI Express bus to serve both cost-sensitive applications where high throughput is not needed, performance-critical applications such as 3D graphics and enterprise storage. Slots and connectors are only defined for a subset of these widths, with link widths in between using the next larger physical slot size.
As a point of reference, a PCI-X device and a PCI Express 1.0 device using four lanes have the same peak single-direction transfer rate of 1064 MB/s. The PCI Express bus has the potential to perform better than the PCI-X bus in cases where multiple devices are transferring data or if communication with the PCI Express peripheral is bidirectional. PCI Express devices communicate via a logical connection called an link. A link is a point-to-point communication channel between two PCI Express ports allowing both of them to send and receive ordinary PCI requests and interrupts. At the physical level, a link is composed of one or more lanes. Low-speed peripherals use a single-lane link, while a graphics adapter uses a much wider and therefore faster 16-lane link. A lane is composed of two differential signaling pairs, with one pair for receiving data and the other for transmitting. Thus, each lane is composed of signal traces. Conceptually, each lane is used as a full-duplex byte stream, transporting data packets in eight-bit "byte" format in both directions between endpoints of a link.
Physical PCI Express links may contain from one to 32 lanes, more 1, 2, 4, 8, 12, 16 or 32 lanes. Lane counts are written with an "×" prefix, with ×16 being the largest size in common use. Lane sizes are referred to via the terms "width" or "by" e.g. an eight-lane slot could be referred to as a "by 8" or as "8 lanes wide." For mechanical card sizes, see below. The bonded serial bus architecture was chosen over the traditional parallel bus because of inherent limitations of the latter, including half-duplex operation, excess sign
A desktop computer is a personal computer designed for regular use at a single location on or near a desk or table due to its size and power requirements. The most common configuration has a case that houses the power supply, disk storage; the case may be oriented horizontally or vertically and placed either underneath, beside, or on top of a desk. Prior to the widespread use of microprocessors, a computer that could fit on a desk was considered remarkably small. Early computers took up the space of a whole room. Minicomputers fit into one or a few refrigerator-sized racks, it was not until the 1970s when programmable computers appeared that could fit on top of a desk. 1970 saw the introduction of the Datapoint 2200, a "smart" computer terminal complete with keyboard and monitor, was designed to connect with a mainframe computer but that didn't stop owners from using its built in computational abilities as a stand alone desktop computer. The HP 9800 series, which started out as programmable calculators in 1971 but was programmable in BASIC by 1972, used a smaller version of a minicomputer design based on ROM memory and had small one-line LED alphanumeric displays and displayed graphics with a plotter.
The Wang 2200 of 1973 had cassette tape storage. The IBM 5100 in 1975 had a small CRT display and could be programmed in BASIC and APL; these were expensive specialized computers sold for business or scientific uses. Apple II, TRS-80 and Commodore PET were first generation personal home computers launched in 1977, which were aimed at the consumer market – rather than businessmen or computer hobbyists. Byte magazine referred to these three as the "1977 Trinity" of personal computing. Throughout the 1980s and 1990s, desktop computers became the predominant type, the most popular being the IBM PC and its clones, followed by the Apple Macintosh, with the third-placed Commodore Amiga having some success in the mid-1980s but declining by the early 1990s. Early personal computers, like the original IBM Personal Computer, were enclosed in a "desktop case", horizontally oriented to have the display screen placed on top, thus saving space on the user's actual desk, although these cases had to be sturdy enough to support the weight of CRT displays that were widespread at the time.
Over the course of the 1990s, desktop cases became less common than the more-accessible tower cases that may be located on the floor under or beside a desk rather than on a desk. Not only do these tower cases have more room for expansion, they have freed up desk space for monitors which were becoming larger every year. Desktop cases the compact form factors, remain popular for corporate computing environments and kiosks; some computer cases can be interchangeably positioned either horizontally or upright. Influential games such as Doom and Quake during the 1990s had pushed gamers and enthusiasts to upgrade to the latest CPUs and graphics cards for their desktops in order to run these applications, though this has slowed since the late 2000s as the growing popularity of Intel integrated graphics forced game developers to scale back. Creative Technology's Sound Blaster series were a de facto standard for sound cards in desktop PCs during the 1990s until the early 2000s, when they were reduced to a niche product, as OEM desktop PCs came with sound boards integrated directly onto the motherboard.
While desktops have long been the most common configuration for PCs, by the mid-2000s the growth shifted from desktops to laptops. Notably, while desktops were produced in the United States, laptops had long been produced by contract manufacturers based in Asia, such as Foxconn; this shift led to the closure of the many desktop assembly plants in the United States by 2010. Another trend around this time was the increasing proportion of inexpensive base-configuration desktops being sold, hurting PC manufacturers such as Dell whose build-to-order customization of desktops relied on upselling added features to buyers. Battery-powered portable computers had just 2% worldwide market share in 1986. However, laptops have become popular, both for business and personal use. Around 109 million notebook PCs shipped worldwide in 2007, a growth of 33% compared to 2006. In 2008, it was estimated that 145.9 million notebooks were sold, that the number would grow in 2009 to 177.7 million. The third quarter of 2008 was the first time when worldwide notebook PC shipments exceeded desktops, with 38.6 million units versus 38.5 million units.
The sales breakdown of the Apple Macintosh have seen sales of desktop Macs staying constant while being surpassed by that of Mac notebooks whose sales rate has grown considerably. The change in sales of form factors is due to the desktop iMac moving from affordable to upscale and subsequent releases are considered premium all-in-ones. By contrast, the MSRP of the MacBook laptop lines have dropped through successive generations such that the MacBook Air and MacBook Pro constitute the lowest price of entry to a Mac, with the exception of the more inexpensive Mac Mini (albeit with
A microprocessor is a computer processor that incorporates the functions of a central processing unit on a single integrated circuit, or at most a few integrated circuits. The microprocessor is a multipurpose, clock driven, register based, digital integrated circuit that accepts binary data as input, processes it according to instructions stored in its memory, provides results as output. Microprocessors contain sequential digital logic. Microprocessors operate on symbols represented in the binary number system; the integration of a whole CPU onto a single or a few integrated circuits reduced the cost of processing power. Integrated circuit processors are produced in large numbers by automated processes, resulting in a low unit price. Single-chip processors increase reliability because there are many fewer electrical connections that could fail; as microprocessor designs improve, the cost of manufacturing a chip stays the same according to Rock's law. Before microprocessors, small computers had been built using racks of circuit boards with many medium- and small-scale integrated circuits.
Microprocessors combined this into a few large-scale ICs. Continued increases in microprocessor capacity have since rendered other forms of computers completely obsolete, with one or more microprocessors used in everything from the smallest embedded systems and handheld devices to the largest mainframes and supercomputers; the complexity of an integrated circuit is bounded by physical limitations on the number of transistors that can be put onto one chip, the number of package terminations that can connect the processor to other parts of the system, the number of interconnections it is possible to make on the chip, the heat that the chip can dissipate. Advancing technology makes more powerful chips feasible to manufacture. A minimal hypothetical microprocessor might include only an arithmetic logic unit, a control logic section; the ALU performs addition and operations such as AND or OR. Each operation of the ALU sets one or more flags in a status register, which indicate the results of the last operation.
The control logic retrieves instruction codes from memory and initiates the sequence of operations required for the ALU to carry out the instruction. A single operation code might affect many individual data paths and other elements of the processor; as integrated circuit technology advanced, it was feasible to manufacture more and more complex processors on a single chip. The size of data objects became larger. Additional features were added to the processor architecture. Floating-point arithmetic, for example, was not available on 8-bit microprocessors, but had to be carried out in software. Integration of the floating point unit first as a separate integrated circuit and as part of the same microprocessor chip sped up floating point calculations. Physical limitations of integrated circuits made such practices as a bit slice approach necessary. Instead of processing all of a long word on one integrated circuit, multiple circuits in parallel processed subsets of each data word. While this required extra logic to handle, for example and overflow within each slice, the result was a system that could handle, for example, 32-bit words using integrated circuits with a capacity for only four bits each.
The ability to put large numbers of transistors on one chip makes it feasible to integrate memory on the same die as the processor. This CPU cache has the advantage of faster access than off-chip memory and increases the processing speed of the system for many applications. Processor clock frequency has increased more than external memory speed, so cache memory is necessary if the processor is not delayed by slower external memory. A microprocessor is a general-purpose entity. Several specialized processing devices have followed: A digital signal processor is specialized for signal processing. Graphics processing units are processors designed for realtime rendering of images. Other specialized units exist for video machine vision. Microcontrollers integrate a microprocessor with peripheral devices in embedded systems. Systems on chip integrate one or more microprocessor or microcontroller cores. Microprocessors can be selected for differing applications based on their word size, a measure of their complexity.
Longer word sizes allow each clock cycle of a processor to carry out more computation, but correspond to physically larger integrated circuit dies with higher standby and operating power consumption. 4, 8 or 12 bit processors are integrated into microcontrollers operating embedded systems. Where a system is expected to handle larger volumes of data or require a more flexible user interface, 16, 32 or 64 bit processors are used. An 8- or 16-bit processor may be selected over a 32-bit processor for system on a chip or microcontroller applications that require low-power electronics, or are part of a mixed-signal integrated circuit with noise-sensitive on-chip analog electronics such as high-resolution analog to digital converters, or both. Running 32-bit arithmetic on an 8-bit chip could end up using more power, as the chip must execute software with multiple instructions. Thousands of items that were traditionally not computer-related inc