Desktop publishing is the creation of documents using page layout skills on a personal computer for print. Desktop publishing software can generate layouts and produce typographic quality text and images comparable to traditional typography and printing; this technology allows individuals and other organizations to self-publish a wide range of printed matter. Desktop publishing is the main reference for digital typography; when used skillfully, desktop publishing allows the user to produce a wide variety of materials, from menus to magazines and books, without the expense of commercial printing. Desktop publishing combines a personal computer and WYSIWYG page layout software to create publication documents on a computer for either large scale publishing or small scale local multifunction peripheral output and distribution. Desktop publishing methods provide more control over design and typography than word processing. However, word processing software has evolved to include some, though by no means all, capabilities available only with professional printing or desktop publishing.
The same DTP skills and software used for common paper and book publishing are sometimes used to create graphics for point of sale displays, promotional items, trade show exhibits, retail package designs and outdoor signs. Although what is classified as "DTP software" is limited to print and PDF publications, DTP skills aren't limited to print; the content produced by desktop publishers may be exported and used for electronic media. The job descriptions that include "DTP", such as DTP artist require skills using software for producing e-books, web content, web pages, which may involve web design or user interface design for any graphical user interface. Desktop publishing was first developed at Xerox PARC in the 1970s. A contradictory claim states that desktop publishing began in 1983 with a program developed by James Davise at a community newspaper in Philadelphia; the program Type Processor One ran on a PC using a graphics card for a WYSIWYG display and was offered commercially by Best info in 1984.
The Macintosh computer platform was introduced by Apple with much fanfare in 1984, but at the beginning, the Mac lacked DTP capabilities. The DTP market exploded in 1985 with the introduction in January of the Apple LaserWriter printer, in July with the introduction of PageMaker software from Aldus, which became the standard software application for desktop publishing. With its advanced layout features, PageMaker relegated word processors like Microsoft Word to the mere composition and editing of purely textual documents; the term "desktop publishing" is attributed to Aldus founder Paul Brainerd, who sought a marketing catchphrase to describe the small size and relative affordability of this suite of products, in contrast to the expensive commercial phototypesetting equipment of the day. Before the advent of desktop publishing, the only option available to most people for producing typed documents was a typewriter, which offered only a handful of typefaces and one or two font sizes. Indeed, one popular desktop publishing book was entitled The Mac is not a typewriter, it had to explain how a Mac could do so much more than a typewriter.
The ability to create WYSIWYG page layouts on screen and print pages containing text and graphical elements at crisp 300 dpi resolution was revolutionary for both the typesetting industry and the personal computer industry. Early 1980s desktop publishing was a primitive affair. Users of the PageMaker-LaserWriter-Macintosh 512K system endured frequent software crashes, cramped display on the Mac's tiny 512 x 342 1-bit monochrome screen, the inability to control letter-spacing and other typographic features, discrepancies between the screen display and printed output. However, it was a revolutionary combination at the time, was received with considerable acclaim. Behind-the-scenes technologies developed by Adobe Systems set the foundation for professional desktop publishing applications; the LaserWriter and LaserWriter Plus printers included high quality, scalable Adobe PostScript fonts built into their ROM memory. The LaserWriter's PostScript capability allowed publication designers to proof files on a local printer print the same file at DTP service bureaus using optical resolution 600+ ppi PostScript printers such as those from Linotronic.
The Macintosh II was released, much more suitable for desktop publishing because of its greater expandability, support for large color multi-monitor displays, its SCSI storage interface which allowed fast high-capacity hard drives to be attached to the system. Macintosh-based systems continued to dominate the market into 1986, when the GEM-based Ventura Publisher was introduced for MS-DOS computers. PageMaker's pasteboard metaphor simulated the process of creating layouts manually, but Ventura Publisher automated the layout process through its use of tags and style sheets and automatically generated indices and other body matter; this made it suitable for other long-format documents. Desktop publishing moved into the home market in 1986 with Professional Page for the Amiga, Publishing Partner for the Atari ST, GST's Timeworks Publisher on the PC and Atari ST, Calamus for the Atari TT030. Software was published for 8-bit computers like the A
Apple Inc. is an American multinational technology company headquartered in Cupertino, that designs and sells consumer electronics, computer software, online services. It is considered one of the Big Four of technology along with Amazon and Facebook; the company's hardware products include the iPhone smartphone, the iPad tablet computer, the Mac personal computer, the iPod portable media player, the Apple Watch smartwatch, the Apple TV digital media player, the HomePod smart speaker. Apple's software includes the macOS and iOS operating systems, the iTunes media player, the Safari web browser, the iLife and iWork creativity and productivity suites, as well as professional applications like Final Cut Pro, Logic Pro, Xcode, its online services include the iTunes Store, the iOS App Store, Mac App Store, Apple Music, Apple TV+, iMessage, iCloud. Other services include Apple Store, Genius Bar, AppleCare, Apple Pay, Apple Pay Cash, Apple Card. Apple was founded by Steve Jobs, Steve Wozniak, Ronald Wayne in April 1976 to develop and sell Wozniak's Apple I personal computer, though Wayne sold his share back within 12 days.
It was incorporated as Apple Computer, Inc. in January 1977, sales of its computers, including the Apple II, grew quickly. Within a few years and Wozniak had hired a staff of computer designers and had a production line. Apple went public in 1980 to instant financial success. Over the next few years, Apple shipped new computers featuring innovative graphical user interfaces, such as the original Macintosh in 1984, Apple's marketing advertisements for its products received widespread critical acclaim. However, the high price of its products and limited application library caused problems, as did power struggles between executives. In 1985, Wozniak departed Apple amicably and remained an honorary employee, while Jobs and others resigned to found NeXT; as the market for personal computers expanded and evolved through the 1990s, Apple lost market share to the lower-priced duopoly of Microsoft Windows on Intel PC clones. The board recruited CEO Gil Amelio to what would be a 500-day charge for him to rehabilitate the financially troubled company—reshaping it with layoffs, executive restructuring, product focus.
In 1997, he led Apple to buy NeXT, solving the failed operating system strategy and bringing Jobs back. Jobs pensively regained leadership status, becoming CEO in 2000. Apple swiftly returned to profitability under the revitalizing Think different campaign, as he rebuilt Apple's status by launching the iMac in 1998, opening the retail chain of Apple Stores in 2001, acquiring numerous companies to broaden the software portfolio. In January 2007, Jobs renamed the company Apple Inc. reflecting its shifted focus toward consumer electronics, launched the iPhone to great critical acclaim and financial success. In August 2011, Jobs resigned as CEO due to health complications, Tim Cook became the new CEO. Two months Jobs died, marking the end of an era for the company. Apple is well known for its size and revenues, its worldwide annual revenue totaled $265 billion for the 2018 fiscal year. Apple is the world's largest information technology company by revenue and the world's third-largest mobile phone manufacturer after Samsung and Huawei.
In August 2018, Apple became the first public U. S. company to be valued at over $1 trillion. The company employs 123,000 full-time employees and maintains 504 retail stores in 24 countries as of 2018, it operates the iTunes Store, the world's largest music retailer. As of January 2018, more than 1.3 billion Apple products are in use worldwide. The company has a high level of brand loyalty and is ranked as the world's most valuable brand. However, Apple receives significant criticism regarding the labor practices of its contractors, its environmental practices and unethical business practices, including anti-competitive behavior, as well as the origins of source materials. Apple Computer Company was founded on April 1, 1976, by Steve Jobs, Steve Wozniak, Ronald Wayne; the company's first product is the Apple I, a computer designed and hand-built by Wozniak, first shown to the public at the Homebrew Computer Club. Apple I was sold as a motherboard —a base kit concept which would now not be marketed as a complete personal computer.
The Apple I went on sale in July 1976 and was market-priced at $666.66. Apple Computer, Inc. was incorporated on January 3, 1977, without Wayne, who had left and sold his share of the company back to Jobs and Wozniak for $800 only twelve days after having co-founded Apple. Multimillionaire Mike Markkula provided essential business expertise and funding of $250,000 during the incorporation of Apple. During the first five years of operations revenues grew exponentially, doubling about every four months. Between September 1977 and September 1980, yearly sales grew from $775,000 to $118 million, an average annual growth rate of 533%; the Apple II invented by Wozniak, was introduced on April 16, 1977, at the first West Coast Computer Faire. It differs from its major rivals, the TRS-80 and Commodore PET, because of its character cell-based color graphics and open architecture. While early Apple II models use ordinary cassette tapes as storage devices, they were superseded by the introduction of a 5 1⁄4-inch floppy disk drive and interface called the Disk II.
The Apple II was chosen to be the desktop platform for the first "killer app" of the business world: VisiCalc, a spreadsheet program. VisiCalc created a business market for the Apple II and gave home users an additional reason to buy an Apple II: compatibility with the office. Before VisiCalc, Apple had been a distant third place c
Subpixel rendering is a way to increase the apparent resolution of a computer's liquid crystal display or organic light-emitting diode display by rendering pixels to take into account the screen type's physical properties. It takes advantage of the fact that each pixel on a color LCD is composed of individual red and blue or other color subpixels to anti-alias text with greater detail or to increase the resolution of all image types on layouts which are designed to be compatible with subpixel rendering. A single pixel on a color subpixelated display is made of several color primaries three colored elements—ordered either as blue and red, or as red and blue; some displays have more than three primaries called MultiPrimary, such as the combination of red, green and yellow, or red, green and white, or red, blue and cyan. These pixel components, sometimes called subpixels, appear as a single color to the human eye because of blurring by the optics and spatial integration by nerve cells in the eye.
The components are visible, when viewed with a small magnifying glass, such as a loupe. Over a certain resolution threshold the colors in the subpixels are not visible, but the relative intensity of the components shifts the apparent position or orientation of a line. Subpixel rendering is better suited to some display technologies than others; the technology is well-suited to LCDs and other technologies where each logical pixel corresponds directly to three or more independent colored subpixels, but less so for CRTs. In a CRT the light from the pixel components spreads across pixels, the outputs of adjacent pixels are not independent. If a designer knew about the display's electron beams and aperture grille, subpixel rendering might have some advantage, but the properties of the CRT components, coupled with the alignment variations that are part of the production process, make subpixel rendering less effective for these displays. The technique should have good application to organic light emitting diodes and other display technologies that organize pixels the same way as LCDs.
The origin of subpixel rendering as used today remains controversial. Apple IBM, Microsoft patented various implementations with certain technical differences owing to the different purposes their technologies were intended for. Microsoft has several patents in the United States on subpixel rendering technology for text rendering on RGB Stripe layouts; the patents 6,219,025, 6,239,783, 6,307,566, 6,225,973, 6,243,070, 6,393,145, 6,421,054, 6,282,327, 6,624,828 were filed between 1998-10-07 and 1999-10-07, thus should expire in 2019-10-07. This caused FreeType, the library used by most current software on the X Window System, to disable this functionality by default. Most modern Linux distributions turn this back on. Apple was able to use it in Mac OS X due to a patent cross-licensing agreement, it is sometimes claimed that the Apple II, introduced in 1977, supports an early form of subpixel rendering in its high-resolution graphics mode. However, the method Gibson describes can be viewed as a limitation of the way the machine generates color, rather than as a technique intentionally exploited by programmers to increase resolution.
David Turner of the FreeType project criticized Gibson's theory as to the invention, at least as far as patent law is concerned, in the following way: "For the record, the Wozniak patent is explicitely referenced in the, the claims are worded to avoid colliding with it." Turner further explains his view: Under the current US regime, any minor improvement to a previous technique can be considered an "invention" and "protected" by a patent under the right circumstances, If we look at, we see that the Apple II Wozniak patent covering this machine's display technique is listed first in the patents' citations. This shows that both Microsoft and the patent examiner who granted the patents were aware of this "prior art"; the bytes that comprise the Apple II high-resolution screen buffer contain seven visible bits and a flag bit used to select between purple/green or blue/orange color sets. Each pixel, since it is represented by a single bit, is either off. Color is instead created as an artifact of the NTSC color encoding scheme, determined by horizontal position: pixels with horizontal coordinates are always purple, odd pixels are always green.
Two lit pixels next to each other are always white, regardless of whether the pair is even/odd or odd/even, irrespective of the value of the flag bit. The foregoing is only an approximation of the true interplay of the digital and analog behavior of the Apple's video output circuits on one hand, the properties of real NTSC monitors on the other hand. However, this approximation is what most programmers of the time would have in mind while working with the Apple's high-resolution mode. Gibson's example claims that because two adjacent bits make a white block, there are in fact two bits per pixel: one which activates the purple left half of the pixel, the other which activates the green right half of the pixel. If the programmer instead activates the green right half of a pixel and the purple half of the next pixel the result is a w
Morphing is a special effect in motion pictures and animations that changes one image or shape into another through a seamless transition. Morphing means stretching or as part of surreal sequence. Traditionally such a depiction would be achieved through cross-fading techniques on film. Since the early 1990s, this has been replaced by computer software to create more realistic transitions. Long before digital morphing, several techniques were used for similar image transformations; some of those techniques are closer to a matched dissolve - a gradual change between two pictures without warping the shapes in the images - while others did change the shapes in between the start and end phases of the transformation. Known since at least the end of the 16th century, Tabula scalata is a type of painting with two images divided over a corrugated surface; each image is only visible from a certain angle. If the pictures are matched properly, a primitive type of morphing effect occurs when changing from one viewing angle to the other.
Around 1790 French shadow play showman François Dominique Séraphin used a metal shadow figure with jointed parts to have the face of a young woman changing into that of a witch. Some 19th century mechanical magic lantern slides produced changes to the appearance of figures. For instance a nose could grow to enormous size by sliding away a piece of glass with black paint that masked part of another glass plate with the picture. In the first half of the 19th century "dissolving views" were a popular type of magic lantern show showing landscapes dissolving from a day to night version or from summer to winter. Other uses are known, for instance; the 1910 short film Narren-grappen shows a dissolve transformation of the clothing of a female character. Maurice Tourneur's 1915 film Alias Jimmy Valentine featured a subtle dissolve transformation of the main character from respected citizen Lee Randall into his criminal alter ego Jimmy Valentine; the Peter Tchaikovsky Story in a 1959 TV-series episode of Disneyland features a swan automaton transforming into a real ballet dancer.
In 1985, Godley & Creme created a "morph" effect using analogue cross-fades on parts of different faces in the video for "Cry". In animation, the morphing effect was created long before the introduction of cinema. A phenakistiscope designed by its inventor Joseph Plateau and/or painter Jean-Baptiste Madou was printed around 1835 and shows the head of a woman changing into a witch and into a monster.Émile Cohl's 1908 animated film Fantasmagorie featured much morphing of characters and objects drawn in simple outlines. In the early 1990s computer techniques that produced more convincing results began to be used; these involved distorting one image at the same time that it faded into another through marking corresponding points and vectors on the "before" and "after" images used in the morph. For example, one would morph one face into another by marking key points on the first face, such as the contour of the nose or location of an eye, mark where these same points existed on the second face; the computer would distort the first face to have the shape of the second face at the same time that it faded the two faces.
To compute the transformation of image coordinates required for the distortion, the algorithm of Beier and Neely can be used. In or before 1986 computer graphics company Omnibus created a digital animation for a Tide commercial with a Tide detergent bottle smoothly morphing into the shape of the United States; the effect was programmed by Bob Hoffman. Omnibus re-used the technique in the movie Flight of the Navigator, it featured. The plaster cast of a model of the spaceship was scanned and digitally modified with techniques that included a reflection mapping technique, developed by programmer Bob Hoffman; the 1986 movie The Golden Child implemented rather crude digital morphing effects from animal to human and back. Willow featured a more detailed digital morphing sequence with a person changing into different animals. A similar process was used a year in Indiana Jones and the Last Crusade to create Walter Donovan's gruesome demise. Both effects were created by Industrial Light & Magic using grid warping techniques developed by Tom Brigham and Doug Smythe.
In 1991, morphing appeared notably in the Michael Jackson music video "Black or White" and in the movies Terminator 2: Judgment Day and Star Trek VI: The Undiscovered Country. The first application for personal computers to offer morphing was Gryphon Software Morph on the Macintosh. Other early morphing systems included ImageMaster, MorphPlus and CineMorph, all of which premiered for the Commodore Amiga in 1992. Other programs became available within a year, for a time the effect became common to the point of cliché. For high-end use, Elastic Reality saw its first feature film use in In The Line of Fire and was used in Quantum Leap. At VisionArt Ted Fay used Elastic Reality to morph Odo for Star Trek: Deep Space Nine. Elastic Reality was purchased by Avid, having become the de facto system of choice, used in many hundreds of films; the technology behind Elastic Reality earned two Academy Awards in 1996 for Scientific and Technical Achievement going to Garth Dickie and Perry Kivolowitz. The effect is technically called a "spatially warped cross-dissolve".
The first social network designed for user-generated morph examples to be posted online was Galleries by Morpheus. In Taiwan, Aderans, a hair loss solutions provider, did a TV commercial featuring a morphing sequence in
In digital signal processing, spatial anti-aliasing is a technique for minimizing the distortion artifacts known as aliasing when representing a high-resolution image at a lower resolution. Anti-aliasing is used in digital photography, computer graphics, digital audio, many other applications. Anti-aliasing means removing signal components that have a higher frequency than is able to be properly resolved by the recording device; this removal is done before sampling at a lower resolution. When sampling is performed without removing this part of the signal, it causes undesirable artifacts such as the black-and-white noise near the top of figure 1-a below. In signal acquisition and audio, anti-aliasing is done using an analogue anti-aliasing filter to remove the out-of-band component of the input signal prior to sampling with an analogue-to-digital converter. In digital photography, optical anti-aliasing filters made of birefringent materials smooth the signal in the spatial optical domain; the anti-aliasing filter blurs the image in order to reduce the resolution to or below that achievable by the digital sensor.
In computer graphics, anti-aliasing improves the appearance of polygon edges, so they are not "jagged" but are smoothed out on the screen. However, it uses more video memory; the level of anti-aliasing determines. Figure 1-a illustrates the visual distortion. Near the top of the image, where the checker-board is small, the image is both difficult to recognise and not aesthetically appealing. In contrast, Figure 1-b shows an anti-aliased version of the scene; the checker-board near the top blends into grey, the desired effect when the resolution is insufficient to show the detail. Near the bottom of the image, the edges appear much smoother in the anti-aliased image. Figure 1-c shows another anti-aliasing algorithm, based on the sinc filter, considered better than the algorithm used in 1-b. Figure 2 shows magnified portions of Figure 1-c for comparison. In Figure 1-c, anti-aliasing has interpolated the brightness of the pixels at the boundaries to produce grey pixels since the space is occupied by both black and white tiles.
These help make. In Figure 3, anti-aliasing was used to blend the boundary pixels of a sample graphic. Anti-aliasing is applied in rendering text on a computer screen, to suggest smooth contours that better emulate the appearance of text produced by conventional ink-and-paper printing. With fonts displayed on typical LCD screens, it is common to use subpixel rendering techniques like ClearType. Sub-pixel rendering requires special colour-balanced anti-aliasing filters to turn what would be severe colour distortion into barely-noticeable colour fringes. Equivalent results can be had by making individual sub-pixels addressable as if they were full pixels, supplying a hardware-based anti-aliasing filter as is done in the OLPC XO-1 laptop's display controller. Pixel geometry affects all of this, whether the anti-aliasing and sub-pixel addressing are done in software or hardware; the most basic approach to anti-aliasing a pixel is determining what percentage of the pixel is occupied by a given region in the vector graphic - in this case a pixel-sized square transposed over several pixels - and using that percentage as the colour.
A basic plot of a single, white-on-black anti-aliased point using that method can be done as follows: Define function PlotAntiAliasedPoint For roundedx = floor to ceil do For roundedy = floor to ceil do percent_x = 1 - abs percent_y = 1 - abs percent = percent_x * percent_y DrawPixel This method is best suited for simple graphics, such as basic lines or curves, applications that would otherwise have to convert absolute coordinates to pixel-constrained coordinates, such as 3-D graphics. It is a fast function, but it is low-quality, gets slower as the complexity of the shape increases. For purposes requiring high-quality graphics or complex vector shapes, this will not be the best approach. Note: The DrawPixel routine above cannot blindly set the colour value to the percent calculated, it must add the new value to the existing value at that location up to a maximum of 1. Otherwise, the brightness of each pixel will be equal to the darkest value calculated in time for that location which produces a bad result.
For example, if one point sets a brightness level of 0.90 for a given pixel and another point calculated barely touches that pixel and has a brightness of 0.05, the final value set for that pixel should be 0.95, not 0.05. For more sophisticated shapes, the algorithm may be generalized as rendering the shape to a pixel grid with higher resolution than the target display surface using bicubic interpolation to determine the average intensity of each real pixel on the display surface. In this approach, the ideal image is regarded as a signal; the image displayed on the screen is taken as samples, at each pixel position, of a filtered version of the
Font hinting is the use of mathematical instructions to adjust the display of an outline font so that it lines up with a rasterized grid. At low screen resolutions, hinting is critical for producing legible text, it can be accompanied by subpixel rendering for further clarity. For the purpose of on-screen text display, font hinting designates which primary pixels are interpolated to more render a font. Hints are created in a font editor during the typeface design process and embedded in the font. A font can be set manually. Most font editors are able to do automatic hinting, this approach is suitable for many fonts. However, high-quality commercial fonts are manually hinted to provide the sharpest appearance on computer displays. Verdana is one example of a font that contains a large amount of hinting data, much of, accomplished manually by type engineer Tom Rickner. One popular and recognizable form of hinting is found in the TrueType font format, released in 1991 by Apple Inc. Hinting in TrueType invokes tables of font data used to render fonts properly on screen.
One aspect of TrueType hinting is grid-fitting, which modifies the height and width of font characters to line up to the set pixel grid of screen display. The open-source FreeType 2 font rendering engine uses an auto-hinter when such hinting data are not present or their use is restricted by a software patent; as of 2011, the FreeType Web site states that the relevant font hinting patents have now all expired, hinting is now enabled in FreeType by default. According to the TrueType Reference Manual, font instructors must balance the following two constraints when hinting a font: At small sizes, chance effects should not be allowed to magnify small differences in the original outline design of a glyph. At large sizes, the subtlety of the original design should emerge; the Manual suggests. Particular attention should be paid to the cap height, x-height, baseline, so that the font retains its normal character while not producing exaggerated effects at small sizes. Kell factor "TrueType Hinting".
Microsoft Corporation. June 30, 1997. Retrieved November 6, 2007. An online font hinting tool The burden of locked grids & blooming dots - short video introduction to hinting by Geraldine Wade et al; the Raster Tragedy at Low-Resolution Revisited: Opportunities and Challenges beyond “Delta-Hinting”. Beat Stamm. March 2011. A revised and extended version of the original 1998 article covering anti-aliasing including sub-pixel rendering, opportunities made possible by anti-aliasing, challenges in the rasterizer and elsewhere, a discussion of font hinting in the context of these opportunities and challenges. FreeType and Patents Tutorial on the DejaVu font wiki Texts Rasterization Exposures Article from the Anti-Grain Geometry Project
MacOS is a series of graphical operating systems developed and marketed by Apple Inc. since 2001. It is the primary operating system for Apple's Mac family of computers. Within the market of desktop and home computers, by web usage, it is the second most used desktop OS, after Microsoft Windows.macOS is the second major series of Macintosh operating systems. The first is colloquially called the "classic" Mac OS, introduced in 1984, the final release of, Mac OS 9 in 1999; the first desktop version, Mac OS X 10.0, was released in March 2001, with its first update, 10.1, arriving that year. After this, Apple began naming its releases after big cats, which lasted until OS X 10.8 Mountain Lion. Since OS X 10.9 Mavericks, releases have been named after locations in California. Apple shortened the name to "OS X" in 2012 and changed it to "macOS" in 2016, adopting the nomenclature that they were using for their other operating systems, iOS, watchOS, tvOS; the latest version is macOS Mojave, publicly released in September 2018.
Between 1999 and 2009, Apple sold. The initial version, Mac OS X Server 1.0, was released in 1999 with a user interface similar to Mac OS 8.5. After this, new versions were introduced concurrently with the desktop version of Mac OS X. Beginning with Mac OS X 10.7 Lion, the server functions were made available as a separate package on the Mac App Store.macOS is based on technologies developed between 1985 and 1997 at NeXT, a company that Apple co-founder Steve Jobs created after leaving the company. The "X" in Mac OS X and OS X is pronounced as such; the X was a prominent part of the operating system's brand identity and marketing in its early years, but receded in prominence since the release of Snow Leopard in 2009. UNIX 03 certification was achieved for the Intel version of Mac OS X 10.5 Leopard and all releases from Mac OS X 10.6 Snow Leopard up to the current version have UNIX 03 certification. MacOS shares its Unix-based core, named Darwin, many of its frameworks with iOS, tvOS and watchOS.
A modified version of Mac OS X 10.4 Tiger was used for the first-generation Apple TV. Releases of Mac OS X from 1999 to 2005 ran on the PowerPC-based Macs of that period. After Apple announced that they were switching to Intel CPUs from 2006 onwards, versions were released for 32-bit and 64-bit Intel-based Macs. Versions from Mac OS X 10.7 Lion run on 64-bit Intel CPUs, in contrast to the ARM architecture used on iOS and watchOS devices, do not support PowerPC applications. The heritage of what would become macOS had originated at NeXT, a company founded by Steve Jobs following his departure from Apple in 1985. There, the Unix-like NeXTSTEP operating system was developed, launched in 1989; the kernel of NeXTSTEP is based upon the Mach kernel, developed at Carnegie Mellon University, with additional kernel layers and low-level user space code derived from parts of BSD. Its graphical user interface was built on top of an object-oriented GUI toolkit using the Objective-C programming language. Throughout the early 1990s, Apple had tried to create a "next-generation" OS to succeed its classic Mac OS through the Taligent and Gershwin projects, but all of them were abandoned.
This led Apple to purchase NeXT in 1996, allowing NeXTSTEP called OPENSTEP, to serve as the basis for Apple's next generation operating system. This purchase led to Steve Jobs returning to Apple as an interim, the permanent CEO, shepherding the transformation of the programmer-friendly OPENSTEP into a system that would be adopted by Apple's primary market of home users and creative professionals; the project was first code named "Rhapsody" and officially named Mac OS X. Mac OS X was presented as the tenth major version of Apple's operating system for Macintosh computers. Previous Macintosh operating systems were named using Arabic numerals, as with Mac OS 8 and Mac OS 9; the letter "X" in Mac OS X's name refers to a Roman numeral. It is therefore pronounced "ten" in this context. However, it is commonly pronounced like the letter "X"; the first version of Mac OS X, Mac OS X Server 1.0, was a transitional product, featuring an interface resembling the classic Mac OS, though it was not compatible with software designed for the older system.
Consumer releases of Mac OS X included more backward compatibility. Mac OS applications could be rewritten to run natively via the Carbon API; the consumer version of Mac OS X was launched in 2001 with Mac OS X 10.0. Reviews were variable, with extensive praise for its sophisticated, glossy Aqua interface but criticizing it for sluggish performance. With Apple's popularity at a low, the makers of several classic Mac applications such as FrameMaker and PageMaker declined to develop new versions of their software for Mac OS X. Ars Technica columnist John Siracusa, who reviewed every major OS X release up to 10.10, described the early releases in retrospect as'dog-slow, feature poor' and Aqua as'unbearably slow and a huge resource hog'. Apple developed several new releases of Mac OS X. Siracusa's review of version 10.3, noted "It's strange to have gone from years of uncertainty and vaporware to a steady annual supply of major new operating system releases." Version 10.4, Tiger shocked executives at Microsoft by offering a number of features, such as fast file s