Crash was a magazine dedicated to the ZX Spectrum home computer focused on games. It was published from 1984 to 1991 by Newsfield Publications Ltd until their liquidation, until 1992 by Europress; the magazine was launched to cater for the booming Spectrum games market. It was popular owing to its quality of writing and distinctive, though controversial, artwork created by Oliver Frey. By 1986 it had become the biggest-selling British computer magazine with over 100,000 copies sold monthly, but struggled towards the end of the decade after other magazines put cassettes of games on the front cover. In the 2010s, a number of retrospective issues were created via a kickstarter campaign. Crash was launched in 1983 in Shropshire by Roger Kean, Oliver Frey and Franco Frey; the trio had met the previous year when they were working for newspaper publisher Alan Purnell, learning how to write and produce a magazine from scratch. Franco Frey had worked for an electronics company, had been asked by one of his business contacts if could get hold of video games.
Kean remembers that "The High Street was ignorant of computer games" and they wanted to source titles and sell them. They set up a mail order catalogue called Crash Micro Games Action and advertised in contemporary computer magazines such as Computer and Video Games, it was successful, so by late 1983, they decided to launch a dedicated magazine, forming the company Newsfield to do so. Kean and Oliver Frey wanted a catchy title for the magazine, choosing "Crash" after J. G. Ballard's novel of the same name. Though he had played video games throughout the 1970s, the middle-aged Kean realised that the target market for the magazine was teenagers and young men, the writing needed to accommodate this, he hired teenage staff writer Matthew Uffindel and the pair recruited local schoolchildren to review the games, including Ben Stone and Robin Candy. To produce screenshots, a camera was set up to directly capture the television set or monitor that the Spectrum was plugged into; the film was processed in-house and delivered to a local print shop to prepare the final page.
The first issue was intended to be published in November 1983, in time for the pre-Christmas trade but owing to a conflict with retailers WH Smith it was published in February the following year. The magazine maintained focus squarely on Spectrum gaming, it was an instant hit thanks to Kean's writing, assisted by Uffindel. Kean and the Frey brothers would continue to be involved with the magazine throughout its lifetime. Reviewers would give their direct opinions on whether a game was good or not, regardless of advertising or any pressure from software houses. Though publishers sometimes tried to bribe the magazine editors to give games good reviews, the children would not do that, once gave a game a low score of 9%; this honesty gave Crash a good reputation and made it influential in the games industry. If a game was awarded a "Crash Smash", the industry believed it was genuinely good and it would sell well. Notable Crash Smashes included Sabre Wulf and Head over Heels. A games compilation "Four Crash Smashes" was produced.
In October 1986, Crash reported sales of over 100,000 copies. Its ABC figure of 101,483 copies a month for the period of January to June were claimed by the magazine to be higher than any other British computer magazine. By 1989, rival Spectrum magazine Your Sinclair came with a free cassette attached to the cover that contained a complete game and various demos. Crash had featured cassettes on the cover, but began to lag in circulation, it was relaunched that June with a free cover-mounted cassette with a number of complete games, which continued as a regular feature. This came at the expense of page editorial content, both of which were reduced. Kean was annoyed by having to put tapes on the cover to keep up with the competition, as it increased costs and obscured Frey's cover artwork. Newsfield was suffering increasing financial difficulties by the early 1990s; the last edition of Crash published by the company was in September 1991. Following the company's liquidation, the magazine was relaunched by Europress that December, continuing until the final issue in April 1992.
After this, Crash was bought by publisher of Sinclair User, who merged the two magazines. In practice, this meant little more than the appearance of the Crash logo on the front cover. In May 2016, No. 2 King Street, Ludlow was awarded a Blue plaque as the premises of Newsfield while it was publishing Crash and ZZap!64 from 1984–9,which both hired pupils from Ludlow Church of England School alongside professional journalists. In 2017, the magazine was commemorated in a special exhibition in Ludlow Buttercross Museum documenting Newsfield's contribution to the local industry; the same year, a special edition of the magazine was issued following a Kickstarter campaign that raised £12,000. Kean, Oliver Frey and Nick Roberts all returned to contribute to this issue; the following year, a similar campaign led to the 2019 Crash annual – issue 100. Crash featured distinctive cover art drawn by Oliver Frey. Much of his work was published in book form for the first time in 2006; the cover of issue 18, July 1985, which depicted a scantily clad sorceress with a man on his knees in collar and chains, was considered provocative by some shops who moved it to the top shelf.
Issue 31 in August 1986 was criticised for the front cover featuring staff writer Hannah Smith in a swimsuit mud wrestling with an alien. The cover of issue 41, June 1987, was a violent image depicting two barbarians fighting, with one about to slit the
A bomb is an explosive weapon that uses the exothermic reaction of an explosive material to provide an sudden and violent release of energy. Detonations inflict damage principally through ground- and atmosphere-transmitted mechanical stress, the impact and penetration of pressure-driven projectiles, pressure damage, explosion-generated effects. Bombs have been utilized since the 11th century starting in East Asia; the term bomb is not applied to explosive devices used for civilian purposes such as construction or mining, although the people using the devices may sometimes refer to them as a "bomb". The military use of the term "bomb", or more aerial bomb action refers to airdropped, unpowered explosive weapons most used by air forces and naval aviation. Other military explosive weapons not classified as "bombs" include shells, depth charges, or land mines. In unconventional warfare, other names can refer to a range of offensive weaponry. For instance, in recent Middle Eastern conflicts, homemade bombs called "improvised explosive devices" have been employed by insurgent fighters to great effectiveness.
The word comes from the Latin bombus, which in turn comes from the Greek βόμβος, an onomatopoetic term meaning "booming", "buzzing". Explosive bombs were used by a Jurchen Jin army against a Chinese Song city. Bombs built using bamboo tubes appear in the 11th century. Bombs made of cast iron shells packed with explosive gunpowder date to 13th century China; the term was coined for this bomb during a Jin dynasty naval battle of 1231 against the Mongols. The History of Jin 《金史》 states that in 1232, as the Mongol general Subutai descended on the Jin stronghold of Kaifeng, the defenders had a "thunder-crash bomb" which "consisted of gunpowder put into an iron container... when the fuse was lit there was a great explosion the noise whereof was like thunder, audible for more than thirty miles, the vegetation was scorched and blasted by the heat over an area of more than half a mou. When hit iron armour was quite pierced through." The Song Dynasty official Li Zengbo wrote in 1257 that arsenals should have several hundred thousand iron bomb shells available and that when he was in Jingzhou, about one to two thousand were produced each month for dispatch of ten to twenty thousand at a time to Xiangyang and Yingzhou.
The Ming Dynasty text Huolongjing describes the use of poisonous gunpowder bombs, including the "wind-and-dust" bomb. During the Mongol invasions of Japan, the Mongols used the explosive "thunder-crash bombs" against the Japanese. Archaeological evidence of the "thunder-crash bombs" has been discovered in an underwater shipwreck off the shore of Japan by the Kyushu Okinawa Society for Underwater Archaeology. X-rays by Japanese scientists of the excavated shells confirmed. Explosive shock waves can cause situations such as body displacement, internal bleeding and ruptured eardrums. Shock waves produced by explosive events have two distinct components, the positive and negative wave; the positive wave shoves outward from the point of detonation, followed by the trailing vacuum space "sucking back" towards the point of origin as the shock bubble collapses. The greatest defense against shock injuries is distance from the source of shock; as a point of reference, the overpressure at the Oklahoma City bombing was estimated in the range of 28 MPa.
A thermal wave is created by the sudden release of heat caused by an explosion. Military bomb tests have documented temperatures of up to 2,480 °C. While capable of inflicting severe to catastrophic burns and causing secondary fires, thermal wave effects are considered limited in range compared to shock and fragmentation; this rule has been challenged, however, by military development of thermobaric weapons, which employ a combination of negative shock wave effects and extreme temperature to incinerate objects within the blast radius. This would be fatal to humans. Fragmentation is produced by the acceleration of shattered pieces of bomb casing and adjacent physical objects; the use of fragmentation in bombs dates to the 14th century, appears in the Ming Dynasty text Huolongjing. The fragmentation bombs were filled with iron pieces of broken porcelain. Once the bomb explodes, the resulting shrapnel is capable of piercing the skin and blinding enemy soldiers. While conventionally viewed as small metal shards moving at super-supersonic and hypersonic speeds, fragmentation can occur in epic proportions and travel for extensive distances.
When the SS Grandcamp exploded in the Texas City Disaster on April 16, 1947, one fragment of that blast was a two-ton anchor, hurled nearly two miles inland to embed itself in the parking lot of the Pan American refinery. Fragmentation should not be confused with shrapnel, which relies on the momentum of a shell to cause damage. To people who are close to a blast incident, such as bomb disposal technicians, soldiers wearing body armor, deminers, or individuals wearing little to no protection, there are four types of blast effects on the human body: overpressure, fragmentation and heat. Overpressure refers to the sudden and drastic rise in ambient pressure that can damage the internal organs leading to permanent damage or death. Fragmentation includes the shrapnel described above but can include sand and vegetation from the area surrounding the blast source; this is common in anti-personnel mine blasts. The projection of materials poses a lethal threat caused by cuts in soft tiss
MS-DOS is an operating system for x86-based personal computers developed by Microsoft. Collectively, MS-DOS, its rebranding as IBM PC DOS, some operating systems attempting to be compatible with MS-DOS, are sometimes referred to as "DOS". MS-DOS was the main operating system for IBM PC compatible personal computers during the 1980s and the early 1990s, when it was superseded by operating systems offering a graphical user interface, in various generations of the graphical Microsoft Windows operating system. MS-DOS was the result of the language developed in the seventies, used by IBM for its mainframe operating system. Microsoft acquired the rights to meet IBM specifications. IBM re-released it on August 12, 1981 as PC DOS 1.0 for use in their PCs. Although MS-DOS and PC DOS were developed in parallel by Microsoft and IBM, the two products diverged after twelve years, in 1993, with recognizable differences in compatibility and capabilities. During its lifetime, several competing products were released for the x86 platform, MS-DOS went through eight versions, until development ceased in 2000.
MS-DOS was targeted at Intel 8086 processors running on computer hardware using floppy disks to store and access not only the operating system, but application software and user data as well. Progressive version releases delivered support for other mass storage media in greater sizes and formats, along with added feature support for newer processors and evolving computer architectures, it was the key product in Microsoft's growth from a programming language company to a diverse software development firm, providing the company with essential revenue and marketing resources. It was the underlying basic operating system on which early versions of Windows ran as a GUI, it is a flexible operating system, consumes negligible installation space. MS-DOS was a renamed form of 86-DOS – owned by Seattle Computer Products, written by Tim Paterson. Development of 86-DOS took only six weeks, as it was a clone of Digital Research's CP/M, ported to run on 8086 processors and with two notable differences compared to CP/M.
This first version was shipped in August 1980. Microsoft, which needed an operating system for the IBM Personal Computer hired Tim Paterson in May 1981 and bought 86-DOS 1.10 for $75,000 in July of the same year. Microsoft kept the version number, but renamed it MS-DOS, they licensed MS-DOS 1.10/1.14 to IBM, who, in August 1981, offered it as PC DOS 1.0 as one of three operating systems for the IBM 5150, or the IBM PC. Within a year Microsoft licensed MS-DOS to over 70 other companies, it was designed to be an OS. Each computer would have its own distinct hardware and its own version of MS-DOS, similar to the situation that existed for CP/M, with MS-DOS emulating the same solution as CP/M to adapt for different hardware platforms. To this end, MS-DOS was designed with a modular structure with internal device drivers, minimally for primary disk drives and the console, integrated with the kernel and loaded by the boot loader, installable device drivers for other devices loaded and integrated at boot time.
The OEM would use a development kit provided by Microsoft to build a version of MS-DOS with their basic I/O drivers and a standard Microsoft kernel, which they would supply on disk to end users along with the hardware. Thus, there were many different versions of "MS-DOS" for different hardware, there is a major distinction between an IBM-compatible machine and an MS-DOS machine; some machines, like the Tandy 2000, were MS-DOS compatible but not IBM-compatible, so they could run software written for MS-DOS without dependence on the peripheral hardware of the IBM PC architecture. This design would have worked well for compatibility, if application programs had only used MS-DOS services to perform device I/O, indeed the same design philosophy is embodied in Windows NT. However, in MS-DOS's early days, the greater speed attainable by programs through direct control of hardware was of particular importance for games, which pushed the limits of their contemporary hardware. Soon an IBM-compatible architecture became the goal, before long all 8086-family computers emulated IBM's hardware, only a single version of MS-DOS for a fixed hardware platform was needed for the market.
This version is the version of MS-DOS, discussed here, as the dozens of other OEM versions of "MS-DOS" were only relevant to the systems they were designed for, in any case were similar in function and capability to some standard version for the IBM PC—often the same-numbered version, but not always, since some OEMs used their own proprietary version numbering schemes —with a few notable exceptions. Microsoft omitted multi-user support from MS-DOS because Microsoft's Unix-based operating system, was multi-user; the company planned, over time, to improve MS-DOS so it would be indistinguishable from single-user Xenix, or XEDOS, which would run on the Motorola 68000, Zilog Z8000, the LSI-11. Microsoft advertised MS-DOS and Xenix together, listing the shared features of its "single-user OS" and "the multi-user, multi-tasking, UNIX-derived operating system", promising easy
A fortification is a military construction or building designed for the defense of territories in warfare, is used to solidify rule in a region during peacetime. The term is derived from the Latin fortis and facere. From early history to modern times, walls have been necessary for cities to survive in an ever-changing world of invasion and conquest; some settlements in the Indus Valley Civilization were the first small cities to be fortified. In ancient Greece, large stone walls had been built in Mycenaean Greece, such as the ancient site of Mycenae. A Greek phrourion was a fortified collection of buildings used as a military garrison, is the equivalent of the Roman castellum or English fortress; these constructions served the purpose of a watch tower, to guard certain roads and lands that might threaten the kingdom. Though smaller than a real fortress, they acted as a border guard rather than a real strongpoint to watch and maintain the border; the art of setting out a military camp or constructing a fortification traditionally has been called "castrametation" since the time of the Roman legions.
Fortification is divided into two branches: permanent fortification and field fortification. There is an intermediate branch known as semi-permanent fortification. Castles are fortifications which are regarded as being distinct from the generic fort or fortress in that they are a residence of a monarch or noble and command a specific defensive territory. Roman forts and hill forts were the main antecedents of castles in Europe, which emerged in the 9th century in the Carolingian Empire; the Early Middle Ages saw the creation of some towns built around castles. Medieval-style fortifications were made obsolete by the arrival of cannons in the 14th century. Fortifications in the age of black powder evolved into much lower structures with greater use of ditches and earth ramparts that would absorb and disperse the energy of cannon fire. Walls exposed to direct cannon fire were vulnerable, so the walls were sunk into ditches fronted by earth slopes to improve protection; the arrival of explosive shells in the 19th century led to yet another stage in the evolution of fortification.
Star forts did not fare well against the effects of high explosive, the intricate arrangements of bastions, flanking batteries and the constructed lines of fire for the defending cannon could be disrupted by explosive shells. Steel-and-concrete fortifications were common during the early 20th centuries; however the advances in modern warfare since World War I have made large-scale fortifications obsolete in most situations. Demilitarized zones along borders are arguably another type of fortification, although a passive kind, providing a buffer between hostile militaries. Many US military installations are known as forts. Indeed, during the pioneering era of North America, many outposts on the frontiers non-military outposts, were referred to generically as forts. Larger military installations may be called fortresses; the word fortification can refer to the practice of improving an area's defence with defensive works. City walls are fortifications but are not called fortresses; the art of setting out a military camp or constructing a fortification traditionally has been called castrametation since the time of the Roman legions.
The art/science of laying siege to a fortification and of destroying it is called siegecraft or siege warfare and is formally known as poliorcetics. In some texts this latter term applies to the art of building a fortification. Fortification is divided into two branches: permanent fortification and field fortification. Permanent fortifications are erected at leisure, with all the resources that a state can supply of constructive and mechanical skill, are built of enduring materials. Field fortifications—for example breastworks—and known as fieldworks or earthworks, are extemporized by troops in the field assisted by such local labour and tools as may be procurable and with materials that do not require much preparation, such as earth and light timber, or sandbags. An example of field fortification was the construction of Fort Necessity by George Washington in 1754. There is an intermediate branch known as semi-permanent fortification; this is employed when in the course of a campaign it becomes desirable to protect some locality with the best imitation of permanent defences that can be made in a short time, ample resources and skilled civilian labour being available.
An example of this is the construction of Roman forts in England and in other Roman territories where camps were set up with the intention of staying for some time, but not permanently. Castles are fortifications which are regarded as being distinct from the generic fort or fortress in that it describes a residence of a monarch or noble and commands a specific defensive territory. An example of this is the massive medieval castle of Carcassonne. From early history to modern times, walls have been a necessity for many cities. In Bulgaria, near the town of Provadia a walled fortified settlement today called Solnitsata starting from 4700 BC had a diameter of about 300 feet, was home to 350 people living in two-storey houses, was encircled by a fortified wall; the huge walls around the settlement, which were built tall and with stone blocks which are 6 feet high and 4.5 feet thick, make it one of the earliest walled settlements in Europe but it is younger than the walled town of Sesklo in Greece from 6800 BC.
Uruk in ancient Su
Nintendo Entertainment System
The Nintendo Entertainment System is an 8-bit home video game console developed and manufactured by Nintendo. It is a remodeled export version of the company's Family Computer platform in Japan known as the Famicom for short, which launched on July 15, 1983; the NES was launched through test markets in New York City and Los Angeles in 1985, before being given a wide release in the rest of North America and parts of Europe in 1986, followed by Australia and other European countries in 1987. Brazil saw only unlicensed clones until the official local release in 1993. In South Korea, it was packaged as the Hyundai Comboy and distributed by SK Hynix, known as Hyundai Electronics; the best-selling gaming console of its time, the NES helped revitalize the US video game industry following the North American video game crash of 1983. With the NES, Nintendo introduced a now-standard business model of licensing third-party developers, authorizing them to produce and distribute titles for Nintendo's platform.
It was succeeded by the Super Nintendo Entertainment System. Following a series of arcade game successes in the early 1980s, Nintendo made plans to create a cartridge-based console called the Famicom, short for Family Computer. Masayuki Uemura designed the system. Original plans called for an advanced 16-bit system which would function as a full-fledged computer with a keyboard and floppy disk drive, but Nintendo president Hiroshi Yamauchi rejected this and instead decided to go for a cheaper, more conventional cartridge-based game console as he believed that features such as keyboards and disks were intimidating to non-technophiles. A test model was constructed in October 1982 to verify the functionality of the hardware, after which work began on programming tools; because 65xx CPUs had not been manufactured or sold in Japan up to that time, no cross-development software was available and it had to be produced from scratch. Early Famicom games were written on a system that ran on an NEC PC-8001 computer and LEDs on a grid were used with a digitizer to design graphics as no software design tools for this purpose existed at that time.
The code name for the project was "GameCom", but Masayuki Uemura's wife proposed the name "Famicom", arguing that "In Japan,'pasokon' is used to mean a personal computer, but it is neither a home or personal computer. We could say it is a family computer." Meanwhile, Hiroshi Yamauchi decided that the console should use a red and white theme after seeing a billboard for DX Antenna which used those colors. During the creation of the Famicom, the ColecoVision, a video game console made by Coleco to compete against Atari's Atari 2600 Game system in The United States, was a huge influence. Takao Sawano, chief manager of the project, brought a ColecoVision home to his family, who were impressed by the system's capability to produce smooth graphics at the time, which contrasted with the flickering and slowdown seen on Atari 2600 games. Uemura, head of Famicom development, stated that the ColecoVision set the bar that influenced how he would approach the creation of the Famicom. Original plans called for the Famicom's cartridges to be the size of a cassette tape, but they ended up being twice as big.
Careful design attention was paid to the cartridge connectors since loose and faulty connections plagued arcade machines. As it necessitated taking 60 connection lines for the memory and expansion, Nintendo decided to produce their own connectors in-house rather than use ones from an outside supplier; the controllers were hard-wired to the console with no connectors for cost reasons. The game pad controllers were more-or-less copied directly from the Game & Watch machines, although the Famicom design team wanted to use arcade-style joysticks taking apart ones from American game consoles to see how they worked. There were concerns regarding the durability of the joystick design and that children might step on joysticks left on the floor. Katsuyah Nakawaka attached a Game & Watch D-pad to the Famicom prototype and found that it was easy to use and caused no discomfort. Though, they installed a 15-pin expansion port on the front of the console so that an optional arcade-style joystick could be used.
Uemura added an eject lever to the cartridge slot, not necessary, but he believed that children could be entertained by pressing it. He added a microphone to the second controller with the idea that it could be used to make players' voices sound through the TV speaker; the console was released on July 15, 1983 as the Family Computer for ¥14,800 alongside three ports of Nintendo's successful arcade games Donkey Kong, Donkey Kong Jr. and Popeye. The Famicom was slow to gather momentum. Following a product recall and a reissue with a new motherboard, the Famicom's popularity soared, becoming the best-selling game console in Japan by the end of 1984. Encouraged by this success, Nintendo turned its attention to the North American market, entering into negotiations with Atari to release the Famicom under Atari's name as the Nintendo Advanced Video Gaming System; the deal was set to be finalized and signed at the Summer Consumer Electronics Show in June 1983. However, Atari discovered at that show that its competitor Coleco was illegally demonstrating its Coleco Adam computer with Nintendo's Donkey Kong game.
This violation of Atari's exclusive license with Nintendo to publish the game for its own computer systems delayed the implementation of Nintendo's game console marketing contract with Atari. Atari's CEO Ray Kassar was fired the next month, so the deal went nowhere, Nintendo decided to market its sys
The Amstrad CPC is a series of 8-bit home computers produced by Amstrad between 1984 and 1990. It was designed to compete in the mid-1980s home computer market dominated by the Commodore 64 and the Sinclair ZX Spectrum, where it established itself in the United Kingdom, France and the German-speaking parts of Europe; the series spawned a total of six distinct models: The CPC464, CPC664, CPC6128 were successful competitors in the European home computer market. The plus models, 464plus and 6128plus, efforts to prolong the system's lifecycle with hardware updates, were less successful, as was the attempt to repackage the plus hardware into a game console as the GX4000; the CPC models' hardware is based on the Zilog Z80A CPU, complemented with either 64 or 128 KB of RAM. Their computer-in-a-keyboard design prominently features an integrated storage device, either a compact cassette deck or 3 inch floppy disk drive; the main units were only sold bundled with either a colour, green-screen or monochrome monitor that doubles as the main unit's power supply.
Additionally, a wide range of first and third party hardware extensions such as external disk drives and memory extensions, was available. The CPC series was pitched against other home computers used to play video games and enjoyed a strong supply of game software; the comparatively low price for a complete computer system with dedicated monitor, its high resolution monochrome text and graphic capabilities and the possibility to run CP/M software rendered the system attractive for business users, reflected by a wide selection of application software. During its lifetime, the CPC series sold three million units; the philosophy behind the CPC series was twofold, firstly the concept was of an “all-in-one”, where the computer and its data storage device were combined in a single unit, sold with its own dedicated display monitor. Most home computers at that time such as Sinclair’s ZX series, the Commodore 64 and the BBC Micro relied on the use of the domestic television set and a separately connected tape recorder or disk drive.
In itself, the all-in-one concept was not new, having been seen before on business-oriented machines and the Commodore PET, but in the home computer space, it predated the Apple Macintosh by a year. Secondly, Amstrad founder Alan Sugar wanted the machine to resemble a “real computer, similar to what someone would see being used to check them in at the airport for their holidays”, for the machine to not look like “a pregnant calculator” – in reference to the Sinclair ZX81 and ZX Spectrum with their low cost, membrane-type keyboards; the CPC 464 sold more than two million units. The CPC 464 featured an internal cassette tape deck, it was introduced in June 1984 in the UK. Initial suggested retail prices for the CPC464 were GBP£249.00/DM899.00 with a green screen and GBP£359.00/DM1398.00 with a colour monitor. Following the introduction of the CPC6128 in late 1985, suggested retail prices for the CPC464 were cut by GBP£50.00/DM100.00. In 1990, the 464plus replaced the CPC 464 in the model line-up, production of the CPC 464 was discontinued.
The CPC664 features 64 KB RAM and an internal 3-inch floppy disk drive. It was introduced in May 1985 in the UK. Initial suggested retail prices for the CPC664 were GBP£339.00/DM1198.00 with a green screen and GBP£449.00/DM1998.00 with a colour monitor. After the successful release of the CPC464, consumers were asking for two improvements: more memory and an internal disk drive. For Amstrad, the latter was easier to realize. At the deliberately low-key introduction of the CPC664 in May 1985, the machine was positioned not only as the lowest-cost disk system but the lowest-cost CP/M 2.2 machine. In the Amstrad CPC product range the CPC664 complemented the CPC464, neither discontinued nor reduced in price. Compared to the CPC464, the CPC664's main unit has been redesigned, not only to accommodate the floppy disk drive but with a redesigned keyboard area. Touted as "ergonomic" by Amstrad's promotional material, the keyboard is noticeably tilted to the front with MSX-style cursor keys above the numeric keypad.
Compared to the CPC464's multicoloured keyboard, the CPC664's keys are kept in a much quieter grey and pale blue colour scheme. The back of the CPC664 main unit features the same connectors as the CPC464, with the exception of an additional 12V power lead. Unlike the CPC464's cassette tape drive that could be powered off the main unit's 5V voltage, the CPC664's floppy disk drive requires an additional 12V voltage; this voltage had to be separately supplied by an updated version of the bundled green screen/colour monitor. The CPC664 was only produced for six months. In late 1985, when the CPC6128 was introduced in Europe, Amstrad decided not to keep three models in the line-up, production of the CPC664 was discontinued; the CPC6128 features an internal 3-inch floppy disk drive. Aside from various hardware and firmware improvements, one of the CPC6128's most prominent features is the compatibility with the CP/M+ operating system that rendered it attractive for business uses; the CPC6128 was released in August 1985 and only sold in the US.
Imported and distributed by Indescomp, Inc. of Chicago, it was the first Amstrad product to be sold in the United States, a market that at the time was traditionally hostile towards European computer manufacturers. By the end of 1985, it replaced the CPC664 in the CPC model line-up. Initial suggested retail prices for the CPC6128 were US$699.00/£299.00/DM1598.00 wit
The Apple IIc, the fourth model in the Apple II series of personal computers, is Apple Computer’s first endeavor to produce a portable computer. The result was a 7.5 lb notebook-sized version of the Apple II that could be transported from place to place. The c in the name stood for compact, referring to the fact it was a complete Apple II computer setup squeezed into a small notebook-sized housing. While sporting a built-in floppy drive and new rear peripheral expansion ports integrated onto the main logic board, it lacks the internal expansion slots and direct motherboard access of earlier Apple II models, making it a closed system like the Macintosh. However, the intended direction for this model — a more appliance-like machine, ready to use out of the box, requiring no technical know-how or experience to hook up and therefore attractive to first-time users; the Apple IIc was released on April 1984, during an Apple-held event called Apple II Forever. With that motto, Apple proclaimed the new machine was proof of the company's long-term commitment to the Apple II series and its users, despite the recent introduction of the Macintosh.
The IIc was seen as the company's response to the new IBM PCjr, Apple hoped to sell 400,000 by the end of 1984. While an Apple IIe computer in a smaller case, it was not a successor, but rather a portable version to complement it. One Apple II machine would be sold for users who required the expandability of slots, another for those wanting the simplicity of a plug and play machine with portability in mind; the machine introduced Apple’s Snow White design language, notable for its case styling and a modern look designed by Hartmut Esslinger which became the standard for Apple equipment and computers for nearly a decade. The Apple IIc introduced a unique off-white coloring known as “Fog,” chosen to enhance the Snow White design style; the IIc and some peripherals were the only Apple products. While light-weight and compact in design, the Apple IIc was not a true portable in design as it lacked a built-in battery and display. Codenames for the machine while under development included: Lollie, ET, Teddy, VLC, IIb, IIp.
Technically the Apple IIc was an Apple IIe in a smaller case, more portable and easier to use but less expandable. The IIc used the CMOS-based 65C02 microprocessor which added 27 new instructions to the 6502, but was incompatible with programs that used deprecated illegal opcodes of the 6502; the new ROM firmware allowed Applesoft BASIC to recognize lowercase characters and work better with an 80-column display, fixed several bugs from the IIe ROM. In terms of video, the text display added 32 unique character symbols called "MouseText" which, when placed side by side, could display simple icons and menus to create a graphical user interface out of text, similar in concept to IBM code page 437 or PETSCII's box-drawing characters. A year the Apple IIe would benefit from these improvements in the form of a four-chip upgrade called the Enhanced IIe; the equivalent of five expansion cards were built-in and integrated into the Apple IIc motherboard: An Extended 80 Column Card, two Apple Super Serial Cards, a Mouse Card, a disk floppy drive controller card.
This meant the Apple IIc had 128 KB RAM, 80-column text, Double-Hi-Resolution graphics built-in and available right out of the box, unlike its older sibling, the Apple IIe. It meant less of a need for slots, as the most popular peripheral add-on cards were built-in, ready for devices to be plugged into the rear ports of the machine; the built-in cards were mapped to phantom slots so software from slot-based Apple II models would know where to find them. The entire Apple Disk II Card, used for controlling floppy drives, had been shrunk down into a single chip called the “IWM” which stood for Integrated Woz Machine. In the rear of the machine were its expansion ports for providing access to its built-in cards; the standard DE-9 joystick connector doubled as a mouse interface, compatible with the same mice used by the Lisa and early Macintosh computers. Two serial ports were provided to support a printer and modem, a floppy port connector supported a single external 5.25-inch drive. A Video Expansion port provided rudimentary signals for add-on adapters but, could not directly generate a video signal.
A port connector tied into an internal 12 V power converter for attaching batteries. The same composite video port found on earlier Apple II models remained present; the Apple IIc had a built-in 5.25-inch floppy drive along the right side of the case—the first Apple II model to include such a feature. Along the left side of the case was a dial to control the volume of the internal speaker, along with a 1⁄8-inch monaural audio jack for headphones or an external speaker. A fold-out carrying handle doubled as a way to prop up the back end of the machine to angle the keyboard for typing, if desired; the keyboard layout mirrored that of the Apple IIe. Two toggle switches were located in the same area: an “80/40”-column switch for software