Nvidia Corporation is an American technology company based in Santa Clara, California. It designs graphics processing units for the market, as well as system on a chip units for the mobile computing. Its primary GPU product line, labeled GeForce, is in competition with Advanced Micro Devices Radeon products. Nvidia expanded its presence in the industry with its handheld SHIELD Portable, SHIELD Tablet. Since 2014, Nvidia has shifted to become a company focused on four markets – gaming, professional visualization, data centers. In addition to GPU manufacturing, Nvidia provides parallel processing capabilities to researchers and they are deployed in supercomputing sites around the world. More recently, It has moved into the computing market. In addition to AMD, its competitors include Intel, Nvidia is now focused on artificial intelligence. The name of the company comes from Invidia in Roman mythology who corresponds to Nemesis, RIVA TNT in 1998 solidified Nvidias reputation for capable hardware.
Autumn 1999 saw the release of the GeForce, most notably introducing on-board transformation, running at 120 MHz and featuring four pixel pipelines, it implemented advanced video acceleration, motion compensation and hardware sub-picture alpha blending. The GeForce outperformed existing products by a wide margin, due to the success of its products, Nvidia won the contract to develop the graphics hardware for Microsofts Xbox game console, which earned Nvidia a $200 million advance. However, the project drew the time of many of its best engineers away from other projects, in the short term this did not matter, and the GeForce2 GTS shipped in the summer of 2000. In December 2000, Nvidia reached an agreement to acquire the assets of its one-time rival 3dfx. The acquisition process was finalized in April 2002, in July 2002, Nvidia acquired Exluna for an undisclosed sum. Exluna made software rendering tools and the personnel were merged into the Cg project, in August 2003, Nvidia acquired MediaQ for approximately US$70 million.
On April 22,2004, Nvidia acquired iReady, a provider of high performance TCP/IP, in December 2004, it was announced that Nvidia would assist Sony with the design of the graphics processor in the PlayStation 3 game console. In March 2006, it emerged that Nvidia would deliver RSX to Sony as an IP core, under the agreement, Nvidia would provide ongoing support to port the RSX to Sonys fabs of choice, as well as die shrinks to 65 nm. This practice contrasted with its business arrangement with Microsoft, in which Nvidia managed production, meanwhile, in May 2005 Microsoft chose to license a design by ATI and to make its own manufacturing arrangements for the Xbox 360 graphics hardware, as had Nintendo for the Wii console
ARoS Aarhus Kunstmuseum
The ARoS Aarhus Kunstmuseum is an art museum in Aarhus, Denmark. The museum was established in 1859 and is the oldest public art museum in Denmark outside of Copenhagen, today ARoS is one of the largest art museums in northern Europe with a total of 816,468 visitors in 2015. This figure includes visitors who in one way or another have redeemed ticket and people who have visited ARoS Shop, ARoS features a shop, café and restaurant. The architectural vision of the museum was completed in 2011, with the addition of the circular skywalk Your rainbow panorama by Ólafur Elíasson. The installation has helped boost the attendance, making it the second most visited museum in Denmark. ARoS has an art collection with works from the Golden Age until today. Andersen, Bill Viola and Wim Wenders, in the present building, the first themed exhibitions presented a series of main works by pop artists like Andy Warhol and Roy Lichtenstein. As many other art galleries and museums, ARoS pays great tribute to architecture and architects.
Located in the basement is The 9 Spaces, a gallery with art from artist like James Turrell, Shirin Neshat. The number 9 refers to Dante Alighieris The Divine Comedy and the 9 circles of hell, the rooms are painted black to contrast with the bright white exterior. The roof terrace substitutes for the divine light your enter from hell and this way the whole museum is part of the travel from hell to heaven. This movement emphasised by the spiral staircase in the main museum streetscape. The roof of the museum is dominated by the installation Your rainbow panorama by Ólafur Elíasson and this circular skywalk has windows in the colors of the rainbow thereby showing the panorama of Aarhus in different colors depending on location of the viewer. The installation cost DKK million 60 to construct and was sponsored by the Realdania foundation and it was inaugurated on May 28,2011. A second section, a lounge that guides visitors from the main museum building to Your rainbow panorama. The museum was established in 1859 and is the oldest public art museum in Denmark outside of Copenhagen, the art collecting activities was initiated some years earlier in 1847 by the local art association of Århus Kunstforening af 1847 and the first public exhibition was presented on the 6.
January 1859 in Aarhus old Town Hall, located at the Cathedral, the present building next to the Concert Halls is the fourth locality of the art museum and it opened here in 2004. January 2009, ARoS Aarhus Kunstmuseum celebrated its 150-year anniversary with an exhibition, displaying the same works as the very first exhibition in 1859
PowerPC is a RISC instruction set architecture created by the 1991 Apple–IBM–Motorola alliance, known as AIM. PowerPC was the cornerstone of AIMs PReP and Common Hardware Reference Platform initiatives in the 1990s and it has since become niche in personal computers, but remain popular as embedded and high-performance processors. Its use in game consoles and embedded applications provided an array of uses. In addition, PowerPC CPUs are still used in AmigaOne and third party AmigaOS4 personal computers, the history of RISC began with IBMs 801 research project, on which John Cocke was the lead developer, where he developed the concepts of RISC in 1975–78. 801-based microprocessors were used in a number of IBM embedded products, the RT was a rapid design implementing the RISC architecture. The result was the POWER instruction set architecture, introduced with the RISC System/6000 in early 1990, the original POWER microprocessor, one of the first superscalar RISC implementations, was a high performance, multi-chip design. IBM soon realized that a microprocessor was needed in order to scale its RS/6000 line from lower-end to high-end machines.
Work on a one-chip POWER microprocessor, designated the RSC began, in early 1991, IBM realized its design could potentially become a high-volume microprocessor used across the industry. IBM approached Apple with the goal of collaborating on the development of a family of single-chip microprocessors based on the POWER architecture and this three-way collaboration became known as AIM alliance, for Apple, IBM, Motorola. In 1991, the PowerPC was just one facet of an alliance among these three companies. The PowerPC chip was one of joint ventures involving the three, in their efforts to counter the growing Microsoft-Intel dominance of personal computing. For Motorola, POWER looked like an unbelievable deal and it allowed them to sell a widely tested and powerful RISC CPU for little design cash on their own part. It maintained ties with an important customer and seemed to offer the possibility of adding IBM too, at this point Motorola already had its own RISC design in the form of the 88000 which was doing poorly in the market.
Motorola was doing well with their 68000 family and the majority of the funding was focused on this, the 88000 effort was somewhat starved for resources. However, the 88000 was already in production, Data General was shipping 88000 machines, the 88000 had achieved a number of embedded design wins in telecom applications. The result of various requirements was the PowerPC specification. The differences between the earlier POWER instruction set and PowerPC is outlined in Appendix E of the manual for PowerPC ISA v.2.02, when the first PowerPC products reached the market, they were met with enthusiasm. In addition to Apple, both IBM and the Motorola Computer Group offered systems built around the processors, Microsoft released Windows NT3.51 for the architecture, which was used in Motorolas PowerPC servers, and Sun Microsystems offered a version of its Solaris OS
The Motorola 68000 is a 32-bit CISC microprocessor with a 16-bit external data bus and marketed by Motorola Semiconductor Products Sector. After 38 years in production, the 68000 architecture is still in use, the 68000 grew out of the MACSS project, begun in 1976 to develop an entirely new architecture without backward compatibility. It would be a higher-power sibling complementing the existing 8-bit 6800 line rather than a compatible successor, in the end, the 68000 did retain a bus protocol compatibility mode for existing 6800 peripheral devices, and a version with an 8-bit data bus was produced. However, the designers focused on the future, or forward compatibility. For instance, the CPU registers are 32 bits wide, though few self-contained structures in the processor itself operate on 32 bits at a time. The MACSS team drew heavily on the influence of minicomputer processor design, such as the PDP-11 and VAX systems, in the mid 1970s, the 8-bit microprocessor manufacturers raced to introduce the 16-bit generation.
National Semiconductor had been first with its IMP-16 and PACE processors in 1973–1975, Intel had worked on their advanced 16/32-bit Intel iAPX432 since 1975 and their Intel 8086 since 1976. Arriving late to the 16-bit arena afforded the new processor more transistors, 32-bit macroinstructions, the original MC68000 was fabricated using an HMOS process with a 3.5 µm feature size. Formally introduced in September 1979, Initial samples were released in February 1980, Initial speed grades were 4,6, and 8 MHz.10 MHz chips became available during 1981, and 12.5 MHz chips by June 1982. The 16.67 MHz 12F version of the MC68000, the fastest version of the original HMOS chip, was not produced until the late 1980s, tom Gunter, retired Corporate Vice President at Motorola, is known as the Father of the 68000. The 68000 was used in Microsoft Xenix systems as well as an early NetWare Unix-based Server, the 68000 was used in the first generation of desktop laser printers including the original Apple Inc.
In 1982, the 68000 received an update to its ISA allowing it to virtual memory and to conform to the Popek. The updated chip was called the 68010, a further extended version which exposed 31 bits of the address bus was produced, in small quantities, as the 68012. To support lower-cost systems and control applications with smaller sizes, Motorola introduced the 8-bit compatible MC68008. This was a 68000 with an 8-bit data bus and an address bus. After 1982, Motorola devoted more attention to the 68020 and 88000 projects, several other companies were second-source manufacturers of the HMOS68000. These included Hitachi, who shrank the size to 2.7 µm for their 12.5 MHz version, Rockwell, Thomson/SGS-Thomson. Toshiba was a maker of the CMOS 68HC000
It is currently developed by the USB Implementers Forum. USB was designed to standardize the connection of peripherals to personal computers. It has become commonplace on other devices, such as smartphones, PDAs, USB has effectively replaced a variety of earlier interfaces, such as serial ports and parallel ports, as well as separate power chargers for portable devices. Also, there are 5 modes of USB data transfer, in order of increasing bandwidth, Low Speed, Full Speed, High Speed, SuperSpeed, USB devices have some choice of implemented modes, and USB version is not a reliable statement of implemented modes. Modes are identified by their names and icons, and the specifications suggests that plugs, unlike other data buses, USB connections are directed, with both upstream and downstream ports emanating from a single host. This applies to power, with only downstream facing ports providing power. Thus, USB cables have different ends, A and B, therefore, in general, each different format requires four different connectors, a plug and receptacle for each of the A and B ends. USB cables have the plugs, and the corresponding receptacles are on the computers or electronic devices, in common practice, the A end is usually the standard format, and the B side varies over standard and micro.
The mini and micro formats provide for USB On-The-Go with a hermaphroditic AB receptacle, the micro format is the most durable from the point of view of designed insertion lifetime. The standard and mini connectors have a lifetime of 1,500 insertion-removal cycles. Likewise, the component of the retention mechanism, parts that provide required gripping force, were moved into plugs on the cable side. A group of seven companies began the development of USB in 1994, Compaq, DEC, IBM, Microsoft, NEC, a team including Ajay Bhatt worked on the standard at Intel, the first integrated circuits supporting USB were produced by Intel in 1995. The original USB1.0 specification, which was introduced in January 1996, Microsoft Windows 95, OSR2.1 provided OEM support for the devices. The first widely used version of USB was 1.1, the 12 Mbit/s data rate was intended for higher-speed devices such as disk drives, and the lower 1.5 Mbit/s rate for low data rate devices such as joysticks. Apple Inc. s iMac was the first mainstream product with USB, following Apples design decision to remove all legacy ports from the iMac, many PC manufacturers began building legacy-free PCs, which led to the broader PC market using USB as a standard.
The USB2.0 specification was released in April 2000 and was ratified by the USB Implementers Forum at the end of 2001.1 specification, the USB3.0 specification was published on 12 November 2008. Its main goals were to increase the transfer rate, decrease power consumption, increase power output. USB3.0 includes a new, higher speed bus called SuperSpeed in parallel with the USB2.0 bus, for this reason, the new version is called SuperSpeed
The Amiga 1200, or A1200, is Commodore Internationals third-generation Amiga computer, aimed at the home computer market. It was launched on October 21,1992, at a price of £399 in the United Kingdom. Like its form-factor predecessors, the Amiga 500 and 600, the A1200 is a design incorporating the CPU, keyboard. The A1200 has a hardware architecture to Commodores Amiga CD32 game console, and is technically close to the Atari Falcon. Initially, only 30,000 A1200s were available at the UK launch, during the first year of its life the system reportedly sold well, but Commodore ran into cash flow problems and filed for bankruptcy. Worldwide sales figures for the A1200 are unknown, but 95,000 systems were sold in Germany before Commodores bankruptcy, after Commodore’s demise in 1994, the A1200 almost disappeared from the market but was relaunched by Escom in 1995. The new Escom A1200 was priced at £399, and it came bundled with two games, seven applications and AmigaOS3.1 and it was initially criticized for being priced 150 pounds higher than the Commodore variant that had been sold for two years prior.
It came with a modified PC floppy disk drive that is incompatible with some Amiga software, the A1200 was finally discontinued in 1996 as the parent company folded. The A1200 offers a number of advantages over earlier lower-budget Amiga models, specifically, it is a 32-bit design, the 68EC020 microprocessor is faster than the 68000 and has 2 MB of RAM as standard. The AGA chipset used in the A1200 is a significant improvement, AGA increases the color palette from 4096 colors to 16.8 million colors with up to 256 on-screen colors normally, and an improved HAM mode allowing 262,144 on-screen colors. The graphics hardware features improved sprite capacity and faster graphics performance mainly due to video memory. Additionally, compared to the A600 the A1200 offers greater expansion possibilities, although it is a significant upgrade, the A1200 did not sell as well as the 500 and proved to be Commodores last lower-budget model before filing for bankruptcy in 1994. This is mainly because the 1200 failed to repeat the technological advantage over competitors like the first Amiga systems, the AGA chipset was something of a disappointment.
While AGA is not notably less capable than its competition, when compared to VGA and its emerging extensions, the Amigas custom chips cost more to produce than the increasingly ubiquitous commodity chips utilized in PCs, making the A1200 more expensive. Some industry commentators felt that the 68020 microprocessor was already too outdated, another issue was that the A1200 never supported high-density floppy disks without a special external drive or unreliable hacks, despite the PC HD drive in Escom models. As a result, fewer retailers carried the A1200, especially in North America, the A1200 received bad press for being incompatible with a number of Amiga 500 games. Further criticism was directed at the A1200s power supply, which is inadequate in expanded systems. Due to less sales and short lifetime, much fewer games were produced for the A1200 than for the generations of Amiga computers
A personal computer is a multi-purpose electronic computer whose size and price make it feasible for individual use. PCs are intended to be operated directly by a end-user, rather than by an expert or technician. In the 2010s, PCs are typically connected to the Internet, allowing access to the World Wide Web, personal computers may be connected to a local area network, either by a cable or a wireless connection. In the 2010s, a PC may be, a multi-component desktop computer, designed for use in a location a laptop computer, designed for easy portability or a tablet computer. In the 2010s, PCs run using a system, such as Microsoft Windows, Linux. The very earliest microcomputers, equipped with a front panel, required hand-loading of a program to load programs from external storage. Before long, automatic booting from permanent read-only memory became universal, in the 2010s, users have access to a wide range of commercial software, free software and free and open-source software, which are provided in ready-to-run or ready-to-compile form.
Since the early 1990s, Microsoft operating systems and Intel hardware have dominated much of the computer market, first with MS-DOS. Alternatives to Microsofts Windows operating systems occupy a minority share of the industry and these include Apples OS X and free open-source Unix-like operating systems such as Linux and Berkeley Software Distribution. Advanced Micro Devices provides the alternative to Intels processors. PC is an initialism for personal computer, some PCs, including the OLPC XOs, are equipped with x86 or x64 processors but not designed to run Microsoft Windows. PC is used in contrast with Mac, an Apple Macintosh computer and this sense of the word is used in the Get a Mac advertisement campaign that ran between 2006 and 2009, as well as its rival, Im a PC campaign, that appeared in 2008. Since Apples transition to Intel processors starting 2005, all Macintosh computers are now PCs, the “brain” may one day come down to our level and help with our income-tax and book-keeping calculations.
But this is speculation and there is no sign of it so far, in the history of computing there were many examples of computers designed to be used by one person, as opposed to terminals connected to mainframe computers. Using the narrow definition of operated by one person, the first personal computer was the ENIAC which became operational in 1946 and it did not meet further definitions of affordable or easy to use. An example of an early single-user computer was the LGP-30, created in 1956 by Stan Frankel and used for science and it came with a retail price of $47, 000—equivalent to about $414,000 today. Introduced at the 1965 New York Worlds Fair, the Programma 101 was a programmable calculator described in advertisements as a desktop computer. It was manufactured by the Italian company Olivetti and invented by the Italian engineer Pier Giorgio Perotto, the Soviet MIR series of computers was developed from 1965 to 1969 in a group headed by Victor Glushkov
Open-source software may be developed in a collaborative public manner. According to scientists who studied it, open-source software is a prominent example of open collaboration, a 2008 report by the Standish Group states that adoption of open-source software models has resulted in savings of about $60 billion per year to consumers. In the early days of computing and developers shared software in order to learn from each other, eventually the open source notion moved to the way side of commercialization of software in the years 1970-1980. In 1997, Eric Raymond published The Cathedral and the Bazaar and this source code subsequently became the basis behind SeaMonkey, Mozilla Firefox and KompoZer. Netscapes act prompted Raymond and others to look into how to bring the Free Software Foundations free software ideas, the new term they chose was open source, which was soon adopted by Bruce Perens, publisher Tim OReilly, Linus Torvalds, and others. The Open Source Initiative was founded in February 1998 to encourage use of the new term, a Microsoft executive publicly stated in 2001 that open source is an intellectual property destroyer.
I cant imagine something that could be worse than this for the software business, IBM, Oracle and State Farm are just a few of the companies with a serious public stake in todays competitive open-source market. There has been a significant shift in the corporate philosophy concerning the development of FOSS, the free software movement was launched in 1983. In 1998, a group of individuals advocated that the free software should be replaced by open-source software as an expression which is less ambiguous. Software developers may want to publish their software with an open-source license, the Open Source Definition, presents an open-source philosophy, and further defines the terms of usage and redistribution of open-source software. Software licenses grant rights to users which would otherwise be reserved by law to the copyright holder. Several open-source software licenses have qualified within the boundaries of the Open Source Definition, the open source label came out of a strategy session held on April 7,1998 in Palo Alto in reaction to Netscapes January 1998 announcement of a source code release for Navigator.
They used the opportunity before the release of Navigators source code to clarify a potential confusion caused by the ambiguity of the free in English. Many people claimed that the birth of the Internet, since 1969, started the open source movement, the Free Software Foundation, started in 1985, intended the word free to mean freedom to distribute and not freedom from cost. Since a great deal of free software already was free of charge, such software became associated with zero cost. The Open Source Initiative was formed in February 1998 by Eric Raymond and they sought to bring a higher profile to the practical benefits of freely available source code, and they wanted to bring major software businesses and other high-tech industries into open source. Perens attempted to open source as a service mark for the OSI. The Open Source Initiatives definition is recognized by governments internationally as the standard or de facto definition, OSI uses The Open Source Definition to determine whether it considers a software license open source
X86-64 is the 64-bit version of the x86 instruction set. It supports vastly larger amounts of memory and physical memory than is possible on its 32-bit predecessors. X86-64 provides 64-bit general-purpose registers and numerous other enhancements and it is fully backward compatible with 16-bit and 32-bit x86 code. The original specification, created by AMD and released in 2000, has been implemented by AMD, the AMD K8 processor was the first to implement the architecture, this was the first significant addition to the x86 architecture designed by a company other than Intel. Intel was forced to suit and introduced a modified NetBurst family which was fully software-compatible with AMDs design. VIA Technologies introduced x86-64 in their VIA Isaiah architecture, with the VIA Nano, the x86-64 specification is distinct from the Intel Itanium architecture, which is not compatible on the native instruction set level with the x86 architecture. AMD64 was created as an alternative to the radically different IA-64 architecture, the first AMD64-based processor, the Opteron, was released in April 2003.
AMDs processors implementing the AMD64 architecture include Opteron, Athlon 64, Athlon 64 X2, Athlon 64 FX, Athlon II, Turion 64, Turion 64 X2, Phenom, Phenom II, FX, Fusion and Ryzen. The primary defining characteristic of AMD64 is the availability of 64-bit general-purpose processor registers, 64-bit integer arithmetic and logical operations, the designers took the opportunity to make other improvements as well. Some of the most significant changes are described below and pops on the stack default to 8-byte strides, and pointers are 8 bytes wide. Additional registers In addition to increasing the size of the general-purpose registers, AMD64 still has fewer registers than many common RISC instruction sets or VLIW-like machines such as the IA-64. However, an AMD64 implementation may have far more internal registers than the number of architectural registers exposed by the instruction set, additional XMM registers Similarly, the number of 128-bit XMM registers is increased from 8 to 16. Larger virtual address space The AMD64 architecture defines a 64-bit virtual address format and this allows up to 256 TB of virtual address space.
The architecture definition allows this limit to be raised in future implementations to the full 64 bits and this is compared to just 4 GB for the x86. This means that very large files can be operated on by mapping the entire file into the address space, rather than having to map regions of the file into. Larger physical address space The original implementation of the AMD64 architecture implemented 40-bit physical addresses, current implementations of the AMD64 architecture extend this to 48-bit physical addresses and therefore can address up to 256 TB of RAM. The architecture permits extending this to 52 bits in the future, for comparison, 32-bit x86 processors are limited to 64 GB of RAM in Physical Address Extension mode, or 4 GB of RAM without PAE mode. Any implementation therefore allows the physical address limit as under long mode
The Amiga is a family of personal computers sold by Commodore in the 1980s and 1990s. The Amiga provided a significant upgrade from earlier 8-bit home computers, the Amiga 1000 was officially released in July 1985, but a series of production problems meant it did not become widely available until early 1986. The best selling model, the Amiga 500, was introduced in 1987 and became one of the home computers of the late 1980s. The A3000, introduced in 1990, started the second generation of Amiga systems, followed by the A500+, finally, as the third generation, the A1200 and the A4000 were released in late 1992. The platform became particularly popular for gaming and programming demos and it found a prominent role in the desktop video, video production, and show control business, leading to video editing systems such as the Video Toaster. The Amigas native ability to play back multiple digital sound samples made it a popular platform for early tracker music software. It was an expensive alternative to the Apple Macintosh.
Initially, the Amiga was developed alongside various Commodore PC clones, Commodore ultimately went bankrupt in April 1994 after the Amiga CD32 model failed in the marketplace. Since the demise of Commodore, various groups have marketed successors to the original Amiga line, including Genesi, Eyetech, ACube Systems Srl, AmigaOS has influenced replacements and compatible systems such as MorphOS, AmigaOS4 and AROS. The Amiga was so far ahead of its time that almost nobody—including Commodores marketing department—could fully articulate what it was all about. Today, its obvious the Amiga was the first multimedia computer, but in those days it was derided as a machine because few people grasped the importance of advanced graphics, sound. Nine years later, vendors are still struggling to make systems that work like 1985 Amigas, Jay Miner joined Atari in the 1970s to develop custom integrated circuits, and led development of the Atari 2600s TIA. Almost as soon as its development was complete, the team developing a much more sophisticated set of chips, CTIA, ANTIC and POKEY.
With the 8-bit lines launch in 1979, Miner again started looking at a next generation chipset, Miner wanted to start work with the new Motorola 68000, but management was only interested in another MOS6502 based system. Miner left the company, and the industry, shortly thereafter, in 1982, Larry Kaplan was approached by a number of investors who wanted to develop a new game platform. Kaplan hired Miner to run the side of the newly formed company. The system was code-named Lorraine in keeping with Miners policy of giving systems female names, in case the company presidents wife. When Kaplan left the late in 1982 to rejoin Atari, Miner was promoted to head engineer
Internet protocol suite
The Internet protocol suite is the conceptual model and set of communications protocols used on the Internet and similar computer networks. It is commonly known as TCP/IP because the protocols in the suite are the Transmission Control Protocol. It is occasionally known as the Department of Defense model, because the development of the model was funded by DARPA. The Internet protocol suite provides end-to-end data communication specifying how data should be packetized, transmitted and received and this functionality is organized into four abstraction layers which are used to sort all related protocols according to the scope of networking involved. Technical standards specifying the Internet protocol suite and many of its constituent protocols are maintained by the Internet Engineering Task Force, the Internet protocol suite model is a simpler model developed prior to the OSI model. The Internet protocol suite resulted from research and development conducted by the Defense Advanced Research Projects Agency in the late 1960s, after initiating the pioneering ARPANET in 1969, DARPA started work on a number of other data transmission technologies.
In 1972, Robert E. Cerf credits Hubert Zimmermann and Louis Pouzin, designer of the CYCLADES network, the protocol was implemented as the Transmission Control Program, first published in 1974. Initially, the TCP managed both datagram transmissions and routing, but as the protocol grew, other researchers recommended a division of functionality into protocol layers, postel stated, “we are screwing up in our design of Internet protocols by violating the principle of layering”. Encapsulation of different mechanisms was intended to create an environment where the layers could access only what was needed from the lower layers. A monolithic design would be inflexible and lead to scalability issues, the Transmission Control Program was split into two distinct protocols, the Transmission Control Protocol and the Internet Protocol. The new suite replaced all protocols used previously and this design is known as the end-to-end principle. Using this design, it possible to connect almost any network to the ARPANET, irrespective of the local characteristics.
One popular expression is that TCP/IP, the product of Cerf and Kahns work. A computer called a router is provided with an interface to each network and it forwards packets back and forth between them. Originally a router was called gateway, but the term was changed to avoid confusion with other types of gateways, from 1973 to 1974, Cerfs networking research group at Stanford worked out details of the idea, resulting in the first TCP specification. A significant technical influence was the early networking work at Xerox PARC, DARPA contracted with BBN Technologies, Stanford University, and the University College London to develop operational versions of the protocol on different hardware platforms. Four versions were developed, TCP v1, TCP v2, TCP v3 and IP v3, the last protocol is still in use today. In 1975, a two-network TCP/IP communications test was performed between Stanford and University College London, in November,1977, a three-network TCP/IP test was conducted between sites in the US, the UK, and Norway
The original model became far more popular than anticipated, selling outside of its target market for uses such as robotics. Peripherals are not included with the Raspberry Pi, some accessories however have been included in several official and unofficial bundles. According to the Raspberry Pi Foundation, over 5 million Raspberry Pis have been sold before February 2015, by 9 September 2016 they had sold 10 million. Several generations of Raspberry Pis have been released, the first generation was released in February 2012. It was followed by a simpler and inexpensive model Model A, in 2014, the foundation released a board with an improved design in Raspberry Pi 1 Model B+. These boards are approximately credit-card sized and represent the standard mainline form-factor, improved A+ and B+ models were released a year later. The Raspberry Pi 2 which added more RAM was released in February 2015, Raspberry Pi 3 Model B released in February 2016 is bundled with on-board WiFi, Bluetooth and USB Boot capabilities.
As of January 2017, Raspberry Pi 3 Model B is the newest mainline Raspberry Pi, Raspberry Pi boards are priced between US$5–35. As of 28 February 2017, the Raspberry PI Zero W was launched, which is identical to the Raspberry PI Zero, all models feature a Broadcom system on a chip, which includes an ARM compatible central processing unit and an on-chip graphics processing unit. CPU speed ranges from 700 MHz to 1.2 GHz for the Pi 3, secure Digital cards are used to store the operating system and program memory in either the SDHC or MicroSDHC sizes. Most boards have between one and four USB slots, HDMI and composite video output, and a 3.5 mm phone jack for audio, lower level output is provided by a number of GPIO pins which support common protocols like I²C. The B-models have an 8P8C Ethernet port and the Pi 3 and Pi Zero W have on board Wi-Fi 802. 11n and Bluetooth. The Foundation provides Raspbian, a Debian-based Linux distribution for download, as well as third party Ubuntu, Windows 10 IOT Core, RISC OS and it promotes Python and Scratch as the main programming language, with support for many other languages.
The default firmware is closed source, while an open source is available. The Raspberry Pi hardware has evolved through several versions that feature variations in memory capacity and this block diagram depicts Models A, B, A+, and B+. Model A, A+, and the Pi Zero lack the Ethernet and USB hub components. The Ethernet adapter is connected to an additional USB port. In Model A, A+, and the PI Zero, the USB port is connected directly to the system on a chip. On the Pi 1 Model B+ and models the USB/Ethernet chip contains a five-point USB hub, on the Pi Zero, the USB port is connected directly to the SoC, but it uses a micro USB port