Ethernet is a family of computer networking technologies used in local area networks, metropolitan area networks and wide area networks. It was commercially introduced in 1980 and first standardized in 1983 as IEEE 802.3, has since retained a good deal of backward compatibility and been refined to support higher bit rates and longer link distances. Over time, Ethernet has replaced competing wired LAN technologies such as Token Ring, FDDI and ARCNET; the original 10BASE5 Ethernet uses coaxial cable as a shared medium, while the newer Ethernet variants use twisted pair and fiber optic links in conjunction with switches. Over the course of its history, Ethernet data transfer rates have been increased from the original 2.94 megabits per second to the latest 400 gigabits per second. The Ethernet standards comprise several wiring and signaling variants of the OSI physical layer in use with Ethernet. Systems communicating over Ethernet divide a stream of data into shorter pieces called frames; each frame contains source and destination addresses, error-checking data so that damaged frames can be detected and discarded.
As per the OSI model, Ethernet provides services up including the data link layer. Features such as the 48-bit MAC address and Ethernet frame format have influenced other networking protocols including Wi-Fi wireless networking technology. Ethernet is used in home and industry; the Internet Protocol is carried over Ethernet and so it is considered one of the key technologies that make up the Internet. Ethernet was developed at Xerox PARC between 1973 and 1974, it was inspired by ALOHAnet. The idea was first documented in a memo that Metcalfe wrote on May 22, 1973, where he named it after the luminiferous aether once postulated to exist as an "omnipresent, completely-passive medium for the propagation of electromagnetic waves." In 1975, Xerox filed a patent application listing Metcalfe, David Boggs, Chuck Thacker, Butler Lampson as inventors. In 1976, after the system was deployed at PARC, Metcalfe and Boggs published a seminal paper; that same year, Ron Crane, Bob Garner, Roy Ogus facilitated the upgrade from the original 2.94 Mbit/s protocol to the 10 Mbit/s protocol, released to the market in 1980.
Metcalfe left Xerox in June 1979 to form 3Com. He convinced Digital Equipment Corporation and Xerox to work together to promote Ethernet as a standard; as part of that process Xerox agreed to relinquish their'Ethernet' trademark. The first standard was published on September 1980 as "The Ethernet, A Local Area Network. Data Link Layer and Physical Layer Specifications"; this so-called DIX standard specified 10 Mbit/s Ethernet, with 48-bit destination and source addresses and a global 16-bit Ethertype-type field. Version 2 was published in November, 1982 and defines what has become known as Ethernet II. Formal standardization efforts proceeded at the same time and resulted in the publication of IEEE 802.3 on June 23, 1983. Ethernet competed with Token Ring and other proprietary protocols. Ethernet was able to adapt to market realities and shift to inexpensive thin coaxial cable and ubiquitous twisted pair wiring. By the end of the 1980s, Ethernet was the dominant network technology. In the process, 3Com became a major company.
3Com shipped its first 10 Mbit/s Ethernet 3C100 NIC in March 1981, that year started selling adapters for PDP-11s and VAXes, as well as Multibus-based Intel and Sun Microsystems computers. This was followed by DEC's Unibus to Ethernet adapter, which DEC sold and used internally to build its own corporate network, which reached over 10,000 nodes by 1986, making it one of the largest computer networks in the world at that time. An Ethernet adapter card for the IBM PC was released in 1982, and, by 1985, 3Com had sold 100,000. Parallel port based Ethernet adapters were produced with drivers for DOS and Windows. By the early 1990s, Ethernet became so prevalent that it was a must-have feature for modern computers, Ethernet ports began to appear on some PCs and most workstations; this process was sped up with the introduction of 10BASE-T and its small modular connector, at which point Ethernet ports appeared on low-end motherboards. Since Ethernet technology has evolved to meet new bandwidth and market requirements.
In addition to computers, Ethernet is now used to interconnect appliances and other personal devices. As Industrial Ethernet it is used in industrial applications and is replacing legacy data transmission systems in the world's telecommunications networks. By 2010, the market for Ethernet equipment amounted to over $16 billion per year. In February 1980, the Institute of Electrical and Electronics Engineers started project 802 to standardize local area networks; the "DIX-group" with Gary Robinson, Phil Arst, Bob Printis submitted the so-called "Blue Book" CSMA/CD specification as a candidate for the LAN specification. In addition to CSMA/CD, Token Ring and Token Bus were considered as candidates for a LAN standard. Competing proposals and broad interest in the initiative led to strong disagreement over which technology to standardize. In December 1980, the group was split into three subgroups, standardization proceeded separately for each proposal. Delays in the standards process put at risk the market introduction of the Xerox Star workstation and 3Com's Ethernet LAN products.
With such business implications in mind, David Liddle an
SRI International is an American nonprofit scientific research institute and organization headquartered in Menlo Park, California. The trustees of Stanford University established SRI in 1946 as a center of innovation to support economic development in the region; the organization was founded as the Stanford Research Institute. SRI formally separated from Stanford University in 1970 and became known as SRI International in 1977. SRI performs client-sponsored research and development for government agencies, commercial businesses, private foundations, it licenses its technologies, forms strategic partnerships, sells products, creates spin-off companies. SRI's annual revenue in 2014 was $540 million. SRI's headquarters are located near the Stanford University campus. William A. Jeffrey has served as SRI's president and CEO since September 2014. SRI employs about 2,100 people. Sarnoff Corporation, a wholly owned subsidiary of SRI since 1988, was integrated into SRI in January 2011. SRI's focus areas include biomedical sciences and materials, computing and space systems, economic development and learning, energy and environmental technology and national defense, as well as sensing and devices.
SRI has received more than 4,000 patent applications worldwide. In the 1920s, Stanford University professor Robert E. Swain proposed creating a research institute in the Western United States. Herbert Hoover a trustee of Stanford University, was an early proponent of an institute, but became less involved with the project after he was elected president of the United States; the development of the institute was delayed by the Great Depression in the 1930s and World War II in the 1940s, with three separate attempts leading to its formation in 1946. In August 1945, Maurice Nelles, Morlan A. Visel, Ernest L. Black of Lockheed made the first attempt to create the institute with the formation of the "Pacific Research Foundation" in Los Angeles. A second attempt was made by Henry T. Heald president of the Illinois Institute of Technology. In 1945, Heald wrote a report recommending a research institute on the West Coast and a close association with Stanford University with an initial grant of $500,000.
A third attempt was made by Stanford University's dean of engineering. Terman's proposal followed Heald's, but focused on faculty and student research more than contract research; the trustees of Stanford University voted to create the organization in 1946. It was structured so that its goals were aligned with the charter of the university—to advance scientific knowledge and to benefit the public at large, not just the students of Stanford University; the trustees were named as the corporation's general members, elected SRI's directors. Research chemist William F. Talbot became the first director of the institute. Stanford University president Donald Tresidder instructed Talbot to avoid work that would conflict with the interests of the university federal contracts that might attract political pressure; the drive to find work and the lack of support from Stanford faculty caused the new research institute to violate this directive six months through the pursuit of a contract with the Office of Naval Research.
This and other issues, including frustration with Tresidder's micromanagement of the new organization, caused Talbot to offer his resignation, which Tresidder accepted. Talbot was replaced by Jesse Hobson, who had led the Armour Research Foundation, but the pursuit of contract work remained. SRI's first research project investigated whether the guayule plant could be used as a source of natural rubber. During World War II, rubber was imported into the U. S. and was subject to strict rationing. From 1942 to 1946, the United States Department of Agriculture supported a project to create a domestic source of natural rubber. Once the war ended, the United States Congress cut funding for the program. SRI's first economic study was for the United States Air Force. In 1947, the Air Force wanted to determine the expansion potential of the U. S. aircraft industry. In 1948, SRI began research and consultation with Chevron Corporation to develop an artificial substitute for tallow and coconut oil in soap production.
Procter & Gamble used the substance as the basis for Tide laundry detergent. The institute performed much of the early research on air pollution and the formation of ozone in the lower atmosphere. SRI sponsored the First National Air Pollution Symposium in Pasadena, California, in November 1949. Experts gave presentations on pollution research, exchanged ideas and techniques, stimulated interest in the field; the event was attended by 400 scientists, business executives, civic leaders from the U. S. SRI co-sponsored subsequent events on the subject. In April 1953, Walt and Roy Disney hired SRI to consult on their proposal for establishing an amusement park in Burbank, California. SRI provided information on location, attendance patterns, economic feasibility. SRI selected a larger site in Anaheim, prepared reports about operation, provided on-site administrative support for Disneyland and acted in an advisory role as the park expanded. In 1955, SRI was c
The Boeing Company is an American multinational corporation that designs and sells airplanes, rockets and missiles worldwide. The company provides leasing and product support services. Boeing is among the largest global aircraft manufacturers. Boeing stock is included in the Dow Jones Industrial Average. Boeing was founded by William Boeing on July 15, 1916, in Washington; the present corporation is the result of the merger of Boeing with McDonnell Douglas on August 1, 1997. Former Boeing's chair and CEO Philip M. Condit continued as the chair and CEO of the new Boeing, while Harry Stonecipher, former CEO of McDonnell Douglas, became the president and chief operating officer of the newly merged company; the Boeing Company has its corporate headquarters in Illinois. The company is led by CEO Dennis Muilenburg. Boeing is organized into five primary divisions: Boeing Commercial Airplanes. In 2017, Boeing recorded $93.3 billion in sales, ranked 24th on the Fortune magazine "Fortune 500" list, ranked 64th on the "Fortune Global 500" list, ranked 19th on the "World's Most Admired Companies" list.
In March 1910, William E. Boeing bought Heath's shipyard in Seattle on the Duwamish River, which became his first airplane factory. Boeing was incorporated in Seattle by William Boeing, on July 15, 1916, as "Pacific Aero Products Co". Boeing was incorporated in Delaware. Boeing, who studied at Yale University, worked in the timber industry, where he became wealthy and learned about wooden structures; this knowledge proved invaluable in his subsequent assembly of airplanes. The company stayed in Seattle to take advantage of the local supply of spruce wood. One of the two "B&W" seaplanes built with the assistance of George Conrad Westervelt, a U. S. Navy engineer, took its maiden flight on June 15, 1916. Boeing and Westervelt decided to build the B&W seaplane after having flown in a Curtiss aircraft. Boeing bought a Glenn Martin "Flying Birdcage" seaplane and was taught to fly by Glenn Martin himself. Boeing soon crashed the Birdcage and when Martin informed Boeing that replacement parts would not become available for months, Boeing realized he could build his own plane in that amount of time.
He and his friend Cdr. G. C. Westervelt soon produced the B&W Seaplane; this first Boeing airplane was assembled in a lakeside hangar located on the northeast shore of Seattle's Lake Union. Many of Boeing's early planes were seaplanes. On April 6, 1917, the U. S. declared war on Germany and entered World War I. On May 9, 1917, the company became the "Boeing Airplane Company". With the U. S. entering the war, Boeing knew that the U. S. Navy needed seaplanes for training. So Boeing shipped two new Model Cs to Pensacola, where the planes were flown for the Navy; the Navy ordered 50 more. The company moved its operations to a larger former shipbuilding facility known as Boeing Plant 1, located on the lower Duwamish River, Washington state; when World War I ended in 1918, a large surplus of cheap, used military planes flooded the commercial airplane market, preventing aircraft companies from selling any new airplanes, driving many out of business. Others, including Boeing, started selling other products. Boeing built dressers and furniture, along with flat-bottom boats called Sea Sleds.
In 1919 the Boeing B-1 flying boat made its first flight. It accommodated two passengers and some mail. Over the course of eight years, it made international airmail flights from Seattle to Victoria, British Columbia. On May 24, 1920, the Boeing Model 8 made its first flight, it was the first plane to fly over Mount Rainier. In 1923, Boeing entered competition against Curtiss to develop a pursuit fighter for the U. S. Army Air Service. Although Curtiss finished its design first and was awarded the contract, Boeing continued to develop its PW-9 fighter; that plane, along with the Boeing P-12/F4B fighter, made Boeing a leading manufacturer of fighters over the course of the next decade. In 1925, Boeing built its Model 40 mail plane for the U. S. government to use on airmail routes. In 1927, an improved version of this plane was built, the Model 40A which won the U. S. Post Office's contract to deliver mail between San Chicago; the 40A had a passenger cabin that accommodated two. That same year, Boeing created an airline named Boeing Air Transport, which merged a year with Pacific Air Transport and the Boeing Airplane Company.
The first airmail flight for the airline was on July 1, 1927. In 1929 the company merged with Pratt & Whitney, Hamilton Aero Manufacturing Company, Chance Vought under the new title United Aircraft and Transport Corporation; the merge was followed by the acquisition of the Sikorsky Manufacturing Corporation, Stearman Aircraft Corporation, Standard Metal Propeller Company. United Aircraft purchased National Air Transport in 1930. On July 27, 1928, the 12-passenger Boeing 80 biplane made its first flight. With three engines, it was Boeing's first plane built with the sole intention of being a passenger transport. An upgraded version, the 80A, carrying eighteen passengers, made its first flight in September 1929. In the early 1930s Boeing became a leader in all-metal aircraft construction, in the design revolution t
X Development LLC. is an American semi-secret research and development facility and organization founded by Google in January 2010, which now operates as a subsidiary of Alphabet Inc. X has its headquarters about a mile and a half from Alphabet's corporate headquarters, the Googleplex, in Mountain View, California. Work at X is overseen by entrepreneur scientist Astro Teller, as CEO and "Captain of Moonshots"; the lab started with the development of Google's self-driving car. On October 2, 2015, after the complete restructuring of Google into Alphabet, Google X became an independent Alphabet company and was renamed to X. On 25 October 2018, The New York Times published an exposé entitled, "How Google Protected Andy Rubin, the ‘Father of Android’"; the company subsequently announced that "48 employees have been fired over the last two years" for sexual misconduct. A week after the article appeared, Google X executive Rich DeVaul resigned pursuant to a complaint of sexual harassment. X's mission is to invent and launch "moonshot" technologies that aim to make the world a radically better place.
A moonshot is defined by X as the intersection of a big problem, a radical solution, breakthrough technology. Project Glass is a research and development program by Google to develop an augmented reality head-mounted display; the intended purpose of Project Glass products would be the hands-free displaying of information available to most smartphone users, allowing for interaction with the Internet via natural language voice commands. One Google Glass costs $1500. Makani is a current project, acquired by X in May 2013 designed to produce wind energy using kites; the T-shaped planes contain 8 turbines tethered to the ground. Compared to wind turbines, Makani's kites require 90% less material. In December 2016, Makani's kite became the first energy kite in the world to generate electricity. X is conducting tests of free-space optical communication in rural areas of India; the technology uses light beams. As of December 2017, X had set up 2,000 of these units in India, through a partnership with Andhra Pradesh State FiberNet Limited.
In October 2013, the existence of four Google barges was revealed, with the vessels registered under the dummy corporation By And Large. Two of the barges have a superstructure whose construction has been kept under the utmost secrecy, while speculations indicate they could be used as marketing for, stores for, Google Glass. However, these are speculations. Others have suggested. Waymo was a project by Google. In December 2016, Google transitioned the project into a new company called Waymo, housed under Google's parent company Alphabet; the project was led by Google engineer Sebastian Thrun, director of the Stanford Artificial Intelligence Laboratory and co-inventor of Google Street View. Thrun's team at Stanford created the robotic vehicle Stanley which won the 2005 DARPA Grand Challenge and its US$2 million prize from the United States Department of Defense; the team developing the system consisted of 15 engineers working for Google, including Chris Urmson, Mike Montemerlo, Anthony Levandowski, who had worked on the DARPA Grand and Urban Challenges.
The U. S. state of Nevada passed a law in June 2011 concerning the operation of driverless cars in Nevada. Google had been lobbying for driverless car laws; the license was issued to a Toyota Prius modified with Google's experimental driver-less technology. As of March 2016, Google had test driven their fleet of vehicles, in autonomous mode, a total of 1,498,214 mi. Project Loon was a project of X that aims to bring internet access to everyone by creating an internet network of balloons flying through the stratosphere, it uses wireless routers in balloons that are above weather and plans to give access to the internet to those who can't reach it or are in need of help. In July 2018, Loon was made a subsidiary of Alphabet. Project Wing was a project of X that aims to deliver products across a city by using flying vehicles, similar to the Amazon Prime Air concept. At the time of the announcement on August 28, 2014, it had been in development secretly at Google for about two years, with full-scale testing being carried out in Australia.
The flying vehicles take off vertically rotate to a horizontal position for flying around. For delivery, it winches packages down to the ground. At the end of the tether is a small bundle of electronics which detects that the package has hit the ground, detaches from the delivery, is pulled back up into the body of the vehicle. Dropping the cargo or landing were found to be unfeasible, as users compromised the safety. Malta was started in July 2017 to develop renewable energy storage systems by utilizing tanks of molten salt; the system works by transforming electrical energy to heat energy for storage, based on research by Robert B. Laughlin. Malta Inc. graduated from X in December 2018 with plans to develop a large-scale test of the technology for future commercial applications. The Google Contact Lens, a smart contact lens that aims to assist people with diabetes by measuring the glucose levels in their tears, was announced by Google on January 16, 2014; this project, the nanodiagnostics project to develop a cancer-detecting pill, other life sciences efforts are now being carried out by Verily.
Google Brain is now a deep learning research project at Google. Considered one of the biggest successes, this one project has produced enough value for Google to more than cover the total costs of X, according to Astro Teller. Dande
Very Large Scale Integration
Very-large-scale integration is the process of creating an integrated circuit by combining millions of transistors or devices into a single chip. VLSI began in the 1970s when complex semiconductor and communication technologies were being developed; the microprocessor is a VLSI device. Before the introduction of VLSI technology most ICs had a limited set of functions they could perform. An electronic circuit might consist of ROM, RAM and other glue logic. VLSI lets IC designers add all of these into one chip; the history of the transistor dates to the 1920s when several inventors attempted devices that were intended to control current in solid-state diodes and convert them into triodes. Success came after World War II, when the use of silicon and germanium crystals as radar detectors led to improvements in fabrication and theory. Scientists who had worked on radar returned to solid-state device development. With the invention of transistors at Bell Labs in 1947, the field of electronics shifted from vacuum tubes to solid-state device.
With the small transistor at their hands, electrical engineers of the 1950s saw the possibilities of constructing far more advanced circuits. However, as the complexity of circuits grew, problems arose. One problem was the size of the circuit. A complex circuit like a computer was dependent on speed. If the components were large, the wires interconnecting them must be long; the electric signals took time thus slowing the computer. The invention of the integrated circuit by Jack Kilby and Robert Noyce solved this problem by making all the components and the chip out of the same block of semiconductor material; the circuits could be made smaller, the manufacturing process could be automated. This led to the idea of integrating all components on a single-crystal silicon wafer, which led to small-scale integration in the early 1960s, medium-scale integration in the late 1960s, large-scale integration as well as VLSI in the 1970s and 1980s, with tens of thousands of transistors on a single chip; the first semiconductor chips held two transistors each.
Subsequent advances added more transistors, as a consequence, more individual functions or systems were integrated over time. The first integrated circuits held only a few devices as many as ten diodes, transistors and capacitors, making it possible to fabricate one or more logic gates on a single device. Now known retrospectively as small-scale integration, improvements in technique led to devices with hundreds of logic gates, known as medium-scale integration. Further improvements led to large-scale integration, i.e. systems with at least a thousand logic gates. Current technology has moved far past this mark and today's microprocessors have many millions of gates and billions of individual transistors. At one time, there was an effort to name and calibrate various levels of large-scale integration above VLSI. Terms like ultra-large-scale integration were used, but the huge number of gates and transistors available on common devices has rendered such fine distinctions moot. Terms suggesting greater than VLSI levels of integration are no longer in widespread use.
In 2008, billion-transistor processors became commercially available. This became more commonplace as semiconductor fabrication advanced from the then-current generation of 65 nm processes. Current designs, unlike the earliest devices, use extensive design automation and automated logic synthesis to lay out the transistors, enabling higher levels of complexity in the resulting logic functionality. Certain high-performance logic blocks like the SRAM cell, are still designed by hand to ensure the highest efficiency. Structured VLSI design is a modular methodology originated by Carver Mead and Lynn Conway for saving microchip area by minimizing the interconnect fabrics area; this is obtained by repetitive arrangement of rectangular macro blocks which can be interconnected using wiring by abutment. An example is partitioning the layout of an adder into a row of equal bit slices cells. In complex designs this structuring may be achieved by hierarchical nesting. Structured VLSI design had been popular in the early 1980s, but lost its popularity because of the advent of placement and routing tools wasting a lot of area by routing, tolerated because of the progress of Moore's Law.
When introducing the hardware description language KARL in the mid' 1970s, Reiner Hartenstein coined the term "structured VLSI design", echoing Edsger Dijkstra's structured programming approach by procedure nesting to avoid chaotic spaghetti-structured program As microprocessors become more complex due to technology scaling, microprocessor designers have encountered several challenges which force them to think beyond the design plane, look ahead to post-silicon: Process variation – As photolithography techniques get closer to the fundamental laws of optics, achieving high accuracy in doping concentrations and etched wires is becoming more difficult and prone to errors due to variation. Designers now must simulate across multiple fabrication process corners before a chip is certified ready for production, or use system-level techniques for dealing with effects of variation. Stricter design rules – Due to lithography and etch issues with scaling, design rules for layout have become stringent.
Designers must keep in mind an increasing list of rules when laying out custom circuits. The overhead for custom design is now reaching a tipping point, with many design houses opting to switch to electronic design automation tools to automate their design process. Timing/design clo
In computing, the desktop metaphor is an interface metaphor, a set of unifying concepts used by graphical user interfaces to help users interact more with the computer. The desktop metaphor treats the computer monitor as if it is the top of the user's desk, upon which objects such as documents and folders of documents can be placed. A document can be opened into a window, which represents a paper copy of the document placed on the desktop. Small applications called desk accessories are available, such as a desk calculator or notepad, etc; the desktop metaphor itself has been extended and stretched with various implementations of desktop environments, since access to features and usability of the computer are more important than maintaining the'purity' of the metaphor. Hence we find trash cans on the desktop, as well as network volumes. Other features such as menu bars or taskbars have no counterpart on a real-world desktop; the desktop metaphor was first introduced by Alan Kay at Xerox PARC in 1970 and elaborated in a series of innovative software applications developed by PARC scientists throughout the ensuing decade.
The first computer to use an early version of the desktop metaphor was the experimental Xerox Alto, the first commercial computer that adopted this kind of interface was the Xerox Star. The use of window controls to contain related information predates the desktop metaphor, with a primitive version appearing in Douglas Engelbart's "Mother of All Demos", though it was incorporated by PARC in the environment of the Smalltalk language. One of the first desktop-like interfaces on the market was a program called Magic Desk I. Built as a cartridge for the Commodore 64 home computer in 1983, a primitive GUI presented a low resolution sketch of a desktop, complete with telephone, calculator, etc; the user made their choices by moving a sprite depicting a hand pointing by using the same joystick the user may have used for video gaming. Onscreen options were chosen by pushing the fire button on the joystick; the Magic Desk I program featured a typewriter graphically emulated complete with audio effects.
Other applications included a calculator, rolodex organiser, a terminal emulator. Files could be archived into the drawers of the desktop. A trashcan was present; the first computer to popularise the desktop metaphor, using it as a standard feature over the earlier command-line interface was the Apple Macintosh in 1984. The desktop metaphor is ubiquitous in modern-day personal computing. BeOS observed the desktop metaphor more than many systems. For example, external hard drives appeared on the'desktop', while internal ones were accessed clicking on an icon representing the computer itself. By comparison, the Mac OS places all drives on the desktop itself by default, while in Windows the user can access the drives through an icon labelled "Computer". Amiga terminology for its desktop metaphor was taken directly from workshop jargon; the desktop was called Workbench, programs were called tools, small applications were utilities, directories were drawers, etc. Icons of objects were animated and the directories are shown as drawers which were represented either open or closed.
As in the classic Mac OS and macOS desktop, an icon for a floppy disk or CD-ROM would appear on the desktop when the disk was inserted into the drive, as it was a virtual counterpart of a physical floppy disk or CD-ROM on the surface of a workbench. The paper paradigm refers to the paradigm used by operating systems; the paper paradigm consists of black text on a white background, files within folders, a "desktop". The paper paradigm was created by many individuals and organisations, such as Douglas Engelbart, Xerox PARC, Apple Computer, was an attempt to make computers more user-friendly by making them resemble the common workplace of the time, it was first presented to the public by Engelbart in 1968, in what is now referred to as "The Mother of All Demos". From John Siracusa: Back in 1984, explanations of the original Mac interface to users who had never seen a GUI before included an explanation of icons that went something like this: "This icon represents your file on disk." But to the surprise of many, users quickly discarded any semblance of indirection.
This icon is my file. My file is this icon. One is not a "representation of" or an "interface; such relationships were foreign to most people, constituted unnecessary mental baggage when there was a much more simple and direct connection to what they knew of reality. Since many aspects of computers have wandered away from the paper paradigm by implementing features such as "shortcuts" to files and non-spatial file browsing. A shortcut and hypertext have no real-world equivalent. Non-spatial file browsing, as well, may confuse novice users, as they can have more than one window representing the same folder open at the same time, something, impossible in reality; these and other departures from real-world equivalents are violations of the pure paper paradigm. Desktop environment File browser History of the GUI Interface metaphor Operating system Skeuomorph Tiling window manager Virtual desktop WIMP ArsTechnica article on the spatial Mac OS Finder
Xerox Corporation is an American global corporation that sells print and digital document and services in more than 160 countries. Xerox is headquartered in Norwalk, though its largest population of employees is based around Rochester, New York, the area in which the company was founded; the company purchased Affiliated Computer Services for $6.4 billion in early 2010. As a large developed company, it is placed in the list of Fortune 500 companies. On December 31, 2016, Xerox separated its business process service operations into a new publicly traded company, Conduent. Xerox focuses on its document technology and document outsourcing business, continues to trade on the NYSE. On January 31, 2018, Xerox announced that it would sell a controlling stake to Fujifilm, which has maintained a joint venture in the Asia-Pacific region known as Fuji Xerox. Researchers at Xerox and its Palo Alto Research Center invented several important elements of personal computing, such as the desktop metaphor GUI, the computer mouse and desktop computing.
These concepts were frowned upon by the board of directors, who ordered the Xerox engineers to share them with Apple technicians. The concepts were adopted by Apple and Microsoft. With the help of these innovations and Microsoft came to dominate the personal computing revolution of the 1980s, whereas Xerox was not a major player. Xerox was founded in 1906 in Rochester as The Haloid Photographic Company, which manufactured photographic paper and equipment. In 1938 Chester Carlson, a physicist working independently, invented a process for printing images using an electrically charged photoconductor-coated metal plate and dry powder "toner". However, it would take more than 20 years of refinement before the first automated machine to make copies was commercialized, using a document feeder, scanning light, a rotating drum. Joseph C. Wilson, credited as the "founder of Xerox", took over Haloid from his father, he saw the promise of Carlson's invention and, in 1946, signed an agreement to develop it as a commercial product.
Wilson remained as President/CEO of Xerox until 1967 and served as Chairman until his death in 1971. Looking for a term to differentiate its new system, Haloid coined the term xerography from two Greek roots meaning "dry writing". Haloid subsequently changed its name to Haloid Xerox in 1958 and Xerox Corporation in 1961. Before releasing the 914, Xerox tested the market by introducing a developed version of the prototype hand-operated equipment known as the Flat-plate 1385; the 1385 was not a viable copier because of its speed of operation. As a consequence, it was sold as a platemaker for the Addressograph-Multigraph Multilith 1250 and related sheet-fed offset printing presses in the offset lithography market, it was little more than a high quality, commercially available plate camera mounted as a horizontal rostrum camera, complete with photo-flood lighting and timer. The glass film/plate had been replaced with a selenium-coated aluminum plate. Clever electrics turned this into reusable substitute for film.
A skilled user could produce fast and metal printing plates of a higher quality than any other method. Having started as a supplier to the offset lithography duplicating industry, Xerox now set its sights on capturing some of offset's market share; the 1385 was followed by the first automatic xerographic printer, the Copyflo, in 1955. The Copyflo was a large microfilm printer which could produce positive prints on roll paper from any type of microfilm negative. Following the Copyflo, the process was scaled down to produce the 1824 microfilm printer. At about half the size and weight, this still sizable machine printed onto hand-fed, cut-sheet paper, pulled through the process by one of two gripper bars. A scaled-down version of this gripper feed system was to become the basis for the 813 desktop copier; the company came to prominence in 1959 with the introduction of the Xerox 914, "the most successful single product of all time." The 914, the first plain paper photocopier was developed by John H. Dessauer.
The product was sold by an innovative ad campaign showing that monkeys could make copies at the touch of a button - simplicity would become the foundation of future Xerox products and user interfaces. Revenues leaped to over $500 million by 1965. In the 1960s, Xerox held a dominant position in the photocopier market, the company expanded making millionaires of some long-suffering investors who had nursed the company through the slow research and development phase of the product. In 1960, a xerography research facility called the Wilson Center for Research and Technology was opened in Webster, New York. In 1961, the company changed its name to Xerox Corporation. Xerox common stock was listed on the New York Stock Exchange in 1961 and on the Chicago Stock Exchange in 1990. In 1963 Xerox introduced the Xerox 813, the first desktop plain-paper copier, realizing Carlson's vision of a copier that could fit on anyone's office desk. Ten years in 1973, a basic, color copier, based on the 914, followed.
The 914 itself was sped up to become the 420 and 720. The 813 was developed into the 330 and 660 products and also the 740 desktop microfiche printer. Xerox's first foray into duplicating, as distinct from copying, was with the Xerox 2400, introduced in 1966; the model number denoted the number of prints produced in an hour. Although not as fast as offset printing, this machine introduced the industry's first automatic document feeder, paper slitter and perforator, collato