Magnetic-core memory was the predominant form of random-access computer memory for 20 years between about 1955 and 1975. Such memory is just called core memory, or, core. Core memory uses toroids of a hard magnetic material as transformer cores, where each wire threaded through the core serves as a transformer winding. Three or four wires pass through each core; each core stores one bit of information. A core can be magnetized in either the counter-clockwise direction; the value of the bit stored in a core is zero or one according to the direction of that core's magnetization. Electric current pulses in some of the wires through a core allow the direction of the magnetization in that core to be set in either direction, thus storing a one or a zero. Another wire through each core, the sense wire, is used to detect; the process of reading the core causes the core to be reset to a zero. This is called destructive readout; when not being read or written, the cores maintain the last value they had if the power is turned off.
Therefore they are a type of non-volatile memory. Using smaller cores and wires, the memory density of core increased, by the late 1960s a density of about 32 kilobits per cubic foot was typical. However, reaching this density required careful manufacture always carried out by hand in spite of repeated major efforts to automate the process; the cost declined over this period from about $1 per bit to about 1 cent per bit. The introduction of the first semiconductor memory chips, SRAM, in the late 1960s began to erode the market for core memory; the first successful DRAM, the Intel 1103, which arrived in quantity in 1972 at 1 cent per bit, marked the beginning of the end for core memory. Improvements in semiconductor manufacturing led to rapid increases in storage capacity and decreases in price per kilobyte, while the costs and specs of core memory changed little. Core memory was driven from the market between 1973 and 1978. Although core memory is obsolete, computer memory is still sometimes called "core" though it's made of semiconductors by people who had worked with machines having real core memory.
And the files that result from saving the entire contents of memory to disk for debugging purposes when a major error occurs are still called "core dumps". The basic concept of using the square hysteresis loop of certain magnetic materials as a storage or switching device was known from the earliest days of computer development. Much of this knowledge had developed due to an understanding of transformers, which allowed amplification and switch-like performance when built using certain materials; the stable switching behavior was well known in the electrical engineering field, its application in computer systems was immediate. For example, J. Presper Eckert and Jeffrey Chuan Chu had done some development work on the concept in 1945 at the Moore School during the ENIAC efforts. Frederick Viehe applied for various patents on the use of transformers for building digital logic circuits in place of relay logic beginning in 1947. A patent on a developed core system was granted in 1947, purchased by IBM in 1956.
This development was little-known and the mainstream development of core is associated with three independent teams. Substantial work in the field was carried out by the Shanghai-born American physicists An Wang and Way-Dong Woo, who created the pulse transfer controlling device in 1949; the name referred to the way that the magnetic field of the cores could be used to control the switching of current. Wang and Woo were working at Harvard University's Computation Laboratory at the time, the university was not interested in promoting inventions created in their labs. Wang was able to patent the system on his own; the MIT Whirlwind computer required a fast memory system for real-time aircraft tracking use. At first, Williams tubes—a storage system based on cathode ray tubes—were used, but these devices were always temperamental and unreliable. Several researchers in the late 1940s conceived the idea of using magnetic cores for computer memory, but Jay Forrester received the principal patent for his invention of the coincident core memory that enabled the 3D storage of information.
William Papian of Project Whirlwind cited one of these efforts, Harvard's "Static Magnetic Delay Line", in an internal memo. The first core memory of 32 x 32 x 16 bits was installed on Whirlwind in the summer of 1953. Papian described: "Magnetic-Core Storage has two big advantages: greater reliability with a consequent reduction in maintenance time devoted to storage; the Wang memory was complicated. As I recall, which may not be correct, it used two cores per binary bit and was a delay line that moved a bit forward. To the extent that I may have focused on it, the approach was not suitable for our purposes." He describes the invention and associated events, in 1975. Forrester has since observed, "It took us about seven years to convince the industry that random-access magnetic-core memory was the solution to a missing link in computer technology. We spent the following seven years in the patent courts convincing them that they had not all thought of it first."A third developer involved in the early development of core was Jan A. Rajchman at RCA.
The LINC is a 12-bit, 2048-word transistorized computer. The LINC is considered by some a forerunner to the personal computer. Named the "Linc", suggesting the project's origins at MIT's Lincoln Laboratory, it was renamed LINC after the project moved from the Lincoln Laboratory; the LINC was designed by Wesley A. Charles Molnar; the LINC and other "MIT Group" machines were designed at MIT and built by Digital Equipment Corporation and Spear Inc. of Waltham, Massachusetts. The LINC sold for more than $40,000 at the time. A typical configuration included an enclosed 6'X20" rack, four boxes holding tape drives, a small display, a control panel, a keyboard. Although the LINC's instruction set was small, it was larger; the LINC interfaced well with laboratory experiments. Analog inputs and outputs were part of the basic design, it was designed in 1962 by Charles Molnar and Wesley Clark at Lincoln Laboratory, for NIH researchers. The LINC's design was in the public domain making it unique in the history of computers.
The number of LINCs and who built them is a minor subject of debate in the 12-bit-word community. One account states that a dozen LINC computers were assembled by their eventual biologist users in a 1963 summer workshop at MIT. Digital Equipment Corporation and Spear Inc. of Waltham, MA. manufactured them commercially. DEC's pioneer C. Gordon Bell states that the LINC project began in 1961, with first delivery in March 1962, the machine was not formally withdrawn until December 1969. A total of 50 were built, most at Lincoln Labs, housing the desktop instruments in four wooden racks; the first LINC included two oscilloscope displays. Twenty-one were sold by DEC at $43,600, delivered in the Production Model design. In these, the tall cabinet sitting behind a white Formica-covered table held two somewhat smaller metal boxes holding the same instrumentation, a Tektronix display oscilloscope over the "front panel" on the user's left, a bay for interfaces over two LINC-Tape drives on the user's right, a chunky keyboard between them.
The standard program development software was designed by Mary Allen Wilkes. The LINC had 2048 12-bit words of memory in two sections. Only the first 1024 words were usable for program execution; the second section of memory could only be used for data. The LINC had a 12-bit accumulator, a one-bit link register, a 10-bit instruction location register; the first sixteen locations in program memory had special functions. Location 0 supported the single-level of subroutine call, automatically being updated with a return address on every jump instruction; the next fifteen locations could be directly addressed by certain instructions to hold data or to work as index registers. Special read-only registers were provided to read control panel switches, a read/write register controlled six relays intended for use by external instruments. A version of the LINC added a 12-bit Z register to facilitate extended precision arithmetic and to support interrupt handling. A single interrupt was provided, with the address of the interrupt routine stored in memory location 17.
Alphanumeric input/output devices included a dedicated keyboard, the ability to display text on the attached bit-mapped CRT. A teleprinter could be connected for printed output. Arithmetic was ones' complement, which meant that there were representations for "plus zero" and "minus zero"; the original LINC required 8 microseconds for each instruction. The natural notation used for the LINC was octal. In this section, all numbers are given as base ten; the LINC instruction set was designed for ease of use with scientific instruments or custom experimental apparatus. Miscellaneous class, no address - Halt, clear accumulator, enable tape mark write gate, transfer accumulator to relay register, read relay register to accumulator, no operation, complement accumulator Shift class, no address - rotate left, rotate right, scale right, Full address class, two word instructions. Immediate address given in second 12-bit word of the opcode - add and clear accumulator, jump. Only the first 1024 words of memory can be accessed.
Skip class, skips the next instruction, can test for set or clear condition. Conditions are: external logic line, key struck, one of five sense switches, accumulator positive, link bit zero, or active tape unit in an interblock zone. Models added skip on bit 0 of the Z register, skip on overflow, skip on interrupt paused. Index class - these instructions could either have a second word, the immediate operand, or that specified the operand address, or that specified one of the registers 01 through 15 as holding the address of the operand; the address was incremented. These instructions included load or add to accumulator, add accumulator to memory, add accumulator with carry to memory, skip if equal and rotate, bit clear, bit set, bit complement. Another instruction in this group was to display a bit map, to represent a character or other data, on the built-in CRT display screen. Half-word class - instructions operating on the lower or upper six bits of a word; these included load half, store half, skip if halves are different, Set - moves data from any memory location
A computer is a device that can be instructed to carry out sequences of arithmetic or logical operations automatically via computer programming. Modern computers have the ability to follow generalized sets of called programs; these programs enable computers to perform an wide range of tasks. A "complete" computer including the hardware, the operating system, peripheral equipment required and used for "full" operation can be referred to as a computer system; this term may as well be used for a group of computers that are connected and work together, in particular a computer network or computer cluster. Computers are used as control systems for a wide variety of industrial and consumer devices; this includes simple special purpose devices like microwave ovens and remote controls, factory devices such as industrial robots and computer-aided design, general purpose devices like personal computers and mobile devices such as smartphones. The Internet is run on computers and it connects hundreds of millions of other computers and their users.
Early computers were only conceived as calculating devices. Since ancient times, simple manual devices like the abacus aided people in doing calculations. Early in the Industrial Revolution, some mechanical devices were built to automate long tedious tasks, such as guiding patterns for looms. More sophisticated electrical machines did specialized analog calculations in the early 20th century; the first digital electronic calculating machines were developed during World War II. The speed and versatility of computers have been increasing ever since then. Conventionally, a modern computer consists of at least one processing element a central processing unit, some form of memory; the processing element carries out arithmetic and logical operations, a sequencing and control unit can change the order of operations in response to stored information. Peripheral devices include input devices, output devices, input/output devices that perform both functions. Peripheral devices allow information to be retrieved from an external source and they enable the result of operations to be saved and retrieved.
According to the Oxford English Dictionary, the first known use of the word "computer" was in 1613 in a book called The Yong Mans Gleanings by English writer Richard Braithwait: "I haue read the truest computer of Times, the best Arithmetician that euer breathed, he reduceth thy dayes into a short number." This usage of the term referred to a human computer, a person who carried out calculations or computations. The word continued with the same meaning until the middle of the 20th century. During the latter part of this period women were hired as computers because they could be paid less than their male counterparts. By 1943, most human computers were women. From the end of the 19th century the word began to take on its more familiar meaning, a machine that carries out computations; the Online Etymology Dictionary gives the first attested use of "computer" in the 1640s, meaning "one who calculates". The Online Etymology Dictionary states that the use of the term to mean "'calculating machine' is from 1897."
The Online Etymology Dictionary indicates that the "modern use" of the term, to mean "programmable digital electronic computer" dates from "1945 under this name. Devices have been used to aid computation for thousands of years using one-to-one correspondence with fingers; the earliest counting device was a form of tally stick. Record keeping aids throughout the Fertile Crescent included calculi which represented counts of items livestock or grains, sealed in hollow unbaked clay containers; the use of counting rods is one example. The abacus was used for arithmetic tasks; the Roman abacus was developed from devices used in Babylonia as early as 2400 BC. Since many other forms of reckoning boards or tables have been invented. In a medieval European counting house, a checkered cloth would be placed on a table, markers moved around on it according to certain rules, as an aid to calculating sums of money; the Antikythera mechanism is believed to be the earliest mechanical analog "computer", according to Derek J. de Solla Price.
It was designed to calculate astronomical positions. It was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, has been dated to c. 100 BC. Devices of a level of complexity comparable to that of the Antikythera mechanism would not reappear until a thousand years later. Many mechanical aids to calculation and measurement were constructed for astronomical and navigation use; the planisphere was a star chart invented by Abū Rayhān al-Bīrūnī in the early 11th century. The astrolabe was invented in the Hellenistic world in either the 1st or 2nd centuries BC and is attributed to Hipparchus. A combination of the planisphere and dioptra, the astrolabe was an analog computer capable of working out several different kinds of problems in spherical astronomy. An astrolabe incorporating a mechanical calendar computer and gear-wheels was invented by Abi Bakr of Isfahan, Persia in 1235. Abū Rayhān al-Bīrūnī invented the first mechanical geared lunisolar calendar astrolabe, an early fixed-wired knowledge processing machine with a gear train and gear-wheels, c. 1000 AD.
The sector, a calculating instrument used for solving problems in proportion, trigonometry and division, for various functions, such as squares and cube roots, was developed in
Mainframe computers or mainframes are computers used by large organizations for critical applications. They are larger and have more processing power than some other classes of computers: minicomputers, servers and personal computers; the term referred to the large cabinets called "main frames" that housed the central processing unit and main memory of early computers. The term was used to distinguish high-end commercial machines from less powerful units. Most large-scale computer system architectures were established in the 1960s, but continue to evolve. Mainframe computers are used as servers. Modern mainframe design is characterized less by raw computational speed and more by: Redundant internal engineering resulting in high reliability and security Extensive input-output facilities with the ability to offload to separate engines Strict backward compatibility with older software High hardware and computational utilization rates through virtualization to support massive throughput. Hot-swapping of hardware, such as processors and memory.
Their high stability and reliability enable these machines to run uninterrupted for long periods of time, with mean time between failures measured in decades. Mainframes have high availability, one of the primary reasons for their longevity, since they are used in applications where downtime would be costly or catastrophic; the term reliability and serviceability is a defining characteristic of mainframe computers. Proper planning and implementation is required to realize these features. In addition, mainframes are more secure than other computer types: the NIST vulnerabilities database, US-CERT, rates traditional mainframes such as IBM Z, Unisys Dorado and Unisys Libra as among the most secure with vulnerabilities in the low single digits as compared with thousands for Windows, UNIX, Linux. Software upgrades require setting up the operating system or portions thereof, are non-disruptive only when using virtualizing facilities such as IBM z/OS and Parallel Sysplex, or Unisys XPCL, which support workload sharing so that one system can take over another's application while it is being refreshed.
In the late 1950s, mainframes had only a rudimentary interactive interface, used sets of punched cards, paper tape, or magnetic tape to transfer data and programs. They operated in batch mode to support back office functions such as payroll and customer billing, most of which were based on repeated tape-based sorting and merging operations followed by line printing to preprinted continuous stationery; when interactive user terminals were introduced, they were used exclusively for applications rather than program development. Typewriter and Teletype devices were common control consoles for system operators through the early 1970s, although supplanted by keyboard/display devices. By the early 1970s, many mainframes acquired interactive user terminals operating as timesharing computers, supporting hundreds of users along with batch processing. Users gained access through keyboard/typewriter terminals and specialized text terminal CRT displays with integral keyboards, or from personal computers equipped with terminal emulation software.
By the 1980s, many mainframes supported graphic display terminals, terminal emulation, but not graphical user interfaces. This form of end-user computing became obsolete in the 1990s due to the advent of personal computers provided with GUIs. After 2000, modern mainframes or phased out classic "green screen" and color display terminal access for end-users in favour of Web-style user interfaces; the infrastructure requirements were drastically reduced during the mid-1990s, when CMOS mainframe designs replaced the older bipolar technology. IBM claimed that its newer mainframes reduced data center energy costs for power and cooling, reduced physical space requirements compared to server farms. Modern mainframes can run multiple different instances of operating systems at the same time; this technique of virtual machines allows applications to run as if they were on physically distinct computers. In this role, a single mainframe can replace higher-functioning hardware services available to conventional servers.
While mainframes pioneered this capability, virtualization is now available on most families of computer systems, though not always to the same degree or level of sophistication. Mainframes can add or hot swap system capacity without disrupting system function, with specificity and granularity to a level of sophistication not available with most server solutions. Modern mainframes, notably the IBM zSeries, System z9 and System z10 servers, offer two levels of virtualization: logical partitions and virtual machines. Many mainframe customers run two machines: one in their primary data center, one in their backup data center—fully active active, or on standby—in case there is a catastrophe affecting the first building. Test, development and production workload for applications and databases can run on a single machine, except for large demands where the capacity of one machine might be limiting; such a two-mainframe installation can support continuous business service, avoiding both planned and unplanned outages.
In practice many customers use multiple mainframes linked either by Parallel Sysplex and shared DASD, or with shared, geographically dispersed storage provided by EMC
National Museum of American History
The National Museum of American History: Kenneth E. Behring Center collects and displays the heritage of the United States in the areas of social, cultural and military history. Among the items on display is the original Star-Spangled Banner; the museum is part of the Smithsonian Institution and located on the National Mall at 14th Street and Constitution Avenue NW in Washington, D. C; the museum opened in 1964 as the Museum of Technology. It was one of the last structures designed by the renowned architectural firm McKim White. In 1980, the museum was renamed the National Museum of American History to represent its mission of the collection, care and interpretation of objects that reflect the experience of the American people. In May 2012, John Gray became the new director, he retired from the post in May 2018 and was succeeded by Anthea M. Hartig, chief executive of the California Historical Society; the museum underwent an $85 million renovation from September 5, 2006 to November 21, 2008, during which time it was closed.
Skidmore and Merrill provided the architecture and interior design services for the renovation, led by Gary Haney. Major changes made during the renovation include: A new, five-story sky-lit atrium, surrounded by displays of artifacts that showcase the breadth of the museum's collection. A new, grand staircase that links the museum's second floors. A new welcome center, the addition of six landmark objects to orient visitors. New galleries, such as the Jerome and Dorothy Lemelson Hall of Invention. An environmentally controlled chamber to protect the original Star-Spangled Banner flag. In 2012, the museum began a $37 million renovation of the west wing to add new exhibition spaces, public plazas and an education center; the renovation will include panoramic windows overlooking the National Mall on all three floors and new interactive features to the exhibits. The first floor of the west wing reopened on July 1, 2015 with the second and third floors of the west wing reopening in 2016 and 2017, respectively.
Each wing of the museum's three exhibition floors is anchored by a landmark object to highlight the theme of that wing. These include the John Bull locomotive, the Greensboro, North Carolina lunch counter, a one of a kind draft wheel. Landmarks from pre-existing exhibits include the 1865 Vassar Telescope, a George Washington Statue, a Red Cross ambulance, a car from Disneyland's Dumbo Flying Elephant ride. Artifact walls, 275 feet of glass-fronted cases, line the second floor center core; the artifact walls are organized around themes including arts. The lower level of the museum displays Taking America to Lunch, which celebrates the history of American lunch boxes; the museum's food court, the Stars and Stripes Café, ride simulators are located here. The first floor's East Wing has exhibits that feature technology; the John Bull locomotive is the signature artifact. The exhibits in the West Wing address innovation, they include Science in American Life featuring Robots on the Road and Bon Appétit!
Julia Child's Kitchen. Spark! Lab is a hands-on exhibit of the Lemelson Center for the Study of Innovation; the Vassar Telescope is the signature artifact. A café and the main museum store are located on the first floor; the first floor contains the Constitution Avenue lobby, as well as a space for a temporary exhibit. The exhibitions in 2 East, the east wing of the second floor, consider American ideals and include the Albert Small Documents Gallery featuring rotating exhibitions. From November 21, 2008 through January 4, 2009 an original copy of the Gettysburg Address, on loan from the White House, was on display; the Greensboro lunch counter is the signature artifact for this section of the museum. Located in the center of the second floor is the original Star Spangled Banner Flag which inspired Francis Scott Key's poem; the newly conserved flag, the centerpiece of the renovated museum, is displayed in a climate-controlled room at the heart of the museum. An interactive display by Potion Design, just across the room from the flag, features a full-size, digital reproduction of the flag that allows patrons to learn more about it by touching different areas on the flag.
The George Washington statue, created in 1840 for the centennial of Washington's birthday, is the signature artifact for 2 West, the west wing of the second floor of the museum. The second floor houses the museum's new welcome center and a store; the second floor lobby leads out to the National Mall. Exhibits in the east wing of the third floor, 3 East, are focused on the United States at war; the Clara Barton Red Cross ambulance is the signature artifact. The center of the third floor, 3 Center, presents The American Presidency: A Glorious Burden, which explores the personal and public lives of the men who have held that office, it features the popular permanent exhibit of First Ladies of America, which features their contributions, changing roles, displays dresses as a mark of changing times. The third-floor west wing, 3 West, has exhibits that feature entertainment and music; these include Thanks for the Memories: Music and Entertainment History, the Hall of Musical Instruments, The Dolls' House.
A car from Disneyland's Dumbo the Flying Elephant ride is t
A transistor is a semiconductor device used to amplify or switch electronic signals and electrical power. It is composed of semiconductor material with at least three terminals for connection to an external circuit. A voltage or current applied to one pair of the transistor's terminals controls the current through another pair of terminals; because the controlled power can be higher than the controlling power, a transistor can amplify a signal. Today, some transistors are packaged individually, but many more are found embedded in integrated circuits; the transistor is the fundamental building block of modern electronic devices, is ubiquitous in modern electronic systems. Julius Edgar Lilienfeld patented a field-effect transistor in 1926 but it was not possible to construct a working device at that time; the first implemented device was a point-contact transistor invented in 1947 by American physicists John Bardeen, Walter Brattain, William Shockley. The transistor revolutionized the field of electronics, paved the way for smaller and cheaper radios and computers, among other things.
The transistor is on the list of IEEE milestones in electronics, Bardeen and Shockley shared the 1956 Nobel Prize in Physics for their achievement. Most transistors are made from pure silicon or germanium, but certain other semiconductor materials can be used. A transistor may have only one kind of charge carrier, in a field effect transistor, or may have two kinds of charge carriers in bipolar junction transistor devices. Compared with the vacuum tube, transistors are smaller, require less power to operate. Certain vacuum tubes have advantages over transistors at high operating frequencies or high operating voltages. Many types of transistors are made to standardized specifications by multiple manufacturers; the thermionic triode, a vacuum tube invented in 1907, enabled amplified radio technology and long-distance telephony. The triode, was a fragile device that consumed a substantial amount of power. In 1909 physicist William Eccles discovered the crystal diode oscillator. Physicist Julius Edgar Lilienfeld filed a patent for a field-effect transistor in Canada in 1925, intended to be a solid-state replacement for the triode.
Lilienfeld filed identical patents in the United States in 1926 and 1928. However, Lilienfeld did not publish any research articles about his devices nor did his patents cite any specific examples of a working prototype; because the production of high-quality semiconductor materials was still decades away, Lilienfeld's solid-state amplifier ideas would not have found practical use in the 1920s and 1930s if such a device had been built. In 1934, German inventor Oskar Heil patented a similar device in Europe. From November 17, 1947, to December 23, 1947, John Bardeen and Walter Brattain at AT&T's Bell Labs in Murray Hill, New Jersey of the United States performed experiments and observed that when two gold point contacts were applied to a crystal of germanium, a signal was produced with the output power greater than the input. Solid State Physics Group leader William Shockley saw the potential in this, over the next few months worked to expand the knowledge of semiconductors; the term transistor was coined by John R. Pierce as a contraction of the term transresistance.
According to Lillian Hoddeson and Vicki Daitch, authors of a biography of John Bardeen, Shockley had proposed that Bell Labs' first patent for a transistor should be based on the field-effect and that he be named as the inventor. Having unearthed Lilienfeld’s patents that went into obscurity years earlier, lawyers at Bell Labs advised against Shockley's proposal because the idea of a field-effect transistor that used an electric field as a "grid" was not new. Instead, what Bardeen and Shockley invented in 1947 was the first point-contact transistor. In acknowledgement of this accomplishment, Shockley and Brattain were jointly awarded the 1956 Nobel Prize in Physics "for their researches on semiconductors and their discovery of the transistor effect". In 1948, the point-contact transistor was independently invented by German physicists Herbert Mataré and Heinrich Welker while working at the Compagnie des Freins et Signaux, a Westinghouse subsidiary located in Paris. Mataré had previous experience in developing crystal rectifiers from silicon and germanium in the German radar effort during World War II.
Using this knowledge, he began researching the phenomenon of "interference" in 1947. By June 1948, witnessing currents flowing through point-contacts, Mataré produced consistent results using samples of germanium produced by Welker, similar to what Bardeen and Brattain had accomplished earlier in December 1947. Realizing that Bell Labs' scientists had invented the transistor before them, the company rushed to get its "transistron" into production for amplified use in France's telephone network; the first bipolar junction transistors were invented by Bell Labs' William Shockley, which applied for patent on June 26, 1948. On April 12, 1950, Bell Labs chemists Gordon Teal and Morgan Sparks had produced a working bipolar NPN junction amplifying germanium transistor. Bell Labs had announced the discovery of this new "sandwich" transistor in a press release on July 4, 1951; the first high-frequency transistor was the surface-barrier germanium transistor developed by Philco in 1953, capable of operating up to 60 MHz.
These were made by etching depressions into an N-type germanium base from both sides with jets of Indium sulfate until it was a few ten-thousandths of an inch thick. Indium electroplated into the depressions formed the emitter; the first "prototype" pocket transistor radio was shown by I
A 19-inch rack is a standardized frame or enclosure for mounting multiple electronic equipment modules. Each module has a front panel, 19 inches wide; the 19-inch dimension includes the edges, or "ears", that protrude on each side which allow the module to be fastened to the rack frame with screws. Common uses include computer server, broadcast video, lighting and scientific lab equipment. Equipment designed to be placed in a rack is described as rack-mount, rack-mount instrument, a rack mounted system, a rack mount chassis, rack mountable, or simply shelf; the height of the electronic modules is standardized as multiples of 1.752 inches or one rack unit or U. The industry standard rack cabinet is 42U tall; the term relay rack appeared first in the world of telephony. By 1911, the term was being used in railroad signaling. There is little evidence; the 19-inch rack format with rack-units of 1.75 inches was established as a standard by AT&T around 1922 in order to reduce the space required for repeater and termination equipment for toll cables.
The earliest repeaters from 1914 were installed in ad-hoc fashion on shelves, in wooden boxes and cabinets. Once serial production started, they were built into custom-made one per repeater, but in light of the rapid growth of the toll network, the engineering department of AT&T undertook a systematic redesign, resulting in a family of modular factory-assembled panels all "designed to mount on vertical supports spaced 19½ inches between centers. The height of the different panels will vary... but... in all cases to be a whole multiple of 13⁄4 inches". By 1934, it was an established standard with holes tapped for 12-24 screws with alternating spacings of 1.25 inches and 0.5 inches The EIA standard was revised again in 1992 to comply with the 1988 public law 100-418, setting the standard U as 15.9 mm + 15.9 mm + 12.7 mm, making each "U" 44.50 millimetres. The 19-inch rack format has remained constant while the technology, mounted within it has changed and the set of fields to which racks are applied has expanded.
The 19-inch standard rack arrangement is used throughout the telecommunication, audio, video and other industries, though the Western Electric 23-inch standard, with holes on 1-inch centers, is still used in legacy ILEC/CLEC facilities. Nineteen-inch racks in two-post or four-post form hold most equipment in modern data centers, ISP facilities, professionally designed corporate server rooms, they allow for dense hardware configurations without occupying excessive floorspace or requiring shelving. Nineteen-inch racks are often used to house professional audio and video equipment, including amplifiers, effects units, headphone amplifiers, small scale audio mixers. A third common use for rack-mounted equipment is industrial power and automation hardware. A piece of equipment being installed has a front panel height 1⁄32 inch less than the allotted number of Us. Thus, a 1U rackmount computer is 1.721 inches tall. 2U would be 3.473 inches instead of 3.504 inches. This gap allows a bit of room above and below an installed piece of equipment so it may be removed without binding on the adjacent equipment.
The mounting holes were tapped with a particular screw thread. When rack rails are too thin to tap, rivnuts or other threaded inserts can be used, when the particular class of equipment to be mounted is known in advance, some of the holes can be omitted from the mounting rails. Threaded mounting holes in racks where the equipment is changed are problematic because the threads can be damaged or the mounting screws can break off. Tapping large numbers of holes that may never be used is expensive. Examples include telephone exchanges, network cabling panels, broadcast studios and some government and military applications; the tapped-hole rack was first replaced by clearance-hole racks. The holes are large enough to permit a bolt to be inserted through without binding, bolts are fastened in place using cage nuts. In the event of a nut being stripped out or a bolt breaking, the nut can be removed and replaced with a new one. Production of clearance-hole racks is less expensive because tapping the holes is eliminated and replaced with fewer, less expensive, cage nuts.
The next innovation in rack design has been the square-hole rack. Square-hole racks allow boltless mounting, such that the rack-mount equipment only needs to insert through and hook down into the lip of the square hole. Installation and removal of hardware in a square hole rack is easy and boltless, where the weight of the equipment and small retention clips are all, necessary to hold the equipment in place. Older equipment meant for round-hole or tapped-hole racks can still be used, with the use of cage nuts made for square-hole racks. Rack-mountable equipment is traditionally mounted by bolting or clipping its front panel to the rack. Within the IT industry, it is common for network/communications equipment to have multiple mounting positions, including table-top and wall mounting, so rack mountable equipment will feature L-brackets that must be screwed or bolted to the equipment prior to mounting in a 19-inch rack. With the prevalence of 23-