Computer data storage
Computer data storage called storage or memory, is a technology consisting of computer components and recording media that are used to retain digital data. It is a core function and fundamental component of computers; the central processing unit of a computer is. In practice all computers use a storage hierarchy, which puts fast but expensive and small storage options close to the CPU and slower but larger and cheaper options farther away; the fast volatile technologies are referred to as "memory", while slower persistent technologies are referred to as "storage". In the Von Neumann architecture, the CPU consists of two main parts: The control unit and the arithmetic logic unit; the former controls the flow of data between the CPU and memory, while the latter performs arithmetic and logical operations on data. Without a significant amount of memory, a computer would be able to perform fixed operations and output the result, it would have to be reconfigured to change its behavior. This is acceptable for devices such as desk calculators, digital signal processors, other specialized devices.
Von Neumann machines differ in having a memory in which they store their operating instructions and data. Such computers are more versatile in that they do not need to have their hardware reconfigured for each new program, but can be reprogrammed with new in-memory instructions. Most modern computers are von Neumann machines. A modern digital computer represents data using the binary numeral system. Text, pictures and nearly any other form of information can be converted into a string of bits, or binary digits, each of which has a value of 1 or 0; the most common unit of storage is the byte, equal to 8 bits. A piece of information can be handled by any computer or device whose storage space is large enough to accommodate the binary representation of the piece of information, or data. For example, the complete works of Shakespeare, about 1250 pages in print, can be stored in about five megabytes with one byte per character. Data are encoded by assigning a bit pattern to digit, or multimedia object.
Many standards exist for encoding. By adding bits to each encoded unit, redundancy allows the computer to both detect errors in coded data and correct them based on mathematical algorithms. Errors occur in low probabilities due to random bit value flipping, or "physical bit fatigue", loss of the physical bit in storage of its ability to maintain a distinguishable value, or due to errors in inter or intra-computer communication. A random bit flip is corrected upon detection. A bit, or a group of malfunctioning physical bits is automatically fenced-out, taken out of use by the device, replaced with another functioning equivalent group in the device, where the corrected bit values are restored; the cyclic redundancy check method is used in communications and storage for error detection. A detected error is retried. Data compression methods allow in many cases to represent a string of bits by a shorter bit string and reconstruct the original string when needed; this utilizes less storage for many types of data at the cost of more computation.
Analysis of trade-off between storage cost saving and costs of related computations and possible delays in data availability is done before deciding whether to keep certain data compressed or not. For security reasons certain types of data may be kept encrypted in storage to prevent the possibility of unauthorized information reconstruction from chunks of storage snapshots; the lower a storage is in the hierarchy, the lesser its bandwidth and the greater its access latency is from the CPU. This traditional division of storage to primary, secondary and off-line storage is guided by cost per bit. In contemporary usage, "memory" is semiconductor storage read-write random-access memory DRAM or other forms of fast but temporary storage. "Storage" consists of storage devices and their media not directly accessible by the CPU hard disk drives, optical disc drives, other devices slower than RAM but non-volatile. Memory has been called core memory, main memory, real storage or internal memory. Meanwhile, non-volatile storage devices have been referred to as secondary storage, external memory or auxiliary/peripheral storage.
Primary storage referred to as memory, is the only one directly accessible to the CPU. The CPU continuously reads instructions executes them as required. Any data operated on is stored there in uniform manner. Early computers used delay lines, Williams tubes, or rotating magnetic drums as primary storage. By 1954, those unreliable methods were replaced by magnetic core memory. Core memory remained dominant until the 1970s, when advances in integrated circuit technology allowed semiconductor memory to become economically competitive; this led to modern random-access memo
An isolation transformer is a transformer used to transfer electrical power from a source of alternating current power to some equipment or device while isolating the powered device from the power source for safety reasons. Isolation transformers provide galvanic isolation and are used to protect against electric shock, to suppress electrical noise in sensitive devices, or to transfer power between two circuits which must not be connected. A transformer sold for isolation is built with special insulation between primary and secondary, is specified to withstand a high voltage between windings. Isolation transformers block transmission of the DC component in signals from one circuit to the other, but allow AC components in signals to pass. Transformers that have a ratio of 1 to 1 between the primary and secondary windings are used to protect secondary circuits and individuals from electrical shocks between energized conductors and earth ground. Suitably designed isolation transformers block interference caused by ground loops.
Isolation transformers with electrostatic shields are used for power supplies for sensitive equipment such as computers, medical devices, or laboratory instruments. Sometimes the term is used to emphasize that a device is not an autotransformer whose primary and secondary circuits are connected. Power transformers with specified insulation between primary and secondary are not described only as "isolation transformers" unless this is their primary function. Only transformers whose primary purpose is to isolate circuits are described as isolation transformers. Isolation transformers are designed with attention to capacitive coupling between the two windings; the capacitance between primary and secondary windings would couple AC current from the primary to the secondary. A grounded Faraday shield between the primary and the secondary reduces the coupling of common-mode noise; this may be a metal strip surrounding a winding. Differential noise can magnetically couple from the primary to the secondary of an isolation transformer, must be filtered out if a problem occurs.
Some small transformers are used for isolation in pulse circuits. In electronics testing and servicing, an isolation transformer is a 1:1 power transformer used for safety. Without it, exposed live metal in a device under test is at a hazardous voltage relative to grounded objects such as a heating radiator or oscilloscope ground lead. With the transformer, as there is no conductive connection between transformer secondary and earth, there is no danger in touching a live part of the circuit while another part of the body is earthed. Electrical isolation is considered to be important on medical equipment, special standards apply; the system must additionally be designed so that fault conditions do not interrupt power, but generate a warning. Isolation transformers are used for the power supply of devices not at ground potential. An example is the Austin transformer for the power supply of air-traffic obstacle warning lamps on radio antenna masts. Without the isolation transformer, the lighting circuits on the mast would conduct radio-frequency energy to ground through the power supply.
Austin transformer Balun Galvanic isolation Power quality Transformer types Zigzag transformer Transformer Isolation
General Services Administration
The General Services Administration, an independent agency of the United States government, was established in 1949 to help manage and support the basic functioning of federal agencies. GSA supplies products and communications for U. S. government offices, provides transportation and office space to federal employees, develops government-wide cost-minimizing policies and other management tasks. GSA employs about 12,000 federal workers and has an annual operating budget of $20.9 billion. GSA oversees $66 billion of procurement annually, it contributes to the management of about $500 billion in U. S. federal property, divided chiefly among 8,700 owned and leased buildings and a 215,000 vehicle motor pool. Among the real estate assets managed by GSA are the Ronald Reagan Building and International Trade Center in Washington, D. C. – the largest U. S. federal building after the Pentagon – and the Hart-Dole-Inouye Federal Center. GSA's business lines include the Federal Acquisition Service and the Public Buildings Service, as well as several Staff Offices including the Office of Government-wide Policy, the Office of Small Business Utilization, the Office of Mission Assurance.
As part of FAS, GSA's Technology Transformation Services helps federal agencies improve delivery of information and services to the public. Key initiatives include FedRAMP, Cloud.gov, the USAGov platform, Data.gov, Performance.gov, Challenge.gov. GSA is a member of the Procurement G6, an informal group leading the use of framework agreements and e-procurement instruments in public procurement. In 1947 President Harry Truman asked former President Herbert Hoover to lead what became known as the Hoover Commission to make recommendations to reorganize the operations of the federal government. One of the recommendations of the commission was the establishment of an "Office of the General Services." This proposed office would combine the responsibilities of the following organizations: U. S. Treasury Department's Bureau of Federal Supply U. S. Treasury Department's Office of Contract Settlement National Archives Establishment All functions of the Federal Works Agency, including the Public Buildings Administration and the Public Roads Administration War Assets AdministrationGSA became an independent agency on July 1, 1949, after the passage of the Federal Property and Administrative Services Act.
General Jess Larson, Administrator of the War Assets Administration, was named GSA's first Administrator. The first job awaiting Administrator Larson and the newly formed GSA was a complete renovation of the White House; the structure had fallen into such a state of disrepair by 1949 that one inspector of the time said the historic structure was standing "purely from habit." Larson explained the nature of the total renovation in depth by saying, "In order to make the White House structurally sound, it was necessary to dismantle, I mean dismantle, everything from the White House except the four walls, which were constructed of stone. Everything, except the four walls without a roof, was stripped down, that's where the work started." GSA worked with President Truman and First Lady Bess Truman to ensure that the new agency's first major project would be a success. GSA completed the renovation in 1952. In 1986 GSA headquarters, U. S. General Services Administration Building, located at Eighteenth and F Streets, NW, was listed on the National Register of Historic Places, at the time serving as Interior Department offices.
In 1960 GSA created the Federal Telecommunications System, a government-wide intercity telephone system. In 1962 the Ad Hoc Committee on Federal Office Space created a new building program to address obsolete office buildings in Washington, D. C. resulting in the construction of many of the offices that now line Independence Avenue. In 1970 the Nixon administration created the Consumer Product Information Coordinating Center, now part of USAGov. In 1974 the Federal Buildings Fund was initiated, allowing GSA to issue rent bills to federal agencies. In 1972 GSA established the Automated Data and Telecommunications Service, which became the Office of Information Resources Management. In 1973 GSA created the Office of Federal Management Policy. GSA's Office of Acquisition Policy centralized procurement policy in 1978. GSA was responsible for emergency preparedness and stockpiling strategic materials to be used in wartime until these functions were transferred to the newly-created Federal Emergency Management Agency in 1979.
In 1984 GSA introduced the federal government to the use of charge cards, known as the GMA SmartPay system. The National Archives and Records Administration was spun off into an independent agency in 1985; the same year, GSA began to provide governmentwide policy oversight and guidance for federal real property management as a result of an Executive Order signed by President Ronald Reagan. In 2003 the Federal Protective Service was moved to the Department of Homeland Security. In 2005 GSA reorganized to merge the Federal Supply Service and Federal Technology Service business lines into the Federal Acquisition Service. On April 3, 2009, President Barack Obama nominated Martha N. Johnson to serve as GSA Administrator. After a nine-month delay, the United States Senate confirmed her nomination on February 4, 2010. On April 2, 2012, Johnson resigned in the wake of a management-deficiency report that detailed improper payments for a 2010 "Western Regions" training conference put on by the Public Buildings Service in Las Vegas.
In July 1991 GSA contractors began the excavation of what is now the Ted Weiss Federal Building in New York City. The planning for that buildin
Ethernet physical layer
The Ethernet physical layer is the physical layer functionality of the Ethernet family of computer network standards. The physical layer defines the electrical or optical properties of the physical connection between a device and the network or between network devices, it is complemented by the logical link layer. The Ethernet physical layer has evolved over its existence starting in 1980 and encompasses multiple physical media interfaces and several orders of magnitude of speed from 1 Mbit/s to 400 Gbit/s; the physical medium ranges from bulky coaxial cable to optical fiber. In general, network protocol stack software will work on all physical layers. Many Ethernet adapters and switch ports support multiple speeds by using autonegotiation to set the speed and duplex for the best values supported by both connected devices. While this can be relied on for Ethernet over twisted pair, few optical-fiber ports support multiple speeds. If autonegotiation fails, some multiple-speed devices sense the speed used by their partner, but this may result in a duplex mismatch.
With rare exceptions, a 100BASE-TX port supports 10BASE-T while a 1000BASE-T port supports 10BASE-T and 100BASE-TX. A 10GBASE-T port also supports 1000BASE-T.10 Gigabit Ethernet was used in both enterprise and carrier networks by 2007, with 40 Gbit/s and 100 Gigabit Ethernet ratified. In 2017, the fastest additions to the Ethernet family were 400 Gbit/s. Layers are named by their specifications: 10, 100, 1000, 10G... – the nominal, usable speed at the top of the physical layer, excluding line codes but including other physical layer overhead. Most twisted pair layers use unique encoding, so most just -T is used; the reach for optical connections, is defined as the maximum achievable link length, guaranteed to work when all channel parameters are met. With better channel parameters a longer, stable link length can be achieved. Vice versa, a link with worse channel parameters can work but only over a shorter distance. Reach and maximum distance have the same meaning; the following sections provide a brief summary of official Ethernet media types.
In addition to these official standards, many vendors have implemented proprietary media types for various reasons—often to support longer distances over fiber optic cabling. Early Ethernet standards used Manchester coding so that the signal was self-clocking and not adversely affected by high-pass filters. All Fast Ethernet variants use a star topology and use 4B5B line coding. All Gigabit Ethernet variants use a star topology. 1000BASE-X variants use 8b/10b PCS encoding. Half-duplex mode was included in the standard but has been abandoned since. Few devices support gigabit speed in half-duplex. 2.5GBASE-T and 5GBASE-T are scaled-down variants of 10GBASE-T. These physical layers support twisted pair copper cabling only. 10 Gigabit Ethernet defines a version of Ethernet with a nominal data rate of 10 Gbit/s, ten times as fast as Gigabit Ethernet. In 2002, the first 10 Gigabit Ethernet standard was published as IEEE Std 802.3ae-2002. Subsequent standards encompass media types for single-mode fibre, multi-mode fibre, copper backplane and copper twisted pair.
All 10-gigabit standards were consolidated into IEEE Std 802.3-2008. Most 10-gigabit variants use 64b/66b PCS code; as of 2009, 10 Gigabit Ethernet is predominantly deployed in carrier networks, where 10GBASE-LR and 10GBASE-ER enjoy significant market shares. Single-lane 25-gigabit Ethernet is based on one 25.78125 GBd lane of the four from the 100 Gigabit Ethernet standard developed by task force P802.3by. 25GBASE-T over twisted pair was approved alongside 40GBASE-T within IEEE 802.3bq. This class of Ethernet was standardized in June 2010 as IEEE 802.3ba along with the first 100 Gbit/s generation, with an addition in March 2011 as IEEE 802.3bg, the fastest yet twisted-pair standard in IEEE 802.3bq-2016. The nomenclature is as follows: The IEEE 802.3cd Task Force has developed 50 Gbit/s along with next-generation 100 and 200 Gbit/s standards using 50 Gbit/s lanes- The first generation of 100G Ethernet using 10 and 25 Gbit/s lanes was standardized in June 2010 as IEEE 802.3ba alongside 40 Gbit/s.
The second generation using 50 Gbit/s lanes has been developed by the IEEE 802.3cd Task Force along with 50 and 200 Gbit/s standards. The third generation using a single 100 Gbit/s lane is being developed by the IEEE 802.3ck Task Force along with 200 and 400 Gbit/s PHYs and attachment unit interfaces using 100 Gbit/s lanes. First generation 200 Gbit/s have been defined by the IEEE 802.3bs
Prentice Hall is a major educational publisher owned by Pearson plc. Prentice Hall publishes print and digital content for the higher-education market. Prentice Hall distributes its technical titles through the Safari Books Online e-reference service. On October 13, 1913, law professor Charles Gerstenberg and his student Richard Ettinger founded Prentice Hall. Gerstenberg and Ettinger took their mothers' maiden names—Prentice and Hall—to name their new company. Prentice Hall was acquired by Gulf+Western in 1984, became part of that company's publishing division Simon & Schuster. Publication of trade books ended in 1991. Simon & Schuster's educational division, including Prentice Hall, was sold to Pearson by G+W successor Viacom in 1998. There were two or more authors, their books turned up missing. One book'The Roof Builder's Handbook' is still being sold in 2018 for as much as $230 per new copy, but the author William C. McElroy was told by Pearson that all new books were either destroyed or went missing in 1995.
Some 2,385 copies are missing. Prentice Hall is the publisher of Magruder's American Government as well as Biology by Ken Miller and Joe Levine, their artificial intelligence series includes Artificial Intelligence: A Modern Approach by Stuart J. Russell and Peter Norvig and ANSI Common Lisp by Paul Graham, they published the well-known computer programming book The C Programming Language by Brian Kernighan and Dennis Ritchie and Operating Systems: Design and Implementation by Andrew S. Tanenbaum. Other titles include Dennis Nolan's Big Pig, Monster Bubbles: A Counting Book, Wizard McBean and his Flying Machine, Witch Bazooza, Llama Beans, The Joy of Chickens. A Prentice Hall subsidiary, Reston Publishing, was in the foreground of technical-book publishing when microcomputers were first becoming available, it was still unclear who would be buying and using "personal computers," and the scarcity of useful software and instruction created a publishing market niche whose target audience yet had to be defined.
In the spirit of the pioneers who made PCs possible, Reston Publishing's editors addressed non-technical users with the reassuring, mildly experimental, Computer Anatomy for Beginners by Marlin Ouverson of People's Computer Company. They followed with a collection of books, by and for programmers, building a stalwart list of titles relied on by many in the first generation of microcomputers users. Prentice Hall International Series in Computer Science Prentice Hall website Prentice Hall School website Prentice Hall Higher Education website Prentice Hall Professional Technical Reference website
Telecommunication is the transmission of signs, messages, writings and sounds or information of any nature by wire, optical or other electromagnetic systems. Telecommunication occurs when the exchange of information between communication participants includes the use of technology, it is transmitted either electrically over physical media, such as cables, or via electromagnetic radiation. Such transmission paths are divided into communication channels which afford the advantages of multiplexing. Since the Latin term communicatio is considered the social process of information exchange, the term telecommunications is used in its plural form because it involves many different technologies. Early means of communicating over a distance included visual signals, such as beacons, smoke signals, semaphore telegraphs, signal flags, optical heliographs. Other examples of pre-modern long-distance communication included audio messages such as coded drumbeats, lung-blown horns, loud whistles. 20th- and 21st-century technologies for long-distance communication involve electrical and electromagnetic technologies, such as telegraph and teleprinter, radio, microwave transmission, fiber optics, communications satellites.
A revolution in wireless communication began in the first decade of the 20th century with the pioneering developments in radio communications by Guglielmo Marconi, who won the Nobel Prize in Physics in 1909, other notable pioneering inventors and developers in the field of electrical and electronic telecommunications. These included Charles Wheatstone and Samuel Morse, Alexander Graham Bell, Edwin Armstrong and Lee de Forest, as well as Vladimir K. Zworykin, John Logie Baird and Philo Farnsworth; the word telecommunication is a compound of the Greek prefix tele, meaning distant, far off, or afar, the Latin communicare, meaning to share. Its modern use is adapted from the French, because its written use was recorded in 1904 by the French engineer and novelist Édouard Estaunié. Communication was first used as an English word in the late 14th century, it comes from Old French comunicacion, from Latin communicationem, noun of action from past participle stem of communicare "to share, divide out.
Homing pigeons have been used throughout history by different cultures. Pigeon post had Persian roots, was used by the Romans to aid their military. Frontinus said; the Greeks conveyed the names of the victors at the Olympic Games to various cities using homing pigeons. In the early 19th century, the Dutch government used the system in Sumatra, and in 1849, Paul Julius Reuter started a pigeon service to fly stock prices between Aachen and Brussels, a service that operated for a year until the gap in the telegraph link was closed. In the Middle Ages, chains of beacons were used on hilltops as a means of relaying a signal. Beacon chains suffered the drawback that they could only pass a single bit of information, so the meaning of the message such as "the enemy has been sighted" had to be agreed upon in advance. One notable instance of their use was during the Spanish Armada, when a beacon chain relayed a signal from Plymouth to London. In 1792, Claude Chappe, a French engineer, built the first fixed visual telegraphy system between Lille and Paris.
However semaphore suffered from the need for skilled operators and expensive towers at intervals of ten to thirty kilometres. As a result of competition from the electrical telegraph, the last commercial line was abandoned in 1880. On 25 July 1837 the first commercial electrical telegraph was demonstrated by English inventor Sir William Fothergill Cooke, English scientist Sir Charles Wheatstone. Both inventors viewed their device as "an improvement to the electromagnetic telegraph" not as a new device. Samuel Morse independently developed a version of the electrical telegraph that he unsuccessfully demonstrated on 2 September 1837, his code was an important advance over Wheatstone's signaling method. The first transatlantic telegraph cable was completed on 27 July 1866, allowing transatlantic telecommunication for the first time; the conventional telephone was invented independently by Alexander Bell and Elisha Gray in 1876. Antonio Meucci invented the first device that allowed the electrical transmission of voice over a line in 1849.
However Meucci's device was of little practical value because it relied upon the electrophonic effect and thus required users to place the receiver in their mouth to "hear" what was being said. The first commercial telephone services were set-up in 1878 and 1879 on both sides of the Atlantic in the cities of New Haven and London. Starting in 1894, Italian inventor Guglielmo Marconi began developing a wireless communication using the newly discovered phenomenon of radio waves, showing by 1901 that they could be transmitted across the Atlantic Ocean; this was the start of wireless telegraphy by radio. Voice and music had little early success. World War I accelerated the development of radio for military communications. After the war, commercial radio AM broadcasting began in the 1920s and became an important mass medium for entertainment and news. World War II again accelerated development of radio for the wartime purposes of aircraft and land communication, radio navigation and radar. Development of stereo FM broadcasting of radio
In electrical engineering, two conductors are said to be inductively coupled or magnetically coupled when they are configured such that a change in current through one wire induces a voltage across the ends of the other wire through electromagnetic induction. A changing current through the first wire creates a changing magnetic field around it by Ampere's circuital law; the changing magnetic field induces an electromotive force in the second wire by Faraday's law of induction. The amount of inductive coupling between two conductors is measured by their mutual inductance; the coupling between two wires can be increased by winding them into coils and placing them close together on a common axis, so the magnetic field of one coil passes through the other coil. Coupling can be increased by a magnetic core of a ferromagnetic material like iron or ferrite in the coils, which increases the magnetic flux; the two coils may be physically contained in a single unit, as in the primary and secondary windings of a transformer, or may be separated.
Coupling may be unintentional. Unintentional inductive coupling can cause signals from one circuit to be induced into a nearby circuit, this is called cross-talk, is a form of electromagnetic interference. An inductively coupled transponder consists of a solid state transceiver chip connected to a large coil that functions as an antenna; when brought within the oscillating magnetic field of a reader unit, the transceiver is powered up by energy inductively coupled into its antenna and transfers data back to the reader unit inductively. Magnetic coupling between two magnets can be used to mechanically transfer power without contact, as in the magnetic gear. Inductive coupling is used throughout electrical technology. For example, when metallic pipeline is installed parallel to a high-voltage power line, the pipeline, a conductor and is insulated from the earth by its protective coating, can develop voltages which are hazardous to personnel operating valves or otherwise contacting the pipeline