In electronics, an analog-to-digital converter is a system that converts an analog signal, such as a sound picked up by a microphone or light entering a digital camera, into a digital signal. An ADC may provide an isolated measurement such as an electronic device that converts an input analog voltage or current to a digital number representing the magnitude of the voltage or current; the digital output is a two's complement binary number, proportional to the input, but there are other possibilities. There are several ADC architectures. Due to the complexity and the need for matched components, all but the most specialized ADCs are implemented as integrated circuits. A digital-to-analog converter performs the reverse function. An ADC converts a continuous-time and continuous-amplitude analog signal to a discrete-time and discrete-amplitude digital signal; the conversion involves quantization of the input, so it introduces a small amount of error or noise. Furthermore, instead of continuously performing the conversion, an ADC does the conversion periodically, sampling the input, limiting the allowable bandwidth of the input signal.
The performance of an ADC is characterized by its bandwidth and signal-to-noise ratio. The bandwidth of an ADC is characterized by its sampling rate; the SNR of an ADC is influenced by many factors, including the resolution and accuracy, aliasing and jitter. The SNR of an ADC is summarized in terms of its effective number of bits, the number of bits of each measure it returns that are on average not noise. An ideal ADC has an ENOB equal to its resolution. ADCs are required SNR of the signal to be digitized. If an ADC operates at a sampling rate greater than twice the bandwidth of the signal per the Nyquist–Shannon sampling theorem, perfect reconstruction is possible; the presence of quantization error limits the SNR of an ideal ADC. However, if the SNR of the ADC exceeds that of the input signal, its effects may be neglected resulting in an perfect digital representation of the analog input signal; the resolution of the converter indicates the number of discrete values it can produce over the range of analog values.
The resolution determines the magnitude of the quantization error and therefore determines the maximum possible average signal-to-noise ratio for an ideal ADC without the use of oversampling. The values are stored electronically in binary form, so the resolution is expressed as the audio bit depth. In consequence, the number of discrete values available is assumed to be a power of two. For example, an ADC with a resolution of 8 bits can encode an analog input to one in 256 different levels; the values can represent the ranges depending on the application. Resolution can be defined electrically, expressed in volts; the change in voltage required to guarantee a change in the output code level is called the least significant bit voltage. The resolution Q of the ADC is equal to the LSB voltage; the voltage resolution of an ADC is equal to its overall voltage measurement range divided by the number of intervals: Q = E F S R 2 M, where M is the ADC's resolution in bits and EFSR is the full scale voltage range.
EFSR is given by E F S R = V R e f H i − V R e f L o w, where VRefHi and VRefLow are the upper and lower extremes of the voltages that can be coded. The number of voltage intervals is given by N = 2 M, where M is the ADC's resolution in bits; that is, one voltage interval is assigned in between two consecutive code levels. Example: Coding scheme as in figure 1 Full scale measurement range = 0 to 1 volt ADC resolution is 3 bits: 23 = 8 quantization levels ADC voltage resolution, Q = 1 V / 8 = 0.125 V. In many cases, the useful resolution of a converter is limited by the signal-to-noise ratio and other errors in the overall system expressed as an ENOB. Quantization error is introduced by quantization in an ideal ADC, it is a rounding error between the analog input voltage to the output digitized value. The error is signal-dependent. In an ideal ADC, where the quantization error is uniformly distributed between −1/2 LSB and +1/2 LSB, the signal has a uniform distribution covering all quantization levels, the Signal-to-quantization-noise ratio is given by S Q N R = 20 log 10 ≈ 6.02 ⋅ Q d B Where Q is the number of quantization bits.
For example, for a 16-bit ADC, the quantization error is 96.3 dB below the maximum level. Quantization error is distributed from DC to the Nyquist frequency if part of the ADC's bandwidth is not used, as is the case
General Services Administration
The General Services Administration, an independent agency of the United States government, was established in 1949 to help manage and support the basic functioning of federal agencies. GSA supplies products and communications for U. S. government offices, provides transportation and office space to federal employees, develops government-wide cost-minimizing policies and other management tasks. GSA employs about 12,000 federal workers and has an annual operating budget of $20.9 billion. GSA oversees $66 billion of procurement annually, it contributes to the management of about $500 billion in U. S. federal property, divided chiefly among 8,700 owned and leased buildings and a 215,000 vehicle motor pool. Among the real estate assets managed by GSA are the Ronald Reagan Building and International Trade Center in Washington, D. C. – the largest U. S. federal building after the Pentagon – and the Hart-Dole-Inouye Federal Center. GSA's business lines include the Federal Acquisition Service and the Public Buildings Service, as well as several Staff Offices including the Office of Government-wide Policy, the Office of Small Business Utilization, the Office of Mission Assurance.
As part of FAS, GSA's Technology Transformation Services helps federal agencies improve delivery of information and services to the public. Key initiatives include FedRAMP, Cloud.gov, the USAGov platform, Data.gov, Performance.gov, Challenge.gov. GSA is a member of the Procurement G6, an informal group leading the use of framework agreements and e-procurement instruments in public procurement. In 1947 President Harry Truman asked former President Herbert Hoover to lead what became known as the Hoover Commission to make recommendations to reorganize the operations of the federal government. One of the recommendations of the commission was the establishment of an "Office of the General Services." This proposed office would combine the responsibilities of the following organizations: U. S. Treasury Department's Bureau of Federal Supply U. S. Treasury Department's Office of Contract Settlement National Archives Establishment All functions of the Federal Works Agency, including the Public Buildings Administration and the Public Roads Administration War Assets AdministrationGSA became an independent agency on July 1, 1949, after the passage of the Federal Property and Administrative Services Act.
General Jess Larson, Administrator of the War Assets Administration, was named GSA's first Administrator. The first job awaiting Administrator Larson and the newly formed GSA was a complete renovation of the White House; the structure had fallen into such a state of disrepair by 1949 that one inspector of the time said the historic structure was standing "purely from habit." Larson explained the nature of the total renovation in depth by saying, "In order to make the White House structurally sound, it was necessary to dismantle, I mean dismantle, everything from the White House except the four walls, which were constructed of stone. Everything, except the four walls without a roof, was stripped down, that's where the work started." GSA worked with President Truman and First Lady Bess Truman to ensure that the new agency's first major project would be a success. GSA completed the renovation in 1952. In 1986 GSA headquarters, U. S. General Services Administration Building, located at Eighteenth and F Streets, NW, was listed on the National Register of Historic Places, at the time serving as Interior Department offices.
In 1960 GSA created the Federal Telecommunications System, a government-wide intercity telephone system. In 1962 the Ad Hoc Committee on Federal Office Space created a new building program to address obsolete office buildings in Washington, D. C. resulting in the construction of many of the offices that now line Independence Avenue. In 1970 the Nixon administration created the Consumer Product Information Coordinating Center, now part of USAGov. In 1974 the Federal Buildings Fund was initiated, allowing GSA to issue rent bills to federal agencies. In 1972 GSA established the Automated Data and Telecommunications Service, which became the Office of Information Resources Management. In 1973 GSA created the Office of Federal Management Policy. GSA's Office of Acquisition Policy centralized procurement policy in 1978. GSA was responsible for emergency preparedness and stockpiling strategic materials to be used in wartime until these functions were transferred to the newly-created Federal Emergency Management Agency in 1979.
In 1984 GSA introduced the federal government to the use of charge cards, known as the GMA SmartPay system. The National Archives and Records Administration was spun off into an independent agency in 1985; the same year, GSA began to provide governmentwide policy oversight and guidance for federal real property management as a result of an Executive Order signed by President Ronald Reagan. In 2003 the Federal Protective Service was moved to the Department of Homeland Security. In 2005 GSA reorganized to merge the Federal Supply Service and Federal Technology Service business lines into the Federal Acquisition Service. On April 3, 2009, President Barack Obama nominated Martha N. Johnson to serve as GSA Administrator. After a nine-month delay, the United States Senate confirmed her nomination on February 4, 2010. On April 2, 2012, Johnson resigned in the wake of a management-deficiency report that detailed improper payments for a 2010 "Western Regions" training conference put on by the Public Buildings Service in Las Vegas.
In July 1991 GSA contractors began the excavation of what is now the Ted Weiss Federal Building in New York City. The planning for that buildin
Quantization (signal processing)
Quantization, in mathematics and digital signal processing, is the process of mapping input values from a large set to output values in a smaller set with a finite number of elements. Rounding and truncation are typical examples of quantization processes. Quantization is involved to some degree in nearly all digital signal processing, as the process of representing a signal in digital form ordinarily involves rounding. Quantization forms the core of all lossy compression algorithms; the difference between an input value and its quantized value is referred to as quantization error. A device or algorithmic function that performs quantization is called a quantizer. An analog-to-digital converter is an example of a quantizer; because quantization is a many-to-few mapping, it is an inherently non-linear and irreversible process. The set of possible input values may be infinitely large, may be continuous and therefore uncountable; the set of possible output values may be countably infinite. The input and output sets involved in quantization can be defined in a rather general way.
For example, vector quantization is the application of quantization to multi-dimensional input data. An analog-to-digital converter can be modeled as two processes: quantization. Sampling converts a time-varying voltage signal into a discrete-time signal, a sequence of real numbers. Quantization replaces each real number with an approximation from a finite set of discrete values. Most these discrete values are represented as fixed-point words. Though any number of quantization levels is possible, common word-lengths are 8-bit, 16-bit and 24-bit. Quantizing a sequence of numbers produces a sequence of quantization errors, sometimes modeled as an additive random signal called quantization noise because of its stochastic behavior; the more levels a quantizer uses, the lower is its quantization noise power. Rate–distortion optimized quantization is encountered in source coding for lossy data compression algorithms, where the purpose is to manage distortion within the limits of the bit rate supported by a communication channel or storage medium.
The analysis of quantization in this context involves studying the amount of data, used to represent the output of the quantizer, studying the loss of precision, introduced by the quantization process. As an example, rounding a real number x to the nearest integer value forms a basic type of quantizer – a uniform one. A typical uniform quantizer with a quantization step size equal to some value Δ can be expressed as Q = Δ ⋅ ⌊ x Δ + 1 2 ⌋ = Δ ⋅ floor ,where the notation ⌊ ⌋ or floor depicts the floor function; the essential property of a quantizer is that it has a countable set of possible output values that has fewer members than the set of possible input values. The members of the set of output values may have integer, rational, or real values. For simple rounding to the nearest integer, the step size Δ is equal to 1. With Δ = 1 or with Δ equal to any other integer value, this quantizer has real-valued inputs and integer-valued outputs; when the quantization step size is small relative to the variation in the signal being quantized, it is simple to show that the mean squared error produced by such a rounding operation will be Δ 2 / 12.
Mean squared error is called the quantization noise power. Adding one bit to the quantizer halves the value of Δ, which reduces the noise power by the factor ¼. In terms of decibels, the noise power change is 10 ⋅ log 10 ≈ − 6 d B; because the set of possible output values of a quantizer is countable, any quantizer can be decomposed into two distinct stages, which can be referred to as the classification stage and the reconstruction stage, where the classification stage maps the input value to an integer quantization index k and the reconstruction stage maps the index k to the reconstruction value y k, the output approx
In communication systems, signal processing, electrical engineering, a signal is a function that "conveys information about the behavior or attributes of some phenomenon". In its most common usage, in electronics and telecommunication, this is a time varying voltage, current or electromagnetic wave used to carry information. A signal may be defined as an "observable change in a quantifiable entity". In the physical world, any quantity exhibiting variation in time or variation in space is a signal that might provide information on the status of a physical system, or convey a message between observers, among other possibilities; the IEEE Transactions on Signal Processing states that the term "signal" includes audio, speech, communication, sonar, radar and musical signals. In a effort of redefining a signal, anything, only a function of space, such as an image, is excluded from the category of signals, it is stated that a signal may or may not contain any information. In nature, signals can take the form of any action by one organism able to be perceived by other organisms, ranging from the release of chemicals by plants to alert nearby plants of the same type of a predator, to sounds or motions made by animals to alert other animals of the presence of danger or of food.
Signaling occurs in organisms all the way down to the cellular level, with cell signaling. Signaling theory, in evolutionary biology, proposes that a substantial driver for evolution is the ability for animals to communicate with each other by developing ways of signaling. In human engineering, signals are provided by a sensor, the original form of a signal is converted to another form of energy using a transducer. For example, a microphone converts an acoustic signal to a voltage waveform, a speaker does the reverse; the formal study of the information content of signals is the field of information theory. The information in a signal is accompanied by noise; the term noise means an undesirable random disturbance, but is extended to include unwanted signals conflicting with the desired signal. The prevention of noise is covered in part under the heading of signal integrity; the separation of desired signals from a background is the field of signal recovery, one branch of, estimation theory, a probabilistic approach to suppressing random disturbances.
Engineering disciplines such as electrical engineering have led the way in the design and implementation of systems involving transmission and manipulation of information. In the latter half of the 20th century, electrical engineering itself separated into several disciplines, specialising in the design and analysis of systems that manipulate physical signals. Definitions specific to sub-fields are common. For example, in information theory, a signal is a codified message, that is, the sequence of states in a communication channel that encodes a message. In the context of signal processing, signals are analog and digital representations of analog physical quantities. In terms of their spatial distributions, signals may be categorized as point source signals and distributed source signals. In a communication system, a transmitter encodes a message to create a signal, carried to a receiver by the communications channel. For example, the words "Mary had a little lamb" might be the message spoken into a telephone.
The telephone transmitter converts the sounds into an electrical signal. The signal is transmitted to the receiving telephone by wires. In telephone networks, for example common-channel signaling, refers to phone number and other digital control information rather than the actual voice signal. Signals can be categorized in various ways; the most common distinction is between discrete and continuous spaces that the functions are defined over, for example discrete and continuous time domains. Discrete-time signals are referred to as time series in other fields. Continuous-time signals are referred to as continuous signals. A second important distinction is between continuous-valued. In digital signal processing, a digital signal may be defined as a sequence of discrete values associated with an underlying continuous-valued physical process. In digital electronics, digital signals are the continuous-time waveform signals in a digital system, representing a bit-stream. Another important property of a signal is its information content.
Two main types of signals encountered in practice are digital. The figure shows a digital signal that results from approximating an analog signal by its values at particular time instants. Digital signals are quantized. An analog signal is any continuous signal for which the time varying feature of the signal is a representation of some other time varying quantity, i.e. analogous to another time varying signal. For example, in an analog audio signal, the instantaneous voltage of the signal varies continuously with the pressure of the sound waves, it differs from a digital signal, in which the continuous quantity is a representation of a sequence of discrete values which can only take on one of a finite number of values. The term analog signal refers to electrical signals. An analog signal uses some property of the medium to convey the signal's information. For ex
Telecommunication is the transmission of signs, messages, writings and sounds or information of any nature by wire, optical or other electromagnetic systems. Telecommunication occurs when the exchange of information between communication participants includes the use of technology, it is transmitted either electrically over physical media, such as cables, or via electromagnetic radiation. Such transmission paths are divided into communication channels which afford the advantages of multiplexing. Since the Latin term communicatio is considered the social process of information exchange, the term telecommunications is used in its plural form because it involves many different technologies. Early means of communicating over a distance included visual signals, such as beacons, smoke signals, semaphore telegraphs, signal flags, optical heliographs. Other examples of pre-modern long-distance communication included audio messages such as coded drumbeats, lung-blown horns, loud whistles. 20th- and 21st-century technologies for long-distance communication involve electrical and electromagnetic technologies, such as telegraph and teleprinter, radio, microwave transmission, fiber optics, communications satellites.
A revolution in wireless communication began in the first decade of the 20th century with the pioneering developments in radio communications by Guglielmo Marconi, who won the Nobel Prize in Physics in 1909, other notable pioneering inventors and developers in the field of electrical and electronic telecommunications. These included Charles Wheatstone and Samuel Morse, Alexander Graham Bell, Edwin Armstrong and Lee de Forest, as well as Vladimir K. Zworykin, John Logie Baird and Philo Farnsworth; the word telecommunication is a compound of the Greek prefix tele, meaning distant, far off, or afar, the Latin communicare, meaning to share. Its modern use is adapted from the French, because its written use was recorded in 1904 by the French engineer and novelist Édouard Estaunié. Communication was first used as an English word in the late 14th century, it comes from Old French comunicacion, from Latin communicationem, noun of action from past participle stem of communicare "to share, divide out.
Homing pigeons have been used throughout history by different cultures. Pigeon post had Persian roots, was used by the Romans to aid their military. Frontinus said; the Greeks conveyed the names of the victors at the Olympic Games to various cities using homing pigeons. In the early 19th century, the Dutch government used the system in Sumatra, and in 1849, Paul Julius Reuter started a pigeon service to fly stock prices between Aachen and Brussels, a service that operated for a year until the gap in the telegraph link was closed. In the Middle Ages, chains of beacons were used on hilltops as a means of relaying a signal. Beacon chains suffered the drawback that they could only pass a single bit of information, so the meaning of the message such as "the enemy has been sighted" had to be agreed upon in advance. One notable instance of their use was during the Spanish Armada, when a beacon chain relayed a signal from Plymouth to London. In 1792, Claude Chappe, a French engineer, built the first fixed visual telegraphy system between Lille and Paris.
However semaphore suffered from the need for skilled operators and expensive towers at intervals of ten to thirty kilometres. As a result of competition from the electrical telegraph, the last commercial line was abandoned in 1880. On 25 July 1837 the first commercial electrical telegraph was demonstrated by English inventor Sir William Fothergill Cooke, English scientist Sir Charles Wheatstone. Both inventors viewed their device as "an improvement to the electromagnetic telegraph" not as a new device. Samuel Morse independently developed a version of the electrical telegraph that he unsuccessfully demonstrated on 2 September 1837, his code was an important advance over Wheatstone's signaling method. The first transatlantic telegraph cable was completed on 27 July 1866, allowing transatlantic telecommunication for the first time; the conventional telephone was invented independently by Alexander Bell and Elisha Gray in 1876. Antonio Meucci invented the first device that allowed the electrical transmission of voice over a line in 1849.
However Meucci's device was of little practical value because it relied upon the electrophonic effect and thus required users to place the receiver in their mouth to "hear" what was being said. The first commercial telephone services were set-up in 1878 and 1879 on both sides of the Atlantic in the cities of New Haven and London. Starting in 1894, Italian inventor Guglielmo Marconi began developing a wireless communication using the newly discovered phenomenon of radio waves, showing by 1901 that they could be transmitted across the Atlantic Ocean; this was the start of wireless telegraphy by radio. Voice and music had little early success. World War I accelerated the development of radio for military communications. After the war, commercial radio AM broadcasting began in the 1920s and became an important mass medium for entertainment and news. World War II again accelerated development of radio for the wartime purposes of aircraft and land communication, radio navigation and radar. Development of stereo FM broadcasting of radio