1.
Embedded system
–
An embedded system is a computer system with a dedicated function within a larger mechanical or electrical system, often with real-time computing constraints. It is embedded as part of a device often including hardware. Embedded systems control many devices in use today. Ninety-eight percent of all microprocessors are manufactured as components of embedded systems, examples of properties of typically embedded computers when compared with general-purpose counterparts are low power consumption, small size, rugged operating ranges, and low per-unit cost. This comes at the price of limited processing resources, which make them more difficult to program. For example, intelligent techniques can be designed to power consumption of embedded systems. Modern embedded systems are based on microcontrollers, but ordinary microprocessors are also common. In either case, the processor used may be ranging from general purpose to those specialised in certain class of computations. A common standard class of dedicated processors is the signal processor. Since the embedded system is dedicated to tasks, design engineers can optimize it to reduce the size and cost of the product and increase the reliability. Some embedded systems are mass-produced, benefiting from economies of scale, complexity varies from low, with a single microcontroller chip, to very high with multiple units, peripherals and networks mounted inside a large chassis or enclosure. One of the very first recognizably modern embedded systems was the Apollo Guidance Computer, an early mass-produced embedded system was the Autonetics D-17 guidance computer for the Minuteman missile, released in 1961. When the Minuteman II went into production in 1966, the D-17 was replaced with a new computer that was the first high-volume use of integrated circuits. Since these early applications in the 1960s, embedded systems have come down in price and there has been a rise in processing power. An early microprocessor for example, the Intel 4004, was designed for calculators and other systems but still required external memory. By the early 1980s, memory, input and output system components had been integrated into the chip as the processor forming a microcontroller. Microcontrollers find applications where a computer would be too costly. A comparatively low-cost microcontroller may be programmed to fulfill the role as a large number of separate components

2.
Acoustic music
–
Acoustic music is music that solely or primarily uses instruments that produce sound through acoustic means, as opposed to electric or electronic means. The retronym acoustic music appeared after the advent of electric instruments, such as the guitar, electric violin, electric organ. It has its origins in the music of the 1960s. The trend has also dubbed as acoustic rock in some cases. Navigating the Music Industry, Current Issues & Business Models, Music and the Modern Condition, Investigating the Boundaries

3.
Bandwidth (signal processing)
–
Bandwidth is the difference between the upper and lower frequencies in a continuous set of frequencies. It is typically measured in hertz, and may refer to passband bandwidth, sometimes to baseband bandwidth. Passband bandwidth is the difference between the upper and lower frequencies of, for example, a band-pass filter, a communication channel. In the case of a filter or baseband signal, the bandwidth is equal to its upper cutoff frequency. A key characteristic of bandwidth is that any band of a given width can carry the amount of information. For example, a 3 kHz band can carry a telephone conversation whether that band is at baseband or modulated to some higher frequency, Bandwidth is a key concept in many telecommunications applications. In radio communications, for example, bandwidth is the range occupied by a modulated carrier signal. An FM radio receivers tuner spans a range of frequencies. A government agency may apportion the regionally available bandwidth to broadcast license holders so that their signals do not mutually interfere, each transmitter owns a slice of bandwidth. For different applications there are different precise definitions, which are different for signals than for systems. One definition of bandwidth, for a system, could be the range of frequencies over which the system produces a level of performance. A less strict and more practically useful definition will refer to the frequencies beyond which frequency response is small, small could mean less than 3 dB below the maximum value, or more rarely 10 dB below, or it could mean below a certain absolute value. As with any definition of the width of a function, many definitions are suitable for different purposes, in some contexts, the signal bandwidth in hertz refers to the frequency range in which the signals spectral density is nonzero or above a small threshold value. That definition is used in calculations of the lowest sampling rate that will satisfy the sampling theorem, the threshold value is often defined relative to the maximum value, and is most commonly the 3dB point, that is the point where the spectral density is half its maximum value. The word bandwidth applies to signals as described above, but it could apply to systems. To say that a system has a certain bandwidth means that the system can process signals of that bandwidth, or that the system reduces the bandwidth of a white noise input to that bandwidth. If the maximum gain is 0 dB, the 3 dB bandwidth is the range where the gain is more than −3 dB. This is also the range of frequencies where the gain is above 70. 7% of the maximum amplitude gain

4.
Hertz
–
The hertz is the unit of frequency in the International System of Units and is defined as one cycle per second. It is named for Heinrich Rudolf Hertz, the first person to provide proof of the existence of electromagnetic waves. Hertz are commonly expressed in SI multiples kilohertz, megahertz, gigahertz, kilo means thousand, mega meaning million, giga meaning billion and tera for trillion. Some of the units most common uses are in the description of waves and musical tones, particularly those used in radio-. It is also used to describe the speeds at which computers, the hertz is equivalent to cycles per second, i. e. 1/second or s −1. In English, hertz is also used as the plural form, as an SI unit, Hz can be prefixed, commonly used multiples are kHz, MHz, GHz and THz. One hertz simply means one cycle per second,100 Hz means one hundred cycles per second, and so on. The unit may be applied to any periodic event—for example, a clock might be said to tick at 1 Hz, the rate of aperiodic or stochastic events occur is expressed in reciprocal second or inverse second in general or, the specific case of radioactive decay, becquerels. Whereas 1 Hz is 1 cycle per second,1 Bq is 1 aperiodic radionuclide event per second, the conversion between a frequency f measured in hertz and an angular velocity ω measured in radians per second is ω =2 π f and f = ω2 π. This SI unit is named after Heinrich Hertz, as with every International System of Units unit named for a person, the first letter of its symbol is upper case. Note that degree Celsius conforms to this rule because the d is lowercase. — Based on The International System of Units, the hertz is named after the German physicist Heinrich Hertz, who made important scientific contributions to the study of electromagnetism. The name was established by the International Electrotechnical Commission in 1930, the term cycles per second was largely replaced by hertz by the 1970s. One hobby magazine, Electronics Illustrated, declared their intention to stick with the traditional kc. Mc. etc. units, sound is a traveling longitudinal wave which is an oscillation of pressure. Humans perceive frequency of waves as pitch. Each musical note corresponds to a frequency which can be measured in hertz. An infants ear is able to perceive frequencies ranging from 20 Hz to 20,000 Hz, the range of ultrasound, infrasound and other physical vibrations such as molecular and atomic vibrations extends from a few femtoHz into the terahertz range and beyond. Electromagnetic radiation is described by its frequency—the number of oscillations of the perpendicular electric and magnetic fields per second—expressed in hertz. Radio frequency radiation is measured in kilohertz, megahertz, or gigahertz

5.
Algorithm
–
In mathematics and computer science, an algorithm is a self-contained sequence of actions to be performed. Algorithms can perform calculation, data processing and automated reasoning tasks, an algorithm is an effective method that can be expressed within a finite amount of space and time and in a well-defined formal language for calculating a function. The transition from one state to the next is not necessarily deterministic, some algorithms, known as randomized algorithms, giving a formal definition of algorithms, corresponding to the intuitive notion, remains a challenging problem. In English, it was first used in about 1230 and then by Chaucer in 1391, English adopted the French term, but it wasnt until the late 19th century that algorithm took on the meaning that it has in modern English. Another early use of the word is from 1240, in a manual titled Carmen de Algorismo composed by Alexandre de Villedieu and it begins thus, Haec algorismus ars praesens dicitur, in qua / Talibus Indorum fruimur bis quinque figuris. Which translates as, Algorism is the art by which at present we use those Indian figures, the poem is a few hundred lines long and summarizes the art of calculating with the new style of Indian dice, or Talibus Indorum, or Hindu numerals. An informal definition could be a set of rules that precisely defines a sequence of operations, which would include all computer programs, including programs that do not perform numeric calculations. Generally, a program is only an algorithm if it stops eventually, but humans can do something equally useful, in the case of certain enumerably infinite sets, They can give explicit instructions for determining the nth member of the set, for arbitrary finite n. An enumerably infinite set is one whose elements can be put into one-to-one correspondence with the integers, the concept of algorithm is also used to define the notion of decidability. That notion is central for explaining how formal systems come into being starting from a set of axioms. In logic, the time that an algorithm requires to complete cannot be measured, from such uncertainties, that characterize ongoing work, stems the unavailability of a definition of algorithm that suits both concrete and abstract usage of the term. Algorithms are essential to the way computers process data, thus, an algorithm can be considered to be any sequence of operations that can be simulated by a Turing-complete system. Although this may seem extreme, the arguments, in its favor are hard to refute. Gurevich. Turings informal argument in favor of his thesis justifies a stronger thesis, according to Savage, an algorithm is a computational process defined by a Turing machine. Typically, when an algorithm is associated with processing information, data can be read from a source, written to an output device. Stored data are regarded as part of the state of the entity performing the algorithm. In practice, the state is stored in one or more data structures, for some such computational process, the algorithm must be rigorously defined, specified in the way it applies in all possible circumstances that could arise. That is, any conditional steps must be dealt with, case-by-case