Qualcomm Incorporated is an American multinational semiconductor and telecommunications equipment company that designs and markets wireless telecommunications products and services. It derives most of its revenue from chipmaking and the bulk of its profit from patent licensing businesses; the company headquarters is located in San Diego, United States, has 224 worldwide locations. The parent company is Qualcomm Incorporated, which includes the Qualcomm Technology Licensing Division. Qualcomm's wholly owned subsidiary, Qualcomm Technologies, Inc. operates all of Qualcomm's R&D activities. Qualcomm was created on July 1985 by seven former Linkabit employees led by Irwin Jacobs; the company was named Qualcomm for “QUALity COMMunications.” It started as a contract research and development center for government and defense projects. Qualcomm merged with Omninet in 1988 and raised $3.5 million in funding in order to produce the Omnitracs satellite communications system for trucking companies. Qualcomm grew from eight employees in 1986 to 620 employees in 1991.
By 1989, Qualcomm had $32 million in revenues, 50 percent of, from an Omnitracs contract with Schneider National. Omnitracs profits helped fund Qualcomm's research and development into code-division multiple access technologies for cell phone networks. Qualcomm was operating at a loss in the 1990s due to its investment in CDMA research. To obtain funding, the company filed an initial public offering in September 1991 raising $68 million. An additional $486 million was raised in 1995 through the sale of 11.5 million more shares. The second funding round was done to raise money for the mass manufacturing of CDMA-based phones, base-stations, equipment, after most US-based cellular networks announced they would adopt the CDMA standard; the company had $383 million in annual revenue in 1995 and $814 million by 1996. In 1991, Qualcomm acquired Eudora, an email client software for the PC that could be used with the OmniTRACS system; the acquisition associated a used email client with a company, little-known at the time.
In 1998, Qualcomm was restructured. Its cell-phone manufacturing business was spun-off in order to focus on its higher-margin patents business; the following year, Qualcomm was the fastest growing stock on the market with a 2,621 percent growth over one year. By 2000, Qualcomm had grown to 6,300 employees, $3.2 billion in revenues, $670 million in profit. 39 percent of its sales were from CDMA technology, followed by licensing and other products. Around this time, Qualcomm established offices in Europe, Asia Pacific, Latin America. By 2001, 65 percent of Qualcomm's revenues originated from outside the United States with 35 percent coming from South Korea. In 2005, Paul E. Jacobs, son of Qualcomm founder Dr. Irwin Jacobs, was appointed as Qualcomm's new CEO. Whereas Irwin Jacobs focused on CDMA patents, Paul Jacobs refocused much of Qualcomm's new research and development on projects related to the internet of things. Qualcomm announced Steven Mollenkopf would succeed Paul Jacobs as CEO in December 2013.
Mollenkopf said he would expand Qualcomm's focus to wireless technology for cars, wearable devices, other new markets. The European Commission fined Qualcomm €997 million for abuse of dominant market position on January 24, 2018. On March 16, 2018, Qualcomm removed executive chairman Paul Jacobs after he "broached a long-shot bid" for a buyout earlier that week. In 2018, Qualcomm filed a lawsuit against Intel. "After several meet-and-confers and exchanges of written correspondence, on May 18, Intel appeared willing to cooperate, offering a'limited supplemental production of technical materials relating to relevant components designed for 2018 iPhone models' in exchange for Qualcomm's agreement that the limited production would satisfy certain requests in the document subpoena," the US federal court filing states. A court in the US, on March 15,2019, ruled that Apple must indemnify Qualcomm for infringing three patents related to mobile technologies; the jury ruled. $1.41 per iPhone that used the company’s technology without authorization.
For the fiscal year 2017, Qualcomm reported earnings of US$2.5 billion, with an annual revenue of US$22.3 billion, an increase of 5.4% over the previous fiscal cycle. Qualcomm's shares traded at over $55 per share, its market capitalization was valued at over US$91.9 billion in September 2018. The company is ranked 133rd on the Fortune 500 list of the largest United States corporations by revenue. Qualcomm pioneered the commercialization of the cdmaOne standard for wireless cellular communications, following up with CDMA2000, an early standard for third-generation mobile. Today, the company is the leading patent holder in advanced 3G mobile technologies, including CDMA2000 1xEV-DO and its evolutions; the license streams from the patents on these inventions, related products, are a major component of Qualcomm's business. In June 2011, Qualcomm announced that it would release a set of application programming interfaces geared to give Web-based applications deeper links into hardware. Beginning in 1991, Qualcomm participated in the development of the Globalstar satellite system along with Loral Space & Communications.
It uses a low Earth orbit satellite constellation consisting of 44 active satellites. The system is used for voice telephony via hand-held satellite phones, asset tracking and data transfer using mobile satellite modems; the system was designed as a normal IS-95 system, used the satellite as a "
Metadata is "data that provides information about other data". Many distinct types of metadata exist, among these descriptive metadata, structural metadata, administrative metadata, reference metadata and statistical metadata. Descriptive metadata describes a resource for purposes such as identification, it can include elements such as title, abstract and keywords. Structural metadata is metadata about containers of data and indicates how compound objects are put together, for example, how pages are ordered to form chapters, it describes the types, versions and other characteristics of digital materials. Administrative metadata provides information to help manage a resource, such as when and how it was created, file type and other technical information, who can access it. Reference metadata describes the contents and quality of statistical data Statistical metadata may describe processes that collect, process, or produce statistical data. Metadata was traditionally used in the card catalogs of libraries until the 1980s, when libraries converted their catalog data to digital databases.
In the 2000s, as digital formats were becoming the prevalent way of storing data and information, metadata was used to describe digital data using metadata standards. The first description of "meta data" for computer systems is purportedly noted by MIT's Center for International Studies experts David Griffel and Stuart McIntosh in 1967: "In summary we have statements in an object language about subject descriptions of data and token codes for the data. We have statements in a meta language describing the data relationships and transformations, ought/is relations between norm and data."There are different metadata standards for each different discipline. Describing the contents and context of data or data files increases its usefulness. For example, a web page may include metadata specifying what software language the page is written in, what tools were used to create it, what subjects the page is about, where to find more information about the subject; this metadata can automatically improve the reader's experience and make it easier for users to find the web page online.
A CD may include metadata providing information about the musicians and songwriters whose work appears on the disc. A principal purpose of metadata is to help users discover resources. Metadata helps to organize electronic resources, provide digital identification, support the archiving and preservation of resources. Metadata assists users in resource discovery by "allowing resources to be found by relevant criteria, identifying resources, bringing similar resources together, distinguishing dissimilar resources, giving location information." Metadata of telecommunication activities including Internet traffic is widely collected by various national governmental organizations. This data can be used for mass surveillance. In many countries, the metadata relating to emails, telephone calls, web pages, video traffic, IP connections and cell phone locations are stored by government organizations. Metadata means "data about data". Although the "meta" prefix means "after" or "beyond", it is used to mean "about" in epistemology.
Metadata is defined as the data providing information about one or more aspects of the data. Some examples include:Means of creation of the data Purpose of the data Time and date of creation Creator or author of the data Location on a computer network where the data was created Standards used File size Data quality Source of the data Process used to create the dataFor example, a digital image may include metadata that describes how large the picture is, the color depth, the image resolution, when the image was created, the shutter speed, other data. A text document's metadata may contain information about how long the document is, who the author is, when the document was written, a short summary of the document. Metadata within web pages can contain descriptions of page content, as well as key words linked to the content; these links are called "Metatags", which were used as the primary factor in determining order for a web search until the late 1990s. The reliance of metatags in web searches was decreased in the late 1990s because of "keyword stuffing".
Metatags were being misused to trick search engines into thinking some websites had more relevance in the search than they did. Metadata can be stored and managed in a database called a metadata registry or metadata repository. However, without context and a point of reference, it might be impossible to identify metadata just by looking at it. For example: by itself, a database containing several numbers, all 13 digits long could be the results of calculations or a list of numbers to plug into an equation - without any other context, the numbers themselves can be perceived as the data, but if given the context that this database is a log of a book collection, those 13-digit numbers may now be identified as ISBNs - information that refers to the book, but is not itself the information within the book. The term "metadata" was coined in 1968 by Philip Bagley, in his book "Extension of Programming Language Concepts" where it is clear that he uses the term in the ISO 11179 "traditional" sense, "structural metadata" i.e. "data about the containers of data".
GarageBand is a line of digital audio workstations for macOS and iOS devices that allows users to create music or podcasts. GarageBand is developed and sold by Apple for macOS, is part of the iLife software suite, its music and podcast creation system enables users to create multiple tracks with pre-made MIDI keyboards, pre-made loops, an array of various instrumental effects, voice recordings. GarageBand was developed by Apple under the direction of Dr. Gerhard Lengeling. Dr. Lengeling was from the German company Emagic, makers of Logic Audio. Steve Jobs announced the application in his keynote speech at the Macworld Conference & Expo in San Francisco on January 6, 2004. Musician John Mayer assisted with its demonstration. Apple announced GarageBand 2 at the 2005 Macworld Conference & Expo on January 11, 2005, it shipped, as announced, around January 22, 2005. Notable new features included the abilities to edit music in musical notation, it was possible to record up to 8 tracks at once and to fix timing and pitch of recordings.
Apple added automation of track pan position, master volume, the master pitch. Transposition of both audio and MIDI has been added by Apple along with the ability to import MIDI files. GarageBand 3, announced at 2006's Macworld Conference & Expo, includes a'podcast studio', including the ability to use more than 200 effects and jingles, integration with iChat for remote interviews. GarageBand 4 known as GarageBand'08, is part of iLife'08, it incorporates the ability to record sections of a song separately, such as bridges, chorus lines. Additionally, it provides support for the automation of tempos and instruments, the creation, exportation of iPhone ringtones, a "Magic GarageBand" feature which includes a virtual jam session with a complete 3D view of the Electric instruments. GarageBand 5 is part of the iLife'09 package, it includes music instruction and allows the user to buy instructional videos by contemporary artists. It contains new features for electric guitar players, including a dedicated 3D Electric Guitar Track containing a virtual stompbox pedalboard, virtual amplifiers with spring reverb and tremolo.
GarageBand 5 includes a cleaner, redesigned user interface as well as Project Templates. GarageBand 6 known as GarageBand'11, is part of the iLife'11 package, which Apple released on October 20, 2010; this version brings new features such as a tool to adjust the rhythm of a recording. It includes the ability to match the tempo of one track with another additional guitar amps and stompboxes, 22 new lessons for guitar and piano, "How Did I Play?", a tool to measure the accuracy and progress of a piano or guitar performance in a lesson. Apple released GarageBand 10 along with OS X 10.9 Mavericks in October 2013. This version has lost the podcast functionality. Apple updated GarageBand 10 for Mac on March 20, 2014. Version 10.0.2 adds the ability to export tracks in MP3 format as well as a new drummer module, but removed support for podcasting. GarageBand was updated to version 10.0.3 on October 16, 2014. This version included myriad bug fixes and several new features including a dedicated Bass Amp Designer, the introduction of global track effects and dynamic track resizing.
Apple released GarageBand 10.2 on June 5, 2017. The latest version is GarageBand 10.3.2, released on December 10, 2018. GarageBand is a digital audio workstation and music sequencer that can record and play back multiple tracks of audio. Built-in audio filters that use the AU standard allow the user to enhance the audio track with various effects, including reverb and distortion amongst others. GarageBand offers the ability to record at both 16-bit and 24-bit Audio Resolution, but at a fixed sample rate of 44.1 kHz. An included tuning system helps with pitch correction and can imitate the Auto-Tune effect when tuned to the maximum level, it has a large array of preset effects to choose from, with an option to create your own effects. GarageBand includes a large selection of realistic, sampled instruments and software modeled synthesizers; these can be used to create original compositions or play music live through the use of a USB MIDI keyboard connected to the computer. An on-screen virtual keyboard is available as well as using a standard QWERTY keyboard with the "musical typing" feature.
The synthesizers were broken into two groups: digital. Each synthesizer has a wide variety of adjustable parameters, including richness, cut off, standard attack, decay and release. In addition to the standard tracks, Garageband allows for guitar-specific tracks that can use a variety of simulated amplifiers and effects processors; these imitate popular hardware from companies including Marshall Amplification, Orange Music Electronic Company, Fender Musical Instruments Corporation. Up to five simulated effects can be layered on top of the virtual amplifiers, which feature adjustable parameters including tone and volume. Guitars can be connected to Macs using a USB interface. GarageBand can import MIDI offers piano roll or notation-style editing and playback. By complying with the MIDI Standard, a user can edit many different aspects of a recorded note, including pitch and duration. Pitch was settable to 1/128 of a semitone, on a scale of 0–127. Velocity, which determines amplitude, can
In computing, floating-point arithmetic is arithmetic using formulaic representation of real numbers as an approximation so as to support a trade-off between range and precision. For this reason, floating-point computation is found in systems which include small and large real numbers, which require fast processing times. A number is, in general, represented to a fixed number of significant digits and scaled using an exponent in some fixed base. A number that can be represented is of the following form: significand × base exponent, where significand is an integer, base is an integer greater than or equal to two, exponent is an integer. For example: 1.2345 = 12345 ⏟ significand × 10 ⏟ base − 4 ⏞ exponent. The term floating point refers to the fact that a number's radix point can "float"; this position is indicated as the exponent component, thus the floating-point representation can be thought of as a kind of scientific notation. A floating-point system can be used to represent, with a fixed number of digits, numbers of different orders of magnitude: e.g. the distance between galaxies or the diameter of an atomic nucleus can be expressed with the same unit of length.
The result of this dynamic range is that the numbers that can be represented are not uniformly spaced. Over the years, a variety of floating-point representations have been used in computers. In 1985, the IEEE 754 Standard for Floating-Point Arithmetic was established, since the 1990s, the most encountered representations are those defined by the IEEE; the speed of floating-point operations measured in terms of FLOPS, is an important characteristic of a computer system for applications that involve intensive mathematical calculations. A floating-point unit is a part of a computer system specially designed to carry out operations on floating-point numbers. A number representation specifies some way of encoding a number as a string of digits. There are several mechanisms. In common mathematical notation, the digit string can be of any length, the location of the radix point is indicated by placing an explicit "point" character there. If the radix point is not specified the string implicitly represents an integer and the unstated radix point would be off the right-hand end of the string, next to the least significant digit.
In fixed-point systems, a position in the string is specified for the radix point. So a fixed-point scheme might be to use a string of 8 decimal digits with the decimal point in the middle, whereby "00012345" would represent 0001.2345. In scientific notation, the given number is scaled by a power of 10, so that it lies within a certain range—typically between 1 and 10, with the radix point appearing after the first digit; the scaling factor, as a power of ten, is indicated separately at the end of the number. For example, the orbital period of Jupiter's moon Io is 152,853.5047 seconds, a value that would be represented in standard-form scientific notation as 1.528535047×105 seconds. Floating-point representation is similar in concept to scientific notation. Logically, a floating-point number consists of: A signed digit string of a given length in a given base; this digit string is referred to mantissa, or coefficient. The length of the significand determines the precision; the radix point position is assumed always to be somewhere within the significand—often just after or just before the most significant digit, or to the right of the rightmost digit.
This article follows the convention that the radix point is set just after the most significant digit. A signed integer exponent. To derive the value of the floating-point number, the significand is multiplied by the base raised to the power of the exponent, equivalent to shifting the radix point from its implied position by a number of places equal to the value of the exponent—to the right if the exponent is positive or to the left if the exponent is negative. Using base-10 as an example, the number 152,853.5047, which has ten decimal digits of precision, is represented as the significand 1,528,535,047 together with 5 as the exponent. To determine the actual value, a decimal point is placed after the first digit of the significand and the result is multiplied by 105 to give 1.528535047×105, or 152,853.5047. In storing such a number, the base need not be stored, since it will be the same for the entire range of supported numbers, can thus be inferred. Symbolically, this final value is: s b p − 1 × b e, where s is the
MacOS is a series of graphical operating systems developed and marketed by Apple Inc. since 2001. It is the primary operating system for Apple's Mac family of computers. Within the market of desktop and home computers, by web usage, it is the second most used desktop OS, after Microsoft Windows.macOS is the second major series of Macintosh operating systems. The first is colloquially called the "classic" Mac OS, introduced in 1984, the final release of, Mac OS 9 in 1999; the first desktop version, Mac OS X 10.0, was released in March 2001, with its first update, 10.1, arriving that year. After this, Apple began naming its releases after big cats, which lasted until OS X 10.8 Mountain Lion. Since OS X 10.9 Mavericks, releases have been named after locations in California. Apple shortened the name to "OS X" in 2012 and changed it to "macOS" in 2016, adopting the nomenclature that they were using for their other operating systems, iOS, watchOS, tvOS; the latest version is macOS Mojave, publicly released in September 2018.
Between 1999 and 2009, Apple sold. The initial version, Mac OS X Server 1.0, was released in 1999 with a user interface similar to Mac OS 8.5. After this, new versions were introduced concurrently with the desktop version of Mac OS X. Beginning with Mac OS X 10.7 Lion, the server functions were made available as a separate package on the Mac App Store.macOS is based on technologies developed between 1985 and 1997 at NeXT, a company that Apple co-founder Steve Jobs created after leaving the company. The "X" in Mac OS X and OS X is pronounced as such; the X was a prominent part of the operating system's brand identity and marketing in its early years, but receded in prominence since the release of Snow Leopard in 2009. UNIX 03 certification was achieved for the Intel version of Mac OS X 10.5 Leopard and all releases from Mac OS X 10.6 Snow Leopard up to the current version have UNIX 03 certification. MacOS shares its Unix-based core, named Darwin, many of its frameworks with iOS, tvOS and watchOS.
A modified version of Mac OS X 10.4 Tiger was used for the first-generation Apple TV. Releases of Mac OS X from 1999 to 2005 ran on the PowerPC-based Macs of that period. After Apple announced that they were switching to Intel CPUs from 2006 onwards, versions were released for 32-bit and 64-bit Intel-based Macs. Versions from Mac OS X 10.7 Lion run on 64-bit Intel CPUs, in contrast to the ARM architecture used on iOS and watchOS devices, do not support PowerPC applications. The heritage of what would become macOS had originated at NeXT, a company founded by Steve Jobs following his departure from Apple in 1985. There, the Unix-like NeXTSTEP operating system was developed, launched in 1989; the kernel of NeXTSTEP is based upon the Mach kernel, developed at Carnegie Mellon University, with additional kernel layers and low-level user space code derived from parts of BSD. Its graphical user interface was built on top of an object-oriented GUI toolkit using the Objective-C programming language. Throughout the early 1990s, Apple had tried to create a "next-generation" OS to succeed its classic Mac OS through the Taligent and Gershwin projects, but all of them were abandoned.
This led Apple to purchase NeXT in 1996, allowing NeXTSTEP called OPENSTEP, to serve as the basis for Apple's next generation operating system. This purchase led to Steve Jobs returning to Apple as an interim, the permanent CEO, shepherding the transformation of the programmer-friendly OPENSTEP into a system that would be adopted by Apple's primary market of home users and creative professionals; the project was first code named "Rhapsody" and officially named Mac OS X. Mac OS X was presented as the tenth major version of Apple's operating system for Macintosh computers. Previous Macintosh operating systems were named using Arabic numerals, as with Mac OS 8 and Mac OS 9; the letter "X" in Mac OS X's name refers to a Roman numeral. It is therefore pronounced "ten" in this context. However, it is commonly pronounced like the letter "X"; the first version of Mac OS X, Mac OS X Server 1.0, was a transitional product, featuring an interface resembling the classic Mac OS, though it was not compatible with software designed for the older system.
Consumer releases of Mac OS X included more backward compatibility. Mac OS applications could be rewritten to run natively via the Carbon API; the consumer version of Mac OS X was launched in 2001 with Mac OS X 10.0. Reviews were variable, with extensive praise for its sophisticated, glossy Aqua interface but criticizing it for sluggish performance. With Apple's popularity at a low, the makers of several classic Mac applications such as FrameMaker and PageMaker declined to develop new versions of their software for Mac OS X. Ars Technica columnist John Siracusa, who reviewed every major OS X release up to 10.10, described the early releases in retrospect as'dog-slow, feature poor' and Aqua as'unbearably slow and a huge resource hog'. Apple developed several new releases of Mac OS X. Siracusa's review of version 10.3, noted "It's strange to have gone from years of uncertainty and vaporware to a steady annual supply of major new operating system releases." Version 10.4, Tiger shocked executives at Microsoft by offering a number of features, such as fast file s
A personal computer is a multi-purpose computer whose size and price make it feasible for individual use. Personal computers are intended to be operated directly by an end user, rather than by a computer expert or technician. Unlike large costly minicomputer and mainframes, time-sharing by many people at the same time is not used with personal computers. Institutional or corporate computer owners in the 1960s had to write their own programs to do any useful work with the machines. While personal computer users may develop their own applications these systems run commercial software, free-of-charge software or free and open-source software, provided in ready-to-run form. Software for personal computers is developed and distributed independently from the hardware or operating system manufacturers. Many personal computer users no longer need to write their own programs to make any use of a personal computer, although end-user programming is still feasible; this contrasts with mobile systems, where software is only available through a manufacturer-supported channel, end-user program development may be discouraged by lack of support by the manufacturer.
Since the early 1990s, Microsoft operating systems and Intel hardware have dominated much of the personal computer market, first with MS-DOS and with Microsoft Windows. Alternatives to Microsoft's Windows operating systems occupy a minority share of the industry; these include free and open-source Unix-like operating systems such as Linux. Advanced Micro Devices provides the main alternative to Intel's processors; the advent of personal computers and the concurrent Digital Revolution have affected the lives of people in all countries. "PC" is an initialism for "personal computer". The IBM Personal Computer incorporated the designation in its model name, it is sometimes useful to distinguish personal computers of the "IBM Personal Computer" family from personal computers made by other manufacturers. For example, "PC" is used in contrast with "Mac", an Apple Macintosh computer.. Since none of these Apple products were mainframes or time-sharing systems, they were all "personal computers" and not "PC" computers.
The "brain" may one day come down to our level and help with our income-tax and book-keeping calculations. But this is speculation and there is no sign of it so far. In the history of computing, early experimental machines could be operated by a single attendant. For example, ENIAC which became operational in 1946 could be run by a single, albeit trained, person; this mode pre-dated the batch programming, or time-sharing modes with multiple users connected through terminals to mainframe computers. Computers intended for laboratory, instrumentation, or engineering purposes were built, could be operated by one person in an interactive fashion. Examples include such systems as the Bendix G15 and LGP-30of 1956, the Programma 101 introduced in 1964, the Soviet MIR series of computers developed from 1965 to 1969. By the early 1970s, people in academic or research institutions had the opportunity for single-person use of a computer system in interactive mode for extended durations, although these systems would still have been too expensive to be owned by a single person.
In what was to be called the Mother of All Demos, SRI researcher Douglas Engelbart in 1968 gave a preview of what would become the staples of daily working life in the 21st century: e-mail, word processing, video conferencing, the mouse. The demonstration required technical support staff and a mainframe time-sharing computer that were far too costly for individual business use at the time; the development of the microprocessor, with widespread commercial availability starting in the mid 1970's, made computers cheap enough for small businesses and individuals to own. Early personal computers—generally called microcomputers—were sold in a kit form and in limited volumes, were of interest to hobbyists and technicians. Minimal programming was done with toggle switches to enter instructions, output was provided by front panel lamps. Practical use required adding peripherals such as keyboards, computer displays, disk drives, printers. Micral N was the earliest commercial, non-kit microcomputer based on a microprocessor, the Intel 8008.
It was built starting in 1972, few hundred units were sold. This had been preceded by the Datapoint 2200 in 1970, for which the Intel 8008 had been commissioned, though not accepted for use; the CPU design implemented in the Datapoint 2200 became the basis for x86 architecture used in the original IBM PC and its descendants. In 1973, the IBM Los Gatos Scientific Center developed a portable computer prototype called SCAMP based on the IBM PALM processor with a Philips compact cassette drive, small CRT, full function keyboard. SCAMP emulated an IBM 1130 minicomputer in order to run APL/1130. In 1973, APL was available only on mainframe computers, most desktop sized microcomputers such as the Wang 2200 or HP 9800 offered only BASIC; because SCAMP was the first to emulate APL/1130 performance on a portable, single user computer, PC Magazine in 1983 designated SCAMP a "revolutionary concept" and "the world's first personal computer". This seminal, single user portable computer now resides in the Smithsonian Institution, Washington, D.
C.. Successful demonstrations of the 1973 SCAMP prototype led to the IBM 5100 portable microcomputer launched in 1975 with the ability to be programmed in both APL and BASIC for engineers, analysts and other business problem-solvers. In the late 1960s such a machine would have been nearly as large as two desks and would have weigh
Digital audio is sound, recorded in, or converted into, digital form. In digital audio, the sound wave of the audio signal is encoded as numerical samples in continuous sequence. For example, in CD audio, samples are taken 44100 times per second each with 16 bit sample depth. Digital audio is the name for the entire technology of sound recording and reproduction using audio signals that have been encoded in digital form. Following significant advances in digital audio technology during the 1970s, it replaced analog audio technology in many areas of audio engineering and telecommunications in the 1990s and 2000s. In a digital audio system, an analog electrical signal representing the sound is converted with an analog-to-digital converter into a digital signal using pulse-code modulation; this digital signal can be recorded, edited and copied using computers, audio playback machines, other digital tools. When the sound engineer wishes to listen to the recording on headphones or loudspeakers, a digital-to-analog converter performs the reverse process, converting a digital signal back into an analog signal, sent through an audio power amplifier and to a loudspeaker.
Digital audio systems may include compression, storage and transmission components. Conversion to a digital format allows convenient manipulation, storage and retrieval of an audio signal. Unlike analog audio, in which making copies of a recording results in generation loss and degradation of signal quality, digital audio allows an infinite number of copies to be made without any degradation of signal quality. Digital audio technologies are used in the recording, mass-production, distribution of sound, including recordings of songs, instrumental pieces, sound effects, other sounds. Modern online music distribution depends on digital recording and data compression; the availability of music as data files, rather than as physical objects, has reduced the costs of distribution. Before digital audio, the music industry distributed and sold music by selling physical copies in the form of records and cassette tapes. With digital-audio and online distribution systems such as iTunes, companies sell digital sound files to consumers, which the consumer receives over the Internet.
An analog audio system converts physical waveforms of sound into electrical representations of those waveforms by use of a transducer, such as a microphone. The sounds are stored on an analog medium such as magnetic tape, or transmitted through an analog medium such as a telephone line or radio; the process is reversed for reproduction: the electrical audio signal is amplified and converted back into physical waveforms via a loudspeaker. Analog audio retains its fundamental wave-like characteristics throughout its storage, transformation and amplification. Analog audio signals are susceptible to noise and distortion, due to the innate characteristics of electronic circuits and associated devices. Disturbances in a digital system do not result in error unless the disturbance is so large as to result in a symbol being misinterpreted as another symbol or disturb the sequence of symbols, it is therefore possible to have an error-free digital audio system in which no noise or distortion is introduced between conversion to digital format, conversion back to analog.
A digital audio signal may optionally be encoded for correction of any errors that might occur in the storage or transmission of the signal. This technique, known as channel coding, is essential for broadcast or recorded digital systems to maintain bit accuracy. Eight-to-fourteen modulation is a channel code used in the audio compact disc. A digital audio system starts with an ADC; the ADC converts at a known bit resolution. CD audio, for example, has a sampling rate of 44.1 kHz, has 16-bit resolution for each stereo channel. Analog signals that have not been bandlimited must be passed through an anti-aliasing filter before conversion, to prevent the aliasing distortion, caused by audio signals with frequencies higher than the Nyquist frequency. A digital audio signal may be transmitted. Digital audio can be stored on a CD, a digital audio player, a hard drive, a USB flash drive, or any other digital data storage device; the digital signal may be altered through digital signal processing, where it may be filtered or have effects applied.
Sample-rate conversion including upsampling and downsampling may be used to conform signals that have been encoded with a different sampling rate to a common sampling rate prior to processing. Audio data compression techniques, such as MP3, Advanced Audio Coding, Ogg Vorbis, or FLAC, are employed to reduce the file size. Digital audio can be carried over digital audio interfaces such as AES3 or MADI. Digital audio can be carried over a network using audio over Ethernet, audio over IP or other streaming media standards and systems. For playback, digital audio must be converted back to an analog signal with a DAC which may use oversampling. Pulse-code modulation was invented by British scientist Alec Reeves in 1937 and was used in telecommunications applications long before its first use in commercial broadcast and recording. Commercial digital recording was pioneered in Japan by NHK and Nippon Columbia and their Denon brand, in the 1960s; the first commercial digital recordings were released in 1971.
The BBC began to experiment with digital audio in the 1960s. By the early 1970s, it had developed a 2-channel recorder