The Macintosh Quadra is a family of personal computers designed and sold by Apple Computer, Inc. from October 1991 to October 1995. The Quadra, named for the Motorola 68040 central processing unit, replaced the Macintosh II family as the high-end Macintosh model; the first models were the Quadra 700 and Quadra 900, both introduced in October 1991. The Quadra 800, 840AV and 605 were added through 1993; the Macintosh Centris line was merged with the Quadra in October 1993, adding the 610, 650 and 660AV to the range. After the introduction of the Power Macintosh line in early 1994, Apple continued to produce and sell new Quadra models; the product manager for the Quadra family was Frank Casanova, the Product Manager for the Macintosh IIfx. The first computers bearing the Macintosh Quadra name were the Quadra 700 and Quadra 900, both introduced in 1991 with a central processing unit speed of 25 MHz; the 700 was a compact model using the same case dimensions as the Macintosh IIci, with a Processor Direct Slot expansion slot, while the latter was a newly designed tower case with five NuBus expansion slots and one PDS slot.
The 900 was replaced in 1992 with a CPU speed of 33 MHz. The line was joined by a number of "800-series" machines in a new minitower case design, starting with the Quadra 800, the "600-series" pizza box desktop cases with the Quadra 610. In 1993, the Quadra 840AV and 660AV were introduced at 25 MHz respectively, they included an AT&T Digital signal processor and S-Video and composite video input/output ports, as well as CD-quality microphone and audio output ports. The AV models introduced PlainTalk, consisting of the text-to-speech software MacinTalk Pro and speech control; however all of these features were poorly supported in software, DSP was not installed in AV Macs, which were based on the more-powerful PowerPC 601 CPU, powerful enough to handle the coprocessor's duties on its own. Apple hired marketing firm Lexicon Branding to come up with the name. Lexicon chose the name Quadra hoping to appeal to engineers by evoking technical terms like quadrant and quadriceps; the Quadra name was used for the successors to the Centris models that existed during 1993: The 610, the 650 and the 660AV.
Centris was a "mid-range" line of systems between the Quadra on the high end and the LC on the low end, but it was decided that there were too many product lines and the name was dropped. Some machines of this era including the Quadra 605 were sold as Performas; the last use of the name was for the Quadra 630, a variation of the LC 630 using a "full" Motorola 68040 instead of the LC's 68LC040, introduced together with it in 1994. The 630 was the first Mac to use an IDE based drive bus for the internal hard disk drive, whereas all earlier models had used SCSI; the first three Apple Workgroup Server models, the WGS 60, the WGS 80 and the WGS 95 were based on the Centris 610, the Quadra 800 and the Quadra 950, respectively. The transition to the Motorola 68040 was not as smooth as the previous transitions to the Motorola 68020 or Motorola 68030. Due to the Motorola 68040's split instruction and data caches, the Quadra had compatibility problems with self-modifying code. Apple fixed this by having the basic Mac OS memory copy call flush the caches.
This solved the vast majority of stability problems, but negated much of the Motorola 68040's performance improvements. Apple introduced a variant of the memory copy call that did not flush the cache; the new trap was defined in such a way that calling it on an older version of Mac OS would call the previous memory copy routine. The net effect of this was that many complex applications were slow or prone to crashing on the 68040, although developers adapted to the new architecture by relying on Apple's memory copy routines rather than their own, using the memory copy that did not flush the cache when appropriate. List of Macintosh models grouped by CPU type List of Macintosh models by case type Timeline of Macintosh models EveryMac.com - Macintosh Quadra series Lowendmac.com - Centris/Quadra Index
An operating system is system software that manages computer hardware and software resources and provides common services for computer programs. Time-sharing operating systems schedule tasks for efficient use of the system and may include accounting software for cost allocation of processor time, mass storage and other resources. For hardware functions such as input and output and memory allocation, the operating system acts as an intermediary between programs and the computer hardware, although the application code is executed directly by the hardware and makes system calls to an OS function or is interrupted by it. Operating systems are found on many devices that contain a computer – from cellular phones and video game consoles to web servers and supercomputers; the dominant desktop operating system is Microsoft Windows with a market share of around 82.74%. MacOS by Apple Inc. is in second place, the varieties of Linux are collectively in third place. In the mobile sector, use in 2017 is up to 70% of Google's Android and according to third quarter 2016 data, Android on smartphones is dominant with 87.5 percent and a growth rate 10.3 percent per year, followed by Apple's iOS with 12.1 percent and a per year decrease in market share of 5.2 percent, while other operating systems amount to just 0.3 percent.
Linux distributions are dominant in supercomputing sectors. Other specialized classes of operating systems, such as embedded and real-time systems, exist for many applications. A single-tasking system can only run one program at a time, while a multi-tasking operating system allows more than one program to be running in concurrency; this is achieved by time-sharing, where the available processor time is divided between multiple processes. These processes are each interrupted in time slices by a task-scheduling subsystem of the operating system. Multi-tasking may be characterized in co-operative types. In preemptive multitasking, the operating system slices the CPU time and dedicates a slot to each of the programs. Unix-like operating systems, such as Solaris and Linux—as well as non-Unix-like, such as AmigaOS—support preemptive multitasking. Cooperative multitasking is achieved by relying on each process to provide time to the other processes in a defined manner. 16-bit versions of Microsoft Windows used cooperative multi-tasking.
32-bit versions of both Windows NT and Win9x, used preemptive multi-tasking. Single-user operating systems have no facilities to distinguish users, but may allow multiple programs to run in tandem. A multi-user operating system extends the basic concept of multi-tasking with facilities that identify processes and resources, such as disk space, belonging to multiple users, the system permits multiple users to interact with the system at the same time. Time-sharing operating systems schedule tasks for efficient use of the system and may include accounting software for cost allocation of processor time, mass storage and other resources to multiple users. A distributed operating system manages a group of distinct computers and makes them appear to be a single computer; the development of networked computers that could be linked and communicate with each other gave rise to distributed computing. Distributed computations are carried out on more than one machine; when computers in a group work in cooperation, they form a distributed system.
In an OS, distributed and cloud computing context, templating refers to creating a single virtual machine image as a guest operating system saving it as a tool for multiple running virtual machines. The technique is used both in virtualization and cloud computing management, is common in large server warehouses. Embedded operating systems are designed to be used in embedded computer systems, they are designed to operate on small machines like PDAs with less autonomy. They are able to operate with a limited number of resources, they are compact and efficient by design. Windows CE and Minix 3 are some examples of embedded operating systems. A real-time operating system is an operating system that guarantees to process events or data by a specific moment in time. A real-time operating system may be single- or multi-tasking, but when multitasking, it uses specialized scheduling algorithms so that a deterministic nature of behavior is achieved. An event-driven system switches between tasks based on their priorities or external events while time-sharing operating systems switch tasks based on clock interrupts.
A library operating system is one in which the services that a typical operating system provides, such as networking, are provided in the form of libraries and composed with the application and configuration code to construct a unikernel: a specialized, single address space, machine image that can be deployed to cloud or embedded environments. Early computers were built to perform a series of single tasks, like a calculator. Basic operating system features were developed in the 1950s, such as resident monitor functions that could automatically run different programs in succession to speed up processing. Operating systems did not exist in their more complex forms until the early 1960s. Hardware features were added, that enabled use of runtime libraries and parallel processing; when personal computers became popular in the 1980s, operating systems were made for them similar in concept to those used on larger computers. In the 1940s, the earliest electronic digital systems had no operating systems.
Electronic systems of this time were programmed on rows of mechanical switches or by jumper wires on plug boards. These were special-purpose systems that, for example, generated ballistics tables for the military or controlled the pri
The Apple II is an 8-bit home computer, one of the first successful mass-produced microcomputer products, designed by Steve Wozniak. It was introduced in 1977 at the West Coast Computer Faire by Jobs and was the first consumer product sold by Apple Computer, Inc, it is the first model in a series of computers which were produced until Apple IIe production ceased in November 1993. The Apple II marks Apple's first launch of a personal computer aimed at a consumer market – branded towards American households rather than businessmen or computer hobbyists. Byte magazine referred to the Apple II, Commodore PET 2001 and the TRS-80 as the "1977 Trinity." The Apple II had the defining feature of being able to display color graphics, this capability was the reason why the Apple logo was redesigned to have a spectrum of colors. By 1976, Steve Jobs had convinced the product designer Jerry Manock to create the "shell" for the Apple II – a smooth case inspired by kitchen appliances that would conceal the internal mechanics.
The earliest Apple IIs were assembled in Silicon Valley, in Texas. The first computers went on sale on June 10, 1977 with a MOS Technology 6502 microprocessor running at 1.023 MHz, two game paddles, 4 KB of RAM, an audio cassette interface for loading programs and storing data, the Integer BASIC programming language built into the ROMs. The video controller displays 24 lines by 40 columns of monochrome, uppercase-only text on the screen, with NTSC composite video output suitable for display on a TV monitor, or on a regular TV set by way of a separate RF modulator; the original retail price of the computer was $1,298 and $2,638. To reflect the computer's color graphics capability, the Apple logo on the casing has rainbow stripes, which remained a part of Apple's corporate logo until early 1998. Most the Apple II was a catalyst for personal computers across many industries. In the May 1977 issue of Byte, Steve Wozniak published a detailed description of his design; this arrangement eliminated the need for a separate refresh circuit for the DRAM chips, as the video transfer accessed each row of the dynamic memory within the timeout period.
In addition, it did not require separate RAM chips for the video RAM, while the PET and TRS-80 had SRAMs for the video. Rather than use a complex analog-to-digital circuit to read the outputs of the game controller, Wozniak used a simple timer circuit whose period is proportional to the resistance of the game controller, used a software loop to measure the timer. A single 14.31818 MHz master oscillator was divided by various ratios to produce all other required frequencies, including the microprocessor clock signals, the video transfer counters, the color-burst samples. The text and graphics screens have a complex arrangement. For instance, the scanlines were not stored in sequential areas of memory; this complexity was due to Wozniak's realization that the method would allow for the refresh of the dynamic RAM as a side effect. This method had no cost overhead to have software calculate or look up the address of the required scanline and avoided the need for significant extra hardware. In the high-resolution graphics mode, color is determined by pixel position and thus can be implemented in software, saving Wozniak the chips needed to convert bit patterns to colors.
This allowed for subpixel font rendering, since orange and blue pixels appear half a pixel-width farther to the right on the screen than green and purple pixels. The Apple II at first used data cassette storage like most other microcomputers of the time. In 1978, the company introduced an external 5 1⁄4-inch floppy disk drive, the Disk II, attached via a controller card that plugs into one of the computer's expansion slots; the Disk II interface, created by Wozniak, is regarded as an engineering masterpiece for its economy of electronic components. The approach taken in the Disk II controller is typical of Wozniak's designs. With a few small-scale logic chips and a cheap PROM, he created a functional floppy disk interface at a fraction of the component cost of standard circuit configurations. Steve Jobs extensively pushed to give the Apple II a case that looked visually appealing and sellable to people outside of electronics hobbyists, rather than the generic wood and metal boxes typical of early microcomputers.
The result was a futuristic-looking molded white plastic case. Jobs paid close attention to the keyboard design and decided to use dark brown keycaps as it contrasted well with the case; the first production Apple IIs had hand-molded cases. In addition, the initial case design ha
A clock is an instrument used to measure and indicate time. The clock is one of the oldest human inventions, meeting the need to measure intervals of time shorter than the natural units: the day, the lunar month, the year. Devices operating on several physical processes have been used over the millennia; some predecessors to the modern clock may be considered as "clocks" that are based on movement in nature: A sundial shows the time by displaying the position of a shadow on a flat surface. There is a range of a well-known example being the hourglass. Water clocks, along with the sundials, are the oldest time-measuring instruments. A major advance occurred with the invention of the verge escapement, which made possible the first mechanical clocks around 1300 in Europe, which kept time with oscillating timekeepers like balance wheels. Traditionally in horology, the term clock was used for a striking clock, while a clock that did not strike the hours audibly was called a timepiece. In general usage today, a "clock" refers to any device for displaying the time.
Watches and other timepieces that can be carried on one's person are distinguished from clocks. Spring-driven clocks appeared during the 15th century. During the 15th and 16th centuries, clockmaking flourished; the next development in accuracy occurred after 1656 with the invention of the pendulum clock. A major stimulus to improving the accuracy and reliability of clocks was the importance of precise time-keeping for navigation; the electric clock was patented in 1840. The development of electronics in the 20th century led to clocks with no clockwork parts at all; the timekeeping element in every modern clock is a harmonic oscillator, a physical object that vibrates or oscillates at a particular frequency. This object can be a pendulum, a tuning fork, a quartz crystal, or the vibration of electrons in atoms as they emit microwaves. Clocks have different ways of displaying the time. Analog clocks indicate time with moving hands. Digital clocks display a numeric representation of time. Two numbering systems are in use.
Most digital clocks use electronic mechanisms and LCD, LED, or VFD displays. For the blind and use over telephones, speaking clocks state the time audibly in words. There are clocks for the blind that have displays that can be read by touch; the study of timekeeping is known as horology. The word clock derives from the medieval Latin word for "bell". Clocks spread to England from the Low Countries, so the English word came from the Middle Low German and Middle Dutch Klocke; the apparent position of the Sun in the sky moves over the course of each day, reflecting the rotation of the Earth. Shadows cast by stationary objects move correspondingly, so their positions can be used to indicate the time of day. A sundial shows the time by displaying the position of a shadow on a flat surface, which has markings that correspond to the hours. Sundials can be vertical, or in other orientations. Sundials were used in ancient times. With the knowledge of latitude, a well-constructed sundial can measure local solar time with reasonable accuracy, within a minute or two.
Sundials continued to be used to monitor the performance of clocks until the modern era. Many devices can be used to mark passage of time without respect to reference time and can be useful for measuring duration or intervals. Examples of such duration timers are incense clocks and the hourglass. Both the candle clock and the incense clock work on the same principle wherein the consumption of resources is more or less constant allowing reasonably precise and repeatable estimates of time passages. In the hourglass, fine sand pouring through a tiny hole at a constant rate indicates an arbitrary, passage of time; the resource is not re-used. Water clocks known as clepsydrae, along with the sundials, are the oldest time-measuring instruments, with the only exceptions being the vertical gnomon and the day counting tally stick. Given their great antiquity and when they first existed is not known and unknowable; the bowl-shaped outflow is the simplest form of a water clock and is known to have existed in Babylon and in Egypt around the 16th century BC.
Other regions of the world, including India and China have early evidence of water clocks, but the earliest dates are less certain. Some authors, write about water clocks appearing as early as 4000 BC in these regions of the world. Greek astronomer Andronicus of Cyrrhus supervised the construction of the Tower of the Winds in Athens in the 1st century B. C; the Greek and Roman civilizations are credited for advancing water clock design to include complex gearing, connected to fanciful automata and resulted in improved accuracy. These advances were passed on through Byzantium and Islamic times making their way back to Europe. Independently, the Chinese developed their own advanced water clocks（水鐘）in 725 AD, passing their ideas on to Korea and Japan; some water clock designs were developed independently and some knowledge was transferred through the spread of trade. Pre-modern societies do not have the same precise timekeeping requirements that exist in modern industrial societies, where every hour of work or rest is monitored, work may start or finish at any time regardless of external conditions.
Instead, water clocks in ancient societies were used for astrological reasons. These early water clocks were calibrated with a sundial. While never reaching the level of accuracy of a modern timepiece, the water clock was the most a
Ethernet is a family of computer networking technologies used in local area networks, metropolitan area networks and wide area networks. It was commercially introduced in 1980 and first standardized in 1983 as IEEE 802.3, has since retained a good deal of backward compatibility and been refined to support higher bit rates and longer link distances. Over time, Ethernet has replaced competing wired LAN technologies such as Token Ring, FDDI and ARCNET; the original 10BASE5 Ethernet uses coaxial cable as a shared medium, while the newer Ethernet variants use twisted pair and fiber optic links in conjunction with switches. Over the course of its history, Ethernet data transfer rates have been increased from the original 2.94 megabits per second to the latest 400 gigabits per second. The Ethernet standards comprise several wiring and signaling variants of the OSI physical layer in use with Ethernet. Systems communicating over Ethernet divide a stream of data into shorter pieces called frames; each frame contains source and destination addresses, error-checking data so that damaged frames can be detected and discarded.
As per the OSI model, Ethernet provides services up including the data link layer. Features such as the 48-bit MAC address and Ethernet frame format have influenced other networking protocols including Wi-Fi wireless networking technology. Ethernet is used in home and industry; the Internet Protocol is carried over Ethernet and so it is considered one of the key technologies that make up the Internet. Ethernet was developed at Xerox PARC between 1973 and 1974, it was inspired by ALOHAnet. The idea was first documented in a memo that Metcalfe wrote on May 22, 1973, where he named it after the luminiferous aether once postulated to exist as an "omnipresent, completely-passive medium for the propagation of electromagnetic waves." In 1975, Xerox filed a patent application listing Metcalfe, David Boggs, Chuck Thacker, Butler Lampson as inventors. In 1976, after the system was deployed at PARC, Metcalfe and Boggs published a seminal paper; that same year, Ron Crane, Bob Garner, Roy Ogus facilitated the upgrade from the original 2.94 Mbit/s protocol to the 10 Mbit/s protocol, released to the market in 1980.
Metcalfe left Xerox in June 1979 to form 3Com. He convinced Digital Equipment Corporation and Xerox to work together to promote Ethernet as a standard; as part of that process Xerox agreed to relinquish their'Ethernet' trademark. The first standard was published on September 1980 as "The Ethernet, A Local Area Network. Data Link Layer and Physical Layer Specifications"; this so-called DIX standard specified 10 Mbit/s Ethernet, with 48-bit destination and source addresses and a global 16-bit Ethertype-type field. Version 2 was published in November, 1982 and defines what has become known as Ethernet II. Formal standardization efforts proceeded at the same time and resulted in the publication of IEEE 802.3 on June 23, 1983. Ethernet competed with Token Ring and other proprietary protocols. Ethernet was able to adapt to market realities and shift to inexpensive thin coaxial cable and ubiquitous twisted pair wiring. By the end of the 1980s, Ethernet was the dominant network technology. In the process, 3Com became a major company.
3Com shipped its first 10 Mbit/s Ethernet 3C100 NIC in March 1981, that year started selling adapters for PDP-11s and VAXes, as well as Multibus-based Intel and Sun Microsystems computers. This was followed by DEC's Unibus to Ethernet adapter, which DEC sold and used internally to build its own corporate network, which reached over 10,000 nodes by 1986, making it one of the largest computer networks in the world at that time. An Ethernet adapter card for the IBM PC was released in 1982, and, by 1985, 3Com had sold 100,000. Parallel port based Ethernet adapters were produced with drivers for DOS and Windows. By the early 1990s, Ethernet became so prevalent that it was a must-have feature for modern computers, Ethernet ports began to appear on some PCs and most workstations; this process was sped up with the introduction of 10BASE-T and its small modular connector, at which point Ethernet ports appeared on low-end motherboards. Since Ethernet technology has evolved to meet new bandwidth and market requirements.
In addition to computers, Ethernet is now used to interconnect appliances and other personal devices. As Industrial Ethernet it is used in industrial applications and is replacing legacy data transmission systems in the world's telecommunications networks. By 2010, the market for Ethernet equipment amounted to over $16 billion per year. In February 1980, the Institute of Electrical and Electronics Engineers started project 802 to standardize local area networks; the "DIX-group" with Gary Robinson, Phil Arst, Bob Printis submitted the so-called "Blue Book" CSMA/CD specification as a candidate for the LAN specification. In addition to CSMA/CD, Token Ring and Token Bus were considered as candidates for a LAN standard. Competing proposals and broad interest in the initiative led to strong disagreement over which technology to standardize. In December 1980, the group was split into three subgroups, standardization proceeded separately for each proposal. Delays in the standards process put at risk the market introduction of the Xerox Star workstation and 3Com's Ethernet LAN products.
With such business implications in mind, David Liddle an
The Macintosh SE is a personal computer designed and sold by Apple Computer, Inc. from March 1987 to October 1990. It marked a significant improvement on the Macintosh Plus design and was introduced by Apple at the same time as the Macintosh II; the SE retains the same Compact Macintosh form factor as the original Macintosh computer introduced three years earlier and uses the same design language used by the Macintosh II. An enhanced model, the SE/30 was introduced in January 1989; the Macintosh SE was updated in August 1989 to include a SuperDrive, with this updated version being called the "Macintosh SE FDHD" and the "Macintosh SE SuperDrive". The Macintosh SE was replaced with the Macintosh Classic, a similar model which retained the same central processing unit and form factor, but at a lower price point; the Macintosh SE was introduced at the AppleWorld conference in Los Angeles on March 2, 1987. The "SE" is an acronym for "System Expansion", its notable new features, compared to its similar predecessor, the Macintosh Plus, were: First compact Macintosh with an internal drive bay for a hard disk or a second floppy drive.
First compact Macintosh. First Macintosh to support the Apple Desktop Bus only available on the Apple IIGS, for keyboard and mouse connections. Improved SCSI support with a standard 50-pin internal SCSI connector. Better reliability and longer life expectancy due to the addition of a cooling fan. Upgraded video circuitry that results in a lower percentage of CPU time being spent drawing the screen. In practice this results in a 10-20 percent performance improvement. Additional fonts and kerning routines in the Toolbox ROM Disk First Aid is included on the system diskThe SE and Macintosh II were the first Apple computers since the Apple I to be sold without a keyboard. Instead the customer was offered the choice of the new ADB Apple Keyboard or the Apple Extended Keyboard. Apple produced ten SEs with transparent cases as prototypes for promotional employees, they are rare and command a premium price for collectors. The Macintosh SE shipped with System 4.0 and Finder 5.4. The README file included with the installation disks for the SE and II is the first place Apple used the term "Macintosh System Software", after 1998 these two versions were retroactively given the name "Macintosh System Software 2.0.1".
Processor: Motorola 68000, 8 MHz, with an 8 MHz system bus and a 16-bit data path RAM: The SE came with 1 MB of RAM as standard, is expandable to 4 MB. The logic board has four 30-pin SIMM slots. Video: There is 256 KB of onboard video memory, enabling 512x384 monochrome resolution; the built-in screen has a lower resolution. Storage: The SE can accommodate either one or two floppy drives, or a floppy drive and a hard drive. After-market brackets were designed to allow the SE to accommodate two floppy drives as well as a hard drive, however it was not a configuration supported by Apple. In addition an external floppy disk drive may be connected, making the SE the only Macintosh besides the Macintosh Portable and Macintosh II which could support three floppy drives, though its increased storage, RAM capacity and optional internal hard drive rendered the external drives less of a necessity than for its predecessors. Single-floppy SE models featured a drive-access light in the spot where the second floppy drive would be.
Hard-drive equipped models came with a 20 MB SCSI hard disk. Battery: Located on the logic board is a 3.6 V lithium battery, which must be present in order for basic settings to persist between power cycles. Macintosh SE machines which have sat for a long time have experienced battery corrosion and leakage, resulting in a damaged case and logic board. Expansion: A Processor Direct Slot on the logic board allows for expansion cards, such as accelerators, to be installed; the SE can be upgraded to more than 5 MB with the MicroMac accelerators. In the past other accelerators were available such as the Sonnet Allegro. Since installing a card required opening the computer's case and exposing the user to high voltages from the internal CRT, Apple recommended that only authorized Apple dealers install the cards. Upgrades: After Apple introduced the Macintosh SE/30 in January, 1989, a logic board upgrade was sold by Apple dealers as a high-cost upgrade for the SE, consisting of a new SE/30 motherboard, case front and internal chassis to accommodate the upgrade components.
Easter egg: The Macintosh SE ROM size increased from 64 KB in the original Mac to 256 KB, which allowed the development team to include an Easter Egg hidden in the ROMs. By jumping to address 0x41D89A or reading from the ROM chips it is possible to display the four images of the engineering team. Introduced March 2, 1987: Macintosh SEIntroduced August 1, 1989: Macintosh SE FDHD: Includes the new SuperDrive, a floppy disk drive that can handle 1.4 MB High Density floppy disks. FDHD is an acronym for "Floppy Disk High Density". High-density floppies would become the de facto standard on both the Macintosh and PC computers from on. An upgrade kit was sold for the original Macintosh SE which included new ROM chips and a new disk controller chip, to replace the originals. Macintosh SE 1/20: The name of the Macintosh SE FD
A crystal oscillator is an electronic oscillator circuit that uses the mechanical resonance of a vibrating crystal of piezoelectric material to create an electrical signal with a precise frequency. This frequency is used to keep track of time, as in quartz wristwatches, to provide a stable clock signal for digital integrated circuits, to stabilize frequencies for radio transmitters and receivers; the most common type of piezoelectric resonator used is the quartz crystal, so oscillator circuits incorporating them became known as crystal oscillators, but other piezoelectric materials including polycrystalline ceramics are used in similar circuits. A crystal oscillator one using a quartz crystal, works by distorting the crystal with an electric field, when voltage is applied to an electrode near or on the crystal; this property is known as inverse piezoelectricity. When the field is removed, the quartz - which oscillates in a precise frequency - generates an electric field as it returns to its previous shape, this can generate a voltage.
The result is that a quartz crystal behaves like an RLC circuit, but with a much higher Q. Quartz crystals are manufactured for frequencies from a few tens of kilohertz to hundreds of megahertz. More than two billion crystals are manufactured annually. Most are used for consumer devices such as wristwatches, radios and cellphones. Quartz crystals are found inside test and measurement equipment, such as counters, signal generators, oscilloscopes. A crystal oscillator is an electronic oscillator circuit that uses a piezoelectric resonator, a crystal, as its frequency-determining element. Crystal is the common term used in electronics for the frequency-determining component, a wafer of quartz crystal or ceramic with electrodes connected to it. A more accurate term for it is piezoelectric resonator. Crystals are used in other types of electronic circuits, such as crystal filters. Piezoelectric resonators are sold as separate components for use in crystal oscillator circuits. An example is shown in the picture.
They are often incorporated in a single package with the crystal oscillator circuit, shown on the righthand side. Piezoelectricity was discovered by Jacques and Pierre Curie in 1880. Paul Langevin first investigated quartz resonators for use in sonar during World War I; the first crystal-controlled oscillator, using a crystal of Rochelle salt, was built in 1917 and patented in 1918 by Alexander M. Nicholson at Bell Telephone Laboratories, although his priority was disputed by Walter Guyton Cady. Cady built the first quartz crystal oscillator in 1921. Other early innovators in quartz crystal oscillators include Louis Essen. Quartz crystal oscillators were developed for high-stability frequency references during the 1920s and 1930s. Prior to crystals, radio stations controlled their frequency with tuned circuits, which could drift off frequency by 3–4 kHz. Since broadcast stations were assigned frequencies only 10 kHz apart, interference between adjacent stations due to frequency drift was a common problem.
In 1925, Westinghouse installed a crystal oscillator in its flagship station KDKA, by 1926, quartz crystals were used to control the frequency of many broadcasting stations and were popular with amateur radio operators. In 1928, Warren Marrison of Bell Telephone Laboratories developed the first quartz-crystal clock. With accuracies of up to 1 second in 30 years, quartz clocks replaced precision pendulum clocks as the world's most accurate timekeepers until atomic clocks were developed in the 1950s. Using the early work at Bell Labs, AT&T established their Frequency Control Products division spun off and known today as Vectron International. A number of firms started producing quartz crystals for electronic use during this time. Using what are now considered primitive methods, about 100,000 crystal units were produced in the United States during 1939. Through World War II crystals were made from natural quartz crystal all from Brazil. Shortages of crystals during the war caused by the demand for accurate frequency control of military and naval radios and radars spurred postwar research into culturing synthetic quartz, by 1950 a hydrothermal process for growing quartz crystals on a commercial scale was developed at Bell Laboratories.
By the 1970s all crystals used in electronics were synthetic. In 1968, Juergen Staudte invented a photolithographic process for manufacturing quartz crystal oscillators while working at North American Aviation that allowed them to be made small enough for portable products like watches. Although crystal oscillators still most use quartz crystals, devices using other materials are becoming more common, such as ceramic resonators. A crystal is a solid in which the constituent atoms, molecules, or ions are packed in a ordered, repeating pattern extending in all three spatial dimensions. Any object made of an elastic material could be used like a crystal, with appropriate transducers, since all objects have natural resonant frequencies of vibration. For example, steel is elastic and has a high speed of sound, it was used in mechanical filters before quartz. The resonant frequency depends on size, shape and the speed of sound in the material. High-frequency crystals are cut in the shape of a simple rectangle or circular disk.
Low-frequency crystals, such as those used in digital watches, are cut in the shape of a tuning fork. For applications not needing precise timing, a low-cost ceramic resonator is used in place of a quartz crystal; when a crystal of quartz is properly cut and mounted, it can be made to distort in an electric field by applying a vol