Electronic design automation
Electronic design automation referred to as electronic computer-aided design, is a category of software tools for designing electronic systems such as integrated circuits and printed circuit boards. The tools work together in a design flow that chip designers use to design and analyze entire semiconductor chips. Since a modern semiconductor chip can have billions of components, EDA tools are essential for their design; this article describes EDA with respect to integrated circuits. Before EDA, integrated circuits were designed by hand, manually laid out; some advanced shops used geometric software to generate the tapes for the Gerber photoplotter, but those copied digital recordings of mechanically drawn components. The process was fundamentally graphic, with the translation from electronics to graphics done manually; the best known company from this era was Calma. By the mid-1970s, developers started to automate the design along with the drafting; the first placement and routing tools were developed.
The proceedings of the Design Automation Conference cover much of this era. The next era began about the time of the publication of "Introduction to VLSI Systems" by Carver Mead and Lynn Conway in 1980; this ground breaking text advocated chip design with programming languages. The immediate result was a considerable increase in the complexity of the chips that could be designed, with improved access to design verification tools that used logic simulation; the chips were easier to lay out and more to function since their designs could be simulated more prior to construction. Although the languages and tools have evolved, this general approach of specifying the desired behavior in a textual programming language and letting the tools derive the detailed physical design remains the basis of digital IC design today; the earliest EDA tools were produced academically. One of the most famous was the "Berkeley VLSI Tools Tarball", a set of UNIX utilities used to design early VLSI systems. Still used are the Espresso heuristic logic minimizer and Magic.
Another crucial development was the formation of MOSIS, a consortium of universities and fabricators that developed an inexpensive way to train student chip designers by producing real integrated circuits. The basic concept was to use reliable, low-cost low-technology IC processes, pack a large number of projects per wafer, with just a few copies of each projects' chips. Cooperating fabricators either donated the processed wafers, or sold them at cost, seeing the program as helpful to their own long-term growth. 1981 marks the beginning of EDA as an industry. For many years, the larger electronic companies, such as Hewlett Packard and Intel, had pursued EDA internally. In 1981, managers and developers spun out of these companies to concentrate on EDA as a business. Daisy Systems, Mentor Graphics, Valid Logic Systems were all founded around this time, collectively referred to as DMV. Within a few years there were many companies specializing in EDA, each with a different emphasis; the first trade show for EDA was held at the Design Automation Conference in 1984.
In 1981, the U. S. Department of Defense began funding of VHDL as a hardware description language. In 1986, another popular high-level design language, was first introduced as a hardware description language by Gateway Design Automation. Simulators followed these introductions, permitting direct simulation of chip designs: executable specifications. In a few more years, back-ends were developed to perform logic synthesis. Current digital flows are modular; the front ends produce standardized design descriptions that compile into invocations of "cells,", without regard to the cell technology. Cells implement logic or other electronic functions using a particular integrated circuit technology. Fabricators provide libraries of components for their production processes, with simulation models that fit standard simulation tools. Analog EDA tools are far less modular, since many more functions are required, they interact more and the components are less ideal. EDA for electronics has increased in importance with the continuous scaling of semiconductor technology.
Some users are foundry operators, who operate the semiconductor fabrication facilities, or "fabs", design-service companies who use EDA software to evaluate an incoming design for manufacturing readiness. EDA tools are used for programming design functionality into FPGAs. High-level synthesis – high-level design description is converted into RTL. Logic synthesis – translation of RTL design description into a discrete netlist of logic gates. Schematic capture – For standard cell digital, analog, RF-like Capture CIS in Orcad by Cadence and ISIS in Proteus Layout – schematic-driven layout, like Layout in Orcad by Cadence, ARES in Proteus Transistor simulation – low-level transistor-simulation of a schematic/layout's behavior, accurate at device-level. Logic simulation – digital-simulation of an RTL or gate-netlist's digital behavior, accurate at boolean-level. Behavioral Simulation – high-level simulation of a design's architectural operation, accurate at cycle-level or interface-level. Hardware emulation – Use of special purpose hardware to emulate the logic of a proposed design.
Can sometimes be plugged into a system in place of a yet-to-be-built chip. Technology CAD analyze the underlying process technology. Electrical prope
Optical proximity correction
Optical proximity correction is a photolithography enhancement technique used to compensate for image errors due to diffraction or process effects. The need for OPC is seen in the making of semiconductor devices and is due to the limitations of light to maintain the edge placement integrity of the original design, after processing, into the etched image on the silicon wafer; these projected images appear with irregularities such as line widths that are narrower or wider than designed, these are amenable to compensation by changing the pattern on the photomask used for imaging. Other distortions such as rounded corners are driven by the resolution of the optical imaging tool and are harder to compensate for; such distortions, if not corrected for, may alter the electrical properties of what was being fabricated. Optical proximity correction corrects these errors by moving edges or adding extra polygons to the pattern written on the photomask; this may be driven by pre-computed look-up tables based on width and spacing between features or by using compact models to dynamically simulate the final pattern and thereby drive the movement of edges broken into sections, to find the best solution.
The objective is to reproduce on the semiconductor wafer, as well as possible, the original layout drawn by the designer. The two most visible benefits of OPC are correcting linewidth differences seen between features in regions of different density, line end shortening. For the former case, this may be used together with resolution enhancement technologies such as scattering bars together with linewidth adjustments. For the latter case, "dog-ear" features may be generated at the line end in the design. OPC has a cost impact on photomask fabrication whereby the mask write time is related to the complexity of the mask and data-files and mask inspection for defects takes longer as the finer edge control requires a smaller spot size; the conventional diffraction-limited resolution is given by the Rayleigh criterion as 0.61 λ / N A, where N A is the numerical aperture and λ is the wavelength of the illumination source. It is common to compare the critical feature width to this value, by defining a parameter, k 1, such that feature width equals k 1 λ / N A.
Nested features with k 1 < 1 benefit less from OPC than isolated features of the same size. The reason is the spatial frequency spectrum of nested features contains fewer components than isolated features; as the feature pitch decreases, more components are truncated by the numerical aperture, resulting in greater difficulty to affect the pattern in the desired fashion. The degree of coherence of the illumination source is determined by the ratio of its angular extent to the numerical aperture; this ratio is called the partial coherence factor, or σ. It affects the pattern quality and hence the application of OPC; the coherence distance in the image plane is given by 0.5 λ /. Two image points separated by more than this distance will be uncorrelated, allowing a simpler OPC application; this distance is in fact close to the Rayleigh criterion for values of σ close to 1. A related point is. If off-axis illumination is required, OPC cannot be used to switch to on-axis illumination, because for on-axis illumination, imaging information is scattered outside the final aperture when off-axis illumination is needed, preventing any imaging.
Aberrations in optical projection systems deform wavefronts, or the spectrum or spread of illumination angles, which can affect the depth of focus. While the use of OPC can offer significant benefits to depth of focus, aberrations can more than offset these benefits. Good depth of focus requires diffracted light traveling at comparable angles with the optical axis, this requires the appropriate illumination angle. Assuming the correct illumination angle, OPC can direct more diffracted light along the right angles for a given pitch, but without the correct illumination angle, such angles will not arise; as the k 1 factor has been shrinking over the past technology generations, the anticipated requirement of moving to multiple exposure to generate circuit patterns becomes more real. This approach will affect the application of OPC, as one will need to take into account the sum of the image intensities from each exposure; this is the case for the complementary photomask technique, where the images of an alternating-aperture phase-shifting mask and a conventional binary mask are added together.
In contrast to multiple exposure of the same photoresist film, multiple layer patterning entails repeated photoresist coating and etching to pattern the same device layer. This gives an opportunity to use looser design rules to pattern the same layer. Depending on the lithography tool used to image at these looser design rules, the OPC will be
A photomask is an opaque plate with holes or transparencies that allow light to shine through in a defined pattern. They are used in photolithography. Lithographic photomasks are transparent fused silica blanks covered with a pattern defined with a chrome metal-absorbing film. Photomasks are used at wavelengths of 365 nm, 248 nm, 193 nm. Photomasks have been developed for other forms of radiation such as 157 nm, 13.5 nm, X-ray and ions. A set of photomasks, each defining a pattern layer in integrated circuit fabrication, is fed into a photolithography stepper or scanner, individually selected for exposure. In double patterning techniques, a photomask would correspond to a subset of the layer pattern. In photolithography for the mass production of integrated circuit devices, the more correct term is photoreticle or reticle. In the case of a photomask, there is a one-to-one correspondence between the mask pattern and the wafer pattern; this was the standard for the 1:1 mask aligners that were succeeded by steppers and scanners with reduction optics.
As used in steppers and scanners, the reticle contains only one layer of the chip.. The pattern is shrunk by four or five times onto the wafer surface. To achieve complete wafer coverage, the wafer is "stepped" from position to position under the optical column until full exposure is achieved. Features 150 nm or below in size require phase-shifting to enhance the image quality to acceptable values; this can be achieved in many ways. The two most common methods are to use an attenuated phase-shifting background film on the mask to increase the contrast of small intensity peaks, or to etch the exposed quartz so that the edge between the etched and unetched areas can be used to image nearly zero intensity. In the second case, unwanted edges would need to be trimmed out with another exposure; the former method is attenuated phase-shifting, is considered a weak enhancement, requiring special illumination for the most enhancement, while the latter method is known as alternating-aperture phase-shifting, is the most popular strong enhancement technique.
As leading-edge semiconductor features shrink, photomask features that are 4× larger must shrink as well. This could pose challenges since the absorber film will need to become thinner, hence less opaque. A recent study by IMEC has found that thinner absorbers degrade image contrast and therefore contribute to line-edge roughness, using state-of-the-art photolithography tools. One possibility is to eliminate absorbers altogether and use "chromeless" masks, relying on phase-shifting for imaging; the emergence of immersion lithography has a strong impact on photomask requirements. The used attenuated phase-shifting mask is more sensitive to the higher incidence angles applied in "hyper-NA" lithography, due to the longer optical path through the patterned film. Leading-edge photomasks images of the final chip patterns magnified by four times; this magnification factor has been a key benefit in reducing pattern sensitivity to imaging errors. However, as features continue to shrink, two trends come into play: the first is that the mask error factor begins to exceed one, i.e. the dimension error on the wafer may be more than 1/4 the dimension error on the mask, the second is that the mask feature is becoming smaller, the dimension tolerance is approaching a few nanometers.
For example, a 25 nm wafer pattern should correspond to a 100 nm mask pattern, but the wafer tolerance could be 1.25 nm, which translates into 5 nm on the photomask. The variation of electron beam scattering in directly writing the photomask pattern can well exceed this; the term "pellicle" is used to mean "film", "thin film", or "membrane." Beginning in the 1960s, thin film stretched on a metal frame known as a "pellicle", was used as a beam splitter for optical instruments. It has been used in a number of instruments to split a beam of light without causing an optical path shift due to its small film thickness. In 1978, Shea et al. at IBM patented a process to use the "pellicle" as a dust cover to protect a photomask or reticle. In the context of this entry, "pellicle" means "thin film dust cover to protect a photomask". Particle contamination can be a significant problem in semiconductor manufacturing. A photomask is protected from particles by a pellicle – a thin transparent film stretched over a frame, glued over one side of the photomask.
The pellicle is far enough away from the mask patterns so that moderate-to-small sized particles that land on the pellicle will be too far out of focus to print. Although they are designed to keep particles away, pellicles become a part of the imaging system and their optical properties need to be taken into account. Pellicles material are Nitrocellulose and made for various Transmission Wavelengths; the SPIE Annual Conference, Photomask Technology reports the SEMATECH Mask Industry Assessment which includes current industry analysis and the results of their annual photomask manufacturers survey. The following companies are listed in order of their global market share: Dai Nippon Printing Toppan Photomasks Photronics Inc Hoya Corporation Taiwan Mask Corporation CompugraphicsMajor chipmakers such as Intel, Globalfoundries, IBM, NEC, TSMC, UMC, Micron Technology, have their own large maskmaking facilities or joint ventures with the abovementioned companies; the worldwide photomask market was estimated as $3.2 billion in 2012 and $3.1 billion in 2013.
Half of the mark
GDSII stream format, common acronym GDSII, is a database file format, the de facto industry standard for data exchange of integrated circuit or IC layout artwork. It is a binary file format representing planar geometric shapes, text labels, other information about the layout in hierarchical form; the data can be used to reconstruct all or part of the artwork to be used in sharing layouts, transferring artwork between different tools, or creating photomasks. GDS = Graphic Database System Initially, GDSII was designed as a format used to control integrated circuit photomask plotting. Despite its limited set of features and low data density, it became the industry conventional format for transfer of IC layout data between design tools of different vendors, all of which operated with proprietary data formats, it was developed by Calma for its layout design software, "Graphic Data System" and "GDSII". GDSII files are the final output product of the IC design cycle and are given to IC foundries for IC fabrication.
GDSII files were placed on magnetic tapes. This moment was fittingly called tape out. Objects contained in a GDSII file are grouped by assigning numeric attributes to them including a "layer number", "datatype" or "texttype". While these attributes were designed to correspond to the "layers of material" used in manufacturing an integrated circuit, their meaning became more abstract to reflect the way that the physical layout is designed; as of October 2004, many EDA software vendors have begun to support a new format, OASIS, which may replace GDSII. As the GDSII stream format is a de facto standard, it is supported by nearly all EDA software. Besides the commercial vendors there are plenty of free GDSII utilities; these free tools include editors, utilities to convert the 2D layout data into common 3D formats, utilities to convert the binary format to a human readable ASCII format and program libraries. Caltech Intermediate Form OASIS * Clein, Dan.. CMOS IC Layout. Newnes. ISBN 0-7506-7194-7 Computer Aids for VLSI Design - Appendix C: GDS II Format by Steven M. Rubin // Addison-Wesley, 1987 The GDSII Stream Format by Jim R. Buchanan, 6/11/96 GDSII™ Stream Format Manual, Release 6.0 // Calma, February 1987 SPIE Handbook of Microlithography and Microfabrication, Vol. 1: Microlithography // Bellingham, Wash.: SPIE Optical Engineering Pr.
1997, 2.9 Appendix: GDSII Stream Format Details of GDSII format including illustrations
A prototype is an early sample, model, or release of a product built to test a concept or process or to act as a thing to be replicated or learned from. It is a term used in a variety of contexts, including semantics, design and software programming. A prototype is used to evaluate a new design to enhance precision by system analysts and users. Prototyping serves to provide specifications for a real, working system rather than a theoretical one. In some design workflow models, creating a prototype is the step between the formalization and the evaluation of an idea; the word prototype derives from the Greek πρωτότυπον prototypon, "primitive form", neutral of πρωτότυπος prototypos, "original, primitive", from πρῶτος protos, "first" and τύπος typos, "impression". Prototypes explore different aspects of an intended design: A Proof-of-Principle Prototype serves to verify some key functional aspects of the intended design, but does not have all the functionality of the final product. A Working Prototype represents all or nearly all of the functionality of the final product.
A Visual Prototype represents the size and appearance, but not the functionality, of the intended design. A Form Study Prototype is a preliminary type of visual prototype in which the geometric features of a design are emphasized, with less concern for color, texture, or other aspects of the final appearance. A User Experience Prototype represents enough of the appearance and function of the product that it can be used for user research. A Functional Prototype captures both function and appearance of the intended design, though it may be created with different techniques and different scale from final design. A Paper Prototype is a printed or hand-drawn representation of the user interface of a software product; such prototypes are used for early testing of a software design, can be part of a software walkthrough to confirm design decisions before more costly levels of design effort are expended. In general, the creation of prototypes will differ from creation of the final product in some fundamental ways: Material: The materials that will be used in a final product may be expensive or difficult to fabricate, so prototypes may be made from different materials than the final product.
In some cases, the final production materials may still be undergoing development themselves and not yet available for use in a prototype. Process: Mass-production processes are unsuitable for making a small number of parts, so prototypes may be made using different fabrication processes than the final product. For example, a final product that will be made by plastic injection molding will require expensive custom tooling, so a prototype for this product may be fabricated by machining or stereolithography instead. Differences in fabrication process may lead to differences in the appearance of the prototype as compared to the final product. Verification: The final product may be subject to a number of quality assurance tests to verify conformance with drawings or specifications; these tests may involve custom inspection fixtures, statistical sampling methods, other techniques appropriate for ongoing production of a large quantity of the final product. Prototypes are made with much closer individual inspection and the assumption that some adjustment or rework will be part of the fabrication process.
Prototypes may be exempted from some requirements that will apply to the final product. Engineers and prototype specialists attempt to minimize the impact of these differences on the intended role for the prototype. For example, if a visual prototype is not able to use the same materials as the final product, they will attempt to substitute materials with properties that simulate the intended final materials. Engineers and prototyping specialists seek to understand the limitations of prototypes to simulate the characteristics of their intended design, it is important to realize that by their definition, prototypes will represent some compromise from the final production design. Due to differences in materials and design fidelity, it is possible that a prototype may fail to perform acceptably whereas the production design may have been sound. A counter-intuitive idea is that prototypes may perform acceptably whereas the production design may be flawed since prototyping materials and processes may outperform their production counterparts.
In general, it can be expected that individual prototype costs will be greater than the final production costs due to inefficiencies in materials and processes. Prototypes are used to revise the design for the purposes of reducing costs through optimization and refinement, it is possible to use prototype testing to reduce the risk that a design may not perform as intended, however prototypes cannot eliminate all risk. There are pragmatic and practical limitations to the ability of a prototype to match the intended final performance of the product and some allowances and engineering judgement are required before moving forward with a production design. Building the full design is expensive and can be time-consuming when repeated several times—building the full design, figuring out what the problems are and how to solve them building another full design; as an alternative, rapid prototyping or rapid application development techniques are used for the initial prototypes, which implement part, but not all, of the complete design.
This allows designers and manufacturers to and inexpensively test the parts of the design that are most to have problems, solve those problems, build the full design. This counter-intuitive idea—that the quickest way to build something is, f
BoPET is a polyester film made from stretched polyethylene terephthalate and is used for its high tensile strength and dimensional stability, reflectivity and aroma barrier properties, electrical insulation. A variety of companies manufacture boPET and other polyester films under different brand names. In the UK and US, the most well-known trade names are Mylar and Hostaphan. BoPET film was developed in the mid-1950s by DuPont, Imperial Chemical Industries, Hoechst. In 1955 Eastman Kodak used Mylar as a support for photographic film and called it "ESTAR Base"; the thin and tough film allowed 6,000-foot reels to be exposed on long-range U-2 reconnaissance flights. In 1964, NASA launched Echo II, a 40-metre diameter balloon constructed from a 9-micrometre thick mylar film sandwiched between two layers of 4.5-micrometre thick aluminum foil bonded together. The manufacturing process begins with a film of molten polyethylene terephthalate being extruded onto a chill roll, which quenches it into the amorphous state.
It is biaxially oriented by drawing. The most common way of doing this is the sequential process, in which the film is first drawn in the machine direction using heated rollers and subsequently drawn in the transverse direction, i.e. orthogonally to the direction of travel, in a heated oven. It is possible to draw the film in both directions although the equipment required for this is somewhat more elaborate. Draw ratios are around 3 to 4 in each direction. Once the drawing is completed, the film is "heat set" or crystallized under tension in the oven at temperatures above 200 °C; the heat setting step prevents the film from shrinking back to its original unstretched shape and locks in the molecular orientation in the film plane. The orientation of the polymer chains is responsible for the high strength and stiffness of biaxially oriented PET film, which has a typical Young's modulus of about 4 GPa. Another important consequence of the molecular orientation is that it induces the formation of many crystal nuclei.
The crystallites that grow reach the boundary of the neighboring crystallite and remain smaller than the wavelength of visible light. As a result, biaxially oriented PET film has excellent clarity, despite its semicrystalline structure. If it were produced without any additives, the surface of the film would be so smooth that layers would adhere to one another when the film is wound up, similar to the sticking of clean glass plates when stacked. To make handling possible, microscopic inert inorganic particles are embedded in the PET to roughen the surface of the film such as silicon dioxide. Biaxially oriented PET film can be metallized by vapor deposition of a thin film of evaporated aluminium, gold, or other metal onto it; the result is much less permeable to gases and reflects up to 99% of light, including much of the infrared spectrum. For some applications like food packaging, the aluminized boPET film can be laminated with a layer of polyethylene, which provides sealability and improves puncture resistance.
The polyethylene side of such a laminate appears dull and the PET side shiny. Other coatings, such as conductive indium tin oxide, can be applied to boPET film by sputter deposition. Uses for boPET polyester films include, but are not limited to: Laminates containing metallized boPET foil protect food against oxidation and aroma loss, achieving long shelf life. Examples are pouches for convenience foods. White boPET web substrate is used as lidding for dairy goods such as yogurt. Clear boPET web substrate is used as lidding for frozen ready meals. Due to its excellent heat resistance, it can remain on the package during oven heating. Roasting bags Metallised films Laminated sheet metal used in the manufacture of cans A clear overlay on a map, on which notations, additional data, or copied data, can be drawn without damaging the map Metallized boPET is used as a mirror-like decorative surface on some book covers, T-shirts, other flexible cloths. Protective covering over buttons/pins/badges The glossy top layer of a Polaroid SX-70 photographic print As a backing for fine sandpaper boPET film is used in bagging comic books, in order to best protect them during storage from environmental conditions that would otherwise cause paper to deteriorate over time.
This material is used for archival quality storage of documents by the Library of Congress and several major library comic book research collections, including the Comic Art Collection at Michigan State University. While boPET is used in this archival sense, it is not immune to the effects of fire and heat and could melt, depending on the intensity of the heat source, causing further damage to the encased item. Trading card decks are packaged in pouches or sleeves made of metallized boPET, it can be used to make the holographic artwork featured on some cards known as "holos", "foils", or "holofoils". For protecting the spine of important documents, such as medical records. An electrical insulating material Insulation for houses and tents, reflecting thermal radiation Five layers of metallized boPET film in NASA's spacesuits make them radiation resistant and help regulate temperature. Metallized boPET film emergency blankets conserve a shock victim's bod
Semiconductor device fabrication
Semiconductor device fabrication is the process used to create the integrated circuits that are present in everyday electrical and electronic devices. It is a multiple-step sequence of photolithographic and chemical processing steps during which electronic circuits are created on a wafer made of pure semiconducting material. Silicon is always used, but various compound semiconductors are used for specialized applications; the entire manufacturing process, from start to packaged chips ready for shipment, takes six to eight weeks and is performed in specialized facilities referred to as foundries or fabs. In more advanced semiconductor devices, such as modern 14/10/7 nm nodes, fabrication can take up to 15 weeks with 11–13 weeks being the industry average. Production in advanced fabrication facilities is automated, carried out in a hermetically sealed, nitrogen environment to improve yield with FOUPs and automated material handling systems taking care of the transport of wafers from machine to machine.
By industry standard, each generation of the semiconductor manufacturing process known as "technology node", is designated by the process’s minimum feature size. Technology nodes known as "process technologies" or "nodes", are indicated by the size in nanometers of the process's gate length; as of 2019, 14 nanometer and 10 nanometer process chips are in mass production, with 7 nanometer process chips in mass production by TSMC and Samsung, although their node definition is similar to Intel's 10 nanometer process. Semiconductor device manufacturing has spread from Texas and California in the 1960s to the rest of the world, including Europe, the Middle East, Asia, it is a global business today. The leading semiconductor manufacturers have facilities all over the world. Intel, the second largest manufacturer, has facilities in Europe and Asia as well as the U. S. Samsung, the world's largest manufacturer of semiconductors has facilities in South Korea and the US, TSMC, the world's largest pure play foundry, has facilities in Taiwan, China and the US.
Qualcomm, Broadcom are among the biggest fabless semiconductor companies, outsourcing their production to companies like TSMC. have facilities spread in different countries. When feature widths were far greater than about 10 micrometres, semiconductor purity was not as big an issue as it is today in device manufacturing; as devices became more integrated, cleanrooms became cleaner. Today, fabrication plants are pressurized with filtered air to remove the smallest particles, which could come to rest on the wafers and contribute to defects; the workers in a semiconductor fabrication facility are required to wear cleanroom suits to protect the devices from human contamination. A typical wafer is made out of pure silicon, grown into mono-crystalline cylindrical ingots up to 300 mm in diameter using the Czochralski process; these ingots are sliced into wafers about 0.75 mm thick and polished to obtain a regular and flat surface. In semiconductor device fabrication, the various processing steps fall into four general categories: deposition, removal and modification of electrical properties.
Deposition is any process that coats, or otherwise transfers a material onto the wafer. Available technologies include physical vapor deposition, chemical vapor deposition, electrochemical deposition, molecular beam epitaxy and more atomic layer deposition among others. Removal is any process. Patterning is the shaping or altering of deposited materials, is referred to as lithography. For example, in conventional lithography, the wafer is coated with a chemical called a photoresist. After etching or other processing, the remaining photoresist is removed by plasma ashing. Modification of electrical properties has entailed doping transistor sources and drains; these doping processes are followed by furnace annealing or, in advanced devices, by rapid thermal annealing. Modification of electrical properties now extends to the reduction of a material's dielectric constant in low-k insulators via exposure to ultraviolet light in UV processing. Modification is achieved by oxidation, which can be carried out to create semiconductor-insulator junctions, such as in the local oxidation of silicon to fabricate metal oxide field effect transistors.
Modern chips have up to eleven metal levels produced in over 300 sequenced processing steps. FEOL processing refers to the formation of the transistors directly in the silicon; the raw wafer is engineered by the growth of an ultrapure defect-free silicon layer through epitaxy. In the most advanced logic devices, prior to the silicon epitaxy step, tricks are performed to improve the performance of the transistors to be built. One method involves introducing a straining step wherein a silicon variant such as silicon-germanium is deposited. Once the epitaxial silicon is deposited, the crystal lattice becomes stretched somewhat, resulting in improved electronic mobility. Another method, called silicon on insulator technology involve