Transistor–transistor logic is a logic family built from bipolar junction transistors. Its name signifies that transistors perform both the amplifying function. TTL integrated circuits were used in applications such as computers, industrial controls, test equipment and instrumentation, consumer electronics, synthesizers. Sometimes TTL-compatible logic levels are not associated directly with TTL integrated circuits, for example, they may be used at the inputs and outputs of electronic instruments. After their introduction in integrated circuit form in 1963 by Sylvania, TTL integrated circuits were manufactured by several semiconductor companies; the 7400 series by Texas Instruments became popular. TTL manufacturers offered a wide range of logic gates, flip-flops and other circuits. Variations of the original TTL circuit design offered higher speed or lower power dissipation to allow design optimization. TTL devices were made in ceramic and plastic dual-in-line packages, flat-pack form. TTL chips are now made in surface-mount packages.
TTL became the foundation of other digital electronics. After Very-large-scale integration integrated circuits made multiple-circuit-board processors obsolete, TTL devices still found extensive use as the glue logic interfacing between more densely integrated components. TTL was invented in 1961 by James L. Buie of TRW, which declared it, "particularly suited to the newly developing integrated circuit design technology." The original name for TTL was transistor-coupled transistor logic. The first commercial integrated-circuit TTL devices were manufactured by Sylvania in 1963, called the Sylvania Universal High-Level Logic family; the Sylvania parts were used in the controls of the Phoenix missile. TTL became popular with electronic systems designers after Texas Instruments introduced the 5400 series of ICs, with military temperature range, in 1964 and the 7400 series, specified over a narrower range and with inexpensive plastic packages, in 1966; the Texas Instruments 7400 family became an industry standard.
Compatible parts were made by Motorola, AMD, Intel, Signetics, Siemens, SGS-Thomson, National Semiconductor, many other companies in the Eastern Bloc. Not only did others make compatible TTL parts, but compatible parts were made using many other circuit technologies as well. At least one manufacturer, IBM, produced non-compatible TTL circuits for its own use; the term "TTL" is applied to many successive generations of bipolar logic, with gradual improvements in speed and power consumption over about two decades. The most introduced family 74Fxx is still sold today, was used into the late 90s. 74AS/ALS Advanced Schottky was introduced in 1985. As of 2008, Texas Instruments continues to supply the more general-purpose chips in numerous obsolete technology families, albeit at increased prices. TTL chips integrate no more than a few hundred transistors each. Functions within a single package range from a few logic gates to a microprocessor bit-slice. TTL became important because its low cost made digital techniques economically practical for tasks done by analog methods.
The Kenbak-1, ancestor of the first personal computers, used TTL for its CPU instead of a microprocessor chip, not available in 1971. The Datapoint 2200 from 1970 used TTL components for its CPU and was the basis for the 8008 and the x86 instruction set; the 1973 Xerox Alto and 1981 Star workstations, which introduced the graphical user interface, used TTL circuits integrated at the level of Arithmetic logic units and bitslices, respectively. Most computers used TTL-compatible "glue logic" between larger chips well into the 1990s; until the advent of programmable logic, discrete bipolar logic was used to prototype and emulate microarchitectures under development. TTL inputs are the emitters of bipolar transistors. In the case of NAND inputs, the inputs are the emitters of multiple-emitter transistors, functionally equivalent to multiple transistors where the bases and collectors are tied together; the output is buffered by a common emitter amplifier. Inputs both logical ones; when all the inputs are held at high voltage, the base–emitter junctions of the multiple-emitter transistor are reverse-biased.
Unlike DTL, a small “collector” current is drawn by each of the inputs. This is. An constant current flows from the positive rail, through the resistor and into the base of the multiple emitter transistor; this current passes through the base–emitter junction of the output transistor, allowing it to conduct and pulling the output voltage low. An input logical zero. Note that the base–collector junction of the multiple-emitter transistor and the base–emitter junction of the output transistor are in series between the bottom of the resistor and ground. If one input voltage becomes zero, the corresponding base–emitter junction of the multiple-emitter transistor is in parallel with these two junctions. A phenomenon called current steering means that when two voltage-stable elements with different threshold voltages are connected in parallel, the current flows through the path with the smaller threshold voltage; that is, current flows out of this input and into the zero voltage source. As a result, no current flows through t
A finite-state machine or finite-state automaton, finite automaton, or a state machine, is a mathematical model of computation. It is an abstract machine that can be in one of a finite number of states at any given time; the FSM can change from one state to another in response to some external inputs. An FSM is defined by a list of its states, its initial state, the conditions for each transition. Finite state machines are of two types – deterministic finite state machines and non-deterministic finite state machines. A deterministic finite-state machine can be constructed equivalent to any non-deterministic one; the behavior of state machines can be observed in many devices in modern society that perform a predetermined sequence of actions depending on a sequence of events with which they are presented. Simple examples are vending machines, which dispense products when the proper combination of coins is deposited, whose sequence of stops is determined by the floors requested by riders, traffic lights, which change sequence when cars are waiting, combination locks, which require the input of combination numbers in the proper order.
The finite state machine has less computational power than some other models of computation such as the Turing machine. The computational power distinction means there are computational tasks that a Turing machine can do but a FSM cannot; this is because a FSM's memory is limited by the number of states it has. FSMs are studied in the more general field of automata theory. An example of a simple mechanism that can be modeled by a state machine is a turnstile. A turnstile, used to control access to subways and amusement park rides, is a gate with three rotating arms at waist height, one across the entryway; the arms are locked, blocking the entry, preventing patrons from passing through. Depositing a coin or token in a slot on the turnstile unlocks the arms, allowing a single customer to push through. After the customer passes through, the arms are locked again. Considered as a state machine, the turnstile has two possible states: Unlocked. There are two possible inputs that affect its state: pushing the arm.
In the locked state, pushing on the arm has no effect. Putting a coin in – that is, giving the machine a coin input – shifts the state from Locked to Unlocked. In the unlocked state, putting additional coins in has no effect. However, a customer pushing through the arms, giving a push input, shifts the state back to Locked; the turnstile state machine can be represented by a state transition table, showing for each possible state, the transitions between them and the outputs resulting from each input: The turnstile state machine can be represented by a directed graph called a state diagram. Each state is represented by a node. Edges show the transitions from one state to another; each arrow is labeled with the input. An input that doesn't cause a change of state is represented by a circular arrow returning to the original state; the arrow into the Locked node from the black dot indicates. A state is a description of the status of a system, waiting to execute a transition. A transition is a set of actions to be executed when a condition is fulfilled or when an event is received.
For example, when using an audio system to listen to the radio, receiving a "next" stimulus results in moving to the next station. When the system is in the "CD" state, the "next" stimulus results in moving to the next track. Identical stimuli trigger different actions depending on the current state. In some finite-state machine representations, it is possible to associate actions with a state: an entry action: performed when entering the state, an exit action: performed when exiting the state. Several state transition table types are used; the most common representation is shown below: the combination of current state and input shows the next state. The complete action's information is not directly described in the table and can only be added using footnotes. A FSM definition including the full actions information is possible using state tables; the Unified Modeling Language has a notation for describing state machines. UML state machines overcome the limitations of traditional finite state machines while retaining their main benefits.
UML state machines introduce the new concepts of hierarchically nested states and orthogonal regions, while extending the notion of actions. UML state machines have the characteristics of Moore machines, they support actions that depend on both the state of the system and the triggering event, as in Mealy machines, as well as entry and exit actions, which are associated with states rather than transitions, as in Moore machines. The Specification and Description Language is a standard from ITU that includes graphical symbols to describe actions in the transition: send an event receive an event start a timer cancel a timer start another concurrent state machine decisionSDL embeds basic data types called "Abstract Data Types", an action language, an execution semantic in order to make the finite state machine executable. There are a large number of variants to represent an FSM such as the one in figure 3. In addition to their use in modeling reactive systems
Open-source hardware consists of physical artifacts of technology designed and offered by the open-design movement. Both free and open-source software and open-source hardware are created by this open-source culture movement and apply a like concept to a variety of components, it is sometimes, referred to as FOSH. The term means that information about the hardware is discerned so that others can make it – coupling it to the maker movement. Hardware design, in addition to the software that drives the hardware, are all released under free/libre terms; the original sharer gains feedback and improvements on the design from the FOSH community. There is now significant evidence that such sharing can drive a high return on investment for the scientific community. Since the rise of reconfigurable programmable logic devices, sharing of logic designs has been a form of open-source hardware. Instead of the schematics, hardware description language code is shared. HDL descriptions are used to set up system-on-a-chip systems either in field-programmable gate arrays or directly in application-specific integrated circuit designs.
HDL modules, when distributed, are called semiconductor intellectual property cores known as IP cores. Open-source hardware helps alleviate the issue of proprietary device drivers for the free and open-source software community, however, it is not a pre-requisite for it, should not be confused with the concept of open documentation for proprietary hardware, sufficient for writing FLOSS device drivers and complete operating systems; the difference between the two concepts is that OSH includes both the instructions on how to replicate the hardware itself as well as the information on communication protocols that the software must use in order to communicate with the hardware, whereas open-source-friendly proprietary hardware would only include the latter without including the former. The first hardware focused "open source" activities were started around 1997 by Bruce Perens, creator of the Open Source Definition, co-founder of the Open Source Initiative, a ham radio operator, he launched the Open Hardware Certification Program, which had the goal of allowing hardware manufacturers to self-certify their products as open.
Shortly after the launch of the Open Hardware Certification Program, David Freeman announced the Open Hardware Specification Project, another attempt at licensing hardware components whose interfaces are available publicly and of creating an new computing platform as an alternative to proprietary computing systems. In early 1999, Sepehr Kiani, Ryan Vallance and Samir Nayfeh joined efforts to apply the open-source philosophy to machine design applications. Together they established the Open Design Foundation as a non-profit corporation and set out to develop an Open Design Definition, but most of these activities faded out after a few years. By the mid 2000s open-source hardware again became a hub of activity due to the emergence of several major open-source hardware projects and companies, such as OpenCores, RepRap, Arduino and SparkFun. In 2007, Perens reactivated the openhardware.org website. Following the Open Graphics Project, an effort to design and manufacture a free and open 3D graphics chip set and reference graphics card, Timothy Miller suggested the creation of an organization to safeguard the interests of the Open Graphics Project community.
Thus, Patrick McNamara founded the Open Hardware Foundation in 2007. The Tucson Amateur Packet Radio Corporation, founded in 1982 as a non-profit organization of amateur radio operators with the goals of supporting R&D efforts in the area of amateur digital communications, created in 2007 the first open hardware license, the TAPR Open Hardware License; the OSI president Eric S. Raymond expressed some concerns about certain aspects of the OHL and decided to not review the license. Around 2010 in context of the Freedom Defined project, the Open Hardware Definition was created as collaborative work of many and is accepted as of 2016 by dozens of organizations and companies. In July 2011, CERN released an open-source hardware license, CERN OHL. Javier Serrano, an engineer at CERN's Beams Department and the founder of the Open Hardware Repository, explained: "By sharing designs CERN expects to improve the quality of designs through peer review and to guarantee their users – including commercial companies – the freedom to study and manufacture them, leading to better hardware and less duplication of efforts".
While drafted to address CERN-specific concerns, such as tracing the impact of the organization’s research, in its current form it can be used by anyone developing open-source hardware. Following the 2011 Open Hardware Summit, after heated debates on licenses and what constitutes open-source hardware, Bruce Perens abandoned the OSHW Definition and the concerted efforts of those involved with it. Openhardware.org, led by Bruce Perens and identifies practices that meet all the combined requirements of the Open Source Hardware Definition, the Open Source Definition, the Four Freedoms of the Free Software Foundation Since 2014 openhardware.org is not online and seems to have ceased activity. The Open Source Hardware Association at oshwa.org proposes Open source hardware and acts as hub of open source hardware activity of all genres, while cooperating with other entities suc
Technical drawing, drafting or drawing, is the act and discipline of composing drawings that visually communicate how something functions or is constructed. Technical drawing is essential for communicating ideas in engineering. To make the drawings easier to understand, people use familiar symbols, units of measurement, notation systems, visual styles, page layout. Together, such conventions constitute a visual language and help to ensure that the drawing is unambiguous and easy to understand. Many of the symbols and principles of technical drawing are codified in an international standard called ISO 128; the need for precise communication in the preparation of a functional document distinguishes technical drawing from the expressive drawing of the visual arts. Artistic drawings are subjectively interpreted. Technical drawings are understood to have one intended meaning. A drafter, draftsperson, or draughtsman is a person. A professional drafter who makes technical drawings is sometimes called a drafting technician.
A sketch is a executed, freehand drawing, not intended as a finished work. In general, sketching is a quick way to record an idea for use. Architect's sketches serve as a way to try out different ideas and establish a composition before a more finished work when the finished work is expensive and time-consuming. Architectural sketches, for example, are a kind of diagrams; these sketches, like metaphors, are used by architects as a means of communication in aiding design collaboration. This tool helps architects to abstract attributes of hypothetical provisional design solutions and summarize their complex patterns, hereby enhancing the design process. Italic text The basic drafting procedure is to place a piece of paper on a smooth surface with right-angle corners and straight sides—typically a drawing board. A sliding straightedge known as a T-square is placed on one of the sides, allowing it to be slid across the side of the table, over the surface of the paper. "Parallel lines" can be drawn by moving the T-square and running a pencil or technical pen along the T-square's edge.
The T-square is used to hold other devices such as set triangles. In this case, the drafter places one or more triangles of known angles on the T-square—which is itself at right angles to the edge of the table—and can draw lines at any chosen angle to others on the page. Modern drafting tables come equipped with a drafting machine, supported on both sides of the table to slide over a large piece of paper; because it is secured on both sides, lines drawn along the edge are guaranteed to be parallel. In addition, the drafter uses several technical drawing tools to draw circles. Primary among these are the compasses, used for drawing simple arcs and circles, the French curve, for drawing curves. A spline is a rubber coated articulated metal. Drafting templates assist the drafter with creating recurring objects in a drawing without having to reproduce the object from scratch every time; this is useful when using common symbols. Templates are sold commercially by a number of vendors customized to a specific task, but it is not uncommon for a drafter to create his own templates.
This basic drafting system requires an accurate table and constant attention to the positioning of the tools. A common error is to allow the triangles to push the top of the T-square down thereby throwing off all angles. Tasks as simple as drawing two angled lines meeting at a point require a number of moves of the T-square and triangles, in general, drafting can be a time-consuming process. A solution to these problems was the introduction of the mechanical "drafting machine", an application of the pantograph which allowed the drafter to have an accurate right angle at any point on the page quite quickly; these machines included the ability to change the angle, thereby removing the need for the triangles as well. In addition to the mastery of the mechanics of drawing lines and circles onto a piece of paper—with respect to the detailing of physical objects—the drafting effort requires a thorough understanding of geometry and spatial comprehension, in all cases demands precision and accuracy, attention to detail of high order.
Although drafting is sometimes accomplished by a project engineer, architect, or shop personnel, skilled drafters accomplish the task, are always in demand to some degree. Today, the mechanics of the drafting task have been automated and accelerated through the use of computer-aided design systems. There are two types of computer-aided design systems used for the production of technical drawings" two dimensions and three dimensions. 2D CAD systems such as AutoCAD or MicroStation replace the paper drawing discipline. The lines, circles and curves are created within the software, it is down to the technical drawing skill of the user to produce the drawing. There is still much scope for error in the drawing when producing first and third angle orthographic projections, auxiliary projections and cross sections. A 2D CAD system is an electronic drawing board, its greatest strength over direct to paper technical drawing is in the making of revisions. Whereas in a conventional hand dr
The drivetrain of a motor vehicle is the group of components that deliver power to the driving wheels. This excludes the motor that generates the power. In contrast, the powertrain is considered to include both the drivetrain; the function of the drivetrain is to couple the engine that produces the power to the driving wheels that use this mechanical power to rotate the axle. This connection involves physically linking the two components, which may be at opposite ends of the vehicle and so requiring a long propeller shaft or drive shaft; the operating speed of the engine and wheels are different and must be matched by the correct gear ratio. As the vehicle speed changes, the ideal engine speed must remain constant for efficient operation and so this gearbox ratio must be changed, either manually, automatically or by an automatic continuous variation; the precise components of the drivetrain vary, according to the type of vehicle. Some typical examples: Flywheel Dual mass flywheel still rare Clutch Gearbox Overdrive Only fitted Propeller shaft Rear axle Final drive Differential Torque converter Transmission Propeller shaft Rear axle Final drive Differential Clutch Transaxle Gearbox Final drive Differential Drive shafts and constant-velocity joints to each wheel Clutch Gearbox Transfer box Transmission brake Propeller shafts, to front and rear Front and rear axles Final drive Locking differential Portal gear Two-wheel drive Four-wheel drive 6×4 Six-wheel drive Eight-wheel drive H-drive Continuous track Hybrid vehicle drivetrain, the drivetrain of hybrid vehicles Powertrain, the drivetrain plus engine
Visualization or visualisation is any technique for creating images, diagrams, or animations to communicate a message. Visualization through visual imagery has been an effective way to communicate both abstract and concrete ideas since the dawn of humanity. Examples from history include cave paintings, Egyptian hieroglyphs, Greek geometry, Leonardo da Vinci's revolutionary methods of technical drawing for engineering and scientific purposes. Visualization today has ever-expanding applications in science, engineering, interactive multimedia, etc. Typical of a visualization application is the field of computer graphics; the invention of computer graphics may be the most important development in visualization since the invention of central perspective in the Renaissance period. The development of animation helped advance visualization; the use of visualization to present information is not a new phenomenon. It has been used in maps, scientific drawings, data plots for over a thousand years. Examples from cartography include Ptolemy's Geographia, a map of China, Minard's map of Napoleon's invasion of Russia a century and a half ago.
Most of the concepts learned in devising these images carry over in a straightforward manner to computer visualization. Edward Tufte has written three critically acclaimed books. Computer graphics has from its beginning been used to study scientific problems. However, in its early days the lack of graphics power limited its usefulness; the recent emphasis on visualization started in 1987 with the publication of Visualization in Scientific Computing, a special issue of Computer Graphics. Since there have been several conferences and workshops, co-sponsored by the IEEE Computer Society and ACM SIGGRAPH, devoted to the general topic, special areas in the field, for example volume visualization. Most people are familiar with the digital animations produced to present meteorological data during weather reports on television, though few can distinguish between those models of reality and the satellite photos that are shown on such programs. TV offers scientific visualizations when it shows computer drawn and animated reconstructions of road or airplane accidents.
Some of the most popular examples of scientific visualizations are computer-generated images that show real spacecraft in action, out in the void far beyond Earth, or on other planets. Dynamic forms of visualization, such as educational animation or timelines, have the potential to enhance learning about systems that change over time. Apart from the distinction between interactive visualizations and animation, the most useful categorization is between abstract and model-based scientific visualizations; the abstract visualizations show conceptual constructs in 2D or 3D. These generated shapes are arbitrary; the model-based visualizations either place overlays of data on real or digitally constructed images of reality or make a digital construction of a real object directly from the scientific data. Scientific visualization is done with specialized software, though there are a few exceptions, noted below; some of these specialized programs have been released as open source software, having often its origins in universities, within an academic environment where sharing software tools and giving access to the source code is common.
There are many proprietary software packages of scientific visualization tools. Models and frameworks for building visualizations include the data flow models popularized by systems such as AVS, IRIS Explorer, VTK toolkit, data state models in spreadsheet systems such as the Spreadsheet for Visualization and Spreadsheet for Images; as a subject in computer science, scientific visualization is the use of interactive, sensory representations visual, of abstract data to reinforce cognition, hypothesis building, reasoning. Data visualization is a related subcategory of visualization dealing with statistical graphics and geographic or spatial data, abstracted in schematic form. Scientific visualization is the transformation, selection, or representation of data from simulations or experiments, with an implicit or explicit geometric structure, to allow the exploration and understanding of the data. Scientific visualization focuses and emphasizes the representation of higher order data using graphics and animation techniques.
It is a important part of visualization and maybe the first one, as the visualization of experiments and phenomena is as old as science itself. Traditional areas of scientific visualization are flow visualization, medical visualization, astrophysical visualization, chemical visualization. There are several different techniques to visualize scientific data, with isosurface reconstruction and direct volume rendering being the more common. Educational visualization is using a simulation to create an image of something so it can be taught about; this is useful when teaching about a topic, difficult to otherwise see, for example, atomic structure, because atoms are far too small to be studied without expensive and difficult to use scientific equipment. Information visualization concentrates on the use of computer-supported tools to explore large amount of abstract data; the term "information visualization" was coined by the User Interface Research Group at Xerox PARC and included Jock Mackinlay. Practical application of information visualization in computer programs involves selecting and representing abstract data in a form that facilitates human interaction for exploration and understanding.
Electronic design automation
Electronic design automation referred to as electronic computer-aided design, is a category of software tools for designing electronic systems such as integrated circuits and printed circuit boards. The tools work together in a design flow that chip designers use to design and analyze entire semiconductor chips. Since a modern semiconductor chip can have billions of components, EDA tools are essential for their design; this article describes EDA with respect to integrated circuits. Before EDA, integrated circuits were designed by hand, manually laid out; some advanced shops used geometric software to generate the tapes for the Gerber photoplotter, but those copied digital recordings of mechanically drawn components. The process was fundamentally graphic, with the translation from electronics to graphics done manually; the best known company from this era was Calma. By the mid-1970s, developers started to automate the design along with the drafting; the first placement and routing tools were developed.
The proceedings of the Design Automation Conference cover much of this era. The next era began about the time of the publication of "Introduction to VLSI Systems" by Carver Mead and Lynn Conway in 1980; this ground breaking text advocated chip design with programming languages. The immediate result was a considerable increase in the complexity of the chips that could be designed, with improved access to design verification tools that used logic simulation; the chips were easier to lay out and more to function since their designs could be simulated more prior to construction. Although the languages and tools have evolved, this general approach of specifying the desired behavior in a textual programming language and letting the tools derive the detailed physical design remains the basis of digital IC design today; the earliest EDA tools were produced academically. One of the most famous was the "Berkeley VLSI Tools Tarball", a set of UNIX utilities used to design early VLSI systems. Still used are the Espresso heuristic logic minimizer and Magic.
Another crucial development was the formation of MOSIS, a consortium of universities and fabricators that developed an inexpensive way to train student chip designers by producing real integrated circuits. The basic concept was to use reliable, low-cost low-technology IC processes, pack a large number of projects per wafer, with just a few copies of each projects' chips. Cooperating fabricators either donated the processed wafers, or sold them at cost, seeing the program as helpful to their own long-term growth. 1981 marks the beginning of EDA as an industry. For many years, the larger electronic companies, such as Hewlett Packard and Intel, had pursued EDA internally. In 1981, managers and developers spun out of these companies to concentrate on EDA as a business. Daisy Systems, Mentor Graphics, Valid Logic Systems were all founded around this time, collectively referred to as DMV. Within a few years there were many companies specializing in EDA, each with a different emphasis; the first trade show for EDA was held at the Design Automation Conference in 1984.
In 1981, the U. S. Department of Defense began funding of VHDL as a hardware description language. In 1986, another popular high-level design language, was first introduced as a hardware description language by Gateway Design Automation. Simulators followed these introductions, permitting direct simulation of chip designs: executable specifications. In a few more years, back-ends were developed to perform logic synthesis. Current digital flows are modular; the front ends produce standardized design descriptions that compile into invocations of "cells,", without regard to the cell technology. Cells implement logic or other electronic functions using a particular integrated circuit technology. Fabricators provide libraries of components for their production processes, with simulation models that fit standard simulation tools. Analog EDA tools are far less modular, since many more functions are required, they interact more and the components are less ideal. EDA for electronics has increased in importance with the continuous scaling of semiconductor technology.
Some users are foundry operators, who operate the semiconductor fabrication facilities, or "fabs", design-service companies who use EDA software to evaluate an incoming design for manufacturing readiness. EDA tools are used for programming design functionality into FPGAs. High-level synthesis – high-level design description is converted into RTL. Logic synthesis – translation of RTL design description into a discrete netlist of logic gates. Schematic capture – For standard cell digital, analog, RF-like Capture CIS in Orcad by Cadence and ISIS in Proteus Layout – schematic-driven layout, like Layout in Orcad by Cadence, ARES in Proteus Transistor simulation – low-level transistor-simulation of a schematic/layout's behavior, accurate at device-level. Logic simulation – digital-simulation of an RTL or gate-netlist's digital behavior, accurate at boolean-level. Behavioral Simulation – high-level simulation of a design's architectural operation, accurate at cycle-level or interface-level. Hardware emulation – Use of special purpose hardware to emulate the logic of a proposed design.
Can sometimes be plugged into a system in place of a yet-to-be-built chip. Technology CAD analyze the underlying process technology. Electrical prope