Java is an island of Indonesia, bordered by the Indian Ocean on the south and the Java Sea on the north. With a population of over 141 million or 145 million, Java is the home to 56.7 percent of the Indonesian population and is the world's most populous island. The Indonesian capital city, Jakarta, is located on its northwestern coast. Much of Indonesian history took place on Java, it was the center of powerful Hindu-Buddhist empires, the Islamic sultanates, the core of the colonial Dutch East Indies. Java was the center of the Indonesian struggle for independence during the 1930s and 1940s. Java dominates Indonesia politically and culturally. Four of Indonesia's eight UNESCO world heritage sites are located in Java: Ujung Kulon National Park, Borobudur Temple, Prambanan Temple, Sangiran Early Man Site. Formed as the result of volcanic eruptions from geologic subduction between Sunda Plate and Australian Plate, Java is the 13th largest island in the world and the fifth largest in Indonesia by landmass at about 138,800 square kilometres.
A chain of volcanic mountains forms an east–west spine along the island. Three main languages are spoken on the island: Javanese and Madurese, where Javanese is the most spoken. Furthermore, most residents are bilingual, speaking Indonesian as their second language. While the majority of the people of Java are Muslim, Java's population comprises people of diverse religious beliefs and cultures. Java is divided into four administrative provinces, West Java, Central Java, East Java, Banten, two special regions and Yogyakarta; the origins of the name "Java" are not clear. One possibility is that the island was named after the jáwa-wut plant, said to be common in the island during the time, that prior to Indianization the island had different names. There are other possible sources: the word jaú and its variations mean "beyond" or "distant". And, in Sanskrit yava means barley, a plant for which the island was famous. "Yavadvipa" is mentioned in the Ramayana. Sugriva, the chief of Rama's army dispatched his men to Yavadvipa, the island of Java, in search of Sita.
It was hence referred to in India by the Sanskrit name "yāvaka dvīpa". Java is mentioned in the ancient Tamil text Manimekalai by Chithalai Chathanar that states that Java had a kingdom with a capital called Nagapuram. Another source states that the "Java" word is derived from a Proto-Austronesian root word, Iawa that meaning "home"; the great island of Iabadiu or Jabadiu was mentioned in Ptolemy's Geographia composed around 150 CE in the Roman Empire. Iabadiu is said to mean "barley island", to be rich in gold, have a silver town called Argyra at the west end; the name indicates Java, seems to be derived from the Sanskrit name Java-dvipa. The annual news of Songshu and Liangshu referred Java as She-po, He-ling called it She-po again until the Yuan dynasty, where they began mentioning Zhao-Wa. According to Ma Huan's book, the Chinese call Java as Chao-Wa, the island was called She-pó in the past; when John of Marignolli returned from China to Avignon, he stayed at the Kingdom of Saba for a few months, which he said had many elephants and led by a queen.
Java lies between Sumatra to Bali to the east. Borneo lies to the north and Christmas Island is to the south, it is the world's 13th largest island. Java is surrounded by the Java Sea to the north, Sunda Strait to the west, the Indian Ocean to the south and Bali Strait and Madura Strait in the east. Java is entirely of volcanic origin; the highest volcano in Java is Mount Semeru. The most active volcano in Java and in Indonesia is Mount Merapi. In total, Java boast more than 150 mountains. More mountains and highlands help to split the interior into a series of isolated regions suitable for wet-rice cultivation. Java was the first place where Indonesian coffee was grown, starting in 1699. Today, Coffea arabica is grown on the Ijen Plateau by larger plantations; the area of Java is 150,000 square kilometres. It is up to 210 km wide; the island's longest river is the 600 km long Solo River. The river rises from its source in central Java at the Lawu volcano flows north and eastward to its mouth in the Java Sea near the city of Surabaya.
Other major rivers are Brantas, Citarum and Serayu. The average temperature ranges from 22 °C to 29 °C; the northern coastal plains are hotter, averaging 34 °C during the day in the dry season. The south coast is cooler than the north, highland areas inland are cooler; the wet season ends in April. During that rain falls in the afternoons and intermittently during other parts of the year; the wettest months are February. West Java is wetter than East mountainous regions receive much higher rainfall; the Parahyangan highlands of West Java receive over 4,000 millimetres annually, while the north coast of East Java receives 900 millimetres annually. The natural environment of Jav
BASIC is a family of general-purpose, high-level programming languages whose design philosophy emphasizes ease of use. In 1964, John G. Kemeny and Thomas E. Kurtz designed the original BASIC language at Dartmouth College, they wanted to enable students in fields other than mathematics to use computers. At the time, nearly all use of computers required writing custom software, something only scientists and mathematicians tended to learn. In addition to the language itself and Kurtz developed the Dartmouth Time Sharing System, which allowed multiple users to edit and run BASIC programs at the same time; this general model became popular on minicomputer systems like the PDP-11 and Data General Nova in the late 1960s and early 1970s. Hewlett-Packard produced an entire computer line for this method of operation, introducing the HP2000 series in the late 1960s and continuing sales into the 1980s. Many early video games trace their history to one of these versions of BASIC; the emergence of early microcomputers in the mid-1970s led to the development of the original Microsoft BASIC in 1975.
Due to the tiny main memory available on these machines 4 kB, a variety of Tiny BASIC dialects were created. BASIC was available for any system of the era, became the de facto programming language for the home computer systems that emerged in the late 1970s; these machines always had a BASIC installed by default in the machine's firmware or sometimes on a ROM cartridge. BASIC fell from use during the 1980s as newer machines with far greater capabilities came to market and other programming languages became tenable. In 1991, Microsoft released Visual Basic, combining a updated version of BASIC with a visual forms builder; this reignited use of the language and "VB" remains a major programming language in the form of VB. NET. John G. Kemeny was the math department chairman at Dartmouth College, on his reputation as an innovator in math teaching, in 1959 the school won an Alfred P. Sloan Foundation award for $500,000 to build a new department building. Thomas E. Kurtz had joined the department in 1956, from the 1960s they agreed on the need for programming literacy among students outside the traditional STEM fields.
Kemeny noted that “Our vision was that every student on campus should have access to a computer, any faculty member should be able to use a computer in the classroom whenever appropriate. It was as simple as that."Kemeny and Kurtz had made two previous experiments with simplified languages, DARSIMCO and DOPE. These did not progress past a single freshman class. New experiments using Fortran and ALGOL followed, but Kurtz concluded these languages were too tricky for what they desired; as Kurtz noted, Fortran had numerous oddly-formed commands, notably an "almost impossible-to-memorize convention for specifying a loop:'DO 100, I = 1, 10, 2'. Is it'1, 10, 2' or'1, 2, 10', is the comma after the line number required or not?"Moreover, the lack of any sort of immediate feedback was a key problem. Kurtz suggested. Small programs would return results in a few seconds; this led to increasing interest in a system using time-sharing and a new language for use by non-STEM students. Kemeny wrote the first version of BASIC.
The acronym BASIC comes from the name of an unpublished paper by Thomas Kurtz. The new language was patterned on FORTRAN II. However, the syntax was changed. For instance, the difficult to remember DO loop was replaced by the much easier to remember FOR I = 1 TO 10 STEP 2, the line number used in the DO was instead indicated by the NEXT I; the cryptic IF statement of Fortran, whose syntax matched a particular instruction of the machine on which it was written, became the simpler IF I=5 THEN GOTO 100. These changes made the language much less idiosyncratic while still having an overall structure and feel similar to the original FORTRAN; the project received a $300,000 grant from the National Science Foundation, used to purchase a GE-225 computer for processing, a Datanet-30 realtime processor to handle the Teletype Model 33 teleprinters used for input and output. A team of a dozen undergraduates worked on the project for about a year, writing both the DTSS system and the BASIC compiler; the main CPU was replaced by a GE-235, still by a GE-635 The first version BASIC language was released on 1 May 1964.
One of the graduate students on the implementation team was Sr. Mary Kenneth Keller, one of the first people in the United States to earn a Ph. D. in computer science and the first woman to do so. BASIC concentrated on supporting straightforward mathematical work, with matrix arithmetic support from its initial implementation as a batch language, character string functionality being added by 1965. Wanting use of the language to become widespread, its designers made the compiler available free of charge, they made it available to high schools in the Hanover, New Hampshire area and put considerable effort into
Machine code is a computer program written in machine language instructions that can be executed directly by a computer's central processing unit. Each instruction causes the CPU to perform a specific task, such as a load, a store, a jump, or an ALU operation on one or more units of data in CPU registers or memory. Machine code is a numerical language, intended to run as fast as possible, may be regarded as the lowest-level representation of a compiled or assembled computer program or as a primitive and hardware-dependent programming language. While it is possible to write programs directly in machine code, it is tedious and error prone to manage individual bits and calculate numerical addresses and constants manually. For this reason, programs are rarely written directly in machine code in modern contexts, but may be done for low level debugging, program patching, assembly language disassembly; the overwhelming majority of practical programs today are written in higher-level languages or assembly language.
The source code is translated to executable machine code by utilities such as compilers and linkers, with the important exception of interpreted programs, which are not translated into machine code. However, the interpreter itself, which may be seen as an executor or processor, performing the instructions of the source code consists of directly executable machine code. Machine code is by definition the lowest level of programming detail visible to the programmer, but internally many processors use microcode or optimise and transform machine code instructions into sequences of micro-ops, this is not considered to be a machine code per se; every processor or processor family has its own instruction set. Instructions are patterns of bits that by physical design correspond to different commands to the machine. Thus, the instruction set is specific to a class of processors using the same architecture. Successor or derivative processor designs include all the instructions of a predecessor and may add additional instructions.
A successor design will discontinue or alter the meaning of some instruction code, affecting code compatibility to some extent. Systems may differ in other details, such as memory arrangement, operating systems, or peripheral devices; because a program relies on such factors, different systems will not run the same machine code when the same type of processor is used. A processor's instruction set may have all instructions of the same length, or it may have variable-length instructions. How the patterns are organized varies with the particular architecture and also with the type of instruction. Most instructions have one or more opcode fields which specifies the basic instruction type and the actual operation and other fields that may give the type of the operand, the addressing mode, the addressing offset or index, or the actual value itself. Not all machines or individual instructions have explicit operands. An accumulator machine has a combined left operand and result in an implicit accumulator for most arithmetic instructions.
Other architectures have accumulator versions of common instructions, with the accumulator regarded as one of the general registers by longer instructions. A stack machine has all of its operands on an implicit stack. Special purpose instructions often lack explicit operands; this distinction between explicit and implicit operands is important in code generators in the register allocation and live range tracking parts. A good code optimizer can track implicit as well as explicit operands which may allow more frequent constant propagation, constant folding of registers and other code enhancements. A computer program is a list of instructions. A program's execution is done in order for the CPU, executing it to solve a specific problem and thus accomplish a specific result. While simple processors are able to execute instructions one after another, superscalar processors are capable of executing a variety of different instructions at once. Program flow may be influenced by special'jump' instructions that transfer execution to an instruction other than the numerically following one.
Conditional jumps are not depending on some condition. A much more readable rendition of machine language, called assembly language, uses mnemonic codes to refer to machine code instructions, rather than using the instructions' numeric values directly. For example, on the Zilog Z80 processor, the machine code 00000101, which causes the CPU to decrement the B processor register, would be represented in assembly language as DEC B; the MIPS architecture provides a specific example for a machine code whose instructions are always 32 bits long. The general type of instruction is given by the op field. J-type and I-type instructions are specified by op. R-type instructions include an additional field funct to determine the exact operation; the fields used in the
Computer hardware includes the physical, tangible parts or components of a computer, such as the cabinet, central processing unit, keyboard, computer data storage, graphics card, sound card and motherboard. By contrast, software is instructions that can be run by hardware. Hardware is so-termed because it rigid with respect to changes or modifications. Intermediate between software and hardware is "firmware", software, coupled to the particular hardware of a computer system and thus the most difficult to change but among the most stable with respect to consistency of interface; the progression from levels of "hardness" to "softness" in computer systems parallels a progression of layers of abstraction in computing. Hardware is directed by the software to execute any command or instruction. A combination of hardware and software forms a usable computing system, although other systems exist with only hardware components; the template for all modern computers is the Von Neumann architecture, detailed in a 1945 paper by Hungarian mathematician John von Neumann.
This describes a design architecture for an electronic digital computer with subdivisions of a processing unit consisting of an arithmetic logic unit and processor registers, a control unit containing an instruction register and program counter, a memory to store both data and instructions, external mass storage, input and output mechanisms. The meaning of the term has evolved to mean a stored-program computer in which an instruction fetch and a data operation cannot occur at the same time because they share a common bus; this is referred to as the Von Neumann bottleneck and limits the performance of the system. The personal computer known as the PC, is one of the most common types of computer due to its versatility and low price. Laptops are very similar, although they may use lower-power or reduced size components, thus lower performance; the computer case encloses most of the components of the system. It provides mechanical support and protection for internal elements such as the motherboard, disk drives, power supplies, controls and directs the flow of cooling air over internal components.
The case is part of the system to control electromagnetic interference radiated by the computer, protects internal parts from electrostatic discharge. Large tower cases provide extra internal space for multiple disk drives or other peripherals and stand on the floor, while desktop cases provide less expansion room. All-in-one style designs include a video display built into the same case. Portable and laptop computers require cases. A current development in laptop computers is a detachable keyboard, which allows the system to be configured as a touch-screen tablet. Hobbyists may decorate the cases with colored lights, paint, or other features, in an activity called case modding. A power supply unit converts alternating current electric power to low-voltage DC power for the internal components of the computer. Laptops are capable of running from a built-in battery for a period of hours; the motherboard is the main component of a computer. It is a board with integrated circuitry that connects the other parts of the computer including the CPU, the RAM, the disk drives as well as any peripherals connected via the ports or the expansion slots.
Components directly attached to or to part of the motherboard include: The CPU, which performs most of the calculations which enable a computer to function, is sometimes referred to as the brain of the computer. It is cooled by a heatsink and fan, or water-cooling system. Most newer CPUs include an on-die graphics processing unit; the clock speed of CPUs governs how fast it executes instructions, is measured in GHz. Many modern computers have the option to overclock the CPU which enhances performance at the expense of greater thermal output and thus a need for improved cooling; the chipset, which includes the north bridge, mediates communication between the CPU and the other components of the system, including main memory. Random-access memory, which stores the code and data that are being accessed by the CPU. For example, when a web browser is opened on the computer it takes up memory. RAM comes on DIMMs in the sizes 2GB, 4GB, 8GB, but can be much larger. Read-only memory, which stores the BIOS that runs when the computer is powered on or otherwise begins execution, a process known as Bootstrapping, or "booting" or "booting up".
The BIOS includes power management firmware. Newer motherboards use Unified Extensible Firmware Interface instead of BIOS. Buses that connect the CPU to various internal components and to expand cards for graphics and sound; the CMOS battery, which powers the memory for date and time in the BIOS chip. This battery is a watch battery; the video card, which processes computer graphics. More powerful graphics cards are better suited to handle strenuous tasks, such as playing intensive video games. An expansion card in computing is a printed circuit board that can be inserted into an expansion slot of a computer motherboard or
In computing, an emulator is hardware or software that enables one computer system to behave like another computer system. An emulator enables the host system to run software or use peripheral devices designed for the guest system. Emulation refers to the ability of a computer program in an electronic device to emulate another program or device. Many printers, for example, are designed to emulate Hewlett-Packard LaserJet printers because so much software is written for HP printers. If a non-HP printer emulates an HP printer, any software written for a real HP printer will run in the non-HP printer emulation and produce equivalent printing. Since at least the 1990s, many video game enthusiasts have used emulators to play classic arcade games from the 1980s using the games' original 1980s machine code and data, interpreted by a current-era system. A hardware emulator is an emulator. Examples include the DOS-compatible card installed in some 1990s-era Macintosh computers like the Centris 610 or Performa 630 that allowed them to run personal computer software programs and FPGA-based hardware emulators.
In a theoretical sense, the Church-Turing thesis implies that any operating environment can be emulated within any other environment. However, in practice, it can be quite difficult when the exact behavior of the system to be emulated is not documented and has to be deduced through reverse engineering, it says nothing about timing constraints. Emulation is a strategy in digital preservation to combat obsolescence. Emulation focuses on recreating an original computer environment, which can be time-consuming and difficult to achieve, but valuable because of its ability to maintain a closer connection to the authenticity of the digital object. Emulation addresses the original hardware and software environment of the digital object, recreates it on a current machine; the emulator allows the user to have access to any kind of application or operating system on a current platform, while the software runs as it did in its original environment. Jeffery Rothenberg, an early proponent of emulation as a digital preservation strategy states, "the ideal approach would provide a single extensible, long-term solution that can be designed once and for all and applied uniformly, in synchrony to all types of documents and media".
He further states that this should not only apply to out of date systems, but be upwardly mobile to future unknown systems. Speaking, when a certain application is released in a new version, rather than address compatibility issues and migration for every digital object created in the previous version of that application, one could create an emulator for the application, allowing access to all of said digital objects. Better graphics quality than original hardware. Additional features original hardware didn't have. Emulators maintain the original look and behavior of the digital object, just as important as the digital data itself. Despite the original cost of developing an emulator, it may prove to be the more cost efficient solution over time. Reduces labor hours, because rather than continuing an ongoing task of continual data migration for every digital object, once the library of past and present operating systems and application software is established in an emulator, these same technologies are used for every document using those platforms.
Many emulators have been developed and released under the GNU General Public License through the open source environment, allowing for wide scale collaboration. Emulators allow software exclusive to one system to be used on another. For example, a PlayStation 2 exclusive video game could be played on a PC using an emulator; this is useful when the original system is difficult to obtain, or incompatible with modern equipment. Intellectual property - Many technology vendors implemented non-standard features during program development in order to establish their niche in the market, while applying ongoing upgrades to remain competitive. While this may have advanced the technology industry and increased vendor's market share, it has left users lost in a preservation nightmare with little supporting documentation due to the proprietary nature of the hardware and software. Copyright laws are not yet in effect to address saving the documentation and specifications of proprietary software and hardware in an emulator module.
Emulators are used as a copyright infringement tool, since they allow users to play video games without having to buy the console, make any attempt to prevent the use of illegal copies. This leads to a number of legal uncertainties regarding emulation, leads to software being programmed to refuse to work if it can tell the host is an emulator; these protections make it more difficult to design emulators, since they must be accurate enough to avoid triggering the protections, whose effects may not be obvious. Emulators require better hardware; because of its primary use of digital formats
A computer program is a collection of instructions that performs a specific task when executed by a computer. A computer requires programs to function. A computer program is written by a computer programmer in a programming language. From the program in its human-readable form of source code, a compiler can derive machine code—a form consisting of instructions that the computer can directly execute. Alternatively, a computer program may be executed with the aid of an interpreter. A collection of computer programs and related data are referred to as software. Computer programs may be categorized along functional lines, such as application software and system software; the underlying method used for some calculation or manipulation is known as an algorithm. The earliest programmable machines preceded the invention of the digital computer. In 1801, Joseph-Marie Jacquard devised a loom that would weave a pattern by following a series of perforated cards. Patterns could be repeated by arranging the cards.
In 1837, Charles Babbage was inspired by Jacquard's loom to attempt to build the Analytical Engine. The names of the components of the calculating device were borrowed from the textile industry. In the textile industry, yarn was brought from the store to be milled; the device would have had a "store"—memory to hold 1,000 numbers of 40 decimal digits each. Numbers from the "store" would have been transferred to the "mill", for processing, and a "thread" being the execution of programmed instructions by the device. It was programmed using two sets of perforated cards—one to direct the operation and the other for the input variables. However, after more than 17,000 pounds of the British government's money, the thousands of cogged wheels and gears never worked together. During a nine-month period in 1842–43, Ada Lovelace translated the memoir of Italian mathematician Luigi Menabrea; the memoir covered the Analytical Engine. The translation contained Note G which detailed a method for calculating Bernoulli numbers using the Analytical Engine.
This note is recognized by some historians as the world's first written computer program. In 1936, Alan Turing introduced the Universal Turing machine—a theoretical device that can model every computation that can be performed on a Turing complete computing machine, it is a finite-state machine. The machine can move the tape forth, changing its contents as it performs an algorithm; the machine starts in the initial state, goes through a sequence of steps, halts when it encounters the halt state. This machine is considered by some to be the origin of the stored-program computer—used by John von Neumann for the "Electronic Computing Instrument" that now bears the von Neumann architecture name; the Z3 computer, invented by Konrad Zuse in Germany, was a programmable computer. A digital computer uses electricity as the calculating component; the Z3 contained 2,400 relays to create the circuits. The circuits provided a floating-point, nine-instruction computer. Programming the Z3 was through a specially designed keyboard and punched tape.
The Electronic Numerical Integrator And Computer was a Turing complete, general-purpose computer that used 17,468 vacuum tubes to create the circuits. At its core, it was a series of Pascalines wired together, its 40 units weighed 30 tons, occupied 1,800 square feet, consumed $650 per hour in electricity when idle. It had 20 base-10 accumulators. Programming the ENIAC took up to two months. Three function tables needed to be rolled to fixed function panels. Function tables were connected to function panels using heavy black cables; each function table had 728 rotating knobs. Programming the ENIAC involved setting some of the 3,000 switches. Debugging a program took a week; the programmers of the ENIAC were women who were known collectively as the "ENIAC girls." The ENIAC featured parallel operations. Different sets of accumulators could work on different algorithms, it used punched card machines for input and output, it was controlled with a clock signal. It ran for eight years, calculating hydrogen bomb parameters, predicting weather patterns, producing firing tables to aim artillery guns.
The Manchester Baby was a stored-program computer. Programming transitioned away from setting dials. Only three bits of memory were available to store each instruction, so it was limited to eight instructions. 32 switches were available for programming. Computers manufactured; the computer program was written on paper for reference. An instruction was represented by a configuration of on/off settings. After setting the configuration, an execute button was pressed; this process was repeated. Computer programs were manually input via paper tape or punched cards. After the medium was loaded, the starting address was set via switches and the execute button pressed. In 1961, the Burroughs B5000 was built to be programmed in the ALGOL 60 language; the hardware featured circuits to ease the compile phase. In 1964, the IBM System/360 was a line of six computers each having the same instruction set architecture; the Model 30 was the least expensive. Customers could retain the same application software; each System/360 model featured multiprogramming.
With operating system support, multiple programs could be in memory at once. When one was waiting for input/output, another could compute; each model could emulate other computers. Customers could upgrade to the System/360 and ret
Central processing unit
A central processing unit called a central processor or main processor, is the electronic circuitry within a computer that carries out the instructions of a computer program by performing the basic arithmetic, logic and input/output operations specified by the instructions. The computer industry has used the term "central processing unit" at least since the early 1960s. Traditionally, the term "CPU" refers to a processor, more to its processing unit and control unit, distinguishing these core elements of a computer from external components such as main memory and I/O circuitry; the form and implementation of CPUs have changed over the course of their history, but their fundamental operation remains unchanged. Principal components of a CPU include the arithmetic logic unit that performs arithmetic and logic operations, processor registers that supply operands to the ALU and store the results of ALU operations and a control unit that orchestrates the fetching and execution of instructions by directing the coordinated operations of the ALU, registers and other components.
Most modern CPUs are microprocessors, meaning they are contained on a single integrated circuit chip. An IC that contains a CPU may contain memory, peripheral interfaces, other components of a computer; some computers employ a multi-core processor, a single chip containing two or more CPUs called "cores". Array processors or vector processors have multiple processors that operate in parallel, with no unit considered central. There exists the concept of virtual CPUs which are an abstraction of dynamical aggregated computational resources. Early computers such as the ENIAC had to be physically rewired to perform different tasks, which caused these machines to be called "fixed-program computers". Since the term "CPU" is defined as a device for software execution, the earliest devices that could rightly be called CPUs came with the advent of the stored-program computer; the idea of a stored-program computer had been present in the design of J. Presper Eckert and John William Mauchly's ENIAC, but was omitted so that it could be finished sooner.
On June 30, 1945, before ENIAC was made, mathematician John von Neumann distributed the paper entitled First Draft of a Report on the EDVAC. It was the outline of a stored-program computer that would be completed in August 1949. EDVAC was designed to perform a certain number of instructions of various types; the programs written for EDVAC were to be stored in high-speed computer memory rather than specified by the physical wiring of the computer. This overcame a severe limitation of ENIAC, the considerable time and effort required to reconfigure the computer to perform a new task. With von Neumann's design, the program that EDVAC ran could be changed by changing the contents of the memory. EDVAC, was not the first stored-program computer. Early CPUs were custom designs used as part of a sometimes distinctive computer. However, this method of designing custom CPUs for a particular application has given way to the development of multi-purpose processors produced in large quantities; this standardization began in the era of discrete transistor mainframes and minicomputers and has accelerated with the popularization of the integrated circuit.
The IC has allowed complex CPUs to be designed and manufactured to tolerances on the order of nanometers. Both the miniaturization and standardization of CPUs have increased the presence of digital devices in modern life far beyond the limited application of dedicated computing machines. Modern microprocessors appear in electronic devices ranging from automobiles to cellphones, sometimes in toys. While von Neumann is most credited with the design of the stored-program computer because of his design of EDVAC, the design became known as the von Neumann architecture, others before him, such as Konrad Zuse, had suggested and implemented similar ideas; the so-called Harvard architecture of the Harvard Mark I, completed before EDVAC used a stored-program design using punched paper tape rather than electronic memory. The key difference between the von Neumann and Harvard architectures is that the latter separates the storage and treatment of CPU instructions and data, while the former uses the same memory space for both.
Most modern CPUs are von Neumann in design, but CPUs with the Harvard architecture are seen as well in embedded applications. Relays and vacuum tubes were used as switching elements; the overall speed of a system is dependent on the speed of the switches. Tube computers like EDVAC tended to average eight hours between failures, whereas relay computers like the Harvard Mark I failed rarely. In the end, tube-based CPUs became dominant because the significant speed advantages afforded outweighed the reliability problems. Most of these early synchronous CPUs ran at low clock rates compared to modern microelectronic designs. Clock signal frequencies ranging from 100 kHz to 4 MHz were common at this time, limited by the speed of the switching de