Power supply unit (computer)
A power supply unit converts mains AC to low-voltage regulated DC power for the internal components of a computer. Modern personal computers universally use switched-mode power supplies, some power supplies have a manual switch for selecting input voltage, while others automatically adapt to the mains voltage. Most modern desktop computer power supplies conform to the ATX specification. While an ATX power supply is connected to the supply, it always provides a 5 Volt standby voltage so that the standby functions on the computer. ATX power supplies are turned on and off by a signal from the motherboard and they provide a signal to the motherboard to indicate when the DC voltages are in spec, so that the computer is able to safely power up and boot. The most recent ATX PSU standard is version 2.31 of mid-2008, the desktop computer power supply changes alternating current from a wall socket to low-voltage direct current to operate the processor and peripheral devices. Several direct-current voltages are required, and they must be regulated with some accuracy to provide stable operation of the computer, a power supply rail or voltage rail refers to a single voltage provided by a power supply unit.
Although the term is used in electronic engineering, many people, especially computer enthusiasts. First-generation microcomputer and home computer power supply units used a heavy step-down transformer, modern computers use switched-mode power supplies with a ferrite-cored high frequency transformer. The switched-mode supply is much lighter and less costly, and is more efficient, computer power supplies may have short circuit protection, overpower protection, overvoltage protection, undervoltage protection, overcurrent protection, and over temperature protection. Recent power supplies have a standby voltage available, to allow most of the system to be powered off. When the computer is powered down but the supply is still on, it can be started remotely via Wake-on-LAN. This standby voltage is generated by a power supply inside the unit. In older PSU designs, it was used to supply the voltage regulator, located on the side of the transformer. The regulator controls the switching transistors insulated by optocouplers or pulse transformers, for this reason, a detected shoutcut may cause to wait up to an hour, until it can successful power up again.
This meantime is defined by the grid side located filter capacitors have been discharged by the installed bleeding resistors, Power supplies designed for worldwide use were equipped with an input voltage selector switch that allowed the user to configure the unit for use on local power grid. In the lower range, around 115 V, this switch is turned on changing the power grid voltage rectifier into a voltage doubler in delon circuit design. Connecting the unit configured for the range to a higher-voltage grid usually resulted in an immediate permanent damage
A computer monitor or a computer display is an electronic visual display for computers. A monitor usually comprises the display device, casing, the display device in modern monitors is typically a thin film transistor liquid crystal display or a flat panel LED display, while older monitors used a cathode ray tubes. It can be connected to the computer via VGA, DVI, HDMI, DisplayPort, Thunderbolt, LVDS or other proprietary connectors, computer monitors were used for data processing while television receivers were used for entertainment. From the 1980s onwards, computers have used for both data processing and entertainment, while televisions have implemented some computer functionality. The common aspect ratio of televisions, and computer monitors, has changed from 4,3 to 16,10, to 16,9. Early electronic computers were fitted with a panel of light bulbs where the state of each particular bulb would indicate the state of a particular register bit inside the computer. This allowed the operating the computer to monitor the internal state of the machine.
As early monitors were only capable of displaying a limited amount of information. Instead, a printer was the primary output device, while the monitor was limited to keeping track of the programs operation. Multiple technologies have used for computer monitors. Until the 21st century most used cathode ray tubes but they have largely superseded by LCD monitors. The first computer monitors used cathode ray tubes, high-resolution CRT displays were developed for specialized military and scientific applications but they were far too costly for general use. Either computer could be connected to the terminals of an ordinary color TV set or used with a purpose-made CRT color monitor for optimum resolution. In 1984 IBM introduced the Enhanced Graphics Adapter which was capable of producing 16 colors and had a resolution of 640 x 350, by the end of the 1980s color CRT monitors that could clearly display 1024 x 768 pixels were widely available and increasingly affordable. During the following decade maximum display resolutions gradually increased and prices continued to fall, CRT technology remained dominant in the PC monitor market into the new millennium partly because it was cheaper to produce and offered viewing angles close to 180 degrees.
CRTs still offer some image quality advantages over LCDs but improvements to the latter have made much less obvious. There are multiple technologies that have used to implement liquid crystal displays. Commonly, the laptop would be offered with an assortment of display options at increasing price points, monochrome
In computing, memory refers to the computer hardware devices involved to store information for immediate use in a computer, it is synonymous with the term primary storage. Computer memory operates at a speed, for example random-access memory, as a distinction from storage that provides slow-to-access program and data storage. If needed, contents of the memory can be transferred to secondary storage. An archaic synonym for memory is store, there are two main kinds of semiconductor memory and non-volatile. Examples of non-volatile memory are flash memory and ROM, PROM, EPROM and EEPROM memory, most semiconductor memory is organized into memory cells or bistable flip-flops, each storing one bit. Flash memory organization includes both one bit per cell and multiple bits per cell. The memory cells are grouped into words of fixed word length, each word can be accessed by a binary address of N bit, making it possible to store 2 raised by N words in the memory. This implies that processor registers normally are not considered as memory, since they only store one word, typical secondary storage devices are hard disk drives and solid-state drives.
In the early 1940s, memory technology oftenly permit a capacity of a few bytes, the next significant advance in computer memory came with acoustic delay line memory, developed by J. Presper Eckert in the early 1940s. Delay line memory would be limited to a capacity of up to a few hundred thousand bits to remain efficient, two alternatives to the delay line, the Williams tube and Selectron tube, originated in 1946, both using electron beams in glass tubes as means of storage. Using cathode ray tubes, Fred Williams would invent the Williams tube, the Williams tube would prove more capacious than the Selectron tube and less expensive. The Williams tube would prove to be frustratingly sensitive to environmental disturbances. Efforts began in the late 1940s to find non-volatile memory, jay Forrester, Jan A. Rajchman and An Wang developed magnetic core memory, which allowed for recall of memory after power loss. Magnetic core memory would become the dominant form of memory until the development of transistor-based memory in the late 1960s, developments in technology and economies of scale have made possible so-called Very Large Memory computers.
The term memory when used with reference to computers generally refers to Random Access Memory or RAM, volatile memory is computer memory that requires power to maintain the stored information. Most modern semiconductor volatile memory is either static RAM or dynamic RAM, SRAM retains its contents as long as the power is connected and is easy for interfacing, but uses six transistors per bit. SRAM is not worthwhile for desktop system memory, where DRAM dominates, SRAM is commonplace in small embedded systems, which might only need tens of kilobytes or less. Forthcoming volatile memory technologies that aim at replacing or competing with SRAM and DRAM include Z-RAM and A-RAM, non-volatile memory is computer memory that can retain the stored information even when not powered
A computer mouse is a pointing device that detects two-dimensional motion relative to a surface. This motion is typically translated into the motion of a pointer on a display, physically, a mouse consists of an object held in ones hand, with one or more buttons. Mice often other elements, such as touch surfaces and wheels. The earliest known publication of the mouse as referring to a computer pointing device is in Bill Englishs July 1965 publication. The plural for the rodent is always mice in modern usage. The plural of a mouse is mouses and mice according to most dictionaries. The first recorded usage is mice, the online Oxford Dictionaries cites a 1984 use. The trackball, a pointing device, was invented in 1944 by Ralph Benjamin as part of a World War II-era fire-control radar plotting system called Comprehensive Display System. Benjamin was working for the British Royal Navy Scientific Service, Benjamins project used analog computers to calculate the future position of target aircraft based on several initial input points provided by a user with a joystick.
Benjamin felt that a more elegant input device was needed and invented what they called a ball for this purpose. The device was patented in 1947, but only a prototype using a ball rolling on two rubber-coated wheels was ever built, and the device was kept as a military secret. Another early trackball was built by British electrical engineer Kenyon Taylor in collaboration with Tom Cranston, Taylor was part of the original Ferranti Canada, working on the Royal Canadian Navys DATAR system in 1952. DATAR was similar in concept to Benjamins display, the trackball used four disks to pick up motion, two each for the X and Y directions. When the ball was rolled, the pickup discs spun and contacts on their outer rim made periodic contact with wires, by counting the pulses, the physical movement of the ball could be determined. A digital computer calculated the tracks and sent the data to other ships in a task force using pulse-code modulation radio signals. This trackball used a standard Canadian five-pin bowling ball and it was not patented, since it was a secret military project.
Douglas Engelbart of the Stanford Research Institute has been credited in published books by Thierry Bardini, Paul Ceruzzi, Howard Rheingold, Engelbart was recognized as such in various obituary titles after his death in July 2013. That November, while attending a conference on computer graphics in Reno, Engelbart began to ponder how to adapt the underlying principles of the planimeter to X-Y coordinate input
Hardness is a measure of how resistant solid matter is to various kinds of permanent shape change when a compressive force is applied. Some materials are harder than others, Hardness is dependent on ductility, elastic stiffness, strain, toughness and viscosity. Common examples of matter are ceramics, certain metals, and superhard materials. There are three types of hardness measurements, scratch and rebound. Within each of these classes of measurement there are individual measurement scales, for practical reasons conversion tables are used to convert between one scale and another. Scratch hardness is the measure of how resistant a sample is to fracture or permanent plastic deformation due to friction from a sharp object, the principle is that an object made of a harder material will scratch an object made of a softer material. When testing coatings, scratch hardness refers to the necessary to cut through the film to the substrate. The most common test is Mohs scale, which is used in mineralogy, one tool to make this measurement is the sclerometer.
Another tool used to make these tests is the pocket hardness tester and this tool consists of a scale arm with graduated markings attached to a four-wheeled carriage. A scratch tool with a rim is mounted at a predetermined angle to the testing surface. In order to use it a weight of mass is added to the scale arm at one of the graduated markings. The use of the weight and markings allows a known pressure to be applied without the need for complicated machinery. Indentation hardness measures the resistance of a sample to material deformation due to a constant compression load from an object, they are primarily used in engineering. The tests work on the premise of measuring the critical dimensions of an indentation left by a specifically dimensioned and loaded indenter. Common indentation hardness scales are Rockwell, Shore, rebound hardness, known as dynamic hardness, measures the height of the bounce of a diamond-tipped hammer dropped from a fixed height onto a material. This type of hardness is related to elasticity, the device used to take this measurement is known as a scleroscope.
Two scales that measures rebound hardness are the Leeb rebound hardness test, there are five hardening processes, Hall-Petch strengthening, work hardening, solid solution strengthening, precipitation hardening, and martensitic transformation. Hardness in the elastic range—a small temporary change in shape for a given force—is known as stiffness in the case of a given object and they exhibit plasticity—the ability to permanently change shape in response to the force, but remain in one piece
A video card is an expansion card which generates a feed of output images to a display. Frequently, these are advertised as discrete or dedicated graphics cards, Standards such as MDA, CGA, HGC, Tandy, PGC, EGA, VGA, MCGA,8514 or XGA were introduced from 1982 to 1990 and supported by a variety of hardware manufacturers. The majority of video cards are built with either AMD-sourced or Nvidia-sourced graphics chips. Until 2000, 3dfx Interactive was an important, and often groundbreaking, most video cards offer various functions such as accelerated rendering of 3D scenes and 2D graphics, MPEG-2/MPEG-4 decoding, TV output, or the ability to connect multiple monitors. Video cards have sound capabilities to output sound – along with the video for connected TVs or monitors with integrated speakers. Graphics cards werent very useful for computers, since they didnt have the capability to run graphic-based games or high-resolution videos as modern computers do now. Within the industry, video cards are sometimes called graphics add-in-boards, abbreviated as AIBs, as an alternative to the use of a video card, video hardware can be integrated into the motherboard or the CPU.
Both approaches can be called integrated graphics, motherboard-based implementations are sometimes called on-board video while CPU-based implementations are known as accelerated processing units. The ability to disable the integrated graphics sometimes allows the use of a motherboard on which the on-board video has failed. Sometimes both the graphics and a dedicated graphics card can be used simultaneously to feed separate displays. The main advantages of integrated graphics include cost, simplicity, the performance disadvantage of integrated graphics arises because the graphics processor shares system resources with the CPU. A dedicated graphics card has its own random access memory, its own cooling system, upgrading to a dedicated graphics card offloads work from the CPU and system RAM, so not only will graphics processing be faster, but the computers overall performance may improve. Both of the dominant CPU makers, AMD and Intel, are moving to APUs, as of late 2013, the best APUs provide graphics processing approaching mid-range mobile video cards and are adequate for casual gaming.
Users seeking the highest video performance for gaming or other graphics-intensive uses should still choose computers with dedicated graphics cards. Beyond the enthusiast segment is the market for video cards for workstations used in the special effects industry. Nvidia is a player in the professional segment. As the processing power of cards has increased, so has their demand for electrical power. Current high-performance video cards tend to consume a great deal of power, for example, the thermal design power for the GeForce GTX TITAN is 250 Watts
Optical disc drive
Some drives can only read from certain discs, but recent drives can both read and record, called burners or writers. Compact discs, DVDs, and Blu-ray discs are common types of media which can be read. Optical disc drives that are no longer in production include CD-ROM drive, CD writer drive, combo drive, as of 2015, DVD writer drive supporting all existing recordable and rewritable DVD formats is the most common for desktop PCs and laptops. There are the DVD-ROM drive, BD-ROM drive, Blu-ray Disc combo drive and they are very commonly used in computers to read software and consumer media distributed on disc, and to record discs for archival and data exchange purposes. Floppy disk drives, with capacity of 1, USB flash drives, high-capacity and inexpensive, are suitable where read/write capability is required. Large backups are often made on external hard drives, as their price has dropped to a level making this viable. The first laser disc, demonstrated in 1972, was the Laservision 12-inch video disc, the video signal was stored as an analog format like a video cassette.
The first digitally recorded optical disc was a 5-inch audio compact disc in a format created by Sony. The CD-ROM format was developed by Sony and Denon, introduced in 1984, as an extension of Compact Disc Digital Audio, the CD-ROM has a storage capacity of 650 MB. Also in 1984, Sony introduced a LaserDisc data storage format, in 1987, Sony demonstrated the erasable and rewritable 5. 25-inch optical drive. The first Blu-Ray prototype was unveiled by Sony in October 2000, technically Blu-ray Disc required a thinner layer for the narrower beam and shorter wavelength blue laser. The first BD-ROM players were shipped in mid-June 2006, the first Blu-ray Disc titles were released by Sony and MGM on June 20,2006. The first mass-market Blu-ray Disc rewritable drive for the PC was the BWU-100A, initially, CD-type lasers with a wavelength of 780 nm were used. For DVDs, the wavelength was reduced to 650 nm, two main servomechanisms are used, the first to maintain the proper distance between lens and disc, to ensure the laser beam is focused as a small laser spot on the disc.
The second servo moves the head along the discs radius, keeping the beam on the track. Optical disk media are read beginning at the radius to the outer edge. This is detected by photodiodes that create corresponding electrical signals, an optical disk recorder encodes data onto a recordable CD-R, DVD-R, DVD+R, or BD-R disc by selectively heating parts of an organic dye layer with a laser. This changes the reflectivity of the dye, thereby creating marks that can be read like the pits, for recordable discs, the process is permanent and the media can be written to only once
A computer case, known as a computer chassis, system unit, base unit, or simply case, is the enclosure that contains most of the components of a computer. Cases are usually constructed from steel or aluminium, plastic is sometimes used, and other materials such as glass and even Lego bricks have appeared in home-built cases. Cases can come in different sizes. The size and shape of a case is usually determined by the form factor of the motherboard. Consequently, personal computer form factors typically specify only the internal dimensions, Form factors for rack-mounted and blade servers may include precise external dimensions as well, since these cases must themselves fit in specific enclosures. For example, a designed for an ATX motherboard and power supply may take on several external forms such as a vertical tower. Full-size tower cases are larger in volume than desktop cases, with more room for drive bays. Desktop cases—and mini-tower cases under about 46 cm high—are popular in business environments where space is at a premium, the most popular form factor for desktop computers is ATX, although microATX and small form factors have become very popular for a variety of uses.
In the high-end segment the unofficial and loosely defined XL-ATX specification appeared around 2009 and it extends the length of the mainboard to accommodate four graphics cards with dual-slot coolers. Some XL-ATX mainboards increase the width as well, to allow more space for the CPU, Memory PWM and, in some cases. While the market share of these exotic high-end mainboards is very low, almost all high-end cases, companies like In Win Development, Shuttle Inc. and AOpen originally popularized small cases, for which FlexATX was the most common motherboard size. As of 2010 Mini ITX has widely replaced FlexATX as the most common form factor mainboard standard. The latest mini ITX mainboards from Asus, Zotac, high-end mini ITX mainboards support standard desktop CPUs, use standard memory DIMM sockets and feature a full size PCI-E 16× slot with support for the fastest graphics cards. This allows customers to build a fully fledged high-end computer in a smaller case. Apple Inc. has produced the Mac Mini computer, which is similar in size to a standard CD-ROM drive, tower cases are often categorized as mini-tower, midi-tower, mid-tower or full-tower.
The terms are subjective and inconsistently defined by different manufacturers, full tower cases are typically 56 cm or more in height and intended to stand on the floor. They have anywhere from six to ten externally accessible drive bays, the ratio of external to internal bays is shifting, however, as computing technology moves from floppy disks and CD-ROMs to large capacity hard drives, USB flash drives, and network-based solutions. The full tower case was developed to house file servers which would typically be tasked with serving data from expensive CD-ROM databases which held more data than the hard drives commonly available
Computer data storage
Computer data storage, often called storage or memory, is a technology consisting of computer components and recording media used to retain digital data. It is a function and fundamental component of computers. The central processing unit of a computer is what manipulates data by performing computations, in practice, almost all computers use a storage hierarchy, which puts fast but expensive and small storage options close to the CPU and slower but larger and cheaper options farther away. In the Von Neumann architecture, the CPU consists of two parts, The control unit and the arithmetic logic unit. The former controls the flow of data between the CPU and memory, while the latter performs arithmetic and logical operations on data, without a significant amount of memory, a computer would merely be able to perform fixed operations and immediately output the result. It would have to be reconfigured to change its behavior and this is acceptable for devices such as desk calculators, digital signal processors, and other specialized devices.
Von Neumann machines differ in having a memory in which they store their operating instructions, most modern computers are von Neumann machines. A modern digital computer represents data using the numeral system. Text, pictures and nearly any form of information can be converted into a string of bits, or binary digits. The most common unit of storage is the byte, equal to 8 bits, a piece of information can be handled by any computer or device whose storage space is large enough to accommodate the binary representation of the piece of information, or simply data. For example, the works of Shakespeare, about 1250 pages in print. Data is encoded by assigning a bit pattern to each character, digit, by adding bits to each encoded unit, redundancy allows the computer to both detect errors in coded data and correct them based on mathematical algorithms. A random bit flip is typically corrected upon detection, the cyclic redundancy check method is typically used in communications and storage for error detection. A detected error is retried, data compression methods allow in many cases to represent a string of bits by a shorter bit string and reconstruct the original string when needed.
This utilizes substantially less storage for many types of data at the cost of more computation, analysis of trade-off between storage cost saving and costs of related computations and possible delays in data availability is done before deciding whether to keep certain data compressed or not. For security reasons certain types of data may be encrypted in storage to prevent the possibility of unauthorized information reconstruction from chunks of storage snapshots. Generally, the lower a storage is in the hierarchy, the lesser its bandwidth and this traditional division of storage to primary, secondary and off-line storage is guided by cost per bit. In contemporary usage, memory is usually semiconductor storage read-write random-access memory, typically DRAM or other forms of fast but temporary storage
An expansion bus is a computer bus which moves information between the internal hardware of a computer system and peripheral devices. It is a collection of wires and protocols that allows for the expansion of a computer, even vacuum-tube based computers had modular construction, but individual functions for peripheral devices filled a cabinet, not just a printed circuit board. Processor, memory and I/O cards became feasible with the development of integrated circuits, starting with the PDP-8, were made of multiple cards, all powered by and communicating through a passive backplane. The first commercial microcomputer to feature expansion slots was the Micral N, proprietary bus implementations for systems such as the Apple II co-existed with multi-manufacturer standards. IBM introduced what would retroactively be called the Industry Standard Architecture bus with the IBM PC in 1981, the IBM XT, introduced in 1983, used the same bus. The 8-bit PC and XT bus was extended with the introduction of the IBM AT and this used a second connector for extending the address and data bus over the XT, but was backward compatible, 8-bit cards were still usable in the AT 16-bit slots.
Industry Standard Architecture became the designation for the IBM AT bus after other types were developed. IBMs MCA bus, developed for the PS/2 in 1987, was a competitor to ISA, their design, but fell out of favor due to the ISAs industry-wide acceptance and IBMs licensing of MCA. EISA, the 32-bit extended version of ISA championed by Compaq, was used on some PC motherboards until 1997, proprietary local buses and the VESA Local Bus Standard, were late 1980s expansion buses that were tied but not exclusive to the 80386 and 80486 CPU bus. The PC/104 bus is a bus that copies the ISA bus. Intel launched their PCI bus chipsets along with the P5-based Pentium CPUs in 1993, the PCI bus was introduced in 1991 as replacement for ISA. The standard is found on PC motherboards to this day, the PCI standard supports bus bridging, as many as ten daisy chained PCI buses have been tested. Cardbus, using the PCMCIA connector, is a PCI format that attaches peripherals to the Host PCI Bus via PCI to PCI Bridge, cardbus is being supplanted by ExpressCard format.
Intel introduced the AGP bus in 1997 as a video acceleration solution. AGP devices are attached to the PCI bus over a PCI-to-PCI bridge. Though termed a bus, AGP usually supports only a card at a time. From 2005 PCI-Express has been replacing both PCI and AGP and this standard, approved in 2004, implements the logical PCI protocol over a serial communication interface. PC/104 or Mini PCI are often added for expansion on small form factor boards such as Mini-ITX, for their 1000 EX and 1000 HX models, Tandy Computer designed the PLUS expansion interface, an adaptation of the XT-bus supporting cards of a smaller form factor
Computer speakers, or multimedia speakers, are speakers sold for use with computers, although usually capable of other audio uses, e. g. for an MP3 player. Most such speakers have an amplifier and consequently require a power source. The signal input connector is often a 3.5 mm jack plug, RCA connectors are used. Battery-powered wireless Bluetooth speakers require no connections at all, most computers have speakers of low power and quality built in, when external speakers are connected they disable the built-in speakers. Altec Lansing claims to have created the computer market in 1990. Computer speakers range widely in quality and in price, computer speakers sometimes packaged with computer systems are small and have mediocre sound quality. Some computer speakers have equalization features such as bass and treble controls, more sophisticated computer speakers can have a subwoofer unit, to enhance bass output. The larger subwoofer enclosure usually contains the amplifiers for the subwoofer, some computer displays have rather basic speakers built-in.
Laptop computers have built-in integrated speakers, usually small and of restricted sound quality to conserve space, instead of using a computer speaker for better sound, a computer can be connected to any external sound system, typically a high-power high-quality setup. It stores energy from the USB connection during quieter periods, delivering power for the peaks. The module is claimed to require less power most of the time, increasing laptop computer battery endurance, and delivering clean, unclipped sound peaks
A computer is a device that can be instructed to carry out an arbitrary set of arithmetic or logical operations automatically. The ability of computers to follow a sequence of operations, called a program, such computers are used as control systems for a very wide variety of industrial and consumer devices. The Internet is run on computers and it millions of other computers. Since ancient times, simple manual devices like the abacus aided people in doing calculations, early in the Industrial Revolution, some mechanical devices were built to automate long tedious tasks, such as guiding patterns for looms. More sophisticated electrical machines did specialized analog calculations in the early 20th century, the first digital electronic calculating machines were developed during World War II. The speed and versatility of computers has increased continuously and dramatically since then, conventionally, a modern computer consists of at least one processing element, typically a central processing unit, and some form of memory.
The processing element carries out arithmetic and logical operations, and a sequencing, peripheral devices include input devices, output devices, and input/output devices that perform both functions. Peripheral devices allow information to be retrieved from an external source and this usage of the term referred to a person who carried out calculations or computations. The word continued with the same meaning until the middle of the 20th century, from the end of the 19th century the word began to take on its more familiar meaning, a machine that carries out computations. The Online Etymology Dictionary gives the first attested use of computer in the 1640s, one who calculates, the Online Etymology Dictionary states that the use of the term to mean calculating machine is from 1897. The Online Etymology Dictionary indicates that the use of the term. 1945 under this name, theoretical from 1937, as Turing machine, devices have been used to aid computation for thousands of years, mostly using one-to-one correspondence with fingers.
The earliest counting device was probably a form of tally stick, record keeping aids throughout the Fertile Crescent included calculi which represented counts of items, probably livestock or grains, sealed in hollow unbaked clay containers. The use of counting rods is one example, the abacus was initially used for arithmetic tasks. The Roman abacus was developed from used in Babylonia as early as 2400 BC. Since then, many forms of reckoning boards or tables have been invented. In a medieval European counting house, a checkered cloth would be placed on a table, the Antikythera mechanism is believed to be the earliest mechanical analog computer, according to Derek J. de Solla Price. It was designed to calculate astronomical positions and it was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, and has been dated to circa 100 BC