In computing, a printer is a peripheral device which makes a persistent representation of graphics or text on paper. While most output is human-readable, bar code printers are an example of an expanded use for printers; the first computer printer designed was a mechanically driven apparatus by Charles Babbage for his difference engine in the 19th century. The first electronic printer was the EP-101, invented by Japanese company Epson and released in 1968; the first commercial printers used mechanisms from electric typewriters and Teletype machines. The demand for higher speed led to the development of new systems for computer use. In the 1980s were daisy wheel systems similar to typewriters, line printers that produced similar output but at much higher speed, dot matrix systems that could mix text and graphics but produced low-quality output; the plotter was used for those requiring high quality line art like blueprints. The introduction of the low-cost laser printer in 1984 with the first HP LaserJet, the addition of PostScript in next year's Apple LaserWriter, set off a revolution in printing known as desktop publishing.
Laser printers using PostScript mixed text and graphics, like dot-matrix printers, but at quality levels available only from commercial typesetting systems. By 1990, most simple printing tasks like fliers and brochures were now created on personal computers and laser printed; the HP Deskjet of 1988 offered the same advantages as laser printer in terms of flexibility, but produced somewhat lower quality output from much less expensive mechanisms. Inkjet systems displaced dot matrix and daisy wheel printers from the market. By the 2000s high-quality printers of this sort had fallen under the $100 price point and became commonplace; the rapid update of internet email through the 1990s and into the 2000s has displaced the need for printing as a means of moving documents, a wide variety of reliable storage systems means that a "physical backup" is of little benefit today. The desire for printed output for "offline reading" while on mass transit or aircraft has been displaced by e-book readers and tablet computers.
Today, traditional printers are being used more for special purposes, like printing photographs or artwork, are no longer a must-have peripheral. Starting around 2010, 3D printing became an area of intense interest, allowing the creation of physical objects with the same sort of effort as an early laser printer required to produce a brochure; these devices have not yet become commonplace. Personal printers are designed to support individual users, may be connected to only a single computer; these printers are designed for low-volume, short-turnaround print jobs, requiring minimal setup time to produce a hard copy of a given document. However, they are slow devices ranging from 6 to around 25 pages per minute, the cost per page is high. However, this is offset by the on-demand convenience; some printers can print documents stored from digital cameras and scanners. Networked or shared printers are "designed for high-volume, high-speed printing", they are shared by many users on a network and can print at speeds of 45 to around 100 ppm.
The Xerox 9700 could achieve 120 ppm. A virtual printer is a piece of computer software whose user interface and API resembles that of a printer driver, but, not connected with a physical computer printer. A virtual printer can be used to create a file, an image of the data which would be printed, for archival purposes or as input to another program, for example to create a PDF or to transmit to another system or user. A barcode printer is a computer peripheral for printing barcode labels or tags that can be attached to, or printed directly on, physical objects. Barcode printers are used to label cartons before shipment, or to label retail items with UPCs or EANs. A 3D printer is a device for making a three-dimensional object from a 3D model or other electronic data source through additive processes in which successive layers of material are laid down under computer control, it is called a printer by analogy with an inkjet printer which produces a two-dimensional document by a similar process of depositing a layer of ink on paper.
The choice of print technology has a great effect on the cost of the printer and cost of operation, speed and permanence of documents, noise. Some printer technologies do not work with certain types of physical media, such as carbon paper or transparencies. A second aspect of printer technology, forgotten is resistance to alteration: liquid ink, such as from an inkjet head or fabric ribbon, becomes absorbed by the paper fibers, so documents printed with liquid ink are more difficult to alter than documents printed with toner or solid inks, which do not penetrate below the paper surface. Cheques can be printed with liquid ink or on special cheque paper with toner anchorage so that alterations may be detected; the machine-readable lower portion of a cheque must be printed using MICR ink. Banks and other clearing houses employ automation equipment that relies on the magnetic flux from these specially printed characters to function properly; the following printing technologies are found in modern printers: A laser printer produces high quality text and graphics.
As with digital photocopiers and multifunction printers, laser printers employ a xerographic printing process but differ from analog photocopiers in
A software bug is an error, failure or fault in a computer program or system that causes it to produce an incorrect or unexpected result, or to behave in unintended ways. The process of finding and fixing bugs is termed "debugging" and uses formal techniques or tools to pinpoint bugs, since the 1950s, some computer systems have been designed to deter, detect or auto-correct various computer bugs during operations. Most bugs arise from mistakes and errors made in either a program's source code or its design, or in components and operating systems used by such programs. A few are caused by compilers producing incorrect code. A program that contains a large number of bugs, and/or bugs that interfere with its functionality, is said to be buggy. Bugs can trigger errors. Bugs may cause the program to crash or freeze the computer. Other bugs qualify as security bugs and might, for example, enable a malicious user to bypass access controls in order to obtain unauthorized privileges; some software bugs have been linked to disasters.
Bugs in code that controlled the Therac-25 radiation therapy machine were directly responsible for patient deaths in the 1980s. In 1996, the European Space Agency's US$1 billion prototype Ariane 5 rocket had to be destroyed less than a minute after launch due to a bug in the on-board guidance computer program. In June 1994, a Royal Air Force Chinook helicopter crashed into the Mull of Kintyre, killing 29; this was dismissed as pilot error, but an investigation by Computer Weekly convinced a House of Lords inquiry that it may have been caused by a software bug in the aircraft's engine-control computer. In 2002, a study commissioned by the US Department of Commerce's National Institute of Standards and Technology concluded that "software bugs, or errors, are so prevalent and so detrimental that they cost the US economy an estimated $59 billion annually, or about 0.6 percent of the gross domestic product". The term "bug" to describe defects has been a part of engineering jargon since the 1870s and predates electronic computers and computer software.
For instance, Thomas Edison wrote the following words in a letter to an associate in 1878: It has been just so in all of my inventions. The first step is an intuition, comes with a burst difficulties arise—this thing gives out and that "Bugs"—as such little faults and difficulties are called—show themselves and months of intense watching and labor are requisite before commercial success or failure is reached; the Middle English word bugge is the basis for the terms "bugbear" and "bugaboo" as terms used for a monster. Baffle Ball, the first mechanical pinball game, was advertised as being "free of bugs" in 1931. Problems with military gear during World War II were referred to as bugs. In a book published in 1942, Louise Dickinson Rich, speaking of a powered ice cutting machine, said, "Ice sawing was suspended until the creator could be brought in to take the bugs out of his darling."Isaac Asimov used the term "bug" to relate to issues with a robot in his short story "Catch That Rabbit", published in 1944.
The term "bug" was used in an account by computer pioneer Grace Hopper, who publicized the cause of a malfunction in an early electromechanical computer. A typical version of the story is: In 1946, when Hopper was released from active duty, she joined the Harvard Faculty at the Computation Laboratory where she continued her work on the Mark II and Mark III. Operators traced an error in the Mark II to a moth trapped in a relay; this bug was removed and taped to the log book. Stemming from the first bug, today we call errors or glitches in a program a bug. Hopper did not find the bug, as she acknowledged; the date in the log book was September 9, 1947. The operators who found it, including William "Bill" Burke of the Naval Weapons Laboratory, Virginia, were familiar with the engineering term and amusedly kept the insect with the notation "First actual case of bug being found." Hopper loved to recount the story. This log book, complete with attached moth, is part of the collection of the Smithsonian National Museum of American History.
The related term "debug" appears to predate its usage in computing: the Oxford English Dictionary's etymology of the word contains an attestation from 1945, in the context of aircraft engines. The concept that software might contain errors dates back to Ada Lovelace's 1843 notes on the analytical engine, in which she speaks of the possibility of program "cards" for Charles Babbage's analytical engine being erroneous:... an analysing process must have been performed in order to furnish the Analytical Engine with the necessary operative data. Granted that the actual mechanism is unerring in its processes, the cards may give it wrong orders; the first documented use of the term "bug" for a technical malfunction was by Thomas Edison. The Open Technology Institute, run by the group, New America, released a report "Bugs in the System" in August 2016 stating that U. S. policymakers should make reforms to help researchers address software bugs. The report "highlights the need for reform in the field of software vulnerability discovery and disclosure."
One of the report’s authors said that Congress has not done enough to address cyber software vulnerability though Congress has passed a number of bills to combat the larger issue of cyber security. Government researchers and cyber security experts are the people who discover software flaws
A computer file is a computer resource for recording data discretely in a computer storage device. Just as words can be written to paper, so can information be written to a computer file. Files can be transferred through the internet. There are different types of computer files, designed for different purposes. A file may be designed to store a picture, a written message, a video, a computer program, or a wide variety of other kinds of data; some types of files can store several types of information at once. By using computer programs, a person can open, change and close a computer file. Computer files may be reopened and copied an arbitrary number of times. Files are organised in a file system, which keeps track of where the files are located on disk and enables user access; the word "file" derives from the Latin filum."File" was used in the context of computer storage as early as January 1940. In Punched Card Methods in Scientific Computation, W. J. Eckert stated, "The first extensive use of the early Hollerith Tabulator in astronomy was made by Comrie.
He used it for building a table from successive differences, for adding large numbers of harmonic terms". "Tables of functions are constructed from their differences with great efficiency, either as printed tables or as a file of punched cards." In February 1950: In a Radio Corporation of America advertisement in Popular Science Magazine describing a new "memory" vacuum tube it had developed, RCA stated: "the results of countless computations can be kept'on file' and taken out again. Such a'file' now exists in a'memory' tube developed at RCA Laboratories. Electronically it retains figures fed into calculating machines, holds them in storage while it memorizes new ones - speeds intelligent solutions through mazes of mathematics." In 1952, "file" denoted, information stored on punched cards. In early use, the underlying hardware, rather than the contents stored on it, was denominated a "file". For example, the IBM 350 disk drives were denominated "disk files"; the introduction, circa 1961, by the Burroughs MCP and the MIT Compatible Time-Sharing System of the concept of a "file system" that managed several virtual "files" on one storage device is the origin of the contemporary denotation of the word.
Although the contemporary "register file" demonstrates the early concept of files, its use has decreased. On most modern operating systems, files are organized into one-dimensional arrays of bytes; the format of a file is defined by its content since a file is a container for data, although, on some platforms the format is indicated by its filename extension, specifying the rules for how the bytes must be organized and interpreted meaningfully. For example, the bytes of a plain text file are associated with either ASCII or UTF-8 characters, while the bytes of image and audio files are interpreted otherwise. Most file types allocate a few bytes for metadata, which allows a file to carry some basic information about itself; some file systems can store arbitrary file-specific data outside of the file format, but linked to the file, for example extended attributes or forks. On other file systems this can be done via software-specific databases. All those methods, are more susceptible to loss of metadata than are container and archive file formats.
At any instant in time, a file might have a size expressed as number of bytes, that indicates how much storage is associated with the file. In most modern operating systems the size can be any non-negative whole number of bytes up to a system limit. Many older operating systems kept track only of the number of blocks or tracks occupied by a file on a physical storage device. In such systems, software employed other methods to track the exact byte count; the general definition of a file does not require that its size have any real meaning, unless the data within the file happens to correspond to data within a pool of persistent storage. A special case is a zero byte file. For example, the file to which the link /bin/ls points in a typical Unix-like system has a defined size that changes. Compare this with /dev/null, a file, but its size may be obscure. Information in a computer file can consist of smaller packets of information that are individually different but share some common traits. For example, a payroll file might contain information concerning all the employees in a company and their payroll details.
A text file may contain lines of corresponding to printed lines on a piece of paper. Alternatively, a file may contain an arbitrary binary image or it may contain an executable; the way information is grouped into a file is up to how it is designed. This has led to a plethora of more or less standardized file structures for all imaginable purposes, from the simplest to the most complex. Most computer files are used by computer programs which create, modify or delete the files for their own use on an as-needed basis; the programmers who create the programs decide what files are needed, how they are to be used and their names. In some cases, computer pr
A programmer, coder, or software engineer is a person who creates computer software. The term computer programmer can refer to a specialist in one area of computers, or to a generalist who writes code for many kinds of software. One who practices, or professes, a formal approach to programming may be known as a programmer analyst. On the other hand, "code monkey" is a derogatory term for a programmer who writes code without any involvement in the design or specifications. A programmer's primary computer language is prefixed to these titles, those who work in a web environment prefix their titles with web. A range of occupations—including: software developer, web developer, mobile applications developer, embedded firmware developer, software engineer, computer scientist, game programmer, game developer, or software analyst—that involve programming require a range of other skills; the use of the term programmer for these positions is sometimes considered an insulting or derogatory simplification.
British countess and mathematician Ada Lovelace is considered the first computer programmer, as she was the first to publish an algorithm intended for implementation on Charles Babbage's analytical engine, in October 1842, intended for the calculation of Bernoulli numbers. Because Babbage's machine was never completed to a functioning standard in her time, she never saw this algorithm run; the first person to run a program on a functioning modern electronically based computer was computer scientist Konrad Zuse, in 1941. The ENIAC programming team, consisting of Kay McNulty, Betty Jennings, Betty Snyder, Marlyn Wescoff, Fran Bilas and Ruth Lichterman were the first working programmers. International Programmers' Day is celebrated annually on 7 January. In 2009, the government of Russia decreed a professional annual holiday known as Programmers' Day to be celebrated on 13 September, it had been an unofficial international holiday before that. The word "software" did not appear in print until the 1960s.
Before this time, computers were programmed either by customers, or the few commercial computer vendors of the time, such as UNIVAC and IBM. The first company founded to provide software products and services was Computer Usage Company in 1955; the software industry expanded in the early 1960s immediately after computers were first sold in mass-produced quantities. Universities and business customers created a demand for software. Many of these programs were written in-house by full-time staff programmers; some were distributed between users of a particular machine for no charge. Others were done on a commercial basis, other firms such as Computer Sciences Corporation started to grow; the computer/hardware makers started bundling operating systems, system software and programming environments with their machines. The industry expanded with the rise of the personal computer in the mid-1970s, which brought computing to the desktop of the office worker. In the following years, it created a growing market for games and utilities.
DOS, Microsoft's first operating system product, was the dominant operating system at the time. In the early years of the 21st century, another successful business model has arisen for hosted software, called software-as-a-service, or SaaS. From the point of view of producers of some proprietary software, SaaS reduces the concerns about unauthorized copying, since it can only be accessed through the Web, by definition, no client software is loaded onto the end user's PC. By 2014, the role of cloud developer had been defined. Computer programmers write, test and maintain the detailed instructions, called computer programs, that computers must follow to perform their functions. Programmers conceive and test logical structures for solving problems by computer. Many technical innovations in programming — advanced computing technologies and sophisticated new languages and programming tools — have redefined the role of a programmer and elevated much of the programming work done today. Job titles and descriptions may vary, depending on the organization.
Programmers work in many settings, including corporate information technology departments, big software companies, small service firms and government entities of all sizes. Many professional programmers work for consulting companies at client sites as contractors. Licensing is not required to work as a programmer, although professional certifications are held by programmers. Programming is considered a profession. Programmers' work varies depending on the type of business for which they are writing programs. For example, the instructions involved in updating financial records are different from those required to duplicate conditions on an aircraft for pilots training in a flight simulator. Simple programs can be written in a few hours, more complex ones may require more than a year of work, while others are never considered'complete' but rather are continuously improved as long as they stay in use. In most cases, several programmers work together as a team under a senior programmer’s supervision.
Programmers write programs according to the specifications determined b
A computer network is a digital telecommunications network which allows nodes to share resources. In computer networks, computing devices exchange data with each other using connections between nodes; these data links are established over cable media such as wires or optic cables, or wireless media such as Wi-Fi. Network computer devices that originate and terminate the data are called network nodes. Nodes are identified by network addresses, can include hosts such as personal computers and servers, as well as networking hardware such as routers and switches. Two such devices can be said to be networked together when one device is able to exchange information with the other device, whether or not they have a direct connection to each other. In most cases, application-specific communications protocols are layered over other more general communications protocols; this formidable collection of information technology requires skilled network management to keep it all running reliably. Computer networks support an enormous number of applications and services such as access to the World Wide Web, digital video, digital audio, shared use of application and storage servers and fax machines, use of email and instant messaging applications as well as many others.
Computer networks differ in the transmission medium used to carry their signals, communications protocols to organize network traffic, the network's size, traffic control mechanism and organizational intent. The best-known computer network is the Internet; the chronology of significant computer-network developments includes: In the late 1950s, early networks of computers included the U. S. military radar system Semi-Automatic Ground Environment. In 1959, Anatolii Ivanovich Kitov proposed to the Central Committee of the Communist Party of the Soviet Union a detailed plan for the re-organisation of the control of the Soviet armed forces and of the Soviet economy on the basis of a network of computing centres, the OGAS. In 1960, the commercial airline reservation system semi-automatic business research environment went online with two connected mainframes. In 1963, J. C. R. Licklider sent a memorandum to office colleagues discussing the concept of the "Intergalactic Computer Network", a computer network intended to allow general communications among computer users.
In 1964, researchers at Dartmouth College developed the Dartmouth Time Sharing System for distributed users of large computer systems. The same year, at Massachusetts Institute of Technology, a research group supported by General Electric and Bell Labs used a computer to route and manage telephone connections. Throughout the 1960s, Paul Baran and Donald Davies independently developed the concept of packet switching to transfer information between computers over a network. Davies pioneered the implementation of the concept with the NPL network, a local area network at the National Physical Laboratory using a line speed of 768 kbit/s. In 1965, Western Electric introduced the first used telephone switch that implemented true computer control. In 1966, Thomas Marill and Lawrence G. Roberts published a paper on an experimental wide area network for computer time sharing. In 1969, the first four nodes of the ARPANET were connected using 50 kbit/s circuits between the University of California at Los Angeles, the Stanford Research Institute, the University of California at Santa Barbara, the University of Utah.
Leonard Kleinrock carried out theoretical work to model the performance of packet-switched networks, which underpinned the development of the ARPANET. His theoretical work on hierarchical routing in the late 1970s with student Farouk Kamoun remains critical to the operation of the Internet today. In 1972, commercial services using X.25 were deployed, used as an underlying infrastructure for expanding TCP/IP networks. In 1973, the French CYCLADES network was the first to make the hosts responsible for the reliable delivery of data, rather than this being a centralized service of the network itself. In 1973, Robert Metcalfe wrote a formal memo at Xerox PARC describing Ethernet, a networking system, based on the Aloha network, developed in the 1960s by Norman Abramson and colleagues at the University of Hawaii. In July 1976, Robert Metcalfe and David Boggs published their paper "Ethernet: Distributed Packet Switching for Local Computer Networks" and collaborated on several patents received in 1977 and 1978.
In 1979, Robert Metcalfe pursued making Ethernet an open standard. In 1976, John Murphy of Datapoint Corporation created ARCNET, a token-passing network first used to share storage devices. In 1995, the transmission speed capacity for Ethernet increased from 10 Mbit/s to 100 Mbit/s. By 1998, Ethernet supported transmission speeds of a Gigabit. Subsequently, higher speeds of up to 400 Gbit/s were added; the ability of Ethernet to scale is a contributing factor to its continued use. Computer networking may be considered a branch of electrical engineering, electronics engineering, telecommunications, computer science, information technology or computer engineering, since it relies upon the theoretical and practical application of the related disciplines. A computer network facilitates interpersonal communications allowing users to communicate efficiently and via various means: email, instant messaging, online chat, video telephone calls, video conferencing. A network allows sharing of computing resources.
Users may access and use resources provided by devices on the network, such as printing a document on a shared network printer or use of a shared storage device. A network allows sharing of files, and
A computer program is a collection of instructions that performs a specific task when executed by a computer. A computer requires programs to function. A computer program is written by a computer programmer in a programming language. From the program in its human-readable form of source code, a compiler can derive machine code—a form consisting of instructions that the computer can directly execute. Alternatively, a computer program may be executed with the aid of an interpreter. A collection of computer programs and related data are referred to as software. Computer programs may be categorized along functional lines, such as application software and system software; the underlying method used for some calculation or manipulation is known as an algorithm. The earliest programmable machines preceded the invention of the digital computer. In 1801, Joseph-Marie Jacquard devised a loom that would weave a pattern by following a series of perforated cards. Patterns could be repeated by arranging the cards.
In 1837, Charles Babbage was inspired by Jacquard's loom to attempt to build the Analytical Engine. The names of the components of the calculating device were borrowed from the textile industry. In the textile industry, yarn was brought from the store to be milled; the device would have had a "store"—memory to hold 1,000 numbers of 40 decimal digits each. Numbers from the "store" would have been transferred to the "mill", for processing, and a "thread" being the execution of programmed instructions by the device. It was programmed using two sets of perforated cards—one to direct the operation and the other for the input variables. However, after more than 17,000 pounds of the British government's money, the thousands of cogged wheels and gears never worked together. During a nine-month period in 1842–43, Ada Lovelace translated the memoir of Italian mathematician Luigi Menabrea; the memoir covered the Analytical Engine. The translation contained Note G which detailed a method for calculating Bernoulli numbers using the Analytical Engine.
This note is recognized by some historians as the world's first written computer program. In 1936, Alan Turing introduced the Universal Turing machine—a theoretical device that can model every computation that can be performed on a Turing complete computing machine, it is a finite-state machine. The machine can move the tape forth, changing its contents as it performs an algorithm; the machine starts in the initial state, goes through a sequence of steps, halts when it encounters the halt state. This machine is considered by some to be the origin of the stored-program computer—used by John von Neumann for the "Electronic Computing Instrument" that now bears the von Neumann architecture name; the Z3 computer, invented by Konrad Zuse in Germany, was a programmable computer. A digital computer uses electricity as the calculating component; the Z3 contained 2,400 relays to create the circuits. The circuits provided a floating-point, nine-instruction computer. Programming the Z3 was through a specially designed keyboard and punched tape.
The Electronic Numerical Integrator And Computer was a Turing complete, general-purpose computer that used 17,468 vacuum tubes to create the circuits. At its core, it was a series of Pascalines wired together, its 40 units weighed 30 tons, occupied 1,800 square feet, consumed $650 per hour in electricity when idle. It had 20 base-10 accumulators. Programming the ENIAC took up to two months. Three function tables needed to be rolled to fixed function panels. Function tables were connected to function panels using heavy black cables; each function table had 728 rotating knobs. Programming the ENIAC involved setting some of the 3,000 switches. Debugging a program took a week; the programmers of the ENIAC were women who were known collectively as the "ENIAC girls." The ENIAC featured parallel operations. Different sets of accumulators could work on different algorithms, it used punched card machines for input and output, it was controlled with a clock signal. It ran for eight years, calculating hydrogen bomb parameters, predicting weather patterns, producing firing tables to aim artillery guns.
The Manchester Baby was a stored-program computer. Programming transitioned away from setting dials. Only three bits of memory were available to store each instruction, so it was limited to eight instructions. 32 switches were available for programming. Computers manufactured; the computer program was written on paper for reference. An instruction was represented by a configuration of on/off settings. After setting the configuration, an execute button was pressed; this process was repeated. Computer programs were manually input via paper tape or punched cards. After the medium was loaded, the starting address was set via switches and the execute button pressed. In 1961, the Burroughs B5000 was built to be programmed in the ALGOL 60 language; the hardware featured circuits to ease the compile phase. In 1964, the IBM System/360 was a line of six computers each having the same instruction set architecture; the Model 30 was the least expensive. Customers could retain the same application software; each System/360 model featured multiprogramming.
With operating system support, multiple programs could be in memory at once. When one was waiting for input/output, another could compute; each model could emulate other computers. Customers could upgrade to the System/360 and ret