A light pen is a computer input device in the form of a light-sensitive wand used in conjunction with a computer's cathode-ray tube display. It allows the user to point to displayed objects or draw on the screen in a similar way to a touchscreen but with greater positional accuracy. A light pen can work with any CRT-based display. A light pen detects changes of brightness of nearby screen pixels when scanned by cathode-ray tube electron beam and communicates the timing of this event to the computer. Since a CRT scans the entire screen one pixel at a time, the computer can keep track of the expected time of scanning various locations on screen by the beam and infer the pen's position from the latest timestamp; the first light pen was created around 1955 as part of the Whirlwind project at MIT. During the 1960s light pens were common on graphics terminals such as the IBM 2250, were available for the IBM 3270 text-only terminal. Light pen usage was expanded in the early 1980s to music workstations such as the Fairlight CMI and personal computers such as the BBC Micro.
IBM PC compatible CGA, HGC and some EGA graphics cards featured a connector compatible with a light pen, as did early Tandy 1000 computers, the Thomson MO5 computer family, the Atari 8-bit, Commodore 8-bit and Amstrad PCW home computers. Because the user was required to hold their arm in front of the screen for long periods of time or to use a desk that tilts the monitor, the light pen fell out of use as a general purpose input device. Light gun Pen computing Stylus
A computing platform or digital platform is the environment in which a piece of software is executed. It may be the hardware or the operating system a web browser and associated application programming interfaces, or other underlying software, as long as the program code is executed with it. Computing platforms have different abstraction levels, including a computer architecture, an OS, or runtime libraries. A computing platform is the stage. A platform can be seen both as a constraint on the software development process, in that different platforms provide different functionality and restrictions. For example, an OS may be a platform that abstracts the underlying differences in hardware and provides a generic command for saving files or accessing the network. Platforms may include: Hardware alone, in the case of small embedded systems. Embedded systems can access hardware directly, without an OS. A browser in the case of web-based software; the browser itself runs on a hardware+OS platform, but this is not relevant to software running within the browser.
An application, such as a spreadsheet or word processor, which hosts software written in an application-specific scripting language, such as an Excel macro. This can be extended to writing fully-fledged applications with the Microsoft Office suite as a platform. Software frameworks. Cloud computing and Platform as a Service. Extending the idea of a software framework, these allow application developers to build software out of components that are hosted not by the developer, but by the provider, with internet communication linking them together; the social networking sites Twitter and Facebook are considered development platforms. A virtual machine such as the Java virtual machine or. NET CLR. Applications are compiled into a format similar to machine code, known as bytecode, executed by the VM. A virtualized version of a complete system, including virtualized hardware, OS, storage; these allow, for instance, a typical Windows program to run on. Some architectures have multiple layers, with each layer acting as a platform to the one above it.
In general, a component only has to be adapted to the layer beneath it. For instance, a Java program has to be written to use the Java virtual machine and associated libraries as a platform but does not have to be adapted to run for the Windows, Linux or Macintosh OS platforms. However, the JVM, the layer beneath the application, does have to be built separately for each OS. AmigaOS, AmigaOS 4 FreeBSD, NetBSD, OpenBSD IBM i Linux Microsoft Windows OpenVMS Classic Mac OS macOS OS/2 Solaris Tru64 UNIX VM QNX z/OS Android Bada BlackBerry OS Firefox OS iOS Embedded Linux Palm OS Symbian Tizen WebOS LuneOS Windows Mobile Windows Phone Binary Runtime Environment for Wireless Cocoa Cocoa Touch Common Language Infrastructure Mono. NET Framework Silverlight Flash AIR GNU Java platform Java ME Java SE Java EE JavaFX JavaFX Mobile LiveCode Microsoft XNA Mozilla Prism, XUL and XULRunner Open Web Platform Oracle Database Qt SAP NetWeaver Shockwave Smartface Universal Windows Platform Windows Runtime Vexi Ordered from more common types to less common types: Commodity computing platforms Wintel, that is, Intel x86 or compatible personal computer hardware with Windows operating system Macintosh, custom Apple Inc. hardware and Classic Mac OS and macOS operating systems 68k-based PowerPC-based, now migrated to x86 ARM architecture based mobile devices iPhone smartphones and iPad tablet computers devices running iOS from Apple Gumstix or Raspberry Pi full function miniature computers with Linux Newton devices running the Newton OS from Apple x86 with Unix-like systems such as Linux or BSD variants CP/M computers based on the S-100 bus, maybe the earliest microcomputer platform Video game consoles, any variety 3DO Interactive Multiplayer, licensed to manufacturers Apple Pippin, a multimedia player platform for video game console development RISC processor based machines running Unix variants SPARC architecture computers running Solaris or illumos operating systems DEC Alpha cluster running OpenVMS or Tru64 UNIX Midrange computers with their custom operating systems, such as IBM OS/400 Mainframe computers with their custom operating systems, such as IBM z/OS Supercomputer architectures Cross-platform Platform virtualization Third platform Ryan Sarver: What is a platform
The cathode-ray tube is a vacuum tube that contains one or more electron guns and a phosphorescent screen, is used to display images. It modulates and deflects electron beam onto the screen to create the images; the images may represent electrical waveforms, radar targets, or other phenomena. CRTs have been used as memory devices, in which case the visible light emitted from the fluorescent material is not intended to have significant meaning to a visual observer. In television sets and computer monitors, the entire front area of the tube is scanned repetitively and systematically in a fixed pattern called a raster. An image is produced by controlling the intensity of each of the three electron beams, one for each additive primary color with a video signal as a reference. In all modern CRT monitors and televisions, the beams are bent by magnetic deflection, a varying magnetic field generated by coils and driven by electronic circuits around the neck of the tube, although electrostatic deflection is used in oscilloscopes, a type of electronic test instrument.
A CRT is constructed from a glass envelope, large, deep heavy, fragile. The interior of a CRT is evacuated to 0.01 pascals to 133 nanopascals, evacuation being necessary to facilitate the free flight of electrons from the gun to the tube's face. The fact that it is evacuated makes handling an intact CRT dangerous due to the risk of breaking the tube and causing a violent implosion that can hurl shards of glass at great velocity; as a matter of safety, the face is made of thick lead glass so as to be shatter-resistant and to block most X-ray emissions if the CRT is used in a consumer product. Since the late 2000s, CRTs have been superseded by newer "flat panel" display technologies such as LCD, plasma display, OLED displays, which in the case of LCD and OLED displays have lower manufacturing costs and power consumption, as well as less weight and bulk. Flat panel displays can be made in large sizes. Cathode rays were discovered by Johann Wilhelm Hittorf in 1869 in primitive Crookes tubes, he observed that some unknown rays were emitted from the cathode which could cast shadows on the glowing wall of the tube, indicating the rays were traveling in straight lines.
In 1890, Arthur Schuster demonstrated cathode rays could be deflected by electric fields, William Crookes showed they could be deflected by magnetic fields. In 1897, J. J. Thomson succeeded in measuring the mass of cathode rays, showing that they consisted of negatively charged particles smaller than atoms, the first "subatomic particles", which were named electrons; the earliest version of the CRT was known as the "Braun tube", invented by the German physicist Ferdinand Braun in 1897. It was a modification of the Crookes tube with a phosphor-coated screen; the first cathode-ray tube to use a hot cathode was developed by John B. Johnson and Harry Weiner Weinhart of Western Electric, became a commercial product in 1922. In 1925, Kenjiro Takayanagi demonstrated a CRT television that received images with a 40-line resolution. By 1927, he improved the resolution to 100 lines, unrivaled until 1931. By 1928, he was the first to transmit human faces in half-tones on a CRT display. By 1935, he had invented an early all-electronic CRT television.
It was named in 1929 by inventor Vladimir K. Zworykin, influenced by Takayanagi's earlier work. RCA was granted a trademark for the term in 1932; the first commercially made electronic television sets with cathode-ray tubes were manufactured by Telefunken in Germany in 1934. Flat panel displays dropped in price and started displacing cathode-ray tubes in the 2000s, with LCD screens exceeding CRTs in 2008; the last known manufacturer of CRTs ceased in 2015. In oscilloscope CRTs, electrostatic deflection is used, rather than the magnetic deflection used with television and other large CRTs; the beam is deflected horizontally by applying an electric field between a pair of plates to its left and right, vertically by applying an electric field to plates above and below. Televisions use magnetic rather than electrostatic deflection because the deflection plates obstruct the beam when the deflection angle is as large as is required for tubes that are short for their size. Various phosphors are available depending upon the needs of the display application.
The brightness and persistence of the illumination depends upon the type of phosphor used on the CRT screen. Phosphors are available with persistences ranging from less than one microsecond to several seconds. For visual observation of brief transient events, a long persistence phosphor may be desirable. For events which are fast and repetitive, or high frequency, a short-persistence phosphor is preferable; when displaying fast one-shot events, the electron beam must deflect quickly, with few electrons impinging on the screen, leading to a faint or invisible image on the display. Oscilloscope CRTs designed for fast signals can give a brighter display by passing the electron beam through a micro-channel plate just before it reaches
Engineering is the application of knowledge in the form of science and empirical evidence, to the innovation, construction and maintenance of structures, materials, devices, systems and organizations. The discipline of engineering encompasses a broad range of more specialized fields of engineering, each with a more specific emphasis on particular areas of applied mathematics, applied science, types of application. See glossary of engineering; the term engineering is derived from the Latin ingenium, meaning "cleverness" and ingeniare, meaning "to contrive, devise". The American Engineers' Council for Professional Development has defined "engineering" as: The creative application of scientific principles to design or develop structures, apparatus, or manufacturing processes, or works utilizing them singly or in combination. Engineering has existed since ancient times, when humans devised inventions such as the wedge, lever and pulley; the term engineering is derived from the word engineer, which itself dates back to 1390 when an engine'er referred to "a constructor of military engines."
In this context, now obsolete, an "engine" referred to a military machine, i.e. a mechanical contraption used in war. Notable examples of the obsolete usage which have survived to the present day are military engineering corps, e.g. the U. S. Army Corps of Engineers; the word "engine" itself is of older origin deriving from the Latin ingenium, meaning "innate quality mental power, hence a clever invention."Later, as the design of civilian structures, such as bridges and buildings, matured as a technical discipline, the term civil engineering entered the lexicon as a way to distinguish between those specializing in the construction of such non-military projects and those involved in the discipline of military engineering. The pyramids in Egypt, the Acropolis and the Parthenon in Greece, the Roman aqueducts, Via Appia and the Colosseum, Teotihuacán, the Brihadeeswarar Temple of Thanjavur, among many others, stand as a testament to the ingenuity and skill of ancient civil and military engineers.
Other monuments, no longer standing, such as the Hanging Gardens of Babylon, the Pharos of Alexandria were important engineering achievements of their time and were considered among the Seven Wonders of the Ancient World. The earliest civil engineer known by name is Imhotep; as one of the officials of the Pharaoh, Djosèr, he designed and supervised the construction of the Pyramid of Djoser at Saqqara in Egypt around 2630–2611 BC. Ancient Greece developed machines in both military domains; the Antikythera mechanism, the first known mechanical computer, the mechanical inventions of Archimedes are examples of early mechanical engineering. Some of Archimedes' inventions as well as the Antikythera mechanism required sophisticated knowledge of differential gearing or epicyclic gearing, two key principles in machine theory that helped design the gear trains of the Industrial Revolution, are still used today in diverse fields such as robotics and automotive engineering. Ancient Chinese, Greek and Hungarian armies employed military machines and inventions such as artillery, developed by the Greeks around the 4th century BC, the trireme, the ballista and the catapult.
In the Middle Ages, the trebuchet was developed. Before the development of modern engineering, mathematics was used by artisans and craftsmen, such as millwrights, clock makers, instrument makers and surveyors. Aside from these professions, universities were not believed to have had much practical significance to technology. A standard reference for the state of mechanical arts during the Renaissance is given in the mining engineering treatise De re metallica, which contains sections on geology and chemistry. De re metallica was the standard chemistry reference for the next 180 years; the science of classical mechanics, sometimes called Newtonian mechanics, formed the scientific basis of much of modern engineering. With the rise of engineering as a profession in the 18th century, the term became more narrowly applied to fields in which mathematics and science were applied to these ends. In addition to military and civil engineering, the fields known as the mechanic arts became incorporated into engineering.
Canal building was an important engineering work during the early phases of the Industrial Revolution. John Smeaton was the first self-proclaimed civil engineer and is regarded as the "father" of civil engineering, he was an English civil engineer responsible for the design of bridges, canals and lighthouses. He was a capable mechanical engineer and an eminent physicist. Using a model water wheel, Smeaton conducted experiments for seven years, determining ways to increase efficiency. Smeaton introduced iron gears to water wheels. Smeaton made mechanical improvements to the Newcomen steam engine. Smeaton designed the third Eddystone Lighthouse where he pioneered the use of'hydraulic lime' and developed a technique involving dovetailed blocks of granite in the building of the lighthouse, he is important in the history, rediscovery of, development of modern cement, because he identified the compositional requirements needed to obtain "hydraulicity" in lime.
A punched card or punch card is a piece of stiff paper that can be used to contain digital data represented by the presence or absence of holes in predefined positions. Digital data can be used for data processing applications or, in earlier examples, used to directly control automated machinery. Punched cards were used through much of the 20th century in the data processing industry, where specialized and complex unit record machines, organized into semiautomatic data processing systems, used punched cards for data input and storage. Many early digital computers used punched cards prepared using keypunch machines, as the primary medium for input of both computer programs and data. While punched cards are now obsolete as a storage medium, as of 2012, some voting machines still use punched cards to record votes. Basile Bouchon developed the control of a loom by punched holes in paper tape in 1725; the design was improved by his assistant Jean-Baptiste Falcon and Jacques Vaucanson Although these improvements controlled the patterns woven, they still required an assistant to operate the mechanism.
In 1804 Joseph Marie Jacquard demonstrated a mechanism to automate loom operation. A number of punched cards were linked into a chain of any length; each card held the instructions for selecting the shuttle for a single pass. It is considered an important step in the history of computing hardware. Semyon Korsakov was reputedly the first to propose punched cards in informatics for information store and search. Korsakov announced his new method and machines in September 1832. Charles Babbage proposed the use of "Number Cards", "pierced with certain holes and stand opposite levers connected with a set of figure wheels... advanced they push in those levers opposite to which there are no holes on the cards and thus transfer that number together with its sign" in his description of the Calculating Engine's Store. In 1881 Jules Carpentier developed a method of recording and playing back performances on a harmonium using punched cards; the system was called the Mélographe Répétiteur and “writes down ordinary music played on the keyboard dans la langage de Jacquard”, as holes punched in a series of cards.
By 1887 Carpentier had separated the mechanism into the Melograph which recorded the player's key presses and the Melotrope which played the music. At the end of the 1800s Herman Hollerith invented the recording of data on a medium that could be read by a machine. "After some initial trials with paper tape, he settled on punched cards...", developing punched card data processing technology for the 1890 US census. His tabulating machines read and summarized data stored on punched cards and they began use for government and commercial data processing; these electromechanical machines only counted holes, but by the 1920s they had units for carrying out basic arithmetic operations. Hollerith founded the Tabulating Machine Company, one of four companies that were amalgamated to form a fifth company, Computing-Tabulating-Recording Company renamed International Business Machines Corporation. Other companies entering the punched card business included The Tabulator Limited, Deutsche Hollerith-Maschinen Gesellschaft mbH, Powers Accounting Machine Company, Remington Rand, H.
W. Egli Bull; these companies, others and marketed a variety of punched cards and unit record machines for creating and tabulating punched cards after the development of electronic computers in the 1950s. Both IBM and Remington Rand tied punched card purchases to machine leases, a violation of the 1914 Clayton Antitrust Act. In 1932, the US government took both to court on this issue. Remington Rand settled quickly. IBM viewed its business as providing a service. IBM fought all the way to the Supreme Court and lost in 1936. IBM had 32 presses at work in Endicott, N. Y. printing and stacking five to 10 million punched cards every day." Punched cards were used as legal documents, such as U. S. Government checks and savings bonds. During WW II punched card equipment was used by the Allies in some of their efforts to decrypt Axis communications. See, for example, Central Bureau in Australia. At Bletchley Park in England, 2,000,000 punched cards were used each week for storing decrypted German messages.
Punched card technology developed into a powerful tool for business data-processing. By 1950 punched cards had become ubiquitous in government. "Do not fold, spindle or mutilate," a generalized version of the warning that appeared on some punched cards, became a motto for the post-World War II era. In 1955 IBM signed a consent decree requiring, amongst other things, that IBM would by 1962 have no more than one-half of the punched card manufacturing capacity in the United States. Tom Watson Jr.'s decision to sign this decree, where IBM saw the punched card provisions as the most significant point, completed the transfer of power to him from Thomas Watson, Sr. The UNITYPER introduced magnetic tape for data entry in the 1950s. During the 1960s, the punched card was replaced as the primary means for data storage by magnetic tape, as better, more capable computers became available. Mohawk Data Sciences introduced a magnetic tape encoder in 1965, a system marketed as a keypunch replacement, somewhat successful.
Punched cards were still c
Vannevar Bush was an American engineer and science administrator, who during World War II headed the U. S. Office of Scientific Research and Development, through which all wartime military R&D was carried out, including important developments in radar and the initiation and early administration of the Manhattan Project, he emphasized the importance of scientific research to national security and economic well-being, was chiefly responsible for the movement that led to the creation of the National Science Foundation. Bush joined the Department of Electrical Engineering at Massachusetts Institute of Technology in 1919, founded the company now known as Raytheon in 1922. Bush became vice president of MIT and dean of the MIT School of Engineering in 1932, president of the Carnegie Institution of Washington in 1938. During his career, Bush patented a string of his own inventions, he is known for his engineering work on analog computers, for the memex. Starting in 1927, Bush constructed a differential analyzer, an analog computer with some digital components that could solve differential equations with as many as 18 independent variables.
An offshoot of the work at MIT by Bush and others was the beginning of digital circuit design theory. The memex, which he began developing in the 1930s, was a hypothetical adjustable microfilm viewer with a structure analogous to that of hypertext; the memex and Bush's 1945 essay "As We May Think" influenced generations of computer scientists, who drew inspiration from his vision of the future. Bush was appointed to the National Advisory Committee for Aeronautics in 1938, soon became its chairman; as chairman of the National Defense Research Committee, director of OSRD, Bush coordinated the activities of some six thousand leading American scientists in the application of science to warfare. Bush was a well-known policymaker and public intellectual during World War II, when he was in effect the first presidential science advisor; as head of NDRC and OSRD, he initiated the Manhattan Project, ensured that it received top priority from the highest levels of government. In Science, The Endless Frontier, his 1945 report to the President of the United States, Bush called for an expansion of government support for science, he pressed for the creation of the National Science Foundation.
Vannevar Bush was born in Everett, Massachusetts, on March 11, 1890, the third child and only son of Perry Bush, the local Universalist pastor, his wife Emma Linwood. He had two older sisters and Reba, he was named after John Vannevar, an old friend of the family who had attended Tufts College with Perry. The family moved to Chelsea, Massachusetts, in 1892, Bush graduated from Chelsea High School in 1909, he attended Tufts, like his father before him. A popular student, he was vice president of his sophomore class, president of his junior class. During his senior year, he managed the football team, he became a member of the Alpha Tau Omega fraternity, dated Phoebe Clara Davis, who came from Chelsea. Tufts allowed students to gain a master's degree in four years with a bachelor's degree. For his master's thesis, Bush invented and patented a "profile tracer"; this was a mapping device for assisting surveyors. It had two bicycle wheels, a pen that plotted the terrain over which it traveled, it was the first of a string of inventions.
On graduation in 1913 he received both bachelor of master of science degrees. After graduation, Bush worked at General Electric in New York, for $14 a week; as a "test man", his job was to assess equipment to ensure. He transferred to GE's plant in Pittsfield, Massachusetts, to work on high voltage transformers, but after a fire broke out at the plant and the other test men were suspended, he returned to Tufts in October 1914 to teach mathematics, spent the 1915 summer break working at the Brooklyn Navy Yard as an electrical inspector. Bush was awarded a $1,500 scholarship to study at Clark University as a doctoral student of Arthur Gordon Webster, but Webster wanted Bush to study acoustics. Bush preferred to quit rather than study a subject. Bush subsequently enrolled in the Massachusetts Institute of Technology electrical engineering program. Spurred by the need for enough financial security to marry, he submitted his thesis, entitled Oscillating-Current Circuits: An Extension of the Theory of Generalized Angular Velocities, with Applications to the Coupled Circuit and the Artificial Transmission Line, in April 1916.
His adviser, Arthur Edwin Kennelly, tried to demand more work from him, but Bush refused, Kennelly was overruled by the department chairman. He married Phoebe in August 1916, they had two sons: John Hathaway Bush. Bush accepted a job with Tufts, where he became involved with the American Radio and Research Corporation, which began broadcasting music from the campus on March 8, 1916; the station owner, Harold Power, hired him to run the company's laboratory, at a salary greater than that which Bush drew from Tufts. In 1917, following the United States' entry into World War I, he went to work with the National Research Council, he attempted to develop a means of detecting submarines by measuring the disturbance in the Earth's magnetic field. His device worked. Bush left Tufts in 1919, although he remained employed by AMRAD, joined the Department of Electrical Engineering at Massachusetts Institute of