The Moon is an astronomical body that orbits planet Earth and is Earth's only permanent natural satellite. It is the fifth-largest natural satellite in the Solar System, the largest among planetary satellites relative to the size of the planet that it orbits; the Moon is after Jupiter's satellite Io the second-densest satellite in the Solar System among those whose densities are known. The Moon is thought to have formed not long after Earth; the most accepted explanation is that the Moon formed from the debris left over after a giant impact between Earth and a Mars-sized body called Theia. The Moon is in synchronous rotation with Earth, thus always shows the same side to Earth, the near side; the near side is marked by dark volcanic maria that fill the spaces between the bright ancient crustal highlands and the prominent impact craters. After the Sun, the Moon is the second-brightest visible celestial object in Earth's sky, its surface is dark, although compared to the night sky it appears bright, with a reflectance just higher than that of worn asphalt.
Its gravitational influence produces the ocean tides, body tides, the slight lengthening of the day. The Moon's average orbital distance is 1.28 light-seconds. This is about thirty times the diameter of Earth; the Moon's apparent size in the sky is the same as that of the Sun, since the star is about 400 times the lunar distance and diameter. Therefore, the Moon covers the Sun nearly during a total solar eclipse; this matching of apparent visual size will not continue in the far future because the Moon's distance from Earth is increasing. The Moon was first reached in September 1959 by an unmanned spacecraft; the United States' NASA Apollo program achieved the only manned lunar missions to date, beginning with the first manned orbital mission by Apollo 8 in 1968, six manned landings between 1969 and 1972, with the first being Apollo 11. These missions returned lunar rocks which have been used to develop a geological understanding of the Moon's origin, internal structure, the Moon's history. Since the Apollo 17 mission in 1972, the Moon has been visited only by unmanned spacecraft.
Both the Moon's natural prominence in the earthly sky and its regular cycle of phases as seen from Earth have provided cultural references and influences for human societies and cultures since time immemorial. Such cultural influences can be found in language, lunar calendar systems and mythology; the usual English proper name for Earth's natural satellite is "the Moon", which in nonscientific texts is not capitalized. The noun moon is derived from Old English mōna, which stems from Proto-Germanic *mēnô, which comes from Proto-Indo-European *mḗh₁n̥s "moon", "month", which comes from the Proto-Indo-European root *meh₁- "to measure", the month being the ancient unit of time measured by the Moon; the name "Luna" is used. In literature science fiction, "Luna" is used to distinguish it from other moons, while in poetry, the name has been used to denote personification of Earth's moon; the modern English adjective pertaining to the Moon is lunar, derived from the Latin word for the Moon, luna. The adjective selenic is so used to refer to the Moon that this meaning is not recorded in most major dictionaries.
It is derived from the Ancient Greek word for the Moon, σελήνη, from, however derived the prefix "seleno-", as in selenography, the study of the physical features of the Moon, as well as the element name selenium. Both the Greek goddess Selene and the Roman goddess Diana were alternatively called Cynthia; the names Luna and Selene are reflected in terminology for lunar orbits in words such as apolune and selenocentric. The name Diana comes from the Proto-Indo-European *diw-yo, "heavenly", which comes from the PIE root *dyeu- "to shine," which in many derivatives means "sky and god" and is the origin of Latin dies, "day"; the Moon formed 4.51 billion years ago, some 60 million years after the origin of the Solar System. Several forming mechanisms have been proposed, including the fission of the Moon from Earth's crust through centrifugal force, the gravitational capture of a pre-formed Moon, the co-formation of Earth and the Moon together in the primordial accretion disk; these hypotheses cannot account for the high angular momentum of the Earth–Moon system.
The prevailing hypothesis is that the Earth–Moon system formed after an impact of a Mars-sized body with the proto-Earth. The impact blasted material into Earth's orbit and the material accreted and formed the Moon; the Moon's far side has a crust, 30 mi thicker than that of the near side. This is thought to be; this hypothesis, although not perfect best explains the evidence. Eighteen months prior to an October 1984 conference on lunar origins, Bill Hartmann, Roger Phillips, Jeff Taylor challenged fellow lunar scientists: "You have eighteen months. Go back to your Apollo data, go back to your computer, do whatever you have to, but make up your mind. Don't come to our conference unless you have something to say about the Moon's birth." At the 1984 conference at Kona, the giant impact hypothesis emerged as the most consensual theory. Before the conference, there were parti
The Teletype Corporation, a part of American Telephone and Telegraph Company's Western Electric manufacturing arm since 1930, came into being in 1928 when the Morkrum-Kleinschmidt Company changed its name to the name of its trademark equipment. Teletype Corporation, of Skokie, was responsible for the research and manufacture of data and record communications equipment, but it is remembered for the manufacture of electromechanical teleprinters; because of the nature of its business, as stated in the corporate charter, Teletype Corporation was allowed a unique mode of operation within Western Electric. It was organized as a separate entity, contained all the elements necessary for a separate corporation. Teletype's charter permitted the sale of equipment to customers outside the AT&T Bell System, which explained their need for a separate sales force; the primary customer outside of the Bell System was the United States Government. The Teletype Corporation continued in this manner until January 8, 1982, the date of settlement of United States v. AT&T, a 1974 United States Department of Justice antitrust suit against AT&T.
At that time, Western Electric was absorbed into AT&T as AT&T Technologies, the Teletype Corporation became AT&T Teletype. The last vestiges of what had been the Teletype Corporation ceased in 1990, bringing to a close the dedicated teleprinter business. One of the three Teletype manufacturing buildings in Skokie remains in use as a parking garage for a shopping center; every other floor of the building has been removed. The other two buildings were demolished; the Teletype Corporation had its roots in the Morkrum Company. In 1902, electrical engineer Frank Pearne approached Joy Morton, head of Morton Salt, seeking a sponsor for Pearne's research into the practicalities of developing a printing telegraph system. Joy Morton needed to determine whether this was worthwhile and so consulted mechanical engineer Charles Krum, vice president of the Western Cold Storage Company, run by Morton’s brother Mark Morton. Krum was interested in helping Pearne, so space was set up in a laboratory in the attic of Western Cold Storage.
Frank Pearne lost interest in the project after a year, left to become a teacher. Krum was prepared to continue Pearne’s work, in August 1903 a patent was filed for a "typebar page printer". In 1904, Krum filed a patent for a "type wheel printing telegraph machine", issued in August 1907. In 1906, the Morkrum Company was formed, with the company name combining the Morton and Krum names and reflecting the financial assistance provided by Joy Morton; this is the time when Howard Krum, joined his father in this work. It was Howard who developed and patented the start-stop synchronizing method for code telegraph systems, which made possible the practical teleprinter. In 1908, a working teleprinter was produced, called the Morkrum Printing Telegraph, field tested with the Alton Railroad. In 1910, the Morkrum Company designed and installed the first commercial teletypewriter system on Postal Telegraph Company lines between Boston and New York City using the "Blue Code Version" of the Morkrum Printing Telegraph.
In 1925, the Morkrum Company and the Kleinschmidt Electric Company merged to form the Morkrum-Kleinschmidt Company. In December 1928, the company changed its name to the less cumbersome "Teletype Corporation". In 1930, the Teletype Corporation was purchased by the American Telephone and Telegraph Company for $30,000,000 in stock and became a subsidiary of the Western Electric Company. While other principals in the Teletype Corporation retired, Howard Krum stayed on as a consultant. Morkrum Printing Telegraph – This was the first mechanically successful teleprinter used to 1908 for the Alton Railroad trials. A "Blue Code Version" was used in 1910 as a part of the first commercial teleprinter circuit that ran on Postal Telegraph Company lines between Boston and New York City. In 1914, a "Green Code Version" was installed using Western Union Telegraph Company lines for the Associated Press and was used to distribute news to competing newspapers in New York City. Morkrum Model 11 Tape Printer – The Model 11 Typewheel Tape Printer, at about 45 words-per-minute, was a bit faster the Morkrum Printing Telegraph Blue and Green-Code printers, was modeled after the European Baudot Telegraph System printer.
The Model 11 was a Tape Printer which used gummed paper tape that could be pasted onto a telegram blank. This was the first teleprinter to operate from an airplane. Morkrum Model GPE Perforator – The Morkrum Company Model GPE "Green Code" Perforator was designed about 1913 and a US Patent was filed in 1914; this equipment continued to be produced for the next 50 years. Morkrum Model 12 Typebar Page Printer – This equipment known as the Model 12 Page Printer, based on an Underwood typewriter mechanism, was the first commercially viable machine; this printer was produced from 1922 to 1925 under the Morkrum Company name, from 1925 to 1929 under the Morkrum-Kleinschmidt name, from 1929 to 1943 under Teletype Corp. In 1916, Kleinschmidt filed a patent application for a type-bar page printer This printer utilized Baudot code but did not utilize the start-stop synchronization technology that Howard Krum had patented; the type-bar printer was intended for use on multiplex circuits, its printing was controlled from a local segment on a receiving distributor of the sunflower type.
In 1919, Kleinschmidt appeared to be concerned chiefly with development of multiplex transmitters for use with this printer. 10-A Printing Telegraph – The Western Electric Company made a line of printing telegraph equipment prior to acquiring the Teletype Corporation in 1930. The design for this equipment was provided by the Bell Telephone L
Thanksgiving Day is a national holiday celebrated on various dates in Canada, the United States, some of the Caribbean islands, Liberia. It began as a day of giving thanks and sacrifice for the blessing of the harvest and of the preceding year. Named festival holidays occur in Germany and Japan. Thanksgiving is celebrated on the second Monday of October in Canada and on the fourth Thursday of November in the United States, around the same part of the year in other places. Although Thanksgiving has historical roots in religious and cultural traditions, it has long been celebrated as a secular holiday as well. Prayers of thanks and special thanksgiving ceremonies are common among all religions after harvests and at other times; the Thanksgiving holiday's history in North America is rooted in English traditions dating from the Protestant Reformation. It has aspects of a harvest festival though the harvest in New England occurs well before the late-November date on which the modern Thanksgiving holiday is celebrated.
In the English tradition, days of thanksgiving and special thanksgiving religious services became important during the English Reformation in the reign of Henry VIII and in reaction to the large number of religious holidays on the Catholic calendar. Before 1536 there were 95 Church holidays, plus 52 Sundays, when people were required to attend church and forego work and sometimes pay for expensive celebrations; the 1536 reforms reduced the number of Church holidays to 27, but some Puritans wished to eliminate all Church holidays, including Christmas and Easter. The holidays were to be replaced by specially called Days of Fasting or Days of Thanksgiving, in response to events that the Puritans viewed as acts of special providence. Unexpected disasters or threats of judgement from on high called for Days of Fasting. Special blessings, viewed as coming from God, called for Days of Thanksgiving. For example, Days of Fasting were called on account of drought in 1611, floods in 1613, plagues in 1604 and 1622.
Days of Thanksgiving were called following the victory over the Spanish Armada in 1588 and following the deliverance of Queen Anne in 1705. An unusual annual Day of Thanksgiving began in 1606 following the failure of the Gunpowder Plot in 1605 and developed into Guy Fawkes Day on November 5. According to some historians, the first celebration of Thanksgiving in North America occurred during the 1578 voyage of Martin Frobisher from England in search of the Northwest Passage. Other researchers, state that "there is no compelling narrative of the origins of the Canadian Thanksgiving day."The origins of Canadian Thanksgiving are sometimes traced to the French settlers who came to New France in the 17th century, who celebrated their successful harvests. The French settlers in the area had feasts at the end of the harvest season and continued throughout the winter season sharing food with the indigenous peoples of the area; as settlers arrived in Nova Scotia from New England after 1700, late autumn Thanksgiving celebrations became commonplace.
New immigrants into the country—such as the Irish and Germans—also added their own traditions to the harvest celebrations. Most of the US aspects of Thanksgiving were incorporated when United Empire Loyalists began to flee from the United States during the American Revolution and settled in Canada. Pilgrims and Puritans who emigrated from England in the 1620s and 1630s carried the tradition of Days of Fasting and Days of Thanksgiving with them to New England; the modern Thanksgiving holiday tradition is traced to a well-recorded 1619 event in Virginia and a sparsely documented 1621 celebration at Plymouth in present-day Massachusetts. The 1619 arrival of 38 English settlers at Berkeley Hundred in Charles City County, concluded with a religious celebration as dictated by the group's charter from the London Company, which required "that the day of our ships arrival at the place assigned... in the land of Virginia shall be yearly and perpetually kept holy as a day of thanksgiving to Almighty God."
The 1621 Plymouth feast and thanksgiving was prompted by a good harvest, which the Pilgrims celebrated with native Americans, who helped them pass the last winter by giving them food in the time of scarcity. Several days of Thanksgiving were held in early New England history that have been identified as the "First Thanksgiving", including Pilgrim holidays in Plymouth in 1621 and 1623, a Puritan holiday in Boston in 1631. According to historian Jeremy Bangs, director of the Leiden American Pilgrim Museum, the Pilgrims may have been influenced by watching the annual services of Thanksgiving for the relief of the siege of Leiden in 1574, while they were staying in Leiden. Now called Oktober Feest, Leiden's autumn thanksgiving celebration in 1617 was the occasion for sectarian disturbance that appears to have accelerated the pilgrims' plans to emigrate to America. In Massachusetts, religious thanksgiving services were declared by civil leaders such as Governor Bradford, who planned the colony's thanksgiving celebration and fast in 1623.
The practice of holding an annual harvest festival did not become a regular affair in New England until the late 1660s. Thanksgiving proclamations were made by church leaders in New England up until 1682, by both state and church leaders until after the American Revolution. During the revolutionary period, political influences affected the issuance of Thanksgiving proclamations. Various proclamations were made by royal governors, John Hancock, General George Washington, the Continental Congress, each giving thanks to God for events favorable to their causes; as President of the United States, George Washington proclaimed the first nationwide thanksgiving cel
San Antonio the City of San Antonio, is the seventh-most populous city in the United States, the second-most populous city in both Texas and the Southern United States, with more than 1.5 million residents. Founded as a Spanish mission and colonial outpost in 1718, the city became the first chartered civil settlement in present-day Texas in 1731; the area was still part of the Spanish Empire, of the Mexican Republic. Today it is the state's oldest municipality; the city's deep history is contrasted with its rapid recent growth during the past few decades. It was the fastest-growing of the top ten largest cities in the United States from 2000 to 2010, the second from 1990 to 2000. Straddling the regional divide between South and Central Texas, San Antonio anchors the southwestern corner of an urban megaregion colloquially known as the "Texas Triangle". San Antonio serves as the seat of Bexar County. Since San Antonio was founded during the Spanish Colonial Era, it has a church in its center, on the main civic plaza in front, a characteristic of many Spanish-founded cities and villages in Spain and Latin America.
As with many other urban centers in the Southwestern United States, areas outside the city limits are sparsely populated. San Antonio is the center of the San Antonio–New Braunfels metropolitan statistical area. Called Greater San Antonio, the metro area has a population of 2,473,974 based on the 2017 U. S. census estimate, making it the 24th-largest metropolitan area in the United States and third-largest in Texas. Growth along the Interstate 35 and Interstate 10 corridors to the north and east make it that the metropolitan area will continue to expand. San Antonio was named by a 1691 Spanish expedition for Saint Anthony of Padua, whose feast day is June 13; the city contains five 18th-century Spanish frontier missions, including The Alamo and San Antonio Missions National Historical Park, which together were designated UNESCO World Heritage sites in 2015. Other notable attractions include the River Walk, the Tower of the Americas, SeaWorld, the Alamo Bowl, Marriage Island. Commercial entertainment includes Morgan's Wonderland amusement parks.
According to the San Antonio Convention and Visitors Bureau, the city is visited by about 32 million tourists a year. It is home to the five-time NBA champion San Antonio Spurs, hosts the annual San Antonio Stock Show & Rodeo, one of the largest such events in the U. S; the U. S. Armed Forces have numerous facilities around San Antonio. Lackland Air Force Base, Randolph Air Force Base, Lackland AFB/Kelly Field Annex, Camp Bullis, Camp Stanley are outside the city limits. Kelly Air Force Base operated out of San Antonio until 2001, when the airfield was transferred to Lackland AFB; the remaining parts of the base were developed as Port San Antonio, an industrial/business park and aerospace complex. San Antonio is home to six Fortune 500 companies and the South Texas Medical Center, the only medical research and care provider in the South Texas region. At the time of European encounter, Payaya Indians lived near the San Antonio River Valley in the San Pedro Springs area, they called the vicinity Yanaguana, meaning "refreshing waters".
In 1691, a group of Spanish explorers and missionaries came upon the river and Payaya settlement on June 13, the feast day of St. Anthony of Padua, they named the river "San Antonio" in his honor. It was years. Father Antonio de Olivares visited the site in 1709, he was determined to found a mission and civilian settlement there; the viceroy gave formal approval for a combined mission and presidio in late 1716, as he wanted to forestall any French expansion into the area from their colony of La Louisiane to the east, as well as prevent illegal trading with the Payaya. He directed the governor of Coahuila y Tejas, to establish the mission complex. Differences between Alarcón and Olivares resulted in delays, construction did not start until 1718. Olivares built, with the help of the Payaya Indians, the Misión de San Antonio de Valero, the Presidio San Antonio de Bexar, the bridge that connected both, the Acequia Madre de Valero; the families who clustered around the presidio and mission were the start of Villa de Béjar, destined to become the most important town in Spanish Texas.
On May 1, the governor transferred ownership of the Mission San Antonio de Valero to Fray Antonio de Olivares. On May 5, 1718 he commissioned the Presidio San Antonio de Béxar on the west side of the San Antonio River, one-fourth league from the mission. On February 14, 1719, the Marquis of San Miguel de Aguayo proposed to the king of Spain that 400 families be transported from the Canary Islands, Galicia, or Havana to populate the province of Texas, his plan was approved, notice was given the Canary Islanders to furnish 200 families. By June 1730, 25 families had reached Cuba, 10 families had been sent to Veracruz before orders from Spain came to stop the re-settlement. Under the leadership of Juan Leal Goraz, the group marched overland from Veracruz to the Presidio San Antonio de Béxar, where they arrived on March 9, 1731. Due to marriages along the way, the party now included a total of 56 persons, they joined the military community established in 1718. The immigrants f
Transistor–transistor logic is a logic family built from bipolar junction transistors. Its name signifies that transistors perform both the amplifying function. TTL integrated circuits were used in applications such as computers, industrial controls, test equipment and instrumentation, consumer electronics, synthesizers. Sometimes TTL-compatible logic levels are not associated directly with TTL integrated circuits, for example, they may be used at the inputs and outputs of electronic instruments. After their introduction in integrated circuit form in 1963 by Sylvania, TTL integrated circuits were manufactured by several semiconductor companies; the 7400 series by Texas Instruments became popular. TTL manufacturers offered a wide range of logic gates, flip-flops and other circuits. Variations of the original TTL circuit design offered higher speed or lower power dissipation to allow design optimization. TTL devices were made in ceramic and plastic dual-in-line packages, flat-pack form. TTL chips are now made in surface-mount packages.
TTL became the foundation of other digital electronics. After Very-large-scale integration integrated circuits made multiple-circuit-board processors obsolete, TTL devices still found extensive use as the glue logic interfacing between more densely integrated components. TTL was invented in 1961 by James L. Buie of TRW, which declared it, "particularly suited to the newly developing integrated circuit design technology." The original name for TTL was transistor-coupled transistor logic. The first commercial integrated-circuit TTL devices were manufactured by Sylvania in 1963, called the Sylvania Universal High-Level Logic family; the Sylvania parts were used in the controls of the Phoenix missile. TTL became popular with electronic systems designers after Texas Instruments introduced the 5400 series of ICs, with military temperature range, in 1964 and the 7400 series, specified over a narrower range and with inexpensive plastic packages, in 1966; the Texas Instruments 7400 family became an industry standard.
Compatible parts were made by Motorola, AMD, Intel, Signetics, Siemens, SGS-Thomson, National Semiconductor, many other companies in the Eastern Bloc. Not only did others make compatible TTL parts, but compatible parts were made using many other circuit technologies as well. At least one manufacturer, IBM, produced non-compatible TTL circuits for its own use; the term "TTL" is applied to many successive generations of bipolar logic, with gradual improvements in speed and power consumption over about two decades. The most introduced family 74Fxx is still sold today, was used into the late 90s. 74AS/ALS Advanced Schottky was introduced in 1985. As of 2008, Texas Instruments continues to supply the more general-purpose chips in numerous obsolete technology families, albeit at increased prices. TTL chips integrate no more than a few hundred transistors each. Functions within a single package range from a few logic gates to a microprocessor bit-slice. TTL became important because its low cost made digital techniques economically practical for tasks done by analog methods.
The Kenbak-1, ancestor of the first personal computers, used TTL for its CPU instead of a microprocessor chip, not available in 1971. The Datapoint 2200 from 1970 used TTL components for its CPU and was the basis for the 8008 and the x86 instruction set; the 1973 Xerox Alto and 1981 Star workstations, which introduced the graphical user interface, used TTL circuits integrated at the level of Arithmetic logic units and bitslices, respectively. Most computers used TTL-compatible "glue logic" between larger chips well into the 1990s; until the advent of programmable logic, discrete bipolar logic was used to prototype and emulate microarchitectures under development. TTL inputs are the emitters of bipolar transistors. In the case of NAND inputs, the inputs are the emitters of multiple-emitter transistors, functionally equivalent to multiple transistors where the bases and collectors are tied together; the output is buffered by a common emitter amplifier. Inputs both logical ones; when all the inputs are held at high voltage, the base–emitter junctions of the multiple-emitter transistor are reverse-biased.
Unlike DTL, a small “collector” current is drawn by each of the inputs. This is. An constant current flows from the positive rail, through the resistor and into the base of the multiple emitter transistor; this current passes through the base–emitter junction of the output transistor, allowing it to conduct and pulling the output voltage low. An input logical zero. Note that the base–collector junction of the multiple-emitter transistor and the base–emitter junction of the output transistor are in series between the bottom of the resistor and ground. If one input voltage becomes zero, the corresponding base–emitter junction of the multiple-emitter transistor is in parallel with these two junctions. A phenomenon called current steering means that when two voltage-stable elements with different threshold voltages are connected in parallel, the current flows through the path with the smaller threshold voltage; that is, current flows out of this input and into the zero voltage source. As a result, no current flows through t
A floppy disk known as a floppy, diskette, or disk, is a type of disk storage composed of a disk of thin and flexible magnetic storage medium, sealed in a rectangular plastic enclosure lined with fabric that removes dust particles. Floppy disks are written by a floppy disk drive. Floppy disks as 8-inch media and in 5 1⁄4-inch and 3 1⁄2 inch sizes, were a ubiquitous form of data storage and exchange from the mid-1970s into the first years of the 21st century. By 2006 computers were manufactured with installed floppy disk drives; these formats are handled by older equipment. The prevalence of floppy disks in late-twentieth century culture was such that many electronic and software programs still use the floppy disks as save icons. While floppy disk drives still have some limited uses with legacy industrial computer equipment, they have been superseded by data storage methods with much greater capacity, such as USB flash drives, flash storage cards, portable external hard disk drives, optical discs, cloud storage and storage available through computer networks.
The first commercial floppy disks, developed in the late 1960s, were 8 inches in diameter. These disks and associated drives were produced and improved upon by IBM and other companies such as Memorex, Shugart Associates, Burroughs Corporation; the term "floppy disk" appeared in print as early as 1970, although IBM announced its first media as the "Type 1 Diskette" in 1973, the industry continued to use the terms "floppy disk" or "floppy". In 1976, Shugart Associates introduced the 5 1⁄4-inch FDD. By 1978 there were more than 10 manufacturers producing such FDDs. There were competing floppy disk formats, with hard- and soft-sector versions and encoding schemes such as FM, MFM, M2FM and GCR; the 5 1⁄4-inch format displaced the 8-inch one for most applications, the hard-sectored disk format disappeared. The most common capacity of the 5 1⁄4-inch format in DOS-based PCs was 360 KB, for the DSDD format using MFM encoding. In 1984 IBM introduced with its PC-AT model the 1.2 MB dual-sided 5 1⁄4-inch floppy disk, but it never became popular.
IBM started using the 720 KB double-density 3 1⁄2-inch microfloppy disk on its Convertible laptop computer in 1986 and the 1.44 MB high-density version with the PS/2 line in 1987. These disk drives could be added to older PC models. In 1988 IBM introduced a drive for 2.88 MB "DSED" diskettes in its top-of-the-line PS/2 models, but this was a commercial failure. Throughout the early 1980s, limitations of the 5 1⁄4-inch format became clear. Designed to be more practical than the 8-inch format, it was itself too large. A number of solutions were developed, with drives at 2-, 2 1⁄2-, 3-, 3 1⁄4-, 3 1⁄2- and 4-inches offered by various companies, they all shared a number of advantages over the old format, including a rigid case with a sliding metal shutter over the head slot, which helped protect the delicate magnetic medium from dust and damage, a sliding write protection tab, far more convenient than the adhesive tabs used with earlier disks. The large market share of the well-established 5 1⁄4-inch format made it difficult for these diverse mutually-incompatible new formats to gain significant market share.
A variant on the Sony design, introduced in 1982 by a large number of manufacturers, was rapidly adopted. The term floppy disk persisted though style floppy disks have a rigid case around an internal floppy disk. By the end of the 1980s, 5 1⁄4-inch disks had been superseded by 3 1⁄2-inch disks. During this time, PCs came equipped with drives of both sizes. By the mid-1990s, 5 1⁄4-inch drives had disappeared, as the 3 1⁄2-inch disk became the predominant floppy disk; the advantages of the 3 1⁄2-inch disk were its higher capacity, its smaller size, its rigid case which provided better protection from dirt and other environmental risks. If a person touches the exposed disk surface of a 5 1⁄4-inch disk through the drive hole, fingerprints may foul the disk—and the disk drive head if the disk is subsequently loaded into a drive—and it is easily possible to damage a disk of this type by folding or creasing it rendering it at least unreadable; however due to its simpler construction the 5 1⁄4-inch disk unit price was lower throughout its history in the range of a third to a half that of a 3 1⁄2-inch disk.
Floppy disks became commonplace during the 1980s and 1990s in their use with personal computers to distribute software, transfer data, create backups. Before hard disks became affordable to the general population, floppy disks were used to store a computer's operating system. Most home computers from that period have an elementary OS and BASIC stored in ROM, with the option of loading a more advanced operating system from a floppy disk. By the early 1990s, the increasing software size meant large packages like Windows or Adobe Photoshop required a dozen disks or more. In 1996, there were an estimated five billion standard floppy disks in use. Distribution of larger packages was replaced by CD-ROMs, DVDs and online distribution. An
In computing, time-sharing is the sharing of a computing resource among many users by means of multiprogramming and multi-tasking at the same time. Its introduction in the 1960s and emergence as the prominent model of computing in the 1970s represented a major technological shift in the history of computing. By allowing a large number of users to interact concurrently with a single computer, time-sharing lowered the cost of providing computing capability, made it possible for individuals and organizations to use a computer without owning one, promoted the interactive use of computers and the development of new interactive applications; the earliest computers were expensive devices, slow in comparison to models. Machines were dedicated to a particular set of tasks and operated by control panels, the operator manually entering small programs via switches in order to load and run a series of programs; these programs might take hours, or weeks, to run. As computers grew in speed, run times dropped, soon the time taken to start up the next program became a concern.
Batch processing methodologies evolved to decrease these "dead periods" by queuing up programs so that as soon as one program completed, the next would start. To support a batch processing operation, a number of comparatively inexpensive card punch or paper tape writers were used by programmers to write their programs "offline"; when typing was complete, the programs were submitted to the operations team, which scheduled them to be run. Important programs were started quickly; when the program run was completed, the output was returned to the programmer. The complete process might take days; the alternative of allowing the user to operate the computer directly was far too expensive to consider. This was; this situation limited interactive development to those organizations that could afford to waste computing cycles: large universities for the most part. Programmers at the universities decried the behaviors that batch processing imposed, to the point that Stanford students made a short film humorously critiquing it.
They experimented with new ways to interact directly with the computer, a field today known as human–computer interaction. Time-sharing was developed out of the realization that while any single user would make inefficient use of a computer, a large group of users together would not; this was due to the pattern of interaction: Typically an individual user entered bursts of information followed by long pauses but a group of users working at the same time would mean that the pauses of one user would be filled by the activity of the others. Given an optimal group size, the overall process could be efficient. Small slices of time spent waiting for disk, tape, or network input could be granted to other users; the concept is claimed to have been first described by John Backus in the 1954 summer session at MIT, by Bob Bemer in his 1957 article "How to consider a computer" in Automatic Control Magazine. In a paper published in December 1958 by W. F. Bauer, he wrote that "The computers would handle a number of problems concurrently.
Organizations would have input-output equipment installed on their own premises and would buy time on the computer much the same way that the average household buys power and water from utility companies." Implementing a system able to take advantage of this was difficult. Batch processing was a methodological development on top of the earliest systems. Since computers still ran single programs for single users at any time, the primary change with batch processing was the time delay between one program and the next. Developing a system that supported multiple users at the same time was a different concept; the "state" of each user and their programs would have to be kept in the machine, switched between quickly. This would take up computer cycles, on the slow machines of the era this was a concern. However, as computers improved in speed, in size of core memory in which users' states were retained, the overhead of time-sharing continually decreased speaking; the first project to implement time-sharing of user programs was initiated by John McCarthy at MIT in 1959 planned on a modified IBM 704, on an additionally modified IBM 709.
One of the deliverables of the project, known as the Compatible Time-Sharing System or CTSS, was demonstrated in November 1961. CTSS has a good claim to be the first time-sharing system and remained in use until 1973. Another contender for the first demonstrated time-sharing system was PLATO II, created by Donald Bitzer at a public demonstration at Robert Allerton Park near the University of Illinois in early 1961, but this was a special purpose system. Bitzer has long said that the PLATO project would have gotten the patent on time-sharing if only the University of Illinois had not lost the patent for 2 years. JOSS began time-sharing service in January 1964; the first commercially successful time-sharing system was the Dartmouth Time Sharing System. Throughout the late 1960s and the 1970s, computer terminals were multiplexed onto large institutional mainframe computers, which in many implementations sequentially polled the terminals to see whether any additional data was available or action was requested by the computer user.
Technology in interconnections were interrupt driven, some of these used parallel data trans