A punched card or punch card is a piece of stiff paper that can be used to contain digital data represented by the presence or absence of holes in predefined positions. Digital data can be used for data processing applications or, in earlier examples, used to directly control automated machinery. Punched cards were used through much of the 20th century in the data processing industry, where specialized and complex unit record machines, organized into semiautomatic data processing systems, used punched cards for data input and storage. Many early digital computers used punched cards prepared using keypunch machines, as the primary medium for input of both computer programs and data. While punched cards are now obsolete as a storage medium, as of 2012, some voting machines still use punched cards to record votes. Basile Bouchon developed the control of a loom by punched holes in paper tape in 1725; the design was improved by his assistant Jean-Baptiste Falcon and Jacques Vaucanson Although these improvements controlled the patterns woven, they still required an assistant to operate the mechanism.
In 1804 Joseph Marie Jacquard demonstrated a mechanism to automate loom operation. A number of punched cards were linked into a chain of any length; each card held the instructions for selecting the shuttle for a single pass. It is considered an important step in the history of computing hardware. Semyon Korsakov was reputedly the first to propose punched cards in informatics for information store and search. Korsakov announced his new method and machines in September 1832. Charles Babbage proposed the use of "Number Cards", "pierced with certain holes and stand opposite levers connected with a set of figure wheels... advanced they push in those levers opposite to which there are no holes on the cards and thus transfer that number together with its sign" in his description of the Calculating Engine's Store. In 1881 Jules Carpentier developed a method of recording and playing back performances on a harmonium using punched cards; the system was called the Mélographe Répétiteur and “writes down ordinary music played on the keyboard dans la langage de Jacquard”, as holes punched in a series of cards.
By 1887 Carpentier had separated the mechanism into the Melograph which recorded the player's key presses and the Melotrope which played the music. At the end of the 1800s Herman Hollerith invented the recording of data on a medium that could be read by a machine. "After some initial trials with paper tape, he settled on punched cards...", developing punched card data processing technology for the 1890 US census. His tabulating machines read and summarized data stored on punched cards and they began use for government and commercial data processing; these electromechanical machines only counted holes, but by the 1920s they had units for carrying out basic arithmetic operations. Hollerith founded the Tabulating Machine Company, one of four companies that were amalgamated to form a fifth company, Computing-Tabulating-Recording Company renamed International Business Machines Corporation. Other companies entering the punched card business included The Tabulator Limited, Deutsche Hollerith-Maschinen Gesellschaft mbH, Powers Accounting Machine Company, Remington Rand, H.
W. Egli Bull; these companies, others and marketed a variety of punched cards and unit record machines for creating and tabulating punched cards after the development of electronic computers in the 1950s. Both IBM and Remington Rand tied punched card purchases to machine leases, a violation of the 1914 Clayton Antitrust Act. In 1932, the US government took both to court on this issue. Remington Rand settled quickly. IBM viewed its business as providing a service. IBM fought all the way to the Supreme Court and lost in 1936. IBM had 32 presses at work in Endicott, N. Y. printing and stacking five to 10 million punched cards every day." Punched cards were used as legal documents, such as U. S. Government checks and savings bonds. During WW II punched card equipment was used by the Allies in some of their efforts to decrypt Axis communications. See, for example, Central Bureau in Australia. At Bletchley Park in England, 2,000,000 punched cards were used each week for storing decrypted German messages.
Punched card technology developed into a powerful tool for business data-processing. By 1950 punched cards had become ubiquitous in government. "Do not fold, spindle or mutilate," a generalized version of the warning that appeared on some punched cards, became a motto for the post-World War II era. In 1955 IBM signed a consent decree requiring, amongst other things, that IBM would by 1962 have no more than one-half of the punched card manufacturing capacity in the United States. Tom Watson Jr.'s decision to sign this decree, where IBM saw the punched card provisions as the most significant point, completed the transfer of power to him from Thomas Watson, Sr. The UNITYPER introduced magnetic tape for data entry in the 1950s. During the 1960s, the punched card was replaced as the primary means for data storage by magnetic tape, as better, more capable computers became available. Mohawk Data Sciences introduced a magnetic tape encoder in 1965, a system marketed as a keypunch replacement, somewhat successful.
Punched cards were still c
National Security Agency
The National Security Agency is a national-level intelligence agency of the United States Department of Defense, under the authority of the Director of National Intelligence. The NSA is responsible for global monitoring and processing of information and data for foreign and domestic intelligence and counterintelligence purposes, specializing in a discipline known as signals intelligence; the NSA is tasked with the protection of U. S. communications networks and information systems. The NSA relies on a variety of measures to accomplish its mission, the majority of which are clandestine. Originating as a unit to decipher coded communications in World War II, it was formed as the NSA by President Harry S. Truman in 1952. Since it has become the largest of the U. S. intelligence organizations in terms of personnel and budget. The NSA conducts worldwide mass data collection and has been known to physically bug electronic systems as one method to this end; the NSA has been alleged to have been behind such attack software as Stuxnet, which damaged Iran's nuclear program.
The NSA, alongside the Central Intelligence Agency, maintains a physical presence in many countries across the globe. SCS collection tactics encompass "close surveillance, wiretapping and entering". Unlike the CIA and the Defense Intelligence Agency, both of which specialize in foreign human espionage, the NSA does not publicly conduct human-source intelligence gathering; the NSA is entrusted with providing assistance to, the coordination of, SIGINT elements for other government organizations – which are prevented by law from engaging in such activities on their own. As part of these responsibilities, the agency has a co-located organization called the Central Security Service, which facilitates cooperation between the NSA and other U. S. defense cryptanalysis components. To further ensure streamlined communication between the signals intelligence community divisions, the NSA Director serves as the Commander of the United States Cyber Command and as Chief of the Central Security Service; the NSA's actions have been a matter of political controversy on several occasions, including its spying on anti-Vietnam-war leaders and the agency's participation in economic espionage.
In 2013, the NSA had many of its secret surveillance programs revealed to the public by Edward Snowden, a former NSA contractor. According to the leaked documents, the NSA intercepts and stores the communications of over a billion people worldwide, including United States citizens; the documents revealed the NSA tracks hundreds of millions of people's movements using cellphones metadata. Internationally, research has pointed to the NSA's ability to surveil the domestic Internet traffic of foreign countries through "boomerang routing"; the origins of the National Security Agency can be traced back to April 28, 1917, three weeks after the U. S. Congress declared war on Germany in World War I. A code and cipher decryption unit was established as the Cable and Telegraph Section, known as the Cipher Bureau, it was headquartered in Washington, D. C. and was part of the war effort under the executive branch without direct Congressional authorization. During the course of the war it was relocated in the army's organizational chart several times.
On July 5, 1917, Herbert O. Yardley was assigned to head the unit. At that point, the unit consisted of two civilian clerks, it absorbed the navy's Cryptanalysis functions in July 1918. World War I ended on November 11, 1918, the army cryptographic section of Military Intelligence moved to New York City on May 20, 1919, where it continued intelligence activities as the Code Compilation Company under the direction of Yardley. After the disbandment of the U. S. Army cryptographic section of military intelligence, known as MI-8, in 1919, the U. S. government created the Cipher Bureau known as Black Chamber. The Black Chamber was the United States' first peacetime cryptanalytic organization. Jointly funded by the Army and the State Department, the Cipher Bureau was disguised as a New York City commercial code company, its true mission, was to break the communications of other nations. Its most notable known success was at the Washington Naval Conference, during which it aided American negotiators by providing them with the decrypted traffic of many of the conference delegations, most notably the Japanese.
The Black Chamber persuaded Western Union, the largest U. S. telegram company at the time, as well as several other communications companies to illegally give the Black Chamber access to cable traffic of foreign embassies and consulates. Soon, these companies publicly discontinued their collaboration. Despite the Chamber's initial successes, it was shut down in 1929 by U. S. Secretary of State Henry L. Stimson, who defended his decision by stating, "Gentlemen do not read each other's mail". During World War II, the Signal Intelligence Service was created to intercept and decipher the communications of the Axis powers; when the war ended, the SIS was reorganized as the Army Security Agency, it was placed under the leadership of the Director of Military Intelligence. On May 20, 1949, all cryptologic activities were centralized under a national organization called the Armed Forces Security Agency; this organization was established within the U. S. Department of Defense under the command of the Joint Chiefs of Staff.
Princeton, New Jersey
Princeton is a municipality with a borough form of government in Mercer County, New Jersey, United States, established in its current form on January 1, 2013, through the consolidation of the Borough of Princeton and Princeton Township. As of the 2010 United States Census, the municipality's population was 28,572, reflecting the former township's population of 16,265, along with the 12,307 in the former borough. Princeton was founded before the American Revolution, it is the home of Princeton University, which bears its name and moved to the community in 1756 from its previous location in Newark. Although its association with the university is what makes Princeton a college town, other important institutions in the area include the Institute for Advanced Study, Westminster Choir College, Princeton Plasma Physics Laboratory, Princeton Theological Seminary, Opinion Research Corporation, Bristol-Myers Squibb, Siemens Corporate Research, SRI International, FMC Corporation, The Robert Wood Johnson Foundation, Amrep and Dwight, Berlitz International, Dow Jones & Company.
Princeton is equidistant from New York City and Philadelphia. It is close to many major highways that serve both cities, receives major television and radio broadcasts from each, it is close to Trenton, New Jersey's capital city, Edison. The New Jersey governor's official residence has been in Princeton since 1945, when Morven in what was Princeton Borough became the first Governor's mansion, it was replaced by the larger Drumthwacket, a colonial mansion located in the former Township. Morven became a museum property of the New Jersey Historical Society. Princeton was ranked 15th of the top 100 towns in the United States to Live and Work In by Money Magazine in 2005. Throughout much of its history, the community was composed of two separate municipalities: a township and a borough; the central borough was surrounded by the township. The borough seceded from the township in 1894 in a dispute over school taxes. Princeton Borough contained Nassau Street, the main commercial street, most of the University campus, incorporated most of the urban area until the postwar suburbanization.
The borough and township had equal populations. The Lenni Lenape Native Americans were the earliest identifiable inhabitants of the Princeton area. Europeans founded their settlement in the late part of the 17th century; the first European to find his home in the boundaries of the future town was Henry Greenland. He built his house in 1683 along with a tavern. In this drinking hole representatives of West Jersey and East Jersey met to set boundaries for the location of the township. Princeton was known only as part of nearby Stony Brook. Nathaniel Fitz Randolph, a native of the town, attested in his private journal on December 28, 1758, that Princeton was named in 1724 upon the making/construction of the first house in the area by James Leonard, who first referred to the town as Princetown when describing the location of his large estate in his diary; the town bore a variety of names subsequently, including: Princetown, Prince's Town and Princeton. Although there is no official documentary backing, the town is considered to be named after King William III, Prince William of Orange of the House of Nassau.
Another theory suggests that the name came from a large land-owner named Henry Prince, but no evidence backs this contention. A royal prince seems a more eponym for the settlement, as three nearby towns had similar names: Kingston and Princessville; when Richard Stockton, one of the founders of the township, died in 1709 he left his estate to his sons, who helped to expand property and the population. Based on the 1880 United States Census, the population of the town comprised 3,209 persons. Local population has expanded from the nineteenth century. According to the 2010 Census, Princeton Borough had 12,307 inhabitants, while Princeton Township had 16,265; the numbers have become stagnant. Aside from housing the university of the same name, the settlement suffered the revolutionary Battle of Princeton in 1777, when George Washington forced the British to evacuate southern New Jersey. After the victory, the town hosted the first Legislature under the State Constitution to decide the State's seal and organization of its government.
In addition, two of the original signers of the Declaration of Independence—Richard Stockton and John Witherspoon lived in Princeton. Princetonians honored their citizens' legacy by naming two streets in the downtown area after them. On January 10, 1938 Henry Ewing Hale called for a group of citizens to discuss opening a "Historical Society of Princeton." The Bainbridge House would be dedicated for this purpose. The house was used once for a meeting of Continental Congress in 1783, a general office, as the Princeton Public Library; the House is owned by Princeton University and is leased to the Princeton Historical Society for one dollar per year. The house has kept its original staircase and paneled walls. Around 70% of the house has been unaltered. Aside from safety features such as wheelchair access and electrical work, the house was has been restored to its original look. During the most stirring events in its history, Princeton was a wide spot in the ro
The Williams tube, or the Williams–Kilburn tube after inventors Freddie Williams and Tom Kilburn, is an early form of computer memory. It was the first random-access digital storage device, was used in several early computers; the Williams tube works by displaying a grid of dots on a cathode ray tube. Due to the way CRTs work, this creates a small charge of static electricity over each dot; the charge at the location of each of the dots is read by a thin metal sheet just in front of the display. Since the display faded over time, it was periodically refreshed, it cycles faster than earlier acoustic delay line memory, at the speed of the electrons inside the vacuum tube, rather than at the speed of sound. However, the system was adversely affected by any nearby electrical fields, required constant alignment to keep operational. Williams–Kilburn tubes were used on high-speed computer designs. Williams and Kilburn applied for British patents on 11 December 1946, 2 October 1947, followed by United States patent applications on 10 December 1947, 16 May 1949.
The Williams tube depends on an effect called secondary emission. When the electron beam strikes the phosphor that forms the display surface, it causes it to light up; these electrons travel a short distance before being attracted back to the CRT surface and falling on it a short distance away. The overall effect is to cause a slight positive charge in the immediate region of the beam where there is a deficit of electrons, a slight negative charge around the dot where those electrons land; the resulting charge well remains on the surface of the tube for a fraction of a second while the electrons flow back to their original locations. The lifetime depends on the size of the well; the process of creating the charge well is used as the write operation in a computer memory, storing a single binary digit, or bit. A collection of dots or spaces one horizontal row on the display, represents a computer word. There is a relationship between the size and spacing of the dots and their lifetime, as well as the ability to reject crosstalk with adjacent dots.
This places an upper limit on the memory density, each Williams tube could store about 1024 to 2560 bits of data. Because the electron beam is inertia-free and can be moved anywhere on the display, the computer can access any location, making it a random access memory; the computer would load the address as an X and Y pair into the driver circuitry and trigger a time base generator that would sweep the selected locations, reading from or writing to the internal registers implemented as flip-flops. Reading the memory took place via a secondary effect caused by the writing operation. During the short period when the write takes place, the redistribution of charges in the phosphor creates an electrical current that induces voltage in any nearby conductors; this is read by placing a thin metal sheet just in front of the display side of the CRT. During a read operation, the beam writes to the selected bit locations on the display; those locations that were written to are depleted of electrons, so no current flows, no voltage appears on the plate.
This allows the computer to determine. If the location had not been written to the write process will create a well and a pulse will be read on the plate, indicating a "0". Reading a memory location creates a charge well whether or not one was there, destroying the original contents of that location, so any read has to be followed by a write to reinstate the original data. In some systems this was accomplished using a second electron gun inside the CRT that could write to one location while the other was reading the next. Since the display would fade over time, the entire display had to be periodically refreshed using the same basic method. However, as the data is read and immediately written, this operation can be carried out by external circuitry while the central processing unit was busy carrying out other operations; this refresh operation is similar to the memory refresh cycles of DRAM in modern systems. Since the refresh process caused the same pattern to continually reappear on the display, there was a need to be able to erase written values.
This was accomplished by writing to the display just beside the original location. The electrons released by this new write would fall into the written well, filling it back in; the original systems produced this effect by writing a small dash, easy to accomplish without changing the master timers and producing the write current for a longer period. The resulting pattern was a series of dashes. There was a considerable amount of research on more effective erasing systems, with some systems using out-of-focus beams or complex patterns; some Williams tubes were made from radar-type cathode ray tubes with a phosphor coating that made the data visible, while other tubes were purpose-built without such a coating. The presence or absence of this coating had no effect on the operation of the tube, was of no importance to the operators, since the face of the tube was covered by the pickup plate. If a visible output was needed, a second tube connected in parallel with the storage tube, with a phosphor coating, but without a pickup plate, was used as a display device.
Developed at the University of Manchester in England, it provided the medium on which the first electronically stored-memory program was implemented in the M
Fortran is a general-purpose, compiled imperative programming language, suited to numeric computation and scientific computing. Developed by IBM in the 1950s for scientific and engineering applications, FORTRAN came to dominate this area of programming early on and has been in continuous use for over half a century in computationally intensive areas such as numerical weather prediction, finite element analysis, computational fluid dynamics, computational physics and computational chemistry, it is a popular language for high-performance computing and is used for programs that benchmark and rank the world's fastest supercomputers. Fortran encompasses a lineage of versions, each of which evolved to add extensions to the language while retaining compatibility with prior versions. Successive versions have added support for structured programming and processing of character-based data, array programming, modular programming and generic programming, high performance Fortran, object-oriented programming and concurrent programming.
Fortran's design was the basis for many other programming languages. Among the better known is BASIC, based on FORTRAN II with a number of syntax cleanups, notably better logical structures, other changes to more work in an interactive environment; the names of earlier versions of the language through FORTRAN 77 were conventionally spelled in all-capitals. The capitalization has been dropped in referring to newer versions beginning with Fortran 90; the official language standards now refer to the language as "Fortran" rather than all-caps "FORTRAN". In late 1953, John W. Backus submitted a proposal to his superiors at IBM to develop a more practical alternative to assembly language for programming their IBM 704 mainframe computer. Backus' historic FORTRAN team consisted of programmers Richard Goldberg, Sheldon F. Best, Harlan Herrick, Peter Sheridan, Roy Nutt, Robert Nelson, Irving Ziller, Lois Haibt, David Sayre, its concepts included easier entry of equations into a computer, an idea developed by J. Halcombe Laning and demonstrated in the Laning and Zierler system of 1952.
A draft specification for The IBM Mathematical Formula Translating System was completed by November 1954. The first manual for FORTRAN appeared in October 1956, with the first FORTRAN compiler delivered in April 1957; this was the first optimizing compiler, because customers were reluctant to use a high-level programming language unless its compiler could generate code with performance comparable to that of hand-coded assembly language. While the community was skeptical that this new method could outperform hand-coding, it reduced the number of programming statements necessary to operate a machine by a factor of 20, gained acceptance. John Backus said during a 1979 interview with Think, the IBM employee magazine, "Much of my work has come from being lazy. I didn't like writing programs, so, when I was working on the IBM 701, writing programs for computing missile trajectories, I started work on a programming system to make it easier to write programs."The language was adopted by scientists for writing numerically intensive programs, which encouraged compiler writers to produce compilers that could generate faster and more efficient code.
The inclusion of a complex number data type in the language made Fortran suited to technical applications such as electrical engineering. By 1960, versions of FORTRAN were available for the IBM 709, 650, 1620, 7090 computers; the increasing popularity of FORTRAN spurred competing computer manufacturers to provide FORTRAN compilers for their machines, so that by 1963 over 40 FORTRAN compilers existed. For these reasons, FORTRAN is considered to be the first used programming language supported across a variety of computer architectures; the development of Fortran paralleled the early evolution of compiler technology, many advances in the theory and design of compilers were motivated by the need to generate efficient code for Fortran programs. The initial release of FORTRAN for the IBM 704 contained 32 statements, including: DIMENSION and EQUIVALENCE statements Assignment statements Three-way arithmetic IF statement, which passed control to one of three locations in the program depending on whether the result of the arithmetic statement was negative, zero, or positive IF statements for checking exceptions.
The arithmetic IF statement was reminiscent of a three-way comparison instruction available on the 704. The statement provided the only way to compare numbers – by testing their difference, with an attendant risk of overflow; this deficiency was overcome by "logical" facilities introduced in FORTRAN IV. The FREQUENCY statement was used to give branch probabilities for the three branch cases of the arithmetic IF statement; the first FORTRAN compiler used this weighting to perform at compile time a Monte Carlo simulation of the generated code, the results of which were used to optimize the
The IBM 702 was IBM's response to the UNIVAC—the first mainframe computer using magnetic tapes. Because these machines had less computational power than the IBM 701 and ERA 1103, which were favored for scientific computing, the 702 was aimed at business computing; the 702 was announced September 25, 1953, withdrawn October 1, 1954, but the first production model was not installed until July 1955. Fourteen 702s were built; the first one was used at IBM. Due to problems with the Williams tubes, the decision was made to switch to magnetic-core memory instead; the fourteenth 702 was built using magnetic-core memory, the others were retrofitted with magnetic-core memory. The successor to the 702 in the 700/7000 series was the IBM 705, which marked the transition to magnetic-core memory; the 702 was designed for business data processing. Therefore the memory of the computer was oriented toward storing characters; the system used electrostatic storage, consisting of 14, 28, 42, 56, or 70 Williams tubes with a capacity of 1000 bits each for the main memory, giving a memory of 2,000 to 10,000 characters of seven bits each, 14 Williams tubes with a capacity of 512 bits each for the two 512-character accumulators.
A complete system included the following units: IBM 702 Central Processing Unit IBM 712 Card Reader IBM 756 Card Reader Control Unit IBM 717 Printer IBM 757 Printer Control Unit IBM 722 Card Punch IBM 758 Card Punch Control Unit IBM 727 Magnetic Tape Unit IBM 752 Tape Control Unit IBM 732 Magnetic Drum Storage UnitTotal weight: about 24,645 pounds. List of vacuum tube computers IBM Archives: 702 Data Processing System IBM 702 Documents on bitsavers The Williams Tube IBM 702 photos
John von Neumann
John von Neumann was a Hungarian-American mathematician, computer scientist, polymath. Von Neumann was regarded as the foremost mathematician of his time and said to be "the last representative of the great mathematicians", he made major contributions to a number of fields, including mathematics, economics and statistics. He was a pioneer of the application of operator theory to quantum mechanics in the development of functional analysis, a key figure in the development of game theory and the concepts of cellular automata, the universal constructor and the digital computer, he published over 150 papers in his life: about 60 in pure mathematics, 60 in applied mathematics, 20 in physics, the remainder on special mathematical subjects or non-mathematical ones. His last work, an unfinished manuscript written while in hospital, was published in book form as The Computer and the Brain, his analysis of the structure of self-replication preceded the discovery of the structure of DNA. In a short list of facts about his life he submitted to the National Academy of Sciences, he stated, "The part of my work I consider most essential is that on quantum mechanics, which developed in Göttingen in 1926, subsequently in Berlin in 1927–1929.
My work on various forms of operator theory, Berlin 1930 and Princeton 1935–1939. During World War II, von Neumann worked on the Manhattan Project with theoretical physicist Edward Teller, mathematician Stanisław Ulam and others, problem solving key steps in the nuclear physics involved in thermonuclear reactions and the hydrogen bomb, he developed the mathematical models behind the explosive lenses used in the implosion-type nuclear weapon, coined the term "kiloton", as a measure of the explosive force generated. After the war, he served on the General Advisory Committee of the United States Atomic Energy Commission, consulted for a number of organizations, including the United States Air Force, the Army's Ballistic Research Laboratory, the Armed Forces Special Weapons Project, the Lawrence Livermore National Laboratory; as a Hungarian émigré, concerned that the Soviets would achieve nuclear superiority, he designed and promoted the policy of mutually assured destruction to limit the arms race.
Von Neumann was born Neumann János Lajos to a wealthy and non-observant Jewish family. After his arrival in the U. S. he had been baptized a Roman Catholic prior to the marriage to his Catholic first wife. Von Neumann was born in Budapest, Kingdom of Hungary, part of the Austro-Hungarian Empire, he was the eldest of three brothers. His father, Neumann Miksa was a banker, he had moved to Budapest from Pécs at the end of the 1880s. Miksa's father and grandfather were both born in Zemplén County, northern Hungary. John's mother was Kann Margit. Three generations of the Kann family lived in spacious apartments above the Kann-Heller offices in Budapest. On February 20, 1913, Emperor Franz Joseph elevated his father to the Hungarian nobility for his service to the Austro-Hungarian Empire; the Neumann family thus acquired the hereditary appellation Margittai. The family had no connection with the town. Neumann János became margittai Neumann János, which he changed to the German Johann von Neumann. Von Neumann was a child prodigy.
When he was 6 years old, he could divide two 8-digit numbers in his head and could converse in Ancient Greek. When the 6-year-old von Neumann caught his mother staring aimlessly, he asked her, "What are you calculating?"Children did not begin formal schooling in Hungary until they were ten years of age. Max believed that knowledge of languages in addition to Hungarian was essential, so the children were tutored in English, French and Italian. By the age of 8, von Neumann was familiar with differential and integral calculus, but he was interested in history, he read his way through Wilhelm Oncken's 46-volume Allgemeine Geschichte in Einzeldarstellungen. A copy was contained in a private library. One of the rooms in the apartment was converted into a library and reading room, with bookshelves from ceiling to floor. Von Neumann entered the Lutheran Fasori Evangélikus Gimnázium in 1911. Eugene Wigner soon became his friend; this was one of the best schools in Budapest and was part of a brilliant education system designed for the elite.
Under the Hungarian system, children received all their education at the one gymnasium. The Hungarian school system produced a generation noted for intellectual achie