Computer science is the study of processes that interact with data and that can be represented as data in the form of programs. It enables the use of algorithms to manipulate and communicate digital information. A computer scientist studies the theory of computation and the practice of designing software systems, its fields can be divided into practical disciplines. Computational complexity theory is abstract, while computer graphics emphasizes real-world applications. Programming language theory considers approaches to the description of computational processes, while computer programming itself involves the use of programming languages and complex systems. Human–computer interaction considers the challenges in making computers useful and accessible; the earliest foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating fixed numerical tasks such as the abacus have existed since antiquity, aiding in computations such as multiplication and division.
Algorithms for performing computations have existed since antiquity before the development of sophisticated computing equipment. Wilhelm Schickard designed and constructed the first working mechanical calculator in 1623. In 1673, Gottfried Leibniz demonstrated a digital mechanical calculator, called the Stepped Reckoner, he may be considered the first computer scientist and information theorist, among other reasons, documenting the binary number system. In 1820, Thomas de Colmar launched the mechanical calculator industry when he released his simplified arithmometer, the first calculating machine strong enough and reliable enough to be used daily in an office environment. Charles Babbage started the design of the first automatic mechanical calculator, his Difference Engine, in 1822, which gave him the idea of the first programmable mechanical calculator, his Analytical Engine, he started developing this machine in 1834, "in less than two years, he had sketched out many of the salient features of the modern computer".
"A crucial step was the adoption of a punched card system derived from the Jacquard loom" making it infinitely programmable. In 1843, during the translation of a French article on the Analytical Engine, Ada Lovelace wrote, in one of the many notes she included, an algorithm to compute the Bernoulli numbers, considered to be the first computer program. Around 1885, Herman Hollerith invented the tabulator, which used punched cards to process statistical information. In 1937, one hundred years after Babbage's impossible dream, Howard Aiken convinced IBM, making all kinds of punched card equipment and was in the calculator business to develop his giant programmable calculator, the ASCC/Harvard Mark I, based on Babbage's Analytical Engine, which itself used cards and a central computing unit; when the machine was finished, some hailed it as "Babbage's dream come true". During the 1940s, as new and more powerful computing machines were developed, the term computer came to refer to the machines rather than their human predecessors.
As it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. In 1945, IBM founded the Watson Scientific Computing Laboratory at Columbia University in New York City; the renovated fraternity house on Manhattan's West Side was IBM's first laboratory devoted to pure science. The lab is the forerunner of IBM's Research Division, which today operates research facilities around the world; the close relationship between IBM and the university was instrumental in the emergence of a new scientific discipline, with Columbia offering one of the first academic-credit courses in computer science in 1946. Computer science began to be established as a distinct academic discipline in the 1950s and early 1960s; the world's first computer science degree program, the Cambridge Diploma in Computer Science, began at the University of Cambridge Computer Laboratory in 1953. The first computer science degree program in the United States was formed at Purdue University in 1962.
Since practical computers became available, many applications of computing have become distinct areas of study in their own rights. Although many believed it was impossible that computers themselves could be a scientific field of study, in the late fifties it became accepted among the greater academic population, it is the now well-known IBM brand that formed part of the computer science revolution during this time. IBM released the IBM 704 and the IBM 709 computers, which were used during the exploration period of such devices. "Still, working with the IBM was frustrating if you had misplaced as much as one letter in one instruction, the program would crash, you would have to start the whole process over again". During the late 1950s, the computer science discipline was much in its developmental stages, such issues were commonplace. Time has seen significant improvements in the effectiveness of computing technology. Modern society has seen a significant shift in the users of computer technology, from usage only by experts and professionals, to a near-ubiquitous user base.
Computers were quite costly, some degree of humanitarian aid was needed for efficient use—in part from professional computer operators. As computer adoption became more widespread and affordable, less human assistance was needed for common usage. Despite its short history as a formal academic discipline, computer science has made a number of fundamental contributions to science and society—in fact, along with electronics, it is
ArXiv is a repository of electronic preprints approved for posting after moderation, but not full peer review. It consists of scientific papers in the fields of mathematics, astronomy, electrical engineering, computer science, quantitative biology, mathematical finance and economics, which can be accessed online. In many fields of mathematics and physics all scientific papers are self-archived on the arXiv repository. Begun on August 14, 1991, arXiv.org passed the half-million-article milestone on October 3, 2008, had hit a million by the end of 2014. By October 2016 the submission rate had grown to more than 10,000 per month. ArXiv was made possible by the compact TeX file format, which allowed scientific papers to be transmitted over the Internet and rendered client-side. Around 1990, Joanne Cohn began emailing physics preprints to colleagues as TeX files, but the number of papers being sent soon filled mailboxes to capacity. Paul Ginsparg recognized the need for central storage, in August 1991 he created a central repository mailbox stored at the Los Alamos National Laboratory which could be accessed from any computer.
Additional modes of access were soon added: FTP in 1991, Gopher in 1992, the World Wide Web in 1993. The term e-print was adopted to describe the articles, it began as a physics archive, called the LANL preprint archive, but soon expanded to include astronomy, computer science, quantitative biology and, most statistics. Its original domain name was xxx.lanl.gov. Due to LANL's lack of interest in the expanding technology, in 2001 Ginsparg changed institutions to Cornell University and changed the name of the repository to arXiv.org. It is now hosted principally with eight mirrors around the world, its existence was one of the precipitating factors that led to the current movement in scientific publishing known as open access. Mathematicians and scientists upload their papers to arXiv.org for worldwide access and sometimes for reviews before they are published in peer-reviewed journals. Ginsparg was awarded a MacArthur Fellowship in 2002 for his establishment of arXiv; the annual budget for arXiv is $826,000 for 2013 to 2017, funded jointly by Cornell University Library, the Simons Foundation and annual fee income from member institutions.
This model arose in 2010, when Cornell sought to broaden the financial funding of the project by asking institutions to make annual voluntary contributions based on the amount of download usage by each institution. Each member institution pledges a five-year funding commitment to support arXiv. Based on institutional usage ranking, the annual fees are set in four tiers from $1,000 to $4,400. Cornell's goal is to raise at least $504,000 per year through membership fees generated by 220 institutions. In September 2011, Cornell University Library took overall administrative and financial responsibility for arXiv's operation and development. Ginsparg was quoted in the Chronicle of Higher Education as saying it "was supposed to be a three-hour tour, not a life sentence". However, Ginsparg remains on the arXiv Scientific Advisory Board and on the arXiv Physics Advisory Committee. Although arXiv is not peer reviewed, a collection of moderators for each area review the submissions; the lists of moderators for many sections of arXiv are publicly available, but moderators for most of the physics sections remain unlisted.
Additionally, an "endorsement" system was introduced in 2004 as part of an effort to ensure content is relevant and of interest to current research in the specified disciplines. Under the system, for categories that use it, an author must be endorsed by an established arXiv author before being allowed to submit papers to those categories. Endorsers are not asked to review the paper for errors, but to check whether the paper is appropriate for the intended subject area. New authors from recognized academic institutions receive automatic endorsement, which in practice means that they do not need to deal with the endorsement system at all. However, the endorsement system has attracted criticism for restricting scientific inquiry. A majority of the e-prints are submitted to journals for publication, but some work, including some influential papers, remain purely as e-prints and are never published in a peer-reviewed journal. A well-known example of the latter is an outline of a proof of Thurston's geometrization conjecture, including the Poincaré conjecture as a particular case, uploaded by Grigori Perelman in November 2002.
Perelman appears content to forgo the traditional peer-reviewed journal process, stating: "If anybody is interested in my way of solving the problem, it's all there – let them go and read about it". Despite this non-traditional method of publication, other mathematicians recognized this work by offering the Fields Medal and Clay Mathematics Millennium Prizes to Perelman, both of which he refused. Papers can be submitted in any of several formats, including LaTeX, PDF printed from a word processor other than TeX or LaTeX; the submission is rejected by the arXiv software if generating the final PDF file fails, if any image file is too large, or if the total size of the submission is too large. ArXiv now allows one to store and modify an incomplete submission, only finalize the submission when ready; the time stamp on the article is set. The standard access route is through one of several mirrors. Sev
OCLC Online Computer Library Center, Incorporated d/b/a OCLC is an American nonprofit cooperative organization "dedicated to the public purposes of furthering access to the world's information and reducing information costs". It was founded in 1967 as the Ohio College Library Center. OCLC and its member libraries cooperatively produce and maintain WorldCat, the largest online public access catalog in the world. OCLC is funded by the fees that libraries have to pay for its services. OCLC maintains the Dewey Decimal Classification system. OCLC began in 1967, as the Ohio College Library Center, through a collaboration of university presidents, vice presidents, library directors who wanted to create a cooperative computerized network for libraries in the state of Ohio; the group first met on July 5, 1967 on the campus of the Ohio State University to sign the articles of incorporation for the nonprofit organization, hired Frederick G. Kilgour, a former Yale University medical school librarian, to design the shared cataloging system.
Kilgour wished to merge the latest information storage and retrieval system of the time, the computer, with the oldest, the library. The plan was to merge the catalogs of Ohio libraries electronically through a computer network and database to streamline operations, control costs, increase efficiency in library management, bringing libraries together to cooperatively keep track of the world's information in order to best serve researchers and scholars; the first library to do online cataloging through OCLC was the Alden Library at Ohio University on August 26, 1971. This was the first online cataloging by any library worldwide. Membership in OCLC is based on use of services and contribution of data. Between 1967 and 1977, OCLC membership was limited to institutions in Ohio, but in 1978, a new governance structure was established that allowed institutions from other states to join. In 2002, the governance structure was again modified to accommodate participation from outside the United States.
As OCLC expanded services in the United States outside Ohio, it relied on establishing strategic partnerships with "networks", organizations that provided training and marketing services. By 2008, there were 15 independent United States regional service providers. OCLC networks played a key role in OCLC governance, with networks electing delegates to serve on the OCLC Members Council. During 2008, OCLC commissioned two studies to look at distribution channels. In early 2009, OCLC negotiated new contracts with the former networks and opened a centralized support center. OCLC provides bibliographic and full-text information to anyone. OCLC and its member libraries cooperatively produce and maintain WorldCat—the OCLC Online Union Catalog, the largest online public access catalog in the world. WorldCat has holding records from private libraries worldwide; the Open WorldCat program, launched in late 2003, exposed a subset of WorldCat records to Web users via popular Internet search and bookselling sites.
In October 2005, the OCLC technical staff began a wiki project, WikiD, allowing readers to add commentary and structured-field information associated with any WorldCat record. WikiD was phased out; the Online Computer Library Center acquired the trademark and copyrights associated with the Dewey Decimal Classification System when it bought Forest Press in 1988. A browser for books with their Dewey Decimal Classifications was available until July 2013; until August 2009, when it was sold to Backstage Library Works, OCLC owned a preservation microfilm and digitization operation called the OCLC Preservation Service Center, with its principal office in Bethlehem, Pennsylvania. The reference management service QuestionPoint provides libraries with tools to communicate with users; this around-the-clock reference service is provided by a cooperative of participating global libraries. Starting in 1971, OCLC produced catalog cards for members alongside its shared online catalog. OCLC commercially sells software, such as CONTENTdm for managing digital collections.
It offers the bibliographic discovery system WorldCat Discovery, which allows for library patrons to use a single search interface to access an institution's catalog, database subscriptions and more. OCLC has been conducting research for the library community for more than 30 years. In accordance with its mission, OCLC makes its research outcomes known through various publications; these publications, including journal articles, reports and presentations, are available through the organization's website. OCLC Publications – Research articles from various journals including Code4Lib Journal, OCLC Research, Reference & User Services Quarterly, College & Research Libraries News, Art Libraries Journal, National Education Association Newsletter; the most recent publications are displayed first, all archived resources, starting in 1970, are available. Membership Reports – A number of significant reports on topics ranging from virtual reference in libraries to perceptions about library funding. Newsletters – Current and archived newsletters for the library and archive community.
Presentations – Presentations from both guest speakers and OCLC research from conferences and other events. The presentations are organized into five categories: Conference presentations, Dewey presentations, Distinguished Seminar Series, Guest presentations, Research staff
Quantum entanglement is a physical phenomenon that occurs when pairs or groups of particles are generated, interact, or share spatial proximity in ways such that the quantum state of each particle cannot be described independently of the state of the others when the particles are separated by a large distance. Measurements of physical properties such as position, momentum and polarization, performed on entangled particles are found to be correlated. For example, if a pair of particles is generated in such a way that their total spin is known to be zero, one particle is found to have clockwise spin on a certain axis, the spin of the other particle, measured on the same axis, will be found to be counterclockwise, as is to be expected due to their entanglement. However, this behavior gives rise to paradoxical effects: any measurement of a property of a particle performs an irreversible collapse on that particle and will change the original quantum state. In the case of entangled particles, such a measurement will be on the entangled system as a whole.
Such phenomena were the subject of a 1935 paper by Albert Einstein, Boris Podolsky, Nathan Rosen, several papers by Erwin Schrödinger shortly thereafter, describing what came to be known as the EPR paradox. Einstein and others considered such behavior to be impossible, as it violated the local realism view of causality and argued that the accepted formulation of quantum mechanics must therefore be incomplete. However, the counterintuitive predictions of quantum mechanics were verified experimentally in tests where the polarization or spin of entangled particles were measured at separate locations, statistically violating Bell's inequality. In earlier tests it couldn't be ruled out that the test result at one point could have been subtly transmitted to the remote point, affecting the outcome at the second location; however so-called "loophole-free" Bell tests have been performed in which the locations were separated such that communications at the speed of light would have taken longer—in one case 10,000 times longer—than the interval between the measurements.
According to some interpretations of quantum mechanics, the effect of one measurement occurs instantly. Other interpretations which don't recognize wavefunction collapse dispute that there is any "effect" at all. However, all interpretations agree that entanglement produces correlation between the measurements and that the mutual information between the entangled particles can be exploited, but that any transmission of information at faster-than-light speeds is impossible. Quantum entanglement has been demonstrated experimentally with photons, electrons, molecules as large as buckyballs, small diamonds; the utilization of entanglement in communication and computation is a active area of research. The counterintuitive predictions of quantum mechanics about correlated systems were first discussed by Albert Einstein in 1935, in a joint paper with Boris Podolsky and Nathan Rosen. In this study, the three formulated the EPR paradox, a thought experiment that attempted to show that quantum mechanical theory was incomplete.
They wrote: "We are thus forced to conclude that the quantum-mechanical description of physical reality given by wave functions is not complete."However, the three scientists did not coin the word entanglement, nor did they generalize the special properties of the state they considered. Following the EPR paper, Erwin Schrödinger wrote a letter to Einstein in German in which he used the word Verschränkung "to describe the correlations between two particles that interact and separate, as in the EPR experiment."Schrödinger shortly thereafter published a seminal paper defining and discussing the notion of "entanglement." In the paper he recognized the importance of the concept, stated: "I would not call one but rather the characteristic trait of quantum mechanics, the one that enforces its entire departure from classical lines of thought." Like Einstein, Schrödinger was dissatisfied with the concept of entanglement, because it seemed to violate the speed limit on the transmission of information implicit in the theory of relativity.
Einstein famously derided entanglement as "spukhafte Fernwirkung" or "spooky action at a distance." The EPR paper generated significant interest among physicists which inspired much discussion about the foundations of quantum mechanics, but produced little other published work. So, despite the interest, the weak point in EPR's argument was not discovered until 1964, when John Stewart Bell proved that one of their key assumptions, the principle of locality, as applied to the kind of hidden variables interpretation hoped for by EPR, was mathematically inconsistent with the predictions of quantum theory. Bell demonstrated an upper limit, seen in Bell's inequality, regarding the strength of correlations that can be produced in any theory obeying local realism, he showed that quantum theory predicts violations of this limit for certain entangled systems, his inequality is experimentally testable, there have been numerous relevant experiments, starting with the pioneering work of Stuart Freedman and John Clauser in 1972 and Alain Aspect's experiments in 1982, all of which have shown agreement with quantum mechanics rather than the principle of local realism.
Until each had left open at least one loophole by which it was possible to question the validity of the results. However, in 2015 an experiment was performed that closed both the detection and locality loopholes, was heralded as "loophole-free".
Beryllium is a chemical element with symbol Be and atomic number 4. It is a rare element in the universe occurring as a product of the spallation of larger atomic nuclei that have collided with cosmic rays. Within the cores of stars beryllium is depleted as it creates larger elements, it is a divalent element which occurs only in combination with other elements in minerals. Notable gemstones which contain beryllium include chrysoberyl; as a free element it is a steel-gray, strong and brittle alkaline earth metal. Beryllium improves many physical properties when added as an alloying element to aluminium, copper and nickel. Beryllium does not form oxides until it reaches high temperatures. Tools made of beryllium copper alloys are strong and hard and do not create sparks when they strike a steel surface. In structural applications, the combination of high flexural rigidity, thermal stability, thermal conductivity and low density make beryllium metal a desirable aerospace material for aircraft components, missiles and satellites.
Because of its low density and atomic mass, beryllium is transparent to X-rays and other forms of ionizing radiation. The high thermal conductivities of beryllium and beryllium oxide have led to their use in thermal management applications; the commercial use of beryllium requires the use of appropriate dust control equipment and industrial controls at all times because of the toxicity of inhaled beryllium-containing dusts that can cause a chronic life-threatening allergic disease in some people called berylliosis. Beryllium is a steel gray and hard metal, brittle at room temperature and has a close-packed hexagonal crystal structure, it has a reasonably high melting point. The modulus of elasticity of beryllium is 50% greater than that of steel; the combination of this modulus and a low density results in an unusually fast sound conduction speed in beryllium – about 12.9 km/s at ambient conditions. Other significant properties are high specific heat and thermal conductivity, which make beryllium the metal with the best heat dissipation characteristics per unit weight.
In combination with the low coefficient of linear thermal expansion, these characteristics result in a unique stability under conditions of thermal loading. Occurring beryllium, save for slight contamination by the cosmogenic radioisotopes, is isotopically pure beryllium-9, which has a nuclear spin of 3/2. Beryllium has a large scattering cross section for high-energy neutrons, about 6 barns for energies above 10 keV. Therefore, it works as a neutron reflector and neutron moderator slowing the neutrons to the thermal energy range of below 0.03 eV, where the total cross section is at least an order of magnitude lower – exact value depends on the purity and size of the crystallites in the material. The single primordial beryllium isotope 9Be undergoes a neutron reaction with neutron energies over about 1.9 MeV, to produce 8Be, which immediately breaks into two alpha particles. Thus, for high-energy neutrons, beryllium is a neutron multiplier, releasing more neutrons than it absorbs; this nuclear reaction is: 94Be + n → 2 42He + 2 nNeutrons are liberated when beryllium nuclei are struck by energetic alpha particles producing the nuclear reaction 94Be + 42He → 126C + n, where 42He is an alpha particle and 126C is a carbon-12 nucleus.
Beryllium releases neutrons under bombardment by gamma rays. Thus, natural beryllium bombarded either by alphas or gammas from a suitable radioisotope is a key component of most radioisotope-powered nuclear reaction neutron sources for the laboratory production of free neutrons. Small amounts of tritium are liberated when 94Be nuclei absorb low energy neutrons in the three-step nuclear reaction 94Be + n → 42He + 62He, 62He → 63Li + β−, 63Li + n → 42He + 31HNote that 62He has a half-life of only 0.8 seconds, β− is an electron, 63Li has a high neutron absorption cross-section. Tritium is a radioisotope of concern in nuclear reactor waste streams; as a metal, beryllium is transparent to most wavelengths of X-rays and gamma rays, making it useful for the output windows of X-ray tubes and other such apparatus. Both stable and unstable isotopes of beryllium are created in stars, but the radioisotopes do not last long, it is believed that most of the stable beryllium in the universe was created in the interstellar medium when cosmic rays induced fission in heavier elements found in interstellar gas and dust.
Primordial beryllium contains only one stable isotope, 9Be, therefore beryllium is a monoisotopic element. Radioactive cosmogenic 10Be is produced in the atmosphere of the Earth by the cosmic ray spallation of oxygen. 10Be accumulates at the soil surface, where its long half-life permits a long residence time before decaying to boron-10. Thus, 10Be and its daughter products are used to examine natural soil erosion, soil formation and the development of lateritic soils, as a proxy for measurement of the variations in solar activity and the age of ice cores; the production of 10Be is inversely proportional to solar activity, because increased solar wind during periods of high solar activity decreases the flux of galactic cosmic rays that reach the Earth. Nuclear explosions form 10Be by the reaction of fast neutrons with 13C in the carbon dioxide in air; this is one of the indicators of past activity at nuclear weapon
An ion trap is a combination of electric or magnetic fields used to capture charged particles in a system isolated from an external environment. Ion traps have a number of scientific uses such as mass spectrometry, basic physics research, controlling quantum states; the two most common types of ion trap are the Penning trap, which forms a potential via a combination of electric and magnetic fields, the Paul trap which forms a potential via a combination of static and oscillating electric fields. Penning traps can be used for precise magnetic measurements in spectroscopy. Studies of quantum state manipulation most use the Paul trap; this may lead to a trapped ion quantum computer and has been used to create the world's most accurate atomic clocks. Electron guns can use an ion trap to prevent degradation of the cathode by positive ions. An ion trap mass spectrometer may incorporate Paul trap or the Kingdon trap; the Orbitrap, introduced in 2005, is based on the Kingdon trap. Other types of mass spectrometers may use a linear quadrupole ion trap as a selective mass filter.
A Penning trap stores charged particles using a strong homogeneous axial magnetic field to confine particles radially and a quadrupole electric field to confine the particles axially. The Penning Trap was named after Frans Michel Penning by Hans Georg Dehmelt who built the first trap. Penning traps are well suited for measurements of the properties of ions and stable charged subatomic particles. Precision studies of the electron magnetic moment by Dehmelt and others are an important topic in modern physics. Penning traps can be used in quantum computation and quantum information processing and are used at CERN to store antimatter. Penning traps form the basis of Fourier transform ion cyclotron resonance mass spectrometry for determining the mass-to-charge ratio of ions. A Paul trap is a type of quadrupole ion trap that uses static direct current and radio frequency oscillating electric fields to trap ions. Paul traps are used as a components of a mass spectrometer; the invention of the 3D quadrupole ion trap itself is attributed to Wolfgang Paul who shared the Nobel Prize in Physics in 1989 for this work.
The trap consists of two hyperbolic metal electrodes with their foci facing each other and a hyperbolic ring electrode halfway between the other two electrodes. Ions are trapped in the space between these three electrodes by the oscillating and static electric fields. A Kingdon trap consists of a thin central wire, an outer cylindrical electrode and isolated end cap electrodes at both ends. A static applied voltage results in a radial logarithmic potential between the electrodes. In a Kingdon trap there is no potential minimum to store the ions. In 1981, Knight introduced a modified outer electrode that included an axial quadrupole term that confines the ions on the trap axis; the dynamic Kingdon trap has an additional AC voltage that uses strong defocusing to permanently store charged particles. The dynamic Kingdon trap does not require the trapped ions to have angular momentum with respect to the filament. An Orbitrap is a modified Kingdon trap, used for mass spectrometry. Though the idea has been suggested and computer simulations performed neither the Kingdon nor the Knight configurations were reported to produce mass spectra, as the simulations indicated mass resolving power would be problematic.
Ion traps were used in television receivers prior to the introduction of aluminized CRT faces around 1958, to protect the phosphor screen from ions. The ion trap must be delicately adjusted for maximum brightness; some experimental work towards developing quantum computers use trapped ions. Units of quantum information called qubits are stored in stable electronic states of each ion, quantum information can be processed and transferred through the collective quantized motion of the ions, interacting by the Coulomb force. Lasers are applied to induce coupling between the qubit states or between the internal qubit states and external motional states. Trapped ion quantum computer VIAS Science Cartoons A cranky view of an ion trap... Paul trap
The Bell states, a concept in quantum information science, are specific quantum states of two qubits that represent the simplest examples of quantum entanglement. The Bell states are a form of normalized basis vectors; this normalization implies that the overall probability of the particle being in one of the mentioned states is 1: ⟨ Φ | Φ ⟩ = 1. Entanglement is a basis-independent result of superposition. Due to this superposition, measurement of the qubit will collapse it into one of its basis states with a given probability; because of the entanglement, measurement of one qubit will assign one of two possible values to the other qubit where the value assigned depends on which Bell state the two qubits are in. Bell states can be generalized to represent specific quantum states of multi-qubit systems, such as the GHZ state for 3 subsystems. Understanding of the Bell states is essential in analysis of quantum communication and quantum teleportation; the no-communication theorem prevents this behavior from transmitting information faster than the speed of light, because there is a need for A to communicate information to B.
The Bell states are four specific maximally entangled quantum states of two qubits. They are in a superposition of 1 -- that is, a linear combination of the two states, their entanglement means the following: The qubit held by Alice can be 0 as well as 1. If Alice measured her qubit in the standard basis, the outcome would be random, either possibility 0 or 1 having probability 1/2, but if Bob measured his qubit, the outcome would be the same as the one Alice got. So, if Bob measured, he would get a random outcome on first sight, but if Alice and Bob communicated, they would find out that, although their outcomes seemed random, they are correlated; this perfect correlation at a distance is special: maybe the two particles "agreed" in advance, when the pair was created, which outcome they would show in case of a measurement. Hence, following Einstein and Rosen in 1935 in their famous "EPR paper", there is something missing in the description of the qubit pair given above—namely this "agreement", called more formally a hidden variable.
In his famous paper of 1964, John S. Bell showed by simple probability theory arguments that these correlations cannot both be made perfect by the use of any "pre-agreement" stored in some hidden variables—but that quantum mechanics predicts perfect correlations. In a more formal and refined formulation known as the Bell-CHSH inequality, it is shown that a certain correlation measure cannot exceed the value 2 if one assumes that physics respects the constraints of local "hidden variable" theory, but certain systems permitted in quantum mechanics can attain values as high as 2 2. Thus, quantum theory violates the idea of local "hidden variables. Four specific two-qubit states with the maximal value of 2 2 are designated as "Bell states", they are known as the four maximally entangled two-qubit Bell states, they form a maximally entangled basis, known as the Bell basis, of the four-dimensional Hilbert space for two qubits: | Φ + ⟩ = 1 2 | Φ − ⟩ = 1 2 | Ψ + ⟩ = 1 2 | Ψ − ⟩ = 1 2. Although there are many possible ways to create entangled Bell states through quantum circuits, the simplest takes a computational basis as the input, contains a Hadamard gate and a CNOT gate.
As an example, the quantum circuit pictures takes the two qubit input | 00 ⟩ and transformed it to the firs