Machine learning is the scientific study of algorithms and statistical models that computer systems use to perform a specific task without using explicit instructions, relying on patterns and inference instead. It is seen as a subset of artificial intelligence. Machine learning algorithms build a mathematical model of sample data, known as "training data", in order to make predictions or decisions without being explicitly programmed to perform the task. Machine learning algorithms are used in a wide variety of applications, such as email filtering, computer vision, where it is infeasible to develop an algorithm of specific instructions for performing the task. Machine learning is related to computational statistics, which focuses on making predictions using computers; the study of mathematical optimization delivers methods and application domains to the field of machine learning. Data mining is a field of study within machine learning, focuses on exploratory data analysis through unsupervised learning.
In its application across business problems, machine learning is referred to as predictive analytics. The name machine learning was coined in 1959 by Arthur Samuel. Tom M. Mitchell provided a quoted, more formal definition of the algorithms studied in the machine learning field: "A computer program is said to learn from experience E with respect to some class of tasks T and performance measure P if its performance at tasks in T, as measured by P, improves with experience E." This definition of the tasks in which machine learning is concerned offers a fundamentally operational definition rather than defining the field in cognitive terms. This follows Alan Turing's proposal in his paper "Computing Machinery and Intelligence", in which the question "Can machines think?" is replaced with the question "Can machines do what we can do?". In Turing's proposal the various characteristics that could be possessed by a thinking machine and the various implications in constructing one are exposed. Machine learning tasks are classified into several broad categories.
In supervised learning, the algorithm builds a mathematical model from a set of data that contains both the inputs and the desired outputs. For example, if the task were determining whether an image contained a certain object, the training data for a supervised learning algorithm would include images with and without that object, each image would have a label designating whether it contained the object. In special cases, the input may be only available, or restricted to special feedback. Semi-supervised learning algorithms develop mathematical models from incomplete training data, where a portion of the sample input doesn't have labels. Classification algorithms and regression algorithms are types of supervised learning. Classification algorithms are used. For a classification algorithm that filters emails, the input would be an incoming email, the output would be the name of the folder in which to file the email. For an algorithm that identifies spam emails, the output would be the prediction of either "spam" or "not spam", represented by the Boolean values true and false.
Regression algorithms are named for their continuous outputs, meaning they may have any value within a range. Examples of a continuous value are the length, or price of an object. In unsupervised learning, the algorithm builds a mathematical model from a set of data which contains only inputs and no desired output labels. Unsupervised learning algorithms are used to find structure in the data, like grouping or clustering of data points. Unsupervised learning can discover patterns in the data, can group the inputs into categories, as in feature learning. Dimensionality reduction is the process of reducing the number of "features", or inputs, in a set of data. Active learning algorithms access the desired outputs for a limited set of inputs based on a budget, optimize the choice of inputs for which it will acquire training labels; when used interactively, these can be presented to a human user for labeling. Reinforcement learning algorithms are given feedback in the form of positive or negative reinforcement in a dynamic environment, are used in autonomous vehicles or in learning to play a game against a human opponent.
Other specialized algorithms in machine learning include topic modeling, where the computer program is given a set of natural language documents and finds other documents that cover similar topics. Machine learning algorithms can be used to find the unobservable probability density function in density estimation problems. Meta learning algorithms learn their own inductive bias based on previous experience. In developmental robotics, robot learning algorithms generate their own sequences of learning experiences known as a curriculum, to cumulatively acquire new skills through self-guided exploration and social interaction with humans; these robots use guidance mechanisms such as active learning, motor synergies, imitation. Arthur Samuel, an American pioneer in the field of computer gaming and artificial intelligence, coined the term "Machine Learning" in 1959 while at IBM; as a scientific endeavour, machine learning grew out of the quest for artificial intelligence. In the early days of AI as an academic discipline, some researchers were interested in having machines learn from data.
They attempted to approach the problem with various symbolic methods, as well as what were termed "neural networks". Probabilistic reasoning was employed in automated medical
Ralph C. Merkle is a computer scientist, he is one of the inventors of public key cryptography, the inventor of cryptographic hashing, more a researcher and speaker of cryonics. While an undergraduate, Merkle devised Merkle's Puzzles, a scheme for communication over an insecure channel, as part of a class project; the scheme is now recognized to be an early example of public key cryptography. He co-invented the Merkle–Hellman knapsack cryptosystem, invented cryptographic hashing, invented Merkle trees. While at Xerox PARC, Merkle designed the Khufu and Khafre block ciphers, the Snefru hash function. Merkle was the manager of compiler development at Elxsi from 1980. In 1988, he became a research scientist at Xerox PARC. In 1999 he became a nanotechnology theorist for Zyvex. In 2003 he became a Distinguished Professor at Georgia Tech, where he led the Georgia Tech Information Security Center. In 2006 he returned to the San Francisco Bay Area, where he has been a senior research fellow at IMM, a faculty member at Singularity University, a board member of the Alcor Life Extension Foundation.
He was awarded the IEEE Richard W. Hamming Medal in 2010. Ralph Merkle is a grandnephew of baseball star Fred Merkle. Merkle is married to the video game designer best known for her game, River Raid. Merkle is on the Board of Directors of the cryonics organization Alcor Life Extension Foundation. Merkle appears in the science fiction novel The Diamond Age, involving nanotechnology. 1996 Paris Kanellakis Award for the Invention of Public Key Cryptography. 1998 Feynman Prize in Nanotechnology for computational modeling of molecular tools for atomically-precise chemical reactions 1999 IEEE Koji Kobayashi Computers and Communications Award 2000 RSA Award for Excellence in Mathematics for the invention of public key cryptography. 2008 International Association for Cryptographic Research fellow for the invention of public key cryptography. 2010 IEEE Hamming Medal for the invention of public key cryptography 2011 Computer History Museum Fellow "for his work, with Whitfield Diffie and Martin Hellman, on public key cryptography."
2011 National Inventors Hall of Fame, for the invention of public key cryptography 2012 National Cyber Security Hall of Fame inductee Other references: Ralph C. Merkle, Secrecy and public key systems, UMI Research Press, 1982, ISBN 0-8357-1384-9. Robert A. Freitas Jr. Ralph C. Merkle, Kinematic Self-Replicating Machines, Landes Bioscience, 2004, ISBN 1-57059-690-5. Paul Kantor, Gheorghe Mureşan, Fred Roberts, Daniel Zeng, Frei-Yue Wang, Hsinchun Chen, Ralph Merkle, "Intelligence and Security Informatics": IEEE International Conference on Intelligence and Security Informatics, ISI 2005, Atlanta, GA, US, May 19–20... Springer, 2005, ISBN 3-540-25999-6. Interview at Google Videos in the Death in the Deep Freeze documentary Nova Southeastern University, Nanotechnology Expert Ralph Merkle to Speak on "Life and Death" Ralph Merkle's personal website Oral history interview with Martin Hellman – from 2004, Palo Alto, California. Charles Babbage Institute, University of Minnesota, Minneapolis. Hellman describes his invention of public key cryptography with collaborators Whitfield Diffie and Ralph Merkle at Stanford University in the mid-1970s.
He relates his subsequent work in cryptography with Steve Pohlig and others
Gary Miller (computer scientist)
Gary Lee Miller is a professor of Computer Science at Carnegie Mellon University, United States. In 2003 he won the ACM Paris Kanellakis Award for the Miller–Rabin primality test, he was made an ACM Fellow in 2002 and won the Knuth Prize in 2013. Miller received his Ph. D. from the University of California, Berkeley in 1975 under the direction of Manuel Blum. His Ph. D. thesis was titled Riemann's Tests for Primality. Apart from computational number theory and primality testing, he has worked in the areas of computational geometry, scientific computing, parallel algorithms and randomized algorithms. Among his Ph. D. students are Susan Landau, F. Thomson Leighton, Shang-Hua Teng, Jonathan Shewchuk. Gary Miller's web page at Carnegie Mellon. Gary Miller at the Mathematics Genealogy Project. Miller's original paper "Riemann's Hypothesis and Tests for Primality"
Yaakov Ziv is an Israeli electrical engineer who, along with Abraham Lempel, developed the LZ family of lossless data compression algorithms. Ziv was born in Tiberias, British-ruled Palestine, on 27 November 1931, he received the B. Sc. Dip. Eng. and M. Sc. degrees, all in electrical engineering, from the Technion – Israel Institute of Technology in 1954, 1957 and the D. Sc. degree from the Massachusetts Institute of Technology in 1962. Ziv joined the Technion – Israel Institute of Technology in 1970 and is Herman Gross Professor of Electrical Engineering and a Technion Distinguished Professor, his research interests include data compression, information theory, statistical communication theory. Ziv was Dean of the Faculty of Electrical Engineering from 1974 to 1976 and Vice President for Academic Affairs from 1978 to 1982. Since 1987 Ziv has spent three sabbatical leaves at the Information Research Department of Bell Laboratories in Murray Hill, New Jersey, USA. From 1955 to 1959, he was a Senior Research Engineer in the Scientific Department Israel Ministry of Defense, was assigned to the research and development of communication systems.
From 1961 to 1962, while studying for his doctorate at M. I. T, he joined the Applied Science Division of Inc.. Watertown, MA, where he was a Senior Research Engineer doing research in communication theory. In 1962 he returned to the Scientific Department, Israel Ministry of Defense, as Head of the Communications Division and was an Adjunct of the Faculty of Electrical Engineering, Technion - Israel Institute of Technology. From 1968 to 1970 he was a Member of the Technical Staff of Inc.. Ziv was the Chairman of the Israeli Universities Planning and Grants Committee from 1985 to 1991, he has been a member of the Israel Academy of Sciences and Humanities since 1981 and served as its president between 1995 and 2004. In 1993, Ziv was awarded the Israel Prize, for exact sciences. Ziv received in 1995 the IEEE Richard W. Hamming Medal, for "contributions to information theory, the theory and practice of data compression", in 1998 a Golden Jubilee Award for Technological Innovation from the IEEE Information Theory Society.
Ziv is the recipient of the 1997 Claude E. Shannon Award from the IEEE Information Theory Society and the 2008 BBVA Foundation Frontiers of Knowledge Award in the category of Information and Communication Technologies; these prestigious awards are considered second only to the Nobel Prize in their monetary amount. List of Israel Prize recipients A Conversation with Jacob Ziv ACM Paris Kanellakis Theory and Practice Award 1977: Jacob Ziv Jacob Ziv at DBLP Bibliography Server
Michael O. Rabin
Michael Oser Rabin is an Israeli mathematician and computer scientist and a recipient of the Turing Award. Rabin was born in 1931 in Breslau, the son of a rabbi. In 1935, he emigrated with his family to Mandate Palestine; as a young boy, he was interested in mathematics and his father sent him to the best high school in Haifa, where he studied under mathematician Elisha Netanyahu, a high school teacher. After high school, he was drafted into the army during the 1948 Arab–Israeli War; the mathematician Abraham Fraenkel, a professor of mathematics in Jerusalem, intervened with the army command, Rabin was discharged to study at the university in 1949. He received an M. Sc. from Hebrew University of Jerusalem in 1953 and a Ph. D. from Princeton University in 1956. Rabin became Associate Professor of Mathematics at the University of California, Berkeley and MIT. Before moving to Harvard University as Gordon McKay Professor of Computer Science in 1981, he was a professor at the Hebrew University. In the late 1950s, he was invited for a summer to do research for IBM at the Lamb Estate in Westchester County, New York with other promising mathematicians and scientists.
It was there that he and Dana Scott wrote the paper "Finite Automata and Their Decision Problems". Soon, using nondeterministic automata, they were able to re-prove Kleene's result that finite state machines accept regular languages; as to the origins of what was to become computational complexity theory, the next summer Rabin returned to the Lamb Estate. John McCarthy posed a puzzle to him about spies and passwords, which Rabin studied and soon after he wrote an article, "Degree of Difficulty of Computing a Function and Hierarchy of Recursive Sets."Nondeterministic machines have become a key concept in computational complexity theory with the description of the complexity classes P and NP. Rabin returned to Jerusalem, researching logic, working on the foundations of what would be known as computer science, he was an associate professor and the head of the Institute of Mathematics at the Hebrew University at 29 years old, a full professor by 33. Rabin recalls, "There was no appreciation of the work on the issues of computing.
Mathematicians did not recognize the emerging new field". In 1960, he was invited by Edward F. Moore to work at Bell Labs, where Rabin introduced probabilistic automata that employ coin tosses in order to decide which state transitions to take, he showed examples of regular languages that required a large number of states, but for which you get an exponential reduction of the number of states if you go over to probabilistic automata. In 1969, Rabin proved that the second-order theory of n successors is decidable. A key component of the proof implicitly showed determinacy of parity games, which lie in the third level of the Borel hierarchy. In 1975, Rabin finished his tenure as Rector of the Hebrew University of Jerusalem and went to the Massachusetts Institute of Technology in the USA as a visiting professor. Gary Miller was there and had his polynomial time test for primality based on the extended Riemann hypothesis. While there, Rabin invented the Miller–Rabin primality test, a randomized algorithm that can determine quickly whether a number is prime.
Rabin's method was based on previous work of Gary Miller that solved the problem deterministically with the assumption that the generalized Riemann hypothesis is true, but Rabin's version of the test made no such assumption. Fast primality testing is key in the successful implementation of most public-key cryptography, in 2003 Miller, Robert M. Solovay, Volker Strassen were given the Paris Kanellakis Award for their work on primality testing. In 1976 he was invited by Joseph Traub to meet at Carnegie Mellon University and presented the primality test. After he gave that lecture, Traub had said, "No, no, this is revolutionary, it's going to become important."In 1979, Rabin invented the Rabin cryptosystem, the first asymmetric cryptosystem whose security was proved equivalent to the intractability of integer factorization. In 1981, Rabin reinvented a weak variant of the technique of oblivious transfer invented by Wiesner under the name of multiplexing, allowing a sender to transmit a message to a receiver where the receiver has some probability between 0 and 1 of learning the message, with the sender being unaware whether the receiver was able to do so.
In 1987, together with Richard Karp, created one of the most well-known efficient string search algorithms, the Rabin–Karp string search algorithm, known for its rolling hash. Rabin's more recent research has concentrated on computer security, he is the Thomas J. Watson Sr. Professor of Computer Science at Harvard University and Professor of Computer Science at Hebrew University. During the spring semester of 2007, he was a visiting professor at Columbia University teaching Introduction to Cryptography. Rabin is a foreign member of the United States National Academy of Sciences, a member of the French Academy of Sciences and a foreign member of the Royal Society. In 1976, the Turing Award was awarded jointly to Rabin and Dana Scott for a paper written in 1959, the citation for which states that the award was granted: For their joint paper "Finite Automata and Their Decision Problems," which introduced the idea of nondeterministic machines, which has proved to be an enormously valuable concept.
Their classic paper has been a continuous source of inspiration for subsequent work in this field. In 1995, Rabin was awarded the Israel Prize, in computer sciences. In 2010, Rabin was awarded the Tel Aviv University
Electronic design automation
Electronic design automation referred to as electronic computer-aided design, is a category of software tools for designing electronic systems such as integrated circuits and printed circuit boards. The tools work together in a design flow that chip designers use to design and analyze entire semiconductor chips. Since a modern semiconductor chip can have billions of components, EDA tools are essential for their design; this article describes EDA with respect to integrated circuits. Before EDA, integrated circuits were designed by hand, manually laid out; some advanced shops used geometric software to generate the tapes for the Gerber photoplotter, but those copied digital recordings of mechanically drawn components. The process was fundamentally graphic, with the translation from electronics to graphics done manually; the best known company from this era was Calma. By the mid-1970s, developers started to automate the design along with the drafting; the first placement and routing tools were developed.
The proceedings of the Design Automation Conference cover much of this era. The next era began about the time of the publication of "Introduction to VLSI Systems" by Carver Mead and Lynn Conway in 1980; this ground breaking text advocated chip design with programming languages. The immediate result was a considerable increase in the complexity of the chips that could be designed, with improved access to design verification tools that used logic simulation; the chips were easier to lay out and more to function since their designs could be simulated more prior to construction. Although the languages and tools have evolved, this general approach of specifying the desired behavior in a textual programming language and letting the tools derive the detailed physical design remains the basis of digital IC design today; the earliest EDA tools were produced academically. One of the most famous was the "Berkeley VLSI Tools Tarball", a set of UNIX utilities used to design early VLSI systems. Still used are the Espresso heuristic logic minimizer and Magic.
Another crucial development was the formation of MOSIS, a consortium of universities and fabricators that developed an inexpensive way to train student chip designers by producing real integrated circuits. The basic concept was to use reliable, low-cost low-technology IC processes, pack a large number of projects per wafer, with just a few copies of each projects' chips. Cooperating fabricators either donated the processed wafers, or sold them at cost, seeing the program as helpful to their own long-term growth. 1981 marks the beginning of EDA as an industry. For many years, the larger electronic companies, such as Hewlett Packard and Intel, had pursued EDA internally. In 1981, managers and developers spun out of these companies to concentrate on EDA as a business. Daisy Systems, Mentor Graphics, Valid Logic Systems were all founded around this time, collectively referred to as DMV. Within a few years there were many companies specializing in EDA, each with a different emphasis; the first trade show for EDA was held at the Design Automation Conference in 1984.
In 1981, the U. S. Department of Defense began funding of VHDL as a hardware description language. In 1986, another popular high-level design language, was first introduced as a hardware description language by Gateway Design Automation. Simulators followed these introductions, permitting direct simulation of chip designs: executable specifications. In a few more years, back-ends were developed to perform logic synthesis. Current digital flows are modular; the front ends produce standardized design descriptions that compile into invocations of "cells,", without regard to the cell technology. Cells implement logic or other electronic functions using a particular integrated circuit technology. Fabricators provide libraries of components for their production processes, with simulation models that fit standard simulation tools. Analog EDA tools are far less modular, since many more functions are required, they interact more and the components are less ideal. EDA for electronics has increased in importance with the continuous scaling of semiconductor technology.
Some users are foundry operators, who operate the semiconductor fabrication facilities, or "fabs", design-service companies who use EDA software to evaluate an incoming design for manufacturing readiness. EDA tools are used for programming design functionality into FPGAs. High-level synthesis – high-level design description is converted into RTL. Logic synthesis – translation of RTL design description into a discrete netlist of logic gates. Schematic capture – For standard cell digital, analog, RF-like Capture CIS in Orcad by Cadence and ISIS in Proteus Layout – schematic-driven layout, like Layout in Orcad by Cadence, ARES in Proteus Transistor simulation – low-level transistor-simulation of a schematic/layout's behavior, accurate at device-level. Logic simulation – digital-simulation of an RTL or gate-netlist's digital behavior, accurate at boolean-level. Behavioral Simulation – high-level simulation of a design's architectural operation, accurate at cycle-level or interface-level. Hardware emulation – Use of special purpose hardware to emulate the logic of a proposed design.
Can sometimes be plugged into a system in place of a yet-to-be-built chip. Technology CAD analyze the underlying process technology. Electrical prope
Martin Edward Hellman is an American cryptologist, best known for his invention of public key cryptography in cooperation with Whitfield Diffie and Ralph Merkle. Hellman is a longtime contributor to the computer privacy debate, has applied risk analysis to a potential failure of nuclear deterrence, in 2016 wrote a book with his wife, Dorothie Hellman, that links creating love at home to bringing peace to the planet. Born to a Jewish family, Hellman graduated from the Bronx High School of Science, he went on to take his bachelor's degree in electrical engineering from New York University in 1966, at Stanford University he received a master's degree and a Ph. D. in the discipline in 1967 and 1969. From 1968 to 1969 he worked at IBM's Thomas J. Watson Research Center in Yorktown Heights, New York, where he encountered Horst Feistel. From 1969 to 1971, he was an assistant professor of electrical engineering at the Massachusetts Institute of Technology, he joined Stanford University electrical engineering department in 1971 as an assistant professor and served on the full-time faculty for twenty-five years before taking emeritus status as a full professor in 1996.
Hellman and Whitfield Diffie's paper New Directions in Cryptography was published in 1976. It introduced a radically new method of distributing cryptographic keys, which went far toward solving one of the fundamental problems of cryptography, key distribution, it has become known as Diffie–Hellman key exchange, although Hellman has argued that it ought to be called Diffie-Hellman-Merkle key exchange because of Merkle's separate contribution. The article stimulated the development of a new class of encryption algorithms, known variously as public key encryption and asymmetric encryption. Hellman and Diffie were awarded the Marconi Fellowship and accompanying prize in 2000 for work on public-key cryptography and for helping make cryptography a legitimate area of academic research, they were awarded the 2015 Turing Award for the same work. Hellman has been a longtime contributor to the computer privacy debate, he and Diffie were the most prominent critics of the short key size of the Data Encryption Standard in 1975.
An audio recording survives of their review of DES at Stanford in 1976 with Dennis Branstad of NBS and representatives of the National Security Agency. Their concern was well-founded: subsequent history has shown not only that NSA intervened with IBM and NBS to shorten the key size, but that the short key size enabled the kind of massively parallel key crackers that Hellman and Diffie sketched out. In response to RSA Security's DES Challenges starting in 1997, brute force crackers were built that could break DES, making it clear that DES was insecure and obsolete; as of 2012, a $10,000 commercially available machine could recover a DES key in days. Hellman served on the National Research Council's Committee to Study National Cryptographic Policy, whose main recommendations have since been implemented. Hellman has been active in researching international security since 1985. Hellman was involved in the original Beyond War movement, serving as the principal editor for the "BEYOND WAR: A New Way of Thinking" booklet.
In 1987 more than 30 scholars came together to produce Russian and English editions of the book Breakthrough: Emerging New Thinking and Western Scholars Issue a Challenge to Build a World Beyond War. Anatoly Gromyko and Martin Hellman served as the chief editors; the authors of the book examine questions such as: How can we overcome the inexorable forces leading toward a clash between the United States and the Soviet Union? How do we build a common vision for the future? How can we restructure our thinking to synchronize with the imperative of our modern world? Hellman's current project in international security is to defuse the nuclear threat. In particular, he is studying the probabilities and risks associated with nuclear weapons and encouraging further international research in this area, his website NuclearRisk.org has been endorsed by a number of prominent individuals, including a former Director of the National Security Agency, Stanford's President Emeritus, two Nobel Laureates. Hellman is a member of the Board of Directors for Daisy Alliance, a non-governmental organization based in Atlanta, seeking global security through nuclear nonproliferation and disarmament.
In 1997 he was awarded The Franklin Institute's Louis E. Levy Medal, in 1981 the IEEE Donald G. Fink Prize Paper Award, in 2000, he won the Marconi Prize for his invention of public-key cryptography to protect privacy on the Internet together with Whit Diffie. In 1998, Hellman was a Golden Jubilee Award for Technological Innovation from the IEEE Information Theory Society, in 2010 the IEEE Richard W. Hamming Medal. In 2011, he was inducted into the National Inventors Hall of Fame. In 2011, Hellman was made a Fellow of the Computer History Museum for his work, with Whitfield Diffie and Ralph Merkle, on public key cryptography. Hellman won the Turing Award for 2015 together with Whitfield Diffie; the Turing award is considered the most prestigious award in the field of computer science. The citation for the award was: "For fundamental contributions to modern cryptography. Diffie and Hellman's groundbreaking 1976 paper, "New Directions in Cryptography," introduced the ideas of public-key cryptography and digital signatures, which are the foundation for most regularly-used security protocols on the internet today."
Oral history interview with Martin Hellman Oral history interview 2004, Palo Alto, California. Charles Babbage Institute, University of Minnesota, Minneapolis. Hellman describes his invention of public key cryp