1.
Authentication
–
Authentication is the act of confirming the truth of an attribute of a single piece of data claimed true by an entity. In other words, authentication often involves verifying the validity of at least one form of identification, Authentication is relevant to multiple fields. In art, antiques and anthropology, a problem is verifying that a given artifact was produced by a certain person or in a certain place or period of history. In computer science, verifying a persons identity is required to allow access to confidential data or systems. With autographed sports memorabilia, this could involve someone attesting that they witnessed the object being signed, a vendor selling branded items implies authenticity, while he or she may not have evidence that every step in the supply chain was authenticated. The second type of authentication is comparing the attributes of the object itself to what is known about objects of that origin. For example, an art expert might look for similarities in the style of painting, check the location and form of a signature, or compare the object to an old photograph. The physics of sound and light, and comparison with a physical environment, can be used to examine the authenticity of audio recordings, photographs. Documents can be verified as being created on ink or paper readily available at the time of the implied creation. Attribute comparison may be vulnerable to forgery, in art and antiques, certificates are of great importance for authenticating an object of interest and value. Certificates can, however, also be forged, and the authentication of these poses a problem. For instance, the son of Han van Meegeren, the well-known art-forger, forged the work of his father and provided a certificate for its provenance as well, see the article Jacques van Meegeren. Criminal and civil penalties for fraud, forgery, and counterfeiting can reduce the incentive for falsification, currency and other financial instruments commonly use this second type of authentication method. The third type of authentication relies on documentation or other external affirmations, in criminal courts, the rules of evidence often require establishing the chain of custody of evidence presented. This can be accomplished through a written evidence log, or by testimony from the police detectives, some antiques are accompanied by certificates attesting to their authenticity. Signed sports memorabilia is usually accompanied by a certificate of authenticity and these external records have their own problems of forgery and perjury, and are also vulnerable to being separated from the artifact and lost. In computer science, a user can be access to secure systems based on user credentials that imply authenticity. A network administrator can give a user a password, or provide the user with a key card or other device to allow system access
2.
Electronic signature
–
An electronic signature, or e-signature, refers to data in electronic form, which is logically associated with other data in electronic form and which is used by the signatory to sign. This type of signature provides the legal standing as a handwritten signature as long as it adheres to the requirements of the specific regulation it was created under. Electronic signatures are a concept distinct from digital signatures, a cryptographic mechanism often used to implement electronic signatures. Standardization agencies like NIST or ETSI provide standards for their implementation, the concept itself is not new, with common law jurisdictions having recognized telegraph signatures as far back as the mid-19th century and faxed signatures since the 1980s. An electronic signature is intended to provide a secure and accurate method for the signatory to provide a seamless transaction. Definitions of electronic signatures vary depending on the applicable jurisdiction and it is difficult to challenge the authorship of a statement signed with a qualified electronic signature - the statement is non-reputable. Since well before the American Civil War began in 1861, morse code was used to send messages electrically by telegraphy, some of these messages were agreements to terms that were intended as enforceable contracts. An early acceptance of the enforceability of telegraphic messages as electronic signatures came from the New Hampshire Supreme Court in 1869, in the 1980s, many companies and even some individuals began using fax machines for high-priority or time-sensitive delivery of documents. Although the original signature on the document was on paper. In 1996 the United Nations published the UNCITRAL Model Law on Electronic Commerce, Article 7 of the UNCITRAL Model Law on Electronic Commerce was highly influential in the development of electronic signature laws around the world, including in the US. In 2001, UNCITRAL concluded work on a text, the UNCITRAL Model Law on Electronic Signatures. PIPEDAs secure electronic signature regulations refine the definition as being a digital signature applied and verified in a specific manner, the current and applicable version of eIDAS was published by the European Parliament and the European Council on July 23,2014. Following Article 25 of the regulation, an advanced electronic signature shall “not be denied legal effect. However it will reach a higher value when enhanced to the level of a qualified electronic signature. However, this is regulated in the European Union and similarly through ZertES in Switzerland. A qualified electronic signature is not defined in the United States and it may be an electronic transmission of the document which contains the signature, as in the case of facsimile transmissions, or it may be encoded message, such as telegraphy using Morse code. It was influenced by ABA committee white papers and the law promulgated by NCCUSL. Under UETA, the means a electronic sound, symbol, or process, attached to or logically associated with a record
3.
Turkey
–
Turkey, officially the Republic of Turkey, is a transcontinental country in Eurasia, mainly in Anatolia in Western Asia, with a smaller portion on the Balkan peninsula in Southeast Europe. Turkey is a democratic, secular, unitary, parliamentary republic with a cultural heritage. The country is encircled by seas on three sides, the Aegean Sea is to the west, the Black Sea to the north, and the Mediterranean Sea to the south. The Bosphorus, the Sea of Marmara, and the Dardanelles, Ankara is the capital while Istanbul is the countrys largest city and main cultural and commercial centre. Approximately 70-80% of the countrys citizens identify themselves as ethnic Turks, other ethnic groups include legally recognised and unrecognised minorities. Kurds are the largest ethnic minority group, making up approximately 20% of the population, the area of Turkey has been inhabited since the Paleolithic by various ancient Anatolian civilisations, as well as Assyrians, Greeks, Thracians, Phrygians, Urartians and Armenians. After Alexander the Greats conquest, the area was Hellenized, a process continued under the Roman Empire. The Seljuk Sultanate of Rûm ruled Anatolia until the Mongol invasion in 1243, the empire reached the peak of its power in the 16th century, especially during the reign of Suleiman the Magnificent. During the war, the Ottoman government committed genocides against its Armenian, Assyrian, following the war, the conglomeration of territories and peoples that formerly comprised the Ottoman Empire was partitioned into several new states. Turkey is a member of the UN, an early member of NATO. Turkeys growing economy and diplomatic initiatives have led to its recognition as a regional power while her location has given it geopolitical, the name of Turkey is based on the ethnonym Türk. The first recorded use of the term Türk or Türük as an autonym is contained in the Old Turkic inscriptions of the Göktürks of Central Asia, the English name Turkey first appeared in the late 14th century and is derived from Medieval Latin Turchia. Similarly, the medieval Khazar Empire, a Turkic state on the shores of the Black. The medieval Arabs referred to the Mamluk Sultanate as al-Dawla al-Turkiyya, the Ottoman Empire was sometimes referred to as Turkey or the Turkish Empire among its European contemporaries. The Anatolian peninsula, comprising most of modern Turkey, is one of the oldest permanently settled regions in the world, various ancient Anatolian populations have lived in Anatolia, from at least the Neolithic period until the Hellenistic period. Many of these peoples spoke the Anatolian languages, a branch of the larger Indo-European language family, in fact, given the antiquity of the Indo-European Hittite and Luwian languages, some scholars have proposed Anatolia as the hypothetical centre from which the Indo-European languages radiated. The European part of Turkey, called Eastern Thrace, has also been inhabited since at least forty years ago. It is the largest and best-preserved Neolithic site found to date, the settlement of Troy started in the Neolithic Age and continued into the Iron Age
4.
India
–
India, officially the Republic of India, is a country in South Asia. It is the seventh-largest country by area, the second-most populous country, and it is bounded by the Indian Ocean on the south, the Arabian Sea on the southwest, and the Bay of Bengal on the southeast. It shares land borders with Pakistan to the west, China, Nepal, and Bhutan to the northeast, in the Indian Ocean, India is in the vicinity of Sri Lanka and the Maldives. Indias Andaman and Nicobar Islands share a border with Thailand. The Indian subcontinent was home to the urban Indus Valley Civilisation of the 3rd millennium BCE, in the following millennium, the oldest scriptures associated with Hinduism began to be composed. Social stratification, based on caste, emerged in the first millennium BCE, early political consolidations took place under the Maurya and Gupta empires, the later peninsular Middle Kingdoms influenced cultures as far as southeast Asia. In the medieval era, Judaism, Zoroastrianism, Christianity, and Islam arrived, much of the north fell to the Delhi sultanate, the south was united under the Vijayanagara Empire. The economy expanded in the 17th century in the Mughal empire, in the mid-18th century, the subcontinent came under British East India Company rule, and in the mid-19th under British crown rule. A nationalist movement emerged in the late 19th century, which later, under Mahatma Gandhi, was noted for nonviolent resistance, in 2015, the Indian economy was the worlds seventh largest by nominal GDP and third largest by purchasing power parity. Following market-based economic reforms in 1991, India became one of the major economies and is considered a newly industrialised country. However, it continues to face the challenges of poverty, corruption, malnutrition, a nuclear weapons state and regional power, it has the third largest standing army in the world and ranks sixth in military expenditure among nations. India is a constitutional republic governed under a parliamentary system. It is a pluralistic, multilingual and multi-ethnic society and is home to a diversity of wildlife in a variety of protected habitats. The name India is derived from Indus, which originates from the Old Persian word Hindu, the latter term stems from the Sanskrit word Sindhu, which was the historical local appellation for the Indus River. The ancient Greeks referred to the Indians as Indoi, which translates as The people of the Indus, the geographical term Bharat, which is recognised by the Constitution of India as an official name for the country, is used by many Indian languages in its variations. Scholars believe it to be named after the Vedic tribe of Bharatas in the second millennium B. C. E and it is also traditionally associated with the rule of the legendary emperor Bharata. Gaṇarājya is the Sanskrit/Hindi term for republic dating back to the ancient times, hindustan is a Persian name for India dating back to the 3rd century B. C. E. It was introduced into India by the Mughals and widely used since then and its meaning varied, referring to a region that encompassed northern India and Pakistan or India in its entirety
5.
Switzerland
–
Switzerland, officially the Swiss Confederation, is a federal republic in Europe. It consists of 26 cantons, and the city of Bern is the seat of the federal authorities. The country is situated in western-Central Europe, and is bordered by Italy to the south, France to the west, Germany to the north, and Austria and Liechtenstein to the east. Switzerland is a country geographically divided between the Alps, the Swiss Plateau and the Jura, spanning an area of 41,285 km2. The establishment of the Old Swiss Confederacy dates to the medieval period, resulting from a series of military successes against Austria. Swiss independence from the Holy Roman Empire was formally recognized in the Peace of Westphalia in 1648. The country has a history of armed neutrality going back to the Reformation, it has not been in a state of war internationally since 1815, nevertheless, it pursues an active foreign policy and is frequently involved in peace-building processes around the world. In addition to being the birthplace of the Red Cross, Switzerland is home to international organisations. On the European level, it is a member of the European Free Trade Association. However, it participates in the Schengen Area and the European Single Market through bilateral treaties, spanning the intersection of Germanic and Romance Europe, Switzerland comprises four main linguistic and cultural regions, German, French, Italian and Romansh. Due to its diversity, Switzerland is known by a variety of native names, Schweiz, Suisse, Svizzera. On coins and stamps, Latin is used instead of the four living languages, Switzerland is one of the most developed countries in the world, with the highest nominal wealth per adult and the eighth-highest per capita gross domestic product according to the IMF. Zürich and Geneva have each been ranked among the top cities in the world in terms of quality of life, with the former ranked second globally, according to Mercer. The English name Switzerland is a compound containing Switzer, a term for the Swiss. The English adjective Swiss is a loan from French Suisse, also in use since the 16th century. The name Switzer is from the Alemannic Schwiizer, in origin an inhabitant of Schwyz and its associated territory, the Swiss began to adopt the name for themselves after the Swabian War of 1499, used alongside the term for Confederates, Eidgenossen, used since the 14th century. The data code for Switzerland, CH, is derived from Latin Confoederatio Helvetica. The toponym Schwyz itself was first attested in 972, as Old High German Suittes, ultimately related to swedan ‘to burn’
6.
European Union
–
The European Union is a political and economic union of 28 member states that are located primarily in Europe. It has an area of 4,475,757 km2, the EU has developed an internal single market through a standardised system of laws that apply in all member states. Within the Schengen Area, passport controls have been abolished, a monetary union was established in 1999 and came into full force in 2002, and is composed of 19 EU member states which use the euro currency. The EU operates through a system of supranational and intergovernmental decision-making. The EU traces its origins from the European Coal and Steel Community, the community and its successors have grown in size by the accession of new member states and in power by the addition of policy areas to its remit. While no member state has left the EU or its antecedent organisations, the Maastricht Treaty established the European Union in 1993 and introduced European citizenship. The latest major amendment to the basis of the EU. The EU as a whole is the largest economy in the world, additionally,27 out of 28 EU countries have a very high Human Development Index, according to the United Nations Development Programme. In 2012, the EU was awarded the Nobel Peace Prize, through the Common Foreign and Security Policy, the EU has developed a role in external relations and defence. The union maintains permanent diplomatic missions throughout the world and represents itself at the United Nations, the World Trade Organization, the G7, because of its global influence, the European Union has been described as an emerging superpower. After World War II, European integration was seen as an antidote to the nationalism which had devastated the continent. 1952 saw the creation of the European Coal and Steel Community, the supporters of the Community included Alcide De Gasperi, Jean Monnet, Robert Schuman, and Paul-Henri Spaak. These men and others are credited as the Founding fathers of the European Union. In 1957, Belgium, France, Italy, Luxembourg, the Netherlands and West Germany signed the Treaty of Rome and they also signed another pact creating the European Atomic Energy Community for co-operation in developing nuclear energy. Both treaties came into force in 1958, the EEC and Euratom were created separately from the ECSC, although they shared the same courts and the Common Assembly. The EEC was headed by Walter Hallstein and Euratom was headed by Louis Armand, Euratom was to integrate sectors in nuclear energy while the EEC would develop a customs union among members. During the 1960s, tensions began to show, with France seeking to limit supranational power, Jean Rey presided over the first merged Commission. In 1973, the Communities enlarged to include Denmark, Ireland, Norway had negotiated to join at the same time, but Norwegian voters rejected membership in a referendum
7.
Asymmetric key algorithm
–
In a public key encryption system, any person can encrypt a message using the public key of the receiver, but such a message can be decrypted only with the receivers private key. For this to work it must be easy for a user to generate a public. The strength of a public key cryptography system relies on the degree of difficulty for a properly generated private key to be determined from its public key. Security then depends only on keeping the key private. Public key algorithms, unlike symmetric key algorithms, do not require a secure channel for the exchange of one secret keys between the parties. Because of the complexity of asymmetric encryption, it is usually used only for small blocks of data. This symmetric key is used to encrypt the rest of the potentially long message sequence. The symmetric encryption/decryption is based on algorithms and is much faster. In a public key system, a person can combine a message with a private key to create a short digital signature on the message. Thus the authenticity of a message can be demonstrated by the signature, Public key algorithms are fundamental security ingredients in cryptosystems, applications and protocols. They underpin various Internet standards, such as Transport Layer Security, S/MIME, PGP, some public key algorithms provide key distribution and secrecy, some provide digital signatures, and some provide both. Public key cryptography finds application in, among others, the information technology security discipline, information security is concerned with all aspects of protecting electronic information assets against security threats. Public key cryptography is used as a method of assuring the confidentiality, authenticity and non-repudiability of electronic communications, two of the best-known uses of public key cryptography are, Public key encryption, in which a message is encrypted with a recipients public key. The message cannot be decrypted by anyone who does not possess the matching private key, who is presumed to be the owner of that key. This is used in an attempt to ensure confidentiality, digital signatures, in which a message is signed with the senders private key and can be verified by anyone who has access to the senders public key. This verification proves that the sender had access to the private key, an analogy to public key encryption is that of a locked mail box with a mail slot. The mail slot is exposed and accessible to the public – its location is, in essence, anyone knowing the street address can go to the door and drop a written message through the slot. However, only the person who possesses the key can open the mailbox, an analogy for digital signatures is the sealing of an envelope with a personal wax seal
8.
Private key
–
In a public key encryption system, any person can encrypt a message using the public key of the receiver, but such a message can be decrypted only with the receivers private key. For this to work it must be easy for a user to generate a public. The strength of a public key cryptography system relies on the degree of difficulty for a properly generated private key to be determined from its public key. Security then depends only on keeping the key private. Public key algorithms, unlike symmetric key algorithms, do not require a secure channel for the exchange of one secret keys between the parties. Because of the complexity of asymmetric encryption, it is usually used only for small blocks of data. This symmetric key is used to encrypt the rest of the potentially long message sequence. The symmetric encryption/decryption is based on algorithms and is much faster. In a public key system, a person can combine a message with a private key to create a short digital signature on the message. Thus the authenticity of a message can be demonstrated by the signature, Public key algorithms are fundamental security ingredients in cryptosystems, applications and protocols. They underpin various Internet standards, such as Transport Layer Security, S/MIME, PGP, some public key algorithms provide key distribution and secrecy, some provide digital signatures, and some provide both. Public key cryptography finds application in, among others, the information technology security discipline, information security is concerned with all aspects of protecting electronic information assets against security threats. Public key cryptography is used as a method of assuring the confidentiality, authenticity and non-repudiability of electronic communications, two of the best-known uses of public key cryptography are, Public key encryption, in which a message is encrypted with a recipients public key. The message cannot be decrypted by anyone who does not possess the matching private key, who is presumed to be the owner of that key. This is used in an attempt to ensure confidentiality, digital signatures, in which a message is signed with the senders private key and can be verified by anyone who has access to the senders public key. This verification proves that the sender had access to the private key, an analogy to public key encryption is that of a locked mail box with a mail slot. The mail slot is exposed and accessible to the public – its location is, in essence, anyone knowing the street address can go to the door and drop a written message through the slot. However, only the person who possesses the key can open the mailbox, an analogy for digital signatures is the sealing of an envelope with a personal wax seal
9.
Electronic mail
–
Email, is a method of exchanging digital messages between people using digital devices such as computers and mobile phones. Email first entered use in the 1960s and by the mid-1970s had taken the form now recognized as email. Some early email systems required the author and the recipient to both be online at the time, in common with instant messaging. Todays email systems are based on a store-and-forward model, Email servers accept, forward, deliver, and store messages. Originally an ASCII text-only communications medium, Internet email was extended by Multipurpose Internet Mail Extensions to carry text in character sets. International email, with internationalized email addresses using UTF-8, has been standardized, the history of modern Internet email services reaches back to the early ARPANET, with standards for encoding email messages published as early as 1973. An email message sent in the early 1970s looks very similar to an email sent today. Email played an important part in creating the Internet, and the conversion from ARPANET to the Internet in the early 1980s produced the core of the current services, historically, the term electronic mail was used generically for any electronic document transmission. For example, several writers in the early 1970s used the term to describe fax document transmission, as a result, it is difficult to find the first citation for the use of the term with the more specific meaning it has today. This spelling also appears in most dictionaries, Mail was the form used in the original protocol standard, RFC524. The service is referred to as mail, and a piece of electronic mail is called a message. EMail is a form that has been used in RFCs for the Authors Address and is expressly required for historical reasons. E-mail is sometimes used, capitalizing the initial E as in similar abbreviations like E-piano, E-guitar, A-bomb, by 1968, AUTODIN linked more than 300 sites in several countries. With the introduction of MITs Compatible Time-Sharing System in 1961, multiple users could log in to a system from remote dial-up terminals. Informal methods of using this to pass messages were developed and expanded,1965 – MITs CTSS MAIL and it used the Unix mail client to send messages between system users. The concept was extended to communicate remotely over the Berkeley Network,1979 – EMAIL, an application written by Shiva Ayyadurai for the University of Medicine and Dentistry of New Jersey. 1979 – MH Message Handling System developed at RAND provided several tools for managing electronic mail on Unix, most of them only allowed communication between users logged into the same host or mainframe, although there might be hundreds or thousands of users within an organization. In the early 1980s, networked personal computers on LANs became increasingly important, server-based systems similar to the earlier mainframe systems were developed
10.
Contract
–
A contract is a voluntary arrangement between two or more parties that is enforceable by law as a binding legal agreement. Contract is a branch of the law of obligations in jurisdictions of the civil law tradition, Contract law concerns the rights and duties that arise from agreements. A contract arises when the parties agree that there is an agreement, formation of a contract generally requires an offer, acceptance, consideration, and a mutual intent to be bound. Each party to a contract must have capacity to enter the agreement, minors, intoxicated persons, and those under a mental affliction may have insufficient capacity to enter a contract. Some types of contracts may require formalities, such as a memorialization in writing, at common law, the elements of a contract are offer, acceptance, intention to create legal relations, and consideration. Not all agreements are necessarily contractual, as the parties generally must be deemed to have an intention to be legally bound, a so-called gentlemens agreement is one which is not intended to be legally enforceable, and which is binding in honour only. In order for a contract to be formed, the parties must reach mutual assent and this is typically reached through offer and an acceptance which does not vary the offers terms, which is known as the mirror image rule. An offer is a statement of the offerors willingness to be bound should certain conditions be met. If a purported acceptance does vary the terms of an offer, it is not an acceptance but a counteroffer and, therefore, the Uniform Commercial Code disposes of the mirror image rule in §2-207, although the UCC only governs transactions in goods in the USA. As a court cannot read minds, the intent of the parties is interpreted objectively from the perspective of a reasonable person and it is important to note that where an offer specifies a particular mode of acceptance, only an acceptance communicated via that method will be valid. Contracts may be bilateral or unilateral, a bilateral contract is an agreement in which each of the parties to the contract makes a promise or set of promises to each other. For example, in a contract for the sale of a home, less common are unilateral contracts in which one party makes a promise, but the other side does not promise anything. In these cases, those accepting the offer are not required to communicate their acceptance to the offeror, in a reward contract, for example, a person who has lost a dog could promise a reward if the dog is found, through publication or orally. The payment could be conditioned on the dog being returned alive. Those who learn of the reward are not required to search for the dog, but if someone finds the dog and delivers it, the High Court of Australia stated that the term unilateral contract is unscientific and misleading. In certain circumstances, a contract may be created. A contract is implied in fact if the circumstances imply that parties have reached an agreement even though they have not done so expressly, quantum meruit claims are an example. Carbolic, a firm, advertised a smoke ball marketed as a wonder drug that would, according to the instructions
11.
Public-key cryptography
–
In a public key encryption system, any person can encrypt a message using the public key of the receiver, but such a message can be decrypted only with the receivers private key. For this to work it must be easy for a user to generate a public. The strength of a public key cryptography system relies on the degree of difficulty for a properly generated private key to be determined from its public key. Security then depends only on keeping the key private. Public key algorithms, unlike symmetric key algorithms, do not require a secure channel for the exchange of one secret keys between the parties. Because of the complexity of asymmetric encryption, it is usually used only for small blocks of data. This symmetric key is used to encrypt the rest of the potentially long message sequence. The symmetric encryption/decryption is based on algorithms and is much faster. In a public key system, a person can combine a message with a private key to create a short digital signature on the message. Thus the authenticity of a message can be demonstrated by the signature, Public key algorithms are fundamental security ingredients in cryptosystems, applications and protocols. They underpin various Internet standards, such as Transport Layer Security, S/MIME, PGP, some public key algorithms provide key distribution and secrecy, some provide digital signatures, and some provide both. Public key cryptography finds application in, among others, the information technology security discipline, information security is concerned with all aspects of protecting electronic information assets against security threats. Public key cryptography is used as a method of assuring the confidentiality, authenticity and non-repudiability of electronic communications, two of the best-known uses of public key cryptography are, Public key encryption, in which a message is encrypted with a recipients public key. The message cannot be decrypted by anyone who does not possess the matching private key, who is presumed to be the owner of that key. This is used in an attempt to ensure confidentiality, digital signatures, in which a message is signed with the senders private key and can be verified by anyone who has access to the senders public key. This verification proves that the sender had access to the private key, an analogy to public key encryption is that of a locked mail box with a mail slot. The mail slot is exposed and accessible to the public – its location is, in essence, anyone knowing the street address can go to the door and drop a written message through the slot. However, only the person who possesses the key can open the mailbox, an analogy for digital signatures is the sealing of an envelope with a personal wax seal
12.
Uniform distribution (discrete)
–
Another way of saying discrete uniform distribution would be a known, finite number of outcomes equally likely to happen. A simple example of the uniform distribution is throwing a fair die. The possible values are 1,2,3,4,5,6, if two dice are thrown and their values added, the resulting distribution is no longer uniform since not all sums have equal probability. The discrete uniform distribution itself is inherently non-parametric and it is convenient, however, to represent its values generally by an integer interval, so that a, b become the main parameters of the distribution. This problem is known as the German tank problem, following the application of maximum estimation to estimates of German tank production during World War II. The UMVU estimator for the maximum is given by N ^ = k +1 k m −1 = m + m k −1 where m is the maximum and k is the sample size. This can be seen as a simple case of maximum spacing estimation. This has a variance of 1 k ≈ N2 k 2 for small samples k ≪ N so a standard deviation of approximately N k, the sample maximum is the maximum likelihood estimator for the population maximum, but, as discussed above, it is biased. If samples are not numbered but are recognizable or markable, one can instead estimate population size via the capture-recapture method, see rencontres numbers for an account of the probability distribution of the number of fixed points of a uniformly distributed random permutation
13.
National Institute of Standards and Technology
–
The National Institute of Standards and Technology is a measurement standards laboratory, and a non-regulatory agency of the United States Department of Commerce. Its mission is to promote innovation and industrial competitiveness, in 1821, John Quincy Adams had declared Weights and measures may be ranked among the necessities of life to every individual of human society. From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, president Theodore Roosevelt appointed Samuel W. Stratton as the first director. The budget for the first year of operation was $40,000, a laboratory site was constructed in Washington, DC, and instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures, the Bureau developed instruments for electrical units, in 1905 a meeting was called that would be the first National Conference on Weights and Measures. Quality standards were developed for products including some types of clothing, automobile brake systems and headlamps, antifreeze, during World War I, the Bureau worked on multiple problems related to war production, even operating its own facility to produce optical glass when European supplies were cut off. Between the wars, Harry Diamond of the Bureau developed a blind approach radio aircraft landing system, in 1948, financed by the Air Force, the Bureau began design and construction of SEAC, the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes, about the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS and used for research there. A mobile version, DYSEAC, was built for the Signal Corps in 1954, due to a changing mission, the National Bureau of Standards became the National Institute of Standards and Technology in 1988. Following 9/11, NIST conducted the investigation into the collapse of the World Trade Center buildings. NIST had a budget for fiscal year 2007 of about $843.3 million. NISTs 2009 budget was $992 million, and it also received $610 million as part of the American Recovery, NIST employs about 2,900 scientists, engineers, technicians, and support and administrative personnel. About 1,800 NIST associates complement the staff, in addition, NIST partners with 1,400 manufacturing specialists and staff at nearly 350 affiliated centers around the country. NIST publishes the Handbook 44 that provides the Specifications, tolerances, the Congress of 1866 made use of the metric system in commerce a legally protected activity through the passage of Metric Act of 1866. NIST is headquartered in Gaithersburg, Maryland, and operates a facility in Boulder, nISTs activities are organized into laboratory programs and extramural programs. Effective October 1,2010, NIST was realigned by reducing the number of NIST laboratory units from ten to six, nISTs Boulder laboratories are best known for NIST‑F1, which houses an atomic clock. NIST‑F1 serves as the source of the official time. NIST also operates a neutron science user facility, the NIST Center for Neutron Research, the NCNR provides scientists access to a variety of neutron scattering instruments, which they use in many research fields
14.
Unary numeral system
–
The unary numeral system is the bijective base-1 numeral system. It is the simplest numeral system to represent natural numbers, in order to represent a number N, for examples, the numbers 1,2,3,4,5. Would be represented in this system as 1,11,111,1111,11111 and these numbers should be distinguished from repunits, which are also written as sequences of ones but have their usual decimal numerical interpretation. This system is used in tallying, for example, using the tally mark |, the number 3 is represented as |||. In East Asian cultures, the three is represented as “三”, a character that is drawn with three strokes. Addition and subtraction are particularly simple in the system, as they involve little more than string concatenation. The Hamming weight or population count operation that counts the number of bits in a sequence of binary values may also be interpreted as a conversion from unary to binary numbers. However, multiplication is more cumbersome and has often used as a test case for the design of Turing machines. Compared to standard positional numeral systems, the system is inconvenient. It occurs in some decision problem descriptions in theoretical computer science, therefore, while the run-time and space requirement in unary looks better as function of the input size, it does not represent a more efficient solution. In computational complexity theory, unary numbering is used to distinguish strongly NP-complete problems from problems that are NP-complete, for such a problem, there exist hard instances for which all parameter values are at most polynomially large. Unary is used as part of data compression algorithms such as Golomb coding. It also forms the basis for the Peano axioms for formalizing arithmetic within mathematical logic, a form of unary notation called Church encoding is used to represent numbers within lambda calculus. Sloanes A000042, Unary representation of natural numbers, the On-Line Encyclopedia of Integer Sequences
15.
Whitfield Diffie
–
Bailey Whitfield Whit Diffie is an American cryptographer and one of the pioneers of public-key cryptography. Diffie and Martin Hellmans paper New Directions in Cryptography was published in 1976 and it introduced a radically new method of distributing cryptographic keys, that went far toward solving one of the fundamental problems of cryptography, key distribution. It has become known as Diffie–Hellman key exchange, the article also seems to have stimulated the almost immediate public development of a new class of encryption algorithms, the asymmetric key algorithms. Diffie was born in Washington, D. C. the son of Justine Louise, a writer and scholar, and Bailey Wallys Diffie, who taught Iberian history and culture at City College of New York. His interest in cryptography began at age 10 when his father, at MIT, he began to program computers in an effort to cultivate a practical skill set while continuing to perceive computers as very low class. I thought of myself as a mathematician and was very interested in partial differential equations and topology. From 1965 to 1969, he remained in Greater Boston as an assistant for the MITRE Corporation in Bedford. As MITRE was a contractor, this position enabled Diffie to avoid the draft. During this period, he helped to develop Mathlab and other non-military applications, Diffie left SAIL to pursue independent research in cryptography from May 1973. He was assisted by his new girlfriend and future wife, Mary Fischer, a planned half-hour meeting between Diffie and Hellman extended over many hours as they shared ideas and information. Hellman then hired Diffie as a grant-funded part-time research programmer for the 1975 spring term, although it is unclear when he dropped out, Diffie remained employed in Hellmans lab as a research assistant through June 1978. In 1975–76, Diffie and Hellman criticized the NBS proposed Data Encryption Standard, an audio recording survives of their review of DES at Stanford in 1976 with Dennis Branstad of NBS and representatives of the National Security Agency. When these were built outside the classified world, they made it clear that DES was insecure. In 2012, a $10,000 commercially available machine could recover a DES key in days.25 networks, in 1991 he joined Sun Microsystems Laboratories in Menlo Park, California as a Distinguished Engineer, working primarily on public policy aspects of cryptography. Diffie remained with Sun, serving as its Chief Security Officer and he was also a Sun Fellow. Diffie received a doctorate from the Swiss Federal Institute of Technology in 1992. He is also a fellow of the Marconi Foundation and visiting fellow of the Isaac Newton Institute and he has received various awards from other organisations. In July 2008, he was awarded a Degree of Doctor of Science by Royal Holloway
16.
Martin Hellman
–
Martin Edward Hellman is an American cryptologist, best known for his invention of public key cryptography in cooperation with Whitfield Diffie and Ralph Merkle. Hellman graduated from the Bronx High School of Science, from 1968 to 1969 he worked at IBMs Thomas J. Watson Research Center in Yorktown Heights, New York, where he encountered Horst Feistel. From 1969 to 1971, he was an assistant professor of engineering at the Massachusetts Institute of Technology. Hellman and Whitfield Diffies paper New Directions in Cryptography was published in 1976 and it introduced a radically new method of distributing cryptographic keys, which went far toward solving one of the fundamental problems of cryptography, key distribution. It has become known as Diffie–Hellman key exchange, although Hellman has argued that it ought to be called Diffie-Hellman-Merkle key exchange because of Merkles separate contribution, the article stimulated the development of a new class of encryption algorithms, known variously as public key encryption and asymmetric encryption. Hellman has been a contributor to the computer privacy debate. He and Diffie were the most prominent critics of the key size of the Data Encryption Standard in 1975. An audio recording survives of their review of DES at Stanford in 1976 with Dennis Branstad of NBS, in response to RSA Securitys DES Challenges starting in 1997, brute force crackers were built that could break DES, making it clear that DES was insecure and obsolete. In 2012, a $10,000 commercially available machine can recover a DES key in days, Hellman also served on the National Research Councils Committee to Study National Cryptographic Policy, whose main recommendations have since been implemented. Hellman has been active in researching international security since 1985, Hellman was involved in the original Beyond War movement, serving as the principal editor for the BEYOND WAR, A New Way of Thinking booklet. Anatoly Gromyko and Martin Hellman served as the chief editors, the authors of the book examine questions such as, How can we overcome the inexorable forces leading toward a clash between the United States and the Soviet Union. How do we build a vision for the future. How can we restructure our thinking to synchronize with the imperative of our modern world, Hellmans current project in international security is to defuse the nuclear threat. In particular, he is studying the probabilities and risks associated with nuclear weapons and his website NuclearRisk. org has been endorsed by a number of prominent individuals, including a former Director of the National Security Agency, Stanfords President Emeritus, and two Nobel Laureates. Hellman is a member of the Board of Directors for Daisy Alliance, in 2011, he was inducted into the National Inventors Hall of Fame. Also in 2011, Hellman was made a Fellow of the Computer History Museum for his work, with Whitfield Diffie and Ralph Merkle, Hellman won the Turing Award for 2015 together with Whitfield Diffie. The Turing award is considered the most prestigious award in the field of computer science. The citation for the award was, For fundamental contributions to modern cryptography, Oral history interview with Martin Hellman Oral history interview 2004, Palo Alto, California
17.
Ronald Rivest
–
Ronald Linn Rivest is a cryptographer and an Institute Professor at MIT. He is a member of MITs Department of Electrical Engineering and Computer Science and he was a member of the Election Assistance Commissions Technical Guidelines Development Committee, tasked with assisting the EAC in drafting the Voluntary Voting System Guidelines. Rivest is one of the inventors of the RSA algorithm and he is the inventor of the symmetric key encryption algorithms RC2, RC4, RC5, and co-inventor of RC6. The RC stands for Rivest Cipher, or alternatively, Rons Code and he also authored the MD2, MD4, MD5 and MD6 cryptographic hash functions. Most importantly, this system does not rely on cryptography at all, stating Our democracy is too important, he simultaneously placed ThreeBallot in the public domain. Rivest earned a Bachelors degree in Mathematics from Yale University in 1969, and he is a co-author of Introduction to Algorithms, a standard textbook on algorithms, with Thomas H. Cormen, Charles E. Leiserson and Clifford Stein. He is a member of the MIT Computer Science and Artificial Intelligence Laboratory in the Theory of Computation Group, and he was also a founder of RSA Data Security, Verisign, and of Peppercoin. Professor Rivest has research interests in cryptography, computer and network security, together with Adi Shamir and Len Adleman, he has been awarded the 2000 IEEE Koji Kobayashi Computers and Communications Award and the Secure Computing Lifetime Achievement Award. He also shared with them the Turing Award, Rivest has received an honorary degree from the Sapienza University of Rome. In 2005, he received the MITX Lifetime Achievement Award, Rivest was named the 2007 the Marconi Fellow, and on May 29,2008 he also gave the Chesley lecture at Carleton College. He was named an Institute Professor at MIT in June 2015, Cormen, Thomas H. Leiserson, Charles, Rivest, Ronald. Cormen, Thomas H. Leiserson, Charles, Rivest, Ronald, Stein, Cormen, Thomas H. Leiserson, Charles, Rivest, Ronald, Stein, Clifford
18.
Len Adleman
–
Leonard Adleman is an American computer scientist. He is one of the creators of the RSA encryption algorithm, for which he received the 2002 Turing Award and he is also known for the creation of the field of DNA computing. He grew up in San Francisco and attended the University of California, Berkeley and he was also the mathematical consultant on the movie Sneakers. He is a member of the National Academy of Engineering and the National Academy of Sciences, Adleman is also an amateur boxer and has sparred with James Toney. In 1994, his paper Molecular Computation of Solutions To Combinatorial Problems described the use of DNA as a computational system. In it, he solved a seven-node instance of the Hamiltonian Graph problem, while the solution to a seven-node instance is trivial, this paper is the first known instance of the successful use of DNA to compute an algorithm. DNA computing has been shown to have potential as a means to solve several other large-scale combinatorial search problems, in 2002, he and his research group managed to solve a nontrivial problem using DNA computation. Specifically, they solved a 20-variable SAT problem having more than 1 million potential solutions and they did it in a manner similar to the one Adleman used in his seminal 1994 paper. First, a mixture of DNA strands logically representative of the solution space was synthesized. This mixture was then operated upon algorithmically using biochemical techniques to winnow out the incorrect strands, analysis of the nucleotide sequence of these remaining strands revealed correct solutions to the original problem. He is one of the discoverers of the Adleman–Pomerance–Rumely primality test. Fred Cohen, in his 1984 paper, Experiments with Computer Viruses has credited Adleman with coining the term virus and he is also widely referred to as the Father of DNA Computing. Currently, Adleman is working on the theory of Strata. In addition, he is a Computer Science professor at the University of Southern California, Adleman was elected a Fellow of the American Academy of Arts and Sciences in 2006
19.
RSA (algorithm)
–
RSA is one of the first practical public-key cryptosystems and is widely used for secure data transmission. In such a cryptosystem, the key is public and differs from the decryption key which is kept secret. In RSA, this asymmetry is based on the difficulty of factoring the product of two large prime numbers, the factoring problem. RSA is made of the letters of the surnames of Ron Rivest, Adi Shamir, and Leonard Adleman. Clifford Cocks, an English mathematician working for the UK intelligence agency GCHQ, had developed an equivalent system in 1973, a user of RSA creates and then publishes a public key based on two large prime numbers, along with an auxiliary value. The prime numbers must be kept secret, breaking RSA encryption is known as the RSA problem, whether it is as hard as the factoring problem remains an open question. RSA is a relatively slow algorithm, and because of this it is commonly used to directly encrypt user data. More often, RSA passes encrypted shared keys for symmetric key cryptography which in turn can perform bulk encryption-decryption operations at higher speed. The idea of an asymmetric public-private key cryptosystem is attributed to Whitfield Diffie and Martin Hellman and they also introduced digital signatures and attempted to apply number theory, their formulation used a shared secret key created from exponentiation of some number, modulo a prime numbers. However, they open the problem of realizing a one-way function. Ron Rivest, Adi Shamir, and Leonard Adleman at MIT made several attempts over the course of a year to create a function that is hard to invert. Rivest and Shamir, as scientists, proposed many potential functions while Adleman. They tried many approaches including knapsack-based and permutation polynomials, for a time they thought it was impossible for what they wanted to achieve due to contradictory requirements. In April 1977, they spent Passover at the house of a student, Rivest, unable to sleep, lay on the couch with a math textbook and started thinking about their one-way function. He spent the rest of the night formalizing his idea and had much of the paper ready by daybreak, the algorithm is now known as RSA – the initials of their surnames in same order as their paper. Clifford Cocks, an English mathematician working for the UK intelligence agency GCHQ, however, given the relatively expensive computers needed to implement it at the time, it was mostly considered a curiosity and, as far as is publicly known, was never deployed. His discovery, however, was not revealed until 1997 due to its secret classification, Kid-RSA is a simplified public-key cipher published in 1997, designed for educational purposes. Some people feel that learning Kid-RSA gives insight into RSA and other public-key ciphers, Patent 4,405,829 for a Cryptographic communications system and method that used the algorithm, on September 20,1983
20.
Lotus Notes
–
IBM Notes and IBM Domino are the client and server, respectively, of a collaborative client-server software platform sold by IBM. IBM Notes can also be used with other IBM Domino applications, IBM Notes 9 Social Edition removed integration with the office software package IBM Lotus Symphony, which had been integrated with the IBM Lotus Notes client in versions 8. x. Lotus Development Corporation originally developed Lotus Notes in 1989, IBM bought the Lotus Corporation in 1995 and it became known as the Lotus Development division of IBM. As of 2015 it forms part of the IBM Software and Systems Group under the name IBM Collaboration Solutions, IBM Notes is a desktop workflow application, commonly used in corporate environments for email but can also be used to access databases such as document libraries and custom applications. IBM Notes is a client-server cross-platform application runtime environment that provides an interface to the IBM Notes, IBM Notes can be used as an email client without an IBM Domino server, for example, as an IMAP client. IBM Notes and Domino provide email, calendars, instant messaging, discussions/forums, blogs, IBM Notes and Domino compete with products from other companies such as Microsoft, Google, Zimbra and others. Because of the application development abilities, IBM Notes and Domino is often compared to products like Microsoft Sharepoint, the database in IBM Notes and Domino can be replicated between servers and between server and client, thereby allowing clients offline capabilities. IBM Domino, an application as well as a messaging server, is compatible with both IBM Notes and web-browsers. IBM Notes may be used to access any IBM Domino application, such as forums, document libraries. IBM Notes resembles a web-browser in that it may run any application that the user has permission for. The. nsf file will normally contain both a design and its associated data. IBM Notes can also access relational databases, either through a server called IBM Enterprise Integrator for Domino. As IBM Notes and Domino is a runtime environment, email and calendars operate as applications within IBM Notes. A Domino application-developer can change or completely replace that application, IBM has released the base templates as open source as well. IBM Notes can be used for email, as a calendar, PIM, instant messaging, Web browsing, Notes can access both local- and server-based applications and data. IBM Notes can function as an IMAP and POP email client with non-Domino mail servers, features include group calendars and schedules, SMTP/MIME-based email, NNTP-based news support, and automatic HTML conversion of all documents by the Domino HTTP task. IBM Notes can be used with IBM Sametime instant-messaging to allow to see other users online, beginning with Release 6.5, this function has been freely available. Presence awareness is available in email and other IBM Domino applications for users in organizations that use both IBM Notes and IBM Sametime, since version 7, Notes has provided a Web services interface
21.
Merkle tree
–
In cryptography and computer science, a hash tree or Merkle tree is a tree in which every non-leaf node is labelled with the hash of the labels or values of its child nodes. Hash trees allow efficient and secure verification of the contents of large data structures, hash trees are a generalization of hash lists and hash chains. The concept of trees is named after Ralph Merkle who patented it in 1979. Hash trees can be used to any kind of data stored, handled and transferred in. Suggestions have been made to use hash trees in trusted computing systems, a hash tree is a tree of hashes in which the leaves are hashes of data blocks in, for instance, a file or set of files. Nodes further up in the tree are the hashes of their respective children, for example, in the picture hash 0 is the result of hashing the result of concatenating hash 0-0 and hash 0-1. That is, hash 0 = hash where + denotes concatenation, most hash tree implementations are binary but they can just as well use many more child nodes under each node. Usually, a hash function such as SHA-2 is used for the hashing. If the hash tree only needs to protect against unintentional damage, in the top of a hash tree there is a top hash. When the top hash is available, the tree can be received from any non-trusted source. Similarly, the integrity of data block 3 can be verified if the tree already has hash 1-1 and this can be an advantage since it is efficient to split files up in very small data blocks so that only small blocks have to be re-downloaded if they get damaged. If the hashed file is very big, such a tree or hash list becomes fairly big. But if it is a tree, one branch can be downloaded quickly, the integrity of the branch can be checked. The Merkle hash root does not indicate the depth, enabling a second-preimage attack in which an attacker creates a document other than the original that has the same Merkle hash root. For the example above, an attacker can create a new document containing two data blocks, where the first is hash 0-0 + hash 0-1, and the second is hash 1-0 + hash 1-1. One simple fix is defined in Certificate Transparency, when computing leaf node hashes, limiting the hash tree size is a prerequisite of some formal security proofs, and helps in making some proofs tighter. The Tiger tree hash is a widely used form of hash tree and it uses a binary hash tree, usually has a data block size of 1024 bytes and uses the cryptographically secure Tiger hash. Tiger tree hashes are used in Gnutella, Gnutella2, and Direct Connect P2P file sharing protocols and in file sharing applications such as Phex, BearShare, LimeWire, Shareaza, DC++, base32, RBOEI7UYRYO5SUXGER5NMUOEZ5O6E4BHPP2MRFQ URN, urn, tree, tiger, RBOEI7UYRYO5SUXGER5NMUOEZ5O6E4BHPP2MRFQ magnet, magnet
22.
Shafi Goldwasser
–
Shafrira Goldwasser is an American-born Israeli computer scientist. She is a professor of engineering and computer science at MIT. She joined MIT in 1983, and in 1997 became the first holder of the RSA Professorship and she became a professor at the Weizmann Institute of Science, concurrent to her professorship at MIT, in 1993. She is a member of the Theory of Computation group at MIT Computer Science, Goldwasser was a co-recipient of the 2012 Turing Award. Goldwassers research areas include computational complexity theory, cryptography and computational number theory and her work in complexity theory includes the classification of approximation problems, showing that some problems in NP remain hard even when only an approximate solution is needed. Goldwasser has twice won the Gödel Prize in theoretical science, first in 1993. In 2001 she was elected to the American Academy of Arts and Sciences, in 2004 she was elected to the National Academy of Science and she was selected as an IACR Fellow in 2007. Goldwasser received the 2008-2009 Athena Lecturer Award of the Association for Computing Machinerys Committee on Women in Computing and she is the recipient of The Franklin Institutes 2010 Benjamin Franklin Medal in Computer and Cognitive Science. She received the IEEE Emanuel R. Piore Award in 2011, and was awarded the 2012 Turing Award along with Silvio Micali for their work in the field of cryptography
23.
Silvio Micali
–
His research centers on the theory of cryptography and information security. Micali won the Gödel Prize in 1993, in 2007, he was selected to be a member of the National Academy of Sciences and a Fellow of the IACR. He is also a member of the National Academy of Engineering and he received the Turing Award for the year 2012 along with Shafi Goldwasser for their work in the field of cryptography. In 2015 the University of Salerno acknowledges his studies giving him an honoris causa degree in Computer Science, categorý, Fellows of the American Academy of Arts and Sciences
24.
Modular arithmetic
–
In mathematics, modular arithmetic is a system of arithmetic for integers, where numbers wrap around upon reaching a certain value—the modulus. The modern approach to modular arithmetic was developed by Carl Friedrich Gauss in his book Disquisitiones Arithmeticae, a familiar use of modular arithmetic is in the 12-hour clock, in which the day is divided into two 12-hour periods. If the time is 7,00 now, then 8 hours later it will be 3,00. Usual addition would suggest that the time should be 7 +8 =15. Likewise, if the clock starts at 12,00 and 21 hours elapse, then the time will be 9,00 the next day, because the hour number starts over after it reaches 12, this is arithmetic modulo 12. According to the definition below,12 is congruent not only to 12 itself, Modular arithmetic can be handled mathematically by introducing a congruence relation on the integers that is compatible with the operations on integers, addition, subtraction, and multiplication. For a positive n, two integers a and b are said to be congruent modulo n, written, a ≡ b. The number n is called the modulus of the congruence, for example,38 ≡14 because 38 −14 =24, which is a multiple of 12. The same rule holds for negative values, −8 ≡72 ≡ −3 −3 ≡ −8. Equivalently, a ≡ b mod n can also be thought of as asserting that the remainders of the division of both a and b by n are the same, for instance,38 ≡14 because both 38 and 14 have the same remainder 2 when divided by 12. It is also the case that 38 −14 =24 is a multiple of 12. A remark on the notation, Because it is common to consider several congruence relations for different moduli at the same time, in spite of the ternary notation, the congruence relation for a given modulus is binary. This would have been if the notation a ≡n b had been used. The properties that make this relation a congruence relation are the following, if a 1 ≡ b 1 and a 2 ≡ b 2, then, a 1 + a 2 ≡ b 1 + b 2 a 1 − a 2 ≡ b 1 − b 2. The above two properties would still hold if the theory were expanded to all real numbers, that is if a1, a2, b1, b2. The next property, however, would fail if these variables were not all integers, the notion of modular arithmetic is related to that of the remainder in Euclidean division. The operation of finding the remainder is referred to as the modulo operation. For example, the remainder of the division of 14 by 12 is denoted by 14 mod 12, as this remainder is 2, we have 14 mod 12 =2
25.
Euler's totient function
–
In number theory, Eulers totient function counts the positive integers up to a given integer n that are relatively prime to n. It is written using the Greek letter phi as φ or ϕ and it can be defined more formally as the number of integers k in the range 1 ≤ k ≤ n for which the greatest common divisor gcd is equal to 1. The integers k of this form are referred to as totatives of n. For example, the totatives of n =9 are the six numbers 1,2,4,5,7 and 8. They are all relatively prime to 9, but the three numbers in this range,3,6, and 9 are not, because gcd = gcd =3. As another example, φ =1 since for n =1 the only integer in the range from 1 to n is 1 itself, Eulers totient function is a multiplicative function, meaning that if two numbers m and n are relatively prime, then φ = φφ. This function gives the order of the group of integers modulo n. It also plays a key role in the definition of the RSA encryption system, leonhard Euler introduced the function in 1763. However, he did not at that time choose any specific symbol to denote it. In a 1784 publication, Euler studied the function further, choosing the Greek letter π to denote it, he wrote πD for the multitude of less than D. This definition varies from the current definition for the totient function at D =1 but is otherwise the same, the now-standard notation φ comes from Gausss 1801 treatise Disquisitiones Arithmeticae. Although Gauss didnt use parentheses around the argument and wrote φA, thus, it is often called Eulers phi function or simply the phi function. In 1879, J. J. Sylvester coined the term totient for this function, so it is referred to as Eulers totient function. Jordans totient is a generalization of Eulers, the cototient of n is defined as n − φ. It counts the number of positive integers less than or equal to n that have at least one factor in common with n. There are several formulas for computing φ and it states φ = n ∏ p ∣ n, where the product is over the distinct prime numbers dividing n. The proof of Eulers product formula depends on two important facts and this means that if gcd =1, then φ = φ φ. If p is prime and k ≥1, then φ = p k − p k −1 = p k −1 = p k, proof, since p is a prime number the only possible values of gcd are 1, p, p2
26.
Cryptographic hash function
–
A cryptographic hash function is a special class of hash function that has certain properties which make it suitable for use in cryptography. It is an algorithm that maps data of arbitrary size to a bit string of a fixed size which is designed to also be a one-way function, that is. Bruce Schneier has called one-way hash functions the workhorses of modern cryptography, the input data is often called the message, and the output is often called the message digest or simply the digest. Most cryptographic hash functions are designed to take a string of any length as input, a cryptographic hash function must be able to withstand all known types of cryptanalytic attack. This concept is related to that of one-way function, functions that lack this property are vulnerable to preimage attacks. Second pre-image resistance Given an input m1 it should be difficult to find different input m2 such that hash = hash, functions that lack this property are vulnerable to second-preimage attacks. Collision resistance It should be difficult to find two different messages m1 and m2 such that hash = hash, such a pair is called a cryptographic hash collision. This property is referred to as strong collision resistance. It requires a value at least twice as long as that required for preimage-resistance. Collision resistance implies second pre-image resistance, but does not imply pre-image resistance, informally, these properties mean that a malicious adversary cannot replace or modify the input data without changing its digest. Thus, if two strings have the same digest, one can be confident that they are identical. A function meeting these criteria may still have undesirable properties and this property can be used to break naive authentication schemes based on hash functions. The HMAC construction works around these problems, in practice, collision resistance is insufficient for many practical uses. In particular, should behave as much as possible like a random function while still being deterministic, checksum algorithms, such as CRC32 and other cyclic redundancy checks, are designed to meet much weaker requirements, and are generally unsuitable as cryptographic hash functions. For example, a CRC was used for message integrity in the WEP encryption standard, the meaning of the term is therefore somewhat dependent on the application, since the effort that a malicious agent may put into the task is usually proportional to his expected gain. However, since the effort usually grows very quickly with the digest length. For messages selected from a set of messages, for example passwords or other short messages. In some theoretical analyses difficult has a specific meaning, such as not solvable in asymptotic polynomial time
27.
Permutation
–
These differ from combinations, which are selections of some members of a set where order is disregarded. For example, written as tuples, there are six permutations of the set, namely and these are all the possible orderings of this three element set. As another example, an anagram of a word, all of whose letters are different, is a permutation of its letters, in this example, the letters are already ordered in the original word and the anagram is a reordering of the letters. The study of permutations of finite sets is a topic in the field of combinatorics, Permutations occur, in more or less prominent ways, in almost every area of mathematics. For similar reasons permutations arise in the study of sorting algorithms in computer science, the number of permutations of n distinct objects is n factorial, usually written as n. which means the product of all positive integers less than or equal to n. In algebra and particularly in group theory, a permutation of a set S is defined as a bijection from S to itself and that is, it is a function from S to S for which every element occurs exactly once as an image value. This is related to the rearrangement of the elements of S in which each element s is replaced by the corresponding f, the collection of such permutations form a group called the symmetric group of S. The key to this structure is the fact that the composition of two permutations results in another rearrangement. Permutations may act on structured objects by rearranging their components, or by certain replacements of symbols, in elementary combinatorics, the k-permutations, or partial permutations, are the ordered arrangements of k distinct elements selected from a set. When k is equal to the size of the set, these are the permutations of the set, fabian Stedman in 1677 described factorials when explaining the number of permutations of bells in change ringing. Starting from two bells, first, two must be admitted to be varied in two ways which he illustrates by showing 12 and 21 and he then explains that with three bells there are three times two figures to be produced out of three which again is illustrated. His explanation involves cast away 3, and 1.2 will remain, cast away 2, and 1.3 will remain, cast away 1, and 2.3 will remain. He then moves on to four bells and repeats the casting away argument showing that there will be four different sets of three, effectively this is an recursive process. He continues with five bells using the casting method and tabulates the resulting 120 combinations. At this point he gives up and remarks, Now the nature of these methods is such, in modern mathematics there are many similar situations in which understanding a problem requires studying certain permutations related to it. There are two equivalent common ways of regarding permutations, sometimes called the active and passive forms, or in older terminology substitutions and permutations, which form is preferable depends on the type of questions being asked in a given discipline. The active way to regard permutations of a set S is to them as the bijections from S to itself. Thus, the permutations are thought of as functions which can be composed with each other, forming groups of permutations