Authentication is the act of confirming the truth of an attribute of a single piece of data claimed true by an entity. In contrast with identification, which refers to the act of stating or otherwise indicating a claim purportedly attesting to a person or thing's identity, authentication is the process of confirming that identity, it might involve confirming the identity of a person by validating their identity documents, verifying the authenticity of a website with a digital certificate, determining the age of an artifact by carbon dating, or ensuring that a product is what its packaging and labeling claim to be. In other words, authentication involves verifying the validity of at least one form of identification. Authentication is relevant to multiple fields. In art and anthropology, a common problem is verifying that a given artifact was produced by a certain person or in a certain place or period of history. In computer science, verifying a person's identity is required to allow access to confidential data or systems.
Authentication can be considered to be of three types: The first type of authentication is accepting proof of identity given by a credible person who has first-hand evidence that the identity is genuine. When authentication is required of art or physical objects, this proof could be a friend, family member or colleague attesting to the item's provenance by having witnessed the item in its creator's possession. With autographed sports memorabilia, this could involve someone attesting that they witnessed the object being signed. A vendor selling branded items implies authenticity, while he or she may not have evidence that every step in the supply chain was authenticated. Centralized authority-based trust relationships back most secure internet communication through known public certificate authorities; the second type of authentication is comparing the attributes of the object itself to what is known about objects of that origin. For example, an art expert might look for similarities in the style of painting, check the location and form of a signature, or compare the object to an old photograph.
An archaeologist, on the other hand, might use carbon dating to verify the age of an artifact, do a chemical and spectroscopic analysis of the materials used, or compare the style of construction or decoration to other artifacts of similar origin. The physics of sound and light, comparison with a known physical environment, can be used to examine the authenticity of audio recordings, photographs, or videos. Documents can be verified as being created on ink or paper available at the time of the item's implied creation. Attribute comparison may be vulnerable to forgery. In general, it relies on the facts that creating a forgery indistinguishable from a genuine artifact requires expert knowledge, that mistakes are made, that the amount of effort required to do so is greater than the amount of profit that can be gained from the forgery. In art and antiques, certificates are of great importance for authenticating an object of interest and value. Certificates can, however be forged, the authentication of these poses a problem.
For instance, the son of Han van Meegeren, the well-known art-forger, forged the work of his father and provided a certificate for its provenance as well. Criminal and civil penalties for fraud and counterfeiting can reduce the incentive for falsification, depending on the risk of getting caught. Currency and other financial instruments use this second type of authentication method. Bills and cheques incorporate hard-to-duplicate physical features, such as fine printing or engraving, distinctive feel and holographic imagery, which are easy for trained receivers to verify; the third type of authentication relies on documentation or other external affirmations. In criminal courts, the rules of evidence require establishing the chain of custody of evidence presented; this can be accomplished through a written evidence log, or by testimony from the police detectives and forensics staff that handled it. Some antiques are accompanied by certificates attesting to their authenticity. Signed sports memorabilia is accompanied by a certificate of authenticity.
These external records have their own problems of forgery and perjury, are vulnerable to being separated from the artifact and lost. In computer science, a user can be given access to secure systems based on user credentials that imply authenticity. A network administrator can give a user a password, or provide the user with a key card or other access device to allow system access. In this case, authenticity is implied but not guaranteed. Consumer goods such as pharmaceuticals, fashion clothing can use all three forms of authentication to prevent counterfeit goods from taking advantage of a popular brand's reputation; as mentioned above, having an item for sale in a reputable store implicitly attests to it being genuine, the first type of authentication. The second type of authentication might involve comparing the quality and craftsmanship of an item, such as an expensive handbag, to genuine articles; the third type of authentication could be the presence of a trademark on the item, a protected marking, or any other identifying feature which aids consumers in the identification o
Application software is software designed to perform a group of coordinated functions, tasks, or activities for the benefit of the user. Examples of an application include a word processor, a spreadsheet, an accounting application, a web browser, an email client,a media player, a file viewer, an aeronautical flight simulator, a console game or a photo editor; the collective noun application software refers to all applications collectively. This contrasts with system software, involved with running the computer. Applications may be bundled with the computer and its system software or published separately, may be coded as proprietary, open-source or university projects. Apps built for mobile platforms are called mobile apps. In information technology, an application, application program or software application is a computer program designed to help people perform an activity. An application thus differs from an operating system, a utility, a programming tool. Depending on the activity for which it was designed, an application can manipulate text, audio, graphics, or a combination of these elements.
Some application packages focus on a single task, such as word processing. User-written software tailors systems to meet the user's specific needs. User-written software includes spreadsheet templates, word processor macros, scientific simulations, audio and animation scripts. Email filters are a kind of user software. Users create this software themselves and overlook how important it is; the delineation between system software such as operating systems and application software is not exact, is the object of controversy. For example, one of the key questions in the United States v. Microsoft Corp. antitrust trial was whether Microsoft's Internet Explorer web browser was part of its Windows operating system or a separable piece of application software. As another example, the GNU/Linux naming controversy is, in part, due to disagreement about the relationship between the Linux kernel and the operating systems built over this kernel. In some types of embedded systems, the application software and the operating system software may be indistinguishable to the user, as in the case of software used to control a VCR, DVD player or microwave oven.
The above definitions may exclude some applications that may exist on some computers in large organizations. For an alternative definition of an app: see Application Portfolio Management; the word "application", once used as an adjective, is not restricted to the "of or pertaining to application software" meaning. For example, concepts such as application programming interface, application server, application virtualization, application lifecycle management and portable application apply to all computer programs alike, not just application software; some applications are available in versions for several different platforms. Sometimes a new and popular application arises which only runs on one platform, increasing the desirability of that platform; this is called a killer killer app. For example, VisiCalc was the first modern spreadsheet software for the Apple II and helped selling the then-new personal computers into offices. For Blackberry it was their email software. In recent years, the shortened term "app" has become popular to refer to applications for mobile devices such as smartphones and tablets, the shortened form matching their smaller scope compared to applications on PCs.
More the shortened version is used for desktop application software as well. There are many different and not alternative ways in order to order and classify application software. By the legal point of view, application software is classified with a black box approach, in relation to the rights of its final end-users or subscribers. Software applications are classified in respect of the programming language in which the source code is written or executed, respect of their purpose and outputs. Application software is distinguished among two main classes: closed source vs open source software applications, among free or proprietary software applications. Proprietary software is placed under the exclusive copyright, a software license grants limited usage rights; the open-closed principle states that software may be "open only for extension, but not for modification". Such applications can only get add-on by third-parties. Free and open-source software shall be run, sold or extended for any purpose, -being open- shall be modified or reversed in the same way.
A security token is a physical device used to gain access to an electronically restricted resource. The token is used in addition to or in place of a password, it acts like an electronic key to access something. Examples include a wireless keycard opening a locked door, or in the case of a customer trying to access their bank account online, the use of a bank-provided token can prove that the customer is who they claim to be; some tokens may store cryptographic keys, such as a digital signature, or biometric data, such as fingerprint details. Some may store passwords; some designs incorporate tamper resistant packaging, while others may include small keypads to allow entry of a PIN or a simple button to start a generating routine with some display capability to show a generated key number. Connected tokens utilize a variety of interfaces including USB, near-field communication, radio-frequency identification, or Bluetooth; some tokens have an audio capability designed for vision-impaired people.
All tokens contain some secret information, used to prove identity. There are four different ways in which this information can be used: Static password token The device contains a password, physically hidden, but, transmitted for each authentication; this type is vulnerable to replay attacks. Synchronous dynamic password token A timer is used to rotate through various combinations produced by a cryptographic algorithm; the token and the authentication server must have synchronized clocks. Asynchronous password token A one-time password is generated without the use of a clock, either from a one-time pad or cryptographic algorithm. Challenge response token Using public key cryptography, it is possible to prove possession of a private key without revealing that key; the authentication server encrypts a challenge with a public key. Time-synchronized one-time passwords change at a set time interval. To do this some sort of synchronization must exist between the client's token and the authentication server.
For disconnected tokens this time-synchronization is done before the token is distributed to the client. Other token types do the synchronization; the main problem with time-synchronized tokens is. However, some such systems, such as RSA's SecurID, allow the user to resynchronize the server with the token, sometimes by entering several consecutive passcodes. Most cannot have replaceable batteries and only last up to 5 years before having to be replaced – so there is additional cost. Another type of one-time password uses a complex mathematical algorithm, such as a hash chain, to generate a series of one-time passwords from a secret shared key; each password is unguessable when previous passwords are known. The open source OAuth algorithm is standardized; each password is observably unpredictable and independent of previous ones, wherefore an adversary would be unable to guess what the next password may be with knowledge of all previous passwords. Tokens can contain chips with functions varying from simple to complex, including multiple authentication methods.
The simplest security tokens do not need any connection to a computer. The tokens have a physical display. Other tokens connect to the computer using wireless techniques, such as Bluetooth; these tokens transfer a key sequence to a nearby access point. Alternatively, another form of token, available for many years is a mobile device which communicates using an out-of-band channel. Still other tokens plug into the computer, may require a PIN. Depending on the type of the token, the computer OS will either read the key from the token and perform a cryptographic operation on it, or ask the token's firmware to perform this operation A related application is the hardware dongle required by some computer programs to prove ownership of the software; the dongle is placed in an input device and the software accesses the I/O device in question to authorize the use of the software in question. Commercial solutions are provided by a variety of vendors, each with their own proprietary implementation of variously used security features.
Token designs meeting certain security standards are certified in the United States as compliant with FIPS 140, a federal security standard. Tokens without any kind of certification are sometimes viewed as suspect, as they do not meet accepted government or industry security standards, have not been put through rigorous testing, cannot provide the same level of cryptographic security as token solutions which have had their designs independently audited by third-party agencies. Disconnected tokens have neither a logical connection to the client computer, they do not require a special input device, instead use a built-in screen to display the generated authentication data, which the user enters manually themselves via a keyboard or keypad. Disconnected tokens are the most common type of security token used in two-factor authentication for online identification. Connected tokens are tokens that must be physically connected to the computer with which the user is authenticating. Tokens in this category automatically transmit the authentication information to the client computer once a physical connection is made, eliminating the need for the u
The Internet is the global system of interconnected computer networks that use the Internet protocol suite to link devices worldwide. It is a network of networks that consists of private, academic and government networks of local to global scope, linked by a broad array of electronic and optical networking technologies; the Internet carries a vast range of information resources and services, such as the inter-linked hypertext documents and applications of the World Wide Web, electronic mail and file sharing. Some publications no longer capitalize "internet"; the origins of the Internet date back to research commissioned by the federal government of the United States in the 1960s to build robust, fault-tolerant communication with computer networks. The primary precursor network, the ARPANET served as a backbone for interconnection of regional academic and military networks in the 1980s; the funding of the National Science Foundation Network as a new backbone in the 1980s, as well as private funding for other commercial extensions, led to worldwide participation in the development of new networking technologies, the merger of many networks.
The linking of commercial networks and enterprises by the early 1990s marked the beginning of the transition to the modern Internet, generated a sustained exponential growth as generations of institutional and mobile computers were connected to the network. Although the Internet was used by academia since the 1980s, commercialization incorporated its services and technologies into every aspect of modern life. Most traditional communication media, including telephony, television, paper mail and newspapers are reshaped, redefined, or bypassed by the Internet, giving birth to new services such as email, Internet telephony, Internet television, online music, digital newspapers, video streaming websites. Newspaper and other print publishing are adapting to website technology, or are reshaped into blogging, web feeds and online news aggregators; the Internet has enabled and accelerated new forms of personal interactions through instant messaging, Internet forums, social networking. Online shopping has grown exponentially both for major retailers and small businesses and entrepreneurs, as it enables firms to extend their "brick and mortar" presence to serve a larger market or sell goods and services online.
Business-to-business and financial services on the Internet affect supply chains across entire industries. The Internet has no single centralized governance in either technological implementation or policies for access and usage; the overreaching definitions of the two principal name spaces in the Internet, the Internet Protocol address space and the Domain Name System, are directed by a maintainer organization, the Internet Corporation for Assigned Names and Numbers. The technical underpinning and standardization of the core protocols is an activity of the Internet Engineering Task Force, a non-profit organization of loosely affiliated international participants that anyone may associate with by contributing technical expertise. In November 2006, the Internet was included on USA Today's list of New Seven Wonders; when the term Internet is used to refer to the specific global system of interconnected Internet Protocol networks, the word is a proper noun that should be written with an initial capital letter.
In common use and the media, it is erroneously not capitalized, viz. the internet. Some guides specify that the word should be capitalized when used as a noun, but not capitalized when used as an adjective; the Internet is often referred to as the Net, as a short form of network. As early as 1849, the word internetted was used uncapitalized as an adjective, meaning interconnected or interwoven; the designers of early computer networks used internet both as a noun and as a verb in shorthand form of internetwork or internetworking, meaning interconnecting computer networks. The terms Internet and World Wide Web are used interchangeably in everyday speech. However, the World Wide Web or the Web is only one of a large number of Internet services; the Web is a collection of interconnected documents and other web resources, linked by hyperlinks and URLs. As another point of comparison, Hypertext Transfer Protocol, or HTTP, is the language used on the Web for information transfer, yet it is just one of many languages or protocols that can be used for communication on the Internet.
The term Interweb is a portmanteau of Internet and World Wide Web used sarcastically to parody a technically unsavvy user. Research into packet switching, one of the fundamental Internet technologies, started in the early 1960s in the work of Paul Baran and Donald Davies. Packet-switched networks such as the NPL network, ARPANET, the Merit Network, CYCLADES, Telenet were developed in the late 1960s and early 1970s; the ARPANET project led to the development of protocols for internetworking, by which multiple separate networks could be joined into a network of networks. ARPANET development began with two network nodes which were interconnected between the Network Measurement Center at the University of California, Los Angeles Henry Samueli School of Engineering and Applied Science directed by Leonard Kleinrock, the NLS system at SRI International by Douglas Engelbart in Menlo Park, California, on 29 October 1969; the third site was the Culler-Fried Interactive Mathematics Center at the University of California, Santa Barbara, followed by the University of
Electronic authentication is the process of establishing confidence in user identities electronically presented to an information system. Digital authentication or e-authentication may be used synonymously when referring to the authentication process that confirms or certifies a person's identity and works; when used in conjunction with an electronic signature, it can provide evidence whether data received has been tampered with after being signed by its original sender. In a time where fraud and identity theft has become rampant, electronic authentication can be a more secure method of verifying that a person is who they say they are when performing transactions online. There are various e-authentication methods that can be used to authenticate a user's identify ranging from a password to higher levels of security that utilize multifactor authentication. Depending on the level of security used, the user might need to prove his or her identity through the use of security tokens, challenge questions or being in possession of a certificate from a third-party certificate authority that attests to their identity.
The American National Institute of Standards and Technology has developed a generic electronic authentication model that provides a basic framework on how the authentication process is accomplished regardless of jurisdiction or geographic region. According to this model, the enrollment process begins with an individual applying to a Credential Service Provider; the CSP will need to prove the applicant's identity before proceeding with the transaction. Once the applicant's identity has been confirmed by the CSP, he or she receives the status of "subscriber", is given an authenticator, such as a token and a credential, which may be in the form of a username; the CSP is responsible for managing the credential along with the subscriber's enrollment data for the life of the credential. The subscriber will be tasked with maintaining the authenticators. An example of this is when a user uses a specific computer to do their online banking. If he or she attempts to access their bank account from another computer, the authenticator will not be present.
In order to gain access, the subscriber would need to verify their identity to the CSP, which might be in the form of answering a challenge question before being given access. The need for authentication has been prevalent throughout history. In ancient times, people would identify each other through physical appearance; the Sumerians in ancient Mesopotamia attested to the authenticity of their writings by using seals embellished with identifying symbols. As time moved on, the most common way to provide authentication would be the handwritten signature. There are three accepted factors that are used to establish a digital identity for electronic authentication, including: Knowledge factor, something that the user knows, such as a password, answers to challenge questions, ID numbers or a PIN. Possession factor, something that the user has, such as mobile phone, PC or token Biometric factor, something that the user is, such as his or her fingerprints, eye scan or voice patternOut of the three factors, the biometric factor is the most convenient and convincing to prove an individual's identity.
However, having to rely on this sole factor can be expensive to sustain. Although having their own unique weaknesses, by combining two or more factors allows for reliable authentication, it is always recommended to use multifactor authentication for that reason. Authentication systems are categorized by the number of factors that they incorporate; the three factors considered as the cornerstone of authentication are: Something you know Something you have Something you are Multifactor authentication is more secure than single-factor authentication. But, some multi-factor authentication approaches are still vulnerable to cases like man-in-the-middle attacks and Trojan attacks. Common methods used in authentication systems are summarized below. Tokens generically are something the claimant possesses and controls that may be used to authenticate the claimant's identity. In e-authentication, the claimant authenticates to a application over a network. Therefore, a token used for e-authentication is a secret and the token must be protected.
The token may, for example, be a cryptographic key, protected by encrypting it under a password. An impostor must learn the password to use the token. Passwords and PINs are categorized. A combination of numbers and mixed cases are considered to be stronger than all-letter password; the adoption of Transport Layer Security or Secure Socket Layer features during the information transmission process will as well create an encrypted channel for data exchange and to further protect information delivered. Most security attacks target on password-based authentication systems; this type of authentication has two parts. One is a public key, the other is a private key. A public key is available to any user or server. A private key is known by the user only; the user shares a unique key with an authentication server. When the user sends a randomly generated message encrypted by the secret key to the authentication server, if the message can be matched by the server using its shared secret key, the user is authenticated.
When implemented together with the password authentication, this method provides a possible solution for two-factor authentication systems. The user receives password by reading the message in