A password is a word or string of characters used for user authentication to prove identity or access approval to gain access to a resource, to be kept secret from those not allowed access. The use of passwords is known to be ancient. Sentries would challenge those wishing to enter an area or approaching it to supply a password or watchword, would only allow a person or group to pass if they knew the password. In modern times, user names and passwords are used by people during a log in process that controls access to protected computer operating systems, mobile phones, cable TV decoders, automated teller machines, etc. A typical computer user has passwords for many purposes: logging into accounts, retrieving e-mail, accessing applications, networks, web sites, reading the morning newspaper online. Despite the name, there is no need for passwords to be actual words; some passwords are formed from multiple words and may more be called a passphrase. The terms passcode and passkey are sometimes used when the secret information is purely numeric, such as the personal identification number used for ATM access.
Passwords are short enough to be memorized and typed, although they may be longer and more complex if the user wishes to be more secure. Most organizations specify a password policy that sets requirements for the composition and usage of passwords dictating minimum length, required categories, prohibited elements; some governments have national authentication frameworks that define requirements for user authentication to government services, including requirements for passwords. Passwords or watchwords have been used since ancient times. Polybius describes the system for the distribution of watchwords in the Roman military as follows: The way in which they secure the passing round of the watchword for the night is as follows: from the tenth maniple of each class of infantry and cavalry, the maniple, encamped at the lower end of the street, a man is chosen, relieved from guard duty, he attends every day at sunset at the tent of the tribune, receiving from him the watchword—that is a wooden tablet with the word inscribed on it – takes his leave, on returning to his quarters passes on the watchword and tablet before witnesses to the commander of the next maniple, who in turn passes it to the one next him.
All do the same until it reaches the first maniples, those encamped near the tents of the tribunes. These latter are obliged to deliver the tablet to the tribunes before dark. So that if all those issued are returned, the tribune knows that the watchword has been given to all the maniples, has passed through all on its way back to him. If any one of them is missing, he makes inquiry at once, as he knows by the marks from what quarter the tablet has not returned, whoever is responsible for the stoppage meets with the punishment he merits. Passwords in military use evolved to include not just a password, but a password and a counterpassword. S. 101st Airborne Division used a password—flash—which was presented as a challenge, answered with the correct response—thunder. The challenge and response were changed every three days. American paratroopers famously used a device known as a "cricket" on D-Day in place of a password system as a temporarily unique method of identification. Passwords have been used with computers since the earliest days of computing.
The Compatible Time-Sharing System, an operating system introduced at MIT in 1961, was the first computer system to implement password login. CTSS had a LOGIN command. "After typing PASSWORD, the system turns off the printing mechanism, if possible, so that the user may type in his password with privacy." In the early 1970s, Robert Morris developed a system of storing login passwords in a hashed form as part of the Unix operating system. The system was based on a simulated Hagelin rotor crypto machine, first appeared in 6th Edition Unix in 1974. A version of his algorithm, known as crypt, used a 12-bit salt and invoked a modified form of the DES algorithm 25 times to reduce the risk of pre-computed dictionary attacks; the easier a password is for the owner to remember means it will be easier for an attacker to guess. However, passwords which are difficult to remember may reduce the security of a system because users might need to write down or electronically store the password, users will need frequent password resets and users are more to re-use the same password.
The more stringent requirements for password strength, e.g. "have a mix of uppercase and lowercase letters and digits" or "change it monthly", the greater the degree to which users will subvert the system. Others argue longer passwords provide more security than shorter passwords with a wide variety of characters. In The Memorability and Security of Passwords, Jeff Yan et al. examine the effect of advice given to users about a good choice of password. They found that passwords based on thinking of a phrase and taking the first letter of each word are just as memorable as naively selected passwords, just as hard to crack as randomly generated passwords. Combining two or more unrelated words and altering some of the letters to special characters or numbers is another good method, but a single di
Telecommunication is the transmission of signs, messages, writings and sounds or information of any nature by wire, optical or other electromagnetic systems. Telecommunication occurs when the exchange of information between communication participants includes the use of technology, it is transmitted either electrically over physical media, such as cables, or via electromagnetic radiation. Such transmission paths are divided into communication channels which afford the advantages of multiplexing. Since the Latin term communicatio is considered the social process of information exchange, the term telecommunications is used in its plural form because it involves many different technologies. Early means of communicating over a distance included visual signals, such as beacons, smoke signals, semaphore telegraphs, signal flags, optical heliographs. Other examples of pre-modern long-distance communication included audio messages such as coded drumbeats, lung-blown horns, loud whistles. 20th- and 21st-century technologies for long-distance communication involve electrical and electromagnetic technologies, such as telegraph and teleprinter, radio, microwave transmission, fiber optics, communications satellites.
A revolution in wireless communication began in the first decade of the 20th century with the pioneering developments in radio communications by Guglielmo Marconi, who won the Nobel Prize in Physics in 1909, other notable pioneering inventors and developers in the field of electrical and electronic telecommunications. These included Charles Wheatstone and Samuel Morse, Alexander Graham Bell, Edwin Armstrong and Lee de Forest, as well as Vladimir K. Zworykin, John Logie Baird and Philo Farnsworth; the word telecommunication is a compound of the Greek prefix tele, meaning distant, far off, or afar, the Latin communicare, meaning to share. Its modern use is adapted from the French, because its written use was recorded in 1904 by the French engineer and novelist Édouard Estaunié. Communication was first used as an English word in the late 14th century, it comes from Old French comunicacion, from Latin communicationem, noun of action from past participle stem of communicare "to share, divide out.
Homing pigeons have been used throughout history by different cultures. Pigeon post had Persian roots, was used by the Romans to aid their military. Frontinus said; the Greeks conveyed the names of the victors at the Olympic Games to various cities using homing pigeons. In the early 19th century, the Dutch government used the system in Sumatra, and in 1849, Paul Julius Reuter started a pigeon service to fly stock prices between Aachen and Brussels, a service that operated for a year until the gap in the telegraph link was closed. In the Middle Ages, chains of beacons were used on hilltops as a means of relaying a signal. Beacon chains suffered the drawback that they could only pass a single bit of information, so the meaning of the message such as "the enemy has been sighted" had to be agreed upon in advance. One notable instance of their use was during the Spanish Armada, when a beacon chain relayed a signal from Plymouth to London. In 1792, Claude Chappe, a French engineer, built the first fixed visual telegraphy system between Lille and Paris.
However semaphore suffered from the need for skilled operators and expensive towers at intervals of ten to thirty kilometres. As a result of competition from the electrical telegraph, the last commercial line was abandoned in 1880. On 25 July 1837 the first commercial electrical telegraph was demonstrated by English inventor Sir William Fothergill Cooke, English scientist Sir Charles Wheatstone. Both inventors viewed their device as "an improvement to the electromagnetic telegraph" not as a new device. Samuel Morse independently developed a version of the electrical telegraph that he unsuccessfully demonstrated on 2 September 1837, his code was an important advance over Wheatstone's signaling method. The first transatlantic telegraph cable was completed on 27 July 1866, allowing transatlantic telecommunication for the first time; the conventional telephone was invented independently by Alexander Bell and Elisha Gray in 1876. Antonio Meucci invented the first device that allowed the electrical transmission of voice over a line in 1849.
However Meucci's device was of little practical value because it relied upon the electrophonic effect and thus required users to place the receiver in their mouth to "hear" what was being said. The first commercial telephone services were set-up in 1878 and 1879 on both sides of the Atlantic in the cities of New Haven and London. Starting in 1894, Italian inventor Guglielmo Marconi began developing a wireless communication using the newly discovered phenomenon of radio waves, showing by 1901 that they could be transmitted across the Atlantic Ocean; this was the start of wireless telegraphy by radio. Voice and music had little early success. World War I accelerated the development of radio for military communications. After the war, commercial radio AM broadcasting began in the 1920s and became an important mass medium for entertainment and news. World War II again accelerated development of radio for the wartime purposes of aircraft and land communication, radio navigation and radar. Development of stereo FM broadcasting of radio
Time is the indefinite continued progress of existence and events that occur in irreversible succession through the past, in the present, the future. Time is a component quantity of various measurements used to sequence events, to compare the duration of events or the intervals between them, to quantify rates of change of quantities in material reality or in the conscious experience. Time is referred to as a fourth dimension, along with three spatial dimensions. Time has long been an important subject of study in religion and science, but defining it in a manner applicable to all fields without circularity has eluded scholars. Diverse fields such as business, sports, the sciences, the performing arts all incorporate some notion of time into their respective measuring systems. Time in physics is unambiguously operationally defined as "what a clock reads". See Units of Time. Time is one of the seven fundamental physical quantities in both the International System of Units and International System of Quantities.
Time is used to define other quantities – such as velocity – so defining time in terms of such quantities would result in circularity of definition. An operational definition of time, wherein one says that observing a certain number of repetitions of one or another standard cyclical event constitutes one standard unit such as the second, is useful in the conduct of both advanced experiments and everyday affairs of life; the operational definition leaves aside the question whether there is something called time, apart from the counting activity just mentioned, that flows and that can be measured. Investigations of a single continuum called spacetime bring questions about space into questions about time, questions that have their roots in the works of early students of natural philosophy. Temporal measurement has occupied scientists and technologists, was a prime motivation in navigation and astronomy. Periodic events and periodic motion have long served as standards for units of time. Examples include the apparent motion of the sun across the sky, the phases of the moon, the swing of a pendulum, the beat of a heart.
The international unit of time, the second, is defined by measuring the electronic transition frequency of caesium atoms. Time is of significant social importance, having economic value as well as personal value, due to an awareness of the limited time in each day and in human life spans. Speaking, methods of temporal measurement, or chronometry, take two distinct forms: the calendar, a mathematical tool for organising intervals of time, the clock, a physical mechanism that counts the passage of time. In day-to-day life, the clock is consulted for periods less than a day whereas the calendar is consulted for periods longer than a day. Personal electronic devices display both calendars and clocks simultaneously; the number that marks the occurrence of a specified event as to hour or date is obtained by counting from a fiducial epoch – a central reference point. Artifacts from the Paleolithic suggest that the moon was used to reckon time as early as 6,000 years ago. Lunar calendars were among the first to appear, with years of either 13 lunar months.
Without intercalation to add days or months to some years, seasons drift in a calendar based on twelve lunar months. Lunisolar calendars have a thirteenth month added to some years to make up for the difference between a full year and a year of just twelve lunar months; the numbers twelve and thirteen came to feature prominently in many cultures, at least due to this relationship of months to years. Other early forms of calendars originated in Mesoamerica in ancient Mayan civilization; these calendars were religiously and astronomically based, with 18 months in a year and 20 days in a month, plus five epagomenal days at the end of the year. The reforms of Julius Caesar in 45 BC put the Roman world on a solar calendar; this Julian calendar was faulty in that its intercalation still allowed the astronomical solstices and equinoxes to advance against it by about 11 minutes per year. Pope Gregory XIII introduced a correction in 1582. During the French Revolution, a new clock and calendar were invented in attempt to de-Christianize time and create a more rational system in order to replace the Gregorian calendar.
The French Republican Calendar's days consisted of ten hours of a hundred minutes of a hundred seconds, which marked a deviation from the 12-based duodecimal system used in many other devices by many cultures. The system was abolished in 1806. A large variety of devices have been invented to measure time; the study of these devices is called horology. An Egyptian device that dates to c. 1500 BC, similar in shape to a bent T-square, measured the passage of time from the shadow cast by its crossbar on a nonlinear rule. The T was oriented eastward in the mornings. At noon, the device was turned around so. A sundial uses a gnomon to cast a shadow on a set of markings calibrated to the hour; the position of the shadow marks the hour in local time. The idea to separate the day into smaller parts is credited to Egyptians because of their sundials, which operated on a duodecimal system; the importance of the number 12 is due to the number of lunar cycles in a year and the number of stars used to count the passage of night.
The most precise timekeeping device of the ancient
Password strength is a measure of the effectiveness of a password against guessing or brute-force attacks. In its usual form, it estimates how many trials an attacker who does not have direct access to the password would need, on average, to guess it correctly; the strength of a password is a function of length and unpredictability. Using strong passwords lowers overall risk of a security breach, but strong passwords do not replace the need for other effective security controls; the effectiveness of a password of a given strength is determined by the design and implementation of the factors. The first factor is the main focus in this article; the rate at which an attacker can submit guessed passwords to the system is a key factor in determining system security. Some systems impose a time-out of several seconds after a small number of failed password entry attempts. In the absence of other vulnerabilities, such systems can be secured with simple passwords; however the system must store information about the user passwords in some form and if that information is stolen, say by breaching system security, the user passwords can be at risk.
Passwords are created either automatically or by a human. While the strength of randomly chosen passwords against a brute-force attack can be calculated with precision, determining the strength of human-generated passwords is challenging. Humans are asked to choose a password, sometimes guided by suggestions or restricted by a set of rules, when creating a new account for a computer system or Internet Web site. Only rough estimates of strength are possible, since humans tend to follow patterns in such tasks, those patterns can assist an attacker. In addition, lists of chosen passwords are available for use by password guessing programs; such lists include the numerous online dictionaries for various human languages, breached databases of plaintext and hashed passwords from various online business and social accounts, along with other common passwords. All items in such lists are considered weak. For some decades, investigations of passwords on multi-user computer systems have shown that 40% or more are guessed using only computer programs, more can be found when information about a particular user is taken into account during the attack.
Systems that use passwords for authentication must have some way to check any password entered to gain access. If the valid passwords are stored in a system file or database, an attacker who gains sufficient access to the system will obtain all user passwords, giving the attacker access to all accounts on the attacked system, other systems where users employ the same or similar passwords. One way to reduce this risk is to store only a cryptographic hash of each password instead of the password itself. Standard cryptographic hashes, such as the Secure Hash Algorithm series, are hard to reverse, so an attacker who gets hold of the hash value cannot directly recover the password. However, knowledge of the hash value lets the attacker test guesses offline. Password cracking programs are available that will test a large number of trial passwords against a purloined cryptographic hash. Improvements in computing technology keep increasing the rate at which guessed passwords can be tested. For example, in 2010, the Georgia Tech Research Institute developed a method of using GPGPU to crack passwords much faster.
Elcomsoft invented the usage of common graphic cards for quicker password recovery in August 2007 and soon filed a corresponding patent in the US. As of 2011, commercial products are available that claim the ability to test up to 112,000 passwords per second on a standard desktop computer using a high-end graphics processor; such a device will crack a 6 letter single-case password in one day. Note that the work can be distributed over many computers for an additional speedup proportional to the number of available computers with comparable GPUs. Special key stretching hashes are available that take a long time to compute, reducing the rate at which guessing can take place. Although it is considered best practice to use key stretching, many common systems do not. Another situation where quick guessing is possible is when the password is used to form a cryptographic key. In such cases, an attacker can check to see if a guessed password decodes encrypted data. For example, one commercial product claims to test 103,000 WPA PSK passwords per second.
If a password system only stores the hash of the password, an attacker can pre-compute hash values for common passwords variants and for all passwords shorter than a certain length, allowing rapid recovery of the password once its hash is obtained. Long lists of pre-computed password hashes can be efficiently stored using rainbow tables; this method of attack can be foiled by storing a random value, called a cryptographic salt, along with the hash. The salt is combined with the password when computing the hash, so an attacker precomputing a rainbow table would have to store for each password its hash with every possible salt value; this becomes infeasible if the salt has a big enough range. Many authentication systems in common use do not employ salts and rainbow tables are available on the Internet for several such systems, it is usual in the computer industry to specify password strength in terms of information entropy, measured in bits and is a concept from information theory. Instead of the number of guesses needed to find the password with certainty, the base-2 logarithm of that number is given
In cryptography, a brute-force attack consists of an attacker submitting many passwords or passphrases with the hope of guessing correctly. The attacker systematically checks all possible passwords and passphrases until the correct one is found. Alternatively, the attacker can attempt to guess the key, created from the password using a key derivation function; this is known as an exhaustive key search. A brute-force attack is a cryptanalytic attack that can, in theory, be used to attempt to decrypt any encrypted data; such an attack might be used when it is not possible to take advantage of other weaknesses in an encryption system that would make the task easier. When password-guessing, this method is fast when used to check all short passwords, but for longer passwords other methods such as the dictionary attack are used because a brute-force search takes too long. Longer passwords and keys have more possible values, making them exponentially more difficult to crack than shorter ones. Brute-force attacks can be made less effective by obfuscating the data to be encoded making it more difficult for an attacker to recognize when the code has been cracked or by making the attacker do more work to test each guess.
One of the measures of the strength of an encryption system is how long it would theoretically take an attacker to mount a successful brute-force attack against it. Brute-force attacks are an application of brute-force search, the general problem-solving technique of enumerating all candidates and checking each one. Brute-force attacks work by calculating every possible combination that could make up a password and testing it to see if it is the correct password; as the password's length increases, the amount of time, on average, to find the correct password increases exponentially. The resources required for a brute-force attack grow exponentially with increasing key size, not linearly. Although U. S. export regulations restricted key lengths to 56-bit symmetric keys, these restrictions are no longer in place, so modern symmetric algorithms use computationally stronger 128- to 256-bit keys. There is a physical argument that a 128-bit symmetric key is computationally secure against brute-force attack.
The so-called Landauer limit implied by the laws of physics sets a lower limit on the energy required to perform a computation of kT · ln 2 per bit erased in a computation, where T is the temperature of the computing device in kelvins, k is the Boltzmann constant, the natural logarithm of 2 is about 0.693. No irreversible computing device can use less energy than this in principle. Thus, in order to flip through the possible values for a 128-bit symmetric key would, require 2128 − 1 bit flips on a conventional processor. If it is assumed that the calculation occurs near room temperature, the Von Neumann-Landauer Limit can be applied to estimate the energy required as ~1018 joules, equivalent to consuming 30 gigawatts of power for one year; this is equal to 30 × 109 W × 365 × 24 × 3600 s = 9.46 × 1017 262.7 TWh. The full actual computation – checking each key to see if a solution has been found – would consume many times this amount. Furthermore, this is the energy requirement for cycling through the key space.
However, this argument assumes that the register values are changed using conventional set and clear operations which generate entropy. It has been shown that computational hardware can be designed not to encounter this theoretical obstruction, though no such computers are known to have been constructed; as commercial successors of governmental ASIC solutions have become available known as custom hardware attacks, two emerging technologies have proven their capability in the brute-force attack of certain ciphers. One is modern graphics processing unit technology, the other is the field-programmable gate array technology. GPUs benefit from their wide availability and price-performance benefit, FPGAs from their energy efficiency per cryptographic operation. Both technologies try to transport the benefits of parallel processing to brute-force attacks. In case of GPUs some hundreds, in the case of FPGA some thousand processing units making them much better suited to cracking passwords than conventional processors.
Various publications in the fields of cryptographic analysis have proved the energy efficiency of today's FPGA technology, for example, the COPACOBANA FPGA Cluster computer consumes the same energy as a single PC, but performs like 2,500 PCs for certain algorithms. A number of firms provide hardware-based FPGA cryptographic analysis solutions from a single FPGA PCI Express card up to dedicated FPGA computers. WPA and WPA2 encryption have been brute-force attacked by reducing the workload by a factor of 50 in comparison to conventional CPUs and some hundred in case of FPGAs. AES permits the use of 256-bit keys. Breaking a symmetric 256-bit key by brute force requires 2128 times more computational power than a 128-bit key. Fifty supercomputers that could check a billion billion AES keys per second would, in theory, require about 3×1051 years to exhaust the 256-bit key space. An underlying assumption of a brute-force attack is that the complete keyspace was used to generate keys, something that relies on an effective random number generator, that there are no defects in the algorithm or its implementation.
General Services Administration
The General Services Administration, an independent agency of the United States government, was established in 1949 to help manage and support the basic functioning of federal agencies. GSA supplies products and communications for U. S. government offices, provides transportation and office space to federal employees, develops government-wide cost-minimizing policies and other management tasks. GSA employs about 12,000 federal workers and has an annual operating budget of $20.9 billion. GSA oversees $66 billion of procurement annually, it contributes to the management of about $500 billion in U. S. federal property, divided chiefly among 8,700 owned and leased buildings and a 215,000 vehicle motor pool. Among the real estate assets managed by GSA are the Ronald Reagan Building and International Trade Center in Washington, D. C. – the largest U. S. federal building after the Pentagon – and the Hart-Dole-Inouye Federal Center. GSA's business lines include the Federal Acquisition Service and the Public Buildings Service, as well as several Staff Offices including the Office of Government-wide Policy, the Office of Small Business Utilization, the Office of Mission Assurance.
As part of FAS, GSA's Technology Transformation Services helps federal agencies improve delivery of information and services to the public. Key initiatives include FedRAMP, Cloud.gov, the USAGov platform, Data.gov, Performance.gov, Challenge.gov. GSA is a member of the Procurement G6, an informal group leading the use of framework agreements and e-procurement instruments in public procurement. In 1947 President Harry Truman asked former President Herbert Hoover to lead what became known as the Hoover Commission to make recommendations to reorganize the operations of the federal government. One of the recommendations of the commission was the establishment of an "Office of the General Services." This proposed office would combine the responsibilities of the following organizations: U. S. Treasury Department's Bureau of Federal Supply U. S. Treasury Department's Office of Contract Settlement National Archives Establishment All functions of the Federal Works Agency, including the Public Buildings Administration and the Public Roads Administration War Assets AdministrationGSA became an independent agency on July 1, 1949, after the passage of the Federal Property and Administrative Services Act.
General Jess Larson, Administrator of the War Assets Administration, was named GSA's first Administrator. The first job awaiting Administrator Larson and the newly formed GSA was a complete renovation of the White House; the structure had fallen into such a state of disrepair by 1949 that one inspector of the time said the historic structure was standing "purely from habit." Larson explained the nature of the total renovation in depth by saying, "In order to make the White House structurally sound, it was necessary to dismantle, I mean dismantle, everything from the White House except the four walls, which were constructed of stone. Everything, except the four walls without a roof, was stripped down, that's where the work started." GSA worked with President Truman and First Lady Bess Truman to ensure that the new agency's first major project would be a success. GSA completed the renovation in 1952. In 1986 GSA headquarters, U. S. General Services Administration Building, located at Eighteenth and F Streets, NW, was listed on the National Register of Historic Places, at the time serving as Interior Department offices.
In 1960 GSA created the Federal Telecommunications System, a government-wide intercity telephone system. In 1962 the Ad Hoc Committee on Federal Office Space created a new building program to address obsolete office buildings in Washington, D. C. resulting in the construction of many of the offices that now line Independence Avenue. In 1970 the Nixon administration created the Consumer Product Information Coordinating Center, now part of USAGov. In 1974 the Federal Buildings Fund was initiated, allowing GSA to issue rent bills to federal agencies. In 1972 GSA established the Automated Data and Telecommunications Service, which became the Office of Information Resources Management. In 1973 GSA created the Office of Federal Management Policy. GSA's Office of Acquisition Policy centralized procurement policy in 1978. GSA was responsible for emergency preparedness and stockpiling strategic materials to be used in wartime until these functions were transferred to the newly-created Federal Emergency Management Agency in 1979.
In 1984 GSA introduced the federal government to the use of charge cards, known as the GMA SmartPay system. The National Archives and Records Administration was spun off into an independent agency in 1985; the same year, GSA began to provide governmentwide policy oversight and guidance for federal real property management as a result of an Executive Order signed by President Ronald Reagan. In 2003 the Federal Protective Service was moved to the Department of Homeland Security. In 2005 GSA reorganized to merge the Federal Supply Service and Federal Technology Service business lines into the Federal Acquisition Service. On April 3, 2009, President Barack Obama nominated Martha N. Johnson to serve as GSA Administrator. After a nine-month delay, the United States Senate confirmed her nomination on February 4, 2010. On April 2, 2012, Johnson resigned in the wake of a management-deficiency report that detailed improper payments for a 2010 "Western Regions" training conference put on by the Public Buildings Service in Las Vegas.
In July 1991 GSA contractors began the excavation of what is now the Ted Weiss Federal Building in New York City. The planning for that buildin
In addition, the server invalidates any associations with the session, making any session-handle in the user's cookie store useless. This feature comes in handy if the user is using a public computer or a computer, using a public wireless connection; as a security precaution, one should not rely on implicit means of logging out of a system not on a public computer, instead one should explicitly log out and wait for the confirmation that this request has taken place. Logging out of a computer when leaving it is a common security practice, preventing unauthorized users from tampering with it. There are people who choose to have a password-protected screensaver set to activate after some period of inactivity, requiring the user to re-enter his or her login credentials to unlock the screensaver and gain access to the system. There can be different methods of logging in that may be via image, eye scan, etc; the terms became common with the time sharing systems of the 1960s and Bulletin Board Systems in the 1970s.
Early home computers and personal computers did not require them until Windows NT, OS/2 and Linux in the 1990s. The noun login comes from the verb log in, by analogy with the verb to clock in. Computer systems keep a log of users' access to the system; the term "log" comes from the chip log used to record distance travelled at sea, recorded in a ship's log or log book. To sign in connotes the same idea, but based on the analogy of manually signing a log or visitors book. While there is no agreed difference in meaning between the three terms, different technical communities tend to prefer one or another - Unix, Novell and Apple using login, with Apple's style guide saying "Users log in to a file server...", By contrast, Microsoft's style guides traditionally suggested the opposite, prescribed log on and logon. In the past Microsoft reserved sign in to when accessing the Internet but from Windows 8 onwards, have moved to the sign in terminology for local authentication. Account Computer security Login session Login spoofing OpenID Password Password policy Personal identification number /var/log/wtmp