Bitcoin is a cryptocurrency, a form of electronic cash. It is a decentralized digital currency without a central bank or single administrator that can be sent from user to user on the peer-to-peer bitcoin network without the need for intermediaries. Transactions are verified by network nodes through cryptography and recorded in a public distributed ledger called a blockchain. Bitcoin was invented by an unknown person or group of people using the name Satoshi Nakamoto and released as open-source software in 2009. Bitcoins are created as a reward for a process known as mining, they can be exchanged for other currencies and services. Research produced by the University of Cambridge estimates that in 2017, there were 2.9 to 5.8 million unique users using a cryptocurrency wallet, most of them using bitcoin. Bitcoin has been criticized for its use in illegal transactions, its high electricity consumption, price volatility, thefts from exchanges, the possibility that bitcoin is an economic bubble. Bitcoin has been used as an investment, although several regulatory agencies have issued investor alerts about bitcoin.
The domain name "bitcoin.org" was registered on 18 August 2008. On 31 October 2008, a link to a paper authored by Satoshi Nakamoto titled Bitcoin: A Peer-to-Peer Electronic Cash System was posted to a cryptography mailing list. Nakamoto implemented the bitcoin software as open-source code and released it in January 2009. Nakamoto's identity remains unknown. On 3 January 2009, the bitcoin network was created when Nakamoto mined the first block of the chain, known as the genesis block. Embedded in the coinbase of this block was the following text: "The Times 03/Jan/2009 Chancellor on brink of second bailout for banks." This note has been interpreted as both a timestamp and a comment on the instability caused by fractional-reserve banking. The receiver of the first bitcoin transaction was cypherpunk Hal Finney, who created the first reusable proof-of-work system in 2004. Finney downloaded the bitcoin software on its release date, on 12 January 2009 received ten bitcoins from Nakamoto. Other early cypherpunk supporters were creators of bitcoin predecessors: Wei Dai, creator of b-money, Nick Szabo, creator of bit gold.
In 2010, the first known commercial transaction using bitcoin occurred when programmer Laszlo Hanyecz bought two Papa John's pizzas for 10,000 bitcoin. Nakamoto is estimated to have mined one million bitcoins before disappearing in 2010, when he handed the network alert key and control of the code repository over to Gavin Andresen. Andresen became lead developer at the Bitcoin Foundation. Andresen sought to decentralize control; this left opportunity for controversy to develop over the future development path of bitcoin. After early "proof-of-concept" transactions, the first major users of bitcoin were black markets, such as Silk Road. During its 30 months of existence, beginning in February 2011, Silk Road accepted bitcoins as payment, transacting 9.9 million in bitcoins, worth about $214 million. In 2011, the price started at $0.30 per bitcoin. The price rose to $31.50 on 8 June. Within a month the price fell to $11.00. The next month it fell to $7.80, in another month to $4.77. Litecoin, an early bitcoin spin-off or altcoin, appeared in October 2011.
Many altcoins have been created since then. In 2012, bitcoin prices started at $5.27 growing to $13.30 for the year. By 9 January the price had risen to $7.38, but crashed by 49% to $3.80 over the next 16 days. The price rose to $16.41 on 17 August, but fell by 57% to $7.10 over the next three days. The Bitcoin Foundation was founded in September 2012 to promote bitcoin's uptake. In 2013, prices started at $13.30 rising to $770 by 1 January 2014. In March 2013 the blockchain temporarily split into two independent chains with different rules; the two blockchains operated for six hours, each with its own version of the transaction history. Normal operation was restored when the majority of the network downgraded to version 0.7 of the bitcoin software. The Mt. Gox exchange halted bitcoin deposits and the price dropped by 23% to $37 before recovering to previous level of $48 in the following hours; the US Financial Crimes Enforcement Network established regulatory guidelines for "decentralized virtual currencies" such as bitcoin, classifying American bitcoin miners who sell their generated bitcoins as Money Service Businesses, that are subject to registration or other legal obligations.
In April, exchanges BitInstant and Mt. Gox experienced processing delays due to insufficient capacity resulting in the bitcoin price dropping from $266 to $76 before returning to $160 within six hours; the bitcoin price rose to $259 on 10 April, but crashed by 83% to $45 over the next three days. On 15 May 2013, US authorities seized accounts associated with Mt. Gox after discovering it had not registered as a money transmitter with FinCEN in the US. On 23 June 2013, the US Drug Enforcement Administration listed 11.02 bitcoins as a seized asset in a United States Department of Justice seizure notice pursuant to 21 U. S. C. § 881. This marked the first time; the FBI seized about 26,000 bitcoins in October 2013 from the dark web website Silk Road during the arrest of Ross William Ulbricht. Bitcoin's price crashed by 50 % to $378 the same day. On 30 November 2013 the price reached $1,163 before starting a long-term crash, declining by 87% to $152 in January 2015. On 5 December 2013, the People's Bank of China prohibited Chinese financial institutions from using bitcoins.
After the announcement, the value of bitcoins dropped, Baidu no longer accepted bitcoins for certain services. Buying real-world goods w
Authentication is the act of confirming the truth of an attribute of a single piece of data claimed true by an entity. In contrast with identification, which refers to the act of stating or otherwise indicating a claim purportedly attesting to a person or thing's identity, authentication is the process of confirming that identity, it might involve confirming the identity of a person by validating their identity documents, verifying the authenticity of a website with a digital certificate, determining the age of an artifact by carbon dating, or ensuring that a product is what its packaging and labeling claim to be. In other words, authentication involves verifying the validity of at least one form of identification. Authentication is relevant to multiple fields. In art and anthropology, a common problem is verifying that a given artifact was produced by a certain person or in a certain place or period of history. In computer science, verifying a person's identity is required to allow access to confidential data or systems.
Authentication can be considered to be of three types: The first type of authentication is accepting proof of identity given by a credible person who has first-hand evidence that the identity is genuine. When authentication is required of art or physical objects, this proof could be a friend, family member or colleague attesting to the item's provenance by having witnessed the item in its creator's possession. With autographed sports memorabilia, this could involve someone attesting that they witnessed the object being signed. A vendor selling branded items implies authenticity, while he or she may not have evidence that every step in the supply chain was authenticated. Centralized authority-based trust relationships back most secure internet communication through known public certificate authorities; the second type of authentication is comparing the attributes of the object itself to what is known about objects of that origin. For example, an art expert might look for similarities in the style of painting, check the location and form of a signature, or compare the object to an old photograph.
An archaeologist, on the other hand, might use carbon dating to verify the age of an artifact, do a chemical and spectroscopic analysis of the materials used, or compare the style of construction or decoration to other artifacts of similar origin. The physics of sound and light, comparison with a known physical environment, can be used to examine the authenticity of audio recordings, photographs, or videos. Documents can be verified as being created on ink or paper available at the time of the item's implied creation. Attribute comparison may be vulnerable to forgery. In general, it relies on the facts that creating a forgery indistinguishable from a genuine artifact requires expert knowledge, that mistakes are made, that the amount of effort required to do so is greater than the amount of profit that can be gained from the forgery. In art and antiques, certificates are of great importance for authenticating an object of interest and value. Certificates can, however be forged, the authentication of these poses a problem.
For instance, the son of Han van Meegeren, the well-known art-forger, forged the work of his father and provided a certificate for its provenance as well. Criminal and civil penalties for fraud and counterfeiting can reduce the incentive for falsification, depending on the risk of getting caught. Currency and other financial instruments use this second type of authentication method. Bills and cheques incorporate hard-to-duplicate physical features, such as fine printing or engraving, distinctive feel and holographic imagery, which are easy for trained receivers to verify; the third type of authentication relies on documentation or other external affirmations. In criminal courts, the rules of evidence require establishing the chain of custody of evidence presented; this can be accomplished through a written evidence log, or by testimony from the police detectives and forensics staff that handled it. Some antiques are accompanied by certificates attesting to their authenticity. Signed sports memorabilia is accompanied by a certificate of authenticity.
These external records have their own problems of forgery and perjury, are vulnerable to being separated from the artifact and lost. In computer science, a user can be given access to secure systems based on user credentials that imply authenticity. A network administrator can give a user a password, or provide the user with a key card or other access device to allow system access. In this case, authenticity is implied but not guaranteed. Consumer goods such as pharmaceuticals, fashion clothing can use all three forms of authentication to prevent counterfeit goods from taking advantage of a popular brand's reputation; as mentioned above, having an item for sale in a reputable store implicitly attests to it being genuine, the first type of authentication. The second type of authentication might involve comparing the quality and craftsmanship of an item, such as an expensive handbag, to genuine articles; the third type of authentication could be the presence of a trademark on the item, a protected marking, or any other identifying feature which aids consumers in the identification o
An authentication protocol is a type of computer communications protocol or cryptographic protocol designed for transfer of authentication data between two entities. It allows the receiving entity to authenticate the connecting entity as well as authenticate itself to the connecting entity by declaring the type of information needed for authentication as well as syntax, it is the most important layer of protection needed for secure communication within computer networks. With the increasing amount of trustworthy information being accessible over the network, the need for keeping unauthorized persons from access to this data emerged. Stealing someone's identity is easy in the computing world - special verification methods had to be invented to find out whether the person/computer requesting data is who he says he is; the task of the authentication protocol is to specify the exact series of steps needed for execution of the authentication. It has to comply with the main protocol principles: A Protocol has to involve two or more parties and everyone involved in the protocol must know the protocol in advance.
All the included. A protocol has to be unambiguous - each step must be defined precisely. A protocol must be complete - must include a specified action for every possible situation. An illustration of password-based authentication using simple authentication protocol: Alice and Bob are both aware of the protocol they agreed on using. Bob has Alice's password stored in a database for comparison. Alice sends Bob her password in a packet complying with the protocol rules. Bob checks the received password against the one stored in his database, he sends a packet saying "Authentication successful" or "Authentication failed" based on the result. This is an example of a basic authentication protocol vulnerable to many threats such as eavesdropping, replay attack, man-in-the-middle attacks, dictionary attacks or brute-force attacks. Most authentication protocols are more complicated. Protocols are used by Point-to-Point Protocol servers to validate the identity of remote clients before granting them access to server data.
Most of them use a password as the cornerstone of the authentication. In most cases, the password has to be shared between the communicating entities in advance. Password Authentication Protocol is one of the oldest authentication protocols. Authentication is initialized by the client sending a packet with credentials at the beginning of the connection, with the client repeating the authentication request until acknowledgement is received, it is insecure because credentials are sent "in the clear" and making it vulnerable to the most simple attacks like eavesdropping and man-in-the-middle based attacks. Although supported, it is specified that if an implementation offers a stronger authentication method, that method must be offered before PAP. Mixed authentication is not expected, as the CHAP authentication would be compromised by PAP sending the password in plain-text; the authentication process in this protocol is always initialized by the server/host and can be performed anytime during the session repeatedly.
Server sends a random string. The client uses password and the string received as parameters for MD5 hash function and sends the result together with username in plain text. Server compares the calculated and received hash. An authentication is unsuccessful. EAP was developed for PPP but today is used in IEEE 802.3, IEEE 802.11 or IEEE 802.16 as a part of IEEE 802.1x authentication framework. The latest version is standardized in RFC 5247; the advantage of EAP is that it is only a general authentication framework for client-server authentication - the specific way of authentication is defined in its many versions called EAP-methods. More than 40 EAP-methods exist, the most common are: EAP-MD5 EAP-TLS EAP-TTLS EAP-FAST EAP-PEAP Complex protocols used in larger networks for verifying the user, controlling access to server data and monitoring network resources and information needed for billing of services; the oldest AAA protocol using IP based authentication without any encryption. Version XTACACS added authorization and accounting.
Both of these protocols were replaced by TACACS+. TACACS + separates the AAA components thus they can be handled on separate servers, it encrypts the whole packet. TACACS+ is Cisco proprietary. Remote Authentication Dial-In User Service is a full AAA protocol used by ISP. Credentials are username-password combination based, it uses NAS and UDP protocol for transport. Diameter evolved from RADIUS and involves many improvements such as usage of more reliable TCP or SCTP transport protocol and higher security thanks to TLS. Kerberos is a centralized network authentication system developed at MIT and available as a free implementation from MIT but in many commercial products, it is the default authentication method in Windows 2000 and later. The authentication process itself is much more complicated than in the previous protocols - Kerberos uses symmetric key cryptography, requires a trusted third party and can use public-key cryptograp
Cryptography or cryptology is the practice and study of techniques for secure communication in the presence of third parties called adversaries. More cryptography is about constructing and analyzing protocols that prevent third parties or the public from reading private messages. Modern cryptography exists at the intersection of the disciplines of mathematics, computer science, electrical engineering, communication science, physics. Applications of cryptography include electronic commerce, chip-based payment cards, digital currencies, computer passwords, military communications. Cryptography prior to the modern age was synonymous with encryption, the conversion of information from a readable state to apparent nonsense; the originator of an encrypted message shares the decoding technique only with intended recipients to preclude access from adversaries. The cryptography literature uses the names Alice for the sender, Bob for the intended recipient, Eve for the adversary. Since the development of rotor cipher machines in World War I and the advent of computers in World War II, the methods used to carry out cryptology have become complex and its application more widespread.
Modern cryptography is based on mathematical theory and computer science practice. It is theoretically possible to break such a system, but it is infeasible to do so by any known practical means; these schemes are therefore termed computationally secure. There exist information-theoretically secure schemes that provably cannot be broken with unlimited computing power—an example is the one-time pad—but these schemes are more difficult to use in practice than the best theoretically breakable but computationally secure mechanisms; the growth of cryptographic technology has raised a number of legal issues in the information age. Cryptography's potential for use as a tool for espionage and sedition has led many governments to classify it as a weapon and to limit or prohibit its use and export. In some jurisdictions where the use of cryptography is legal, laws permit investigators to compel the disclosure of encryption keys for documents relevant to an investigation. Cryptography plays a major role in digital rights management and copyright infringement of digital media.
The first use of the term cryptograph dates back to the 19th century—originating from The Gold-Bug, a novel by Edgar Allan Poe. Until modern times, cryptography referred exclusively to encryption, the process of converting ordinary information into unintelligible form. Decryption is the reverse, in other words, moving from the unintelligible ciphertext back to plaintext. A cipher is a pair of algorithms that create the reversing decryption; the detailed operation of a cipher is controlled both by the algorithm and in each instance by a "key". The key is a secret a short string of characters, needed to decrypt the ciphertext. Formally, a "cryptosystem" is the ordered list of elements of finite possible plaintexts, finite possible cyphertexts, finite possible keys, the encryption and decryption algorithms which correspond to each key. Keys are important both formally and in actual practice, as ciphers without variable keys can be trivially broken with only the knowledge of the cipher used and are therefore useless for most purposes.
Ciphers were used directly for encryption or decryption without additional procedures such as authentication or integrity checks. There are two kinds of cryptosystems: asymmetric. In symmetric systems the same key is used to decrypt a message. Data manipulation in symmetric systems is faster than asymmetric systems as they use shorter key lengths. Asymmetric systems use a public key to encrypt a private key to decrypt it. Use of asymmetric systems enhances the security of communication. Examples of asymmetric systems include RSA, ECC. Symmetric models include the used AES which replaced the older DES. In colloquial use, the term "code" is used to mean any method of encryption or concealment of meaning. However, in cryptography, code has a more specific meaning, it means the replacement of a unit of plaintext with a code word. Cryptanalysis is the term used for the study of methods for obtaining the meaning of encrypted information without access to the key required to do so; some use the terms cryptography and cryptology interchangeably in English, while others use cryptography to refer to the use and practice of cryptographic techniques and cryptology to refer to the combined study of cryptography and cryptanalysis.
English is more flexible than several other languages in which crypto
A timestamp is a sequence of characters or encoded information identifying when a certain event occurred giving date and time of day, sometimes accurate to a small fraction of a second. The term derives from rubber stamps used in offices to stamp the current date, sometimes time, in ink on paper documents, to record when the document was received. Common examples of this type of timestamp are a postmark on a letter or the "in" and "out" times on a time card. In modern times usage of the term has expanded to refer to digital date and time information attached to digital data. For example, computer files contain timestamps that tell when the file was last modified, digital cameras add timestamps to the pictures they take, recording the date and time the picture was taken. A timestamp is the time at which an event is recorded by a computer, not the time of the event itself. In many cases, the difference may be inconsequential: the time at which an event is recorded by a timestamp should be close to the time of the event.
This data is presented in a consistent format, allowing for easy comparison of two different records and tracking progress over time. The sequential numbering of events is sometimes called timestamping. Timestamps are used for logging events or in a sequence of events, in which case each event in the log or SOE is marked with a timestamp. All computer file systems store one or more timestamps in the per-file metadata. In particular, most modern operating systems support the POSIX stat, so each file has 3 timestamps associated with it: time of last access, time of last modification, time of last status change; some file archivers and some version control software, when they copy a file from some remote computer to the local computer, adjust the timestamps of the local file to show the date/time in the past when that file was created or modified on that remote computer, rather than the date/time when that file was copied to the local computer. Examples of timestamps: Wed 01-01-2009 6:00 2005-10-30 T 10:45 UTC 2007-11-09 T 11:20 UTC Sat Jul 23 02:16:57 2005 1256953732 – 07:38, 11 December 2012 1985-102 T 10:15 UTC 1985-W15-5 T 10:15 UTC 20180203073000 ISO 8601 standardizes the representation of dates and times.
These standard representations are used to construct timestamp values. Timestamp can refer to: A time code Unix time, the number of seconds since 00:00:00 UTC on January 1, 1970 ICMP Timestamp A digitally signed timestamp whose signer vouches for the existence of the signed document or content at the time given as part of the digital signature The modification or access time of a file or directory in a computer file system or database A proof of authenticity of a person on sites such as 4chan Bates numbering Timestamping Timestamp-based concurrency control Trusted timestamping Decentralized Trusted Timestamping on the blockchain Linked timestamping
Email spam known as junk email, is unsolicited messages sent in bulk by email. The name comes from Spam luncheon meat by way of a Monty Python sketch in which Spam is ubiquitous and repetitive. Email spam has grown since the early 1990s, by 2014 was estimated that it made up around 90% of email messages sent. Since the expense of the spam is borne by the recipient, it is postage due advertising; this makes it an excellent example of a negative externality. The legal definition and status of spam varies from one jurisdiction to another, but laws and lawsuits have nowhere been successful in stemming spam. Most email spam messages are commercial in nature. Whether commercial or not, many are not only annoying, but dangerous because they may contain links that lead to phishing web sites or sites that are hosting malware - or include malware as file attachments. Spammers collect email addresses from chat rooms, customer lists and viruses that harvest users' address books; these collected email addresses are sometimes sold to other spammers.
At the beginning of the Internet, sending of commercial email was prohibited. Gary Thuerk sent the first email spam message in 1978 to 600 people, he told not to do it again. Now the ban on spam is enforced by the Terms of Service/Acceptable Use Policy of internet service providers and peer pressure. Spam is sent by less companies; when spam is sent by reputable companies it is sometimes referred to as Mainsleaze. Mainsleaze makes up 3% of the spam sent over the internet; the problem with mainsleaze is that it is mixed in with mail that the recipients asked for, it is difficult to tell the difference using traditional mail filters. As a result, if a mail system filters out all the mail from a mainsleazer, they'll get complaints from the people who signed up, it was estimated in 2009. As the scale of the spam problem has grown, ISPs and the public have turned to government for relief from spam, which has failed to materialize. Many spam emails contain URLs to websites. According to a Cyberoam report in 2014, there are an average of 54 billion spam messages sent every day.
"Pharmaceutical products jumped up 45% from last quarter’s analysis, leading this quarter’s spam pack. Emails purporting to offer jobs with fast, easy cash come in at number two, accounting for 15% of all spam email. And, rounding off at number three are spam emails about diet products, accounting for 1%." Spam is a medium for fraudsters to scam users into entering personal information on fake Web sites using emails forged to look like they are from banks or other organizations, such as PayPal. This is known as phishing. Targeted phishing, where known information about the recipient is used to create forged emails, is known as spear-phishing. If a marketer has one database containing names and telephone numbers of customers, they can pay to have their database matched against an external database containing email addresses; the company has the means to send email to people who have not requested email, which may include people who have deliberately withheld their email address. Image spam, or image-based spam, is an obfuscation method by which text of the message is stored as a GIF or JPEG image and displayed in the email.
This prevents text-based spam filters from blocking spam messages. Image spam was used in the mid-2000s to advertise "pump and dump" stocks. Image spam contains nonsensical, computer-generated text which annoys the reader. However, new technology in some programs tries to read the images by attempting to find text in these images; these programs are not accurate, sometimes filter out innocent images of products, such as a box that has words on it. A newer technique, however, is to use an animated GIF image that does not contain clear text in its initial frame, or to contort the shapes of letters in the image to avoid detection by optical character recognition tools. Blank spam is spam lacking a payload advertisement; the message body is missing altogether, as well as the subject line. Still, it fits the definition of spam because of its nature as unsolicited email. Blank spam may be originated in different ways, either intentional or unintentionally: Blank spam can have been sent in a directory harvest attack, a form of dictionary attack for gathering valid addresses from an email service provider.
Since the goal in such an attack is to use the bounces to separate invalid addresses from the valid ones, spammers may dispense with most elements of the header and the entire message body, still accomplish their goals. Blank spam may occur when a spammer forgets or otherwise fails to add the payload when he or she sets up the spam run. Blank spam headers appear truncated, suggesting that computer glitches, such as software bugs or other may have contributed to this problem—from poorly written spam software to malfunctioning relay servers, or any problems that may truncate header lines from the message body; some spam may appear to be blank. An example of this is the VBS. Davinia. B email worm which propagates through messages that have no subject line and appears blank, when in fact it uses HTML code to download other files. Backscatter is a side-effect of email spam and worms, it happens when email servers are mis-configured to send a bogus bounce message to the envelope sender when rejecting or quarantining email (rather than rejecting the attemp
Hypertext Transfer Protocol
The Hypertext Transfer Protocol is an application protocol for distributed, hypermedia information systems. HTTP is the foundation of data communication for the World Wide Web, where hypertext documents include hyperlinks to other resources that the user can access, for example by a mouse click or by tapping the screen in a web browser. HTTP was developed to facilitate the World Wide Web. Development of HTTP was initiated by Tim Berners-Lee at CERN in 1989. Development of HTTP standards was coordinated by the Internet Engineering Task Force and the World Wide Web Consortium, culminating in the publication of a series of Requests for Comments; the first definition of HTTP/1.1, the version of HTTP in common use, occurred in RFC 2068 in 1997, although this was made obsolete by RFC 2616 in 1999 and again by the RFC 7230 family of RFCs in 2014. A version, the successor HTTP/2, was standardized in 2015, is now supported by major web servers and browsers over Transport Layer Security using Application-Layer Protocol Negotiation extension where TLS 1.2 or newer is required.
HTTP functions as a request–response protocol in the client–server computing model. A web browser, for example, may be the client and an application running on a computer hosting a website may be the server; the client submits an HTTP request message to the server. The server, which provides resources such as HTML files and other content, or performs other functions on behalf of the client, returns a response message to the client; the response contains completion status information about the request and may contain requested content in its message body. A web browser is an example of a user agent. Other types of user agent include the indexing software used by search providers, voice browsers, mobile apps, other software that accesses, consumes, or displays web content. HTTP is designed to permit intermediate network elements to improve or enable communications between clients and servers. High-traffic websites benefit from web cache servers that deliver content on behalf of upstream servers to improve response time.
Web browsers cache accessed web resources and reuse them, when possible, to reduce network traffic. HTTP proxy servers at private network boundaries can facilitate communication for clients without a globally routable address, by relaying messages with external servers. HTTP is an application layer protocol designed within the framework of the Internet protocol suite, its definition presumes an underlying and reliable transport layer protocol, Transmission Control Protocol is used. However, HTTP can be adapted to use unreliable protocols such as the User Datagram Protocol, for example in HTTPU and Simple Service Discovery Protocol. HTTP resources are identified and located on the network by Uniform Resource Locators, using the Uniform Resource Identifiers schemes http and https. URIs and hyperlinks in HTML documents form interlinked hypertext documents. HTTP/1.1 is a revision of the original HTTP. In HTTP/1.0 a separate connection to the same server is made for every resource request. HTTP/1.1 can reuse a connection multiple times to download images, stylesheets, etc after the page has been delivered.
HTTP/1.1 communications therefore experience less latency as the establishment of TCP connections presents considerable overhead. The term hypertext was coined by Ted Nelson in 1965 in the Xanadu Project, in turn inspired by Vannevar Bush's 1930s vision of the microfilm-based information retrieval and management "memex" system described in his 1945 essay "As We May Think". Tim Berners-Lee and his team at CERN are credited with inventing the original HTTP, along with HTML and the associated technology for a web server and a text-based web browser. Berners-Lee first proposed the "WorldWideWeb" project in 1989—now known as the World Wide Web; the first version of the protocol had only one method, namely GET, which would request a page from a server. The response from the server was always an HTML page; the first documented version of HTTP was HTTP V0.9. Dave Raggett led the HTTP Working Group in 1995 and wanted to expand the protocol with extended operations, extended negotiation, richer meta-information, tied with a security protocol which became more efficient by adding additional methods and header fields.
RFC 1945 introduced and recognized HTTP V1.0 in 1996. The HTTP WG planned to publish new standards in December 1995 and the support for pre-standard HTTP/1.1 based on the developing RFC 2068 was adopted by the major browser developers in early 1996. By March that year, pre-standard HTTP/1.1 was supported in Arena, Netscape 2.0, Netscape Navigator Gold 2.01, Mosaic 2.7, Lynx 2.5, in Internet Explorer 2.0. End-user adoption of the new browsers was rapid. In March 1996, one web hosting company reported that over 40% of browsers in use on the Internet were HTTP 1.1 compliant. That same web hosting company reported that by June 1996, 65% of all browsers accessing their servers were HTTP/1.1 compliant. The HTTP/1.1 standard as defined in RFC 2068 was released in January 1997. Improvements and updates to the HTTP/1.1 standard were released under RFC 2616 in June 1999. In 2007, the HTTPbis Working Group was formed, in part, to revise and clarify the HTTP/1.1 specification. In June 2014, the WG released an updated six-part specification obsoleting RFC 2616: RFC 7230, HTTP/1.1: Message Syntax and Routing RFC 7231, HTTP/1.1: Semantics and Content RFC 7232, HTTP/1.1: Conditional Requests RFC 7233, HTTP/1.1: Range Requests RFC 7234, HTTP/1.1: Caching RFC 7235, HTTP/1