File sharing is the practice of distributing or providing access to digital media, such as computer programs, documents or electronic books. File sharing may be achieved in a number of ways. Common methods of storage and dispersion include manual sharing utilizing removable media, centralized servers on computer networks, World Wide Web-based hyperlinked documents, the use of distributed peer-to-peer networking. Peer-to-peer file sharing is based on the peer-to-peer application architecture. Shared files on the computers of other users are indexed on directory servers. P2P technology was used by popular services like Limewire; the most popular protocol for P2P sharing is BitTorrent. Cloud-based file syncing and sharing services implement automated file transfers by updating files from a dedicated sharing directory on each user's networked devices. Files placed in this folder are accessible through a website and mobile app, can be shared with other users for viewing or collaboration; such services have become popular via consumer-oriented file hosting services such as Dropbox and Google Drive.
Rsync is a more traditional program released in 1996 which synchronizes files on a direct machine-to-machine basis. Data synchronization in general can use other approaches to share files, such as distributed filesystems, version control, or mirrors. Files were first exchanged on removable media. Computers were able to access remote files using filesystem mounting, bulletin board systems, FTP servers. Internet Relay Chat and Hotline enabled users to communicate remotely through chat and to exchange files; the mp3 encoding, standardized in 1991 and reduced the size of audio files, grew to widespread use in the late 1990s. In 1998, MP3.com and Audiogalaxy were established, the Digital Millennium Copyright Act was unanimously passed, the first mp3 player devices were launched. In June 1999, Napster was released as an unstructured centralized peer-to-peer system, requiring a central server for indexing and peer discovery, it is credited as being the first peer-to-peer file sharing system. Gnutella, eDonkey2000, Freenet were released in 2000, as MP3.com and Napster were facing litigation.
Gnutella, released in March, was the first decentralized file sharing network. In the gnutella network, all connecting software was considered equal, therefore the network had no central point of failure. In July, Freenet became the first anonymity network. In September the eDonkey2000 client and server software was released. In 2001, Kazaa and Poisoned for the Mac was released, its FastTrack network was distributed, though unlike gnutella, it assigned more traffic to'supernodes' to increase routing efficiency. The network was proprietary and encrypted, the Kazaa team made substantial efforts to keep other clients such as Morpheus off of the FastTrack network. In July 2001, Napster was sued by several recording companies and lost in A&M Records, Inc. v. Napster, Inc.. In the case of Napster, it has been ruled that an online service provider could not use the "transitory network transmission" safe harbor in the DMCA if they had control of the network with a server. Shortly after its loss in court, Napster was shut down to comply with a court order.
This drove users to other P2P applications and file sharing continued its growth. The Audiogalaxy Satellite client grew in popularity, the LimeWire client and BitTorrent protocol were released; until its decline in 2004, Kazaa was the most popular file sharing program despite bundled malware and legal battles in the Netherlands and the United States. In 2002, a Tokyo district court ruling shut down File Rogue, the Recording Industry Association of America filed a lawsuit that shut down Audiogalaxy. From 2002 through 2003, a number of BitTorrent services were established, including Suprnova.org, isoHunt, TorrentSpy, The Pirate Bay. In 2002, the RIAA was filing lawsuits against Kazaa users; as a result of such lawsuits, many universities added file sharing regulations in their school administrative codes. With the shutdown of eDonkey in 2005, eMule became the dominant client of the eDonkey network. In 2006, police raids took down the Razorback2 eDonkey server and temporarily took down The Pirate Bay.“The File Sharing Act was launched by Chairman Towns in 2009, this act prohibited the use of applications that allowed individuals to share federal information amongst one another.
On the other hand, only specific file sharing application were made available to federal computers”. In 2009, the Pirate Bay trial ended in a guilty verdict for the primary founders of the tracker; the decision was appealed, leading to a second guilty verdict in November 2010. In October 2010, Limewire was forced to shut down following a court order in Arista Records LLC v. Lime Group LLC but the gnutella network remains active through open source clients like Frostwire and gtk-gnutella. Furthermore, multi-protocol file sharing software such as MLDonkey and Shareaza adapted in order to support all the major file sharing protocols, so users no longer had to install and configure multiple file sharing programs. On January 19, 2012, the United States Department of Justice shut down the popular domain of Megaupload; the file sharing site has claimed to have over 50,000,000 people a day. Kim Dotcom was arrested with three associates in New Zealand on January 20, 2012 and is awaiting extradition; the case involving the downfall of the world's largest and most popular file sharing site was not well received, with hac
A blog is a discussion or informational website published on the World Wide Web consisting of discrete informal diary-style text entries. Posts are displayed in reverse chronological order, so that the most recent post appears first, at the top of the web page; until 2009, blogs were the work of a single individual of a small group, covered a single subject or topic. In the 2010s, "multi-author blogs" emerged, featuring the writing of multiple authors and sometimes professionally edited. MABs from newspapers, other media outlets, think tanks, advocacy groups, similar institutions account for an increasing quantity of blog traffic; the rise of Twitter and other "microblogging" systems helps integrate MABs and single-author blogs into the news media. Blog can be used as a verb, meaning to maintain or add content to a blog; the emergence and growth of blogs in the late 1990s coincided with the advent of web publishing tools that facilitated the posting of content by non-technical users who did not have much experience with HTML or computer programming.
A knowledge of such technologies as HTML and File Transfer Protocol had been required to publish content on the Web, early Web users therefore tended to be hackers and computer enthusiasts. In the 2010s, the majority are interactive Web 2.0 websites, allowing visitors to leave online comments, it is this interactivity that distinguishes them from other static websites. In that sense, blogging can be seen as a form of social networking service. Indeed, bloggers do not only produce content to post on their blogs, but often build social relations with their readers and other bloggers. However, there are high-readership blogs. Many blogs provide commentary on topic, ranging from politics to sports. Others function as more personal online diaries, others function more as online brand advertising of a particular individual or company. A typical blog combines text, digital images, links to other blogs, web pages, other media related to its topic; the ability of readers to leave publicly viewable comments, interact with other commenters, is an important contribution to the popularity of many blogs.
However, blog owners or authors moderate and filter online comments to remove hate speech or other offensive content. Most blogs are textual, although some focus on art, videos and audio. In education, blogs can be used as instructional resources; these blogs are referred to as edublogs. Microblogging is another type of blogging, featuring short posts. On 16 February 2011, there were over 156 million public blogs in existence. On 20 February 2014, there were around 172 million Tumblr and 75.8 million WordPress blogs in existence worldwide. According to critics and other bloggers, Blogger is the most popular blogging service used today. However, Blogger does not offer public statistics. Technorati lists 1.3 million blogs as of February 22, 2014. The term "weblog" was coined by Jorn Barger on 17 December 1997; the short form, "blog", was coined by Peter Merholz, who jokingly broke the word weblog into the phrase we blog in the sidebar of his blog Peterme.com in April or May 1999. Shortly thereafter, Evan Williams at Pyra Labs used "blog" as both a noun and verb and devised the term "blogger" in connection with Pyra Labs' Blogger product, leading to the popularization of the terms.
Before blogging became popular, digital communities took many forms including Usenet, commercial online services such as GEnie, Byte Information Exchange and the early CompuServe, e-mail lists, Bulletin Board Systems. In the 1990s, Internet forum software created running conversations with "threads". Threads are topical connections between messages on a virtual "corkboard". From 14 June 1993, Mosaic Communications Corporation maintained their "What’s New" list of new websites, updated daily and archived monthly; the page was accessible by a special ``. The earliest instance of a commercial blog was on the first business to consumer Web site created in 1995 by Ty, Inc. which featured a blog in a section called "Online Diary". The entries were maintained by featured Beanie Babies that were voted for monthly by Web site visitors; the modern blog evolved from the online diary where people would keep a running account of the events in their personal lives. Most such writers journalers. Justin Hall, who began personal blogging in 1994 while a student at Swarthmore College, is recognized as one of the earlier bloggers, as is Jerry Pournelle.
Dave Winer's Scripting News is credited with being one of the older and longer running weblogs. The Australian Netguide magazine maintained the Daily Net News on their web site from 1996. Daily Net News ran links and daily reviews of new websites in Australia. Another early blog was Wearable Wireless Webcam, an online shared diary of a person's personal life combining text, digital video, digital pictures transmitted live from a wearable computer and EyeTap device to a web site in 1994; this practice of semi-automated blogging with live video together with text was referred to as sousveillance, such journals were used as evidence in legal matters. Some early bloggers, such as The Misanthropic Bitch, who began in 1997 referred to their online presence as a zine, before the term blog entered common usage. Early blogs were manually updated components of common Websites. In 1995, the "Online Diary" on
A podcast or generically netcast, is an episodic series of digital audio or video files which a user can download in order to listen to. It is available for subscription, so that new episodes are automatically downloaded via web syndication to the user's own local computer, mobile application, or portable media player; the word was suggested by Ben Hammersley as a portmanteau of "iPod" and "broadcast". The files distributed are in audio format, but may sometimes include other file formats such as PDF or EPUB. Videos which are shared following a podcast model are sometimes called video vodcasts; the generator of a podcast maintains a central list of the files on a server as a web feed that can be accessed through the Internet. The listener or viewer uses special client application software on a computer or media player, known as a podcatcher, which accesses this web feed, checks it for updates, downloads any new files in the series; this process can be automated to download new files automatically, which may seem to users as though new episodes are broadcast or "pushed" to them.
Files are stored locally on the user's device, ready for offline use. There are many different mobile applications available for people to use to subscribe and to listen to podcasts. Many of these applications allow users to download podcasts or to stream them on demand as an alternative to downloading. Many podcast players allow listeners to control the playback speed; some have labeled podcasting as a converged medium bringing together audio, the web, portable media players, as well as a disruptive technology that has caused some individuals in the radio business to reconsider established practices and preconceptions about audiences, consumption and distribution. Podcasts are free of charge to listeners and can be created for little to no cost, which sets them apart from the traditional model of "gate-kept" media and production tools. Podcast creators can monetize their podcasts by allowing companies to purchase ad time, as well as via sites such as Patreon, which provides special extras and content to listeners for a fee.
Podcasting is much a horizontal media form – producers are consumers, consumers may become producers, both can engage in conversations with each other. "Podcast" is a portmanteau word, formed by combining "iPod" and "broadcast". The term "podcasting" as a name for the nascent technology was first suggested by The Guardian columnist and BBC journalist Ben Hammersley, who invented it in early February 2004 while "padding out" an article for The Guardian newspaper. Despite the etymology, the content can be accessed using any computer or similar device that can play media files. Use of the term "podcast" predated Apple's addition of formal support for podcasting to the iPod, or its iTunes software. Other names for podcasting include "net cast", intended as a vendor-neutral term without the loose reference to the Apple iPod; this name is used by shows from the TWiT.tv network. Some sources have suggested the backronym "portable on demand" or "POD", for similar reasons. In 2004, former MTV video jockey Adam Curry, in collaboration with Dave Winer – co-author of the RSS specification – is credited with coming up with the idea to automate the delivery and syncing of textual content to portable audio players.
Podcasting, once an obscure method of spreading audio information, has become a recognized medium for distributing audio content, whether for corporate or personal use. Podcasts are similar to radio programs in form, but they exist as audio files that can be played at a listener's convenience, anytime or anywhere; the first application to make this process feasible was iPodderX, developed by August Trometer and Ray Slakinski. By 2007, audio podcasts were doing what was accomplished via radio broadcasts, the source of radio talk shows and news programs since the 1930s; this shift occurred as a result of the evolution of internet capabilities along with increased consumer access to cheaper hardware and software for audio recording and editing. In October 2003, Matt Schichter launched. B. B. King, Third Eye Blind, Gavin DeGraw, The Beach Boys, Jason Mraz were notable guests the first season; the hour long radio show was recorded live, transcoded to 16kbit/s audio for dial-up online streaming. Despite a lack of a accepted identifying name for the medium at the time of its creation, The Backstage Pass which became known as Matt Schichter Interviews is believed to be the first podcast to be published online.
In August 2004, Adam Curry launched his show Daily Source Code. It was a show focused on chronicling his everyday life, delivering news, discussions about the development of podcasting, as well as promoting new and emerging podcasts. Curry published it in an attempt to gain traction in the development of what would come to be known as podcasting and as a means of testing the software outside of a lab setting; the name Daily Source Code was chosen in the hope that it would attract an audience with an interest in technology. Daily Source Code started at a grassroots level of production and was directed at podcast developers; as its audience became interested in the format, these developers were inspired to create and produce their own projects and, as a result, they improved the code used to create podcasts. As more people learned how easy it was to produce podcasts, a community of pioneer podcasters appeared. In June 2005, Apple released iTunes 4.9 which added formal support for podcasts, thus negating the need to use a separate program in order to download and transfer them to a mobile device.
While this made access to podcasts more
Internet Governance Forum
The Internet Governance Forum is a multi-stakeholder forum for policy dialogue on issues of Internet governance. It brings together all stakeholders in the Internet governance debate, whether they represent governments, the private sector or civil society, including the technical and academic community, on an equal basis and through an open and inclusive process; the establishment of the IGF was formally announced by the United Nations Secretary-General in July 2006. It has held an annual meeting since then; the first phase of World Summit on the Information Society, held in Geneva in December 2003, failed to agree on the future of Internet governance, but did agree to continue the dialogue and requested the United Nations Secretary-General to establish a multi-stakeholder Working Group on Internet Governance. Following a series of open consultations in 2004 and 2005 and after reaching a clear consensus among its members the WGIG proposed the creation of the IGF as one of four proposals made in its final report.
Paragraph 40 of the WGIG report stated: "he WGIG identified a vacuum within the context of existing structures, since there is no global multi-stakeholder forum to address Internet-related public policy issues. It came to the conclusion that there would be merit in creating such a space for dialogue among all stakeholders; this space could address these issues, as well as emerging issues, that are cross-cutting and multidimensional and that either affect more than one institution, are not dealt with by any institution or are not addressed in a coordinated manner". The WGIG report was one of the inputs to the second phase of the World Summit on the Information Society held in Tunis in 2005; the idea of the Forum was proposed by Argentina, as stated in its proposal made during the last Prepcom 3 in Tunis: "In order to strengthen the global multistakeholder interaction and cooperation on public policy issues and developmental aspects relating to Internet governance we propose a forum. This forum should not replace existing mechanisms or institutions but should build on the existing structures on Internet governance, should contribute to the sustainability and robustness of the Internet by addressing appropriately public policy issues that are not otherwise being adequately addressed excluding any involvement in the day to day operation of the Internet.
It should be constituted as a neutral, non-duplicative and non-binding process to facilitate the exchange of information and best practices and to identify issues and make known its findings, to enhance awareness and build consensus and engagement. Recognizing the rapid development of technology and institutions, we propose that the forum mechanism periodically be reviewed to determine the need for its continuation." The second phase of WSIS, held in Tunis in November 2005, formally called for the creation of the IGF and set out its mandate. Paragraph 72 of the Tunis Agenda called on the UN Secretary-General to convene a meeting with regards to the new multi-stakeholder forum to be known as the IGF; the Tunis WSIS meeting did not reach an agreement on any of the other WGIG proposals that focused on new oversight functions for the Internet that would reduce or eliminate the special role that the United States plays with respect to Internet governance through its contractual oversight of ICANN.
The US Government's position during the lead-up to the Tunis WSIS meeting was flexible on the principle of global involvement strong on the principle of multi-stakeholder participation, but inflexible on the need for US control to remain for the foreseeable future in order to ensure the "security and stability of the Internet". The mandate for the IGF is contained in the 2005 WSIS Tunis Agenda; the IGF was mandated to be principally a discussion forum for facilitating dialogue between the Forum's participants. The IGF may "identify emerging issues, bring them to the attention of the relevant bodies and the general public, where appropriate, make recommendations," but does not have any direct decision-making authority. In this mandate, different stakeholders are encouraged to strengthen engagement those from developing countries. In paragraph 72, the mandate focused on capacity-building for developing countries and the drawing out of local resources; this particular effort, for instance, has been reinforced through Diplo Foundation's Internet Governance Capacity Building Programme that allowed participants from different regions to benefit from valuable resources with the help of regional experts in Internet governance.
The United Nations published its endorsement of a five-year mandate for the IGF in April 2006. There were two rounds of consultations with regards to the convening of the first IGF: 16 – 17 of February 2006 – The first round of consultations was held in Geneva; the transcripts of the two-day consultations are available in the IGF site. 19 May 2006 – The second round of consultations was open to all stakeholders and was coordinated for the preparations of the inaugural IGF meeting. The meeting chairman was Nitin Desai, the United Nations Secretary-General's Special Adviser for Internet Governance; the convening of the IGF was announced on 18 July 2006, with the inaugural meeting of the Forum to be held in Athens, Greece from 30 October to 2 November 2006. In the lead-up to the completion of the first five-year mandate of the IGF in 2010, the UN initiated a process of evaluating the continuation of the IGF, resulting in a United Nations General Assembly resolution to continue the IGF for a further five years.
In addition to the renewed mandate, another UN body, the Commission on Science and Technology for Development, established a Working Group on Improvements to the IGF
The Internet is the global system of interconnected computer networks that use the Internet protocol suite to link devices worldwide. It is a network of networks that consists of private, academic and government networks of local to global scope, linked by a broad array of electronic and optical networking technologies; the Internet carries a vast range of information resources and services, such as the inter-linked hypertext documents and applications of the World Wide Web, electronic mail and file sharing. Some publications no longer capitalize "internet"; the origins of the Internet date back to research commissioned by the federal government of the United States in the 1960s to build robust, fault-tolerant communication with computer networks. The primary precursor network, the ARPANET served as a backbone for interconnection of regional academic and military networks in the 1980s; the funding of the National Science Foundation Network as a new backbone in the 1980s, as well as private funding for other commercial extensions, led to worldwide participation in the development of new networking technologies, the merger of many networks.
The linking of commercial networks and enterprises by the early 1990s marked the beginning of the transition to the modern Internet, generated a sustained exponential growth as generations of institutional and mobile computers were connected to the network. Although the Internet was used by academia since the 1980s, commercialization incorporated its services and technologies into every aspect of modern life. Most traditional communication media, including telephony, television, paper mail and newspapers are reshaped, redefined, or bypassed by the Internet, giving birth to new services such as email, Internet telephony, Internet television, online music, digital newspapers, video streaming websites. Newspaper and other print publishing are adapting to website technology, or are reshaped into blogging, web feeds and online news aggregators; the Internet has enabled and accelerated new forms of personal interactions through instant messaging, Internet forums, social networking. Online shopping has grown exponentially both for major retailers and small businesses and entrepreneurs, as it enables firms to extend their "brick and mortar" presence to serve a larger market or sell goods and services online.
Business-to-business and financial services on the Internet affect supply chains across entire industries. The Internet has no single centralized governance in either technological implementation or policies for access and usage; the overreaching definitions of the two principal name spaces in the Internet, the Internet Protocol address space and the Domain Name System, are directed by a maintainer organization, the Internet Corporation for Assigned Names and Numbers. The technical underpinning and standardization of the core protocols is an activity of the Internet Engineering Task Force, a non-profit organization of loosely affiliated international participants that anyone may associate with by contributing technical expertise. In November 2006, the Internet was included on USA Today's list of New Seven Wonders; when the term Internet is used to refer to the specific global system of interconnected Internet Protocol networks, the word is a proper noun that should be written with an initial capital letter.
In common use and the media, it is erroneously not capitalized, viz. the internet. Some guides specify that the word should be capitalized when used as a noun, but not capitalized when used as an adjective; the Internet is often referred to as the Net, as a short form of network. As early as 1849, the word internetted was used uncapitalized as an adjective, meaning interconnected or interwoven; the designers of early computer networks used internet both as a noun and as a verb in shorthand form of internetwork or internetworking, meaning interconnecting computer networks. The terms Internet and World Wide Web are used interchangeably in everyday speech. However, the World Wide Web or the Web is only one of a large number of Internet services; the Web is a collection of interconnected documents and other web resources, linked by hyperlinks and URLs. As another point of comparison, Hypertext Transfer Protocol, or HTTP, is the language used on the Web for information transfer, yet it is just one of many languages or protocols that can be used for communication on the Internet.
The term Interweb is a portmanteau of Internet and World Wide Web used sarcastically to parody a technically unsavvy user. Research into packet switching, one of the fundamental Internet technologies, started in the early 1960s in the work of Paul Baran and Donald Davies. Packet-switched networks such as the NPL network, ARPANET, the Merit Network, CYCLADES, Telenet were developed in the late 1960s and early 1970s; the ARPANET project led to the development of protocols for internetworking, by which multiple separate networks could be joined into a network of networks. ARPANET development began with two network nodes which were interconnected between the Network Measurement Center at the University of California, Los Angeles Henry Samueli School of Engineering and Applied Science directed by Leonard Kleinrock, the NLS system at SRI International by Douglas Engelbart in Menlo Park, California, on 29 October 1969; the third site was the Culler-Fried Interactive Mathematics Center at the University of California, Santa Barbara, followed by the University of
Internet service provider
An Internet service provider is an organization that provides services for accessing, using, or participating in the Internet. Internet service providers may be organized in various forms, such as commercial, community-owned, non-profit, or otherwise owned. Internet services provided by ISPs include Internet access, Internet transit, domain name registration, web hosting, Usenet service, colocation; the Internet was developed as a network between government research laboratories and participating departments of universities. Other companies and organizations joined by direct connection to the backbone, or by arrangements through other connected companies, sometime using dialup tools such as UUCP. By the late 1980s, a process was set in place towards commercial use of the Internet; the remaining restrictions were removed by 1991, shortly after the introduction of the World Wide Web. During the 1980s, online service providers such as CompuServe and America On Line began to offer limited capabilities to access the Internet, such as e-mail interchange, but full access to the Internet was not available to the general public.
In 1989, the first Internet service providers, companies offering the public direct access to the Internet for a monthly fee, were established in Australia and the United States. In Brookline, The World became the first commercial ISP in the US, its first customer was served in November 1989. These companies offered dial-up connections, using the public telephone network to provide last-mile connections to their customers; the barriers to entry for dial-up ISPs were low and many providers emerged. However, cable television companies and the telephone carriers had wired connections to their customers and could offer Internet connections at much higher speeds than dial-up using broadband technology such as cable modems and digital subscriber line; as a result, these companies became the dominant ISPs in their service areas, what was once a competitive ISP market became a monopoly or duopoly in countries with a commercial telecommunications market, such as the United States. On 23 April 2014, the U.
S. Federal Communications Commission was reported to be considering a new rule that will permit ISPs to offer content providers a faster track to send content, thus reversing their earlier net neutrality position. A possible solution to net neutrality concerns may be municipal broadband, according to Professor Susan Crawford, a legal and technology expert at Harvard Law School. On 15 May 2014, the FCC decided to consider two options regarding Internet services: first, permit fast and slow broadband lanes, thereby compromising net neutrality. On 10 November 2014, President Barack Obama recommended that the FCC reclassify broadband Internet service as a telecommunications service in order to preserve net neutrality. On 16 January 2015, Republicans presented legislation, in the form of a U. S. Congress H. R. discussion draft bill, that makes concessions to net neutrality but prohibits the FCC from accomplishing the goal or enacting any further regulation affecting Internet service providers. On 31 January 2015, AP News reported that the FCC will present the notion of applying Title II of the Communications Act of 1934 to the Internet in a vote expected on 26 February 2015.
Adoption of this notion would reclassify Internet service from one of information to one of the telecommunications and, according to Tom Wheeler, chairman of the FCC, ensure net neutrality. The FCC is expected to enforce net neutrality in its vote, according to The New York Times. On 26 February 2015, the FCC ruled in favor of net neutrality by adopting Title II of the Communications Act of 1934 and Section 706 in the Telecommunications Act of 1996 to the Internet; the FCC Chairman, Tom Wheeler, commented, "This is no more a plan to regulate the Internet than the First Amendment is a plan to regulate free speech. They both stand for the same concept." On 12 March 2015, the FCC released the specific details of the net neutrality rules. On 13 April 2015, the FCC published the final rule on its new "Net Neutrality" regulations; these rules went into effect on 12 June 2015. Upon becoming FCC chairman in April 2017, Ajit Pai proposed an end to net neutrality, awaiting votes from the commission. On 21 November 2017, Pai announced that a vote will be held by FCC members on 14 December on whether to repeal the policy.
On 11 June 2018, the repeal of the FCC's network neutrality rules took effect. Access provider ISPs provide Internet access, employing a range of technologies to connect users to their network. Available technologies have ranged from computer modems with acoustic couplers to telephone lines, to television cable, Wi-Fi, fiber optics. For users and small businesses, traditional options include copper wires to provide dial-up, DSL asymmetric digital subscriber line, cable modem or Integrated Services Digital Network. Using fiber-optics to end users is called Fiber To The Home or similar names. For customers with more demanding requirements can use higher-speed DSL, metropolitan Ethernet, gigabit Ethernet, Frame Relay, ISDN Primary Rate Interface, ATM and synchronous optical networking. Wireless access is another option, including satellite Internet access. A mailbox provider is an organization that provides services for hosting electronic mail domains with access to storage for mail boxes
Hypertext Transfer Protocol
The Hypertext Transfer Protocol is an application protocol for distributed, hypermedia information systems. HTTP is the foundation of data communication for the World Wide Web, where hypertext documents include hyperlinks to other resources that the user can access, for example by a mouse click or by tapping the screen in a web browser. HTTP was developed to facilitate the World Wide Web. Development of HTTP was initiated by Tim Berners-Lee at CERN in 1989. Development of HTTP standards was coordinated by the Internet Engineering Task Force and the World Wide Web Consortium, culminating in the publication of a series of Requests for Comments; the first definition of HTTP/1.1, the version of HTTP in common use, occurred in RFC 2068 in 1997, although this was made obsolete by RFC 2616 in 1999 and again by the RFC 7230 family of RFCs in 2014. A version, the successor HTTP/2, was standardized in 2015, is now supported by major web servers and browsers over Transport Layer Security using Application-Layer Protocol Negotiation extension where TLS 1.2 or newer is required.
HTTP functions as a request–response protocol in the client–server computing model. A web browser, for example, may be the client and an application running on a computer hosting a website may be the server; the client submits an HTTP request message to the server. The server, which provides resources such as HTML files and other content, or performs other functions on behalf of the client, returns a response message to the client; the response contains completion status information about the request and may contain requested content in its message body. A web browser is an example of a user agent. Other types of user agent include the indexing software used by search providers, voice browsers, mobile apps, other software that accesses, consumes, or displays web content. HTTP is designed to permit intermediate network elements to improve or enable communications between clients and servers. High-traffic websites benefit from web cache servers that deliver content on behalf of upstream servers to improve response time.
Web browsers cache accessed web resources and reuse them, when possible, to reduce network traffic. HTTP proxy servers at private network boundaries can facilitate communication for clients without a globally routable address, by relaying messages with external servers. HTTP is an application layer protocol designed within the framework of the Internet protocol suite, its definition presumes an underlying and reliable transport layer protocol, Transmission Control Protocol is used. However, HTTP can be adapted to use unreliable protocols such as the User Datagram Protocol, for example in HTTPU and Simple Service Discovery Protocol. HTTP resources are identified and located on the network by Uniform Resource Locators, using the Uniform Resource Identifiers schemes http and https. URIs and hyperlinks in HTML documents form interlinked hypertext documents. HTTP/1.1 is a revision of the original HTTP. In HTTP/1.0 a separate connection to the same server is made for every resource request. HTTP/1.1 can reuse a connection multiple times to download images, stylesheets, etc after the page has been delivered.
HTTP/1.1 communications therefore experience less latency as the establishment of TCP connections presents considerable overhead. The term hypertext was coined by Ted Nelson in 1965 in the Xanadu Project, in turn inspired by Vannevar Bush's 1930s vision of the microfilm-based information retrieval and management "memex" system described in his 1945 essay "As We May Think". Tim Berners-Lee and his team at CERN are credited with inventing the original HTTP, along with HTML and the associated technology for a web server and a text-based web browser. Berners-Lee first proposed the "WorldWideWeb" project in 1989—now known as the World Wide Web; the first version of the protocol had only one method, namely GET, which would request a page from a server. The response from the server was always an HTML page; the first documented version of HTTP was HTTP V0.9. Dave Raggett led the HTTP Working Group in 1995 and wanted to expand the protocol with extended operations, extended negotiation, richer meta-information, tied with a security protocol which became more efficient by adding additional methods and header fields.
RFC 1945 introduced and recognized HTTP V1.0 in 1996. The HTTP WG planned to publish new standards in December 1995 and the support for pre-standard HTTP/1.1 based on the developing RFC 2068 was adopted by the major browser developers in early 1996. By March that year, pre-standard HTTP/1.1 was supported in Arena, Netscape 2.0, Netscape Navigator Gold 2.01, Mosaic 2.7, Lynx 2.5, in Internet Explorer 2.0. End-user adoption of the new browsers was rapid. In March 1996, one web hosting company reported that over 40% of browsers in use on the Internet were HTTP 1.1 compliant. That same web hosting company reported that by June 1996, 65% of all browsers accessing their servers were HTTP/1.1 compliant. The HTTP/1.1 standard as defined in RFC 2068 was released in January 1997. Improvements and updates to the HTTP/1.1 standard were released under RFC 2616 in June 1999. In 2007, the HTTPbis Working Group was formed, in part, to revise and clarify the HTTP/1.1 specification. In June 2014, the WG released an updated six-part specification obsoleting RFC 2616: RFC 7230, HTTP/1.1: Message Syntax and Routing RFC 7231, HTTP/1.1: Semantics and Content RFC 7232, HTTP/1.1: Conditional Requests RFC 7233, HTTP/1.1: Range Requests RFC 7234, HTTP/1.1: Caching RFC 7235, HTTP/1