A website or Web site is a collection of related network web resources, such as web pages, multimedia content, which are identified with a common domain name, published on at least one web server. Notable examples are wikipedia.org, google.com, amazon.com. Websites can be accessed via a public Internet Protocol network, such as the Internet, or a private local area network, by a uniform resource locator that identifies the site. Websites can be used in various fashions. Websites are dedicated to a particular topic or purpose, ranging from entertainment and social networking to providing news and education. All publicly accessible websites collectively constitute the World Wide Web, while private websites, such as a company's website for its employees, are part of an intranet. Web pages, which are the building blocks of websites, are documents composed in plain text interspersed with formatting instructions of Hypertext Markup Language, they may incorporate elements from other websites with suitable markup anchors.
Web pages are accessed and transported with the Hypertext Transfer Protocol, which may optionally employ encryption to provide security and privacy for the user. The user's application a web browser, renders the page content according to its HTML markup instructions onto a display terminal. Hyperlinking between web pages conveys to the reader the site structure and guides the navigation of the site, which starts with a home page containing a directory of the site web content; some websites require user subscription to access content. Examples of subscription websites include many business sites, news websites, academic journal websites, gaming websites, file-sharing websites, message boards, web-based email, social networking websites, websites providing real-time stock market data, as well as sites providing various other services. End users can access websites on a range of devices, including desktop and laptop computers, tablet computers and smart TVs; the World Wide Web was created in 1990 by the British CERN physicist Tim Berners-Lee.
On 30 April 1993, CERN announced. Before the introduction of HTML and HTTP, other protocols such as File Transfer Protocol and the gopher protocol were used to retrieve individual files from a server; these protocols offer a simple directory structure which the user navigates and where they choose files to download. Documents were most presented as plain text files without formatting, or were encoded in word processor formats. Websites can be used in various fashions. Websites can be the work of an individual, a business or other organization, are dedicated to a particular topic or purpose. Any website can contain a hyperlink to any other website, so the distinction between individual sites, as perceived by the user, can be blurred. Websites are written in, or converted to, HTML and are accessed using a software interface classified as a user agent. Web pages can be viewed or otherwise accessed from a range of computer-based and Internet-enabled devices of various sizes, including desktop computers, tablet computers and smartphones.
A website is hosted on a computer system known as a web server called an HTTP server. These terms can refer to the software that runs on these systems which retrieves and delivers the web pages in response to requests from the website's users. Apache is the most used web server software and Microsoft's IIS is commonly used; some alternatives, such as Nginx, Hiawatha or Cherokee, are functional and lightweight. A static website is one that has web pages stored on the server in the format, sent to a client web browser, it is coded in Hypertext Markup Language. Images are used to effect the desired appearance and as part of the main content. Audio or video might be considered "static" content if it plays automatically or is non-interactive; this type of website displays the same information to all visitors. Similar to handing out a printed brochure to customers or clients, a static website will provide consistent, standard information for an extended period of time. Although the website owner may make updates periodically, it is a manual process to edit the text and other content and may require basic website design skills and software.
Simple forms or marketing examples of websites, such as classic website, a five-page website or a brochure website are static websites, because they present pre-defined, static information to the user. This may include information about a company and its products and services through text, animations, audio/video, navigation menus. Static websites can be edited using four broad categories of software: Text editors, such as Notepad or TextEdit, where content and HTML markup are manipulated directly within the editor program WYSIWYG offline editors, such as Microsoft FrontPage and Adobe Dreamweaver, with which the site is edited using a GUI and the final HTML markup is generated automatically by the editor software WYSIWYG online editors which create media rich online presentation like web pages, intro, blogs, an
World Wide Web
The World Wide Web known as the Web, is an information space where documents and other web resources are identified by Uniform Resource Locators, which may be interlinked by hypertext, are accessible over the Internet. The resources of the WWW may be accessed by users by a software application called a web browser. English scientist Tim Berners-Lee invented the World Wide Web in 1989, he wrote the first web browser in 1990 while employed at CERN near Switzerland. The browser was released outside CERN in 1991, first to other research institutions starting in January 1991 and to the general public in August 1991; the World Wide Web has been central to the development of the Information Age and is the primary tool billions of people use to interact on the Internet. Web resources may be any type of downloaded media, but web pages are hypertext media that have been formatted in Hypertext Markup Language; such formatting allows for embedded hyperlinks that contain URLs and permit users to navigate to other web resources.
In addition to text, web pages may contain images, video and software components that are rendered in the user's web browser as coherent pages of multimedia content. Multiple web resources with a common theme, a common domain name, or both, make up a website. Websites are stored in computers that are running a program called a web server that responds to requests made over the Internet from web browsers running on a user's computer. Website content can be provided by a publisher, or interactively where users contribute content or the content depends upon the users or their actions. Websites may be provided for a myriad of informative, commercial, governmental, or non-governmental reasons. Tim Berners-Lee's vision of a global hyperlinked information system became a possibility by the second half of the 1980s. By 1985, the global Internet began to proliferate in Europe and the Domain Name System came into being. In 1988 the first direct IP connection between Europe and North America was made and Berners-Lee began to discuss the possibility of a web-like system at CERN.
While working at CERN, Berners-Lee became frustrated with the inefficiencies and difficulties posed by finding information stored on different computers. On March 12, 1989, he submitted a memorandum, titled "Information Management: A Proposal", to the management at CERN for a system called "Mesh" that referenced ENQUIRE, a database and software project he had built in 1980, which used the term "web" and described a more elaborate information management system based on links embedded as text: "Imagine the references in this document all being associated with the network address of the thing to which they referred, so that while reading this document, you could skip to them with a click of the mouse." Such a system, he explained, could be referred to using one of the existing meanings of the word hypertext, a term that he says was coined in the 1950s. There is no reason, the proposal continues, why such hypertext links could not encompass multimedia documents including graphics and video, so that Berners-Lee goes on to use the term hypermedia.
With help from his colleague and fellow hypertext enthusiast Robert Cailliau he published a more formal proposal on 12 November 1990 to build a "Hypertext project" called "WorldWideWeb" as a "web" of "hypertext documents" to be viewed by "browsers" using a client–server architecture. At this point HTML and HTTP had been in development for about two months and the first Web server was about a month from completing its first successful test; this proposal estimated that a read-only web would be developed within three months and that it would take six months to achieve "the creation of new links and new material by readers, authorship becomes universal" as well as "the automatic notification of a reader when new material of interest to him/her has become available". While the read-only goal was met, accessible authorship of web content took longer to mature, with the wiki concept, WebDAV, Web 2.0 and RSS/Atom. The proposal was modelled after the SGML reader Dynatext by Electronic Book Technology, a spin-off from the Institute for Research in Information and Scholarship at Brown University.
The Dynatext system, licensed by CERN, was a key player in the extension of SGML ISO 8879:1986 to Hypermedia within HyTime, but it was considered too expensive and had an inappropriate licensing policy for use in the general high energy physics community, namely a fee for each document and each document alteration. A NeXT Computer was used by Berners-Lee as the world's first web server and to write the first web browser, WorldWideWeb, in 1990. By Christmas 1990, Berners-Lee had built all the tools necessary for a working Web: the first web browser and the first web server; the first web site, which described the project itself, was published on 20 December 1990. The first web page may be lost, but Paul Jones of UNC-Chapel Hill in North Carolina announced in May 2013 that Berners-Lee gave him what he says is the oldest known web page during a 1991 visit to UNC. Jones stored it on his NeXT computer. On 6 August 1991, Berners-Lee published a short summary of the World Wide Web project on the newsgroup alt.hypertext.
This date is sometimes confused with the public availability of the first web servers, which had occurred months earlier. As another example of such confusion, several news media reported that the first photo on the Web was published by Berners-Lee in 1992, an image of the CERN house band Les Horribles Cernettes taken by Silvano de Gennaro.
A CD-ROM is a pre-pressed optical compact disc that contains data. Computers can read—but not write to or erase—CD-ROMs, i.e. it is a type of read-only memory. During the 1990s, CD-ROMs were popularly used to distribute software and data for computers and fourth generation video game consoles; some CDs, called enhanced CDs, hold both computer data and audio with the latter capable of being played on a CD player, while data is only usable on a computer. The CD-ROM format was developed by Japanese company Denon in 1982, it was an extension of Compact Disc Digital Audio, adapted the format to hold any form of digital data, with a storage capacity of 553 MiB. CD-ROM was introduced by Denon and Sony at a Japanese computer show in 1984; the Yellow Book is the technical standard. One of a set of color-bound books that contain the technical specifications for all CD formats, the Yellow Book, standardized by Sony and Philips in 1983, specifies a format for discs with a maximum capacity of 650 MiB. CD-ROMs are identical in appearance to audio CDs, data are stored and retrieved in a similar manner.
Discs are made from a 1.2 mm thick disc of polycarbonate plastic, with a thin layer of aluminium to make a reflective surface. The most common size of CD-ROM is 120 mm in diameter, though the smaller Mini CD standard with an 80 mm diameter, as well as shaped compact discs in numerous non-standard sizes and molds, are available. Data is stored on the disc as a series of microscopic indentations. A laser is shone onto the reflective surface of the disc to read the pattern of lands; because the depth of the pits is one-quarter to one-sixth of the wavelength of the laser light used to read the disc, the reflected beam's phase is shifted in relation to the incoming beam, causing destructive interference and reducing the reflected beam's intensity. This is converted into binary data. Several formats are used for data stored on compact discs, known as the Rainbow Books; the Yellow Book, published in 1988, defines the specifications for CD-ROMs, standardized in 1989 as the ISO/IEC 10149 / ECMA-130 standard.
The CD-ROM standard builds on top of the original Red Book CD-DA standard for CD audio. Other standards, such as the White Book for Video CDs, further define formats based on the CD-ROM specifications; the Yellow Book itself is not available, but the standards with the corresponding content can be downloaded for free from ISO or ECMA. There are several standards that define how to structure data files on a CD-ROM. ISO 9660 defines the standard file system for a CD-ROM. ISO 13490 is an improvement on this standard which adds support for non-sequential write-once and re-writeable discs such as CD-R and CD-RW, as well as multiple sessions; the ISO 13346 standard was designed to address most of the shortcomings of ISO 9660, a subset of it evolved into the UDF format, adopted for DVDs. The bootable CD specification was issued in January 1995, to make a CD emulate a hard disk or floppy disk, is called El Torito. Data stored on CD-ROMs follows the standard CD data encoding techniques described in the Red Book specification.
This includes cross-interleaved Reed–Solomon coding, eight-to-fourteen modulation, the use of pits and lands for coding the bits into the physical surface of the CD. The structures used to group data on a CD-ROM are derived from the Red Book. Like audio CDs, a CD-ROM sector contains 2,352 bytes of user data, composed of 98 frames, each consisting of 33-bytes. Unlike audio CDs, the data stored in these sectors corresponds to any type of digital data, not audio samples encoded according to the audio CD specification. To structure and protect this data, the CD-ROM standard further defines two sector modes, Mode 1 and Mode 2, which describe two different layouts for the data inside a sector. A track inside a CD-ROM only contains sectors in the same mode, but if multiple tracks are present in a CD-ROM, each track can have its sectors in a different mode from the rest of the tracks, they can coexist with audio CD tracks as well, the case of mixed mode CDs. Both Mode 1 and 2 sectors use the first 16 bytes for header information, but differ in the remaining 2,336 bytes due to the use of error correction bytes.
Unlike an audio CD, a CD-ROM cannot rely on error concealment by interpolation. To achieve improved error correction and detection, Mode 1, used for digital data, adds a 32-bit cyclic redundancy check code for error detection, a third layer of Reed–Solomon error correction using a Reed-Solomon Product-like Code. Mode 1 therefore contains 288 bytes per sector for error detection and correction, leaving 2,048 bytes per sector available for data. Mode 2, more appropriate for image or video data, contains no additional error detection or correction bytes, having therefore 2,336 available data bytes per sector. Note that both modes, like audio CDs, still benefit from the lower layers of error correction at the frame level. Before being stored on a disc with the techniques described above, each CD-ROM sector is scrambled to prevent some problematic patterns from showing up; these scrambled sectors follow the same encoding process described in the Red Book in order to be stored
Samizdat was a form of dissident activity across the Eastern Bloc in which individuals reproduced censored and underground publications by hand and passed the documents from reader to reader. This grassroots practice to evade official Soviet censorship was fraught with danger, as harsh punishments were meted out to people caught possessing or copying censored materials. Vladimir Bukovsky summarized it as follows: "Samizdat: I write it myself, edit it myself, censor it myself, publish it myself, distribute it myself, spend jail time for it myself." Etymologically, the word samizdat derives from sam and izdat, thus means "self-published". The Ukrainian language has a similar term: samvydav, from sam, "self", vydavnytstvo, "publishing house"; the Russian poet Nikolai Glazkov coined a version of the term as a pun in the 1940s when he typed copies of his poems and included the note Samsebyaizdat on the front page. Tamizdat refers to literature published abroad from smuggled manuscripts. Samizdat copies of texts, such as Mikhail Bulgakov's novel The Master and Margarita or Václav Havel's essay The Power of the Powerless were passed around among trusted friends.
The techniques used to reproduce these forbidden texts varied. Several copies might be made using carbon paper, either on a typewriter. Before glasnost, the practice was dangerous, because copy machines, printing presses, typewriters in offices were under control of the organisation's First Department, i.e. the KGB: reference printouts from all of these machines were stored for subsequent identification purposes, should samizdat output be found. Samizdat distinguishes itself not only by the ideas and debates that it helped spread to a wider audience but by its physical form; the hand-typed blurry and wrinkled pages with numerous typographical errors and nondescript covers helped to separate and elevate Russian samizdat from Western literature. The physical form of samizdat arose from a simple lack of resources and the necessity to be inconspicuous. In time, dissidents in the USSR began to admire these qualities for their own sake, the ragged appearance of samizdat contrasting with the smooth, well-produced appearance of texts passed by the censor's office for publication by the State.
The form samizdat took gained precedence over the ideas it expressed, became a potent symbol of the resourcefulness and rebellious spirit of the inhabitants of the Soviet Union. In effect, the physical form of samizdat itself elevated the reading of samizdat to a prized clandestine act. Samizdat originated from the dissident movement of the Russian intelligentsia, most samizdat directed itself to a readership of Russian elites. While circulation of samizdat was low, at around 200,000 readers on average, many of these readers possessed positions of cultural power and authority. Furthermore, because of the presence of "dual consciousness" in the Soviet Union, the simultaneous censorship of information and necessity of absorbing information to know how to censor it, many government officials became readers of samizdat. Although the general public at times came into contact with samizdat, most of the public lacked access to the few expensive samizdat texts in circulation, expressed discontent with the censored reading material made available by the state.
The purpose and methods of samizdat may contrast with the purpose of the concept of copyright. Self-published and self-distributed literature has a long history in Russia. Samizdat is unique to other countries with similar systems. Faced with the police state's powers of censorship, society turned to underground literature for self-analysis and self-expression. Certain works published by the State-controlled media were impossible to find in bookshops and libraries, found their way into samizdat; the first full-length book to be distributed as samizdat was Boris Pasternak's 1957 novel Doctor Zhivago. Although the literary magazine Novy Mir had published ten poems from the book in 1954, a year the full text was judged unsuitable for publication and entered samizdat circulation; the novel One Day in the Life of Ivan Denisovich by Aleksandr Solzhenitsyn had a similar fate and was distributed via samizdat. At the outset of the Khrushchev Thaw in the mid-1950s USSR, poetry became popular and writings of a wide variety of known, repressed, as well as young and unknown poets circulated among Soviet intelligentsia.
A number of samizdat publications began circulating that carried unofficial poetry: the Moscow samizdat magazine Sintaksis by writer Alexander Ginzburg, Vladimir Osipov's Boomerang and Phoenix produced by Yuri Galanskov and Alexander Ginzburg. The editors of these magazines were regulars at impromptu public poetry readings between 1958 and 1961 on Mayakovsky Square in Moscow; the gatherings did not last long, for soon the authorities began clamping down on them. In the summer of 1961, several meeting regulars were arrested and charged with "anti-Soviet agitation and propaganda", putting an end to most of the magazines. Not everything published in samizdat had political overtones. In 1963, Joseph Brodsky was charged with "social parasitism" and convicted for being nothing but a poet, his poems circulated in samizdat, with only four judged as
Antivirus software, or anti-virus software known as anti-malware, is a computer program used to prevent and remove malware. Antivirus software was developed to detect and remove computer viruses, hence the name. However, with the proliferation of other kinds of malware, antivirus software started to provide protection from other computer threats. In particular, modern antivirus software can protect users from: malicious browser helper objects, browser hijackers, keyloggers, rootkits, trojan horses, malicious LSPs, fraudtools and spyware; some products include protection from other computer threats, such as infected and malicious URLs, spam and phishing attacks, online identity, online banking attacks, social engineering techniques, advanced persistent threat and botnet DDoS attacks. Although the roots of the computer virus date back as early as 1949, when the Hungarian scientist John von Neumann published the "Theory of self-reproducing automata", the first known computer virus appeared in 1971 and was dubbed the "Creeper virus".
This computer virus infected Digital Equipment Corporation's PDP-10 mainframe computers running the TENEX operating system. The Creeper virus was deleted by a program created by Ray Tomlinson and known as "The Reaper"; some people consider "The Reaper" the first antivirus software written – it may be the case, but it is important to note that the Reaper was a virus itself designed to remove the Creeper virus. The Creeper virus was followed by several other viruses; the first known that appeared "in the wild" was "Elk Cloner", in 1981, which infected Apple II computers. In 1983, the term "computer virus" was coined by Fred Cohen in one of the first published academic papers on computer viruses. Cohen used the term "computer virus" to describe a program that: "affect other computer programs by modifying them in such a way as to include a copy of itself." The first IBM PC compatible "in the wild" computer virus, one of the first real widespread infections, was "Brain" in 1986. From the number of viruses has grown exponentially.
Most of the computer viruses written in the early and mid-1980s were limited to self-reproduction and had no specific damage routine built into the code. That changed when more and more programmers became acquainted with computer virus programming and created viruses that manipulated or destroyed data on infected computers. Before internet connectivity was widespread, computer viruses were spread by infected floppy disks. Antivirus software came into use, but was updated infrequently. During this time, virus checkers had to check executable files and the boot sectors of floppy disks and hard disks. However, as internet usage became common, viruses began to spread online. There are competing claims for the innovator of the first antivirus product; the first publicly documented removal of an "in the wild" computer virus was performed by Bernd Fix in 1987. In 1987, Andreas Lüning and Kai Figge, who founded G Data Software in 1985, released their first antivirus product for the Atari ST platform. In 1987, the Ultimate Virus Killer was released.
This was the de facto industry standard virus killer for the Atari ST and Atari Falcon, the last version of, released in April 2004. In 1987, in the United States, John McAfee founded the McAfee company and, at the end of that year, he released the first version of VirusScan. In 1987, Peter Paško, Rudolf Hrubý, Miroslav Trnka created the first version of NOD antivirus. In 1987, Fred Cohen wrote that there is no algorithm that can detect all possible computer viruses. At the end of 1987, the first two heuristic antivirus utilities were released: Flushot Plus by Ross Greenberg and Anti4us by Erwin Lanting. In his O'Reilly book, Malicious Mobile Code: Virus Protection for Windows, Roger Grimes described Flushot Plus as "the first holistic program to fight malicious mobile code."However, the kind of heuristic used by early AV engines was different from those used today. The first product with a heuristic engine resembling modern ones was F-PROT in 1991. Early heuristic engines were based on dividing the binary in different sections: data section, code section.
Indeed, the initial viruses re-organized the layout of the sections, or overrode the initial portion of section in order to jump to the end of the file where malicious code was located—only going back to resume execution of the original code. This was a specific pattern, not used at the time by any legitimate software, which represented an elegant heuristic to catch suspicious code. Other kinds of more advanced heuristics were added, such as suspicious section names, incorrect header size, regular expressions, partial pattern in-memory matching. In 1988, the growth of antivirus companies continued. In Germany, Tjark Auerbach released the first version of AntiVir. In Bulgaria, Dr. Vesselin Bontchev released his first freeware antivirus program. Frans Veldman released the first version of ThunderByte Antivirus known as TBAV. In Czechoslovakia, Pavel Baudiš and Eduard Kučera started avast! (at th
Web hosting service
A web hosting service is a type of Internet hosting service that allows individuals and organizations to make their website accessible via the World Wide Web. Web hosts are companies that provide space on a server owned or leased for use by clients, as well as providing Internet connectivity in a data center. Web hosts can provide data center space and connectivity to the Internet for other servers located in their data center, called colocation known as Housing in Latin America or France; until 1991, the Internet was restricted to use only...for research and education in the sciences and engineering... and was used for email, telnet, FTP and USENET traffic - but only a tiny number of web pages. The World Wide Web protocols had only just been written and not until the end of 1993 would there be a graphical web browser for Mac or Windows computers. After there was some opening up of internet access, the situation was confused until 1995. To host a website on the internet, an individual or company would need their own server.
As not all companies had the budget or expertise to do this, web hosting services began to offer to host users' websites on their own servers, without the client needing to own the necessary infrastructure required to operate the website. The owners of the websites called webmasters, would be able to create a website that would be hosted on the web hosting service's server and published to the web by the web hosting service; as the number of users on the World Wide Web grew, the pressure for companies, both large and small, to have an online presence grew. By 1995, companies such as GeoCities and Tripod were offering free hosting; the most basic is web page and small-scale file hosting, where files can be uploaded via File Transfer Protocol or a Web interface. The files are delivered to the Web "as is" or with minimal processing. Many Internet service providers offer this service free to subscribers. Individuals and organizations may obtain Web page hosting from alternative service providers.
Free web hosting service is offered by different companies with limited services, sometimes supported by advertisements, limited when compared to paid hosting. Single page hosting is sufficient for personal web pages. Personal web site hosting is free, advertisement-sponsored, or inexpensive. Business web site hosting has a higher expense depending upon the size and type of the site. Many large companies that are not Internet service providers need to be permanently connected to the web to send email, etc. to other sites. The company may use the computer as a website host to provide details of their goods and services and facilities for online orders. A complex site calls for a more comprehensive package that provides database support and application development platforms; these facilities allow customers to write or install scripts for applications like forums and content management. Secure Sockets Layer is used for websites that wish to keep the data transmitted more secure. Internet hosting services can run Web servers.
The scope of web hosting services varies greatly. One's website is placed on the same server as many other sites, ranging from a few sites to hundreds of websites. All domains may share a common pool of server resources, such as RAM and the CPU; the features available with this type of service can be quite basic and not flexible in terms of software and updates. Resellers sell shared web hosting and web companies have reseller accounts to provide hosting for clients. Allows clients to become web hosts themselves. Resellers could function, for individual domains, under any combination of these listed types of hosting, depending on who they are affiliated with as a reseller. Resellers' accounts may vary tremendously in size: they may have their own virtual dedicated server to a colocated server. Many resellers provide a nearly identical service to their provider's shared hosting plan and provide the technical support themselves. Known as a Virtual Private Server, divides server resources into virtual servers, where resources can be allocated in a way that does not directly reflect the underlying hardware.
VPS will be allocated resources based on a one server to many VPSs relationship, however virtualisation may be done for a number of reasons, including the ability to move a VPS container between servers. The users may have root access to their own virtual space. Customers are sometimes responsible for patching and maintaining the server or the VPS provider may provide server admin tasks for the customer; the user gains full control over it. One type of dedicated hosting is unmanaged; this is the least expensive for dedicated plans. The user has full administrative access to the server, which means the client is responsible for the security and maintenance of his own dedicated server; the user is not allowed full control over it. The user is disallowed full control so that the provider can guarantee quality of service by not allowing the user to modify the server or create configuration problems; the user does not own the server. The server is leased to the client. Similar to the dedicated web hosting service.
Open-source software is a type of computer software in which source code is released under a license in which the copyright holder grants users the rights to study and distribute the software to anyone and for any purpose. Open-source software may be developed in a collaborative public manner. Open-source software is a prominent example of open collaboration. Open-source software development generates an more diverse scope of design perspective than any company is capable of developing and sustaining long term. A 2008 report by the Standish Group stated that adoption of open-source software models have resulted in savings of about $60 billion per year for consumers. In the early days of computing and developers shared software in order to learn from each other and evolve the field of computing; the open-source notion moved to the way side of commercialization of software in the years 1970-1980. However, academics still developed software collaboratively. For example Donald Knuth in 1979 with the TeX typesetting system or Richard Stallman in 1983 with the GNU operating system.
In 1997, Eric Raymond published The Cathedral and the Bazaar, a reflective analysis of the hacker community and free-software principles. The paper received significant attention in early 1998, was one factor in motivating Netscape Communications Corporation to release their popular Netscape Communicator Internet suite as free software; this source code subsequently became the basis behind SeaMonkey, Mozilla Firefox and KompoZer. Netscape's act prompted Raymond and others to look into how to bring the Free Software Foundation's free software ideas and perceived benefits to the commercial software industry, they concluded that FSF's social activism was not appealing to companies like Netscape, looked for a way to rebrand the free software movement to emphasize the business potential of sharing and collaborating on software source code. The new term they chose was "open source", soon adopted by Bruce Perens, publisher Tim O'Reilly, Linus Torvalds, others; the Open Source Initiative was founded in February 1998 to encourage use of the new term and evangelize open-source principles.
While the Open Source Initiative sought to encourage the use of the new term and evangelize the principles it adhered to, commercial software vendors found themselves threatened by the concept of distributed software and universal access to an application's source code. A Microsoft executive publicly stated in 2001 that "open source is an intellectual property destroyer. I can't imagine something that could be worse than this for the software business and the intellectual-property business." However, while Free and open-source software has played a role outside of the mainstream of private software development, companies as large as Microsoft have begun to develop official open-source presences on the Internet. IBM, Oracle and State Farm are just a few of the companies with a serious public stake in today's competitive open-source market. There has been a significant shift in the corporate philosophy concerning the development of FOSS; the free-software movement was launched in 1983. In 1998, a group of individuals advocated that the term free software should be replaced by open-source software as an expression, less ambiguous and more comfortable for the corporate world.
Software licenses grant rights to users which would otherwise be reserved by copyright law to the copyright holder. Several open-source software licenses have qualified within the boundaries of the Open Source Definition; the most prominent and popular example is the GNU General Public License, which "allows free distribution under the condition that further developments and applications are put under the same licence", thus free. The open source label came out of a strategy session held on April 7, 1998 in Palo Alto in reaction to Netscape's January 1998 announcement of a source code release for Navigator. A group of individuals at the session included Tim O'Reilly, Linus Torvalds, Tom Paquin, Jamie Zawinski, Larry Wall, Brian Behlendorf, Sameer Parekh, Eric Allman, Greg Olson, Paul Vixie, John Ousterhout, Guido van Rossum, Philip Zimmermann, John Gilmore and Eric S. Raymond, they used the opportunity before the release of Navigator's source code to clarify a potential confusion caused by the ambiguity of the word "free" in English.
Many people claimed that the birth of the Internet, since 1969, started the open-source movement, while others do not distinguish between open-source and free software movements. The Free Software Foun