A website or Web site is a collection of related network web resources, such as web pages, multimedia content, which are identified with a common domain name, published on at least one web server. Notable examples are wikipedia.org, google.com, amazon.com. Websites can be accessed via a public Internet Protocol network, such as the Internet, or a private local area network, by a uniform resource locator that identifies the site. Websites can be used in various fashions. Websites are dedicated to a particular topic or purpose, ranging from entertainment and social networking to providing news and education. All publicly accessible websites collectively constitute the World Wide Web, while private websites, such as a company's website for its employees, are part of an intranet. Web pages, which are the building blocks of websites, are documents composed in plain text interspersed with formatting instructions of Hypertext Markup Language, they may incorporate elements from other websites with suitable markup anchors.
Web pages are accessed and transported with the Hypertext Transfer Protocol, which may optionally employ encryption to provide security and privacy for the user. The user's application a web browser, renders the page content according to its HTML markup instructions onto a display terminal. Hyperlinking between web pages conveys to the reader the site structure and guides the navigation of the site, which starts with a home page containing a directory of the site web content; some websites require user subscription to access content. Examples of subscription websites include many business sites, news websites, academic journal websites, gaming websites, file-sharing websites, message boards, web-based email, social networking websites, websites providing real-time stock market data, as well as sites providing various other services. End users can access websites on a range of devices, including desktop and laptop computers, tablet computers and smart TVs; the World Wide Web was created in 1990 by the British CERN physicist Tim Berners-Lee.
On 30 April 1993, CERN announced. Before the introduction of HTML and HTTP, other protocols such as File Transfer Protocol and the gopher protocol were used to retrieve individual files from a server; these protocols offer a simple directory structure which the user navigates and where they choose files to download. Documents were most presented as plain text files without formatting, or were encoded in word processor formats. Websites can be used in various fashions. Websites can be the work of an individual, a business or other organization, are dedicated to a particular topic or purpose. Any website can contain a hyperlink to any other website, so the distinction between individual sites, as perceived by the user, can be blurred. Websites are written in, or converted to, HTML and are accessed using a software interface classified as a user agent. Web pages can be viewed or otherwise accessed from a range of computer-based and Internet-enabled devices of various sizes, including desktop computers, tablet computers and smartphones.
A website is hosted on a computer system known as a web server called an HTTP server. These terms can refer to the software that runs on these systems which retrieves and delivers the web pages in response to requests from the website's users. Apache is the most used web server software and Microsoft's IIS is commonly used; some alternatives, such as Nginx, Hiawatha or Cherokee, are functional and lightweight. A static website is one that has web pages stored on the server in the format, sent to a client web browser, it is coded in Hypertext Markup Language. Images are used to effect the desired appearance and as part of the main content. Audio or video might be considered "static" content if it plays automatically or is non-interactive; this type of website displays the same information to all visitors. Similar to handing out a printed brochure to customers or clients, a static website will provide consistent, standard information for an extended period of time. Although the website owner may make updates periodically, it is a manual process to edit the text and other content and may require basic website design skills and software.
Simple forms or marketing examples of websites, such as classic website, a five-page website or a brochure website are static websites, because they present pre-defined, static information to the user. This may include information about a company and its products and services through text, animations, audio/video, navigation menus. Static websites can be edited using four broad categories of software: Text editors, such as Notepad or TextEdit, where content and HTML markup are manipulated directly within the editor program WYSIWYG offline editors, such as Microsoft FrontPage and Adobe Dreamweaver, with which the site is edited using a GUI and the final HTML markup is generated automatically by the editor software WYSIWYG online editors which create media rich online presentation like web pages, intro, blogs, an
A database is an organized collection of data stored and accessed electronically from a computer system. Where databases are more complex they are developed using formal design and modeling techniques; the database management system is the software that interacts with end users and the database itself to capture and analyze the data. The DBMS software additionally encompasses; the sum total of the database, the DBMS and the associated applications can be referred to as a "database system". The term "database" is used to loosely refer to any of the DBMS, the database system or an application associated with the database. Computer scientists may classify database-management systems according to the database models that they support. Relational databases became dominant in the 1980s; these model data as rows and columns in a series of tables, the vast majority use SQL for writing and querying data. In the 2000s, non-relational databases became popular, referred to as NoSQL because they use different query languages.
Formally, a "database" refers to the way it is organized. Access to this data is provided by a "database management system" consisting of an integrated set of computer software that allows users to interact with one or more databases and provides access to all of the data contained in the database; the DBMS provides various functions that allow entry and retrieval of large quantities of information and provides ways to manage how that information is organized. Because of the close relationship between them, the term "database" is used casually to refer to both a database and the DBMS used to manipulate it. Outside the world of professional information technology, the term database is used to refer to any collection of related data as size and usage requirements necessitate use of a database management system. Existing DBMSs provide various functions that allow management of a database and its data which can be classified into four main functional groups: Data definition – Creation and removal of definitions that define the organization of the data.
Update – Insertion and deletion of the actual data. Retrieval – Providing information in a form directly usable or for further processing by other applications; the retrieved data may be made available in a form the same as it is stored in the database or in a new form obtained by altering or combining existing data from the database. Administration – Registering and monitoring users, enforcing data security, monitoring performance, maintaining data integrity, dealing with concurrency control, recovering information, corrupted by some event such as an unexpected system failure. Both a database and its DBMS conform to the principles of a particular database model. "Database system" refers collectively to the database model, database management system, database. Physically, database servers are dedicated computers that hold the actual databases and run only the DBMS and related software. Database servers are multiprocessor computers, with generous memory and RAID disk arrays used for stable storage.
RAID is used for recovery of data. Hardware database accelerators, connected to one or more servers via a high-speed channel, are used in large volume transaction processing environments. DBMSs are found at the heart of most database applications. DBMSs may be built around a custom multitasking kernel with built-in networking support, but modern DBMSs rely on a standard operating system to provide these functions. Since DBMSs comprise a significant market and storage vendors take into account DBMS requirements in their own development plans. Databases and DBMSs can be categorized according to the database model that they support, the type of computer they run on, the query language used to access the database, their internal engineering, which affects performance, scalability and security; the sizes and performance of databases and their respective DBMSs have grown in orders of magnitude. These performance increases were enabled by the technology progress in the areas of processors, computer memory, computer storage, computer networks.
The development of database technology can be divided into three eras based on data model or structure: navigational, SQL/relational, post-relational. The two main early navigational data models were the hierarchical model and the CODASYL model The relational model, first proposed in 1970 by Edgar F. Codd, departed from this tradition by insisting that applications should search for data by content, rather than by following links; the relational model employs sets of ledger-style tables, each used for a different type of entity. Only in the mid-1980s did computing hardware become powerful enough to allow the wide deployment of relational systems. By the early 1990s, relational systems dominated in all large-scale data processing applications, as of 2018 they remain dominant: IBM DB2, Oracle, MySQL, Microsoft SQL Server are the most searched DBMS; the dominant database language, standardised SQL for the relational model, has influenced database languages for other data models. Object databases were developed in the 1980s to overcome the inconvenience of object-relational impedance mismatch, which led to the coining of the term "post-relational" and the development of hybrid object-relational databas
In computing, a web application or web app is a client–server computer program which the client runs in a web browser. Common web applications include webmail, online retail sales, online auction; the general distinction between a dynamic web page of any kind and a "web application" is unclear. Web sites most to be referred to as "web applications" are those which have similar functionality to a desktop software application, or to a mobile app. HTML5 introduced explicit language support for making applications that are loaded as web pages, but can store data locally and continue to function while offline. Single-page applications are more application-like because they reject the more typical web paradigm of moving between distinct pages with different URLs. Single-page frameworks like Sencha Touch and AngularJS might be used to speed development of such a web app for a mobile platform. There are several ways of targeting mobile devices when making a web application: Responsive web design can be used to make a web application - whether a conventional website or a single-page application viewable on small screens and work well with touchscreens.
Progressive Web Apps are web applications that load like regular web pages or websites but can offer the user functionality such as working offline, push notifications, device hardware access traditionally available only to native mobile applications. Native apps or "mobile apps" run directly on a mobile device, just as a conventional software application runs directly on a desktop computer, without a web browser. Frameworks like React Native, Flutter and FuseTools allow the development of native apps for all platforms using languages other than each standard native language. Hybrid apps embed a mobile web site inside a native app using a hybrid framework like Apache Cordova and Ionic or Appcelerator Titanium; this allows development using web technologies while retaining certain advantages of native apps. In earlier computing models like client–server, the processing load for the application was shared between code on the server and code installed on each client locally. In other words, an application had its own pre-compiled client program which served as its user interface and had to be separately installed on each user's personal computer.
Douglas Crockford first popularized the JSON format. The acronym originated at State Software, a company co-founded by Crockford and others in March 2001; the co-founders agreed to build a system that used standard browser capabilities and provided an abstraction layer for Web developers to create stateful Web applications that had a persistent duplex connection to a Web server by holding two HTTP connections open and recycling them before standard browser time-outs if no further data were exchanged. The co-founders had a round-table discussion and voted whether to call the data format JSML or JSON, as well as under what license type to make it available. Crockford, being inspired by the words of President Bush, should be credited with coming up with the "evil-doers" JSON license in order to open-source the JSON libraries, but force corporate lawyers, or those who are overly pedantic, to seek to pay for a license from State. Chip Morningstar developed the idea for the State Application Framework at State Software.
String: a sequence of zero or more Unicode characters. Strings support a backslash escaping syntax. Boolean: either of the values true or false Array: an ordered list of zero or more values, each of which may be of any type. Arrays use square bracket elements are comma-separated. Object: an unordered collection of name–value pairs where the names are strings. Since objects are intended to represent associative arrays, it is recommended, though not required, that each key is unique within an object. Objects are delimited with curly brackets and use commas to separate each pair, while within each pair the colon':' character separates the key or name from its value. Null: An empty value, using the word nullLimited whitespace is allowed and ignored around or between syntactic elements. Only four specific characters are considered whitespace for this purpose: space, horizontal tab, line feed, carriage return. In particular, the byte orde
In the context of the World Wide Web, a bookmark is a Uniform Resource Identifier, stored for retrieval in any of various storage formats. All modern web browsers include bookmark features. Bookmarks are called favorites or Internet shortcuts in Internet Explorer, by virtue of that browser's large market share, these terms have been synonymous with bookmark since the first browser war. Bookmarks are accessed through a menu in the user's web browser, folders are used for organization. In addition to bookmarking methods within most browsers, many external applications offer bookmark management. Bookmarks have been incorporated in browsers since the Mosaic browser in 1993. Bookmark lists were called Hotlists in previous versions of Opera. Other early web browsers such as ViolaWWW and Cello had bookmarking features. With the advent of social bookmarking, shared bookmarks have become a means for users sharing similar interests to pool web resources, or to store their bookmarks in such a way that they are not tied to one specific computer or browser.
Live bookmarks are updated automatically. Comparison of browser synchronizers Enterprise bookmarking Favicon Smart keyword Social bookmark link generator Social bookmarking XBEL Bookmark Managers at Curlie
Mozilla Firefox is a free and open-source web browser developed by The Mozilla Foundation and its subsidiary, Mozilla Corporation. Firefox is available for Microsoft Windows, macOS, Linux, BSD, illumos and Solaris operating systems, its sibling, Firefox for Android, is available. Firefox uses the Gecko layout engine to render web pages, which implements current and anticipated web standards. In 2017, Firefox began incorporating new technology under the code name Quantum to promote parallelism and a more intuitive user interface. An additional version, Firefox for iOS, was released on November 12, 2015. Due to platform restrictions, it uses the WebKit layout engine instead of Gecko, as with all other iOS web browsers. Firefox was created in 2002 under the codename "Phoenix" by the Mozilla community members who desired a standalone browser, rather than the Mozilla Application Suite bundle. During its beta phase, Firefox proved to be popular with its testers and was praised for its speed and add-ons compared to Microsoft's then-dominant Internet Explorer 6.
Firefox was released on November 9, 2004, challenged Internet Explorer's dominance with 60 million downloads within nine months. Firefox is the spiritual successor of Netscape Navigator, as the Mozilla community was created by Netscape in 1998 before their acquisition by AOL. Firefox usage grew to a peak of 32% at the end of 2009, with version 3.5 overtaking Internet Explorer 7, although not Internet Explorer as a whole. Usage declined in competition with Google Chrome; as of January 2019, Firefox has 9.5% usage share as a "desktop" browser, according to StatCounter, making it the second-most popular such web browser. Firefox is still the most popular desktop browser in a few countries including Cuba and Eritrea with 72.26% and 83.28% of the market share, respectively. According to Mozilla, in December 2014, there were half a billion Firefox users around the world; the project began as an experimental branch of the Mozilla project by Dave Hyatt, Joe Hewitt, Blake Ross. They believed the commercial requirements of Netscape's sponsorship and developer-driven feature creep compromised the utility of the Mozilla browser.
To combat what they saw as the Mozilla Suite's software bloat, they created a stand-alone browser, with which they intended to replace the Mozilla Suite. On April 3, 2003, the Mozilla Organization announced that they planned to change their focus from the Mozilla Suite to Firefox and Thunderbird; the community-driven SeaMonkey was formed and replaced the Mozilla Application Suite in 2005. The Firefox project has undergone several name changes, it was titled Phoenix, which carried the implication of the mythical firebird that rose triumphantly from the ashes of its dead predecessor, in this case from the "ashes" of Netscape Navigator after it had been killed off by Microsoft Internet Explorer in the "First Browser War". Phoenix was renamed due to trademark issues with Phoenix Technologies. In response, the Mozilla Foundation stated that the browser would always bear the name Mozilla Firebird to avoid confusion. After further pressure, on February 9, 2004, Mozilla Firebird became Mozilla Firefox.
The name Firefox was said to be derived from a nickname of the red panda, which became the mascot for the newly named project. For the abbreviation of Firefox, Mozilla prefers Fx or fx, though it is abbreviated as FF; the Firefox project went through many versions before version 1.0 was released on November 9, 2004. In 2016, Mozilla announced a project known as Quantum, which sought to improve Firefox's Gecko engine and other components to improve Firefox's performance, modernize its architecture, transition the browser to a multi-process model; these improvements came in the wake of decreasing market share to Google Chrome, as well as concerns that its performance was lapsing in comparison. Despite its improvements, these changes required existing add-ons for Firefox to be made incompatible with newer versions, in favor of a new extension system, designed to be similar to Chrome and other recent browsers. Firefox 57, released in November 2017, was the first version to contain enhancements from Quantum, has thus been named Firefox Quantum.
Firefox supported add-ons using the XUL and XPCOM APIs, which allowed them to directly access and manipulate much of the browser's internal functionality. As they are not compatible with its m
Hypertext Transfer Protocol
The Hypertext Transfer Protocol is an application protocol for distributed, hypermedia information systems. HTTP is the foundation of data communication for the World Wide Web, where hypertext documents include hyperlinks to other resources that the user can access, for example by a mouse click or by tapping the screen in a web browser. HTTP was developed to facilitate the World Wide Web. Development of HTTP was initiated by Tim Berners-Lee at CERN in 1989. Development of HTTP standards was coordinated by the Internet Engineering Task Force and the World Wide Web Consortium, culminating in the publication of a series of Requests for Comments; the first definition of HTTP/1.1, the version of HTTP in common use, occurred in RFC 2068 in 1997, although this was made obsolete by RFC 2616 in 1999 and again by the RFC 7230 family of RFCs in 2014. A version, the successor HTTP/2, was standardized in 2015, is now supported by major web servers and browsers over Transport Layer Security using Application-Layer Protocol Negotiation extension where TLS 1.2 or newer is required.
HTTP functions as a request–response protocol in the client–server computing model. A web browser, for example, may be the client and an application running on a computer hosting a website may be the server; the client submits an HTTP request message to the server. The server, which provides resources such as HTML files and other content, or performs other functions on behalf of the client, returns a response message to the client; the response contains completion status information about the request and may contain requested content in its message body. A web browser is an example of a user agent. Other types of user agent include the indexing software used by search providers, voice browsers, mobile apps, other software that accesses, consumes, or displays web content. HTTP is designed to permit intermediate network elements to improve or enable communications between clients and servers. High-traffic websites benefit from web cache servers that deliver content on behalf of upstream servers to improve response time.
Web browsers cache accessed web resources and reuse them, when possible, to reduce network traffic. HTTP proxy servers at private network boundaries can facilitate communication for clients without a globally routable address, by relaying messages with external servers. HTTP is an application layer protocol designed within the framework of the Internet protocol suite, its definition presumes an underlying and reliable transport layer protocol, Transmission Control Protocol is used. However, HTTP can be adapted to use unreliable protocols such as the User Datagram Protocol, for example in HTTPU and Simple Service Discovery Protocol. HTTP resources are identified and located on the network by Uniform Resource Locators, using the Uniform Resource Identifiers schemes http and https. URIs and hyperlinks in HTML documents form interlinked hypertext documents. HTTP/1.1 is a revision of the original HTTP. In HTTP/1.0 a separate connection to the same server is made for every resource request. HTTP/1.1 can reuse a connection multiple times to download images, stylesheets, etc after the page has been delivered.
HTTP/1.1 communications therefore experience less latency as the establishment of TCP connections presents considerable overhead. The term hypertext was coined by Ted Nelson in 1965 in the Xanadu Project, in turn inspired by Vannevar Bush's 1930s vision of the microfilm-based information retrieval and management "memex" system described in his 1945 essay "As We May Think". Tim Berners-Lee and his team at CERN are credited with inventing the original HTTP, along with HTML and the associated technology for a web server and a text-based web browser. Berners-Lee first proposed the "WorldWideWeb" project in 1989—now known as the World Wide Web; the first version of the protocol had only one method, namely GET, which would request a page from a server. The response from the server was always an HTML page; the first documented version of HTTP was HTTP V0.9. Dave Raggett led the HTTP Working Group in 1995 and wanted to expand the protocol with extended operations, extended negotiation, richer meta-information, tied with a security protocol which became more efficient by adding additional methods and header fields.
RFC 1945 introduced and recognized HTTP V1.0 in 1996. The HTTP WG planned to publish new standards in December 1995 and the support for pre-standard HTTP/1.1 based on the developing RFC 2068 was adopted by the major browser developers in early 1996. By March that year, pre-standard HTTP/1.1 was supported in Arena, Netscape 2.0, Netscape Navigator Gold 2.01, Mosaic 2.7, Lynx 2.5, in Internet Explorer 2.0. End-user adoption of the new browsers was rapid. In March 1996, one web hosting company reported that over 40% of browsers in use on the Internet were HTTP 1.1 compliant. That same web hosting company reported that by June 1996, 65% of all browsers accessing their servers were HTTP/1.1 compliant. The HTTP/1.1 standard as defined in RFC 2068 was released in January 1997. Improvements and updates to the HTTP/1.1 standard were released under RFC 2616 in June 1999. In 2007, the HTTPbis Working Group was formed, in part, to revise and clarify the HTTP/1.1 specification. In June 2014, the WG released an updated six-part specification obsoleting RFC 2616: RFC 7230, HTTP/1.1: Message Syntax and Routing RFC 7231, HTTP/1.1: Semantics and Content RFC 7232, HTTP/1.1: Conditional Requests RFC 7233, HTTP/1.1: Range Requests RFC 7234, HTTP/1.1: Caching RFC 7235, HTTP/1