Metadata is "data that provides information about other data". Many distinct types of metadata exist, among these descriptive metadata, structural metadata, administrative metadata, reference metadata and statistical metadata. Descriptive metadata describes a resource for purposes such as identification, it can include elements such as title, abstract and keywords. Structural metadata is metadata about containers of data and indicates how compound objects are put together, for example, how pages are ordered to form chapters, it describes the types, versions and other characteristics of digital materials. Administrative metadata provides information to help manage a resource, such as when and how it was created, file type and other technical information, who can access it. Reference metadata describes the contents and quality of statistical data Statistical metadata may describe processes that collect, process, or produce statistical data. Metadata was traditionally used in the card catalogs of libraries until the 1980s, when libraries converted their catalog data to digital databases.
In the 2000s, as digital formats were becoming the prevalent way of storing data and information, metadata was used to describe digital data using metadata standards. The first description of "meta data" for computer systems is purportedly noted by MIT's Center for International Studies experts David Griffel and Stuart McIntosh in 1967: "In summary we have statements in an object language about subject descriptions of data and token codes for the data. We have statements in a meta language describing the data relationships and transformations, ought/is relations between norm and data."There are different metadata standards for each different discipline. Describing the contents and context of data or data files increases its usefulness. For example, a web page may include metadata specifying what software language the page is written in, what tools were used to create it, what subjects the page is about, where to find more information about the subject; this metadata can automatically improve the reader's experience and make it easier for users to find the web page online.
A CD may include metadata providing information about the musicians and songwriters whose work appears on the disc. A principal purpose of metadata is to help users discover resources. Metadata helps to organize electronic resources, provide digital identification, support the archiving and preservation of resources. Metadata assists users in resource discovery by "allowing resources to be found by relevant criteria, identifying resources, bringing similar resources together, distinguishing dissimilar resources, giving location information." Metadata of telecommunication activities including Internet traffic is widely collected by various national governmental organizations. This data can be used for mass surveillance. In many countries, the metadata relating to emails, telephone calls, web pages, video traffic, IP connections and cell phone locations are stored by government organizations. Metadata means "data about data". Although the "meta" prefix means "after" or "beyond", it is used to mean "about" in epistemology.
Metadata is defined as the data providing information about one or more aspects of the data. Some examples include:Means of creation of the data Purpose of the data Time and date of creation Creator or author of the data Location on a computer network where the data was created Standards used File size Data quality Source of the data Process used to create the dataFor example, a digital image may include metadata that describes how large the picture is, the color depth, the image resolution, when the image was created, the shutter speed, other data. A text document's metadata may contain information about how long the document is, who the author is, when the document was written, a short summary of the document. Metadata within web pages can contain descriptions of page content, as well as key words linked to the content; these links are called "Metatags", which were used as the primary factor in determining order for a web search until the late 1990s. The reliance of metatags in web searches was decreased in the late 1990s because of "keyword stuffing".
Metatags were being misused to trick search engines into thinking some websites had more relevance in the search than they did. Metadata can be stored and managed in a database called a metadata registry or metadata repository. However, without context and a point of reference, it might be impossible to identify metadata just by looking at it. For example: by itself, a database containing several numbers, all 13 digits long could be the results of calculations or a list of numbers to plug into an equation - without any other context, the numbers themselves can be perceived as the data, but if given the context that this database is a log of a book collection, those 13-digit numbers may now be identified as ISBNs - information that refers to the book, but is not itself the information within the book. The term "metadata" was coined in 1968 by Philip Bagley, in his book "Extension of Programming Language Concepts" where it is clear that he uses the term in the ISO 11179 "traditional" sense, "structural metadata" i.e. "data about the containers of data".
Computer science is the study of processes that interact with data and that can be represented as data in the form of programs. It enables the use of algorithms to manipulate and communicate digital information. A computer scientist studies the theory of computation and the practice of designing software systems, its fields can be divided into practical disciplines. Computational complexity theory is abstract, while computer graphics emphasizes real-world applications. Programming language theory considers approaches to the description of computational processes, while computer programming itself involves the use of programming languages and complex systems. Human–computer interaction considers the challenges in making computers useful and accessible; the earliest foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating fixed numerical tasks such as the abacus have existed since antiquity, aiding in computations such as multiplication and division.
Algorithms for performing computations have existed since antiquity before the development of sophisticated computing equipment. Wilhelm Schickard designed and constructed the first working mechanical calculator in 1623. In 1673, Gottfried Leibniz demonstrated a digital mechanical calculator, called the Stepped Reckoner, he may be considered the first computer scientist and information theorist, among other reasons, documenting the binary number system. In 1820, Thomas de Colmar launched the mechanical calculator industry when he released his simplified arithmometer, the first calculating machine strong enough and reliable enough to be used daily in an office environment. Charles Babbage started the design of the first automatic mechanical calculator, his Difference Engine, in 1822, which gave him the idea of the first programmable mechanical calculator, his Analytical Engine, he started developing this machine in 1834, "in less than two years, he had sketched out many of the salient features of the modern computer".
"A crucial step was the adoption of a punched card system derived from the Jacquard loom" making it infinitely programmable. In 1843, during the translation of a French article on the Analytical Engine, Ada Lovelace wrote, in one of the many notes she included, an algorithm to compute the Bernoulli numbers, considered to be the first computer program. Around 1885, Herman Hollerith invented the tabulator, which used punched cards to process statistical information. In 1937, one hundred years after Babbage's impossible dream, Howard Aiken convinced IBM, making all kinds of punched card equipment and was in the calculator business to develop his giant programmable calculator, the ASCC/Harvard Mark I, based on Babbage's Analytical Engine, which itself used cards and a central computing unit; when the machine was finished, some hailed it as "Babbage's dream come true". During the 1940s, as new and more powerful computing machines were developed, the term computer came to refer to the machines rather than their human predecessors.
As it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. In 1945, IBM founded the Watson Scientific Computing Laboratory at Columbia University in New York City; the renovated fraternity house on Manhattan's West Side was IBM's first laboratory devoted to pure science. The lab is the forerunner of IBM's Research Division, which today operates research facilities around the world; the close relationship between IBM and the university was instrumental in the emergence of a new scientific discipline, with Columbia offering one of the first academic-credit courses in computer science in 1946. Computer science began to be established as a distinct academic discipline in the 1950s and early 1960s; the world's first computer science degree program, the Cambridge Diploma in Computer Science, began at the University of Cambridge Computer Laboratory in 1953. The first computer science degree program in the United States was formed at Purdue University in 1962.
Since practical computers became available, many applications of computing have become distinct areas of study in their own rights. Although many believed it was impossible that computers themselves could be a scientific field of study, in the late fifties it became accepted among the greater academic population, it is the now well-known IBM brand that formed part of the computer science revolution during this time. IBM released the IBM 704 and the IBM 709 computers, which were used during the exploration period of such devices. "Still, working with the IBM was frustrating if you had misplaced as much as one letter in one instruction, the program would crash, you would have to start the whole process over again". During the late 1950s, the computer science discipline was much in its developmental stages, such issues were commonplace. Time has seen significant improvements in the effectiveness of computing technology. Modern society has seen a significant shift in the users of computer technology, from usage only by experts and professionals, to a near-ubiquitous user base.
Computers were quite costly, some degree of humanitarian aid was needed for efficient use—in part from professional computer operators. As computer adoption became more widespread and affordable, less human assistance was needed for common usage. Despite its short history as a formal academic discipline, computer science has made a number of fundamental contributions to science and society—in fact, along with electronics, it is
The 17-volume Macropædia is the third part of the Encyclopædia Britannica. The name Macropædia is a neologism coined by Mortimer J. Adler from the ancient Greek words for "large" and "instruction". Adler's intention was; the Macropædia was introduced in the 15th edition with 19 volumes having 4,207 articles. In the drastic reorganization of that edition in 1985, these articles were combined and condensed into 17 volumes with 700 articles, ranging in length from 2-310 pages; the longest article, on the United States, resulted from the merging of the 50 articles on each state. The articles of the Macropædia are written by named contributors and have references, in contrast to the 65,000 articles of the Micropædia that have no named contributor and no references. However, some parts of the Macropædia were written by the editorial staff of the Britannica. Since its reorganization, the Macropædia has not remained constant. New articles are being added, whereas older articles are sometimes split, absorbed into other articles or drastically shortened deleted.
An example of the latter is the 1989 article on Adhesives, which had its own article of 7 pages in the 1989 Macropædia but was a page in a different article of the 1991 edition. Encyclopædia Britannica Ultimate Reference Suite
In information systems, a tag is a keyword or term assigned to a piece of information. This kind of metadata helps describe an item and allows it to be found again by browsing or searching. Tags are chosen informally and by the item's creator or by its viewer, depending on the system, although they may be chosen from a controlled vocabulary. Tagging was popularized by websites associated with Web 2.0 and is an important feature of many Web 2.0 services. It is now part of other database systems, desktop applications, operating systems. People use tags to aid classification, mark ownership, note boundaries, indicate online identity. Tags may take the form of images, or other identifying marks. An analogous example of tags in the physical world is museum object tagging. People were using textual keywords to classify information and objects long before computers. Computer based. Tagging gained popularity due to the growth of social bookmarking, image sharing, social networking websites; these sites allow users to manage labels that categorize content using simple keywords.
Websites that include tags display collections of tags as tag clouds, as do some desktop applications. On websites that aggregate the tags of all users, an individual user's tags can be useful both to them and to the larger community of the website's users. Tagging systems have sometimes been classified into two kinds: bottom-up. Top-down taxonomies are created by an authorized group of designers, whereas bottom-up taxonomies are created by all users; this definition of "top down" and "bottom up" should not be confused with the distinction between a single hierarchical tree structure versus multiple non-hierarchical sets. Some researchers and applications have experimented with combining hierarchical and non-hierarchical tagging to aid in information retrieval. Others are combining top-down and bottom-up tagging, including in some large library catalogs such as WorldCat; when tags or other taxonomies have further properties such as relationships and attributes, they constitute an ontology. Metadata tags as described in this article should not be confused with the use of the word "tag" in some software to refer to an automatically generated cross-reference.
The use of keywords as part of an identification and classification system long predates computers. Paper data storage devices, notably edge-notched cards, that permitted classification and sorting by multiple criteria were in use prior to the twentieth century, faceted classification has been used by libraries since the 1930s. In the late 1970s and early 1980s, the Unix text editor Emacs offered a companion software program called Tags that could automatically build a table of cross-references called a tags table that Emacs could use to jump between a function call and that function's definition; this use of the word "tag" did not refer to metadata tags, but was an early use of the word "tag" in software to refer to a word index. Online databases and early websites deployed keyword tags as a way for publishers to help users find content. In the early days of the World Wide Web, the keywords meta element was used by web designers to tell web search engines what the web page was about, but these keywords were only visible in a web page's source code and were not modifiable by users.
In 1997, the collaborative portal "A Description of the Equator and Some ØtherLands" produced by documenta X, used the folksonomic term Tag for its co-authors and guest authors on its Upload page. In "The Equator" the term Tag for user-input was described as an abstract literal or keyword to aid the user. However, users defined singular Tags, did not share Tags at that point. In 2003, the social bookmarking website Delicious provided a way for its users to add "tags" to their bookmarks. Within a couple of years, the photo sharing website Flickr allowed its users to add their own text tags to each of their pictures, constructing flexible and easy metadata that made the pictures searchable; the success of Flickr and the influence of Delicious popularized the concept, other social software websites—such as YouTube and Last.fm—also implemented tagging. In 2005, the Atom web syndication standard provided a "category" element for inserting subject categories into web feeds, in 2007 Tim Bray proposed a "tag" URN.
Many blog systems allow authors to add free-form tags to a post, along with placing the post into a predetermined category. For example, a post may display; each of those tags is a web link leading to an index page listing all of the posts associated with that tag. The blog may have a sidebar listing all the tags in use on that blog, with each tag leading to an index page. To reclassify a post, an author edits its list of tags. All connections between posts are automatically updated by the blog software; some desktop applications an
Knowledge is a familiarity, awareness, or understanding of someone or something, such as facts, descriptions, or skills, acquired through experience or education by perceiving, discovering, or learning. Knowledge can refer to a practical understanding of a subject, it can be explicit. In philosophy, the study of knowledge is called epistemology. However, several definitions of knowledge and theories to explain it exist. Knowledge acquisition involves complex cognitive processes: perception and reasoning; the eventual demarcation of philosophy from science was made possible by the notion that philosophy's core was "theory of knowledge," a theory distinct from the sciences because it was their foundation... Without this idea of a "theory of knowledge," it is hard to imagine what "philosophy" could have been in the age of modern science; the definition of knowledge is a matter of ongoing debate among philosophers in the field of epistemology. The classical definition, described but not endorsed by Plato, specifies that a statement must meet three criteria in order to be considered knowledge: it must be justified and believed.
Some claim that these conditions are not sufficient, as Gettier case examples demonstrate. There are a number of alternatives proposed, including Robert Nozick's arguments for a requirement that knowledge'tracks the truth' and Simon Blackburn's additional requirement that we do not want to say that those who meet any of these conditions'through a defect, flaw, or failure' have knowledge. Richard Kirkham suggests that our definition of knowledge requires that the evidence for the belief necessitates its truth. In contrast to this approach, Ludwig Wittgenstein observed, following Moore's paradox, that one can say "He believes it, but it isn't so," but not "He knows it, but it isn't so." He goes on to argue that these do not correspond to distinct mental states, but rather to distinct ways of talking about conviction. What is different here is not the mental state of the speaker, but the activity in which they are engaged. For example, on this account, to know that the kettle is boiling is not to be in a particular state of mind, but to perform a particular task with the statement that the kettle is boiling.
Wittgenstein sought to bypass the difficulty of definition by looking to the way "knowledge" is used in natural languages. He saw knowledge as a case of a family resemblance. Following this idea, "knowledge" has been reconstructed as a cluster concept that points out relevant features but, not adequately captured by any definition. Symbolic representations can be thought of as a dynamic process. Hence the transfer of the symbolic representation can be viewed as one ascription process whereby knowledge can be transferred. Other forms of communication include observation and imitation, verbal exchange, audio and video recordings. Philosophers of language and semioticians construct and analyze theories of knowledge transfer or communication. While many would agree that one of the most universal and significant tools for the transfer of knowledge is writing and reading, argument over the usefulness of the written word exists nonetheless, with some scholars skeptical of its impact on societies. In his collection of essays Technopoly, Neil Postman demonstrates the argument against the use of writing through an excerpt from Plato's work Phaedrus.
In this excerpt, the scholar Socrates recounts the story of Thamus, the Egyptian king and Theuth the inventor of the written word. In this story, Theuth presents his new invention "writing" to King Thamus, telling Thamus that his new invention "will improve both the wisdom and memory of the Egyptians". King Thamus is skeptical of this new invention and rejects it as a tool of recollection rather than retained knowledge, he argues that the written word will infect the Egyptian people with fake knowledge as they will be able to attain facts and stories from an external source and will no longer be forced to mentally retain large quantities of knowledge themselves. Classical early modern theories of knowledge those advancing the influential empiricism of the philosopher John Locke, were based implicitly or explicitly on a model of the mind which likened ideas to words; this analogy between language and thought laid the foundation for a graphic conception of knowledge in which the mind was treated as a table, a container of content, that had to be stocked with facts reduced to letters, numbers or symbols.
This created a situation in which the spatial alignment of words on the page carried great cognitive weight, so much so that educators paid close attention to the visual structure of information on the page and in notebooks. Major libraries today can have millions of books of knowledge, it is only that audio and video technology for recording knowledge have become available and the use of these still requires replay equipment and electricity. Verbal teaching and handing down of knowledge is limited to those who would have contact with the transmitter or someone who could interpret wr
Metaphysics is the branch of philosophy that examines the fundamental nature of reality, including the relationship between mind and matter, between substance and attribute, between possibility and actuality. The word "metaphysics" comes from two Greek words that, together mean "after or behind or among the natural", it has been suggested that the term might have been coined by a first century CE editor who assembled various small selections of Aristotle’s works into the treatise we now know by the name Metaphysics. Metaphysics studies questions related to what it is for something to exist and what types of existence there are. Metaphysics seeks to answer, in an abstract and general manner, the questions: What is there? What is it like? Topics of metaphysical investigation include existence and their properties and time, cause and effect, possibility. Metaphysics study, conducted using deduction from that, known a priori. Like foundational mathematics, it tries to give a coherent account of the structure of the world, capable of explaining our everyday and scientific perception of the world, being free from contradictions.
In mathematics, there are many different ways. While metaphysics may, as a special case, study the entities postulated by fundamental science such as atoms and superstrings, its core topic is the set of categories such as object and causality which those scientific theories assume. For example: claiming that "electrons have charge" is a scientific theory. There are two broad stances about; the strong, classical view assumes that the objects studied by metaphysics exist independently of any observer, so that the subject is the most fundamental of all sciences. The weak, modern view assumes that the objects studied by metaphysics exist inside the mind of an observer, so the subject becomes a form of introspection and conceptual analysis; some philosophers, notably Kant, discuss both of these "worlds" and what can be inferred about each one. Some philosophers, such as the logical positivists, many scientists, reject the strong view of metaphysics as meaningless and unverifiable. Others reply that this criticism applies to any type of knowledge, including hard science, which claims to describe anything other than the contents of human perception, thus that the world of perception is the objective world in some sense.
Metaphysics itself assumes that some stance has been taken on these questions and that it may proceed independently of the choice—the question of which stance to take belongs instead to another branch of philosophy, epistemology. Ontology is the philosophical study of the nature of being, existence or reality, as well as the basic categories of being and their relations. Traditionally listed as the core of metaphysics, ontology deals with questions concerning what entities exist or may be said to exist and how such entities may be grouped, related within a hierarchy, subdivided according to similarities and differences. Identity is a fundamental metaphysical issue. Metaphysicians investigating identity are tasked with the question of what it means for something to be identical to itself, or — more controversially — to something else. Issues of identity arise in the context of time: what does it mean for something to be itself across two moments in time? How do we account for this? Another question of identity arises when we ask what our criteria ought to be for determining identity?
And how does the reality of identity interface with linguistic expressions? The metaphysical positions one takes on identity have far-reaching implications on issues such as the mind-body problem, personal identity and law; the ancient Greeks took extreme positions on the nature of change. Parmenides denied change altogether, while Heraclitus argued that change was ubiquitous: "ou cannot step into the same river twice." Identity, sometimes called Numerical Identity, is the relation that a "thing" bears to itself, which no "thing" bears to anything other than itself. A modern philosopher who made a lasting impact on the philosophy of identity was Leibniz, whose Law of the Indiscernibility of Identicals is still in wide use today, it states that if some object x is identical to some object y any property that x has, y will have as well. Put formally, it states ∀ x ∀ y However, it seems, that objects can change over time. If one were to look at a tree one day, the tree lost a leaf, it would seem that one could still be looking at that same tree.
Two rival theories to account for the relationship between change and identity are perdurantism, which treats the tree as a series of tree-stages, endurantism, which maintains that the organism—the same tree—is present at every stage in its history. Objects appear to us in space and time, while abstract entities such as classes, r