A personal computer is a multi-purpose computer whose size and price make it feasible for individual use. Personal computers are intended to be operated directly by an end user, rather than by a computer expert or technician. Unlike large costly minicomputer and mainframes, time-sharing by many people at the same time is not used with personal computers. Institutional or corporate computer owners in the 1960s had to write their own programs to do any useful work with the machines. While personal computer users may develop their own applications these systems run commercial software, free-of-charge software or free and open-source software, provided in ready-to-run form. Software for personal computers is developed and distributed independently from the hardware or operating system manufacturers. Many personal computer users no longer need to write their own programs to make any use of a personal computer, although end-user programming is still feasible; this contrasts with mobile systems, where software is only available through a manufacturer-supported channel, end-user program development may be discouraged by lack of support by the manufacturer.
Since the early 1990s, Microsoft operating systems and Intel hardware have dominated much of the personal computer market, first with MS-DOS and with Microsoft Windows. Alternatives to Microsoft's Windows operating systems occupy a minority share of the industry; these include free and open-source Unix-like operating systems such as Linux. Advanced Micro Devices provides the main alternative to Intel's processors; the advent of personal computers and the concurrent Digital Revolution have affected the lives of people in all countries. "PC" is an initialism for "personal computer". The IBM Personal Computer incorporated the designation in its model name, it is sometimes useful to distinguish personal computers of the "IBM Personal Computer" family from personal computers made by other manufacturers. For example, "PC" is used in contrast with "Mac", an Apple Macintosh computer.. Since none of these Apple products were mainframes or time-sharing systems, they were all "personal computers" and not "PC" computers.
The "brain" may one day come down to our level and help with our income-tax and book-keeping calculations. But this is speculation and there is no sign of it so far. In the history of computing, early experimental machines could be operated by a single attendant. For example, ENIAC which became operational in 1946 could be run by a single, albeit trained, person; this mode pre-dated the batch programming, or time-sharing modes with multiple users connected through terminals to mainframe computers. Computers intended for laboratory, instrumentation, or engineering purposes were built, could be operated by one person in an interactive fashion. Examples include such systems as the Bendix G15 and LGP-30of 1956, the Programma 101 introduced in 1964, the Soviet MIR series of computers developed from 1965 to 1969. By the early 1970s, people in academic or research institutions had the opportunity for single-person use of a computer system in interactive mode for extended durations, although these systems would still have been too expensive to be owned by a single person.
In what was to be called the Mother of All Demos, SRI researcher Douglas Engelbart in 1968 gave a preview of what would become the staples of daily working life in the 21st century: e-mail, word processing, video conferencing, the mouse. The demonstration required technical support staff and a mainframe time-sharing computer that were far too costly for individual business use at the time; the development of the microprocessor, with widespread commercial availability starting in the mid 1970's, made computers cheap enough for small businesses and individuals to own. Early personal computers—generally called microcomputers—were sold in a kit form and in limited volumes, were of interest to hobbyists and technicians. Minimal programming was done with toggle switches to enter instructions, output was provided by front panel lamps. Practical use required adding peripherals such as keyboards, computer displays, disk drives, printers. Micral N was the earliest commercial, non-kit microcomputer based on a microprocessor, the Intel 8008.
It was built starting in 1972, few hundred units were sold. This had been preceded by the Datapoint 2200 in 1970, for which the Intel 8008 had been commissioned, though not accepted for use; the CPU design implemented in the Datapoint 2200 became the basis for x86 architecture used in the original IBM PC and its descendants. In 1973, the IBM Los Gatos Scientific Center developed a portable computer prototype called SCAMP based on the IBM PALM processor with a Philips compact cassette drive, small CRT, full function keyboard. SCAMP emulated an IBM 1130 minicomputer in order to run APL/1130. In 1973, APL was available only on mainframe computers, most desktop sized microcomputers such as the Wang 2200 or HP 9800 offered only BASIC; because SCAMP was the first to emulate APL/1130 performance on a portable, single user computer, PC Magazine in 1983 designated SCAMP a "revolutionary concept" and "the world's first personal computer". This seminal, single user portable computer now resides in the Smithsonian Institution, Washington, D.
C.. Successful demonstrations of the 1973 SCAMP prototype led to the IBM 5100 portable microcomputer launched in 1975 with the ability to be programmed in both APL and BASIC for engineers, analysts and other business problem-solvers. In the late 1960s such a machine would have been nearly as large as two desks and would have weigh
Generation loss is the loss of quality between subsequent copies or transcodes of data. Anything that reduces the quality of the representation when copying, would cause further reduction in quality on making a copy of the copy, can be considered a form of generation loss. File size increases are a common result of generation loss, as the introduction of artifacts may increase the entropy of the data through each generation. In analog systems, generation loss is due to noise and bandwidth issues in cables, mixers, recording equipment and anything else between the source and the destination. Poorly adjusted distribution amplifiers and mismatched impedances can make these problems worse. Repeated conversion between analog and digital can cause loss. Generation loss was a major consideration in complex analog audio and video editing, where multi-layered edits were created by making intermediate mixes which were "bounced down" back onto tape. Careful planning was required to minimize generation loss, the resulting noise and poor frequency response.
One way of minimizing the number of generations needed was to use an audio mixing or video editing suite capable of mixing a large number of channels at once. The introduction of professional analog noise reduction systems such as Dolby A helped reduce the amount of audible generation loss, but were superseded by digital systems which vastly reduced generation loss. According to ATIS, "Generation loss is limited to analog recording because digital recording and reproduction may be performed in a manner, free from generation loss." Used digital technology can eliminate generation loss. Copying a digital file gives an exact copy; this trait of digital technology has given rise to awareness of the risk of unauthorized copying. Before digital technology was widespread, a record label, for example, could be confident knowing that unauthorized copies of their music tracks were never as good as the originals. Processing a lossily compressed file rather than an original results in more loss of quality than generating the same output from an uncompressed original.
For example, a low-resolution digital image for a web page is better if generated from an uncompressed raw image than from an already-compressed JPEG file of higher quality. In digital systems, several techniques, used because of other advantages, may introduce generation loss and must be used with caution. However, copying a digital file itself incurs no generation loss--the copied file is identical to the original, provided a perfect copying channel is used; some digital transforms are reversible. Lossless compression is, by definition reversible, while lossy compression throws away some data which cannot be restored. Many DSP processes are not reversible, thus careful planning of an audio or video signal chain from beginning to end and rearranging to minimize multiple conversions is important to avoid generation loss. Arbitrary choices of numbers of pixels and sampling rates for source and intermediates can degrade digital signals in spite of the potential of digital technology for eliminating generation loss completely.
When using lossy compression, it will ideally only be done once, at the end of the workflow involving the file, after all required changes have been made. Converting between lossy formats – be it decoding and re-encoding to the same format, between different formats, or between different bitrates or parameters of the same format – causes generation loss. Repeated applications of lossy compression and decompression can cause generation loss if the parameters used are not consistent across generations. Ideally an algorithm will be both idempotent, meaning that if the signal is decoded and re-encoded with identical settings, there is no loss, scalable, meaning that if it is re-encoded with lower quality settings, the result will be the same as if it had been encoded from the original signal – see Scalable Video Coding. More transcoding between different parameters of a particular encoding will ideally yield the greatest common shared quality – for instance, converting from an image with 4 bits of red and 8 bits of green to one with 8 bits of red and 4 bits of green would ideally yield an image with 4 bits of red color depth and 4 bits of green color depth without further degradation.
Some lossy compression algorithms are much worse than others in this regard, being neither idempotent nor scalable, introducing further degradation if parameters are changed. For example, with JPEG, changing the quality setting will cause different quantization constants to be used, causing additional loss. Further, as JPEG is divided into 16×16 blocks, cropping that does not fall on an 8×8 boundary shifts the encoding blocks, causing substantial degradation – similar problems happen on rotation; this can be avoided by the use of similar tools for cropping. Similar degradation occurs. Digital resampling such as image scaling, other DSP techniques can introduce artifacts or degrade signal-to-noise ratio each time they are used if the underlying storage is lossless. Resampling causes aliasing, both blurring low-frequency components and adding high-frequency noise, causing jaggies, while rou
The Amiga is a family of personal computers introduced by Commodore in 1985. The original model was part of a wave of 16- and 32-bit computers that featured 256 KB or more of RAM, mouse-based GUIs, improved graphics and audio over 8-bit systems; this wave included the Atari ST—released the same year—Apple's Macintosh, the Apple IIGS. Based on the Motorola 68000 microprocessor, the Amiga differed from its contemporaries through the inclusion of custom hardware to accelerate graphics and sound, including sprites and a blitter, a pre-emptive multitasking operating system called AmigaOS; the Amiga 1000 was released in July 1985, but a series of production problems kept it from becoming available until early 1986. The best selling model, the Amiga 500, was introduced in 1987 and became one of the leading home computers of the late 1980s and early 1990s with four to six million sold; the A3000, introduced in 1990, started the second generation of Amiga systems, followed by the A500+, the A600 in March 1992.
As the third generation, the A1200 and the A4000 were released in late 1992. The platform became popular for gaming and programming demos, it found a prominent role in the desktop video, video production, show control business, leading to video editing systems such as the Video Toaster. The Amiga's native ability to play back multiple digital sound samples made it a popular platform for early tracker music software; the powerful processor and ability to access several megabytes of memory enabled the development of several 3D rendering packages, including LightWave 3D, Aladdin4D, TurboSilver and Traces, a predecessor to Blender. Although early Commodore advertisements attempt to cast the computer as an all-purpose business machine when outfitted with the Amiga Sidecar PC compatibility add-on, the Amiga was most commercially successful as a home computer, with a wide range of games and creative software. Poor marketing and the failure of the models to repeat the technological advances of the first systems meant that the Amiga lost its market share to competing platforms, such as the fourth generation game consoles and the dropping prices of IBM PC compatibles which gained 256-color VGA graphics in 1987.
Commodore went bankrupt in April 1994 after the Amiga CD32 model failed in the marketplace. Since the demise of Commodore, various groups have marketed successors to the original Amiga line, including Genesi, Eyetech, ACube Systems Srl and A-EON Technology. AmigaOS has influenced replacements and compatible systems such as MorphOS, AmigaOS 4 and AROS. "The Amiga was so far ahead of its time that nobody—including Commodore's marketing department—could articulate what it was all about. Today, it's obvious the Amiga was the first multimedia computer, but in those days it was derided as a game machine because few people grasped the importance of advanced graphics and video. Nine years vendors are still struggling to make systems that work like 1985 Amigas." Jay Miner joined Atari in the 1970s to develop custom integrated circuits, led development of the Atari 2600's TIA. As soon as its development was complete, the team began developing a much more sophisticated set of chips, CTIA, ANTIC and POKEY, that formed the basis of the Atari 8-bit family.
With the 8-bit line's launch in 1979, the team once again started looking at a next generation chipset. Nolan Bushnell had sold the company to Warner Communications in 1978, the new management was much more interested in the existing lines than development of new products that might cut into their sales. Miner wanted to start work with the new Motorola 68000, but management was only interested in another 6502 based system. Miner left the company, for a time, the industry. In 1979, Larry Kaplan founded Activision. In 1982, Kaplan was approached by a number of investors. Kaplan hired Miner to run the hardware side of the newly formed company, "Hi-Toro"; the system was code-named "Lorraine" in keeping with Miner's policy of giving systems female names, in this case the company president's wife, Lorraine Morse. When Kaplan left the company late in 1982, Miner was promoted to head engineer and the company relaunched as Amiga Corporation. A breadboard prototype was completed by late 1983, shown at the January 1984 Consumer Electronics Show.
At the time, the operating system was not ready, so the machine was demonstrated with the Boing Ball demo. A further developed version of the system was demonstrated at the June 1984 CES and shown to many companies in hopes of garnering further funding, but found little interest in a market, in the final stages of the North American video game crash of 1983. In March, Atari expressed a tepid interest in Lorraine for its potential use in a games console or home computer tentatively known as the 1850XLD, but the talks were progressing and Amiga was running out of money. A temporary arrangement in June led to a $500,000 loan from Atari to Amiga to keep the company going; the terms required the loan to be repaid at the end of the month, otherwise Amiga would forfeit the Lorraine design to Atari. During 1983, Atari lost over $1 million a week, due to the combined effects of the crash and the ongoing price war in the home computer market. By the end of the year, Warner was desperate to sell the company.
In January 1984, Jack Tramiel resigned from Commodore due to internal battles over the future direction of the company. A number of Commodore employees followed him to Tramiel Technology; this included a number of the senior technical staff, where they began development of a 68000-based machine of the
In physics and related fields, a wave is a disturbance of a field in which a physical attribute oscillates at each point or propagates from each point to neighboring points, or seems to move through space. The waves most studied in physics are mechanical and electromagnetic. A mechanical wave is a local deformation in some physical medium that propagates from particle to particle by creating local stresses that cause strain in neighboring particles too. For example, sound waves in air are variations of the local pressure that propagate by collisions between gas molecules. Other examples of mechanical waves are seismic waves, gravity waves and shock waves. An electromagnetic wave consists of a combination of variable electric and magnetic fields, that propagates through space according to Maxwell's equations. Electromagnetic waves can travel through vacuum. Other types of waves include gravitational waves, which are disturbances in a gravitational field that propagate according to general relativity.
Mechanical and electromagnetic waves may seem to travel through space. In mathematics and electronics waves are studied as signals. On the other hand, some waves do not appear to move at all, like hydraulic jumps. Some, like the probability waves of quantum mechanics, may be static in both space. A plane seems to travel in a definite direction, has constant value over any plane perpendicular to that direction. Mathematically, the simplest waves are the sinusoidal ones. Complicated waves can be described as the sum of many sinusoidal plane waves. A plane wave can be transverse, if its effect at each point is described by a vector, perpendicular to the direction of propagation or energy transfer. While mechanical waves can be both transverse and longitudinal, electromagnetic waves are transverse in free space. Consider a traveling transverse wave on a string. Consider the string to have a single spatial dimension. Consider this wave as traveling in the x direction in space. For example, let the positive x direction be to the right, the negative x direction be to the left.
With constant amplitude u with constant velocity v, where v is independent of wavelength independent of amplitude. With constant waveform, or shapeThis wave can be described by the two-dimensional functions u = F u = G or, more by d'Alembert's formula: u = F + G. representing two component waveforms F and G traveling through the medium in opposite directions. A generalized representation of this wave can be obtained as the partial differential equation 1 v 2 ∂ 2 u ∂ t 2 = ∂ 2 u ∂ x 2. General solutions are based upon Duhamel's principle; the form or shape of F in d'Alembert's formula involves the argument x − vt. Constant values of this argument correspond to constant values of F, these constant values occur if x increases at the same rate that vt increases; that is, the wave shaped like the function F will move in the positive x-direction at velocity v. In the case of a periodic function F with period λ, that is, F = F, the periodicity of F in space means that a snapshot of the wave at a given time t finds the wave varying periodically in space with period λ.
In a similar fashion, this periodicity of F implies a periodicity in time as well: F = F provided vT = λ, so an observation of the wave at a fixed location x finds the wave undulating periodically in time with period T = λ/v. The amplitude of a wave may be constant, or may be modulated so as to vary with time and/or position; the outline of the variation in amplitude is called the envelope of the w
SCISYS PLC is a pan-European computer software and services company based in the United Kingdom and Germany. SCISYS is a medium-sized software and IT services company; the AIM-listed company is a leading developer of information and communications technology services, e-business and mobile applications, editorial newsroom solutions and advanced technology solutions. The SCISYS Group comprises SCISYS UK Ltd, SCISYS Deutschland GmbH, ANNOVA Systems GmbH and Xibis Ltd. SCISYS was formed in 1980 as Science Systems and its shares were listed on London's Alternative Investment Market in 1997. In 2000, Science Systems acquired CODA, an accounting software company, in 2002 the group holding company was renamed CODASciSys plc; the CODA business was demerged as a separately listed company in 2006. Following the demerger the original, remaining business became SciSys plc in 2006. In 2007, SciSys purchased a private Bochum-based German company, VCS AG. VCS produced software, computer systems and telecommunications systems for broadcasting, services for public- and private-sector satellites.
In January 2012 the VCS business was renamed SCISYS Deutschland GmbH as part of a wider integration exercise. As part of this, in May 2012, the UK operations of SciSys were rebranded as SCISYS; these two changes brought the business divisions together, operating under the same name and branding. In October 2012, SCISYS purchased the space-market elements of the German company MakaluMedia. In December 2014, SCISYS announced the acquisition of Xibis Limited, a mobile and web-development firm. In November 2016, SCISYS announced the acquisition of ANNOVA Systems GmbH a leading provider of software solutions for media and broadcast, in particular its OpenMedia newsroom computer systems. In May 2017, SCISYS announced that the Media & Broadcast division of SCISYS Deutschland has moved from Bochum to new offices in nearby Dortmund; the Bochum location remains the home of its space division. In 2018, SCISYS moved its headquarters to Ireland, seen as a response to the then-pending Brexit; the SCISYS Group comprises SCISYS UK Ltd, SCISYS Deutschland GmbH, ANNOVA Systems GmbH and Xibis Ltd.
SCISYS LocationsUK Chippenham Bristol Reading Germany Bochum Dortmund DarmstadtANNOVA Systems LocationsGermany Munich Xibis Ltd LocationsUK Oadby, Leicester SCISYS Website: English SCISYS Deutschland GmbH Website: German ANNOVA Systems Website: English Xibis Ltd Website
Compact disc is a digital optical disc data storage format, co-developed by Philips and Sony and released in 1982. The format was developed to store and play only sound recordings but was adapted for storage of data. Several other formats were further derived from these, including write-once audio and data storage, rewritable media, Video Compact Disc, Super Video Compact Disc, Photo CD, PictureCD, CD-i, Enhanced Music CD; the first commercially available audio CD player, the Sony CDP-101, was released October 1982 in Japan. Standard CDs have a diameter of 120 millimetres and can hold up to about 80 minutes of uncompressed audio or about 700 MiB of data; the Mini CD has various diameters ranging from 60 to 80 millimetres. At the time of the technology's introduction in 1982, a CD could store much more data than a personal computer hard drive, which would hold 10 MB. By 2010, hard drives offered as much storage space as a thousand CDs, while their prices had plummeted to commodity level. In 2004, worldwide sales of audio CDs, CD-ROMs and CD-Rs reached about 30 billion discs.
By 2007, 200 billion CDs had been sold worldwide. From the early 2000s CDs were being replaced by other forms of digital storage and distribution, with the result that by 2010 the number of audio CDs being sold in the U. S. had dropped about 50% from their peak. In 2014, revenues from digital music services matched those from physical format sales for the first time. American inventor James T. Russell has been credited with inventing the first system to record digital information on an optical transparent foil, lit from behind by a high-power halogen lamp. Russell's patent application was filed in 1966, he was granted a patent in 1970. Following litigation and Philips licensed Russell's patents in the 1980s; the compact disc is an evolution of LaserDisc technology, where a focused laser beam is used that enables the high information density required for high-quality digital audio signals. Prototypes were developed by Sony independently in the late 1970s. Although dismissed by Philips Research management as a trivial pursuit, the CD became the primary focus for Philips as the LaserDisc format struggled.
In 1979, Sony and Philips set up a joint task force of engineers to design a new digital audio disc. After a year of experimentation and discussion, the Red Book CD-DA standard was published in 1980. After their commercial release in 1982, compact discs and their players were popular. Despite costing up to $1,000, over 400,000 CD players were sold in the United States between 1983 and 1984. By 1988, CD sales in the United States surpassed those of vinyl LPs, by 1992 CD sales surpassed those of prerecorded music cassette tapes; the success of the compact disc has been credited to the cooperation between Philips and Sony, which together agreed upon and developed compatible hardware. The unified design of the compact disc allowed consumers to purchase any disc or player from any company, allowed the CD to dominate the at-home music market unchallenged. In 1974, Lou Ottens, director of the audio division of Philips, started a small group with the aim to develop an analog optical audio disc with a diameter of 20 cm and a sound quality superior to that of the vinyl record.
However, due to the unsatisfactory performance of the analog format, two Philips research engineers recommended a digital format in March 1974. In 1977, Philips established a laboratory with the mission of creating a digital audio disc; the diameter of Philips's prototype compact disc was set at 11.5 cm, the diagonal of an audio cassette. Heitaro Nakajima, who developed an early digital audio recorder within Japan's national public broadcasting organization NHK in 1970, became general manager of Sony's audio department in 1971, his team developed a digital PCM adaptor audio tape recorder using a Betamax video recorder in 1973. After this, in 1974 the leap to storing digital audio on an optical disc was made. Sony first publicly demonstrated an optical digital audio disc in September 1976. A year in September 1977, Sony showed the press a 30 cm disc that could play 60 minutes of digital audio using MFM modulation. In September 1978, the company demonstrated an optical digital audio disc with a 150-minute playing time, 44,056 Hz sampling rate, 16-bit linear resolution, cross-interleaved error correction code—specifications similar to those settled upon for the standard compact disc format in 1980.
Technical details of Sony's digital audio disc were presented during the 62nd AES Convention, held on 13–16 March 1979, in Brussels. Sony's AES technical paper was published on 1 March 1979. A week on 8 March, Philips publicly demonstrated a prototype of an optical digital audio disc at a press conference called "Philips Introduce Compact Disc" in Eindhoven, Netherlands. Sony executive Norio Ohga CEO and chairman of Sony, Heitaro Nakajima were convinced of the format's commercial potential and pushed further development despite widespread skepticism; as a result, in 1979, Sony and Philips set up a joint task force of engineers to design a new digital audio disc. Led by engineers Kees Schouhamer Immink and Toshitada Doi, the research pushed forward laser and optical disc technology. After a year of experimentation and discussion, the task force produced the Red Book CD-DA standard. First published in 1980, the stand
Metadata is "data that provides information about other data". Many distinct types of metadata exist, among these descriptive metadata, structural metadata, administrative metadata, reference metadata and statistical metadata. Descriptive metadata describes a resource for purposes such as identification, it can include elements such as title, abstract and keywords. Structural metadata is metadata about containers of data and indicates how compound objects are put together, for example, how pages are ordered to form chapters, it describes the types, versions and other characteristics of digital materials. Administrative metadata provides information to help manage a resource, such as when and how it was created, file type and other technical information, who can access it. Reference metadata describes the contents and quality of statistical data Statistical metadata may describe processes that collect, process, or produce statistical data. Metadata was traditionally used in the card catalogs of libraries until the 1980s, when libraries converted their catalog data to digital databases.
In the 2000s, as digital formats were becoming the prevalent way of storing data and information, metadata was used to describe digital data using metadata standards. The first description of "meta data" for computer systems is purportedly noted by MIT's Center for International Studies experts David Griffel and Stuart McIntosh in 1967: "In summary we have statements in an object language about subject descriptions of data and token codes for the data. We have statements in a meta language describing the data relationships and transformations, ought/is relations between norm and data."There are different metadata standards for each different discipline. Describing the contents and context of data or data files increases its usefulness. For example, a web page may include metadata specifying what software language the page is written in, what tools were used to create it, what subjects the page is about, where to find more information about the subject; this metadata can automatically improve the reader's experience and make it easier for users to find the web page online.
A CD may include metadata providing information about the musicians and songwriters whose work appears on the disc. A principal purpose of metadata is to help users discover resources. Metadata helps to organize electronic resources, provide digital identification, support the archiving and preservation of resources. Metadata assists users in resource discovery by "allowing resources to be found by relevant criteria, identifying resources, bringing similar resources together, distinguishing dissimilar resources, giving location information." Metadata of telecommunication activities including Internet traffic is widely collected by various national governmental organizations. This data can be used for mass surveillance. In many countries, the metadata relating to emails, telephone calls, web pages, video traffic, IP connections and cell phone locations are stored by government organizations. Metadata means "data about data". Although the "meta" prefix means "after" or "beyond", it is used to mean "about" in epistemology.
Metadata is defined as the data providing information about one or more aspects of the data. Some examples include:Means of creation of the data Purpose of the data Time and date of creation Creator or author of the data Location on a computer network where the data was created Standards used File size Data quality Source of the data Process used to create the dataFor example, a digital image may include metadata that describes how large the picture is, the color depth, the image resolution, when the image was created, the shutter speed, other data. A text document's metadata may contain information about how long the document is, who the author is, when the document was written, a short summary of the document. Metadata within web pages can contain descriptions of page content, as well as key words linked to the content; these links are called "Metatags", which were used as the primary factor in determining order for a web search until the late 1990s. The reliance of metatags in web searches was decreased in the late 1990s because of "keyword stuffing".
Metatags were being misused to trick search engines into thinking some websites had more relevance in the search than they did. Metadata can be stored and managed in a database called a metadata registry or metadata repository. However, without context and a point of reference, it might be impossible to identify metadata just by looking at it. For example: by itself, a database containing several numbers, all 13 digits long could be the results of calculations or a list of numbers to plug into an equation - without any other context, the numbers themselves can be perceived as the data, but if given the context that this database is a log of a book collection, those 13-digit numbers may now be identified as ISBNs - information that refers to the book, but is not itself the information within the book. The term "metadata" was coined in 1968 by Philip Bagley, in his book "Extension of Programming Language Concepts" where it is clear that he uses the term in the ISO 11179 "traditional" sense, "structural metadata" i.e. "data about the containers of data".