Metadata is "data that provides information about other data". Many distinct types of metadata exist, among these descriptive metadata, structural metadata, administrative metadata, reference metadata and statistical metadata. Descriptive metadata describes a resource for purposes such as identification, it can include elements such as title, abstract and keywords. Structural metadata is metadata about containers of data and indicates how compound objects are put together, for example, how pages are ordered to form chapters, it describes the types, versions and other characteristics of digital materials. Administrative metadata provides information to help manage a resource, such as when and how it was created, file type and other technical information, who can access it. Reference metadata describes the contents and quality of statistical data Statistical metadata may describe processes that collect, process, or produce statistical data. Metadata was traditionally used in the card catalogs of libraries until the 1980s, when libraries converted their catalog data to digital databases.
In the 2000s, as digital formats were becoming the prevalent way of storing data and information, metadata was used to describe digital data using metadata standards. The first description of "meta data" for computer systems is purportedly noted by MIT's Center for International Studies experts David Griffel and Stuart McIntosh in 1967: "In summary we have statements in an object language about subject descriptions of data and token codes for the data. We have statements in a meta language describing the data relationships and transformations, ought/is relations between norm and data."There are different metadata standards for each different discipline. Describing the contents and context of data or data files increases its usefulness. For example, a web page may include metadata specifying what software language the page is written in, what tools were used to create it, what subjects the page is about, where to find more information about the subject; this metadata can automatically improve the reader's experience and make it easier for users to find the web page online.
A CD may include metadata providing information about the musicians and songwriters whose work appears on the disc. A principal purpose of metadata is to help users discover resources. Metadata helps to organize electronic resources, provide digital identification, support the archiving and preservation of resources. Metadata assists users in resource discovery by "allowing resources to be found by relevant criteria, identifying resources, bringing similar resources together, distinguishing dissimilar resources, giving location information." Metadata of telecommunication activities including Internet traffic is widely collected by various national governmental organizations. This data can be used for mass surveillance. In many countries, the metadata relating to emails, telephone calls, web pages, video traffic, IP connections and cell phone locations are stored by government organizations. Metadata means "data about data". Although the "meta" prefix means "after" or "beyond", it is used to mean "about" in epistemology.
Metadata is defined as the data providing information about one or more aspects of the data. Some examples include:Means of creation of the data Purpose of the data Time and date of creation Creator or author of the data Location on a computer network where the data was created Standards used File size Data quality Source of the data Process used to create the dataFor example, a digital image may include metadata that describes how large the picture is, the color depth, the image resolution, when the image was created, the shutter speed, other data. A text document's metadata may contain information about how long the document is, who the author is, when the document was written, a short summary of the document. Metadata within web pages can contain descriptions of page content, as well as key words linked to the content; these links are called "Metatags", which were used as the primary factor in determining order for a web search until the late 1990s. The reliance of metatags in web searches was decreased in the late 1990s because of "keyword stuffing".
Metatags were being misused to trick search engines into thinking some websites had more relevance in the search than they did. Metadata can be stored and managed in a database called a metadata registry or metadata repository. However, without context and a point of reference, it might be impossible to identify metadata just by looking at it. For example: by itself, a database containing several numbers, all 13 digits long could be the results of calculations or a list of numbers to plug into an equation - without any other context, the numbers themselves can be perceived as the data, but if given the context that this database is a log of a book collection, those 13-digit numbers may now be identified as ISBNs - information that refers to the book, but is not itself the information within the book. The term "metadata" was coined in 1968 by Philip Bagley, in his book "Extension of Programming Language Concepts" where it is clear that he uses the term in the ISO 11179 "traditional" sense, "structural metadata" i.e. "data about the containers of data".
Interoperability is a characteristic of a product or system, whose interfaces are understood, to work with other products or systems, at present or in the future, in either implementation or access, without any restrictions. While the term was defined for information technology or systems engineering services to allow for information exchange, a broader definition takes into account social and organizational factors that impact system to system performance. Task of building coherent services for users when the individual components are technically different and managed by different organizations If two or more systems are capable of communicating with each other, they exhibit syntactic interoperability when using specified data formats and communication protocols. XML or SQL standards are among the tools of syntactic interoperability; this is true for lower-level data formats, such as ensuring alphabetical characters are stored in a same variation of ASCII or a Unicode format in all the communicating systems.
Beyond the ability of two or more computer systems to exchange information, semantic interoperability is the ability to automatically interpret the information exchanged meaningfully and in order to produce useful results as defined by the end users of both systems. To achieve semantic interoperability, both sides must refer to a common information exchange reference model; the content of the information exchange requests are unambiguously defined: what is sent is the same as what is understood. The possibility of promoting this result by user-driven convergence of disparate interpretations of the same information has been object of study by research prototypes such as S3DB. Cross-domain interoperability involves multiple social, political, legal entities working together for a common interest and/or information exchange. Interoperability imply Open standards ab-initio, i.e. by definition. Interoperability imply exchanges between a range of products, or similar products from several different vendors, or between past and future revisions of the same product.
Interoperability may be developed post-facto, as a special measure between two products, while excluding the rest, by using Open standards. When a vendor is forced to adapt its system to a dominant system, not based on Open standards, it is not interoperability but only compatibility. Open standards rely on a broadly consultative and inclusive group including representatives from vendors and others holding a stake in the development that discusses and debates the technical and economic merits and feasibility of a proposed common protocol. After the doubts and reservations of all members are addressed, the resulting common document is endorsed as a common standard; this document is subsequently released to the public, henceforth becomes an open standard. It is published and is available or at a nominal cost to any and all comers, with no further encumbrances. Various vendors and individuals can use the standards document to make products that implement the common protocol defined in the standard, are thus interoperable by design, with no specific liability or advantage for any customer for choosing one product over another on the basis of standardised features.
The vendors' products compete on the quality of their implementation, user interface, ease of use, price, a host of other factors, while keeping the customers data intact and transferable if he chooses to switch to another competing product for business reasons. Post facto interoperability may be the result of the absolute market dominance of a particular product in contravention of any applicable standards, or if any effective standards were not present at the time of that product's introduction; the vendor behind that product can choose to ignore any forthcoming standards and not co-operate in any standardisation process at all, using its near-monopoly to insist that its product sets the de facto standard by its market dominance. This is not a problem if the product's implementation is open and minimally encumbered, but it may as well be both closed and encumbered; because of the network effect, achieving interoperability with such a product is both critical for any other vendor if it wishes to remain relevant in the market, difficult to accomplish because of lack of co-operation on equal terms with the original vendor, who may well see the new vendor as a potential competitor and threat.
The newer implementations rely on clean-room reverse engineering in the absence of technical data to achieve interoperability. The original vendors can provide such technical data to others in the name of'encouraging competition,' but such data is invariably encumbered, may be of limited use. Availability of such data is not equivalent to an open standard, because: The data is provided by the original vendor on a discretionary basis, who has every interest in blocking the effective implementation of competing solutions, may subtly alter or change its product in newer revisions, so that competitors' implementations are but not quite interoperable, leading customers to consider them unreliable or of a lower quality; these changes can either not be passed on to other vendors at all, or passed on after a strategic delay, maintaining the market dominance of the original vendor. The data itself may be encumbered, e.g. by patents or pricing, leading to a dependence of all competing solutions on the original vendor, leading a revenue stream from the competitors' customers back to the original vendor.
This revenue stream is only a result of the origina
KDE is an international free software community developing Free and Open Source software. As a central development hub, it provides tools and resources that allow collaborative work on this kind of software. Well-known products include the Plasma Desktop, KDE Frameworks and a range of cross-platform applications like Krita or digikam designed to run on Unix and Unix-like desktops, Microsoft Windows and Android. Being one of KDE's most recognized projects, the Plasma Desktop is the official / default desktop environment on many Linux distributions, such as openSUSE, Mageia, OpenMandriva, Kubuntu, KaOS and PCLinuxOS; the KDE community and its work can be measured in the following figures: KDE is one of the largest active Free Software communities. More than 2500 contributors participate in developing KDE software. About 20 new developers contribute their first code each month. KDE software consists of over 6 million lines of code. KDE software has been translated into over 108 languages. KDE software is available on more than 114 official FTP mirrors in over 34 countries.
A read-only mirror of all repositories can be found on Github. There are many free software projects maintained by the KDE community; the project known as KDE or KDE SC nowadays consists of three parts: KDE Plasma, a platform UI that provides the base for different workspaces like Plasma Desktop or Plasma Mobile KDE Frameworks, a collection of more than 70 free-to-use libraries built on top of Qt KDE Applications KDE Plasma is a user interface technology that can be adjusted to run on various form factors like desktops, netbooks and smartphones or embedded devices. The brand Plasma for the graphical workspaces has been introduced from KDE SC 4.4 onwards. During the fourth series there have been two additional workspaces besides the Plasma 4 Desktop called Plasma Netbook and Plasma Active; the latest KDE Plasma 5 features the following workspaces: Plasma Desktop for any mouse or keyboard driven computing devices like desktops or laptops Plasma Mobile for smartphones Plasma Minishell for embedded and touch-enabled devices, like IoT or automotive Plasma Media Center for TVs and set-top boxes KDE Frameworks provide more than 70 free and open-source libraries built on top of Qt.
Starting with Qt 5, this platform was transformed into a set of modules, now referred to as KDE Frameworks. These modules include: Solid, Phonon, etc. and are licensed either under the LGPL, BSD license, MIT License or X11 license. KDE Applications is a bundle of software, part of the official KDE Applications release. Like Okular, Dolphin or KDEnlive, they are built on KDE Frameworks and released on a 4 months schedule with the version numbering consisting of YY. MM. Software, not part of the official KDE Applications bundle can be found in the "Extragear" section, they feature their own versioning numbers. There are many standalone applications like KTorrent, Krita or Amarok that are designed to be portable between operating systems and deployable independent of a particular workspace or desktop environment; some brands consist of multiple applications, such as KDE Kontact. KDE neon is a software repository, it aims to provide the users with updated Qt and KDE software, while updating the rest of the OS components from the Ubuntu repositories at the normal pace.
KDE maintains that it is not a "KDE distribution," but rather an up-to-date archive of KDE and Qt packages. There is two "Developer" editions of KDE Neon. WikiToLearn, abbreviated WTL, is one of KDE's newer endeavors, it is a wiki that provides a platform to share open source textbooks. The idea is to have a massive library of textbooks for anyone and everyone to create, its roots lay in University of Milan, where a group of physics majors wanted to share notes—then decided that it was for everyone and not just their internal friend group. They have become an official KDE project with several universities backing it. Like many free/open source projects, developing KDE software is a volunteer effort, although various companies, such as Novell, Nokia, or Blue Systems employ or employed developers to work on various parts of the project. Since a large number of individuals contribute to KDE in various ways (e.g. code
OCLC Online Computer Library Center, Incorporated d/b/a OCLC is an American nonprofit cooperative organization "dedicated to the public purposes of furthering access to the world's information and reducing information costs". It was founded in 1967 as the Ohio College Library Center. OCLC and its member libraries cooperatively produce and maintain WorldCat, the largest online public access catalog in the world. OCLC is funded by the fees that libraries have to pay for its services. OCLC maintains the Dewey Decimal Classification system. OCLC began in 1967, as the Ohio College Library Center, through a collaboration of university presidents, vice presidents, library directors who wanted to create a cooperative computerized network for libraries in the state of Ohio; the group first met on July 5, 1967 on the campus of the Ohio State University to sign the articles of incorporation for the nonprofit organization, hired Frederick G. Kilgour, a former Yale University medical school librarian, to design the shared cataloging system.
Kilgour wished to merge the latest information storage and retrieval system of the time, the computer, with the oldest, the library. The plan was to merge the catalogs of Ohio libraries electronically through a computer network and database to streamline operations, control costs, increase efficiency in library management, bringing libraries together to cooperatively keep track of the world's information in order to best serve researchers and scholars; the first library to do online cataloging through OCLC was the Alden Library at Ohio University on August 26, 1971. This was the first online cataloging by any library worldwide. Membership in OCLC is based on use of services and contribution of data. Between 1967 and 1977, OCLC membership was limited to institutions in Ohio, but in 1978, a new governance structure was established that allowed institutions from other states to join. In 2002, the governance structure was again modified to accommodate participation from outside the United States.
As OCLC expanded services in the United States outside Ohio, it relied on establishing strategic partnerships with "networks", organizations that provided training and marketing services. By 2008, there were 15 independent United States regional service providers. OCLC networks played a key role in OCLC governance, with networks electing delegates to serve on the OCLC Members Council. During 2008, OCLC commissioned two studies to look at distribution channels. In early 2009, OCLC negotiated new contracts with the former networks and opened a centralized support center. OCLC provides bibliographic and full-text information to anyone. OCLC and its member libraries cooperatively produce and maintain WorldCat—the OCLC Online Union Catalog, the largest online public access catalog in the world. WorldCat has holding records from private libraries worldwide; the Open WorldCat program, launched in late 2003, exposed a subset of WorldCat records to Web users via popular Internet search and bookselling sites.
In October 2005, the OCLC technical staff began a wiki project, WikiD, allowing readers to add commentary and structured-field information associated with any WorldCat record. WikiD was phased out; the Online Computer Library Center acquired the trademark and copyrights associated with the Dewey Decimal Classification System when it bought Forest Press in 1988. A browser for books with their Dewey Decimal Classifications was available until July 2013; until August 2009, when it was sold to Backstage Library Works, OCLC owned a preservation microfilm and digitization operation called the OCLC Preservation Service Center, with its principal office in Bethlehem, Pennsylvania. The reference management service QuestionPoint provides libraries with tools to communicate with users; this around-the-clock reference service is provided by a cooperative of participating global libraries. Starting in 1971, OCLC produced catalog cards for members alongside its shared online catalog. OCLC commercially sells software, such as CONTENTdm for managing digital collections.
It offers the bibliographic discovery system WorldCat Discovery, which allows for library patrons to use a single search interface to access an institution's catalog, database subscriptions and more. OCLC has been conducting research for the library community for more than 30 years. In accordance with its mission, OCLC makes its research outcomes known through various publications; these publications, including journal articles, reports and presentations, are available through the organization's website. OCLC Publications – Research articles from various journals including Code4Lib Journal, OCLC Research, Reference & User Services Quarterly, College & Research Libraries News, Art Libraries Journal, National Education Association Newsletter; the most recent publications are displayed first, all archived resources, starting in 1970, are available. Membership Reports – A number of significant reports on topics ranging from virtual reference in libraries to perceptions about library funding. Newsletters – Current and archived newsletters for the library and archive community.
Presentations – Presentations from both guest speakers and OCLC research from conferences and other events. The presentations are organized into five categories: Conference presentations, Dewey presentations, Distinguished Seminar Series, Guest presentations, Research staff
GNOME is a free and open-source desktop environment for Unix-like operating systems. GNOME was an acronym for GNU Network Object Model Environment, but the acronym was dropped because it no longer reflected the vision of the GNOME project. GNOME is part of the GNU Project and developed by The GNOME Project, composed of both volunteers and paid contributors, the largest corporate contributor being Red Hat, it is an international project that aims to develop software frameworks for the development of software, to program end-user applications based on these frameworks, to coordinate efforts for internationalization and localization and accessibility of that software. GNOME 3 is the default desktop environment on many major Linux distributions including Fedora, Ubuntu, SUSE Linux Enterprise, Red Hat Enterprise Linux, CentOS, Oracle Linux, Scientific Linux, SteamOS, Kali Linux and Endless OS; the continued fork of the last GNOME 2 release that goes under the name MATE is default on many distributions that targets low usage of system resources.
GNOME was started on August 15 1997 by Miguel de Icaza and Federico Mena as a free software project to develop a desktop environment and applications for it. It was founded in part because K Desktop Environment, growing in popularity, relied on the Qt widget toolkit which used a proprietary software license until version 2.0. In place of Qt, the GTK toolkit was chosen as the base of GNOME. GTK uses the GNU Lesser General Public License, a free software license that allows software linking to it to use a much wider set of licenses, including proprietary software licenses. GNOME itself is licensed under the LGPL for its libraries, the GNU General Public License for its applications; the name "GNOME" was an acronym of GNU Network Object Model Environment, referring to the original intention of creating a distributed object framework similar to Microsoft's OLE, but the acronym was dropped because it no longer reflected the vision of the GNOME project. The California startup Eazel developed the Nautilus file manager from 1999 to 2001.
De Icaza and Nat Friedman founded Helix Code in 1999 in Massachusetts. During the transition to GNOME 2 around the year 2001 and shortly thereafter there were brief talks about creating a GNOME Office suite. On September 15, 2003 GNOME-Office 1.0, consisting of AbiWord 2.0, GNOME-DB 1.0 and Gnumeric 1.2.0 was released. Although some release planning for GNOME Office 1.2 was happening on gnome-office mailing list, Gnumeric 1.4 was announced as a part of it, the 1.2 release of the suite itself never materialized. As of May 4, 2014 GNOME wiki only mentions "GNOME/Gtk applications that are useful in an office environment". GNOME 2 was similar to a conventional desktop interface, featuring a simple desktop in which users could interact with virtual objects, such as windows and files. GNOME 2 started out with Sawfish, but switched to Metacity as its default window manager; the handling of windows and files in GNOME 2 is similar to that of contemporary desktop operating systems. In the default configuration of GNOME 2, the desktop has a launcher menu for quick access to installed programs and file locations.
However, these features can be moved to any position or orientation the user desires, replaced with other functions or removed altogether. As of 2009, GNOME 2 was the default desktop for OpenSolaris. GNOME 1 and 2 followed the traditional desktop metaphor. GNOME 3, released in 2011, changed this with GNOME Shell, a more abstract metaphor where switching between different tasks and virtual desktops takes place in a separate area called "Overview". Since Mutter replaced Metacity as the default window manager, the minimize and maximize buttons no longer appear by default, the title bar, menu bar and tool bar combinated in one horizontal bar called "header bar" via Client-Side Decoration mechanism. Adwaita replaced Clearlooks as the default theme. Many GNOME Core Applications went through redesigns to provide a more consistent user experience; the release of GNOME 3, notable for its move away from the traditional menu bar and taskbar, has caused considerable controversy in the GNU and Linux community.
Many users and developers have expressed concerns about usability. A few projects have been initiated to continue development of GNOME 2.x or to modify GNOME 3.x to be more like the 2.x releases. GNOME 3 aims to provide a single interface for desktop computers and tablet computers; this means using only input techniques that work on all those devices, requiring abandonment of certain concepts to which desktop users were accustomed, such as right-clicking, or saving files on the desktop. These major changes evoked widespread criticism; the MATE desktop environment was forked from the GNOME 2 code-base with the intent of retaining the traditional GNOME 2 interface, whilst keeping compatibility with modern Linux technology, such as GTK 3. The Linux Mint team addressed the issue in another way by developing the "Mint GNOME Shell Extensions" that ran on top of GNOME Shell and allowed it to be used via the traditional desktop metaphor; this led to the creation of the Cinnamon user interface, forked from the GNOME 3 codebase.
Among those critical of the early releases of GNOME 3 is Linus Torvalds, the creator of the Linux kernel. Torvalds abandoned GNOME for a wh
Free University of Bozen-Bolzano
The Free University of Bozen-Bolzano is a university located in Bolzano, Italy. It is organized into five Faculties; the Free University of Bozen-Bolzano aims to offer students a multilingual, practice-oriented education that meets the demands of the local and the European labour market. Lectures and seminars are held in German and English; the only exception is the Faculty of Education, which offers German and Ladin language speaking students separate training sections. The university offers study programmes ranging from bachelor's degrees to doctorates; the emphasis of teaching is to provide theoretically practice-orientated training. A large proportion of educational activities take place as seminars, lectures given by guest speakers, practical training and workshops. Within the framework of the academic exchange program students are encouraged to spend one or more semesters at universities abroad; the university has three campuses: at Bolzano and Bruneck. The buildings in Bolzano were designed by the architects Matthias Bischoff and Roberto Azzola of Zurich and those at Brixen by Regina Kohlmeyer and Jens Oberst from Stuttgart.
The latter won in 2005 the 9th architecture prize of the city of Oderzo. The rectors of the university have been Alfred Steinherr, an economist from Luxembourg, from 1998 to 2003, the Swiss linguist Rita Franceschini, from 2004 to 2008, the German sociologist Walter Lorenz, from 2008 to 2016; the Italian engineer Paolo Lugli is the actual rector since 2017. The Faculty of Economics and Management is based at Bruneck-Brunico, it offers three master's degree programs. Bachelor in Economics and Management Bachelor in Economics and Social Sciences PPE Bachelor in Tourism and Event Management Master in Entrepreneurship and Innovation Master in Economics and Management of the Public Sector Master in Accounting and Finance PhD in Management and EconomicsThe research clusters of the faculty are: Tourism and Regional Development Law and Institutions Financial Markets and Regulations Entrepreneurship and Innovation Quantitative Methods and Economic Modelling The Faculty of Education is in Brixen-Bressanone and the following courses are active: Bachelor in Social Work Bachelor for Social Education Bachelor in Communication Sciences and Culture Master in Primary Education Master in Innovation and Research for Social Work and Social Education Master in Musicology PhD in General Pedagogy, Social Pedagogy and General EducationMain research areas at the faculty are: educational and development projects and processes for different age groups and contexts social dynamics, social cohesion,citizenship and solidarity systems languages and communication for a multicultural and multilingual society The Faculty of Computer Science is based in Bozen-Bolzano and has the following courses: Bachelor in Computer Science and Engineering Master in Computer Science Master in Computational Data Science European Master in Computational Logic European Master on Software Engineering PhD in Computer ScienceThe faculty's research centres are: Research Centre for Information and Database Systems Engineering Research Centre for Knowledge and Data Research Centre for Software and Systems Engineering The Faculty of Science and Technology has the following active courses: Bachelor in Agricultural and Agro-Environmental Sciences Bachelor in Industrial and Mechanical Engineering Bachelor in Wood Engineering Master in Industrial Mechanical Engineering Master in Energy Engineering Master in Environmental Management of Mountain Areas International Master in Horticulture Science Master in Viticulture and Wine Marketing PhD in Mountain Environment and Agriculture PhD in Sustainable Energy and Technologies Food Engineering and BiotechnologyThe main research areas of the faculty are: Agricultural Sciences Energy Resources and Energy Efficiency Food Sciences Fundamental Sciences for Innovative Applications Management and Technologies for Mountain Environments Industrial Engineering and Automation The Faculty of Design and Art offers the following courses: Bachelor in Design and Art - Major in Design Bachelor in Design and Art - Major in Art Master in Eco-Social DesignThe main research areas of the faculty are: Visual culture and its impact on society Phenomena and results of three-dimensional projects Theories and languages of design and visual cultureThe faculty operates a fab lab named Bitz, open to users not affiliated with unibz, too.
The Faculty of Computer Science at the Free University of Bozen-Bolzano is active in this way and is amongst the research centers recognized by the European Union as a leader in this program. There are two European Master Programmes in the area of computer science running in this university, under Erasmus Mundus Programme: European Master Program in Computational Logic European Master on Software Engineering Since 2011 there has been a Studium Generale course, which offers a wide range of lectures in fields of general interest. 917 projects of basic and applied research have been conducted within the university since 1998. The University has scientific and technological laboratories at each of its sites, at the NOI Techpark, a local technological and innovation hub, at the Versuchszentrum Laimburg, a agronomy research institute. According to library ranking system issued by German library networks in 2009, this university has the second best library amongst German-speaking states (Germany, Switzerland and South Tyrol in It
Asset Description Metadata Schema
The Asset Description Metadata Schema is a common metadata vocabulary to describe standards, so-called interoperability assets, on the Web. Used in concert with web syndication technology ADMS helps people make sense of the complex multi-publisher environment around standards and in particular the ones which are semantic assets such as ontologies, data models, data dictionaries, code lists, XML and RDF schemas. In spite of their importance, standards are not discoverable on the web via search engines because metadata about them is available. Navigating on the websites of the different publishers of standards is not efficient either. A semantic asset is a specific type of standard which involves: reusable metadata and/or reference data Organisations use semantic assets to share information and knowledge. Semantic assets are very valuable and reusable elements for the development of Information Systems, in particular, as part of machine-to-machine interfaces; as enablers to interoperable information exchange, semantic assets are created and maintained by standardisation bodies.
Nonetheless, ICT projects and groups of experts create such assets. There are therefore many publishers of semantic assets with different degrees of formalism. ADMS is a standardised metadata vocabulary created by the EU's Interoperability Solutions for European Public Administrations Programme of the European Commission to help publishers of standards document what their standards are about and where they can be found on the Web. ADMS descriptions can be published on different websites while the standard itself remains on the website of its publisher. ADMS embraces the multi-publisher environment and, at the same time, it provides the means for the creation of aggregated catalogues of standards and single points of access to them based on ADMS descriptions; the Commission will offer a single point of access to standards described using ADMS via its collaborative platform, Joinup. The Federation service will increase the visibility of standards described with ADMS on the web; this will stimulate their reuse by Pan-European initiatives.
More than 43 people of 20 EU Member States as well as from the US and Australia have participated in the ADMS Working Group. Most of them were experts from research centres and the EU Commission; the working group used a methodology based on W3C's methods. ADMS version 1 was released in April 2012. Version 1.00 of ADMS is available for download on Joinup:https://joinup.ec.europa.eu/asset/adms/release/100ADMS is offered under ISA's Open Metadata Licence v1.1 The ADMS specification reuses existing metadata vocabularies and core vocabularies including: The Dublin Core Metadata Element Set The Data Catalog Vocabulary The Friend of a Friend Ontology The vCard Ontology ADMS v1.00 will be contributed to W3C’s Government Linked Data Working Group. This means that ADMS will be published by the GLD Working Group as First Public Working Drafts for further consultation within the context of the typical W3C standardization process; the desired outcome of that process will be the publication of ADMS as a W3C Recommendation available under W3C's Royalty-Free License.
The ADMS RDFS Vocabulary has a w3.org namespace: http://www.w3.org/ns/adms#