The scientific method is an empirical method of acquiring knowledge that has characterized the development of science since at least the 17th century. It involves careful observation, applying rigorous skepticism about what is observed, given that cognitive assumptions can distort how one interprets the observation, it involves formulating hypotheses, via induction, based on such observations. These are principles of the scientific method, as distinguished from a definitive series of steps applicable to all scientific enterprises. Though diverse models for the scientific method are available, there is in general a continuous process that includes observations about the natural world. People are inquisitive, so they come up with questions about things they see or hear, they develop ideas or hypotheses about why things are the way they are; the best hypotheses lead to predictions. The most conclusive testing of hypotheses comes from reasoning based on controlled experimental data. Depending on how well additional tests match the predictions, the original hypothesis may require refinement, expansion or rejection.
If a particular hypothesis becomes well supported, a general theory may be developed. Although procedures vary from one field of inquiry to another, they are the same from one to another; the process of the scientific method involves making conjectures, deriving predictions from them as logical consequences, carrying out experiments or empirical observations based on those predictions. A hypothesis is a conjecture, based on knowledge obtained while seeking answers to the question; the hypothesis might be specific, or it might be broad. Scientists test hypotheses by conducting experiments or studies. A scientific hypothesis must be falsifiable, implying that it is possible to identify a possible outcome of an experiment or observation that conflicts with predictions deduced from the hypothesis; the purpose of an experiment is to determine whether observations agree with or conflict with the predictions derived from a hypothesis. Experiments can take place anywhere from a garage to CERN's Large Hadron Collider.
There are difficulties in a formulaic statement of method, however. Though the scientific method is presented as a fixed sequence of steps, it represents rather a set of general principles. Not all steps take place in every scientific inquiry, they are not always in the same order; some philosophers and scientists have argued. Robert Nola and Howard Sankey remark that "For some, the whole idea of a theory of scientific method is yester-year's debate, the continuation of which can be summed up as yet more of the proverbial deceased equine castigation. We beg to differ." Important debates in the history of science concern rationalism as advocated by René Descartes. The term "scientific method" emerged in the 19th century, when a significant institutional development of science was taking place and terminologies establishing clear boundaries between science and non-science, such as "scientist" and "pseudoscience", appeared. Throughout the 1830s and 1850s, by which time Baconianism was popular, naturalists like William Whewell, John Herschel, John Stuart Mill engaged in debates over "induction" and "facts" and were focused on how to generate knowledge.
In the late 19th and early 20th centuries, a debate over realism vs. antirealism was conducted as powerful scientific theories extended beyond the realm of the observable. The term "scientific method" came into popular use in the twentieth century, popping up in dictionaries and science textbooks, although there was little scientific consensus over its meaning. Although there was a growth through the middle of the twentieth century, by the end of that century numerous influential philosophers of science like Thomas Kuhn and Paul Feyerabend had questioned the universality of the "scientific method" and in doing so replaced the notion of science as a homogeneous and universal method with that of it being a heterogeneous and local practice. In particular, Paul Feyerabend argued against there being any universal rules of science. Historian of science Daniel Thurs maintains that the scientific method is a myth or, at best, an idealization; the scientific method is the process. As in other areas of inquiry, science can build on previous knowledge and develop a more sophisticated understanding of its topics of study over time.
This model can be seen to underlie the scientific revolution. The ubiquitous element in the model of the scientific method is empiricism, or more epistemologic sensualism; this is in opposition to stringent forms of rationalism: the scientific method embodies that reason alone cannot solve a particular scientific problem. A strong formulation of the scientific method is not always aligned with a form of empiricism in which the empirical data is put forward in the form of experience or other abstracted forms of knowledge; the scientific method is of necessity als
Social science is a category of academic disciplines, concerned with society and the relationships among individuals within a society. Social science as a whole has many branches; these social sciences include, but are not limited to: anthropology, communication studies, history, human geography, linguistics, political science, public health, sociology. The term is sometimes used to refer to the field of sociology, the original "science of society", established in the 19th century. For a more detailed list of sub-disciplines within the social sciences see: Outline of social science. Positivist social scientists use methods resembling those of the natural sciences as tools for understanding society, so define science in its stricter modern sense. Interpretivist social scientists, by contrast, may use social critique or symbolic interpretation rather than constructing empirically falsifiable theories, thus treat science in its broader sense. In modern academic practice, researchers are eclectic, using multiple methodologies.
The term "social research" has acquired a degree of autonomy as practitioners from various disciplines share in its aims and methods. The history of the social sciences begins in the Age of Enlightenment after 1650, which saw a revolution within natural philosophy, changing the basic framework by which individuals understood what was "scientific". Social sciences came forth from the moral philosophy of the time and were influenced by the Age of Revolutions, such as the Industrial Revolution and the French Revolution; the social sciences developed from the sciences, or the systematic knowledge-bases or prescriptive practices, relating to the social improvement of a group of interacting entities. The beginnings of the social sciences in the 18th century are reflected in the grand encyclopedia of Diderot, with articles from Jean-Jacques Rousseau and other pioneers; the growth of the social sciences is reflected in other specialized encyclopedias. The modern period saw "social science" first used as a distinct conceptual field.
Social science was influenced by positivism, focusing on knowledge based on actual positive sense experience and avoiding the negative. Auguste Comte used the term "science sociale" to describe the field, taken from the ideas of Charles Fourier. Following this period, there were five paths of development that sprang forth in the social sciences, influenced by Comte on other fields. One route, taken was the rise of social research. Large statistical surveys were undertaken in various parts of the United States and Europe. Another route undertaken was initiated by Émile Durkheim, studying "social facts", Vilfredo Pareto, opening metatheoretical ideas and individual theories. A third means developed, arising from the methodological dichotomy present, in which social phenomena were identified with and understood; the fourth route taken, based in economics, was developed and furthered economic knowledge as a hard science. The last path was the correlation of knowledge and social values. In this route and prescription were non-overlapping formal discussions of a subject.
Around the start of the 20th century, Enlightenment philosophy was challenged in various quarters. After the use of classical theories since the end of the scientific revolution, various fields substituted mathematics studies for experimental studies and examining equations to build a theoretical structure; the development of social science subfields became quantitative in methodology. The interdisciplinary and cross-disciplinary nature of scientific inquiry into human behaviour and environmental factors affecting it, made many of the natural sciences interested in some aspects of social science methodology. Examples of boundary blurring include emerging disciplines like social research of medicine, neuropsychology and the history and sociology of science. Quantitative research and qualitative methods are being integrated in the study of human action and its implications and consequences. In the first half of the 20th century, statistics became a free-standing discipline of applied mathematics.
Statistical methods were used confidently. In the contemporary period, Karl Popper and Talcott Parsons influenced the furtherance of the social sciences. Researchers continue to search for a unified consensus on what methodology might have the power and refinement to connect a proposed "grand theory" with the various midrange theories that, with considerable success, continue to provide usable frameworks for massive, growing data banks; the social sciences will for the foreseeable future be composed of different zones in the research of, sometime distinct in approach toward, the field. The term "social science" may refer either to the specific sciences of society established by thinkers such as Comte, Durkheim and Weber, or more to all disciplines outside of "noble science" and arts. By the late 19th century, the academic social sciences were constituted of five fields: jurisprudence and amendment of the law, health and trade, art. Around the start of the 21st century, the expanding domain of economics in the social sciences has been described as economic imperialism.
The social science disciplines are branches of knowledge taught and researched at the college or university level. Social science disciplines are defined and rec
Peter N. Peregrine
Peter N. Peregrine is an American anthropologist, registered professional archaeologist, academic, he is well known for his staunch defense of science in anthropology, for his popular textbook Anthropology. Peregrine did dissertation research on the evolution of the Mississippian culture of North America, did fieldwork on Bronze Age cities in Syria, he is Professor of Anthropology and Museum Studies at Lawrence University and Research Associate of the Human Relations Area Files at Yale University. From 2012 to 2018 he was an External Professor at the Santa Fe Institute. Peregrine developed a comprehensive data set and methodology for conducting diachronic cross-cultural research; this work produced the Atlas of Cultural Evolution and the Encyclopedia of Prehistory, formed the organizational structure for the Human Relations Area Files eHRAF Archaeology. Peregrine has conducted archaeological fieldwork in North America and South America. Much of his fieldwork has involved the use of geophysical techniques to identify buried archaeological deposits.
In 2009 Peregrine started the Lawrence University Archaeological Survey, which focuses on using geophysical techniques to locate unmarked graves in early Wisconsin cemeteries. In 2011 Peregrine was elected a Fellow of the American Association for the Advancement of Science. Peregrine has published extensively on the Mississippian culture and on archaeological method and theory. Peregrine argued that Mississippian cultures should be seen as participants in a large system that integrated much of eastern North America in a single political economy, he employed world-systems theory to do this, arguing that large centers were cores of political and economic authority which were supported by peripheral regions though the exchange of objects used in rituals of social reproduction such as initiation and marriage. The Mississippian cores themselves competitively manufactured and traded these objects, linking them into what Peregrine called a prestige-goods system. Polities vied for power over exchange, rose and fell as their ability to control prestige-goods strengthened or waned.
The response to Peregrine’s view was mixed, with some calling it “exaggerationalist” and others adopting it into their own work. In the mid-1990s Peregrine and colleagues Richard Blanton, Gary M. Feinman, Steven Kowalewski developed “dual-processual” theory, which Peregrine applied to Mississippian polities. Dual-processual theory posits that political leaders adopt strategies for implementing power ranging along a continuum from being exclusionary to inclusive. Exclusionary strategies are. Peregrine argued. While not without controversy, dual processual theory has come to be seen as a valuable tool for understanding both Mississippian and Ancestral Puebloan polities. More Peregrine and colleague Steven Lekson have argued that the Mississippian and Ancestral Puebloan worlds should be viewed as linked together, along with Early Postclassic Mesoamerica, in a continent-wide “oikoumene”, they argue that only such a continental perspective can allow archaeologists to understand broad processes of coordinated change such as the emergence of urban-like communities in many parts of North America around 900 CE.
Again, though not without controversy, Peregrine’s drive to promote a multi-regional perspective has been seen as useful for addressing some questions in North American archaeology. In addition to archaeology Peregrine has made a number of contributions to cross-cultural studies; the focus of his work has been on developing archaeological correlates for various types of behavior, including warfare, postmarital residence, social stratification. Peregrine developed new methodologies for conducting diachronic cross-cultural research using archaeological cases. Peregrine is now using diachronic cross-cultural research to explore how ancient societies were able to build resilience to climate-related disasters, he argues that this work may help modern societies to create policies to enhance resilience to the increasing frequency of climate-related disasters caused by climate change. Peregrine lives in Appleton, Wisconsin and is married with two daughters
Digital humanities is an area of scholarly activity at the intersection of computing or digital technologies and the disciplines of the humanities. It includes the systematic use of digital resources in the humanities, as well as the reflection on their application. DH can be defined as new ways of doing scholarship that involve collaborative, transdisciplinary, computationally engaged research and publishing, it brings digital tools and methods to the study of the humanities with the recognition that the printed word is no longer the main medium for knowledge production and distribution. By producing and using new applications and techniques, DH makes new kinds of teaching and research possible, while at the same time studying and critiquing how these impact cultural heritage and digital culture. Thus, a distinctive feature of DH is its cultivation of a two-way relationship between the humanities and the digital: the field both employs technology in the pursuit of humanities research and subjects technology to humanistic questioning and interrogation simultaneously.
The definition of the digital humanities is being continually formulated by scholars and practitioners. Since the field is growing and changing, specific definitions can become outdated or unnecessarily limit future potential; the second volume of Debates in the Digital Humanities acknowledges the difficulty in defining the field: "Along with the digital archives, quantitative analyses, tool-building projects that once characterized the field, DH now encompasses a wide range of methods and practices: visualizations of large image sets, 3D modeling of historical artifacts,'born digital' dissertations, hashtag activism and the analysis thereof, alternate reality games, mobile makerspaces, more. In what has been called'big tent' DH, it can at times be difficult to determine with any specificity what digital humanities work entails."Historically, the digital humanities developed out of humanities computing and has become associated with other fields, such as humanistic computing, social computing, media studies.
In concrete terms, the digital humanities embraces a variety of topics, from curating online collections of primary sources to the data mining of large cultural data sets to topic modeling. Digital humanities incorporates both digitized and born-digital materials and combines the methodologies from traditional humanities disciplines and social sciences, with tools provided by computing, digital publishing. Related subfields of digital humanities have emerged like software studies, platform studies, critical code studies. Fields that parallel the digital humanities include new media studies and information science as well as media theory of composition, game studies in areas related to digital humanities project design and production, cultural analytics. Berry and Fagerjord have suggested that a way to reconceptualise digital humanities could be through a "digital humanities stack", they argue that "this type of diagram is common in computation and computer science to show how technologies are'stacked' on top of each other in increasing levels of abstraction.
Here, use the method in a more illustrative and creative sense of showing the range of activities, skills and structures that could be said to make up the digital humanities, with the aim of providing a high-level map." Indeed, the "diagram can be read as the bottom levels indicating some of the fundamental elements of the digital humanities stack, such as computational thinking and knowledge representation, other elements that build on these. " Digital humanities descends from the field of humanities computing, whose origins reach back to the 1930s and 1940s in the pioneering work of English professor Josephine Miles and Jesuit scholar Roberto Busa and the women they employed. In collaboration with IBM, they created a computer-generated concordance to Thomas Aquinas' writings known as the Index Thomisticus. Other scholars began using mainframe computers to automate tasks like word-searching and counting, much faster than processing information from texts with handwritten or typed index cards.
In the decades which followed archaeologists, historians, literary scholars, a broad array of humanities researchers in other disciplines applied emerging computational methods to transform humanities scholarship. As Tara McPherson has pointed out, the digital humanities inherit practices and perspectives developed through many artistic and theoretical engagements with electronic screen culture beginning the late 1960s and 1970s; these range from research developed by organizations such as SIGGRAPH to creations by artists such as Charles and Ray Eames and the members of E. A. T.. The Eames and E. A. T. Explored nascent computer culture and intermediality in creative works that dovetailed technological innovation with art; the first specialized journal in the digital humanities was Computers and the Humanities, which debuted in 1966. The Association for Literary and Linguistic Computing and the Association for Computers and the Humanities were founded in 1977 and 1978, respectively. Soon, there was a need for a standardized protocol for tagging digital texts, the Text Encoding Initiative was developed.
The TEI project was launched in 1987 and published the first full version of the TEI Guidelines in May 1994. TEI helped shape the field of electronic textual schol
"Big data" is a field that treats ways to analyze, systematically extract information from, or otherwise deal with data sets that are too large or complex to be dealt with by traditional data-processing application software. Data with many cases offer greater statistical power, while data with higher complexity may lead to a higher false discovery rate. Big data challenges include capturing data, data storage, data analysis, sharing, visualization, updating, information privacy and data source. Big data was associated with three key concepts: volume and velocity. Other concepts attributed with big data are veracity and value. Current usage of the term big data tends to refer to the use of predictive analytics, user behavior analytics, or certain other advanced data analytics methods that extract value from data, to a particular size of data set. "There is little doubt that the quantities of data now available are indeed large, but that's not the most relevant characteristic of this new data ecosystem."
Analysis of data sets can find new correlations to "spot business trends, prevent diseases, combat crime and so on." Scientists, business executives, practitioners of medicine and governments alike meet difficulties with large data-sets in areas including Internet search, urban informatics, business informatics. Scientists encounter limitations in e-Science work, including meteorology, connectomics, complex physics simulations and environmental research. Data sets grow rapidly- in part because they are gathered by cheap and numerous information- sensing Internet of things devices such as mobile devices, software logs, microphones, radio-frequency identification readers and wireless sensor networks; the world's technological per-capita capacity to store information has doubled every 40 months since the 1980s. Based on an IDC report prediction, the global data volume will grow exponentially from 4.4 zettabytes to 44 zettabytes between 2013 and 2020. By 2025, IDC predicts. One question for large enterprises is determining who should own big-data initiatives that affect the entire organization.
Relational database management systems, desktop statistics and software packages used to visualize data have difficulty handling big data. The work may require "massively parallel software running on tens, hundreds, or thousands of servers". What qualifies as being "big data" varies depending on the capabilities of the users and their tools, expanding capabilities make big data a moving target. "For some organizations, facing hundreds of gigabytes of data for the first time may trigger a need to reconsider data management options. For others, it may take tens or hundreds of terabytes before data size becomes a significant consideration." The term has been in use since the 1990s, with some giving credit to John Mashey for popularizing the term. Big data includes data sets with sizes beyond the ability of used software tools to capture, curate and process data within a tolerable elapsed time. Big data philosophy encompasses unstructured, semi-structured and structured data, however the main focus is on unstructured data.
Big data "size" is a moving target, as of 2012 ranging from a few dozen terabytes to many exabytes of data. Big data requires a set of techniques and technologies with new forms of integration to reveal insights from datasets that are diverse, of a massive scale. A 2016 definition states that "Big data represents the information assets characterized by such a high volume and variety to require specific technology and analytical methods for its transformation into value". Kaplan and Haenlein define big data as "data sets characterized by huge amounts of updated data in various formats, such as numeric, textual, or images/videos." Additionally, a new V, veracity, is added by some organizations to describe it, revisionism challenged by some industry authorities. The three Vs have been further expanded to other complementary characteristics of big data: Machine learning: big data doesn't ask why and detects patterns Digital footprint: big data is a cost-free byproduct of digital interactionA 2018 definition states "Big data is where parallel computing tools are needed to handle data", notes, "This represents a distinct and defined change in the computer science used, via parallel programming theories, losses of some of the guarantees and capabilities made by Codd's relational model."
The growing maturity of the concept more starkly delineates the difference between "big data" and "Business Intelligence": Business Intelligence uses descriptive statistics with data with high information density to measure things, detect trends, etc. Big data uses inductive statistics and concepts from nonlinear system identification to infer laws from large sets of data with low information density to reveal relationships and dependencies, or to perform predictions of outcomes and behaviors. Big data can be described by the following characteristics: Volume The quantity of generated and stored data; the size of the data determines the value and potential insight, whether it can be considered big data or not. Variety The type and nature of the data; this helps people who analyze it to use the resulting insight. Big data draws from text, audio, video.
The Seshat: Global History Databank is an international scientific research project of the nonprofit Evolution Institute. Founded in 2011, the Seshat: Global History Databank gathers data into a single, large database that can be used to test scientific hypotheses; the Databank consults directly with expert scholars to code what historical societies and their environments were like in the form of accessible datapoints and thus forms a digital storehouse for data on the political and social organization of all human groups from the early modern back to the ancient and neolithic periods. The organizers of this research project contend that the mass of data can be used to test a variety of competing hypotheses about the rise and fall of large-scale societies around the globe which may help science provide answers to global problems; the Seshat: Global History Databank claims to be a scientific approach to historical research and its large dataset, though compiled with the intention of being theory-neutral, is of interest to researchers of Cliodynamics.
The main goal of Cliodynamics researchers is to use the scientific method to produce the data necessary to empirically test competing theories. A large interdisciplinary and international team of experts helps the Seshat project to produce a database, rigorous enough to study the past using well-established scientific techniques. Seshat data may be used with sociocultural evolutionary theory or cultural evolutionary theory to identify long-term dynamics that may have had significant effects on the course of human history; the Seshat: Global History Databank is an umbrella organization for several research projects that examine different themes or facets of human life. Each project is led by members of the Seshat Team in collaboration with a group of consultants and contributing experts. Themes include: the evolution of social complexity in early civilizations, the creation of prosociality, the role of ritual and religion in social cohesion, the causes of economic growth and its consequences on individual's well-being, many others.
The Seshat team is heavily engaged in improving the way that cutting-edge digital technologies can aid in research, with projects devoted to developing cutting-edge systems for collecting and distributing information with computer assistance. Several key research questions drive these research projects; these include the following: What mechanisms transform economic growth into improvements in quality of life for regular people? What roles do ritual activities and religion play in cultural development and group cohesion? How and under what conditions does prosocial behavior evolve in large societies? What is the impact of environmental and climatic factors in societal advance? To maximise their time and resources, the Seshat project has begun data collection with a representative sample of polities from around the globe and throughout human history, ranging from the late Neolithic to the early modern period; this is the World Sample 30. The World Sample-30 provides the Seshat project with an initial sample of societies that vary along the dimension of social complexity from ten major regions around the globe.
Three natural geographic areas were selected within each region––one NGA was selected in each world region that developed complex state-level societies comparatively early. Ian Morris praised two key aspects of the Seshat project: it emphasizes the collection of data related to shifts in cultural systems in addition to material elements and it better situates extraordinary individuals in their geographic and historical context. Gary Feinman praised the Seshat Project for helping to demolish the academic knowledge silos that have emerged with increases in specialisation over the last several decades. Critics of the Seshat project have noted that the coding of historical data is not a wholly objective enterprise and that concrete and transparent steps should be taken to minimize subjectivity in the coding process; the Seshat project uses multiple coders and experts and other techniques for ensuring data quality, but some have suggested that machine coding techniques hold great promise for further reducing biases and increasing the reliability of the data produced.
Funding for the Seshat: Global History Databank comes from the John Templeton Foundation, the Economic and Social Research Council, Horizon 2020, the Tricoastal Foundation, the Evolution Institute. The Seshat: Global History Databank is governed by an Editorial Board, which includes Prof. Peter Turchin, Prof. Harvey Whitehouse, Dr. Pieter François, Dr. Thomas E. Currie, Dr. Kevin C. Feeney. Cliodynamics Cliometrics Longue durée Peter. "The Seshat Databank Project: The 2014 Report". Cliodynamics. 5. Doi:10.21237/C7clio5125311. Turchin, Peter. "Fitting Dynamic Regression Models to Seshat Data". Cliodynamics. 9: 25–58. Doi:10.21237/C7clio9137696. Turchin, Thomas E. Currie, Harvey Whitehouse, Pieter François, Kevin Feeney, Daniel Mullins, Daniel Hoyer et al.. "Quantitative Historical Analysis Uncovers a Single Dimension of Complexity that Structures Global Variation in Human Social Organization." PNAS 115: E144-E151. Doi:10.1073/pnas.1708800115
International Standard Serial Number
An International Standard Serial Number is an eight-digit serial number used to uniquely identify a serial publication, such as a magazine. The ISSN is helpful in distinguishing between serials with the same title. ISSN are used in ordering, interlibrary loans, other practices in connection with serial literature; the ISSN system was first drafted as an International Organization for Standardization international standard in 1971 and published as ISO 3297 in 1975. ISO subcommittee TC 46/SC 9 is responsible for maintaining the standard; when a serial with the same content is published in more than one media type, a different ISSN is assigned to each media type. For example, many serials are published both in electronic media; the ISSN system refers to these types as electronic ISSN, respectively. Conversely, as defined in ISO 3297:2007, every serial in the ISSN system is assigned a linking ISSN the same as the ISSN assigned to the serial in its first published medium, which links together all ISSNs assigned to the serial in every medium.
The format of the ISSN is an eight digit code, divided by a hyphen into two four-digit numbers. As an integer number, it can be represented by the first seven digits; the last code digit, which may be 0-9 or an X, is a check digit. Formally, the general form of the ISSN code can be expressed as follows: NNNN-NNNC where N is in the set, a digit character, C is in; the ISSN of the journal Hearing Research, for example, is 0378-5955, where the final 5 is the check digit, C=5. To calculate the check digit, the following algorithm may be used: Calculate the sum of the first seven digits of the ISSN multiplied by its position in the number, counting from the right—that is, 8, 7, 6, 5, 4, 3, 2, respectively: 0 ⋅ 8 + 3 ⋅ 7 + 7 ⋅ 6 + 8 ⋅ 5 + 5 ⋅ 4 + 9 ⋅ 3 + 5 ⋅ 2 = 0 + 21 + 42 + 40 + 20 + 27 + 10 = 160 The modulus 11 of this sum is calculated. For calculations, an upper case X in the check digit position indicates a check digit of 10. To confirm the check digit, calculate the sum of all eight digits of the ISSN multiplied by its position in the number, counting from the right.
The modulus 11 of the sum must be 0. There is an online ISSN checker. ISSN codes are assigned by a network of ISSN National Centres located at national libraries and coordinated by the ISSN International Centre based in Paris; the International Centre is an intergovernmental organization created in 1974 through an agreement between UNESCO and the French government. The International Centre maintains a database of all ISSNs assigned worldwide, the ISDS Register otherwise known as the ISSN Register. At the end of 2016, the ISSN Register contained records for 1,943,572 items. ISSN and ISBN codes are similar in concept. An ISBN might be assigned for particular issues of a serial, in addition to the ISSN code for the serial as a whole. An ISSN, unlike the ISBN code, is an anonymous identifier associated with a serial title, containing no information as to the publisher or its location. For this reason a new ISSN is assigned to a serial each time it undergoes a major title change. Since the ISSN applies to an entire serial a new identifier, the Serial Item and Contribution Identifier, was built on top of it to allow references to specific volumes, articles, or other identifiable components.
Separate ISSNs are needed for serials in different media. Thus, the print and electronic media versions of a serial need separate ISSNs. A CD-ROM version and a web version of a serial require different ISSNs since two different media are involved. However, the same ISSN can be used for different file formats of the same online serial; this "media-oriented identification" of serials made sense in the 1970s. In the 1990s and onward, with personal computers, better screens, the Web, it makes sense to consider only content, independent of media; this "content-oriented identification" of serials was a repressed demand during a decade, but no ISSN update or initiative occurred. A natural extension for ISSN, the unique-identification of the articles in the serials, was the main demand application. An alternative serials' contents model arrived with the indecs Content Model and its application, the digital object identifier, as ISSN-independent initiative, consolidated in the 2000s. Only in 2007, ISSN-L was defined in the