Thermodynamics is the branch of physics that deals with heat and temperature, their relation to energy, work and properties of bodies of matter. The behavior of these quantities is governed by the four laws of thermodynamics, irrespective of the specific composition of the material or system in question; the laws of thermodynamics are explained in terms of microscopic constituents by statistical mechanics. Thermodynamics applies to a wide variety of topics in science and engineering physical chemistry, chemical engineering and mechanical engineering. Thermodynamics developed out of a desire to increase the efficiency of early steam engines through the work of French physicist Nicolas Léonard Sadi Carnot who believed that engine efficiency was the key that could help France win the Napoleonic Wars. Scots-Irish physicist Lord Kelvin was the first to formulate a concise definition of thermodynamics in 1854 which stated, "Thermo-dynamics is the subject of the relation of heat to forces acting between contiguous parts of bodies, the relation of heat to electrical agency."
The initial application of thermodynamics to mechanical heat engines was extended early on to the study of chemical compounds and chemical reactions. Chemical thermodynamics studies the nature of the role of entropy in the process of chemical reactions and has provided the bulk of expansion and knowledge of the field. Other formulations of thermodynamics emerged in the following decades. Statistical thermodynamics, or statistical mechanics, concerned itself with statistical predictions of the collective motion of particles from their microscopic behavior. In 1909, Constantin Carathéodory presented a purely mathematical approach to the field in his axiomatic formulation of thermodynamics, a description referred to as geometrical thermodynamics. A description of any thermodynamic system employs the four laws of thermodynamics that form an axiomatic basis; the first law specifies that energy can be exchanged between physical systems as work. The second law defines the existence of a quantity called entropy, that describes the direction, thermodynamically, that a system can evolve and quantifies the state of order of a system and that can be used to quantify the useful work that can be extracted from the system.
In thermodynamics, interactions between large ensembles of objects are categorized. Central to this are the concepts of its surroundings. A system is composed of particles, whose average motions define its properties, those properties are in turn related to one another through equations of state. Properties can be combined to express internal energy and thermodynamic potentials, which are useful for determining conditions for equilibrium and spontaneous processes. With these tools, thermodynamics can be used to describe how systems respond to changes in their environment; this can be applied to a wide variety of topics in science and engineering, such as engines, phase transitions, chemical reactions, transport phenomena, black holes. The results of thermodynamics are essential for other fields of physics and for chemistry, chemical engineering, corrosion engineering, aerospace engineering, mechanical engineering, cell biology, biomedical engineering, materials science, economics, to name a few.
This article is focused on classical thermodynamics which studies systems in thermodynamic equilibrium. Non-equilibrium thermodynamics is treated as an extension of the classical treatment, but statistical mechanics has brought many advances to that field; the history of thermodynamics as a scientific discipline begins with Otto von Guericke who, in 1650, built and designed the world's first vacuum pump and demonstrated a vacuum using his Magdeburg hemispheres. Guericke was driven to make a vacuum in order to disprove Aristotle's long-held supposition that'nature abhors a vacuum'. Shortly after Guericke, the English physicist and chemist Robert Boyle had learned of Guericke's designs and, in 1656, in coordination with English scientist Robert Hooke, built an air pump. Using this pump and Hooke noticed a correlation between pressure and volume. In time, Boyle's Law was formulated, which states that pressure and volume are inversely proportional. In 1679, based on these concepts, an associate of Boyle's named Denis Papin built a steam digester, a closed vessel with a fitting lid that confined steam until a high pressure was generated.
Designs implemented a steam release valve that kept the machine from exploding. By watching the valve rhythmically move up and down, Papin conceived of the idea of a piston and a cylinder engine, he did not, follow through with his design. In 1697, based on Papin's designs, engineer Thomas Savery built the first engine, followed by Thomas Newcomen in 1712. Although these early engines were crude and inefficient, they attracted the attention of the leading scientists of the time; the fundamental concepts of heat capacity and latent heat, which were necessary for the development of thermodynamics, were developed by Professor Joseph Black at the University of Glasgow, where James Watt was employed as an instrument maker. Black and Watt performed experiments together, but it was Watt who conceived the idea of the external condenser which resulted in a large increase in steam engine efficiency. Drawing on all the previous work led Sadi Carnot, the "father of thermodynamics", to publish Reflections on the Motive Power of Fire, a discourse on heat, power and engine efficiency.
The book outlined the basic energetic relations between the Carnot engine, the Carnot cycle, motive power. It marked the start of thermodynamics as a modern scien
Particle statistics is a particular description of multiple particles in statistical mechanics. Its core concept is a statistical ensemble that emphasizes properties of a large system as a whole at the expense of knowledge about parameters of separate particles; when an ensemble consists of particles with similar properties, their number is called the particle number and denoted by N. In classical mechanics, all particles in the system are considered distinguishable; this means. As a consequence, changing the position of any two particles in the system leads to a different configuration of the entire system. Furthermore, there is no restriction on placing more than one particle in any given state accessible to the system; these characteristics of classical positions are called Maxwell–Boltzmann statistics. The fundamental feature of quantum mechanics that distinguishes it from classical mechanics is that particles of a particular type are indistinguishable from one another; this means that in an assembly consisting of similar particles, interchanging any two particles does not lead to a new configuration of the system.
In the case of a system consisting of particles of different kinds, the wave function of the system is invariant up to a phase separately for both assemblies of particles. The applicable definition of a particle does not require it to be elementary or "microscopic", but it requires that all its degrees of freedom that are relevant to the physical problem considered shall be known. All quantum particles, such as leptons and baryons, in the universe have three translational motion degrees of freedom and one discrete degree of freedom, known as spin. Progressively more "complex" particles obtain progressively more internal freedoms, when the number of internal states, that "identical" particles in an ensemble can occupy, dwarfs their count effects of quantum statistics become negligible. That's why quantum statistics is useful when one considers, helium liquid or ammonia gas, but is useless applied to systems constructed of macromolecules. While this difference between classical and quantum descriptions of systems is fundamental to all of quantum statistics, quantum particles are divided into two further classes on the basis of the symmetry of the system.
The spin–statistics theorem binds two particular kinds of combinatorial symmetry with two particular kinds of spin symmetry, namely bosons and fermions. Bose–Einstein statistics Fermi–Dirac statistics
Quantum mechanics, including quantum field theory, is a fundamental theory in physics which describes nature at the smallest scales of energy levels of atoms and subatomic particles. Classical physics, the physics existing before quantum mechanics, describes nature at ordinary scale. Most theories in classical physics can be derived from quantum mechanics as an approximation valid at large scale. Quantum mechanics differs from classical physics in that energy, angular momentum and other quantities of a bound system are restricted to discrete values. Quantum mechanics arose from theories to explain observations which could not be reconciled with classical physics, such as Max Planck's solution in 1900 to the black-body radiation problem, from the correspondence between energy and frequency in Albert Einstein's 1905 paper which explained the photoelectric effect. Early quantum theory was profoundly re-conceived in the mid-1920s by Erwin Schrödinger, Werner Heisenberg, Max Born and others; the modern theory is formulated in various specially developed mathematical formalisms.
In one of them, a mathematical function, the wave function, provides information about the probability amplitude of position and other physical properties of a particle. Important applications of quantum theory include quantum chemistry, quantum optics, quantum computing, superconducting magnets, light-emitting diodes, the laser, the transistor and semiconductors such as the microprocessor and research imaging such as magnetic resonance imaging and electron microscopy. Explanations for many biological and physical phenomena are rooted in the nature of the chemical bond, most notably the macro-molecule DNA. Scientific inquiry into the wave nature of light began in the 17th and 18th centuries, when scientists such as Robert Hooke, Christiaan Huygens and Leonhard Euler proposed a wave theory of light based on experimental observations. In 1803, Thomas Young, an English polymath, performed the famous double-slit experiment that he described in a paper titled On the nature of light and colours.
This experiment played a major role in the general acceptance of the wave theory of light. In 1838, Michael Faraday discovered cathode rays; these studies were followed by the 1859 statement of the black-body radiation problem by Gustav Kirchhoff, the 1877 suggestion by Ludwig Boltzmann that the energy states of a physical system can be discrete, the 1900 quantum hypothesis of Max Planck. Planck's hypothesis that energy is radiated and absorbed in discrete "quanta" matched the observed patterns of black-body radiation. In 1896, Wilhelm Wien empirically determined a distribution law of black-body radiation, known as Wien's law in his honor. Ludwig Boltzmann independently arrived at this result by considerations of Maxwell's equations. However, it underestimated the radiance at low frequencies. Planck corrected this model using Boltzmann's statistical interpretation of thermodynamics and proposed what is now called Planck's law, which led to the development of quantum mechanics. Following Max Planck's solution in 1900 to the black-body radiation problem, Albert Einstein offered a quantum-based theory to explain the photoelectric effect.
Around 1900–1910, the atomic theory and the corpuscular theory of light first came to be accepted as scientific fact. Among the first to study quantum phenomena in nature were Arthur Compton, C. V. Raman, Pieter Zeeman, each of whom has a quantum effect named after him. Robert Andrews Millikan studied the photoelectric effect experimentally, Albert Einstein developed a theory for it. At the same time, Ernest Rutherford experimentally discovered the nuclear model of the atom, for which Niels Bohr developed his theory of the atomic structure, confirmed by the experiments of Henry Moseley. In 1913, Peter Debye extended Niels Bohr's theory of atomic structure, introducing elliptical orbits, a concept introduced by Arnold Sommerfeld; this phase is known as old quantum theory. According to Planck, each energy element is proportional to its frequency: E = h ν, where h is Planck's constant. Planck cautiously insisted that this was an aspect of the processes of absorption and emission of radiation and had nothing to do with the physical reality of the radiation itself.
In fact, he considered his quantum hypothesis a mathematical trick to get the right answer rather than a sizable discovery. However, in 1905 Albert Einstein interpreted Planck's quantum hypothesis realistically and used it to explain the photoelectric effect, in which shining light on certain materials can eject electrons from the material, he won the 1921 Nobel Prize in Physics for this work. Einstein further developed this idea to show that an electromagnetic wave such as light could be described as a particle, with a discrete quantum of energy, dependent on its frequency; the foundations of quantum mechanics were established during the first half of the 20th century by Max Planck, Niels Bohr, Werner Heisenberg, Louis de Broglie, Arthur Compton, Albert Einstein, Erwin Schrödinger, Max Born, John von Neumann, Paul Dirac, Enrico Fermi, Wolfgang Pauli, Max von Laue, Freeman Dyson, David Hilbert, Wi
Ludwig Eduard Boltzmann was an Austrian physicist and philosopher whose greatest achievement was in the development of statistical mechanics, which explains and predicts how the properties of atoms determine the physical properties of matter. Boltzmann coined the word ergodic. Boltzmann was born in the capital of the Austrian Empire, his father, Ludwig Georg Boltzmann, was a revenue official. His grandfather, who had moved to Vienna from Berlin, was a clock manufacturer, Boltzmann's mother, Katharina Pauernfeind, was from Salzburg, he received his primary education from a private tutor at the home of his parents. Boltzmann attended high school in Upper Austria; when Boltzmann was 15, his father died. Boltzmann studied physics at the University of Vienna, starting in 1863. Among his teachers were Josef Loschmidt, Joseph Stefan, Andreas von Ettingshausen and Jozef Petzval. Boltzmann received his PhD degree in 1866 working under the supervision of Stefan. In 1867 he became a Privatdozent. After obtaining his doctorate degree, Boltzmann worked two more years as Stefan's assistant.
It was Stefan. In 1869 at age 25, thanks to a letter of recommendation written by Stefan, he was appointed full Professor of Mathematical Physics at the University of Graz in the province of Styria. In 1869 he spent several months in Heidelberg working with Robert Bunsen and Leo Königsberger and in 1871 with Gustav Kirchhoff and Hermann von Helmholtz in Berlin. In 1873 Boltzmann joined the University of Vienna as Professor of Mathematics and there he stayed until 1876. In 1872, long before women were admitted to Austrian universities, he met Henriette von Aigentler, an aspiring teacher of mathematics and physics in Graz, she was refused permission to audit lectures unofficially. Boltzmann advised her to appeal. On July 17, 1876 Ludwig Boltzmann married Henriette. Boltzmann went back to Graz to take up the chair of Experimental Physics. Among his students in Graz were Svante Arrhenius and Walther Nernst, he spent 14 happy years in Graz and it was there that he developed his statistical concept of nature.
Boltzmann was appointed to the Chair of Theoretical Physics at the University of Munich in Bavaria, Germany in 1890. In 1894, Boltzmann succeeded his teacher Joseph Stefan as Professor of Theoretical Physics at the University of Vienna. Boltzmann spent a great deal of effort in his final years defending his theories, he did not get along with some of his colleagues in Vienna Ernst Mach, who became a professor of philosophy and history of sciences in 1895. That same year Georg Helm and Wilhelm Ostwald presented their position on energetics at a meeting in Lübeck, they saw energy, not matter, as the chief component of the universe. Boltzmann's position carried the day among other physicists who supported his atomic theories in the debate. In 1900, Boltzmann went on the invitation of Wilhelm Ostwald. Ostwald offered Boltzmann the professorial chair in physics, which became vacant when Gustav Heinrich Wiedemann died. After Mach retired due to bad health, Boltzmann returned to Vienna in 1902. In 1903, together with Gustav von Escherich and Emil Müller, founded the Austrian Mathematical Society.
His students included Paul Ehrenfest and Lise Meitner. In Vienna, Boltzmann taught physics and lectured on philosophy. Boltzmann's lectures on natural philosophy were popular and received considerable attention, his first lecture was an enormous success. Though the largest lecture hall had been chosen for it, the people stood all the way down the staircase; because of the great successes of Boltzmann's philosophical lectures, the Emperor invited him for a reception at the Palace. In 1906, Boltzmann's deteriorating mental condition forced him to resign his position, he committed suicide on September 5, 1906, by hanging himself while on vacation with his wife and daughter in Duino, near Trieste. He is buried in the Viennese Zentralfriedhof, his tombstone bears the inscription of Boltzmann's entropy formula: S = k ⋅ log W Boltzmann's kinetic theory of gases seemed to presuppose the reality of atoms and molecules, but all German philosophers and many scientists like Ernst Mach and the physical chemist Wilhelm Ostwald disbelieved their existence.
During the 1890s Boltzmann attempted to formulate a compromise position which would allow both atomists and anti-atomists to do physics without arguing over atoms. His solution was to use Hertz's theory that atoms were Bilder, that is, pictures. Atomists could think the pictures were the real atoms while the anti-atomists could think of the pictures as representing a useful but unreal model, but this did not satisfy either group. Furthermore and many defenders of "pure thermodynamics" were trying hard to refute the kinetic theory of gases and statistical mechanics because of Boltzmann's assumptions about atoms and molecules and statistical interpretation of the second law of thermodynamics. Around the turn of the century, Boltzmann's science was being threatened by another philosophical objection; some physicists, including Mach's student, Gustav Jaumann, interpreted Hertz to mean that all electromagnetic behavior is continuous, as if there were no atoms and molecules, as if all physical behavior were ultimate
The Schrödinger equation is a linear partial differential equation that describes the wave function or state function of a quantum-mechanical system. It is a key result in quantum mechanics, its discovery was a significant landmark in the development of the subject; the equation is named after Erwin Schrödinger, who derived the equation in 1925, published it in 1926, forming the basis for the work that resulted in his Nobel Prize in Physics in 1933. In classical mechanics, Newton's second law is used to make a mathematical prediction as to what path a given physical system will take over time following a set of known initial conditions. Solving this equation gives the position and the momentum of the physical system as a function of the external force F on the system; those two parameters are sufficient to describe its state at each time instant. In quantum mechanics, the analogue of Newton's law is Schrödinger's equation; the concept of a wave function is a fundamental postulate of quantum mechanics.
Using these postulates, Schrödinger's equation can be derived from the fact that the time-evolution operator must be unitary, must therefore be generated by the exponential of a self-adjoint operator, the quantum Hamiltonian. This derivation is explained below. In the Copenhagen interpretation of quantum mechanics, the wave function is the most complete description that can be given of a physical system. Solutions to Schrödinger's equation describe not only molecular and subatomic systems, but macroscopic systems even the whole universe. Schrödinger's equation is central to all applications of quantum mechanics including quantum field theory which combines special relativity with quantum mechanics. Theories of quantum gravity, such as string theory do not modify Schrödinger's equation; the Schrödinger equation is not the only way to study quantum mechanical systems and make predictions. The other formulations of quantum mechanics include matrix mechanics, introduced by Werner Heisenberg, the path integral formulation, developed chiefly by Richard Feynman.
Paul Dirac incorporated the Schrödinger equation into a single formulation. The form of the Schrödinger equation depends on the physical situation; the most general form is the time-dependent Schrödinger equation, which gives a description of a system evolving with time: where i is the imaginary unit, ℏ = h 2 π is the reduced Planck constant, Ψ is the state vector of the quantum system, t is time, H ^ is the Hamiltonian operator. The position-space wave function of the quantum system is nothing but the components in the expansion of the state vector in terms of the position eigenvector | r ⟩, it is a scalar function, expressed as Ψ = ⟨ r | Ψ ⟩. The momentum-space wave function can be defined as Ψ ~ = ⟨ p | Ψ ⟩, where | p ⟩ is the momentum eigenvector; the most famous example is the nonrelativistic Schrödinger equation for the wave function in position space Ψ of a single particle subject to a potential V, such as that due to an electric field. Where m is the particle's mass, ∇ 2 is the Laplacian.
This is a diffusion equation, but unlike the heat equation, this one is a wave equation given the imaginary unit present in the transient term. The term "Schrödinger equation" can refer to both the general equation, or the specific nonrelativistic version; the general equation is indeed quite general, used throughout quantum mechanics, for everything from the Dirac equation to quantum field theory, by plugging in diverse expressions for the Hamiltonian. The specific nonrelativistic version is a classical approximation to reality and yields accurate results in many situations, but only to a certain extent. To apply the Schrödinger equation, write down the Hamiltonian for the system, accounting for the kinetic and potential energies of the particles constituting the system insert it into the Schrödinger equation; the resulting partial differential equation is solved for the wave function, which contains information about the system. The time-dependent Schrödinger equation described above predicts that wave functions can form standing waves, called stationary states.
These states are important as their individual study simplifies the task of solving the time-dependent Schrödinger equation for any state. Stationary states can be described by a simpler form of the Schrödinger equation, the time-independe
Sociology is the scientific study of society, patterns of social relationships, social interaction, culture of everyday life. It is a social science that uses various methods of empirical investigation and critical analysis to develop a body of knowledge about social order and change or social evolution. While some sociologists conduct research that may be applied directly to social policy and welfare, others focus on refining the theoretical understanding of social processes. Subject matter ranges from the micro-sociology level of individual agency and interaction to the macro level of systems and the social structure; the different traditional focuses of sociology include social stratification, social class, social mobility, secularization, sexuality and deviance. As all spheres of human activity are affected by the interplay between social structure and individual agency, sociology has expanded its focus to other subjects, such as health, economy and penal institutions, the Internet, social capital, the role of social activity in the development of scientific knowledge.
The range of social scientific methods has expanded. Social researchers draw upon a variety of quantitative techniques; the linguistic and cultural turns of the mid-20th century led to interpretative and philosophic approaches towards the analysis of society. Conversely, the end of the 1990s and the beginning of the 2000s have seen the rise of new analytically and computationally rigorous techniques, such as agent-based modelling and social network analysis. Social research informs politicians and policy makers, planners, administrators, business magnates, social workers, non-governmental organizations, non-profit organizations, people interested in resolving social issues in general. There is a great deal of crossover between social research, market research, other statistical fields. Sociological reasoning predates the foundation of the discipline. Social analysis has origins in the common stock of Western knowledge and philosophy, has been carried out from as far back as the time of ancient Greek philosopher Plato, if not before.
The origin of the survey, i.e. the collection of information from a sample of individuals, can be traced back to at least the Domesday Book in 1086, while ancient philosophers such as Confucius wrote about the importance of social roles. There is evidence of early sociology in medieval Arab writings; some sources consider Ibn Khaldun, a 14th-century Arab Islamic scholar from North Africa, to have been the first sociologist and father of sociology. The word sociology is derived from both Greek origins; the Latin word: socius, "companion". It was first coined in 1780 by the French essayist Emmanuel-Joseph Sieyès in an unpublished manuscript. Sociology was defined independently by the French philosopher of science, Auguste Comte in 1838 as a new way of looking at society. Comte had earlier used the term social physics, but that had subsequently been appropriated by others, most notably the Belgian statistician Adolphe Quetelet. Comte endeavoured to unify history and economics through the scientific understanding of the social realm.
Writing shortly after the malaise of the French Revolution, he proposed that social ills could be remedied through sociological positivism, an epistemological approach outlined in The Course in Positive Philosophy and A General View of Positivism. Comte believed a positivist stage would mark the final era, after conjectural theological and metaphysical phases, in the progression of human understanding. In observing the circular dependence of theory and observation in science, having classified the sciences, Comte may be regarded as the first philosopher of science in the modern sense of the term. Comte gave a powerful impetus to the development of sociology, an impetus which bore fruit in the decades of the nineteenth century. To say this is not to claim that French sociologists such as Durkheim were devoted disciples of the high priest of positivism, but by insisting on the irreducibility of each of his basic sciences to the particular science of sciences which it presupposed in the hierarchy and by emphasizing the nature of sociology as the scientific study of social phenomena Comte put sociology on the map.
To be sure, beginnings can be traced back well beyond Montesquieu, for example, to Condorcet, not to speak of Saint-Simon, Comte's immediate predecessor. But Comte's clear recognition of sociology as a particular science, with a character of its own, justified Durkheim in regarding him as the father or founder of this science, in spite of the fact that Durkheim did not accept the idea of the three states and criticized Comte's approach to sociology. Both Auguste Comte and Karl Marx set out to develop scientifically justified systems in the wake of European industrialization and secularization, informed by various key movements in the philosophies of history and science. Marx rejected Comtean positivism but in attempting to develop a science of society came to be recognized as a founder of sociology as the word gained wider meaning. For Isaiah Berlin, Marx may be regarded as the "true father" of modern sociology, "in so far as anyone can claim the title."To have given clear and unified answers in familiar empirical terms to those theor
Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set of axioms; these axioms formalise probability in terms of a probability space, which assigns a measure taking values between 0 and 1, termed the probability measure, to a set of outcomes called the sample space. Any specified subset of these outcomes is called an event. Central subjects in probability theory include discrete and continuous random variables, probability distributions, stochastic processes, which provide mathematical abstractions of non-deterministic or uncertain processes or measured quantities that may either be single occurrences or evolve over time in a random fashion. Although it is not possible to predict random events, much can be said about their behavior. Two major results in probability theory describing such behaviour are the law of large numbers and the central limit theorem.
As a mathematical foundation for statistics, probability theory is essential to many human activities that involve quantitative analysis of data. Methods of probability theory apply to descriptions of complex systems given only partial knowledge of their state, as in statistical mechanics. A great discovery of twentieth-century physics was the probabilistic nature of physical phenomena at atomic scales, described in quantum mechanics; the mathematical theory of probability has its roots in attempts to analyze games of chance by Gerolamo Cardano in the sixteenth century, by Pierre de Fermat and Blaise Pascal in the seventeenth century. Christiaan Huygens published a book on the subject in 1657 and in the 19th century, Pierre Laplace completed what is today considered the classic interpretation. Probability theory considered discrete events, its methods were combinatorial. Analytical considerations compelled the incorporation of continuous variables into the theory; this culminated on foundations laid by Andrey Nikolaevich Kolmogorov.
Kolmogorov combined the notion of sample space, introduced by Richard von Mises, measure theory and presented his axiom system for probability theory in 1933. This became the undisputed axiomatic basis for modern probability theory. Most introductions to probability theory treat discrete probability distributions and continuous probability distributions separately; the measure theory-based treatment of probability covers the discrete, continuous, a mix of the two, more. Consider an experiment that can produce a number of outcomes; the set of all outcomes is called the sample space of the experiment. The power set of the sample space is formed by considering all different collections of possible results. For example, rolling an honest die produces one of six possible results. One collection of possible results corresponds to getting an odd number. Thus, the subset is an element of the power set of the sample space of die rolls; these collections are called events. In this case, is the event that the die falls on some odd number.
If the results that occur fall in a given event, that event is said to have occurred. Probability is a way of assigning every "event" a value between zero and one, with the requirement that the event made up of all possible results be assigned a value of one. To qualify as a probability distribution, the assignment of values must satisfy the requirement that if you look at a collection of mutually exclusive events, the probability that any of these events occurs is given by the sum of the probabilities of the events; the probability that any one of the events, or will occur is 5/6. This is the same as saying that the probability of event is 5/6; this event encompasses the possibility of any number except five being rolled. The mutually exclusive event has a probability of 1/6, the event has a probability of 1, that is, absolute certainty; when doing calculations using the outcomes of an experiment, it is necessary that all those elementary events have a number assigned to them. This is done using a random variable.
A random variable is a function that assigns to each elementary event in the sample space a real number. This function is denoted by a capital letter. In the case of a die, the assignment of a number to a certain elementary events can be done using the identity function; this does not always work. For example, when flipping a coin the two possible outcomes are "heads" and "tails". In this example, the random variable X could assign to the outcome "heads" the number "0" and to the outcome "tails" the number "1". Discrete probability theory deals with events. Examples: Throwing dice, experiments with decks of cards, random walk, tossing coins Classical definition: Initially the probability of an event to occur was defined as the number of cases favorable for the event, over the number of total outcomes possible in an equiprobable sample space: see Classical definition of probability. For example, if the event is "occurrence of an number when a die is