Atomism is a natural philosophy that developed in several ancient traditions. References to the concept of atomism and its atoms appeared in both ancient Greek and ancient Indian philosophical traditions; the ancient Greek atomists theorized that nature consists of two fundamental principles: atom and void. Unlike their modern scientific namesake in atomic theory, philosophical atoms come in an infinite variety of shapes and sizes, each indestructible and surrounded by a void where they collide with the others or hook together forming a cluster. Clusters of different shapes and positions give rise to the various macroscopic substances in the world; the particles of chemical matter for which chemists and other natural philosophers of the early 19th century found experimental evidence were thought to be indivisible, therefore were given the name "atom", long used by the atomist philosophy. Although the connection to historical atomism is at best tenuous, elementary particles have become a modern analog of philosophical atoms.
Philosophical atomism is a reductive argument. Atomism stands in contrast to a substance theory wherein a prime material continuum remains qualitatively invariant under division. Indian Buddhists, such as Dharmakirti and others developed distinctive theories of atomism, for example, involving momentary atoms, that flash in and out of existence. In the 5th century BCE, Leucippus and his pupil Democritus proposed that all matter was composed of small indivisible particles called atoms. Nothing whatsoever is known about Leucippus except. Democritus, by contrast, was a prolific writer, who wrote over eighty known treatises, none of which have survived to the present day complete. However, a massive number of fragments and quotations of his writings have survived; these are the main source of information on his teachings about atoms. Democritus's argument for the existence of atoms hinged on the idea that it is impossible to keep dividing matter for infinity and that matter must therefore be made up of tiny particles.
Democritus believed that atoms are too small for human senses to detect, they are infinitely many, they come in infinitely many varieties, that they have always existed. They float in a vacuum, which Democritus called the "void", they vary in form and posture; some atoms, he maintained, are convex, others concave, some shaped like hooks, others like eyes. They are moving and colliding into each other. Democritus wrote that atoms and void are the only things that exist and that all other things are said to exist by social convention; the objects humans see in everyday life are composed of many atoms united by random collisions and their forms and materials are determined by what kinds of atom make them up. Human perceptions are caused by atoms as well. Bitterness is caused by small, jagged atoms passing across the tongue. Parmenides denied the existence of motion and void, he believed all existence to be a single, all-encompassing and unchanging mass, that change and motion were mere illusions. This conclusion, as well as the reasoning that led to it, may indeed seem baffling to the modern empirical mind, but Parmenides explicitly rejected sensory experience as the path to an understanding of the universe, instead used purely abstract reasoning.
Firstly, he believed. This in turn meant, he wrote all, must be an indivisible unity, for if it were manifold there would have to be a void that could divide it. He stated that the all encompassing Unity is unchanging, for the Unity encompasses all, can be. Democritus accepted most of Parmenides' arguments, except for the idea, he believed change was real, if it was not at least the illusion had to be explained. He thus supported the concept of void, stated that the universe is made up of many Parmenidean entities that move around in the void; the void provides the space in which the atoms can pack or scatter differently. The different possible packings and scatterings within the void make up the shifting outlines and bulk of the objects that organisms feel, eat, hear and taste. While organisms may feel hot or cold and cold have no real existence, they are sensations produced in organisms by the different packings and scatterings of the atoms in the void that compose the object that organisms sense as being "hot" or "cold".
The work of Democritus only survives in secondhand reports, some of which are unreliable or conflicting. Much of the best evidence of Democritus' theory of atomism is reported by Aristotle in his discussions of Democritus' and Plato's contrasting views on the types of indivisibles composing the natural world. Plato, if he had been familiar with the atomism of Democritus, would have objected to its mechanistic materialism, he argued that atoms just crashing into other atoms could never produce the beauty and form of the world
False equivalence is a logical fallacy in which two opposing arguments appear to be logically equivalent when in fact they are not. This fallacy is categorized as a fallacy of inconsistency. A common way for this fallacy to be perpetuated is one shared trait between two subjects is assumed to show equivalence in order of magnitude, when equivalence is not the logical result. False equivalence is a common result when an anecdotal similarity is pointed out as equal, but the claim of equivalence doesn't bear because the similarity is based on oversimplification or ignorance of additional factors; the pattern of the fallacy is as such: "If A is the set of c and d, B is the set of d and e since they both contain d, A and B are equal". D is not required to exist in both sets. False equivalence arguments are used in journalism and in politics, where the minor flaws of one candidate may be compared to major flaws of another; the following statements are examples of false equivalence: "They're both living animals that metabolize chemical energy.
There's no difference between a pet cat and a pet snail."The "equivalence" is in factors that are not relevant to the animals' suitability as pets. "The Deepwater Horizon oil spill is no different from your neighbor dripping some oil on the ground when changing oil in his car."The comparison is between things differing by many orders of magnitude: Deepwater Horizon spilled 210 million US gal of oil. Equivocation False balance False analogy Tu quoque Whataboutism Wronger than wrong
The sorites paradox is a paradox that arises from vague predicates. A typical formulation involves a heap of sand. Under the assumption that removing a single grain does not turn a heap into a non-heap, the paradox is to consider what happens when the process is repeated enough times: is a single remaining grain still a heap? If not, when did it change from a heap to a non-heap? The word "sorites" derives from the Greek word for heap; the paradox is so named because of its original characterization, attributed to Eubulides of Miletus. The paradox goes as follows: consider a heap of sand from which grains are individually removed. One might construct the argument, using premises, as follows: 1,000,000 grains of sand is a heap of sand A heap of sand minus one grain is still a heap. Repeated applications of Premise 2 forces one to accept the conclusion that a heap may be composed of just one grain of sand. Read observes that "the argument is itself a heap, or sorites, of steps of modus ponens": 1,000,000 grains is a heap.
If 1,000,000 grains is a heap 999,999 grains is a heap. So 999,999 grains is a heap. If 999,999 grains is a heap 999,998 grains is a heap. So 999,998 grains is a heap. If...... So 1 grain is a heap. Tension between small changes and big consequences gives rise to the Sorites Paradox... There are many variations... consideration of the difference between being... and seeming.... Another formulation is to start with a grain of sand, not a heap, assume that adding a single grain of sand to something, not a heap does not turn it into a heap. Inductively, this process can be repeated as much as one wants without constructing a heap. A more natural formulation of this variant is to assume a set of colored chips exists such that two adjacent chips vary in color too little for human eyesight to be able to distinguish between them. By induction on this premise, humans would not be able to distinguish between any colors; the removal of one drop from the ocean, will not make it'not an ocean', but since the volume of water in the ocean is finite after enough removals a litre of water left is still an ocean.
This paradox can be reconstructed for a variety of predicates, for example, with "tall", "rich", "old", "blue", "bald", so on. Bertrand Russell argued that all of natural language logical connectives, is vague. Other similar paradoxes are: Argument of the beard The bald man paradox On the face of it, there are some ways to avoid this conclusion. One may object to the first premise by denying, but 1,000,000 is just an arbitrarily large number, the argument will go through with any such number. So the response must deny outright. Peter Unger defends this solution. Alternatively, one may object to the second premise by stating that it is not true for all heaps of sand that removing one grain from it still makes a heap. A common first response to the paradox is to call any set of grains that has more than a certain number of grains in it a heap. If one were to set the "fixed boundary" at, say, 10,000 grains one would claim that for fewer than 10,000, it is not a heap. However, such solutions are unsatisfactory as there seems little significance to the difference between 9,999 grains and 10,000 grains.
The boundary, wherever it may be set, remains as arbitrary and so its precision is misleading. It is objectionable on both philosophical and linguistic grounds: the former on account of its arbitrariness, the latter on the ground that it is not how we use natural language. A second response attempts to find a fixed boundary. For example, a dictionary may define a "heap" as "a collection of things thrown together so as to form an elevation." This requires there to be enough grains. Thus, adding one grain atop a single layer produces a heap, removing the last grain above the bottom layer destroys the heap. Timothy Williamson and Roy Sorensen hold an approach that there are fixed boundaries but that they are unknowable. Supervaluationism is a semantics for dealing with irreferential singular vagueness, it allows one to retain the usual tautological laws when dealing with undefined truth values. As an example for a proposition about an irreferential singular term, consider the sentence "Pegasus likes licorice".
Since the name "Pegasus" fails to refer, no truth value can be assigned to the sentence. However, there are some statements about "Pegasus" which have definite truth values such as "Pegasus likes licorice or Pegasus doesn't like licorice"; this sentence is an instance of the tautology " p ∨ ¬ p ", i.e. the valid schema " p or not- p ". According to supervaluationism, it should be true regardless of whether or not its components have a truth value. "1,000 grains of sand is a heap of sand" may be considered a border case having no truth value, but "1,000 grains of sand is a heap of sand, or 1,000 grains of sand is not a heap of sand" should be true. Let v be a classical valuation defined on every atomic sentence of the language L, let A
Begging the question
Begging the question is an informal fallacy that occurs when an argument's premises assume the truth of the conclusion, instead of supporting it. It is a type of circular reasoning: an argument; this occurs in an indirect way such that the fallacy's presence is hidden, or at least not apparent. The phrase begging the question originated in the 16th century as a mistranslation of the Latin petitio principii, which translates to "assuming the initial point". In modern vernacular usage, "begging the question" is used to mean "raising the question" or "dodging the question". In contexts that demand strict adherence to a technical definition of the term, many consider these usages incorrect. Africa is the largest continent. Left-handed people are better painters. Both these arguments are logically valid if one assumes the initial premise is correct the conclusion logically follows, but they are flawed because assuming the initial premise is valid means assuming the conclusion is as well. The original phrase used by Aristotle from which begging the question descends is: τὸ ἐξ ἀρχῆς αἰτεῖν, "asking for the initial thing."
Aristotle's intended meaning is tied to the type of dialectical argument he discusses in his Topics, book VIII: a formalized debate in which the defending party asserts a thesis that the attacking party must attempt to refute by asking yes-or-no questions and deducing some inconsistency between the responses and the original thesis. In this stylized form of debate, the proposition that the answerer undertakes to defend is called "the initial thing" and one of the rules of the debate is that the questioner cannot ask for it. Aristotle discusses this in Sophistical Refutations and in Prior Analytics book II; the stylized dialectical exchanges Aristotle discusses in the Topics included rules for scoring the debate, one important issue was the matter of asking for the initial thing—which included not just making the actual thesis adopted by the answerer into a question, but making a question out of a sentence, too close to that thesis. The term was translated into English from Latin in the 16th century.
The Latin version, petitio principii, "asking for the starting point", can be interpreted in different ways. Petitio, in the post-classical context in which the phrase arose, means assuming or postulating, but in the older classical sense means petition, request or beseeching. Principii, genitive of principium, means basis or premise. Petitio principii means "assuming the premise" or "assuming the original point"; the Latin phrase comes from the Greek τὸ ἐν ἀρχῇ αἰτεῖσθαι in Aristotle's Prior Analytics II xvi 64b28–65a26: Begging or assuming the point at issue consists failing to demonstrate the required proposition. But there are several other ways. Now begging the question is none of these. If, the relation of B to C is such that they are identical, or that they are convertible, or that one applies to the other he is begging the point at issue.... Egging the question is proving what is not self-evident by means of itself...either because predicates which are identical belong to the same subject, or because the same predicate belongs to subjects which are identical.
Aristotle's distinction between apodictic science and other forms of non-demonstrative knowledge rests on an epistemology and metaphysics wherein appropriate first principles become apparent to the trained dialectician: Aristotle's advice in S. E. 27 for resolving fallacies of Begging the Question is brief. If one realizes that one is being asked to concede the original point, one should refuse to do so if the point being asked is a reputable belief. On the other hand, if one fails to realize that one has conceded the point at issue and the questioner uses the concession to produce the apparent refutation one should turn the tables on the sophistical opponent by oneself pointing out the fallacy committed. In dialectical exchange it is a worse mistake to be caught asking for the original point than to have inadvertently granted such a request; the answerer in such a position has failed to detect. The questioner, if he did not realize he was asking the original point, has committed the same error.
But if he has knowingly asked for the original point he reveals himself to be ontologically confused: he has mistaken what is non-self-explanatory to be something self-explanatory. In pointing this out to the false reasoner, one is not just pointing out a tactical psychological misjudgment by the questioner, it is not that the questioner falsely thought that the original point, if placed under the guise of a semantic equivalent, or a logical equivalent, or a covering universal, or divided up into exhaustive parts, would be more persuasive to the answerer. Rather, the questioner falsely thought that a non-self-explanatory fact about the world was an explanatory first principle. For Aristotle, that certain facts are self-explanatory while other
Classical mechanics describes the motion of macroscopic objects, from projectiles to parts of machinery, astronomical objects, such as spacecraft, planets and galaxies. If the present state of an object is known it is possible to predict by the laws of classical mechanics how it will move in the future and how it has moved in the past; the earliest development of classical mechanics is referred to as Newtonian mechanics. It consists of the physical concepts employed by and the mathematical methods invented by Isaac Newton and Gottfried Wilhelm Leibniz and others in the 17th century to describe the motion of bodies under the influence of a system of forces. More abstract methods were developed, leading to the reformulations of classical mechanics known as Lagrangian mechanics and Hamiltonian mechanics; these advances, made predominantly in the 18th and 19th centuries, extend beyond Newton's work through their use of analytical mechanics. They are, with some modification used in all areas of modern physics.
Classical mechanics provides accurate results when studying large objects that are not massive and speeds not approaching the speed of light. When the objects being examined have about the size of an atom diameter, it becomes necessary to introduce the other major sub-field of mechanics: quantum mechanics. To describe velocities that are not small compared to the speed of light, special relativity is needed. In case that objects become massive, general relativity becomes applicable. However, a number of modern sources do include relativistic mechanics into classical physics, which in their view represents classical mechanics in its most developed and accurate form; the following introduces the basic concepts of classical mechanics. For simplicity, it models real-world objects as point particles; the motion of a point particle is characterized by a small number of parameters: its position and the forces applied to it. Each of these parameters is discussed in turn. In reality, the kind of objects that classical mechanics can describe always have a non-zero size.
Objects with non-zero size have more complicated behavior than hypothetical point particles, because of the additional degrees of freedom, e.g. a baseball can spin while it is moving. However, the results for point particles can be used to study such objects by treating them as composite objects, made of a large number of collectively acting point particles; the center of mass of a composite object behaves like a point particle. Classical mechanics uses common-sense notions of how matter and forces interact, it assumes that matter and energy have definite, knowable attributes such as location in space and speed. Non-relativistic mechanics assumes that forces act instantaneously; the position of a point particle is defined in relation to a coordinate system centered on an arbitrary fixed reference point in space called the origin O. A simple coordinate system might describe the position of a particle P with a vector notated by an arrow labeled r that points from the origin O to point P. In general, the point particle does not need to be stationary relative to O.
In cases where P is moving relative to O, r is defined as a function of time. In pre-Einstein relativity, time is considered an absolute, i.e. the time interval, observed to elapse between any given pair of events is the same for all observers. In addition to relying on absolute time, classical mechanics assumes Euclidean geometry for the structure of space; the velocity, or the rate of change of position with time, is defined as the derivative of the position with respect to time: v = d r d t. In classical mechanics, velocities are directly subtractive. For example, if one car travels east at 60 km/h and passes another car traveling in the same direction at 50 km/h, the slower car perceives the faster car as traveling east at 60 − 50 = 10 km/h. However, from the perspective of the faster car, the slower car is moving 10 km/h to the west denoted as -10 km/h where the sign implies opposite direction. Velocities are directly additive as vector quantities. Mathematically, if the velocity of the first object in the previous discussion is denoted by the vector u = ud and the velocity of the second object by the vector v = ve, where u is the speed of the first object, v is the speed of the second object, d and e are unit vectors in the directions of motion of each object then the velocity of the first object as seen by the second object is u ′ = u − v. Similarly, the first object sees the velocity of the second object as v ′ = v − u.
When both objects are moving in the same direction, this equation can be simplified to u ′ = d. Or, by ignoring direction, the difference can be given in terms of speed only: u ′ = u − v; the acceleration, or rate of change of velocity, is th
A fuzzy concept is a concept of which the boundaries of application can vary according to context or conditions, instead of being fixed once and for all. This means the concept is vague in some way, lacking a fixed, precise meaning, without however being unclear or meaningless altogether, it has a definite meaning, which can be made more precise only through further elaboration and specification - including a closer definition of the context in which the concept is used. The study of the characteristics of fuzzy concepts and fuzzy language is called fuzzy semantics; the inverse of a "fuzzy concept" is a "crisp concept". A fuzzy concept is understood by scientists as a concept, "to an extent applicable" in a situation; that means the concept has gradations of unsharp boundaries of application. A fuzzy statement is a statement, true "to some extent", that extent can be represented by a scaled value; the best known example of a fuzzy concept around the world is an amber traffic light, indeed fuzzy concepts are used in traffic control systems.
The term is used these days in a more general, popular sense - in contrast to its technical meaning - to refer to a concept, "rather vague" for any kind of reason. In the past, the idea of reasoning with fuzzy concepts faced considerable resistance from academic elites, they did not want to endorse the use of imprecise concepts in argumentation. Yet although people might not be aware of it, the use of fuzzy concepts has risen gigantically in all walks of life from the 1970s onward; that is due to advances in electronic engineering, fuzzy mathematics and digital computer programming. The new technology allows complex inferences about "variations on a theme" to be anticipated and fixed in a program; the new neuro-fuzzy computational methods make it possible, to identify, to measure and respond to fine gradations of significance, with great precision. It means that useful concepts can be coded and applied to all kinds of tasks if, these concepts are never defined. Nowadays engineers and programmers represent fuzzy concepts mathematically, using fuzzy logic, fuzzy values, fuzzy variables and fuzzy sets.
Problems of vagueness and fuzziness have always existed in human experience. The boundary between different things can appear blurry. Sometimes people have to think, when they are not in the best frame of mind to do it, or, they have to talk about something out there, which just isn't defined. Across time, however and scientists began to reflect about those kinds of problems, in much more systematic ways; the ancient Sorites paradox first raised the logical problem of how we could define the threshold at which a change in quantitative gradation turns into a qualitative or categorical difference. With some physical processes this threshold is easy to identify. For example, water turns into steam at 100 °C or 212 °F. With many other processes and gradations, the point of change is much more difficult to locate, remains somewhat vague. Thus, the boundaries between qualitatively different things may be unsharp: we know that there are boundaries, but we cannot define them exactly. According to the modern idea of the continuum fallacy, the fact that a statement is to an extent vague, does not automatically mean that it is invalid.
The problem becomes one of how we could ascertain the kind of validity that the statement does have. The Nordic myth of Loki's wager suggested that concepts that lack precise meanings or precise boundaries of application cannot be usefully discussed at all. However, the 20th century idea of "fuzzy concepts" proposes that "somewhat vague terms" can be operated with, since we can explicate and define the variability of their application, by assigning numbers to gradations of applicability; this idea sounds simple enough. The intellectual origins of the species of fuzzy concepts as a logical category have been traced back to a diversity of famous and less well-known thinkers, including Eubulides, Cicero, Georg Wilhelm Friedrich Hegel, Karl Marx and Friedrich Engels, Friedrich Nietzsche, Hugh MacColl, Charles S. Peirce, Max Black, Jan Łukasiewicz, Emil Leon Post, Alfred Tarski, Georg Cantor, Nicolai A. Vasiliev, Kurt Gödel, Stanisław Jaśkowski and Donald Knuth. Across at least two and a half millennia, all of them had something to say about graded concepts with unsharp boundaries.
This suggests at least that the awareness of the existence of concepts with "fuzzy" characteristics, in one form or another, has a long history in human thought. Quite a few logicians and philosophers have tried to analyze the characteristics of fuzzy concepts as a recognized species, sometimes with the aid of some kind of many-valued logic or substructural logic. An early attempt in the post-WW2 era to create a theory of sets where set membership is a matter of degree was made by Abraham Kaplan and Hermann Schott in 1951, they intended to apply the idea to empirical research. Kaplan and Schott measured the degree of membership of empirical classes using real numbers between 0 and 1, they defined corresponding notions of intersection, union and subset. However, at the time, their idea "fell on stony ground". J. Barkley Rosser Sr. published a treatise on many-valued logics in 1952, anticipating "many-valued sets". Another treatise was published in 1963 by Aleksandr A. Zinov'ev and othersIn 1964, the American philosopher William Alston introduced the term "degree vagueness" to describe vagueness in an idea that resu
Quantum mechanics, including quantum field theory, is a fundamental theory in physics which describes nature at the smallest scales of energy levels of atoms and subatomic particles. Classical physics, the physics existing before quantum mechanics, describes nature at ordinary scale. Most theories in classical physics can be derived from quantum mechanics as an approximation valid at large scale. Quantum mechanics differs from classical physics in that energy, angular momentum and other quantities of a bound system are restricted to discrete values. Quantum mechanics arose from theories to explain observations which could not be reconciled with classical physics, such as Max Planck's solution in 1900 to the black-body radiation problem, from the correspondence between energy and frequency in Albert Einstein's 1905 paper which explained the photoelectric effect. Early quantum theory was profoundly re-conceived in the mid-1920s by Erwin Schrödinger, Werner Heisenberg, Max Born and others; the modern theory is formulated in various specially developed mathematical formalisms.
In one of them, a mathematical function, the wave function, provides information about the probability amplitude of position and other physical properties of a particle. Important applications of quantum theory include quantum chemistry, quantum optics, quantum computing, superconducting magnets, light-emitting diodes, the laser, the transistor and semiconductors such as the microprocessor and research imaging such as magnetic resonance imaging and electron microscopy. Explanations for many biological and physical phenomena are rooted in the nature of the chemical bond, most notably the macro-molecule DNA. Scientific inquiry into the wave nature of light began in the 17th and 18th centuries, when scientists such as Robert Hooke, Christiaan Huygens and Leonhard Euler proposed a wave theory of light based on experimental observations. In 1803, Thomas Young, an English polymath, performed the famous double-slit experiment that he described in a paper titled On the nature of light and colours.
This experiment played a major role in the general acceptance of the wave theory of light. In 1838, Michael Faraday discovered cathode rays; these studies were followed by the 1859 statement of the black-body radiation problem by Gustav Kirchhoff, the 1877 suggestion by Ludwig Boltzmann that the energy states of a physical system can be discrete, the 1900 quantum hypothesis of Max Planck. Planck's hypothesis that energy is radiated and absorbed in discrete "quanta" matched the observed patterns of black-body radiation. In 1896, Wilhelm Wien empirically determined a distribution law of black-body radiation, known as Wien's law in his honor. Ludwig Boltzmann independently arrived at this result by considerations of Maxwell's equations. However, it underestimated the radiance at low frequencies. Planck corrected this model using Boltzmann's statistical interpretation of thermodynamics and proposed what is now called Planck's law, which led to the development of quantum mechanics. Following Max Planck's solution in 1900 to the black-body radiation problem, Albert Einstein offered a quantum-based theory to explain the photoelectric effect.
Around 1900–1910, the atomic theory and the corpuscular theory of light first came to be accepted as scientific fact. Among the first to study quantum phenomena in nature were Arthur Compton, C. V. Raman, Pieter Zeeman, each of whom has a quantum effect named after him. Robert Andrews Millikan studied the photoelectric effect experimentally, Albert Einstein developed a theory for it. At the same time, Ernest Rutherford experimentally discovered the nuclear model of the atom, for which Niels Bohr developed his theory of the atomic structure, confirmed by the experiments of Henry Moseley. In 1913, Peter Debye extended Niels Bohr's theory of atomic structure, introducing elliptical orbits, a concept introduced by Arnold Sommerfeld; this phase is known as old quantum theory. According to Planck, each energy element is proportional to its frequency: E = h ν, where h is Planck's constant. Planck cautiously insisted that this was an aspect of the processes of absorption and emission of radiation and had nothing to do with the physical reality of the radiation itself.
In fact, he considered his quantum hypothesis a mathematical trick to get the right answer rather than a sizable discovery. However, in 1905 Albert Einstein interpreted Planck's quantum hypothesis realistically and used it to explain the photoelectric effect, in which shining light on certain materials can eject electrons from the material, he won the 1921 Nobel Prize in Physics for this work. Einstein further developed this idea to show that an electromagnetic wave such as light could be described as a particle, with a discrete quantum of energy, dependent on its frequency; the foundations of quantum mechanics were established during the first half of the 20th century by Max Planck, Niels Bohr, Werner Heisenberg, Louis de Broglie, Arthur Compton, Albert Einstein, Erwin Schrödinger, Max Born, John von Neumann, Paul Dirac, Enrico Fermi, Wolfgang Pauli, Max von Laue, Freeman Dyson, David Hilbert, Wi