Induced demand – related to latent demand and generated demand – is the phenomenon that after supply increases, more of a good is consumed. This is consistent with the economic theory of supply and demand; this phenomenon, called induced traffic, is a contributing factor to urban sprawl. City planner Jeff Speck has called induced demand "the great intellectual black hole in city planning, the one professional certainty that everyone thoughtful seems to acknowledge, yet no one is willing to act upon."The inverse effect, or reduced demand, is observed. According to CityLab: Induced demand is used as a catch-all term for a variety of interconnected effects that cause new roads to fill up to capacity. In growing areas where roads were not designed for the current population, there may be a great deal of latent demand for new road capacity, which causes a flood of new drivers to take to the freeway once the new lanes are open clogging them up again, but these individuals were already living nearby.
They may have taken alternative modes of transport, traveled at off hours, or not made those trips at all. That’s why latent demand can be difficult to disentangle from generated demand—the new traffic, a direct result of the new capacity.. The technical distinction between the two terms, which are used interchageably, is that latent demand is travel that cannot be realized because of constraints, it is thus "pent-up". Induced demand is demand, realized, or "generated", by improvements made to transportation infrastructure. Thus, induced demand generates the traffic, "pent-up" as latent demand. Latent demand has been recognised by road traffic professionals for many decades, was referred to as "traffic generation". In the simplest terms, latent demand is demand that exists, for any number of reasons, most having to do with human psychology, is suppressed by the inability of the system to handle it. Once additional capacity is added to the network, the demand, latent materializes as actual usage.
The effect was recognized as early as 1930, when an executive of a St. Louis, Missouri electric railway company told a Transportation Survey Commission that widening streets produces more traffic, heavier congestion. In New York, it was seen in the highway-building program of Robert Moses, the "master builder" of the New York City area; as described by Moses' biographer, Robert Caro, in The Power Broker: During the last two or three years before, a few planners had...begun to understand that, without a balanced system, roads would not only not alleviate transportation congestion but would aggravate it. Watching Moses open the Triborough Bridge to ease congestion on the Queensborough Bridge, open the Bronx-Whitestone Bridge to ease congestion on the Triborough Bridge and watching traffic counts on all three bridges mount until all three were as congested as one had been before, planners could hardly avoid the conclusion that "traffic generation" was no longer a theory but a proven fact: the more highways were built to alleviate congestion, the more automobiles would pour into them and congest them and this force the building of more highways – which would generate more traffic and become congested in their turn in an ever-widening spiral that contained the most awesome implications for the future of New York and of all urban areas.
The same effect had been seen earlier with the new parkways that Moses had built on Long Island in the 1930s and 40s, where...every time a new parkway was built, it became jammed with traffic, but the load on the old parkways was not relieved. The building of the Brooklyn-Battery Tunnel failed to ease congestion on the Queens-Midtown Tunnel and the three East River bridges, as Moses had expected it to. By 1942, Moses could no longer ignore the reality that his roads were not alleviating congestion in the way he expected them to, but his answer to the problem was not to invest in mass transit, it was to build more roads, in a vast program which would expand or newly create 200 miles of roads, including additional bridges, such as the Throgs Neck Bridge and the Verrazano Narrows Bridge. J. J. Leeming, a British road-traffic engineer and county surveyor between 1924 and 1964, described the phenomenon in his 1969 book, Road Accidents: Prevent or Punish?: Motorways and bypasses generate traffic, that is, produce extra traffic by inducing people to travel who would not otherwise have done so by making the new route more convenient than the old by people who go out of their direct route to enjoy the greater convenience of the new road, by people who use the towns bypassed because they are more convenient for shopping and visits when through traffic has been removed.
Leeming went on to give an example of the observed effect following the opening of the Doncaster Bypass section of the A1 in 1961. By 1998, Donald Chen quoted the British Transport Minister as saying "The fact of the matter is that we cannot tackle our traffic problem by building more roads." In Southern California, a study by the Southern California Association of Governments in 1989 concluded that steps taken to alleviate traffic congestion, such as adding lanes or turning freeways into double-decked roads, would have nothing but a cosmetic effect on the problem. Als
J. Bradford DeLong
James Bradford "Brad" DeLong is an economic historian, professor of Economics at the University of California, Berkeley. DeLong served as Deputy Assistant Secretary of the U. S. Department of the Treasury in the Clinton Administration under Lawrence Summers, he is an active blogger whose "Grasping Reality with Both Invisible Hands" covers political and economic issues as well as criticism of their media coverage. According to the 2019 ranking of economists by Research Papers in Economics, DeLong is the 746th most influential economist, he graduated summa cum laude from Harvard University in 1982, followed by an M. A. and PhD in economics in 1985 and 1987 also from Harvard. After earning his PhD, he taught economics at universities in the Boston area, including MIT, Boston University, Harvard University, from 1987 to 1993, he was a John M. Olin Fellow at the National Bureau of Economic Research in 1991–1992, he joined UC Berkeley as an associate professor in 1993. From April 1993 to May 1995, he served as Deputy Assistant Secretary for Economic Policy at the U.
S. Department of the Treasury in Washington, D. C; as an official in the Treasury Department in the Clinton administration, he worked on the 1993 federal budget, the unsuccessful health care reform effort, on other policies, on several trade issues, including the Uruguay Round of the General Agreement on Tariffs and Trade and the North American Free Trade Agreement. He became a full professor at Berkeley in 1997 and has been there since, he has been a research associate of the National Bureau of Economic Research, a visiting scholar at the Federal Reserve Bank of San Francisco, Alfred P. Sloan Research Fellow. Along with Joseph Stiglitz and Aaron Edlin, DeLong is co-editor of The Economists' Voice, has been co-editor of the read Journal of Economic Perspectives, he is the author of a textbook, the second edition of which he coauthored with Martha Olney. He writes a monthly syndicated op-ed column for Project Syndicate. DeLong lives in Berkeley, with his wife Ann Marie Marciarille, a professor of law at the University of Missouri-Kansas City.
DeLong considers himself a free trade liberal. He has cited Adam Smith, John Maynard Keynes, Andrei Shleifer, Milton Friedman, Lawrence Summers as the economists who have had the greatest influence on his views. In 1990 and 1991 DeLong and Lawrence Summers co-wrote two theoretical papers that were to become critical theoretical underpinnings for the financial deregulation put in place when Summers was Secretary of the Treasury under Bill Clinton. In March 2008, DeLong endorsed Barack Obama as the Democratic Party candidate for President. DeLong has been a critic of his Berkeley colleague, John Yoo, a law professor who worked in the Office of Legal Counsel under President George W. Bush. Yoo authored the torture memos authorizing the Bush administration to use torture during the war on terror, crafting the unitary executive theory. DeLong wrote a letter to the Berkeley Chancellor Robert Birgeneau calling for Yoo's dismissal in February 2009. DeLong maintains a political commentary site, "Brad DeLong's Egregious Moderation", contributed to Shrillblog, a blog critical of the Republican Party and the Bush administration.
The blog originated in a conversation among DeLong, Tyler Cowen, Andrew Northrup regarding the use of the term "shrill" as a criticism of New York Times columnist and fellow academic economist Paul Krugman. According to his faculty webpage, his research interests include "comparative technological and industrial revolutions. "Noise Trader Risk in Financial Markets" "Equipment Investment and Economic Growth" "In Defense of Mexico's Rescue" "Princes and Merchants: European City Growth before the Industrial Revolution" "The Marshall Plan: History's Most Successful Structural Adjustment Programme" doi:10.3386/w3899 "Between Meltdown and Moral Hazard: The International Monetary and Financial Policy of the Clinton Administration" "Review of Robert Skidelsky, John Maynard Keynes, volume 3, Fighting for Britain" "The Triumph of Monetarism?" "Asset Returns and Economic Growth" "Productivity Growth in the 2000s" "The New Economy: Background, Speculations" "Speculative Microeconomics for Tomorrow's Economy" "America's Peacetime Inflation" "Keynesianism Pennsylvania-Avenue Style" "Productivity and Machinery Investment: A Long-Run Look, 1870-1980" "The Stock Market Bubble of 1929
Microeconomics is a branch of economics that studies the behaviour of individuals and firms in making decisions regarding the allocation of scarce resources and the interactions among these individuals and firms. One goal of microeconomics is to analyze the market mechanisms that establish relative prices among goods and services and allocate limited resources among alternative uses. Microeconomics shows conditions, it analyzes market failure, where markets fail to produce efficient results. Microeconomics stands in contrast to macroeconomics, which involves "the sum total of economic activity, dealing with the issues of growth and unemployment and with national policies relating to these issues". Microeconomics deals with the effects of economic policies on microeconomic behavior and thus on the aforementioned aspects of the economy. In the wake of the Lucas critique, much of modern macroeconomic theories has been built upon microfoundations—i.e. Based upon basic assumptions about micro-level behavior.
Microeconomic theory begins with the study of a single rational and utility maximizing individual. To economists, rationality means an individual possesses stable preferences that are both complete and transitive; the technical assumption that preference relations are continuous is needed to ensure the existence of a utility function. Although microeconomic theory can continue without this assumption, it would make comparative statics impossible since there is no guarantee that the resulting utility function would be differentiable. Microeconomic theory progresses by defining a competitive budget set, a subset of the consumption set, it is at this point that economists make the technical assumption that preferences are locally non-satiated. Without the assumption of LNS there is no 100% guarantee but there would be a rational rise in individual utility. With the necessary tools and assumptions in place the utility maximization problem is developed; the utility maximization problem is the heart of consumer theory.
The utility maximization problem attempts to explain the action axiom by imposing rationality axioms on consumer preferences and mathematically modeling and analyzing the consequences. The utility maximization problem serves not only as the mathematical foundation of consumer theory but as a metaphysical explanation of it as well; that is, the utility maximization problem is used by economists to not only explain what or how individuals make choices but why individuals make choices as well. The utility maximization problem is a constrained optimization problem in which an individual seeks to maximize utility subject to a budget constraint. Economists use the extreme value theorem to guarantee that a solution to the utility maximization problem exists; that is, since the budget constraint is both bounded and closed, a solution to the utility maximization problem exists. Economists call the solution to the utility maximization problem a Walrasian demand function or correspondence; the utility maximization problem has so far been developed by taking consumer tastes as the primitive.
However, an alternative way to develop microeconomic theory is by taking consumer choice as the primitive. This model of microeconomic theory is referred to as revealed preference theory; the theory of supply and demand assumes that markets are competitive. This implies that there are many buyers and sellers in the market and none of them have the capacity to influence prices of goods and services. In many real-life transactions, the assumption fails because some individual buyers or sellers have the ability to influence prices. Quite a sophisticated analysis is required to understand the demand-supply equation of a good model. However, the theory works well in situations meeting these assumptions. Mainstream economics does not assume a priori that markets are preferable to other forms of social organization. In fact, much analysis is devoted to cases where market failures lead to resource allocation, suboptimal and creates deadweight loss. A classic example of suboptimal resource allocation is that of a public good.
In such cases, economists may attempt to find policies that avoid waste, either directly by government control, indirectly by regulation that induces market participants to act in a manner consistent with optimal welfare, or by creating "missing markets" to enable efficient trading where none had existed. This is studied in the field of public choice theory. "Optimal welfare" takes on a Paretian norm, a mathematical application of the Kaldor–Hicks method. This can diverge from the Utilitarian goal of maximizing utility because it does not consider the distribution of goods between people. Market failure in positive economics is limited in implications without mixing the belief of the economist and their theory; the demand for various commodities by individuals is thought of as the outcome of a utility-maximizing process, with each individual trying to maximize their own utility under a budget constraint and a given consumption set. The study of microeconomics involves several "key" areas: Supply and demand is an economic model of price determination in a competitive market.
It concludes that in a competitive market with no externalities, per unit taxes, or price controls, the unit price for a particular good is the price at which the quantity demanded by consumers equals the quantity supplied by producers. This price results in a stable economic equilibrium. Elasticity is the measurement of how resp
In economics, an indifference curve connects points on a graph representing different quantities of two goods, points between which a consumer is indifferent. That is, the consumer has no preference for one combination or bundle of goods over a different combination on the same curve. One can refer to each point on the indifference curve as rendering the same level of utility for the consumer. In other words, an indifference curve is the locus of various points showing different combinations of two goods providing equal utility to the consumer. Utility is a device to represent preferences rather than something from which preferences come; the main use of indifference curves is in the representation of observable demand patterns for individual consumers over commodity bundles. There are infinitely many indifference curves: one passes through each combination. A collection of indifference curves, illustrated graphically, is referred to as an indifference map; the theory of indifference curves was developed by Francis Ysidro Edgeworth, who explained in his 1881 book the mathematics needed for their drawing.
The theory can be derived from William Stanley Jevons' ordinal utility theory, which posits that individuals can always rank any consumption bundles by order of preference. A graph of indifference curves for several utility levels of an individual consumer is called an indifference map. Points yielding different utility levels are each associated with distinct indifference curves and these indifference curves on the indifference map are like contour lines on a topographical graph; each point on the curve represents the same elevation. If you move "off" an indifference curve traveling in a northeast direction you are climbing a mound of utility; the higher you go the greater the level of utility. The non-satiation requirement means that you will never reach the "top," or a "bliss point," a consumption bundle, preferred to all others. Indifference curves are represented to be: Defined only in the non-negative quadrant of commodity quantities. Negatively sloped; that is, as quantity consumed of one good increases, total satisfaction would increase if not offset by a decrease in the quantity consumed of the other good.
Equivalently, such that more of either good is preferred to no increase, is excluded. The negative slope of the indifference curve reflects the assumption of the monotonicity of consumer's preferences, which generates monotonically increasing utility functions, the assumption of non-satiation; because of monotonicity of preferences and non-satiation, a bundle with more of both goods must be preferred to one with less of both, thus the first bundle must yield a higher utility, lie on a different indifference curve at a higher utility level. The negative slope of the indifference curve implies that the marginal rate of substitution is always positive. So, with, no two curves can intersect. Transitive with respect to points on distinct indifference curves; that is, if each point on I2 is preferred to each point on I1, each point on I3 is preferred to each point on I2, each point on I3 is preferred to each point on I1. A negative slope and transitivity exclude indifference curves crossing, since straight lines from the origin on both sides of where they crossed would give opposite and intransitive preference rankings.
Convex. With, convex preferences imply that the indifference curves cannot be concave to the origin, i.e. they will either be straight lines or bulge toward the origin of the indifference curve. If the latter is the case as a consumer decreases consumption of one good in successive units, successively larger doses of the other good are required to keep satisfaction unchanged. Preferences are complete; the consumer has ranked all available alternative combinations of commodities in terms of the satisfaction they provide him. Assume that there are two consumption bundles A and B each containing two commodities x and y. A consumer can unambiguously determine that one and only one of the following is the case: A is preferred to B, formally written as A p B B is preferred to A, formally written as B p A A is indifferent to B, formally written as A I B This axiom precludes the possibility that the consumer cannot decide, It assumes that a consumer is able to make this comparison with respect to every conceivable bundle of goods.
Preferences are reflexiveThis means that if A and B are identical in all respects the consumer will recognize this fact and be indifferent in comparing A and B A = B ⇒ A I BPreferences are transitiveIf A p B and B p C A p C. If A I B and B I C A I C; this is a consistency assumption. Preferences are continuousIf A is preferred to B and C is sufficiently close to B A is preferred to C. A p B and C → B ⇒ A p C. "Continuous" means infinitely divisible - just like there are infinitely many numbers between 1 and 2 all bundles are infinitely divisible. T
Economics is the social science that studies the production and consumption of goods and services. Economics focuses on the behaviour and interactions of economic agents. Microeconomics analyzes basic elements in the economy, including individual agents and markets, their interactions, the outcomes of interactions. Individual agents may include, for example, firms and sellers. Macroeconomics analyzes the entire economy and issues affecting it, including unemployment of resources, economic growth, the public policies that address these issues. See glossary of economics. Other broad distinctions within economics include those between positive economics, describing "what is", normative economics, advocating "what ought to be". Economic analysis can be applied throughout society, in business, health care, government. Economic analysis is sometimes applied to such diverse subjects as crime, the family, politics, social institutions, war and the environment; the discipline was renamed in the late 19th century due to Alfred Marshall, from "political economy" to "economics" as a shorter term for "economic science".
At that time, it became more open to rigorous thinking and made increased use of mathematics, which helped support efforts to have it accepted as a science and as a separate discipline outside of political science and other social sciences. There are a variety of modern definitions of economics. Scottish philosopher Adam Smith defined what was called political economy as "an inquiry into the nature and causes of the wealth of nations", in particular as: a branch of the science of a statesman or legislator a plentiful revenue or subsistence for the people... to supply the state or commonwealth with a revenue for the publick services. Jean-Baptiste Say, distinguishing the subject from its public-policy uses, defines it as the science of production and consumption of wealth. On the satirical side, Thomas Carlyle coined "the dismal science" as an epithet for classical economics, in this context linked to the pessimistic analysis of Malthus. John Stuart Mill defines the subject in a social context as: The science which traces the laws of such of the phenomena of society as arise from the combined operations of mankind for the production of wealth, in so far as those phenomena are not modified by the pursuit of any other object.
Alfred Marshall provides a still cited definition in his textbook Principles of Economics that extends analysis beyond wealth and from the societal to the microeconomic level: Economics is a study of man in the ordinary business of life. It enquires how he uses it. Thus, it is on the one side, the study of wealth and on the other and more important side, a part of the study of man. Lionel Robbins developed implications of what has been termed "erhaps the most accepted current definition of the subject": Economics is a science which studies human behaviour as a relationship between ends and scarce means which have alternative uses. Robbins describes the definition as not classificatory in "pick out certain kinds of behaviour" but rather analytical in "focus attention on a particular aspect of behaviour, the form imposed by the influence of scarcity." He affirmed that previous economists have centred their studies on the analysis of wealth: how wealth is created and consumed. But he said that economics can be used to study other things, such as war, that are outside its usual focus.
This is because war has as the goal winning it, generates both cost and benefits. If the war is not winnable or if the expected costs outweigh the benefits, the deciding actors may never go to war but rather explore other alternatives. We cannot define economics as the science that studies wealth, crime and any other field economic analysis can be applied to; some subsequent comments criticized the definition as overly broad in failing to limit its subject matter to analysis of markets. From the 1960s, such comments abated as the economic theory of maximizing behaviour and rational-choice modelling expanded the domain of the subject to areas treated in other fields. There are other criticisms as well, such as in scarcity not accounting for the macroeconomics of high unemployment. Gary Becker, a contributor to the expansion of economics into new areas, describes the approach he favours as "combin assumptions of maximizing behaviour, stable preferences, market equilibrium, used relentlessly and unflinchingly."
One commentary characterizes the remark as making economics an approach rather than a subject matter but with great specificity as to the "choice process and the type of social interaction that analysis involves." The same source reviews a range of definitions included in principles of economics textbooks and concludes that the lack of agreement need not affect the subject-matter that the texts treat. A
In economics, a shortage or excess demand is a situation in which the demand for a product or service exceeds its supply in a market. It is the opposite of an excess supply. In a perfect market, an excess of demand will prompt sellers to increase prices until demand at that price matches the available supply, establishing market equilibrium. In economic terminology, a shortage occurs when for some reason the price does not rise to reach equilibrium. In this circumstance, buyers want to purchase more at the market price than the quantity of the good or service, available, some non-price mechanism determines which buyers are served. So in a perfect market the only thing that can cause a shortage is price. In common use, the term "shortage" may refer to a situation where most people are unable to find a desired good at an affordable price where supply problems have increased the price. "Market clearing" happens when all buyers and sellers willing to transact at the prevailing price are able to find partners.
There are always willing buyers at a lower-than-market-clearing price. Shortages may be caused by: Price ceilings, a type of price control which involves a government-imposed limit on the price of a product service. Anti-price gouging laws. Government ban on the sale of a product or service, such as prostitution or certain recreational drugs. Decisions by suppliers not to raise prices, for example to maintain friendly relationships with potential future customers during a supply disruption. Artificial scarcity Decisions which result in a below-market-clearing price help some people and hurt others. In this case, shortages may be accepted because they theoretically enable a certain portion of the population to purchase a product that they couldn't afford at the market-clearing price; the cost is to those who are willing to pay for a product and either can't, or experience greater difficulty in doing so. In the case of government intervention in the market, there is always a trade-off with positive and negative effects.
For example, a price ceiling may cause a shortage, but it will enable a certain percentage of the population to purchase a product that they couldn't afford at market costs. Economic shortages caused by higher transaction costs and opportunity costs mean that the distribution process is wasteful. Both of these factors contribute to a decrease in aggregate wealth. Shortages may cause: Black markets, illegal markets in which products that are unavailable in conventional markets are sold, or in which products with excess demand are sold at higher prices than in the conventional market. Artificial controls of demand, such as time and rationing. Non-monetary bargaining methods, such as time, nepotism, or violence. Price discrimination; the inability to purchase a product, subsequent forced saving. Rationing in the United Kingdom occurred during and after the world wars From 1920 to 1933 during prohibition in the United States, the creation of a black market for liquor was created due to the low supply of alcoholic beverages.
During the 1973 oil crisis, during which long lines and rationing was used to control demand. In the former Soviet Union during the 1980s, prices were artificially low by fiat. Soviet citizens waited in line for various price-controlled goods and services such as cars, apartments, or some types of clothing. From the point of view of those waiting in line, such goods were in perpetual "short supply"; this method for determining the allocation of goods in short supply is known as "rationing". From the mid-2000s through the 2010s, shortages in Venezuela occurred, due to the Venezuelan government's economic policies; as a result of such shortages, Venezuelans had to search for products, wait in lines for hours and rationing was initiated, with the government allowing the purchase of a certain amount of products through fingerprint recognition. Whether an economic shortage of a certain good or service is beneficial or detrimental to society depends on one's ethical and political views. For instance, consider the shortage of recreational drugs discussed above, the controversies around the use of such drugs.
Consider the economic shortage of cars in the Soviet Union during the 1980s: On the one hand, people had to wait in line to buy a new car. Garrett Hardin emphasised that a shortage of supply can just as well be viewed as a "longage" of demand. For instance, a shortage of food can just as well be called a longage of people. By looking at it from this view, he felt. In its narrowest definition, a labour shortage is an economic condition in which employers believe there are insufficient qualified candidates to fill the marketplace demands for employment at a wage, employer-determined; such a condition is sometimes referred to by economists as "an in
In economics, average cost or unit cost is equal to total cost divided by the number of unit of a good produced: A C = T C Q. It is equal to the sum of average variable costs and average fixed costs. Average costs may be dependent on the time period considered. Average costs are a fundamental component of supply and demand. Short-run costs are those that vary with no time lagging. Labor cost and the cost of raw materials are short-run costs. An average cost curve can be plotted with cost on the vertical axis and quantity on the horizontal axis. Marginal costs are also shown on these graphs, with marginal cost representing the cost of the last unit produced at each point. A typical average cost curve has a U-shape, because fixed costs are all incurred before any production takes place and marginal costs are increasing, because of diminishing marginal productivity. In this "typical" case, for low levels of production marginal costs are below average costs, so average costs are decreasing as quantity increases.
An increasing marginal cost curve intersects a U-shaped average cost curve at the latter’s minimum, after which the average cost curve begins to slope upward. For further increases in production beyond this minimum, marginal cost is above average costs, so average costs are increasing as quantity increases. For example: for a factory designed to produce a specific quantity of widgets per period—below a certain production level, average cost is higher due to under-used equipment, above that level, production bottlenecks increase average cost. Long-run average cost is the unit cost of producing a certain output when all inputs physical capital, are variable; the behavioral assumption is that the firm will choose that combination of inputs that produce the desired quantity at the lowest possible cost. A long-run average cost curve is downward sloping at low levels of output and upward or downward sloping at high levels of output. Most the long-run average cost curve is U-shaped, by definition reflecting economies of scale where negatively sloped and diseconomies of scale where positively sloped.
If the firm is a perfect competitor in all input markets, thus the per-unit prices of all its inputs are unaffected by how much of the inputs the firm purchases it can be shown that at a particular level of output, the firm has economies of scale if and only if it has increasing returns to scale, the latter being a feature of the production function. It has diseconomies of scale if and only if it has decreasing returns to scale, has neither economies nor diseconomies of scale if it has constant returns to scale. With perfect competition in the output market the long-run market equilibrium will involve all firms operating at the minimum point of their long-run average cost curves. If, the firm is not a perfect competitor in the input markets the above conclusions are modified. For example, if there are increasing returns to scale in some range of output levels, but the firm is so big in one or more input markets that increasing its purchases of an input drives up the input's per-unit cost the firm could have diseconomies of scale in that range of output levels.
Conversely, if the firm is able to get bulk discounts of an input it could have economies of scale in some range of output levels if it has decreasing returns in production in that output range. In some industries, long-run average cost is always declining; this means that the largest firm tends to have a cost advantage, the industry tends to become a monopoly, hence is called a natural monopoly. Natural monopolies tend to exist in industries with high capital costs in relation to variable costs, such as water supply and electricity supply; when average cost is declining as output increases, marginal cost is less than average cost. When average cost is rising, marginal cost is greater than average cost; when average cost is neither rising nor falling, marginal cost equals average cost. Other special cases for average cost and marginal cost appear frequently: Constant marginal cost/high fixed costs: each additional unit of production is produced at constant additional expense per unit; the average cost curve slopes down continuously.
An example is hydroelectric generation, which has no fuel expense, limited maintenance expenses and a high up-front fixed cost. Industries with fixed marginal costs, such as electrical transmission networks, may meet the conditions for a natural monopoly, because once capacity is built, the marginal cost to the incumbent of serving an additional customer is always lower than the average cost for a potential competitor; the high fixed capital costs are a barrier to entry. Two popular pricing mechanisms are average cost pricing and marginal cost pricing. A monopoly produces where its average cost curve meets the market demand curve under average cost pricing, referred