|New York Stock Exchange, Inc.|
Securities and Exchange Commission Concept Release on
"Regulation of Market Information Fees and Revenues"
(Release No. 34-42208; File No. S7-28-99)
The Economic Perspective on Regulation of Market Data Prices
|Prepared:||Dr. Oliver Grawe, Principal|
|Reviewed:||Alan Kolnik, Senior Vice President|
PHB Hagler Bailly, Inc.
1776 Eye St, N.W.
Washington, DC 20006-3700
TABLE OF CONTENTS
THE ECONOMIC PERSPECTIVE ON REGULATION OF MARKET DATA PRICES
IS THERE A NEED TO REGULATE?
The Purpose of Regulation
The Current State of Fees for Market Data
WHAT IS THE NATURE OF REGULATION DESIGNED TO SERVE INVESTORS' WELL-BEING?
The Basis for Regulation
Regulation of Market Data Prices - The Economist's Perspective
Regulation That Mimics Competition - The Basis of Correct Pricing
WHAT WILL BE THE EFFECT OF ADDITIONAL REGULATION ON COMPETITION IN THE EXCHANGE ENVIRONMENT?
The Importance of Listing Location With Respect to Transactions
Market Data and Competition
Trading Off the Primary Exchanges
Impact of Across-the-Board Limits on Market Data Fees.
APPENDIX 1: FULLY DISTRIBUTED COST
We discuss the SEC Concept Release No. 34-42208; File No. S7-28-99 in terms of economic theory and practice, following three lines of reasoning:
1. Is There a Need to Regulate?
Is there any demonstrated need to regulate market data fees, based upon the standard natural monopoly/essential facility rationale for intervening in the market process? We do not believe that this has been demonstrated or that it would be demonstrable.
2. What is The Nature of Regulation Designed to Serve Investors' Well-Being?
If price regulation, in any event, is deemed in the public interest, what broadly speaking would be the nature of regulation designed to maximize the well-being of investors, while assuring the viability of the exchanges?1 Economic theory makes one point quite clear. Price regulation designed to promote consumer welfare subject to the proviso that the supplier(s) are financially viable and that their viability depends only upon earnings generated from market-based sales of services will not result in prices being equated to the marginal (alternatively, incremental or direct) cost of making the service available.2 When regulation is required to control private market power arising from economies of scale or scope, compelling a regulated firm to price services at individual product marginal cost either results in insolvency or requires a tax-supported subsidy.
3. What will be the Effect of Additional Regulation on Competition in the Exchange Environment?
We believe regulation could adversely affect the competition for transactions between markets, which competition already strongly influence on the prices of the set of four3 related primary services exchanges offer:
The Purpose of Regulation
Competition policy broadly defined includes antitrust and regulatory policy. Antitrust and regulation have been viewed as particularly American paths to the public control of real or perceived private market power.4 The need to replace or find alternatives to market institutions - whether public ownership, public subsidy and taxes, cooperatives, or regulation - has historically flowed from the perception that the market has failed. Traditional economic analysis has concluded that a competitive market fails either due to market power, arising from indivisible factors of production leading to substantial and significant5 economies of scale and scope, or from severe information asymmetries that cause markets to unravel or never form in the first place.6
Traditionally, the need to regulate a firm's prices - the widely known "public utility" examples of surface transportation, electric power, telecommunications, and defense industry firms - as the SEC proposes with respect to market data, has been based on the premise that such firms possess significant amounts of market power. Antitrust and regulatory policy, as they evolved during the 1970s and 1980s, have come to reflect a pro-consumer orientation.7 Business activities that make life difficult for rivals typically only violate the antitrust laws if they result in higher prices to final consumers, a reduction in product quality, or reduced innovation designed to lead to lower prices or better quality.8 However, there is no reason to regulate unless competitive markets do not exist or private contracting around the sources of market power is not feasible. Regulation has nothing positive to contribute. As Judge Stephen Breyer has put it:
[R]egulation and antitrust are the opposite sides of the same coin. The classical way to put this point is that society does not seek either competition or regulation for their own sakes. Rather, both are but means for achieving a further objective. That objective basically is to improve people's lives in three specific ways. The first is called "economic efficiency," putting together the world's resources so as to maximize their potential for satisfying human needs. That has something to do with low prices. The second way is through achieving "X efficiency," which, roughly speaking, refers to technically more efficient production methods. The third way is through increased innovation.
Those are the three classical ends of the antitrust laws; they are also the classical ends of regulation. ... We sometimes tend to think that regulation is really inefficient, though it aims directly at our goals; but antitrust, though its aim is indirect, in fact more often helps move us in the right direction.9
Economists sum up at least the antitrust focus as being aimed as consumer welfare.10 Judge Breyer has claimed that the aim of regulation-its goal-is the same.
The SEC's emphasis is apparently the same. The SEC's Concept Release on Regulation of Market Information Fees and Revenues frequently refers to its goal in terms of the impact on "final consumers", the retail investors.11 For example, after noting that "most of the fees applicable to retail investors have been reduced in recent months by 50% to 80%," the SEC nonetheless stated that it "remains concerned that retail investor fees have not properly kept pace with changing technology and increased demand."12 According to the Concept Release "the most important function that the Commission can perform for retail investors is to ensure that they have access to the information they need to protect and further their own interests." Moreover, "broad access to real-time Market Data should be an affordable option for most retail investors, as it long has been for professional investors."13 Finally, the SEC's focus on retail investors, as opposed to professional investors whom the SEC apparently concludes have adequate access and currently pay "affordable" rates, includes the premise that market data fees for retail investors "must not be unreasonably discriminatory when compared with the fees charged to professional users.14
The Current State of Fees for Market Data
The reduction in fees for market data that went into effect in October 1999 is worthy of deeper consideration.
Market data provided by the Consolidated Tape Association ("CTA") is priced to vendors depending upon two factors:
(1) whether the third-party user is a professional or a non-professional, and
(2) whether the data delivered to a third party is real-time or delayed.
CTA does not charge vendors for the redistribution of delayed data. Hence, the SEC's Concept Release applies only to real-time data for which markets currently charge fees.
CTA provides data to, among others, both traditional full-service brokers and to on-line brokerage firms (or on-line departments of traditional firms). The traditional and on-line firms compete with one another for retail investor patronage. Traditional brokers have made the data they receive available to their customers in various ways, typically over the telephone or when investors visit the brokers' place of business. CTA charges these firms a monthly fee per data terminal. 15 This fee permits the broker to respond to an unlimited number of inquiries for quotes.16 Given the way the traditional brokerage business operates, this contracting approach minimizes monitoring and other transaction costs.
However, under the current method by which market data fees have been determined, the price charged retail investors for real-time market-data by profit-maximizing vendors is as low as zero. The real-time data available from many sources, including on-line brokers or even information services such as SmartMoney.com, is free.17 This is not entirely surprising, at least for NYSE market-data, as the non-professional fee for real-time data set by NYSE is $0.50 - $1.00 per month. Examples of vendors that advertise free real-time data include RTC Thomson ("Real Time Quotes"),18 Medved Quote Tracker,19 freerealtime.com,20 and SmartMoney.21 This limited survey simply illustrates that investors are willing, under the prevailing method for setting market data fees, to back entrants into the financial marketplace who offer retail investors free stock and option quotes in real time. Whether any or all of these firms ultimately succeed - have their business models vindicated by meeting a market test - is not the point. Rather, current market data fees are not so high as to preclude firms from purchasing it and then transmitting it at no charge.22
We do not believe that direct economic price regulation is likely to improve upon free real time data insofar as retail investors are concerned. Moreover, the public information available from even a casual Internet search indicates that free or low-cost real-time market data is very widely available for exchanges located in North America.
Other vendors sell market data as part of a package that includes software for manipulating data and access to brokerage reports and other qualitative information. Advertised market data fees appear in some cases to include a vendor mark-up above the charge made by NYSE, for example, and are, in every case, a small part of the vendors' monthly fees. Examples of firms that package real-time or delayed data with various software packages, news sources, and market analysis are set out in TABLE 1 below:
TABLE 1. Vendors Packaging Real Time or Delayed Data
Market Data Fee
Percent of Package Price
1 exchange included
2 exchanges included
3 exchanges included
Exchange fees included
$75.00 + $2.75/hr
$125.00 + $1.50/hr
$175.00 + $0.75/hr
$200.00 + NC/hr
NASDAQ II: $100/month
For each of these vendors, monthly market data fees for the three major markets account for less than 25 percent of the price investors pay for the vendor's service package. This sampling of vendors indicates that only a small proportion of the fees that investors pay for these services returns to the exchanges in the form of payment for market data.
The cost of the market data provided by Network A is a fraction of vendor fees charged to consumers for data and the packages of software designed to manipulate it. Indeed, investors have opportunities to obtain real time data for free. We very much doubt that regulation, even perfectly efficient regulation carried out at low cost, if that were possible, can improve upon that. Moreover, we do not believe that the Concept Release contains any analysis sufficient to show why market data fees require regulation. The individual traditional exchanges are not natural monopolies - if they were there would not be nine of them. The exchanges compete for transactions and, in doing so, that competition shifts the market-data produced by transacting as well from one exchange to another. Hence, a compelling case for regulation, one demonstrating that the gains from applying price regulation specifically to market data would exceed its costs, has not been made.
The Basis for Regulation
As previously mentioned, economic theory and practice makes one point very clear. Regulation designed to serve investors' well-being would not limit rates for any regulated service to the marginal (incremental, direct) cost of making the service available. A regulator interested in maximizing consumer welfare but obliged to mediate between contending consumer groups arguing over how to allocate shared costs would allocate the bulk of the shared cost to the service(s) with the least elastic demand. And the service with the least elastic demand may well be market data.
The basis for enduring monopoly power flows from substantial economies of scale and scope29 or strong network externalities.30
Judge Breyer has described the best known rationale for regulation this way:
The most traditional and persistent rationale for governmental regulation of a firm's prices and profits is the existence of a "natural monopoly." Some industries, it is claimed, cannot efficiently support more than one firm. ... Rather than have three connecting phone companies laying separate cables where one would do, it may be more efficient to grant one firm a monopoly subject to governmental regulation of its prices and profits.31
Judge Breyer's point of reference is widely accepted. In 1950, Clemens observed: "Necessity and monopoly are almost prerequisites of public utility status."32
Judge Breyer notes that there have been other arguments supporting regulation, including an issue related to "fairness." As Judge Breyer notes, the "natural monopoly" reason to regulate can be undercut substantially when the regulated firm can price discriminate-the output restriction that is the hallmark of market or monopoly power disappears.
The "fairness" issue that arises does so because the gains from eliminating the output restriction-effectively the "bribe" paid to the monopolist to sell more-may be deemed to have been collected "unfairly" from the monopolist's customers. In that sense, the discrimination is deemed to be "undue" or "excessive" or "unreasonable" or "unfair." One point to bear in mind, however, is that regulatory policy has never eliminated economic discrimination-indeed some regulatory policies actually promote economic discrimination (e.g., the subsidy to local telephone customers generated from high long-distance charges). The issue is not discrimination per se but "excessive" discrimination, a term notoriously difficult to define.
This difficulty is especially acute when customers of the regulated entity themselves compete and use very different business models with different monitoring costs for each different use of the regulated service. For example, full-service broker-dealers provide information either face-to-face, over the telephone or via a person-to-person Internet link. For NYSE or any other exchange to successfully monitor each use of NYSE-supplied data would be costly. As a result, Network A provides market data to broker/dealers to allow their registered representatives to respond to their customers' inquiries for stock data. For this, broker/dealers pay a monthly per-device fee. Referring back to footnote 16, this contract option sets the fee as: CA = FA. Network A also provides data to broker/dealers and vendors for their provision of online services to non-professional subscribers. For this, Network A charges the broker/dealer or vendor $1.00 per month (or, if the broker/dealer or vendor has more than 250,000 subscribers, $0.50 per month). That fee entitles the broker/dealer or vendor to provide unlimited data to the subscriber for that month. In that case, CA = $1.00 (or $0.50). For both the broker-dealer and the retail investor, the marginal cost of another quote is zero, as both contract options provide for unlimited use once the fixed fee is paid.
Network A also offers an alternative to the per-device contract option to vendors. This option ties cost directly to use. Network A permits users to pay $0.0075 per quote for the first 20 million quotes obtained during a calendar month, with the fee schedule declining to $0.0025 a quote for all quotes above 40 million a month. In terms of the earlier model, CA = 0+cA(Q)*Q. There is no fixed fee. The charge paid depends only upon use, with the per-quote cost, cA(Q), depending upon the number of quotes "purchased" during a given month.
Bonbright, et al. also adopt Breyer's "natural monopoly" justification for economic price regulation:
The natural-monopoly theory of public utility regulation reflects an old and orthodox point of view. ... Properly qualified, we believe it to be sound.33
A natural monopoly is of particular interest to this study inasmuch as it is sine qua non of regulation according to the public interest theory of regulation.34
As the authors make clear, even with substantial economies of scale - a necessary condition for a "natural monopoly" - price regulation is not inevitably required35:
...regulation is a questionable substitute for competition under conditions of natural monopoly and is a very poor substitute indeed when an industry is naturally competitive. Regulation carriers with it the potential for anti-competitive effects even when there is a true natural monopoly, and this is why economists have such a strong bias favoring competition.
The late Claire Wilcox put the point this way:
Regulation, at best, is a pallid substitute for competition. It cannot prescribe quality, force efficiency, or require innovation, because such action would invade the sphere of management. [Emphasis added] But when it leaves these matters to the discretion of industry, it denies consumers the protection that competition would afford. Regulation cannot set prices below an industry's costs however excessive they may be. Competition does so, and the high-cost company is compelled to discover means whereby its costs can be reduced. Regulation does not enlarge consumption by setting prices at the lowest level consistent with a fair return. Competition has this effect. Regulation fails to encourage performance in the public interest by offering rewards and penalties. Competition offers both.36
The natural monopoly, or "essential facilities",37 doctrine as a necessary pre-requisite for economic price regulation is well known and well accepted. The "essential facilities" doctrine was imported into regulation through antitrust decisions. Indeed, some of the antitrust decisions were the outgrowth of regulatory reluctance to permit entry into either regulated markets or markets adjacent to regulated markets.
Roughly, an essential facility is one access to which is necessary for a firm to be able to compete. For example, eighty years ago access to the NYSE may have been "essential" for a broker to do business as an intermediary between buyers and sellers of stock. Both the transactions and the information necessary for future transactions - the market data produced as a byproduct of current trades - were produced on NYSE's trading floor. As other exchanges developed and as the SEC promulgated rules requiring NYSE to make its market data available off the floor (provide access), the brokerage function between buyers and sellers no longer needed to take place at 11 Wall Street for NYSE-listed shares.
While the fraction of NYSE-listed shares and trades taking place on the NYSE has fallen, with respect to small orders, the fraction of block trades and their associated bids and offers that significantly contribute to price discovery itself has apparently remained high. Hence, some traders, especially traders handling smaller orders or liquidity traders, can avoid NYSE's floor entirely so long as they have access to the price discovery activity that takes place there. These traders in effect free-ride on the price discovery resulting from the NYSE-auction mechanism. In addition, competition to provide the place where price discovery itself occurs is not absent. For example, in the options arena, different exchanges provide primary price discovery for different securities - there is no evident need for price discovery for all securities to occur in one place even if there are strong economic reasons for centralized price discovery for individual securities.
However, it is not always easy to identify whether an enterprise is or is not "essential". As Bonbright, et al note:
However, determining whether natural monopoly or competition characterizes a specific industry is a complicated task that depends on the supply (cost and technology) structure, the demand pattern, and the behavioral intentions (e.g., pricing strategies) between incumbent firms and potential entrants.38
Moreover, it is important to understand that an "essential facility" is more than an enterprise that may have some limited market power. Many firms may have some market power, defined as having some control over price resulting from a downward-sloped (residual) demand curve for the firm's products or services and leading to price set above marginal cost. These situations have generally not resulted in economic price regulation. What is more, no exchange constitutes an "essential facility" because competition, old and new, between exchanges for transactions (order flow) and for listings is substantial.39
We believe it is also very important to understand that even when economies of scale and scope give a single enterprise a substantial cost-advantage, government price-regulation may not be required. NYSE, for example, has a Board of Directors that includes representatives of many of the firms that use NYSE's services. NYSE also has a series of advisory committees that expands participation by institutions and individuals using NYSE services even more. NYSE does not represent the typical text book model of purely independent sellers dealing at arms-length with purely independent buyers. In a real sense, NYSE is a cooperative made up of the customers who share the facilities and services NYSE makes available. NYSE members, in effect, determine the way rates are tailored for the services they use. When NYSE's Board agrees to change the rate structure for the listing, transaction, and market data services NYSE provides, they are changing the structure of the fees they pay.
Regulation of Market Data Prices - The Economist's Perspective
Inter-market competition, especially competition for transactions (order flow), appears to be vigorous (see Section 3 of this paper), or at least sufficient to cast doubt on the proposition that primary markets are "essential facilities" or "natural monopolies" for all the services they provide.
Exchanges provide four related services paid for directly by investors and listed companies:
At the risk of oversimplifying, transactions produce as an unavoidable co-product market data. Originally, due to a technology for collecting, processing and disseminating that made the process slow and essentially "local", exchanges provided a place for buyers and sellers to meet and transact through a double auction. When the transaction and the market data it produced were closely linked at one site - to obtain relevant price discovery information a trader needed physical access to the exchange - the cost of setting up and running the exchange could be defrayed by a fee on each transaction (or share of stock traded).41 The fact that there were no fees levied on "market data" did not matter much so long as the price discovery information produced by an exchange remained under the control, or for the use of, members of the exchange.
As technology changed, however, and as the SEC mandated the widespread distribution of market data to the investing public, traders outside the exchange gained access to the information byproduct of exchange transactions and bid/offer postings. To the extent that these traders did not use the exchange as a place to consummate trades they escaped paying exchange fees Hence, for the simplified model of a traditional primary market the exchange's net income42. can be written as:
(1) = pT*T + pD*D - C(T,D)
where represents net income ("profits"), pT and pD represent the fees charged for transactions (T) and market data (D), respectively, and C(T,D) represents the total cost associated with an exchange having T transactions, and D market data.43 The business exchanges are in, as we have already noted, inextricably links transactions and market data. We use the following commonly used form to recognize this interdependence:
(2) C(T,D) = c(T,D) + F
where F represents fixed (and possibly sunk) set-up costs required for the production of any service and c(T,D) represents the avoidable costs arising from producing T and/or D once capacity has been set-up.
Regulation That Mimics Competition - The Basis of Correct Pricing
It is both correct and widely understood that undistorted effective competition is the best mechanism to assure that the public interest is served by a market. It is also correct and widely understood that economic price regulation, no matter how well intentioned and adroitly implemented, does not produce the same benefits for consumers as competition would provide were it feasible. Absent significant and durable market power, economic price regulation should be avoided.
When market power is significant and durable, economic price regulation seeks to ensure efficient supply of all the services, and at their associated quality levels, for which consumer benefits exceed the full cost of provision. The rates set should be no higher than necessary to cover the efficient costs of supply, but must be high enough to allow competitive rates of return on efficiently employed capital in order to assure that funds are made available for needed replacement, upgrading and expansion of facilities. It is important that the rate set and the rate-setting process itself does not take away incentives for producers to become more efficient by adopting better technology and by bargaining hard over the prices paid for the labor, capital, technology and other services and materials they use44.
But what regulatory rule should be used to guide rate-setting when (a) the regulatory concern is with consumer welfare - in this case the welfare of retail investors, (b) the regulated supplier must be viable, (c) the regulated firm's technology displays economies of scale and scope which give rise to shared costs, and (d) the government does not provide subsidies from general tax revenue?
The correct rule prices must meet is that the price for each service and for each bundle of services including the bundle that consists of all services must cover the incremental cost of making that bundle available. This rule provides a price floor that, when met, guarantees that no customer is being subsidized by any other customer. At the other end, the most that should be charged for any service or bundle of services is the stand-alone cost45 of making that service available. The logic of this cap is as follows. In a competitive market the most any individual or group can be charged is the cost of supplying that individual or group. If a supplier tries to charge more, the customer or group will simply take their business elsewhere. Competition forces the prices of any service, or bundle of services, to fall if and only if someone else can produce it more cheaply. And that happens if and only if the price of the good or bundle of goods exceeds its stand-alone cost.
It must be clearly appreciated that when economies of scale and scope arise, the regulator cannot limit the fees a monopolist may charge to the direct or incremental cost of each service supplied. To do so inevitably condemns the supplier to insolvency.46 A simple example of an exchange shows why. The exchange provides two services, transaction services, T, and market data, D. The technology exchanges use has a shared cost element, F, as well as service-specific costs, cT(T) and cD(D), that vary with the quantity of the service provided. Total cost, then, is F + cT(T) + cD(D). Unless the exchange has a reasonable expectation that it can recover this total cost, it will exit the market. The "direct cost" or "incremental cost" criterion for pricing services frequently takes a form that limits the firm to setting rates that recover only the variable "incremental" element. Suppose we follow this advice and mandate that the fee for market data, pD, must be set so that total market data revenue, pDD, equals only incremental market data variable costs, cD(D). Then for the exchange to remain in business, the total fees collected from transactions, pTT, must cover not only the variable cost of transaction services, cTT, but also all of the shared cost, F. But the principle that fees should only recover a service's incremental cost precludes that as well. Hence, the "direct" or "incremental" cost test applied one product at a time ensures that a natural monopoly (or any firm with a technology giving rise to shared costs) is guaranteed losses equal to the shared costs. There is no "expectation" - reasonable or otherwise - of earning a normal return.47
The regulated firm is viable only when the sum of the rates charged for all the firm's services covers the incremental cost of providing all of them together, but that sum is the firm's total cost including the shared cost, F. Hence, the fees for each service must differ from incremental or direct cost.48 In a very real sense, regulation of a multiple service firm runs the risk of making the regulator the vehicle for negotiating how shared costs will be allocated between different groups of buyers, who may or may not be competitors of one another in their own output markets.
This effect is already observed in submissions by industry participants to the Commission, in which they attempt to sway the Commission towards regulatory actions that would favor their competitive model (inevitably, at the expense of their competitors, who would be correspondingly disadvantaged). Thus, when the Commission is called upon to rule on, say, new pricing mechanisms for market data, it will have to consider the impact of its actions not only on the purported beneficiaries (investors) but the new competitive situation that will arise in the industry. And upon ruling one way or another, it can expect to be subjected to new demands by disaffected competitors attempting to redress the damage they perceive has been done to their cause.
For situations where the regulatory goal is to maximize consumer welfare50 while, at the same time, ensuring that the supplier is remains viable, in an economy where the government does not directly subsidize suppliers from general tax revenue,51 William Baumol has noted: "[I]t is now generally recognized that the correct (second-best) solution requires Ramsey pricing."52 Prices that maximize consumer welfare while meeting the supplier's viability constraint - Ramsey prices - depend upon both a firm's marginal (incremental) costs as well as the elasticities of demand for the services the firm sells. Unlike fully-distributed cost-allocation accounting schemes, discussed in Appendix A, Ramsey prices cannot be deduced from cost data alone. Conversely, prices that are based only upon cost data will meet the regulatory goal of maximizing consumer welfare (and avoiding cross subsidy) only by chance.
The formulae for fixing Ramsey prices can be complex. But the simplest case, where demands are independent of one another, illustrates an important point. In that case, the service prices meet the so-called "inverse elasticity rule": price departs the most from marginal (incremental, direct) cost where demand is least elastic.53
This rule has two interpretations. First, it can be recast in terms of the regulatory concept of "value of service" pricing - price departed more from incremental cost where "value" was deemed high. When demand is inelastic - the service where the Ramsey price-cost margin is large - a given percentage increase in price, say 5 percent, results in a smaller percentage reduction in quantity demanded, say 2 percent. This means that at the original price, consumers tended to value what they bought more highly than the price they paid. Conversely, when demand is elastic a small percentage change in price results in a larger percentage reduction in purchases. The marginal consumers did not value what they bought much more than the price they paid - these consumers had good alternatives on which to spend their money and they took that option when confronted with a higher price. Ramsey pricing, then, tends to allocate most of the shared cost of making the service available to those customers who value the service most highly.54
Second, because Ramsey pricing allocates overhead to services where demand is inelastic-where an increase in price reduced output the least-it produces revenue sufficient to cover shared costs with the least average output restriction possible.
Implementing Ramsey pricing directly may involve a substantial data collection exercise.55 We are not suggesting that the SEC should or should not pursue that approach if regulation in the end is deemed essential.56 The more significant point: consumer-welfare maximizing prices set by a benign regulator would invariably depart from marginal (incremental or direct) costs and possibly by a significant amount. For NYSE, if transaction demand is much more elastic than is the demand for NYSE's market data, then a benign regulator would impose most of the shared costs associated with running the NYSE equities' auctions on market data fees. This, in turn, might well result in an increase in market data fees.
Investors do not need, under current SEC regulations, to access any specific exchange - or any exchange - to consummate trades once they have market data. Competition for transactions - for order flow - is substantial and places a cap on the revenue stream any exchange can earn from transaction fees. Maladroit price regulation on market data fees - limiting market data fee income beyond market-based limits - could well have unfortunate consequences, and potentially lead to unwanted changes in the composition of the securities industry.
The Importance of Listing Location With Respect to Transactions
SEC rules permit NYSE-listed stock to be transacted both over-the-counter and on other markets. If NYSE were truly "essential" for stock transaction purposes, then the SEC's policy permitting other exchanges to trade that stock would be meaningless. Nor is NYSE "essential" for companies seeking to list their equities to facilitate trading (liquidity) in their equity shares. As a pure numbers game, more companies list on NASDAQ than on NYSE.57 Moreover, a recent study by Prof. Hendrik Bessembinder of Emory University indicated that the 190 firms migrating from NASDAQ to NYSE in 1996 and 1997 reduced total transactions costs associated with equity trading by an average of $900,000 to $3,200,000, depending upon the measure of cost used.58
Market Data and Competition
The SEC requires that traditional exchanges make the "market data" each exchange produces for the "same" stocks available to the investing public. Following the 1975 changes to the Securities Exchange Act of 1934 the SEC required each exchange to have a plan59, in place that would permit consolidation of its market data with the market data from the other exchanges trading the same securities.60
Given the technology available at the time, the markets that traded listed equities ultimately chose, with SEC approval, to provide their market data to a single "consolidator" - SIAC - leading to the provision of data through two "pipes": Network A (NYSE-listed firm data) and Network B (AMEX-listed firm data). The market data provided to SIAC for Network A companies, for example, comes from NYSE, from a number of regional exchanges and from the OTC market. Similarly, market data for AMEX-listed companies consolidated on Network B comes from AMEX itself, a number of regional exchanges, and the OTC market.
On its face, requiring one market's data to be consolidated with data for the same equities from other markets seems to imply that Congress or the SEC believe data from Boston or Philadelphia is similar, if not identical, to data from NYSE or AMEX for the same stock (e.g. IBM) for the purpose of price discovery.
Suppose, however, that data quality varies significantly from one market to another. Even though market data is tagged for identification as to source, the current approach results in the recovery of the same price for each bit of market data regardless of source - there is one price for Network A consolidated data. Hence, any differences in quality, including the cost of providing higher quality data, are not be reflected in market data prices.
If one market's data were the only truly relevant "market data," then adding in unlabeled data from other markets would merely add noise to the signal-it would not improve upon the signal's quality or utility but rather degrade it. In an unregulated market, vendors and investors would simply disregard the low-quality information, but SEC rules preclude vendors' selective display of equities market data.61 In addition, requiring NYSE and AMEX to "share" market data revenue on, e.g., a per-transaction reported basis with their rivals has little to commend it. This is an inefficient method of subsidizing less efficient markets.
This is not a theoretical issue. First, recently Reuters has indicated that it will stop making options market data provided by certain options exchanges available to its customers, citing cost and quality as the reasons. This may also reflect Reuters' assessment that some markets' auto-quotes provide no value insofar as price discovery is concerned.62 Second, the current mismatches between data cost, data value, and average price may help explain rebates paid by some markets to encourage order flow to their floors.63 One way for an exchange to increase its share of transactions is to regularly produce the "best" bids and offers. However, there is another way. The exchange may offer its members rebates for bringing trades to that market. The economic incentive to provide rebates represents a by-product of the distribution of market data revenues among markets on the basis of each market's share of total transactions.
As the process currently functions, Network A, for example, generates a pool of revenue that is divided among participating exchanges based upon the number of transactions each reports to CTA/SIAC. Any exchange that can increase its share of transactions captures a greater share of the Network A revenue pool. If a market's cost-per-transaction is less than the average revenue per transaction, it has an incentive to increase the number of transactions provided to Network A.64 Similarly, compensating market data providers on an average revenue basis may under-compensate markets providing information that is more costly to produce. If market data pricing reflected the value that investors and brokers attributed to the quality of data e.g. based upon the data's source, lower quality/lower cost data would be sold at lower prices. The opportunity to pay a rebate would disappear if each market sold its data separately.65
Thus far the value of market data provided through Network A has not been considered. How investors will value Network A's data stream, as the content of the data stream changes, is not obvious. For example, investors might link the value of the total data stream to the value of the "average" bit - the value to potential buyers and sellers of the "average" reported transaction. Hence, each exchange reporting to Network A, under this scenario, would receive a share-weighted payment based upon the average information value of each Network A reported trade.
This is not the only answer. It is also possible, if the source of the data in Network A's stream is identified, for the total value of the Network A stream to be determined solely by the value of the "best" transaction data, possibly data coming from the largest market.66 Equating the value of a data stream to the value of the highest quality data within it, however, assumes that investors can filter out that data at low or no cost. The value of Network A data might decline if the volume of market data resulting from auto-quoted trades were to rise. This might happen if investors bear a cost for sorting through relatively uninformative data to find the bits that matter most, or if the temporal flow of data from markets where price discovery is most likely is interrupted or delayed to report auto-quotes as part of the real-time stream.
Trading Off the Primary Exchanges
Competition between markets occurs now, as the SEC has noted, and is significant for transactions. Inter-market competition for large blocks of stock traded by institutions is apparently intense. ECNs are playing an increasing role in inter-market competition. Three have filed with the Commission for recognition as national securities exchanges.
According to the SEC, in September 1999 25.6 percent of the trades (16.1 percent of the share volume) of NYSE-listed securities were traded outside the NYSE exchange. For AMEX, the fraction percent of off-exchange trades (off-exchange volume) was 31.3 percent (29.5 percent).
The available evidence indicates that a substantial amount of share trading occurs outside the primary organized markets (NYSE, NASDAQ, and AMEX). NYSE listed shares trade on six regional exchanges (Boston, Cincinnati, Chicago, CBOE, Pacific, and Philadelphia), in the OTC market, and on nine more recently developed ECNs. The markets also compete to list companies, but put that aside.
Investors who can transact anywhere (many places) do not need access to any specific market. Investors do, however, need access to high quality market data. In a real sense, transactions and market data have been unbundled from one another through earlier SEC and SRO rule changes. The cost of the physical act of buying and selling - transacting - once high quality market data has already been obtained may be relatively (very) low. Hence, once transacting and market data have been unbundled from one another, traditional primary markets may not charge more for transactions (order flow) than the alternatives available to investors large (Instinet) or small (ECNs). If the markets cannot charge for market data, or if market data fees are capped excessively, some markets may fail to recover total cost and, hence, exit. In the next section of this paper and in TABLE 3 below, we illustrate that eliminating market data fees entirely would have made every traditional exchange, except NYSE, "unprofitable" in 1994 and 1998.
The SEC has expressed concern that the markets may further fragment. This concern makes the SEC's market-data regulatory impulse little short of bizarre. Transactions and market data are true joint products67: every transaction gives rise to a bit of market data68. If transactions can easily be moved to other exchanges, physical or electronic, or can be moved "in house" by brokers matching buy-sell orders off the exchange floor, then market data moves off any given exchange's floor as well.
Because demand for transaction activity between exchanges is intense, so is the derived competition for market-data. While market data under current SEC rules (assuming these rules do and will apply to all institutions, foreign and domestic, where trades in U.S. listed equities occur) will exist, the specific exchange from which the market data comes is not fixed and given, but depends upon competition. Hence, a perceived "need" to regulate market data also implies a "need" to regulate transaction fees. Conversely, and we believe correctly, no perceived need to regulated transaction fees implies that the need to regulate market data fees has not been demonstrated.
However, should the SEC conclude that the marketplace for transaction services is competitive, with highly elastic demand for transaction services either in the aggregate or at the level of each exchange separately, that implies that transaction fees themselves will be quite close to marginal cost. The fee that induces exchange patronage, and also creates easily captured market data, is the transaction fee. This is the only fee that distinguishes exchange use from trading off the exchange.
If regulation caps market data fees, assuming for the moment that market data demand is inelastic, this effectively forces the exchanges to try to recover overhead costs from transaction fees. And these are precisely the fees investors can more easily avoid.
Impact of Across-the-Board Limits on Market Data Fees.
The data set out in TABLE 2 indicates that market data fees, on average, accounted for 27.9 percent (31.1 percent) of the total expenses for the nine listed exchanges in 1998 (1994). The ratio of market data revenue to exchange expense varied significantly from exchange-to-exchange. For example, in 1998 (1994) AMEX recovered 38.1 percent (47.6 percent) of its expenses from market data fees (largely from Network B, which it administers), while NYSE recovered only 17 percent.69 CSE and CHX were especially dependent upon market data fees. According to the data provided as part of the Concept release, market data revenues amounted to 46.2 percent of CSE's reported expenses and for 51.7 percent of CHX's reported expenses in 1998.
The market data revenue contribution to each of the exchanges excludes the direct cost of CTA/CQ/SIAC. Hence, a requirement that market data revenue only cover the direct costs associated with CTA/CQ/SIAC would reduce market data contributions to each exchange to zero.70 The consequence for each of the exchanges, except NYSE, would have been significant losses in each year.
Revenue and expense data for each of the market centers and regional exchanges is set out in TABLE 3. TABLE 3 sets out for each exchange for 1994 and 1998 (1) total revenue from all sources, (2) total expenses, (3) annual net accounting income, and (4) total market data revenues earned from CTA's Networks A and B, OPRA, and NASDAQ.71 TABLE 3 also illustrates how each exchange would be affected if the SEC limited market data fees to specified fractions of total exchange expenses (20%, 30% and 40%, respectively). This is one relatively simply way to cap market data fees.72 As TABLE 3 illustrates, a 20 percent, or 30 percent cap would have had an adverse impact on net income earned by the AMEX, Boston, and Chicago exchanges in 1998. A 20 percent to 40 percent cap would have adversely affected AMEX and Chicago in 1994. The 20 percent cap would have adversely affected Boston and NASDAQ in 1994 as well.
Moreover, NYSE was the only exchange that more than covered its annual costs in 1994 and 1998 without market data fees. Without market data fees, every other exchange would have run deficits. Assuming that transaction fees are affected by competition from both other traditional exchanges as well as on-line alternatives, it is not at all obvious that the exchanges could simply raise transaction fees to make up any short-fall due to regulatory reductions in market data revenue. Even if they did do so, the net impact on the investor is far from clear - market data is cheaper but that is offset by more expensive transactions. If the exchanges attempted to raise regulatory or listing fees, the result again is that this may not benefit the investors who, after all, pay broker/dealer costs and own the bulk of the listed firms. Exchanges that would otherwise earn profits due to superior efficiency or to the superior quality of their services, including market surveillance and market data, should not be penalized.
If a firm is regulated with price tied to cost, it is well-recognized that any attempt to "over-regulate" the firm by forcing production and price to be where demand is inelastic creates a powerful incentive for the firm to incur unnecessary expenses by "cost padding," "gold-plating", or overpaying for inputs.73 This could apply to an exchange that was over-regulated.
Suppose the SEC could find a way to price-regulate the most profitable exchanges. The result may very well be to create an incentive to incur higher, rather than lower, costs. Since the price of market data, for example, would be tied to the cost of producing it, over-regulation could incent the exchange that was in this situation to increase costs, and hence, revenues. This would then invariably involve the SEC in inquiries regarding exchange costs, their prudence and reasonableness.
Regulation is widely recognized as "at best...a pallid substitute for competition." As we note elsewhere, regulation creates costs and they may be far from trivial. If regulation is required, the principle used to guide the regulatory approach is one that seeks to mimic competition. As noted above, an inept regulation strategy may seek to put too much of the cost-burden on services with more elastic, rather than less elastic, demand and result in a non-viable supplier. According to this "Ramsey approach" to regulation, rates reflect both cost and value (demand) for the services being offered. When economies of scale and scope, the basis for a need to regulate, are present, setting rates based either on the direct (incremental, marginal) cost of each service separately ensures that the regulated firm(s) will not be viable. Simple, traditional fully allocated cost methodologies while engaging the regulator in lengthy accounting exercises - including the need to establish a consistent cost-accounting methodology for each of the regulated exchanges - need not result in either efficient or "fair" prices - prices free from cross subsidies.
Economic price and access regulation has historically been reserved for cases where technology has resulted in economies of scale and scope that are so significant and enduring to result in a "natural monopoly" or an "essential facility."74 It is far from obvious that primary markets display these features. Competition for order flow (transactions) and for listings is significant. Moreover, changes in technology, including changes permitting round-the-clock trading in primary-market listed equities - trading that occurs even when the listing exchange is closed - suggests that price discovery activity need not reside or occur in a single market. Today, retail and other investors can obtain delayed and real-time market data for very low prices, as low as zero from some vendors. This does not square well with an obvious and compelling need to regulate, as the potential consumer gains from doing so are miniscule at best.
The SEC agrees: "Ultimately, only fair and vigorous competition can be relied upon to set efficient prices." Notice of Filing of Proposed Rule Change to Rescind Rule 390, February 23, 2000, p. 18.
Fully Distributed Cost ("FDC") pricing refers to a set of approaches for allocating joint and common costs, also known as "shared costs", to specific services.75 Firms that produce multiple outputs exist because of economies of scope.76 Each of the various FDC approaches builds up prices by apportioning shared costs to each service without regard to service demand. As a result, FDC methods entail efficiency losses compared to second-best methods that begin with a well-defined consumer welfare maximization goal and then find the prices that maximize welfare subject to a qualification or constraint that the supplier be viable. Hence, the distinguishing feature of FDC pricing is that shared costs are allocated without much reference to economically meaningful criteria.
It is important to recognize that the process by which shared costs are apportioned can be viewed as a bargaining game. Consider the following very simple example with two services (A and B) and a simple technology that has fixed capital essential for producing either or both services. In addition to the shared capital, producing each of the two services entails a variable or "direct" cost per unit produced. Hence, the cost of production can be written as:
(1) C(A,B) = F + cAA + cBB
The total "incremental" or total "direct" cost for producing A, given that B is already being supplied, is simply given by cAA. Similarly, the total "incremental" or total "direct" cost of making B available, given that A is already being supplied, is cBB. If the supplier is confined to setting rates per unit of A and B equal only to cA and cB, respectively, the supplier will not recover any of the essential fixed cost, F:
(2) pAA + pBB = cAA + cBB < C(A,B)
Hence, recommendations that a supplier should be confined to setting rates that recover only product line incremental or direct cost will fail to be compensatory and the supplier will exit. This is a general result whenever there are shared costs and no product line diseconomies of scale.77
Since the supplier must be viable - total revenues must cover total (opportunity) costs including a normal return - the regulatory allocation of the shared costs, F, may be viewed as the outcome of a negotiation between the customers of service A and service B mediated by the regulatory agency. When regulation is viewed as largely a process by which different consumers bargain with one another regarding how joint and common costs will be shared, it should be clearly kept in mind that there is no economically justifiable basis for pure cost allocation.78 However, it is also quite clear that there is nothing inherently "fair" or "reasonable" about a cost allocation method, as such, that assigns none of the shared cost to a specific service.
Three methods for implementing FDC have been the most commonly used in regulatory proceedings: the relative output method ("ROM"), the gross revenue method ("GRM"), and the attributable cost method ("ACM"). All three methods can be written down as variants of the following deceptively simple revenue equation:
(1) RI = DCI + IF, I < 1
In this equation, F represents total shared costs, I is the fraction of shared costs paid by service i, DCI represents the total direct cost attributable to service i, and RI is the total revenue collected by service i.
The three cost allocation methods differ only in the way they determine the cost-sharing parameter, . Under the three approaches, this parameter as noted is not determined with any regard to efficiency. The three allocation schemes determine as follows:
ROM: I = Qi/[Q1 + ... + QN]
GRM: I = Ri/[R1 + ... + RN]
ACM: I = DCi/[DC1 + ... + DCN]79
In practice, as we note elsewhere, FDC also involves the regulator in resolving a very large number of contentious or potentially contentious issues, including depreciation rate determination, asset valuations (book versus replacement cost), how to handle construction work in progress, and other matters. As Rogerson's analysis of the use of ACM in defense contracting indicates, FDC may also enmesh regulators in determining whether prices paid for specific inputs are "fair".80
If each of the service levels is fixed at a predetermined value, QI, similar to the regulatory approach to rate-setting based upon the use of "test years", we can divide "revenue", R, and FDC "cost" by the fixed service level and obtain an average revenue figure or "price":
(1A) Ri/QI = ARI = PI = [DCI + IF]/QI
So long as (1) cost has the separable form given by the sum of service-specific direct charges, DC, plus a shared cost element, F, and (2) the service quantities actually demanded remain at the preset values, then prices set by any of the FDC methodologies will be subsidy-free. However, when the prices arrived at in (1A) are actually put into use, quantities may very well change from their predetermined values. When that occurs, the breakeven constraint will not be met and the prices will not be subsidy-free ex post. When rates are not subsidy-free, they may be deemed to be "unfair" or "unreasonable." This, then, will provide grist for a new round of rate hearings and rate re-determination.81
One alternative way to support FDC methods has been to start with a list of well-specified criteria that a FDC scheme "should" meet. One list of criteria, proposed by Mirman, Samet and Tauman is the following:
1. Cost sharing. An allocation mechanism should result in prices for a set of services and cost of supply so that total cost is covered.
2. Rescaling. If the way the services are measured changes (e.g., moving from an individual share to round lots), then a sensible allocation mechanism should rescale prices accordingly (e.g., the price for a round lot should be 100 times the price for a share).
3. Consistency. Take a subset S of all services S. If the cost of producing this subset depends only on the total quantity produced, e.g. cost depends only on S1 + S2 + ... + SS, then the prices set for any two services should be the same.82
4. Positivity. Take two different cost functions, C1 and C2. If C1 > C2 at zero output (e.g., the fixed cost of C1 is larger than the fixed cost for C2) and this difference in cost increases as output increases, then prices associated with C1 should be higher than prices associated with C2. Regulated prices should be positively linked to cost.
5. Additivity of Allocations. Suppose the variable cost of providing a set of services can be separated into k "stages", Gk(S). Then total variable cost is C(S) = Gk(S). Mirman, Samet and Tauman assert that the rate-setting process should assign a fraction, fk, of the shared costs to each stage and the allocation should be added to each stage's variable cost. Finally, total allowed revenue should just equal the sum of each stage's variable cost plus its allocation of shared costs.
6. Correlation. For any two stages, h and k, if Gh > Gk, then the fraction of overhead allocated to h should be larger than the fraction allocated to k.
The first two and the fourth criteria are relatively innocuous. The third criterion links rates to marginal costs only and means this approach has no special claim to being efficient as it excludes demand considerations. The fifth and sixth criteria imply that shared cost allocations should be treated as add-ons so that the allocations of overheads are correlated with variable costs.83
Mirman, Samet, and Tauman go on to prove that only one FDC method that meets the six criteria when the cost function takes the special form C = F + ck(Qk), a form commonly used in empirical work in economics because it is empirically reasonable. The FDC that meets this test is the ACM. Hence, to the extent these criteria are appealing, they provide a basis for one FDC method that is not completely arbitrary (or any more arbitrary than the underlying criteria).
|1||By "assuring...viability" we do not mean that regulation should necessarily guarantee that any specific exchange is profitable in any given year. The problem with many regulatory proposals, however, is that implementing them would guarantee that the regulated firm would either be insolvent or, in the case of regulation limited to a subset of the enterprise's services, require the enterprise to alter unregulated prices in order to remain economically viable.|
|2||Frank P. Ramsey, "A Contribution to the Theory of Taxation," Economic J., Vol. 37 (1927): 47-61; William Baumol and David Bradford, "Optimal Departures from Marginal Cost Pricing," Amer. Econ. Rev., Vol. 60, June 1970: 265-283; William Baumol, "Ramsey Pricing," in: The New Palgrave, J. Eatwell, M. Milgate, and P. Newman (eds.), Vol. IV (Macmillan, 1987): 49-51; Marcel Boiteux, "On the Management of Public Monopolies Subject to Budgetary Constraints," Journal of Econ. Theory, 3, 1972: 219-240 (first published as "Sur la gestion des monoplies publics astreint a l'equilibre budgetaire," Econometrica, Vol. 24, 1956: 22-40; Stephen J. Brown and David S. Sibley, The Theory of Public Utility Pricing (Cambridge University Press, 1986); Robert D. Willig, "The Theory of Network Access Pricing," in H.M. Trebing, (ed.), Issues in Public Utility Regulation (Michigan State University Press, 1979): 109-152; Gerald R. Faulhaber and James Boyd, "Optimal New-Product Pricing in Regulated Industries," J. of Regulatory Economics, Vol. 1 (1989): 341-358.|
|3||For the purposes of this paper, we will focus on transactions and market data as being most relevant to the issues raised by Commission in its Concept Release.|
|4||The alternative path, taken in many other countries, has been outright public ownership and operation. The perceived need to regulate or nationalize enterprises gave rise to a substantial body of economic analysis of "optimal" pricing and rate-making. For example, the history of whether price should be set at "marginal" or "incremental" or "direct" cost, or whether price should systematically depart from these cost concepts goes back more than 150 years. Robert B. Ekelund, Jr. and Robert F. Hebert, Secret Origins of Modern Microeconomics: Dupuit and the Engineers (University of Chicago, 1999). For sympathetic discussions of the strict application of "marginal" or "direct cost" pricing to nationalized enterprises and of marginal (direct) cost pricing coupled with subsidies paid from general tax revenue for regulated utilities, see Burnham Putnam Beckwith, Marginal-Cost Price-Output Control (Columbia University Press, 1955), Harold Hotelling, "The General Welfare in relation to Problems of Taxation and the Railway and Utility Rates," Econometrica, Vol. 6, 1938: 242-269. Hotelling credited Dupuit as an earlier proponent of marginal cost-based price regulation. However, as Ekelund and Hebert make quite clear, Hotelling's reliance upon general tax revenue to cover the deficits he fully expected to result from his marginal (incremental, direct) cost-based regulatory proposal had been considered, and rejected as manifestly unfair, by Dupuit nearly 100 years earlier.|
|5||The qualifiers "substantial" and "significant" are important. For example, the multi-product or multi-service firm endemic in nearly every market would be much less likely to exist absent economies of scope. Baumol, Panzar and Willig, Contestable Markets and Industry Structure (Harcourt, Brace, Jovanovich, 1982), p. 71.|
|6|| Michael Magill and Martine Quinzii, Theory of Incomplete Markets, vol. 1 (MIT, 1996); Andreas Papandreou, Externality and Institutions (Oxford University Press, 1994). However, the fact that the institution of the anonymous market may unravel does not inevitably imply that public regulation should be imposed. Other, private, institutions may develop to address the problem. Consider the two simplest examples where markets unravel due to imperfect information and adverse selection: the sale of used cars of unknown quality or the provision of insurance to individuals with different health (genetic) status. Adverse selection may arise when one party (the owner of the used car, the potential insured) knows its true characteristics while the other party (the auto buyer, the insurance company) does not. If used car prices or health insurance premiums are set based on the average quality of the pool of potential used car sellers or insurance buyers, there is an incentive for those with high quality cars or high health status to drop out, reducing the average quality of the pool. This may occur repeatedly and the market unravels. This need not require public regulation to fix. For example, warranties or health examinations may help those with higher characteristics differentiate their product from the rest. A single anonymous market in the end may be replaced with multiple markets, each reflecting different quality or health status.
Failure of markets to form, or to fragment once formed, is especially relevant in this discussion. The SEC has expressed concerns over "market fragmentation" by which the SEC appears to mean the dispersal of trading activity across a number of unlinked or weakly linked exchanges or market centers. We discuss this issue and its connection to how exchanges are funded below.
Second, market data is information and as such has special production and use properties. Information may be much more costly to collect (find, discover) initially than to replicate (copy, imitate). This property of information makes contracts to sell data difficult. Selling information to someone can create a rival fully capable of competing directly and this feature differentiates information from products generally. When GM sells a Buick it does not create a full-line automotive rival. When Microsoft sells a copy of Windows, or when NYSE sells real-time stock data they provide the buyer with a necessary input to enter the marketplace provided replication costs are low and property rights are imperfectly defined. When imitation (copying) is inexpensive, when verifiable contracts are costly to negotiate, and when legal protection (copyright, software patents) are also imperfect, vendor's optimal price is reduced because the elasticity of demand for the product from that vendor is higher than the market demand for the service.
|7||See Bork, The Antitrust Paradox: A Policy At War With Itself (Basic Books, 1978) for why antitrust policy should take this approach and the problems that arise when it has failed to do so. Some antitrust statutes, notably the Robinson-Patman amendments to Section 2 of the Clayton Act (1936), are difficult to square with a policy aimed at promoting competition rather than protecting competitors. Regulatory policy, as implemented, has also historically raised issues of regulatory capture, incumbent protection, and suppression of competition. Stephen Breyer, Regulation (Harvard, 1982). The deregulatory wave of the 1970s, beginning with the dismantling of the Civil Aviation Board ("CAB") and scaling back the Interstate Commerce Commission ("ICC"), and moving to electric power and telecommunications in the U.S. and the U.K. reflected the reaction to perverse regulatory outcomes and cost.|
|8||Competitive markets, by their very nature, are unkind to the inefficient or to those that are slow to innovate. Firms that pick the wrong "business model" - the plan by which they intend to cover their costs of doing business - decline or disappear entirely. Hence, regulation designed to protect competitors will produce anti-competitive results unless a compelling case can be made that ongoing competitor protection - protection the antitrust laws cannot provide adequately - is inextricably and positively linked to consumer welfare.|
|9||Stephen Breyer, "Luncheon Address," Seminar, The Cutting Edge of Antitrust: Lessons from Deregulation, reprinted, Antitrust Law Journal, Vol. 57, Issue 3, 1988: 771.|
|10||This goal dates back to Adam Smith who observed that consumption is the aim of production. We do not want competitors for their own sake but only insofar as active competitors expand production of goods or services and thereby reduce price or improve quality.|
|11|| A self-regulated organization ("SRO") such as NYSE considers the interests of other stakeholders as well. Traditional models of regulation, simplified to include "consumers" on the one hand and "producers" on the other hand, also result in rates and other relevant terms that balance consumer and producer interests. As a starting point, suppose the regulator may be deemed to act in the public interest, where the public interest may be defined as maximizing a weighted sum of the benefits consumers and producers obtain from trading with one another. More precisely, welfare, W, is the weighted sum of consumers surplus, CS, and producers' surplus or profits,
W = CS +
where 0 < < 1. When the weight, , placed on producer income is less than one, the regulator "cares" more about consumers. This is a very general approach that can be extended to weight the interests of different consumers in different ways (e.g. `retail investors' and `institutional investors'). Baumol and Bradford, op cit., J. J. Laffont and J. Tirole, A Theory of Incentives in Procurement and Regulation, (MIT, 1993), Chapter 1. While the body of this report follows the SEC Concept Release's stated concern about retail investors, public interest regulation, whether formally done through a government agency or informally through SROs, invariably takes a broader range of interests into account.
|12||SEC Concept Release, Regulation of Market Information Fees and Revenues, No. 34-42208, File No. S7-28-99, p. 5|
|13||SEC Concept Release, op cit., p. 3.|
|14||SEC Concept Release, op cit., p. 3.|
|15||It is important to recognize that the current fee structures evolved in the early-mid 1990s through NYSE pilot programs with the investing community, members of whom control NYSE through its Board of Directors, "voting with its dollars" for the arrangement they preferred.|
|16|| The general fee structure used by the CTA and NASD is a two-part tariff. The simplest two-part tariff consists of an access fee, F, [Note: the term "access" is used here in its economic sense, and is not meant to be equated to the "access fees" charged for physical network connections] that does not vary with use (but may vary across classes of customers or may vary with the size of the network requesting connection), and a per unit variable fee, c. Hence, the cost incurred by user A can be written as:
CA = FA + cAQA
where QA represents the quantity user A actually purchases in a given month. The CTA and NASD set cA equal to zero for traditional brokers and set FA equal to zero for on-line brokers. The two-part tariff consists of two prices, one for access or participation (F) and the other for consumption or use (cA). Viewed in this way, access and consumption are complements. When the firm increases the price of access - raises F - customers on the margin drop out and so consumption declines. Conversely, if the firm raises the price of consumption, some marginal customers no longer find participation attractive and so the number of participants declines. For a more detailed discussion of how to set optimal two-part tariffs, see S.J. Brown & D.S. Sibley, The theory of public utility pricing, (Cambridge, 1986): Chapter 4. These authors show that under some, but by no means every, condition on customer behavior, it may be optimal to set the consumption price below marginal cost. For more on multipart tariffs and other forms on nonlinear pricing, see Robert B. Wilson, Nonlinear Pricing (Oxford, 1993). For earlier discussions of the efficiency properties of two-part tariffs see A.M. Henderson, "The Pricing of Public Utility Undertakings," Manchester School, Vol. XV, September, 1947: 223-250, and especially Ronald Coase, "The Marginal Cost Controversy, " Economica, Vol. XIII, August, 1946: 168-182.
|17||Delayed data has been provided free by the exchanges to everyone for some time. This is important to keep in mind. It is not obvious that all or even most retail investors need "real-time data" in order to manage their portfolios. At least one Internet-based data vendor, provides a delayed quote package, including software, for $10.00 per month and advertises the product as being suitable for "long-term" retail investors who do not need real-time data. WWQuote, http://www.wwquote.com.|
|18||RTC Thomson "Real Time Quotes", February 14, 2000 (http://rtq.thomsoninvest.net/index.sht) ("Once you have completed your real time stock quote registration, you will be eligible to upgrade your account to receive FREE real time options." ... "As your agreement for the receipt and use of market data provides, the securities markets (1) reserve all rights to the market data that they make available; (2) do not guarantee that data; and (3) shall not be liable for any loss due either to their negligence or to any cause beyond their reasonable control." ... "A free service of Thomson Investors Network offering: Quote (Up to 100 real-time stock and options quotes)..."|
|19||Medved Quote Tracker - FREE Real-Time Quotes, Charts, News (http://www.medved.net/QuoteTracker/main.htm) ("Note: Now it's FREE. February 07, 2000: version 1.3.1 is now available. STREAMING QUOTES!! Quote Tracker can now get streaming realtime quotes from DATEK and ISLAND. We also added Candlestick and OHCL intraday charting options as well as a number of other enhancements and bug fixes.")|
|20||freerealtime.com (http://quotes.freerealtime.com/frontpage), February 14, 2000 (Now YOU have access to the SAME information for which professionals pay hundreds of dollars a month! Freerealtime.com members get FREE unlimited real-time quotes, great symbol specific research, convenient Watchlist tracking features, popular StockTalk message boards, FREE market news, valuable investment analysis, and FREE web-based e-mail. Did we mention FREE!?")|
|21||These vendors are not charities. They have patterned their Internet business models after over-the-air ("free") radio and television and newspapers and magazines. These media each provide content for free or nearly for free in order to build up an audience with selected demographic characteristics. The right to contact this assembled audience is then "sold" to advertisers. Hence, providing real-time market data for free as a vehicle for selling advertising space is just the latest variation on an old, established business model.|
|22||Various brokerage companies have bundled real-time data together with trading. For examples, Accutrade, Ameritrade, and Charles Schwab provide retail investors with 100 "free" real-time quotes for each trade the brokerage executes. Brown & Co. supplies 100 "free" real-time quotes each day, regardless of trading activity, to investors maintaining at least a $15,000 account. Castle Online provides NASDAQ Level II quotes for $150 per month, or "for free" if the investor executes 100 or more trades per month. Datek provides "free" real-time quotes to investors who maintain a $2,000 minimum account. Dreyfus and E*Trade also provide "free" real-time quotes apparently to investors maintaining a $2,000 minimum account. For Fidelity Online, the minimum account is $5,000. This list was pulled from a longer list provided by Internet InvestingTM, Online and Discount Brokers (http://www.afterhourstrading.com/brokers.htm), March 8, 2000.|
|23||Package price exclusive of exchange fees.|
|24||This is the lowest cost WWQuote data package including real time data. WWQuote also offers a package, called the Investor Basic, for $10.00 per month that includes delayed data. WWQuote also provides a service, the Daytrader Complete, for $54.95 per month, excluding exchange fees. http://www.wwquote.com.|
|25||The listed NYSE exchange fee is 50% higher than the $1.00 per month actually charged by NYSE for non-professional investors. WWQuote also lists exchange fees for "Canadian investors" for the Tokyo Stock Exchange ($5.20/month); the CDN ($6.20/month); and the CDNX ($13.80/month).|
|27||For an additional $20.00/month, DTN.IQ provides real time options quotes. DTN.IQ (http://www.dtniq.com)|
|28||RealTimeQuotes "Upgrade" (http://www.rtquotes.com/pricing.htm) The first two packages listed also have a onetime start-up fee of $300.00, while the last two have a $500.00 fee. The packages differ based on the amount of Internet access provided. The four plans, respectively, provide as part of the base fee 20 hours/month, 60 hours/month, 140 hours/month and unlimited access. The plans included delayed tick-by-tick trading information, real-time news headlines, watchlists, symbol and news searches, e-mail, chat sessions, alerts, "and other features." The listed NYSE "non-pro fee" of $5.25/month is 425 percent higher than the fee Network A currently charges vendors, and was more than 20 percent higher than the non-pro fee prior to October 1999.|
|29|| Daniel F. Spulber, Regulation and Markets (MIT, 1989), p. 3:
Natural monopoly is due to economies of scale or economies of multiple-output production.
Economies of scale arise when the incremental (marginal) cost of producing successful units of output is lower that the incremental cost of prior units. Hence, the marginal cost of a specific unit of output depends upon the existing scale - how many total units are being produced. Economy of scale is a strong form of cost sub-additivity. Cost is sub-additive up to some output level, Q, when there is no feasible way of subdividing Q, having each subpart produced by a separate firm without incurring higher total costs. For a one-product firm, economy of scale implies sub-additive costs but not the converse. Baumol, Panzar, and Willig, Contestable Markets and the Theory of Industry Structure (Harcourt Brace Jovanovich, 1982).
When firms produce multiple-outputs, economies of scope necessarily arise. Economies of scope are the multiple-output analog to economies of scale. Economies of scope arises when the incremental cost of producing one output, A, depends negatively upon the amount of another (all the other) product, B, that is being produced. As more B is produced, the incremental cost of producing A declines (does not rise).
|30||Network economies arise when the value to one consumer from a given product increases with the number of other consumers who buy a compatible product or service. Telephony provides the archetype example. The value of network access to an individual depends positively upon the number of other individuals who access the same network. The analyses of industry-wide standards, software platforms, telecommunications, and language itself have all been shaped by the notion of "network effects" or "network externalities." Lord Keynes described stock selection as an odd form of beauty contest: the wise investor did not pick stocks because she thought they were beautiful but because she thought others believed they were. Networks have a reverse Yogi Berra effect. Mr. Berra allegedly remarked about a restaurant: "Nobody goes there anymore because it's too crowded." With networks, individuals select networks that are "crowded" (but not congested) - they go where they expect to find others with whom to transact.|
|31||Breyer, Regulation, op cit. p. 15.|
|32||Economics and Public Utilities (Appleton-Century-Crofts, 1950), p. 25. The "necessity" Clemens referred to is the need for regulation, not the "necessity" of the product.|
|33|| Bonbright, Danielsen, and Kamerschen, Principles of Public Utility Rates, 2d ed. (Public Utility Reports, Inc., 1988), p. 17.
The SEC agrees: "Ultimately, only fair and vigorous competition can be relied upon to set efficient prices." Notice of Filing of Proposed Rule Change to Rescind Rule 390, February 23, 2000, p. 18.
|34||Bonbright, et al, op cit., p. 34|
|35||Bonbright, et al, op cit., p. 30|
|36||Public Policies Toward Business (Irwin, 1966), p. 476|
|37||The principle cases include Terminal Railroad (224 U.S. 383, 1912) (railroad terminal service/ICC jurisdiction); Otter Tail Power (410 U.S. 366, 1973) (wholesale electricity wheeling/Federal Power Commission jurisdiction); Hush-A-Phone Corp. v. AT&T (20 FCC 391, 1955, rev'd)|
|38||Bonbright, et al, op cit., p. 34.|
|39|| This competition continues to become more intense. Recently, the Pacific Exchange combined with an electronic communications network, Archipelago, to create an electronic stock exchange for U.S. equities. According to a Financial Times' front-page story:
The Archipelago exchange is expected to trade stocks listed on the New York Stock Exchange, NASDAQ and the American Stock Exchange.
It will compete with NASDAQ and the NYSE for new share listings, especially in technology start-ups. In the longer term, the partners expect the exchange to form the basis of a 24-hour global trading system. ...
For investors, the partnership combination is likely to bring further reductions in trading spreads, especially in the trading of NYSE-listed stocks.
The Pacific Exchange will become a stakeholder in Archipelago, along with its current backers, including Goldman Sachs, J.P. Morgan and Reuter's Instinet subsidiary.John Labate, "Alliance to create electronic U.S. stock exchange," Financial Times, March 15, 2000, p. 1.
|40||Not all "market data" may have the same value as part of the ongoing price discovery process. For example, offers to buy or sell at the market price may be much less valuable than limit bids and offers away from the current market price, especially when coupled with bid or offer size. A price quote, bid or offer, that merely mimics or copies bid and offer data from other exchanges does not necessarily add anything to the price discovery process because it adds no new information the original bid or offer did not provide. The quality of an exchange's market surveillance program may also affect the perceived quality of market data from that source. Finally, the quality of market data may be affected by how "thick" or "liquid" the market is at that stock trading post.|
|41||In principle, the exchange could have funded itself by "taxing" the market data produced by each transaction as it was assimilated by each trader. Of course this approach would have been impractical because the exchange could not verify how much "information" regarding any specific trade, bid or offer, any individual trader became aware of, let alone acted upon. Transactions were linked to market data and were very much easier to verify, so for sound economic reasons the transaction became the basis for collecting exchange fees.|
|42||We ignore listing and regulatory fees in the interest of clarity|
|43||This simplified model assumes the exchange chooses to charge a constant amount for transactions and market data regardless of volume. This is an admitted simplification. As we discussed earlier, the exchanges have experimented with and currently use pricing arrangements for data where the charge varies with use. The tariffs in use imply that average costs are different but marginal costs are the same.|
|44|| It is quite important when proposing economic price regulation supervised by a federal agency to keep in mind that many of the features of administrative process surrounding public regulation can and have imposed substantial social costs, especially in an innovative market. The regulatory process invariably slows down the speed with which prices adjust, up or down, to new circumstances and also subjects each new rate and service proposal by regulated firms to lengthy, costly and competitively revealing review. This is the antithesis of a competitive process.
This issue is exemplified by the lengthy process that was encountered when CTA first filed its request for the $0.01 per quote price in September 1997. It was not until October 1999 that the new rate was made effective by the SEC.
|45||The "stand alone cost" of serving any buyer or group of buyers is the total cost that would be incurred if the suppliers of these customers were to produce without simultaneously producing any other items or any additional quantities of any of the services these customers use. The stand alone cost of producing a service S is the amount it would cost to produce S if its production were deprived of all further economies of scale and economies of scope.|
|46||This is a well-recognized point and has been applied to the FCC's "reverse Ramsey" telecommunications pricing policy for access by J. Gregory Sidak and Daniel Spulber, Regulatory Takings and the Regulatory Contract (Cambridge, 1997): especially pp. 339-342.|
|47||The failure of the incremental cost rule applied one product at a time applies whether the test uses short-run or long-run incremental costs.|
|48||The problem is compounded when, as in the present case, the total cost of providing the services are incurred across corporate or organization boundaries. For example, NYSE provides data to CTA/SIAC for consolidation. The cost of receiving, processing, consolidating and transmitting data that has already been collected and put in the proper format may be very low. Limiting market data fees to cover only the cost of SIAC and CTA would simply ignore the bulk of the costs associated with market data that are incurred within the exchanges. NYSE could avoid this problem by expressly charging SIAC/CTA for the data provided to them, making the full data collection costs obvious. However, that would only place the onus of regulation squarely on the exchanges and the fees they would be permitted to charge SIAC/CTA or any other potential competitor for the data-consolidation function.|
|49||Ramsey pricing derives its name from Frank P. Ramsey, "A Contribution to the Theory of Taxation," Economic Journal, Vol. 37 (1927). For technical discussions of Ramsey pricing, see Baumol, Superfairness, op cit.; Brown and Sibley, op cit., Bonbright, et al, op cit.; Baumol, Panzer and Willig, op cit.; or Sidak and Spulber, op cit. The Interstate Commerce Commission adopted Ramsey pricing in 1983 to regulate surface transport. The FCC, however, expressly rejected Ramsey pricing because that approach was deemed, by the FCC, to conflict with the Telecommunications Act of 1996. FCC, First Report and Order, 11 F.C.C. Red. at 15,853 ¶ 696-698. In effect, the FCC's initial approach required local exchange carriers to recover their forward-looking shared costs by raising prices on their most price-sensitive network elements, guaranteeing maximum consumer defection (bypass) to other (possibly higher cost) alternatives.|
|50||William Baumol and David Bradford, "Optimal Departures from Marginal Cost Pricing", Amer. Econ. Rev. Vol. 60, June 1970: pp. 265-283.|
|51||Firms are obliged to earn their way from market sales.|
|52||William Baumol, Superfairness (MIT, 1986), p. 143. The term "second-best" should be understood as follows. Economists have long accepted that when price equals correctly measured marginal cost all mutually beneficial transactions occur - the value (measured by price) of the marginal or last unit of service provided just equals its cost. This is the "first-best" outcome, and one that markets capable of supporting competition provide. However, when technology dictates substantial economies of scale and scope, setting price to just equal marginal cost would, absent some other action, make the supplier insolvent. The revenue generated from prices equal to marginal cost would fail to cover the supplier's total costs. Pricing rules designed to maximize social welfare while recognizing the constraint that the supplier be viable are known as "second best" pricing rules.|
|53|| The formula is:
[pK - MCK]/pK = /K
where K represents the elasticity of demand for service k, MC represents marginal cost, p represents price, and _ is the Ramsey proportionality constant that is set to ensure the firm earns a normal return. When services are not independent of one another in consumption - they are complements or substitutes - but marginal costs remain constant so they do not vary with the output of any specific service, the formula can be replaced with another that replaces the individual service's demand elasticity, k, with a an expression, dubbed the "superelasticity", that includes for own-price and cross-price elasticities for all the services the firm provides. J.H. Rohlfs, "A Theory of Interdependent Demands for a Communication Service," Bell Journal of Economics and Management Science, Vol. 5(1), 1974: 16-37.
|54||This may or may not be deemed "fair." Examples can be constructed that point in either direction. Our more limited point is simply to suggest that Ramsey pricing can be interpreted as consistent with "value of service" rate-making.|
|55||As we indicate elsewhere, however, every form of serious price regulation ultimately involves substantial data collection, analysis and review. Ramsey pricing differs from other forms of price regulation because it directly involves estimating demand elasticities in order to produce consumer welfare-optimal prices.|
|56|| An alternative form of regulation that has been employed by telecommunications regulators in the United Kingdom involves fixing an initial price for each regulated service and then capping the rates the regulated firm is allowed to charge through a formula. The formula used in the United Kingdom linked future prices to the current price by:
Pt = P0*[RPI - X]
where Pt is the future allowed price, P0 is the base price, RPI a price index deemed to be beyond the control of the regulated firm (in the U.K. example it was the retail price index) and X was a target productivity factor. Under one form of price-cap regulation, the prices chosen by the regulated firm approximate Ramsey prices over time. Ingo Vogelsang and Jorg Finsinger, "A Regulatory Adjustment Process for Optimal Pricing by Multiproduct Monopoly Firms," Bell Journal of Economics and Management Science, Vol. 10, Spring 1979: 157-171. Hence, regulators who opt for price caps may be "embracing Ramsey pricing" without knowing it.
It should be clear that price-capping will involve the regulatory agency at the outset in a potentially extended set of hearings regarding the proper base price, the proper index (RPI) and the proper productivity target (X). The hearing will also invariable involve discussions of how the price cap regulatory process can be made "credible" in the sense that future regulatory commissions will be bound to abide by the process. Price cap regulation for electricity was put in place by the Arizona Public Utility Commission in the early 1980s, but the process collapsed due, in large part, to subsequent renegotiation. Mark R. Isaac, "Price Cap Regulation: A Case Study of Some Pitfalls of Implementation"; J. of Regulatory Econ, Vol 3-2, June 1991, Pgs. 193 - 210
|57||Recently NYSE relaxed Rule 500 which did not preclude competition between exchanges for company listings but erected a "barrier to exit," or a form of "lock-in". When firms have listing options, an exchange that makes "exit" difficult is also likely to be less attractive as an exchange to "enter." Hence, firms may have been less likely to list on NYSE - or may have waited longer to list once they met NYSE's eligibility requirements - because it would then be more difficult for these firms, once listed, to adjust to changed circumstances by e.g. choosing to voluntarily de-list. However, firms have chosen to list on NYSE even with its difficult "exit" policy. NYSE's elimination of or weakening of the "exit" cost is equivalent to a reduction in the expected cost of a NYSE listing.|
|58||H. Bessembinder, "On Assessing the Costs and Benefits of Exchange Listing," NYSE Working Paper 2000-01, January 2000. Dr. Bessembinder considered (1) listing fee differences, (2) commission differences, (3) bid-ask spread differences, and (4) differences in payments to "market professionals" (market-makers). The estimated cost-reduction from migrating to NYSE may have declined following NASDAQ's 1997 change in order handling rules. The average cost saving to liquidity suppliers declined in the sample of firms Bessembinder examined from $980,000 to -$4,000, and the number of migrating firms with an estimated cost reduction also declined, from 89.8 percent to 69.6 percent. The gain by public investors from exchange migration at the expense of market professionals (market-makers) declined from $3.39 million to $1.94 million. Whether these figures represent a real reduction or an artifact of the sample of firms selected for the study is not entirely clear. It is very important to recognize, as Bessembinder notes, that the savings estimated for firms that choose to migrate do not necessarily reflect savings that would be available to "typical" NASDAQ-listed firms. Presumably, firms with the largest estimated saving are those more likely to move. The more important point is that when cost-savings appear to arise, firms take the opportunity and shift. Finally, the variation in transactions savings varied widely|
|59||"No exchange or member thereof shall make available or disseminate, on a current and continuing basis, transaction reports or last sale data with respect to transactions in any reported security executed througfh the facilities of such exchange except pursuant to an effective transaction reporting plan filed by such exchange (either individually or jointly with other persons)". SEC Rule 11Aa3-1(c)(2)|
|60||The negotiation history surrounding the formation of the CTA, Networks A and B, and the process by which the exchanges have been required to provide capacity-demand estimates to SIAC indicates that not all of the exchanges, at least, would have voluntarily agreed to provide consolidated data on their own.|
|61||Moreover, if collecting, distributing and displaying low-quality data imposes costs - e.g. requires CTA/SIAC and the vendor community to make larger data-processing investments than they otherwise would - someone (retail investors) will pay for the equivalent of a WPA project (one group digs holes in the morning that another group fills in at night).|
|62|| For a more thorough discussion of free-riding and the need for firm property-rights to market data, see J. Harold Mulherin, J. M. Netter, and J. A. Overdahl, "Who Owns the Quotes? A Case Study into the Definition and Enforcement of Property Rights at the Chicago Board of Trade," The Review of Futures Markets, Vol. 10, 1991: 108-129. This paper was written while Mulherin and Overdahl were with the SEC and first appeared as an SEC Working Paper. The authors examined the impact of a, then, new information dissemination technology, telegraphy, on the operation of the CBOT in the 19th century. As with the current development of the Internet, high-speed telegraphy meant that trading no longer needed to occur on at the CBOT. Hence, while the Internet differs in many ways from telegraphy, some of the same fundamental economic issues are associated with both. So long as traders, including brokers and other exchanges, outside Chicago had timely access to CBOT price-discovery, they could assure themselves and trading partners that price quotes reflected a broader market. As the authors note (pp. 115-116), telegraphy made it much easier for nonmembers of the exchange to free-ride on the information produced by the CBOT:
The bucket shops (which included competing exchanges, brokerage houses not actually dealing on the CBOT, and outright gambling establishments), therefore, had no intended delivery and were not engaged in actual price discovery, but instead established business based on the quotes of the CBOT. With no delivery or price creation, the bucket shops experienced lower costs than the CBOT and could, therefore, charge lower fees to their customers. ... For our purposes, the primary importance of bucket shops is their potential for free-riding on the price quotes of the CBOT.
|63|| For a simple example, an exchange reporting transactions to Network A receives the following revenue for each transaction occurring within that exchange:
Total per Transaction Fee = T + NETA*k/tk
In this formula, T represents the usual per transaction fee, NETA represents Network A's net revenue available for distribution, k represents the share of all reported transactions exchange k provides, and tk represents the absolute number of transactions exchange k reports. Suppose the transaction element is competitively set and is the same for all exchanges.
The cost of providing a transaction-cum-market data can be written as:
Cost per transaction = ck(tk:Qk) = h(tk) + g(Qk)
This formula recognizes that per transaction cost may depend upon transaction volume, t, and also upon the quality of transactions and market-data, Q. The simplification assumes that the cost of quantity, h, and the cost of quality, g, are independent of one another. A normal assumption would be that cost varies positively with quality.
|64|| When transaction fees are competitively set and identical, the only difference between one exchange an another is the quality, and cost, of providing transaction services cum-market-data. Suppose for the highest quality exchange:
T + NETA/t = h(t) + g(Q)
This means that the exchange offering the highest quality data just breaks even. Exchanges offering lower quality data, however, earn more by having transactions done there, reporting those transactions to Network A, and obtaining a share of Network A revenue they earn than it costs them to produce both taken together. Myopically, each exchange other than the one producing high-quality data has an incentive to increase transactions to increase their share of Network A revenue.
|65||Current technology permits each market to either continue to provide data under separate terms and conditions to an industry consolidator such as CTA/SIAC, or to provide data directly to end users for a fee with the end-users (vendors) picking the data feeds they desire and then consolidating them. Competition between vendors for investor patronage would dictate the extent to which consolidation occurs.|
|66||Some evidence that this may be true is the recent Reuters' proposal to provide options data only from the market where the largest number of options in a given security are traded. Investors seeking data from other, smaller, markets will be required to pay an additional fee.|
|67||Economists use the terms "joint", "common", or "shared" costs to describe a situation like this. "Joint" costs are costs shared by two (or more) services in fixed proportions. The classic examples are "mutton and wool," "beef and hides," and railroad back-haul services. "Common" costs are costs shared by two or more services in variable proportion. For example, the NYSE building itself could represent a common cost: it is used to produce an array of services necessary for transacting stock but each service may use the building to a different degree. Sometimes where economists, regulators or business people are talking about a technology that has both "joint" and "common" costs, they use the term "shared" costs.|
|68||This is strictly true only for transactions taking place through markets or market-like institutions required by the SEC to report. For any trades occurring outside the ambit of the SEC's reporting jurisdiction (e.g., outside the US), the market-data residual may not be reported to CTA/CQ and made available for consolidation.|
|69||Assume the SEC intends to limit market data fees to a fraction of cost, directly tied in some yet to be determined manner to the production of market data, including market surveillance. According to a NYSE rough estimate, NYSE spent approximately $300 million in 1998 for the systems that produce market data, transactions and market surveillance. This figure excludes overheads. [The $300 million estimate is just that. It is not audited data.] It includes only the costs associated with NYSE's various systems for receiving, routing and processing orders. The output of this system, market surveillance aside, is a transaction and two types of market data. NYSE-revenues from Network A amounted to $111,000,000, reduced to $93,000,000 after paying Network A expenses. It is important to understand that the expenses incurred to operate Network A and SIAC represent only a small part of the costs associated with creating "market data." The costs of Network A and SIAC include only costs incurred after market data has been created within the exchanges. Based upon NYSE's estimate, NYSE market data net revenue covered less than 33 percent of NYSE's costs related to market data production in 1998. A market-data revenue cap limiting market data fees to 30-40 percent of the shared costs associated with market data production would have had little or no impact on NYSE in 1994 or 1998. Such a limit, in all likelihood, would have had a more pronounced adverse financial impact on the other eight traditional exchanges.|
|70||Such a proposal obviously misunderstands the way market data is produced, because it considers only the minor part of market data production costs that arise after the market data has been produced. It is akin to requiring automobiles to be sold at prices only sufficient to recover the cost of a dealership, but not the cost of producing automobiles in the first place.|
|71||This data was reported as part of the SEC's Concept Release. The Concept Release data is evidently based upon audited reports for each of the four data networks, and hence may differ from data that could be obtained from each exchange's annual reports.|
|72||If demands for all services, including market data, and future total expenses were known with certainty and assuming this form of regulation would not discourage efficient operation, the exchange could set fees to recover any specified fraction of total expense. When demand and future expenses are not known with certainty, the exchange can only set fees based upon forecasts that may prove in error.|
|73||Elizabeth Bailey, Economic Theory of Regulatory Constraints (Lexington, 1973); H. Averch and L.J. Johnson, "Behavior of the Firm under Regulatory Constraint," Amer. Econ. Rev. Vol. 52, December 1962: 1053-1069.|
|74||There have been exceptions but the outcome has been widely recognized as inefficient and inequitable. For example, motor carriers were regulated partially in response to pressure from already-regulated railroads. The result was to distort inter-modal competition and to regulate an industry, trucking, that did not share characteristics normally associated with market power. In the end, surface transportation was substantially deregulated in the 1970s and 1980s.|
|75|| "Joint costs" arise when two products are produced in technically fixed proportions. For example, the class examples include agricultural products such as "mutton and wool" or "beef and hides". Other, more modern, examples include the production of a desired good, say steel, and the production, given the technology, of an unwanted byproduct, say "pollution". Over time, technical change - improved agricultural techniques or steelmaking technology - can alter the ratio of one product (e.g. mutton) to the other (e.g., wool). But the important consideration is that, given the available technology, the output of one product automatically entails the production of the other, and vice versa.
"Common costs" arise when two, or more, products or services can flow from the same plant (technology) but the mix can vary. For example, a railroad connects Pittsburgh and New York City. The amount of freight carried from Pittsburgh to New York City does not strictly depend upon the number of passengers carried over the same route, nor does the number of first or second class passengers depend upon the number of third-class passengers carried in fixed proportions. The cost of the right of way and the engine are "common" to freight and passengers and to different classes of passengers, however.
|76||Baumol, Panzar & Willig, Contestable Markets and the Theory of Industry Structure (Harcourt, Brace, Jovanovich, 1982): p. 71. Economies of scope are cost-saving externalities between product lines (e.g., the production of good A reduces the production cost of good B). Economies of scope most often linked to joint and common costs.|
|77||While economies of scope are consistent with diseconomies of scale for each product line, this situation, while giving rise to multiple-product firms would be unlikely to give rise to a need to regulate.|
|78||This is a principle that goes back at least as far a Sune Carlson, Pure Theory of Cost and Production (Kegan-Paul, Ltd., 1936). The accounting literature also documents this problem. Arthur L. Thomas, The Allocation Problem (Sarasota, Fla: American Accounting Association, 1974). Cost allocation exercises have also employed game theory and have limited allocations to those that assign to each customer cost based upon the Shapley Value of a cost game. The Shapley Value is a concept from cooperative game theory under which each member of a cooperative at least as much benefit or avoids at least as much cost as she would by departing from the cooperative and acting independently. For a more rigorous treatment, see Martin J. Osborne & Ariel Rubinstein, A Course in Game Theory (MIT, 1994): Chpt. 14.4. For some examples of this approach to assigning costs, H. Peyton Young (ed.), Fair Allocation (American Mathematical Society, 1985); and H. Peyton Young (ed.), Cost Allocation: Methods, Principles, Applications (Elseviers Science Publishers, 1985); and L. J. Mirman, D. Samet, and Y. Tauman, "Axiomatic Approach to the Allocation of a Fixced Cost Through Prices," Bell Journal of Economics and Management Science, Vol. 14 (1), 1983: 139-151. Allocations based upon the Shapley Value, however, suffer as do all purely cost-based allocation schemes because they do not take demand characteristics into account adequately.|
|79||If the firm is regulated to earn only a normal return on capital (the "zero economic profit" constraint), then the GRM and ACM methods are equivalent.|
|80||William Rogerson, Overhead Allocation and Incentives for Cost Minimization in Defense Procurement, RAND Corporation, National Defense Research Institute, R-4013-PA&E, 1992.|
|81||Logically, cross subsidy should exist only when the elimination of a service or elimination of service to a set of customers benefits other users. FDC methodology cannot be used as it is for testing for cross-subsidy. This is because the FDC methodology deals only with the service set as it has been operated or as it is operated. It does not make any incremental comparisons of the sort needed to determine whether eliminating a service, and its associated costs and revenues, would make consumers of other services better or worse off. Hence, the FDC methodology cannot be used to determine the "fairness" or "reasonableness" of a pricing regime when cross-subsidy is the issue.|
|82||The Consistency Criterion is not generally compatible with Ramsey welfare optimal prices because it ignores highly relevant differences in demand elasticities.|
|83||These two criteria are also generally inconsistent with Ramsey pricing rules.|