Skip to main content

Remarks of Commissioner Mark T. Uyeda at 2024 Conference on Financial Market Regulation: The Analytical Challenges of Regulating Today’s Financial Markets

Washington D.C.

May 10, 2024

Thank you for the opportunity to deliver the lunch keynote address at the Commission’s 11th Annual Conference on Financial Market Regulation. The SEC’s Division of Economic and Risk Analysis, led by Chief Economist Jessica Wachter, has spent much time and energy to hold this conference. I also appreciate our partners at the University of Maryland’s Center for Financial Policy and Lehigh University’s Center for Financial Services for their contributions to the event. Today’s gathering provides a platform that pulls together viewpoints, insights and analysis from academia, industry, and the Commission pertinent to contemporary policy issues facing the financial markets. My remarks today reflect my individual views as a Commissioner and do not necessarily reflect the views of the full Commission or my fellow Commissioners.

By profession, I am a lawyer and not an economist. However, as an undergraduate student, I studied finance. I distinctly recall, as a junior in the fall of 1990, my finance professor coming into class and beaming with excitement at the news that financial economists Harry Markowitz, Merton Miller, and William Sharpe had been awarded the Nobel Prize in Economic Sciences. Little did I realize how their work, such as Markowitz’s modern portfolio theory, would help shape my views on issues that I face in my current role as Commissioner.

The Commission was created ninety years ago in the midst of the greatest economic calamity that the United States had ever faced. The stock market crash had commenced in 1929 before bottoming out by 1933, with an overall decrease in value that approached 90 percent.[1] In 1933, the unemployment rate was about 25%[2] and real GDP was down 27%.[3] At the same time, thousands of banks across the country were failing. The Glass-Steagall Act of 1933,[4] the Securities Act of 1933 (Securities Act),[5] and the Securities Exchange Act of 1934 (Exchange Act)[6] were passed by Congress during this period.

My remarks will describe a regulatory philosophy as applied to the financial markets. It is driven by the obligation to act in accordance with the U.S. Constitution, which lays out the separation of legislative, executive, and judicial powers. Staying within the parameters of the Commission’s statutory authority under the laws enacted by Congress is mandatory, not optional. The Commission also must act in furtherance of its mission to protect investors, to maintain fair, order, and efficient markets, and to facilitate capital formation. What is the role that economics can play in this mission?

Given the assumption of perfect information and a few other assumptions, neoclassical economics holds that competitive markets generally work in promoting the economic welfare of the public. One founder of neoclassical economics, W. Stanley Jevons, thought these assumptions reasonably approximated reality. In 1871, Jevons wrote in his “Theory of Political Economy”:

The theoretical conception of a perfect market is more or less completely carried out in practice. It is the work of brokers in any extensive market to organize exchange, so that every purchase shall be made with the most thorough acquaintance with the conditions of trade. Each broker strives to gain the best knowledge of the conditions of supply and demand, and the earliest intimation of any change.[7]

While assumptions may be a useful starting point for analytical and predictive purposes, experience across time has revealed that markets are not necessarily characterized by perfect information, an important distinction. The problem of asymmetrical information between a buyer and a seller may result in what economists refer to as “agency cost.” This asymmetry can lead to sub-optimal outcomes, as economist George Akerlof described in his Nobel-winning paper “The Market for Lemons.”[8] Fraud can be an aspect of this asymmetry. History also teaches that, if left unpoliced, the financial markets are susceptible to these failures, which can cause them to implode.[9]

Law professor Bernard Black, in applying the Lemon’s Principle to the securities markets, observed that “[jurisdictions that fail to adequately address these market failures have] fallen into what insurance companies call a ‘death spiral,’ in which information asymmetry and adverse selection combine to drive almost all honest issuers out of the market and to drive share prices to zero.”[10] One of the Commission’s responsibilities is to prevent that outcome from happening and the work of economists assists those efforts.

By any measure, the initial set of federal securities laws passed by Congress ninety years ago during the Great Depression have held up well. What the financial markets needed during that period was a concerted effort by the government to set disclosure standards and provide legal tools to remove bad actors from the market. Congress’s efforts resulted in a regulatory framework that avoided the death spiral implied by the Lemon’s Principle.

In passing the Securities Act and the Exchange Act, Congress could have followed the merit-based approach featured in many contemporaneous state securities laws, but it did not. Instead, Congress chose to rely on market mechanisms, while addressing informational asymmetries through disclosure. This approach provided the foundation for the subsequent success of the U.S. financial markets to date, which facilitate capital formation and the resulting prosperity from economic growth, jobs creation, and innovation.

Congress gave additional direction through the National Securities Markets Improvements Act of 1996,[11] which requires the Commission, when engaged in rulemaking, to consider, “in addition to the protection of investors, whether the action will promote efficiency, competition, and capital formation.”[12] This obligation often requires the Commission to consider potential trade-offs. In so doing, the Commission should pursue analytical and evidence-based methods. In other words, the Commission should be scientific in its approach to the maximum extent possible. But what does that mean and what does it entail?

As a general matter, scientific discovery has provided modern conveniences to our economy and society, many of which are taken for granted by the public today — whether they are automobiles and aircraft; heating, ventilation, and air conditioning systems; mobile phones with applications and worldwide connectivity; clean water; medicines, antibiotics, and medical devices; rockets, satellites, and space travel; and electronic trading.

But how can science apply to the study of the financial markets? What does it mean to apply the scientific method in that context? Are there aspects of the scientific method that apply differently? Are there limitations on what can be learned about financial markets through the scientific method? With what confidence can the results of such analysis be applied? By applying methodological tools to the study of the financial markets, economists can make an important contribution. Hopefully, these efforts will provide insight about efficiency, competition and capital formation, and gauge any tradeoffs among them.

Let’s divide this discussion into three parts. First, a brief summary of a few philosophers of science and what they have stated regarding what science can and cannot do.

Second, a discussion on the analytical challenges presented by today’s financial markets, which might limit the ability to understand the effects and possible unintended consequences of regulation.

Third, describing a regulatory approach in light of the analytical challenges inherent to the financial markets. This approach focuses on steady, evidence-based gradualism that recognizes the limits of regulator knowledge, maximizes the scope of decentralized innovation through the markets and market intermediaries, and embraces economic analysis as a key tool for navigating this complex terrain.

While it may not be surprising, an approach to regulate the financial markets using a gradualist, evidence-based approach is grounded in traditional methodological fundamentals.

Methodological Insights from the Philosophy of Science

What are the methodological challenges in applying economics to the financial markets? Let’s start with a fundamental insight described by Scottish philosopher and economist David Hume, who died in 1776, which was the same year as the publication of Adam Smith’s Wealth of Nations and the signing of the American Declaration of Independence.[13] Hume argued that one cannot establish truth through induction—that is, through repeated observation. No matter how many times that a set of conditions appears to result in a particular outcome, it does not necessarily mean that it will be true the next time. As Hume stated in A Treatise of Human Nature, “there can be no demonstrative arguments to prove, that those instances, of which we have had no experience, resemble those, of which we have had experience.”[14] The reason is that no matter how many times a potentially causal relationship is observed, it is unknown whether there might be another causal element of which one is unaware. Or, as securities lawyers put it, past performance is no guarantee of future results. This is known as the Problem of Induction.[15]

British philosopher Bertrand Russell made an amusing point in his book The Problems of Philosophy.[16] An inductive chicken notices that when the door to the chicken coop opens and the farmer appears, chicken feed soon appears. After observing this numerous times, the chicken concludes that this is a causal truth. When the chicken hears the door open, it runs out expecting to be fed. One day, a causal condition changes of which the chicken was unaware, and that destroys the predictive content of its inductively derived truth. The change of condition is that the farmer is hungry. Thus, the farmer grabs that first chicken and has it for dinner. So much for inductive reasoning.

If one cannot discern the truth from observation, how can one know it at all? Milton Friedman answers in his Essays in Positive Economics. He argues that while one cannot know that whether a model is true, one can know if it has predictive content. Friedman posits that the task of “positive economics” is:

to provide a system of generalizations that can be used to make correct predictions about the consequences of any change in circumstances. Its performance is to be judged by the precision, scope, and conformity with experience of the predictions it yields.[17]

According to Friedman, what matters is the model’s predictive power as “theory is to be judged by its predictive power for the class of phenomena which it is intended to explain.”[18] With this approach, Friedman attempts to avoid the problem of whether the model is true by pointing out that the ultimate criterion should be whether the model is useful in terms of predictive content. Friedman’s approach, however, still leaves the problem of knowing which model to select based on the empirical evidence—what might be called the problem of theory choice—and what to do if there are counterexamples with respect to the predictive content of the model.

Philosopher Karl Popper, who was at the London School of Economics for a time, had an answer.[19] Popper noted that there was a fundamental asymmetry between corroboration and falsification. With corroboration, irrespective of how many times a seemingly causal connection is observed, it cannot be proven. In contrast, with falsification, it takes only one counterexample to disprove a proposition.

As Aristotle notes, one would think that one could conclude that all swans are white, if enough of them were observed.[20] But it turns out that induction does not even work in terms of Aristotle’s famous example, which was disproven by a single observation in Australia of a black swan.[21] That is Popper’s point, which is the potential for falsification lies at the heart of good science and appropriate methodology. As Popper describes, “it is only in searching for refutations that science can hope to learn and to advance. It is only in considering how its various theories stand up to tests that it can distinguish between better and worse theories and so find a criterion of progress.”[22]

Popper asks: if proof cannot be accomplished through the process of induction, then what is the nature of science? He based his concept of good science on the asymmetry between corroboration and falsification.[23] Popper argues that the essence of science is generating hypotheses that can be maintained tentatively as possibly true in that they must have predictive content consistent with what has been observed. At the same time, those hypotheses must not have been falsified by a counterexample, even though framed in a manner that would render them susceptible to falsification. Thus, good science constitutes tentatively held theories that have survived in the face of the potential for counterexamples. Good science was not so much about corroborating the content, but rather testing it for falsification against potential counterexamples. In other words, good scientists should try to disprove their own work.[24]

For example, if a person hypothesized that water boils at 100 degrees centigrade, he or she might hire research assistants with Bunsen burners to corroborate the proposition over and over again. But knowledge might advance faster if one tried to disprove the proposition, in which case it might try boiling water on a mountain top and find that the hypothesis did not apply there. In that case, one might consider modifying the hypothesis to factor in air pressure.[25]

In sum, Popper’s view of science is as a process of bold, falsifiable hypotheses combined with ruthless efforts to falsify them, with the resulting hypotheses maintained as possibly true and tentatively held. This is not too different than the methodological perspective espoused by Friedman. As Friedman states, “[f]actual evidence can never ‘prove’ a hypothesis; it can only fail to disprove it, which is what we generally mean when we say, somewhat inexactly, that the hypothesis has been ‘confirmed’ by experience.”[26]

There are a few problems with Popper’s view. First, most scientists do not spend extensive time trying to disprove their own work. Second, an “observation” that might constitute a counterexample may itself be wrong, and this is a potentially significant problem. Third, in actual scientific practice, a theory that confronts a counterexample may not necessarily be disposed.[27] If that theory has some uses where it has predictive power, it may be retained until a better theory comes along, even if the original theory has been disproven in some applications. Also, a theory that appears to be falsified through a potential counterexample might be modified in a manner where the counterexample is deemed a mere “anomaly.” For example, if a planet is not quite where the theory says it should be, one might hypothesize the presence of a small but thus far unobserved gravitational presence that is just beyond it and pulling it out of orbit.

Philosopher Imre Lakatos, who was a student of Popper, addressed some of these critiques. Lakatos described science as competing research programs.[28] A “research program” consists of an interrelated set of hypotheses that is tested by identifying anomalies and transforming them into corroborated content by adjusting the model or theory. Within a given discipline, that is often deemed the real test of the scientist. Can they take a research program and extend its predictive content through such refining adjustments to address the seeming anomalies presented? The result of such a research program is not growth of proven content, but a growth in predictive content and perhaps a growth in knowledge.

While Lakatos’ approach might not result in an absolute truth, it can be used as a means of improving predictive capabilities. Friedman agreed with this methodological insight. Friedman referred to such promise within a competing theory as “fruitfulness.” He explained: “[a] theory is … more ‘fruitful’ the more precise the resulting prediction, the wider the area in which the theory yields predictions, and the more additional lines for research it suggests.”[29] If you are a junior scientist entering the field, you would want to work on a “fruitful” research program to make your mark.

Thus, good science entails a process by which the predictive content of research programs increases across time by transforming anomalies into corroborated content, even if unexplained anomalies remain. The professionalization of economics as a discipline reflects in part the prowess of the Lakatosian approach. In response to an anomaly, if an adjustment is made to a theory that results in both corroborated excess empirical content and the retention of unrefuted content, then it is rational to accept the new adjusted theory and reject the old theory. The reason is not because the new adjusted theory is proven, but rather because it is better than the old theory.

Now, having described methodological insights from the philosophy of science, how does one apply them to the financial markets?

The Analytical Challenges Inherent in Financial Markets

From a methodological perspective, fully capturing concerns about investor protection, efficiency, competition and capital formation in the financial markets is challenging. One reason is that the financial markets have developed and continue to evolve at a rapid pace. Because of the complexity of this ecosystem, developing a theoretic framework with corroborated content across time is very challenging. The Lakatosian approach may prove to be difficult to apply if what one is trying to model keeps changing. In the financial markets, constant developments and innovations result from the interaction of technological, regulatory, infrastructural, and demographic changes, which, in turn, shift incentives resulting in further evolution.

The fundamental problem is that we are not examining investor protection, efficiency, competition, and capital formation in a steady state ecosystem. Rather, it is fast-changing, with multiple drivers of the change that are in constant flux. The effect of a particular driver may vary over time. These drivers keep shifting among each other, often in nuanced and sometimes unpredictable ways. The systematic and diligent application of research program to enhance the corroborated content of models takes time. But what happens when the markets being studied move so fast that what may have previously constituted corroborated content is now rendered false by the evolution of that ecosystem?

From a methodological point of view, this matters. The “problem of induction” becomes more severe in this context. Model content that may be accurate and corroborated at one point in time may become false, and falsifiable, in the next period of time. Does this mean that economic analysis is doomed from the start? I think not.

Such a fast-moving system means that more economic analysis is needed, not less. Economic analysis must also be conducted on a continual basis because the likelihood of changing conditions altering the findings is high. Given that the Commission is trying to understand policy decisions across a complex ecosystem in a relatively short order — often meaning less than a year or so, not over a period of years — that makes the contribution of economists to the Commission’s decision-making process more vital. Because many of the emerging changes to the ecosystem are imposed by the Commission itself in the form of regulatory reforms, the Commission has an obligation to engage its economists to analyze the overall impact of these reforms. It is not sufficient only to analyze each reform in isolation.

One substantial constraint on economics as a science is that economics largely is not and cannot be an experimental science. As Friedman describes:[30]

Economics as a positive science is a body of tentatively accepted generalizations about economic phenomena that can be used to predict the consequences of changes in circumstances. Progress in expanding this body of generalizations, strengthening our confidence in their validity, and improving the accuracy of the predictions they yield is hindered not only by the limitations of human ability that impede all search for knowledge but also by obstacles that are especially important for the social sciences in general and economics in particular to them. … The necessity of relying on uncontrolled experience rather than on controlled experiment makes it difficult to produce dramatic and clear-cut evidence to justify the acceptance of tentative hypotheses.

Friedman’s insights hit hard with respect to regulatory policymaking in the financial markets. If the Commission is engaged in rulemaking, how does the Commission know what the impact of that rule will be on investor protection, efficiency, competition, and capital formation if it has never observed a world with such a rule in it and generally cannot do so before adopting it? The Commission is generally not in a position to conduct an experiment to test the effects of a rule. A prior attempt by the Commission to engage in a pilot program as a means of gathering data was successfully opposed in court.[31] Other times, the relatively short tenures of Commission leadership often leave less incentive to conduct long-term research. By not engaging in experiments, the Commission cannot hold other variables constant when evaluating potential policy choices.

Of course, this challenge does not mean that the Commission should refrain from estimating the effects of its proposals. However, the Commission’s ability to do so with a high level of confidence is limited. But this is only the beginning of the methodological challenges.

Other challenges include that the financial system is fraught with positive and negative feedback loops that render many non-linear responses and may result in non-normal probability distributions.[32] There can be sudden and unanticipated mass shifts in behavior as markets receive new information. For example, market variables that may normally not be correlated with one another become correlated in the midst of a panic.[33] These shifts in correlations during a panic often render recent data taken from a period not characterized by panic as irrelevant and even misleading. However, even if one were to examine a previous historical panic to obtain data, the system, and its causal interrelationships, may have evolved from that earlier period, also rendering that data potentially misleading. This can make it difficult to predict potential unintended consequences of a change in the regulatory framework.

In the midst of these challenges, how likely is the occurrence of Hume’s problem — that there is a variable that matters to a particular prediction but was not observed through empirical efforts? The many variables and changing correlations make the financial markets difficult to analyze. Like history, what holds in one moment, may not hold in the next. There is also the risk that one can be “fooled by randomness.”[34]

Additional complexities are introduced by the types of principal-agent conflicts that characterize many aspects of the financial markets. One can partition principal-agent conflicts into three categories: (1) simple conflicts, such as where the agent favors itself ahead of clients or is embezzling money from clients, (2) conflicts where the agent favors some clients over others, such as allocating profitable trades to preferred clients, and (3) conflicts favoring some lines of business over others, such as unloading bad initial public offerings on retail brokerage customers to support an investment banking client. Various components of financial services can be bundled and unbundled over time, resulting in shifts in business models. With these new business models, the potential for new principal-agent conflicts arises.

When a financial services firm has multiple lines of business, they often have different categories of customers. In the normal course of business, some business lines will be more profitable than others. With respect to customers, some are more sophisticated and better at monitoring potential principal-agent issues than others, and some customers have the leverage to bargain for contractual terms that tend to mitigate those issues. In such circumstances, there is a temptation to favor the more profitable lines of business at the expense of the less profitable lines of business, especially if the customers of the more profitable businesses are less sophisticated. The bundling of some functions into one entity, and the outsourcing of other functions, may sometimes be driven by such opportunities.

Adding to the complexity of issues facing regulators is the continuing advances in information technology and connectivity. Financial services are an information intensive industry, and the digital era continues to promote rapid changes in business models. In particular, technology has had a significant impact on the cost structures of various components of financial services, thereby prompting various forms of unbundling, rebundling, outsourcing, disintermediation and reintermediation. Reduction in transaction costs can change the optimal organizational boundaries of any given financial service firm, since it may be cheaper to outsource and monitor the quality of certain functions than to provide such functions internally.[35]

Although the Exchange Act is nearly 90 years old, Congress, in its great wisdom, embedded a degree of nimbleness by authorizing the Commission to facilitate innovation through rules and exemptive applications. The Commission also has discretion as to how to enforce the federal securities laws. The Commission’s staff, while not empowered to act on behalf of the Commission absent an express delegation of authority, has issued no-action letters and guidance. These efforts are attempts to respond to changing market conditions in a manner that benefits market participants. One particularly effective method is to draft principles-based regulations that anticipate innovation and avoid prescriptive measures that might become outdated. Effective regulators provide timely guidance to the industry when faced with new technological approaches. More importantly, effective regulators can play a significant role in helping market participants consider different perspectives on the interaction of technology and regulation.

Amidst these complexities, the financial markets are fraught with network externalities. An economist might describe a network externality as any framework, standard, system or technology where the more people that use it, the more the benefit of being a user of it. The classic example is the telephone system. When only a small number of persons had access to the telephone system, there was not much utility. As more persons who are added to the telephone system, it became exponentially more useful for all of them. In a way, this is the opposite of the “scarcity” assumption that drives so much of economic analysis.

The trading of securities, and the phenomenon of liquidity itself, involve network externalities. The more people who participate in a trading venue, the more likely they will find counterparties, and there will be more liquidity for all participants. As frequently heard by market participants, “liquidity begets liquidity.”

Network externalities, when combined with traditional economies of scale, can make the financial markets susceptible to oligopolistic or monopolistic tendencies. For example, there might be a propensity for one exchange platform to dominate. Should that result occur, there could be negative ramifications, such as a reduction in innovation and more rent-seeking behavior.

In their book Information Rules, economists Carl Shapiro and Hal Varian describe the economic pattern inherent in a network as follows:[36]

Technologies subject to strong network effects tend to exhibit long lead times followed by explosive growth. The pattern results from positive feedback: as the installed base of users grows, more and more users find adoption worthwhile. Eventually, the product achieves critical mass and takes over the market. . . . But network externalities are not confined to communications networks. They are also powerful in “virtual” networks . . . Because these virtual networks of compatible users generate network externalities, popular hardware and software systems enjoy a significant competitive advantage over less popular systems. As a result, growth is a strategic imperative, not just to achieve the usual production side economies of scale but to achieve the demand side economies of scale generated by network effects. . . . Once you have a large enough customer base, the market will build itself.

In a networked context, regulation may play an important role in facilitating competition, especially given the contrary dynamic provided by network externalities. For example, in railroad law there is “common carrier” regulation, which is an attempt to ensure some competitive uses of that network.[37] Similarly, the Telecommunications Act of 1996 attempted to promote competition through ensuring competitive access to what was known as the “last mile.”[38]

The point here is a simple one. An important objective of securities regulation is to promote competition. However, because of the power of network externalities, it can be difficult to achieve the proper balance. Even if the regulator can obtain an appropriate balance that succeeds in promoting competition, technological change can throw off that balance by altering cost structures and it is possible for that the entire network can be affected by changing types of information connections and interflows. Thus, maintaining a competitive ecosystem can be fragile.

In short, the financial markets are characterized by asymmetries of information, evolving issues of agency cost, positive and negative feedback loops, network externalities, and informational intensity. The financial markets are highly susceptible to changes in technology and communications, which can quickly change the economic incentives for the outsourcing and insourcing of functions. Thus, the financial markets constitute a very noisy system, hard to fully grasp in real time, evolve rapidly, and, at best, are a moving target in terms of developing useful models to predict the impact of regulatory changes.

The Vital Contribution of Economics to Securities Regulation

In the midst of these challenges, how can economists contribute to the discussion – even if the complexities of the financial markets make it difficult to develop corroborated content of a model? My view is that economists are important to the Commission’s policy considerations, particularly in light of these analytic challenges. Economists serve as gatekeepers, helping to inform the Commission as to whether a proposed regulatory action is consistent with its statutory requirement to consider whether the action will promote efficiency, competition, and capital formation. It is imperative that the agency’s economists conduct their analyses in an objective manner and that the Division’s cost-benefit analysis is not simply a post hoc exercise to justify a predetermined policy outcome.

First, economists have a comparative advantage in analyzing the impacts of incentive structures, and estimating the consequences, including unintended consequences. The results of that analytic and empirical work, as applied to financial markets, can be nuanced and often unexpected. In particular, we need economists to employ their analytic and evidence-based skills in tracing and measuring the impact of shifting conflicts of interest that can potentially undermine the markets’ ability to facilitate capital formation and harm investors.

Second, economists, with their various models, can bring to bear insights and hypotheses that may be different than those from lawyers and accountants. There needs to be competition among a wide variety of hypotheses and evidence-based criticism thereof, which is an activity to which economists are well-position to engage in. It is vital that multiple perspectives are brought to bear on the costs and benefits of the Commission’s rules.

Third, economists not only need to do quality cost-benefit analysis prior to the adoption of a rule, but they need to regularly conduct ex post cost-benefit analysis as well. The Commission should be regularly reviewing its rules and regulations across time to identify those which are excessively costly in light of the benefits they bring and, more importantly, those which may be counterproductive in terms of unintended consequences.

Fourth, the Commission’s economists need to be interacting on a regular basis with other interested economists through both internal and external research efforts. These interactions cannot be limited solely to economists in academia, but also in the industry and in government. For that reason, today’s gathering is a testament to that interaction.

Precisely because of the challenges in financial regulation, the Commission needs exposure to as many schools of thought pertinent to its mission – and all of them need to be constantly pressure tested through evidence-based criticism.

To the greatest extent possible, financial regulators such as the Commission should engage in evidence-based policymaking. However, given the analytical challenges inherent in the securities markets, this process is inevitably judgment laden. Exercising that judgment is the responsibility of the five Commissioners and they need to be provided with the full range of arguments supported by the most reliable and indicative evidence. Economists—with their toolset and skills—can play an important role in that process.

Friedman had some interesting comments on the need for judgment in the application of positive economic analysis. In noting that regardless of how successful economists might be in developing a model, there will inevitably remain room for judgment in applying the model. He further stated:[39]

The capacity to judge that these are or are not to be disregarded, that they should or should not affect what observable phenomena are to be identified with what entities in the model, is something that that cannot be taught; it can be learned but only by experience and exposure in the “right” scientific atmosphere, not by rote. It is at this point that the “amateur” is separated from the “professional” in all sciences and that the thin line is drawn which distinguishes the “crackpot” from the scientist.

Policy Insights Based on the Analytical Challenges of Regulating Financial Markets

As a final note, I would like to return to articulating how the Commission should approach regulation. Gradualism is a key component. At any point in time, there is a delicate balance of factors and drivers supporting the financial market ecosystem. The Commission is not in a position to fully grasp it, because it is in constant flux. This is precisely why the Commission should not try to control the capital formation process through any form of merit regulation, but rather provide a regulatory framework that facilitates a competitive and decentralized knowledge process that constitutes the markets themselves. This is the essence of capitalism, and it is driven by the recognition that a centralized planning system lacks sufficient information and knowledge to optimize the allocation of scarce resources.

Gradualism also is pragmatic in light of the often unforeseen and unintended effects of regulatory policy on the financial markets. It is reminiscent of the quote from Prussian military theorist Carl von Clausewitz: “war is the realm of uncertainty; three quarters of the factors on which action is based are wrapped in a fog of greater or lesser uncertainty.”[40] His observation could be equally pertinent to financial regulation.

Recognizing the limitation of knowledge, the Commission should remain focused on addressing key market externalities and not attempt to re-order the decentralized pricing system that has served the capital markets so efficiently and is vital to the nation’s continued prosperity. In regulating the financial markets, the Commission should consider refraining from addressing theoretical concerns with respect to aspects of the markets that are generally working well. The financial markets are a delicate ecosystem and the danger of unintended consequences—the unknown unknowns—always loom large.

Economic analysis and insight should be part of every stage of the rulemaking process. It is needed to identify the issues in the financial markets that might need to be addressed from a regulatory perspective. It is needed to contemplate alternative ways of addressing the issue. It is needed as the Commission develops rule proposals. It is needed when the Commission considers rule adoptions. It is needed within the Commission’s enforcement and examination programs. And it is certainly needed ex post—to look back at the Commission’s various rules, regulations, and forms, including the interaction among them—to ensure that they are effective and cost-efficient.

This is a long-winded way of saying thank you for the great work that economists are doing to advance understanding of the financial markets and to thank you for this conference.


[1] See Harold Bierman, Jr., The 1929 Stock Market Crash, Economic History Association, available at https://eh.net/encyclopedia/the-1929-stock-market-crash/#:~:text=By%20all%20accounts%2C%20there%20was,90%20percent%20of%20their%20value.

[2] FDR Library National Archives Great Depression Facts, available at https://www.fdrlibrary.org/great-depression-facts

[4] 12 USC 227, Banking Act of 1933.

[5] 15 USC §77a et seq.

[6] 15 USC §78a et seq.

[7] W. Stanley Jevons, The Theory of Political Economy, 2nd ed. (London: Macmillan, 1879), (1st ed., 1871), at 86.

[8] See George A. Akerlof, The Market for ‘Lemons’: Quality Uncertainty and the Market Mechanism, QJE, Vol. 84, No. 3 (1970) at 490 (Given asymmetrical information, “[i]t has been seen that the good cars may be driven out of the market by the lemons. But in a more continuous case with different grades of goods, even worse pathologies can exist. For it is quite possible to have the bad driving out the not-so-bad driving out the medium driving out the not-so-good driving out the good in such a sequence of events that no market exists at all.).

[9] See, e.g., John C. Coffee, Jr., Law and the Market: The Impact of Enforcement, 156 U. Pa. L. Review, 229, 230 (2007) (“higher enforcement intensity giver the U.S. economy a lower cost of capital and higher securities valuations.) For an excellent discussion, see also Ziven Scott Birdwell, The Key Elements for Developing A Securities Market to Drove Economic Growth: A Roadmap for Emerging Markets, Ga. J. Int’l & Comp. L. (Spring 2011).

[10] Bernard Black, The Core Institutions that Support Strong Securities Markets, 55 Bus. Law., 1565 at 1567-68 (2000).

[11] Public Law No. 104-290 (1996).

[12] 15 U.S. Code § 77b (applying “whenever pursuant to this title the Commission is engaged in rulemaking and is required to consider or determine whether an action is necessary or appropriate in the public interest”). The U.S. Court of Appeals for the D.C. Circuit has interpreted this requirement to mean that – if the Commission fails to consider a rule’s economic consequences – then the adoption of the rule is arbitrary and capricious under the Administrative Procedure Act. See, e.g., Bus. Roundtable v. SEC, 647 F.3d 1144 (D.C. Cir. 2011), available at https://casetext.com/case/business-rountble-v-sectis-ex-comm-10-1305-dc-cir-7-22-2011. In light of these requirements, the Commission’s internal guidance provides that “[r]ulewriting staff should work with [the agency’s] economists to identify relevant potential benefits and costs of [a] proposed rule…” See Memorandum from the Div. of Risk, Strategy, and Fin. Innovation (RFSI) and the Office of Gen. Counsel, Current Guidance on Economic Analysis in SEC Rulemakings (Mar. 16, 2012), available at https://www.sec.gov/divisions/riskfin/rsfi-guidance-econ-analy-secrulemaking.pdf.

[13] A number of the methodological insights and examples in the following subsection are articulated in Field of Gourds: A Guide to Intellectual Rebellion (“Field of Gourds”) by Robert M. Fisher, 2012, published in CreateSpace.

[14] David Hume, A Treatise of Human Nature, (1739). For an exposition on Hume’s view, see, e.g., “The Problem of Induction”, Stanford Encyclopedia of Philosophy, available at The Problem of Induction (Stanford Encyclopedia of Philosophy)

[15] Id.

[16] Bertrand Russell, The Problems of Philosophy, (1912), Chapter VI: On Induction, available at: The Problems of Philosophy by Bertrand Russell (kent.edu)

[17] Milton Friedman, Essays in Positive Economics, University of Chicago Press (1953) at 4.

[18] Id. at 8.

[19] See Karl R. Popper, Conjectures and Refutations: The Growth of Scientific Knowledge (“Conjectures and Refutations”), Routledge and Kegan Paul (1963).

[20] Field of Gourds at 30, 31.

[21] Id.

[22] Conjectures and Refutations, at 113.

[23] Conjectures and Refutations at 36 (“[a] theory which is not refutable by any conceivable event is non-scientific”).

[24] See discussion on Popper in Field of Gourds at 73-86.

[25] Field of Gourds at 32-33

[26] Milton Friedman, Essays in Positive Economics, University of Chicago Press, (1953) at 9

[27] For discussion, see chapter on “The Falsification of Falsificationism” in Field of Gourds at 93-105.

[28] See Imre Lakatos and Alan Musgrave, eds., Criticism and the Growth of Knowledge. Cambridge: Cambridge University Press 1970. See also Imre Lakatos Proofs and Refutations. Cambridge: Cambridge University Press, 1976. For discussion of Lakatos’ approach as applied to economics, see Robert M. Fisher, The Logic of Economic Discovery: Neoclassical Economics and the Marginal Revolution, New York University Press (1986), at 10-33.

[29] Milton Friedman, Essays in Positive Economics, University of Chicago Press, (1953) at 10.

[30] At 39-40 (underlining added).

[31] See, the tick-size pilot program that was struck down by the U.S. Court of Appeals for the DC Circuit, New York Stock Exchange LLC, Et al., v. SEC, No. 19-1042 (June 2020), where the court stated “[t]he Pilot Program was not a trial run of a new regulation. Rather, it was designed ‘to gather data’ so that the Commission might be able to determine in the future whether regulatory action was necessary”, at 2-3. Among other things, the court found that “nothing in the Commission’s rulemaking authority authorizes it to promulgate a ‘one-off’ regulation … merely to secure information that might indicate to the SEC whether there is a problem worthy of regulation”, at 6. (Italics in original), available at https://www.cadc.uscourts.gov/internet/opinions.nsf/BE5AD5AD3C0064408525858900537163/$file/19-1042-1847356.pdf

[32] See, e.g., Nassim Nickolas Taleb, The Black Swan: The Impact of the Highly Improbable, Random House (2007) Much of Talib’s tome entails a critique of the normal distributions, at least in certain contexts. See, e.g., the discussion from page 35 to 37.

[33] See, e.g., Gerardo Manzo and Jeffrey N. Saret, Asset Class Correlations: Return to Normalcy, Two Sigma (correlations spiked at the onset of the financial crisis in late-2008 after a decade of relative stability. Pairwise equity correlations reached nearly 70 percent from a pre-crisis level of approximately 40 percent), available at https://www.twosigma.com/articles/asset-class-correlations-return-to-normalcy/#:~:text=For%20all%20four%20asset%20classes,level%20of%20approximately%2040%20percent.

[34] See Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets, Nassim Nicholas Taleb, 2nd Ed., Random House Trade Paperbacks (2005).

[35] See Oliver E. Williamson, The Economics of Organization: The Transaction Cost Approach, American Journal of Sociology (1981).

[36] Carl Shapiro and Hal R. Varian, Information Rules: A Strategic Guide to the Network Economy, Harvard Business School Press (1999).

[37] This body of law is complex and evolving. For a brief and useful discussion of the statutory history see Report to Congress: Shared-Use of Railroad Rights of Way, (2019), Federal Railroad Administration, U.S. Department of Transportation, at 3-4, available at https://railroads.dot.gov/sites/fra.dot.gov/files/fra_net/18863/Report%20to%20Congress%20Shared-Use%20of%20Railroad%20Rights-of-Way%20July%202019.pdf

[38] For an excellent discussion, see Michael I. Meyerson, Ideas of the Marketplace: A Guide to the 1996 Telecommunications Act, Federal Communications Law Journal (1997).

[39] Milton Friedman, Essays in Positive Economics, University of Chicago Press, (1953) at 25.

[40] Carl von Clausewitz,On War, trans. and eds. Michael Howard and Peter Paret (Princeton, NJ: Princeton University Press, 1976). For an in-depth discussion of Clausewitz, see, e.g., Martin Samuels, The “Finely-Honed Blade”: Clausewitz and Boyd on Friction and Moral Factors, (2020) Official Website of the Marines, available at https://www.usmcu.edu/Outreach/Marine-Corps-University-Press/Expeditions-with-MCUP-digital-journal/The-Finely-Honed-Blade/#:~:text=Clausewitz%20noted%2C%20%E2%80%9CWar%20is%20the,out%20the%20truth.%20.%20.%20.

Return to Top