U.S. Securities & Exchange Commission
SEC Seal
Home | Previous Page
U.S. Securities and Exchange Commission

SEC Advisory Committee on Market Information Subcommittee on Alternative Models Summary Minutes

March 26, 2001 Meeting

The Subcommittee on Alternative Models ("Subcommittee") of the SEC Advisory Committee on Market Information ("Advisory Committee") held its first meeting on March 26, 2001. Professor Donald Langevoort, Chairman of the Subcommittee ("Chairman"), began the meeting by discussing the "deliverable" the Subcommittee would need to produce for the Advisory Committee in connection with its May 14 meeting. The May 14 Advisory Committee meeting will focus on alternative models and, at minimum, the Subcommittee will need to structure an agenda for that meeting. To the extent the Subcommittee can reach consensus on an alternative "competing consolidators" model, those views should be presented to the Advisory Committee as well.

The Chairman then turned to the five key "decision points" for the Subcommittee described in the agenda. He suggested that the first decision point - whether there should be some mandatory minimum of consolidated data - be taken off the table for the time being. That issue is being considered by the full Advisory Committee, and the Chairman's sense was that members' views on this would not be affected by whether or not there were competing consolidators. Members agreed, but some noted the need for more flexibility in the Display Rule and the importance of fair pricing and non-discriminatory access.

The Chairman also suggested that an in-depth discussion of the second decision point - the extent of appropriate price regulation - be deferred temporarily by the Subcommittee, as this issue will be considered by the full Advisory Committee at its next meeting.

Instead, the Chairman wished to begin with the last two decision points - whether or not, in a competing consolidators model, the SEC would need to: (1) set qualification standards for market data providers and monitor their capacity and performance on an ongoing basis; and (2) take a role in the design of protocols and conventions to facilitate the transmission of data and assure matters such as appropriate sequencing. Would the introduction of competing consolidators lead to concerns about the quality and reliability of market data and, if so, how could these concerns be minimized?

The discussion began with a presentation by Tom Demchak - the Vice President of SIAC responsible for the CTA, CQ and OPRA market data systems - on four key operational considerations for ensuring the quality and reliability of data: (1) accurate sequencing of data; (2) capacity concerns; (3) protocols for data formats; and (4) validation criteria. As market data is received by SIAC, it is validated and then sequenced. A BBO is generated from the nine participating market centers, beginning at 9:30 a.m., using price/size/time priority. On the transaction side, a database is maintained with the open, high, low, and last sale data on a consolidated basis. The SEC, through its Automation Review Policy (ARP) program, reviews annually selected aspects of SIAC's operations. (But the ARP reviews are limited in scope, and focus only on certain operating standards - not technical standards.)

SIAC applies a time stamp (to the second) when data hits its system, and guarantees sequencing from that time. With competing consolidators, steps would need to be taken to assure accurate sequencing at each consolidator. For example, today, when vendors take data from SIAC's feed and convert it into their value-added products, sequencing errors can occur. In a competing consolidators environment, the risk of sequencing errors or gaps in data will increase, so steps will need to be taken to mitigate this. Mr. Ketchum pointed out that some responsibility for sequencing has to fall back on the generating market/data provider, and a role for the competing consolidator would be to watch for errors generally from the inputting markets. If necessary, Mr. Ketchum believed a consolidator should have the ability to drop an inputting market for a period of time until the quality of its data improves. Mr. Roiter noted that we will have to accept some level of error in the system, with a key decision being determining the appropriate tolerance level. Mr. Bernard (NYSE) believed that a real source of concern with competing consolidators is the potential for different sequencing created by differing computer systems. Mr. Demchak pointed out that one way to minimize this risk would be by establishing network-based protocols. More generally, Mr. Ketchum argued that, as long as minimum capability standards can be established for competing consolidators, the level of risk should be manageable.

With respect to capacity concerns, Mr. Demchak pointed out that SIAC devotes substantial resources to this. With a competing consolidators model, it may be necessary to take steps to assure that each consolidator has sufficient capacity to meet marketplace standards. In addition, data format protocols, as well as benchmark validation criteria, may be advisable.

The Chairman then asked the Subcommittee for input on the extent to which government oversight would be necessary to ensure minimum quality standards are met by competing consolidators. Mr. Ketchum suggested that the Subcommittee survey vendors on concerns they might have with a competing consolidators model (e.g.. the need to establish standard protocols). He also expressed concern that if the SROs were to take some responsibility for overseeing the quality of competing consolidators, and they cut off a competing consolidator for underperforming, it might lead to a "denial of access" proceeding. Mr. Feuer indicated that Reuters would be more comfortable with the SEC establishing minimum standards for competing consolidators, at least initially.

If market discipline were to be relied upon to ensure data quality (i.e., if we were to look primarily to the marketplace to effectively weed out underperforming consolidators), an important issue would be the ease with which users could switch from one consolidator to another. Mr. Tom Haley - the CTA/CQ Chair and the NYSE Vice President in charge of Network A contract administration - explained the process through which users today connect to the consolidated SIAC feed. Prospective data users are sent paperwork to review, including a contract, the rate structure, technical specifications, and an instruction package. Before service can begin, the contract must be executed, there must be conformance to the technical specifications, and appropriate arrangements must be made with a telecommunications provider. (Making the telecom arrangements usually is the most time-consuming step.)

Several members suggested that a new class of regulated consolidator might be created to help assure certain minimum quality standards are met. The existing standards for regulating exclusive SIPs, that apply to SIAC and Nasdaq today, could serve as a starting point for this, and be expanded to cover non-exclusive SIPs. But there was some recognition that these relatively light standards would need to be enhanced. Ms. Dwyer argued that any regulatory standards should apply only to the extent a consolidator is distributing the mandatory minimum level of data. Mr. Ketchum believed that either the SEC or the SROs would have to assure that competing consolidators have operating systems that meet basic quality standards, so that there is a reasonable likelihood they will produce reliable streams of appropriately sequenced data.

Mr. Roiter suggested that the regulation of competing consolidators impose relatively light initial registration requirements, so as to minimize the barriers to entry. Analogizing to broker-dealer registration, Mr. Roiter believed that the regulatory emphasis should be on the ongoing oversight of competing consolidators, with the SEC having the ability to go after underperformers. Perhaps the SEC could prescribe initial performance-based standards (e.g., minimum capacity requirements, maximum error rate in sequencing, maximum outage rate), that would be tested by the SEC after a three to six month time period. Thereafter, the SEC might perform some sort of annual "fiscal check-up" of the consolidator. Mr. Roiter emphasized that the market would play a significant role in determining which consolidators stay in business - the role of the SEC simply would be to assure that certain baseline standards are maintained. And the SEC would rely as much as possible on self-reporting and self-audit by the consolidators.

The Chairman then asked the Subcommittee whether there was consensus that there exists a "quality control" role for the SEC in the competing consolidators model, along the lines of that described by Mr. Roiter. While some noted that the role of the SEC should be as low-maintenance as possible, there did seem to be such a consensus.

After the lunch break, the Chairman shifted the discussion from quality control issues to pricing and access issues. He began by asking the Subcommittee whether, in a competing consolidators model, there would be an excessive level of risk either with respect to market data pricing or access.

Mr. Roiter began by referencing the NYSE's alternative model - where each SRO could separately price and sell its own data - and asking how competition in the pricing of data would be generated among SROs. Mr. Bernard responded that the potential for excessive pricing would be constrained in the same way it is today, including through the broad membership of the SRO boards - in effect, the users of data themselves set the prices. Mr. Nicoll noted that, in practice, this may not be an effective check. Mr. Bernard added that permitting the markets to separately price their data would have the positive effect of eliminating cross-subsidies from the system (as evidenced, for example, by the ability of certain SROs to use "excess" market data fees to pay for order flow). Ms. Dwyer pointed out that this showed there was too much money sloshing around the system. As to the concern about adequately funding the self-regulatory function, she suggested it would be better to unbundle these SRO costs and fairly allocate them among a market's users rather than trying to recover them indirectly through market data fees.

Mr. Nicoll stressed that, so long as the requirement for a mandatory minimum of consolidated market data exists, users will be unable to bargain effectively and promote competition among the SROs supplying the data. He strongly advocated the elimination of the Display Rule, and instead would rely on appropriate disclosures about the unconsolidated nature of the data to protect users.

Mr. Feuer suggested that one way to check the power of SROs to charge monopoly rents would be to permit alternative sources of market data - in particular, broker-dealers should be allowed to sell their data directly. Some members argued, however, that there would be significant practical problems with this. Ms. Dwyer believed there should be a requirement for "most favored nation" pricing by the SROs with respect to similarly-situated users. (Mr. Bernard noted that the NYSE believes it currently practices MFN pricing.) The Subcommittee also discussed, as an alternative way to constrain the pricing power of the SROs, the possibility of limiting the requirement to offer consolidated data to the time immediately prior to the point at which an order is placed.

The Chairman then asked each member of the Subcommittee to summarize the day's discussion. Mr. Roiter said that there appeared to be no significant technological barriers to moving toward a competing consolidators model. With respect to regulatory matters, there seemed to be agreement on a role for the SEC in assuring a baseline level of quality of service and integrity of data. But regulation should be conducted in a manner that minimizes barriers to entry by competing consolidators. As to matters of pricing, no consensus seemed to have developed. Perhaps members would agree that, to the extent there is a mandatory minimum requirement for consolidated data, each market center retains some pricing leverage. Also, there appears to be consensus that pricing should be non-discriminatory among similarly-situated users.

Mr. Feuer agreed with Mr. Roiter's conclusions, but thought the Subcommittee should strive to figure out a way to ensure that prices set through competing consolidators are economically valid (i.e., not reflective of monopoly rents). Ms. Drake (Archipelago) added that the Subcommittee should revisit the continued need for the Display Rule. Ms. Dwyer agreed with Mr. Roiter's comments, and suggested that a productive way forward might be to ask each member to outline, in one-page, a competing consolidators model that addresses each of the issues in the agenda.

Mr. Ketchum concurred, and noted that there seemed to be consensus on the "doability" of a competing consolidators model, and that the Subcommittee ought to be able to develop a rational non-discrimination standard. In addition, it would be worthwhile to try to find a middle ground on the need for consolidated data, perhaps limiting the requirement to discrete points in time, such as when an order is placed. Mr. Bernard was attracted to that idea intellectually - people shouldn't be forced to take more data than is needed. At minimum, the Display Rule should be made more flexible.

Mr. Quick believed the current model has worked remarkably well and would not support moving away from the single consolidator model. However, he might be willing to consider competitive bidding for the single consolidator function. Mr. O'Kelly argued that the consolidation function currently is being performed at a very low cost, and it is unlikely that competing consolidators will lower those costs materially. To the extent there are concerns about "special deals" being made, these are factual matters that can be addressed simply by disclosure. Mr. Nicoll again stressed that the Display Rule should be revisited. Concerns about monopoly rents will not be put to rest as long as there is a requirement to provide an NBBO - the only way to achieve fair pricing is to eliminate the mandatory minimum.

The Chairman concluded the meeting by asking the members to prepare the one-page proposal suggested by Ms. Dwyer. He will circulate an e-mail shortly with details of this and other follow-up matters. Finally, the Chairman indicated alternative dates for the second Subcommittee meeting may be reviewed.

http://www.sec.gov/divisions/marketreg/marketinfo/subcommin.htm


Modified: 04/24/2001