Skip to main content

CATS2000 Data

Jan. 30, 2002

This document is an HTML formatted version of a printed document. The printed document may contain agency comments, charts, photographs, appendices, footnotes and page numbers which may not be reproduced in this electronic version. If you require a printed version of this document contact the United States Securities and Exchange Commission, Office of Inspector General, Mail Stop 11-7, 450 Fifth Street N.W., Washington, D.C. 20549 or call (202) 942-4460.

CATS2000 DATA

Audit No. 331
January 30, 2002

EXECUTIVE SUMMARY

The Office of Inspector General conducted an audit of CATS data. We found that the timeliness, accuracy and completeness of data in the Case Activity Tracking System (CATS) need improvement. Our recommendations include issuing guidance to staff, periodically reviewing system data, and correcting data errors.

During the audit, the Division of Enforcement (Division) established a Steering Committee to improve CATS data reliability. Also, the Division, the Office of Information Technology (OIT), and the Office of the Secretary (OS) are considering updating CATS with some data from OS's computer systems, through an automated interface. The interface (i.e., linkage) could eliminate redundant data entry and improve CATS data reliability. Lastly, during the audit, the Division of Enforcement began to implement our recommendations.

We commend the Division of Enforcement, OIT, and OS for their proactive efforts.

AUDIT OBJECTIVES AND SCOPE

Our audit objective was to evaluate the reliability (i.e., the timeliness, accuracy, and completeness) of selected CATS data. The audit scope included an analysis of data related to Matters Under Inquiry (MUI), investigations, administrative proceedings (APs), and civil actions. We did not test internal controls of the CATS system or the interface between NRSI and CATS.1

We used the Commission's Name Relationship Search Index (NRSI) computer system to perform our audit work instead of the CATS system. As explained in the Background Section, CATS data are accessed through NRSI.

During the audit, we interviewed Commission staff and performed tests of data reliability. We reviewed supporting documentation for several judgmental samples of Commission civil actions and APs (as explained in the Appendix), and MUIs.

The audit was performed between March and August 2001 in accordance with generally accepted government auditing standards.

BACKGROUND

The CATS computer system is used by the Division of Enforcement to record Enforcement data and to create management reports. The system tracks MUIs, investigations, actions filed, related party information, and other enforcement-related data. CATS was implemented in June 1999 primarily because the prior CATS system was not "Y2K" compliant.

The Division of Enforcement has created standardized data entry forms for CATS. The Enforcement attorney working on a case completes a form and gives it to administrative staff for data entry. However, as CATS becomes linked to the Secretary's systems, some data will no longer need to be recorded by Enforcement staff.

CATS interfaces with NRSI, which acts as a search engine. When staff request information, NRSI searches CATS and various non-enforcement computer systems (e.g., Workload, the Entity Filing and Fee system) and gathers the information.

AUDIT RESULTS

We found that the timeliness, accuracy and completeness of data in CATS need improvement. Our testing indicated that in some cases, data recorded in CATS were not timely or accurate, and some data fields were not completed. The Appendix describes our findings in more detail, except for the audit work performed on the MUI process.

During the audit, the Division of Enforcement established a Steering Committee to improve the data reliability of CATS. At the same time, the Division of Enforcement, OIT, and OS began exploring the feasibility of linking the CATS system with some OS computer systems, since the Enforcement Program and OS both record certain data. Their goals were to eliminate redundant data entry and improve data reliability.

The linkage project between CATS and OS's computer systems will be studied in phases. Possible links include formal order information, and data on civil actions and administrative proceedings. However, regardless of the project's success, Enforcement staff will still have to enter data on investigations and actions.

DATA RELIABILITY ISSUES

Some Enforcement attorneys do not complete the data entry forms timely. Also, the definition of certain CATS data fields is apparently misunderstood, causing inaccurate data to be recorded.

Enforcement management indicated that reliable data in CATS is important. Management uses the data to assess the effectiveness and efficiency of the Enforcement program. Also, other Commission staff rely on CATS data for information about Enforcement cases. Unreliable data could result in wrong decisions being made.

Recommendation A

The Division of Enforcement should explain to all Enforcement staff why reliable CATS data is important, and consider periodically reviewing the reliability of this data. It should hold the staff accountable for data reliability.

The Division has issued guidance to the staff explaining the importance of reliable CATS data.

Recommendation B

The Division of Enforcement should issue guidance to all Enforcement staff explaining how to determine the closing date for an action.

Recommendation C

The Division of Enforcement should correct the data entry errors (described in the Appendix) we found.

According to the Division of Enforcement, they have corrected the data errors.

TRACKING DATA ENTRY

When our audit began, CATS did not track when data are recorded, making it more difficult to monitor data entry and to identify delays and other problems. An audit file function was developed which performs this function.

MUI OPENING AND CLOSING

Matters Under Inquiry enable Enforcement management to review all the readily available information about a matter in order to determine whether an investigation should be opened. Recording of MUI data in CATS provides notice to Commission staff that the Enforcement program has an interest in a matter.

While reviewing the reliability of CATS data, we noted problems in the opening and closing of MUIs.2 According to the Division of Enforcement's procedures (dated March 16, 1998),

It is important to open a MUI as soon as possible to provide notice to the rest of the Division and the Commission (through NRSI) that an inquiry is being conducted.

A MUI should be terminated after 80 hours...or after it has been opened for two months.

We reviewed a judgment sample of 50 MUIs opened after October 1, 2000 by the Division of Enforcement and the field offices. In 12 instances (24%), the MUI was opened more than 30 days after the potential violation was initially identified.

We also reviewed a report of all open MUIs as of February 22, 2001. We found that approximately 460 of 580 MUIs (79%) had been opened in excess of 60 days.3

Recommendation D

The Division of Enforcement should remind all Enforcement attorneys of the importance of timely opening and closing MUIs, and hold them accountable for complying with the Division's procedures.

Recommendation E

The Division of Enforcement should consider whether the existing procedures for opening and closing MUIs need revision.

MUI ORIGINS

In Audit Memorandum No. 18 (dated July 1999), we found that NRSI in some instances lacked reliable information on the origin of investigations (e.g., from an investor complaint, a referral from a self-regulatory organization or the examination program, a news article). In response to our recommendations, the Division of Enforcement reviewed the list of available case origins, and issued clarifying guidance to Enforcement staff.

We sought to determine whether the reliability of the case origin information in NRSI had improved. Based on our review of the 50 MUIs, we found 11 instances (22%) where the case origin appeared inaccurate. Also, in 5 instances (10%), the case origin was not recorded in CATS. Apparently, additional guidance is needed.

Recommendation F

The Division of Enforcement should provide additional guidance to Enforcement staff on case origins, including the handling of potentially ambiguous situations (e.g., when an investor complaint results in an examination, which then leads to an enforcement referral).

APPENDIX

RESULTS OF AUDIT TESTING

PURPOSE: We evaluated recent actions (e.g., Administrative

Proceedings, Civil Actions, appeals of initial decisions from the Commission's Administrative Law Judges, etc.) in order to evaluate the reliability of the CATS data.

METHODOLOGY: In order to evaluate the reliability of the data, different

samples were selected. The samples were judgmentally selected, and represent actions that occurred after September 2000. However, some events related to the action may have occurred prior to September 2000. For instance:

  • An Administrative Proceeding (AP) may have been instituted in September 2000, but the Formal Order could have been obtained prior to September 2000; or
     
  • A Civil Action may have been closed in September 2000, but the complaint could have been filed prior to September 2000.

RESULTS:

(A) We reviewed 150 actions, and found the following exceptions (as of when the audit work was performed):4

  • 12 instances where the actions were recorded as open. These actions should also have been closed. More specifically, in
     
    • 6 instances the actions have since been closed. It took an average of approximately 154 days to record the action as closed. Also, 4 of the 6 actions have a wrong closing date.
       
    • 6 instances the actions have not yet been recorded as closed.5 On average, it has been approximately 266 days since the actions should have been closed.
       
  • 10 instances where the actions were not recorded as open, but should have been. More specifically, in
     
    • 8 instances the actions were eventually opened. It took an average of approximately 123 days to record the action. Also in one of these instances, when the action was subsequently opened, it was also closed. However, it should not have been closed.
       
    • 2 instances the actions have not yet been opened. On average, it has been approximately 229 days since the opening should have been recorded.
  • 19 instances where the actions were not recorded as open or closed. The actions were simultaneously initiated and settled. More specifically, in

    • 4 instances the actions were eventually properly recorded. It took an average of approximately 59 days to record the opening and closing.6
       
    • 4 instances the actions were eventually opened, but they have not yet been closed. On average, it has been approximately 207 days since the action should have been closed.
       
    • 6 instances the actions were eventually opened, but the closing date was wrong. It took an average of approximately 127 days to record the opening and wrong closing date.
       
    • 5 instances the actions have not yet been opened or closed. On average, it has been approximately 228 days since the opening and closing should have been recorded.
       
  • 2 instances where the actions were not recorded as opened or closed, but should have been. The actions were not simultaneously initiated and settled. The actions were subsequently opened and closed. It took an average of approximately 53 days to record the opening and closing. In addition, one of these actions has a wrong closing date.
     
  • 9 instances (not including in any instances previously described above) where the actions were recorded as either open and/or closed. However, the wrong date was used.
     
  • 3 instances where the actions were recorded as "Authorized". However, the action should have been recorded as open. More specifically, in
     
    • 2 instances the actions were eventually opened. It took approximately 36 days to record the action as open.
       
    • 1 instance the action has not yet been opened. It has been approximately 253 days since the opening should have been recorded.

(B) We reviewed 100 actions, and found the following exceptions (as of when the audit work was performed): 7

  • 4 instances where the civil docket or AP number was wrong.
     
  • 38 instances where the Formal Order information was wrong. More specifically, in
     
    • 12 instances the Formal Order was issued; however, it was not recorded.
       
    • 17 instances the Formal Order was not properly recorded (i.e., wrong date).
       
    • 9 instances a Formal Order was recorded. However, it appears that a Formal Order was never actually obtained.

(C) We found numerous instances of actions not being properly described in CATS. The actions were described as "Active"; however, CATS has codes that would have better described the action. More specifically,

  • We reviewed the 50 APs (see footnote 4). We found 12 instances (as of March 11, 2001) where the AP was not settled, and was on the docket of the Administrative Law Judges (ALJ). The AP was recorded in CATS as "Active". However, there is a specific code for APs that are being litigated.
     
  • We reviewed all 32 pending appeals (as of March 16, 2001) of initial decisions from the ALJs. We found 31 instances where the AP was recorded in CATS as "Active". However, there is a specific code for APs that are being appealed. We also found one instance in which an AP was closed in CATS; however, the initial decision was being appealed.

1 Enforcement staff indicated that occasionally the interface does not work properly. For instance, a deleted item in CATS may not be deleted in NRSI. According to OIT, they have since fixed the problem. The problem did not adversely affect our audit results.

2 A prior audit (No. 322) found that MUIs were not always timely opened after a referral from the examination program. Prior audits (Nos. 196 and 322) have found that MUIs were not always timely closed.

3 We did not determine whether the MUIs exceeded 80 hours worked.

4 The 150 actions consist of: 50 APs that were instituted, 50 complaints filed in civil actions, and 50 civil actions in which consent injunctions were entered.

5 Throughout this Appendix, we refer to actions as not being recorded "yet". This refers to not being recorded as of August 31, 2001.

6 With respect to two of these actions, we are unsure of when the action was closed. We used the earliest possible date in order to compute the average.

7 The 100 actions represent the actions described above, except for the 50 civil actions in which a consent injunction was entered.

Return to Top