0001 1 U.S. SECURITIES AND EXCHANGE COMMISSION 2 3 4 5 6 CYBERSECURITY ROUNDTABLE 7 8 9 10 11 12 13 Wednesday, March 26, 2014 14 9:30 a.m. 15 16 17 18 19 20 21 22 23 U.S. Securities and Exchange Commission 24 100 F Street, N.E., Washington, D.C. 25 Station Place 1 Auditorium 0002 1 PARTICIPANTS: 2 3 Mary Jo White, Chair 4 Luis Aguilar, Commissioner 5 Daniel Gallagher, Commissioner 6 Kara Stein, Commissioner 7 Michael Piwowar, Commissioner 8 Cyrus Amir-Mokri 9 Thomas Bayer 10 Peter Beshar 11 Andrew Bowden 12 David Burg 13 James Burns 14 Mark Clancy 15 John Denning 16 Todd Furney 17 Mary E. Galligan 18 Mark Graff 19 David Grim 20 Keith Higgins 21 Roberta Karmel 22 Jonas Kron 23 Jimmie Lenz 24 Mark Manley 25 Douglas Meal 0003 1 PARTICIPANTS (CONT.): 2 3 Javier Ortiz 4 Marcus Prendergast 5 Katheryn Rosen 6 Andy Roth 7 Karl Schimmeck 8 Ari Schwartz 9 Adam Sedgewick 10 Daniel M Sibears 11 Thomas Sinnott 12 John Reed Stark 13 Craig Thomas 14 Leslie T. Thornton 15 David Tittsworth 16 Aaron Weissenfluh 17 Larry Zelvin 18 19 20 21 22 23 24 25 0004 1 C O N T E N T S 2 3 PAGE 4 Opening Remarks 5 5 6 Panel 1: Cybersecurity Landscape 17 7 8 9 Panel 2: Public Company Disclosure 73 10 11 Panel 3: Market Systems 116 12 13 Panel 4: Broker-Dealers, Investment Advisors, 14 and Transfer Agents 169 15 16 Closing Remarks 218 17 18 19 20 21 22 23 24 25 0005 1 P R O C E E D I N G S 2 MR. HIGGINS: Good morning. My name is Keith 3 Higgins, and I'm the director of the Division of 4 Corporation Finance here at the SEC. Beside me is Tom 5 Bayer, Chief Information Officer at the SEC. To my left 6 is Jim Burns, who is the deputy director in our Division 7 of Trading and Markets. Later today we'll be joined by 8 David Grim, who is the deputy director in Investment 9 Management here at the Commission, as well as Andrew 10 Bowden, who is the director of the Office of Compliance, 11 Inspections, and Examinations. 12 Today it's our pleasure to moderate a staff 13 roundtable discussion about the cybersecurity landscape 14 and the cybersecurity issues faced by exchanges and other 15 key market systems, broker-dealers, investment advisors, 16 transfer agents, and public companies. 17 Before we go any further, I need to note that 18 the views expressed by Tom, Jim, David, Drew, and me, as 19 SEC moderators throughout the course of the day, are our 20 views alone and don't necessarily reflect the views of 21 the Commission, any of the Commissioners, or any other 22 members of the staff. 23 Indeed, as moderators, we may at times ask 24 questions or make statements that don't even necessarily 25 reflect our own views but are offered up for the purpose 0006 1 of eliciting a spirited dialogue. We hope that our 2 questions will contribute to a meaningful and 3 constructive discussion. We're happy to have with us 4 here this morning Chair Mary Jo White, Commissioners 5 Aguilar, Gallagher -- and I think Commissioner Stein 6 hopefully is on her way -- and Commissioner Michael 7 Piwowar. 8 Let me start by saying welcome to those here in 9 the audience as well as those who are attending by 10 webcast. It's great to have you here. Thanks and 11 welcome as well to our panelists and a particular thank 12 you to our speakers today who are lending their expertise 13 and experience on this very important topic. 14 Cyber incidents appear to be escalating in 15 frequency, duration, and complexity. Hardly a day goes 16 by without a news story about another breach or a 17 sophisticated mechanism for a cyber attack. The 18 participants we have here today reflect many diverse 19 viewpoints and include representatives from the White 20 House, the Department of Treasury, the Department of 21 Homeland Security, the National Institute of Standards 22 and Technology, the exchanges and other key market 23 systems, broker-dealers, investment advisors, transfer 24 agents, public companies, investors, law firms, 25 consultants, academia, and trade associations. 0007 1 I have little doubt that this group will lead 2 us through a lively, thought-provoking, and informative 3 discussion about the issues surrounding this very 4 important topic that has generated a significant amount 5 of public interest. Moreover, we hope this discussion 6 today will be a catalyst for even more fruitful debate, 7 and we urge those here as well as those listening in to 8 join us and give us your thoughts by sending in comment 9 letters. 10 As our press release indicated, we have both 11 the web intake form as well as an email address by which 12 you can submit comments, and we encourage everyone to use 13 them. With that, I'd like to invite Chair White to make 14 some opening remarks after which each of the 15 commissioners will be -- have the opportunity to make 16 some brief remarks. 17 Chair White? 18 CHAIR WHITE: Thank you very much, Keith. And 19 let me say both good morning and add to Keith's welcome 20 to everyone to today's roundtable on cybersecurity. 21 Cybersecurity threats come from many sources: 22 criminal and hired hackers, terrorists, state-sponsored 23 intruders, and even misguided computer experts to see 24 what they are able to penetrate. Cyber threats also pose 25 non-discriminating risks across our economy to all of our 0008 1 critical infrastructures, our financial markets, banks, 2 intellectual property, and, as recent events have 3 emphasized, the private data of the American consumer. 4 This is a global threat. Cyber threats are of 5 extraordinary and long-term seriousness. They are first 6 on the Division of Intelligence's list of global threats, 7 even surpassing terrorism. And Jim Comey, director of the 8 FBI, has testified that resources devoted to cyber-based 9 threats are expected to eclipse resources devoted to 10 terrorism. What emerges from this arresting view of the 11 cybersecurity landscape is that the public and private 12 sectors must be riveted, in lockstep, in addressing these 13 threats. 14 The President's 2013 Cybersecurity Executive 15 Order and the Cybersecurity Framework issued in 2014 by 16 the National Institute of Standards and Technology are 17 reflective of the compelling need for stronger 18 partnerships between the government and the private 19 sector. 20 The SEC's formal jurisdiction over 21 cybersecurity is directly focused on the integrity of our 22 market systems, customer data protection, and disclosure 23 of material information. But it is incumbent on every 24 government agency to be informed on the full range of 25 cybersecurity risks and to actively engage to combat 0009 1 those risks in our respective spheres of responsibility. 2 This roundtable is one aspect of the SEC's efforts to 3 better inform ourselves, the marketplace, our fellow 4 agencies, and the private sector as to what the risks are 5 and how best to combat them. 6 As many of you know, we at the SEC have been 7 focused on cybersecurity-related issues for some time. 8 In connection with public company disclosures in October 9 2011, our Division of Corporation Finance issued guidance 10 on existing disclosure obligations related to 11 cybersecurity risks and incidents to assist public 12 companies in framing disclosures of cybersecurity issues. 13 That guidance makes clear that material information 14 regarding cybersecurity risks and cyber incidents is 15 required to be disclosed. 16 Since we issued that guidance, our staff has 17 continued to study the important and challenging issues 18 that cybersecurity presents to public companies, market 19 participants, and investors, including the intersection 20 of our investor-focused disclosure requirements and the 21 types of information those with national security 22 responsibility need in order to better protect our 23 critical infrastructure. I am looking forward to hearing 24 the views on these topics. 25 Cybersecurity for SROs and large alternative 0010 1 trading systems also is a very important area of focus 2 for our staff. Part of this focus involves the 3 Commission's proposed rule on Regulation Systems, 4 Compliance and Integrity, which would require an entity 5 covered by the rule to test its automated systems for 6 vulnerabilities, test its business continuity and 7 disaster recovery plans, notify the Commission of cyber 8 intrusions, and recover its clearing and trading 9 operations within specified time frames. I do expect the 10 Commission to move ahead with Regulation SCI this year. 11 We have also focused on cybersecurity risk 12 issues for registered investment advisers, broker- 13 dealers, and funds, including, for example, data 14 protection and identity theft vulnerabilities. In this 15 area, the Commission last year adopted Regulation S-ID, 16 which requires certain regulated financial institutions 17 and creditors to adopt and implement identity theft 18 programs. Regulation S-ID builds upon the SEC's existing 19 rules for protecting customer data, in particular 20 Regulation S-P. 21 I want to thank all of our panelists for 22 participating today and sharing their views on these 23 critical issues. There is no better way to proceed than 24 by assembling the right people in the same room to 25 discuss and share information, points of view, and best 0011 1 practices. Each panel consists, as you will see, of a 2 very impressive group of professionals who bring a great 3 deal of expertise and a range of relevant perspectives. 4 In addition to our panelists, we are joined 5 today by many others who are here in person or watching 6 online. And, of course, we welcome your views as well. 7 We have set up a comment file on our website for the 8 public to submit views on cybersecurity issues or to 9 respond to the questions addressed and the views 10 expressed by our panelists. And I especially look forward 11 to hearing the public's ideas and input. Your views are 12 important to us. And we and the others here today will 13 benefit immensely from hearing them as we study these 14 issues. 15 Thank you and enjoy the roundtable. 16 MR. HIGGINS: Thank you, Chair White. 17 Commissioner Aguilar? 18 COMMISSIONER AGUILAR: Thank you, Keith. I 19 would also like to start by welcoming each of the 20 participants as well as the members that are in the 21 audience that are joining us here in person today as well 22 as those joining us by webcast. In recent months cyber 23 security has become a top concern to American companies. 24 Regulators and law enforcement agencies have also been 25 spending a lot of time on this issue. This is, in part, 0012 1 because of the mounting evidence that the constant threat 2 of a cyber attack is real, lasting, and cannot be 3 ignored. 4 One of the most prominent examples of the wide- 5 ranging and potentially devastating effects that could 6 result from cyber attacks is the December 2013 data 7 breach of Target Corporation. In addition, several large 8 banks have repeatedly been the subject of denial of 9 service attacks in which their public web sites have been 10 knocked offline for hours at a time. And numerous 11 government agencies have also experienced a series of 12 cyber attacks. 13 Moreover, cyber attacks on financial 14 institutions have become both more frequent and more 15 sophisticated. This is also true of cyber attacks on the 16 infrastructure underlying the capital markets. For 17 example, according to a 2012 global survey of securities 18 exchanges, 89 percent identify cyber crime as a 19 potentially systemic risk, and 53 percent reported 20 experiencing a cyber attack in the previous year. 21 As an SEC Commissioner, I have become 22 particularly concerned about the risk the cyber attacks 23 pose to public companies and to the capital markets and 24 its critical participants, including the exchanges, 25 clearing agencies, transfer agents, broker-dealers, and 0013 1 investment advisers. Cyber attacks aimed at these market 2 participants can have devastating effects on our economy, 3 on individual consumers, and on the markets and investors 4 that the SEC was created to safeguard. 5 There is no doubt that the SEC must play a role 6 in this area. What is less clear is what that role 7 should be. As many of you know, in 2011 the staff issued 8 guidance to public companies about their disclosure 9 obligations with respect to cybersecurity risks and 10 cyber incidents. I hope that these disclosures have 11 helped investors and public companies to focus and assess 12 cybersecurity issues. However, the increased 13 pervasiveness and seriousness of the cybersecurity 14 threat raises questions about whether more should be done 15 to ensure the proper functioning of the capital markets 16 and the protection of investors. 17 As I explored this issue, it became readily 18 apparent to me that the Commission has much to learn 19 about the specific risks that our regulated entities and 20 public companies are facing. After conducting research 21 into this area, I recommended that the Commission convene 22 a roundtable so that we can begin to develop a better 23 understanding of this growing problem. I am pleased that 24 Chair White agreed with my recommendation and that she 25 asked the staff to make this roundtable a reality. 0014 1 The issues that will be discussed today by today's 2 four panels can roughly be broken down into two 3 categories: issues potentially impacting public 4 companies, and issues impacting the capital markets 5 infrastructure and SEC-regulated entities. 6 With regard to the public company discussion, I 7 am particularly interested in hearing whether the current 8 disclosure regime under the 2011 guidance is working or 9 how it could be improved. 10 The risk facing the capital market 11 infrastructure and risks potentially impacting regulated 12 entities are a particular concern to the SEC. For 13 instance, a cyber attack on an exchange or other critical 14 market participant can have broad consequences that 15 impact a large number of public companies under 16 investors. Indeed, given the extent to which the capital 17 markets have become increasingly dependent upon 18 sophisticated and interconnected technological systems, 19 there is a substantial risk that a cyber attack could 20 cause significant and wide-ranging market disruption and 21 investor harm. 22 I am hopeful that today's roundtable will 23 engender significant discussion about the ways in which 24 regulators and industry can work together to address 25 these risks. One of the most important things that can 0015 1 develop from this roundtable is for the Commission to 2 hear what we can do to help you find and respond to the 3 growing cyber threat that is confronting our markets and 4 our public companies. My expectation is for the 5 Commission to analyze all the information we receive 6 as a result of this roundtable and, with appropriate 7 haste, consider what additional steps the Commission 8 should take to address cyber threats. 9 It will be important to keep the dialogue 10 momentum from today's event going. One immediate step 11 that the Commission should take is to establish a 12 cybersecurity task force. This task force should be 13 composed of representatives from each division that 14 will regularly meet and communicate with one another 15 to discuss these issues and, importantly, to advise 16 the Commission as appropriate. 17 In conclusion, I would like to thank all of our 18 panelists for taking the time to be here today, and I 19 wanted to thank the staff for organizing the roundtable. 20 I look forward to a well-informed discussion about cyber 21 attacks as well as the ways to prevent, respond to, and 22 mitigate the risk of such attacks. And as a reminder, as 23 Chair White said, there will be a public comment file on 24 our web site, and I encourage everyone to write in. We 25 need to hear you views. Thank you. 0016 1 MR. HIGGINS: Thank you, Commissioner Aguilar. 2 Commissioner Gallagher? 3 COMMISSIONER GALLAGHER: Well, you'll be 4 pleased to hear I have no formal remarks. I was actually 5 betting on Commissioner Aguilar giving such good remarks 6 that I could just agree with them. 7 And so you came through for me. Thank you, 8 Luis, made my job easier. 9 CHAIR WHITE: Not my remarks, you understand. 10 COMMISSIONER GALLAGHER: Oh boy. Now I have to 11 parse things a bit. 12 No, I agree with everything that's been said 13 and happy to have this roundtable today. I'm especially 14 pleased at the caliber of participants on the panels. 15 And, in particular, I want to thank Cyrus Amir-Mokri for 16 coming. It's quite an honor to have someone at your 17 level at Treasury. 18 And I think his presence here today points out 19 how important these issues are. So I just look forward 20 to learning, and I agree with Commissioner Aguilar 21 emphatically that it's important for us to know from you, 22 from this interaction today from the comment file, what 23 the Commission could be doing, what more we could be 24 doing in this space, so happy to be here. 25 MR. HIGGINS: Thank you, Commissioner 0017 1 Gallagher. 2 Commissioner Piwowar? 3 COMMISSIONER PIWOWAR: Thank you. Unlike 4 Commissioner Gallagher, I do have prepared remarks, so 5 thanks, Keith. 6 I also want to extend my thanks and gratitude 7 to all the participants who will be appearing on all the 8 panels today, for those of you that are up here on this 9 panel and those that are out in the audience for the 10 future panels. 11 Computer and electronic communication systems 12 play a vital role in ensuring fair, orderly, and 13 efficient financial markets. We simply cannot afford to 14 become complacent when it comes to the security of these 15 systems. I commend Chair White and Commissioner Aguilar 16 for their leadership on this issue, and I appreciate the 17 efforts of our staff to organize this roundtable. 18 Now on a related note, on Monday I had the 19 privilege of meeting a number of great students and 20 faculty of the University of North Carolina Wilmington 21 Cameron School of Business at the kickoff of their 22 Business Week celebration. 23 In particular, I want to give a shout out to a 24 wonderful group of students and faculty sponsors who are 25 doing amazing things related to cybersecurity, the UNCW 0018 1 Cyber Defense Club. Among other things, they compete in 2 something called the National Collegiate Cyber Defense 3 Competition, which challenges students to defend computer 4 network services against attacks by professional hackers. 5 Teams are assessed not only on their ability to detect 6 and respond to outside threats, but also on their ability 7 to balance security needs against business needs. 8 I learned a lot about cybersecurity issues, 9 including the importance of balancing security needs 10 against business needs for my discussion with these very 11 bright and motivated students. I hope to learn even more 12 from our roundtable discussion with our distinguished 13 panelists today. 14 I look forward to hearing from our participants 15 and from the general public through the comment file that 16 Chair White and Commissioner Aguilar mentioned in their 17 opening remarks. Thank you. 18 MR. HIGGINS: Thank you, Commissioner Piwowar. 19 Now I'd like to turn it over to Tom Bayer, who 20 will moderate our first panel. 21 Tom? 22 MR. BAYER: Thank you. 23 Cybersecurity threats exploit the increased 24 complexity and connectivity of our critical 25 infrastructure systems, placing the nation's security, 0019 1 economy, and public safety at risk. In an ever 2 increasingly connected world, cyber threats can impact an 3 organization's ability to innovate, protect their 4 intellectual property, and other online assets. 5 In this panel we will explore the current cyber 6 landscape, including cybersecurity risks faced by market 7 participants, the state of preparedness in mitigating 8 these threats, and we will also discuss how various 9 regulatory agencies are responding to cyber threats and 10 how they provide guidance and oversight to components of 11 the U.S. critical infrastructure, including financial 12 services. 13 Joining us on this panel starting on my right, 14 we have Cyrus Amir-Mokri, Assistant Secretary for 15 Financial Institutions, Department of Treasury, Mary E. 16 Galligan, Director, Cyber Risk Services, Deloitte & 17 Touche, Ari Schwartz, Acting Senior Director for 18 Cybersecurity Programs, National Security Council, The 19 White House, Larry Zelvin, Director, National 20 Cybersecurity and Communications Integration Center, U.S. 21 Department of Homeland Security, Adam Sedgewick, Senior 22 Information Technology Policy Advisor, National Institute 23 of Standards and Technology, Andy Roth, Partner and Co- 24 Chair, Global Privacy and Security Group, Dentons US, and 25 Javier Ortiz, Vice President, Strategy and Global Head of 0020 1 Government Affairs, TaaSera. 2 Thank you for joining our discussion this 3 morning. 4 With that, I'd like to turn it over to Mary and 5 Cyrus to begin our dialogue through opening statements 6 about the current cybersecurity landscape. 7 Mary? 8 MS. GALLIGAN: Thank you very much, Tom. It's 9 a pleasure to be here today. I know many of my fellow 10 panelists from my time at the FBI working cyber 11 intrusions, and it's a true honor to be included with 12 such a distinguished group of people. 13 To get a complete picture of the cyber threat 14 landscape, we need to break it down into three areas, 15 threat vectors, threat intelligence, and threats to the 16 ability to be resilient. 17 By threat vectors we mean it could be nation- 18 states, spies who seek to steal our national security 19 secrets or our intellectual property, organized criminals 20 who use sophisticated cyber tools to steal our identity 21 and our money, terrorists who want to attack our 22 infrastructure, or hacktivists that are trying to make a 23 social statement by stealing information and then 24 publishing it to embarrass organizations. 25 Another part of this threat vector are the 0021 1 methods. Are we talking about destruction of data or 2 hardware as the world saw with the Saudi Aramco or the 3 banks in South Korea? Are we talking about denial of 4 service that was referred to in earlier comments that our 5 financial institutions suffered over a period of months 6 or perhaps the recent rash or ransomware where files are 7 encrypted until ransom is paid? Or are we talking about 8 theft where identity and money is stolen as we saw with 9 the recent retail breaches? 10 For the past decades organizations have been 11 securing these data from this threat matrix of actors and 12 methods, and we have learned that securing the data is 13 simply not enough and that the cyber threat landscape 14 also needs threat intelligence. 15 How do organizations get distilled pertinent 16 actionable intelligence about the threats that they face 17 to allow them to remain vigilant in securing their data? 18 I know this panel will discuss the sharing of threat 19 information by the United States government to the 20 private sector and vice versa and the numerous hurdles in 21 that process as well as how far the government has come 22 in sharing information. 23 Obtaining this threat intelligence has a direct 24 impact on the third area of the cyber threat landscape. 25 And that is our ability to be resilient. The quicker an 0022 1 incident can be detected, we can recover, and we can 2 remediate faster. And the fact that the incident has 3 been discovered raises threats in and of itself, threats 4 to daily business operations, threats to reputation based 5 on the reactions to certain disclosures, and threats to 6 the bottom line due to the cost of response and 7 remediation. 8 So when an organization looks at the cyber 9 threat landscape from the point of view of threat 10 vectors, threat intelligence and threats to the ability 11 to be resilient, they see that the landscape is not 12 uniform, that the terrain is different depending on where 13 you're standing and what it is you're trying to protect 14 and that this threat is no longer a threat to technology 15 but a threat to operations and business. 16 And I know, Tom, that the panel can expound on 17 that. 18 CYRUS AMIR-MOKRI: Thanks very much, Tom, and 19 thank you everyone. It's really an honor to be here to 20 participate on this panel. I very much agree with what 21 Mary just said. Let me just pause on a couple of the 22 issues, how we look at things generally. 23 I think the -- in understanding threats I think 24 it's very important to understand where the threat is 25 coming from. And so as you think about who the actors 0023 1 are and who the potential intruders into network systems 2 are, I think it's very important to distinguish between 3 kind of the source of the activity, and I agree with how 4 Mary broke that down. 5 Second, not all actions are equal. You know, 6 you have a lot of scanning. You have DDoS attacks. You 7 have intrusions. These are different kinds of -- they 8 all fall under the category of cyber attack, but they're 9 very, very different activities with different 10 consequences and different remediation or prevention 11 activities associated with them. And they tie back, 12 incidentally, to the actor and the aims that they have. 13 And as Mary said, they could range from, you know, things 14 like defacement and reputational issues to corruption, 15 destruction, wiping out contents and databases. So it's 16 very important to distinguish between those issues in 17 order to understand how you're going to protect yourself. 18 A couple of other observations I'll make, 19 again, tying back to what Mary just said, is that, in 20 thinking about the whole area of cybersecurity -- and 21 I've spoken about this publicly before as late as last 22 week -- we generally try to break it down into three 23 large conceptual categories. And a lot of this, by the 24 way, ties back to the substance of the President's 25 executive order issued in February of 2013. And you can 0024 1 also map it back to the cybersecurity framework that NIST 2 published about a month or so ago. 3 Basically we look at it not unlike how we think 4 about prudential regulation more generally in the area of 5 financial regulation. And that is we start out with what 6 we call resilience. How do firms generally protect 7 themselves? What are the measures they ought to take? A 8 lot of that is just basically technical IT activity, and 9 a lot of that also ties back into another concept Mary 10 mentioned, which is information sharing. And that's -- 11 that has both a private-private component and also a 12 public sector to private sector component. 13 A second piece of it -- again, Mary alluded to 14 it -- is incident management. And incident management -- 15 you know, it depends on the nature of the incident. It 16 could be kind of a small thing where, you know, the 17 resources needed to manage are not that great, or it 18 could be a very serious event, in which case you might be 19 talking not just the private sector but also public 20 sector involvement. And as we think about potential 21 incidents, those are the concepts that we try to keep in 22 mind. And as we think about preparedness, those are the 23 kinds of things that we deal with. 24 And then finally, we think about recovery. So 25 what if something really bad happens? How do we deal 0025 1 with it afterward? Those are very important questions 2 that are at the forefront of our thinking as we think 3 through managing cybersecurity. 4 Just a couple of final points to tie everything 5 together. As Chair White indicated in her remarks, we 6 review this as a whole of government effort. We have 7 different agencies with different sets of expertise, 8 different jurisdictional responsibilities, and we feel 9 that it's very important to bring everyone together. And 10 Larry Zelvin at DHS will tell you that one of the main 11 things that they work on is bringing the whole picture 12 together. 13 And we work very closely with the private 14 sector, we, Treasury, as a sector-specific agency. But 15 we try to be a conduit to the government as a whole. We 16 believe there's a whole of government effort where 17 financial regulators, independent regulators, the 18 administration and the various parts of the U.S. 19 government have an essential part to play. 20 The second is -- and Chair White alluded to 21 this as well -- is the private-public partnership. 22 Again, just as different areas of government have their 23 responsibilities, the private sector, we have said and we 24 believe, is the front line defense. They are the ones 25 who know their systems best. They often have bespoke 0026 1 systems, and they're in the best position to man and 2 manage the front line defense. 3 In government we have certain capabilities and 4 certain abilities to help out with technical assistance 5 or information sharing, but we need to be mindful of the 6 separation of functions along that axis as well. 7 And so it's a -- kind of my bottom line message 8 is this effort requires everyone to cooperate and 9 coordinate, and that is what we work on every day. 10 Thanks. 11 MR. BAYER: Thank you, Cyrus. 12 So our first topic -- I would ask Ari Schwartz, 13 Larry Zelvin, and Mary Galligan to reply to this issue. 14 Please discuss what are the most common cyber threats 15 faced by market participants today; what challenges to 16 market participants face in assessing those threats and 17 potential vulnerabilities; and are certain industries 18 more vulnerable than others. 19 Ari? 20 MR. SCHWARTZ: Well, in this space really we -- 21 Mary went over a pretty good list of the types of threats 22 that we see on a regular basis here. The issue is that 23 list of actors that you have that Mary put out there, 24 nation-states, criminal actors, and the hacktivists, 25 etcetera. Each have different sets of motivations. 0027 1 Then you have sets of the tool kits that they 2 use where we see different kinds of attacks, the denial 3 of service and destruction kind of attacks, the theft of 4 information, be it personal information, financial 5 information, or business information and intellectual 6 property. And so we see such a wide range of attacks 7 that businesses in this space really have to have such a 8 wide range of expertise to be able to protect from them, 9 and that's really a lot of the basis of this. 10 Where the President has focused a lot of energy 11 has been on the critical infrastructure, on energy, 12 water, financial, medical, other -- the true critical 13 industry that's out there that's been set forth in 14 presidential directives going back for a dozen years or 15 so. And that space has been really where the President 16 feels the gravest national danger presented itself and 17 has focused a lot of our national security efforts on 18 that space. 19 MR. ZELVIN: Well, thank you. And I'd like to 20 thank the Commission for having us here today and also 21 the group for being here. 22 The threats, I agree, has been covered well. 23 The one I would highlight that hasn't maybe been 24 mentioned as much is the insider threat, if you look at 25 Snowden, if you look at Private Manning, extraordinarily 0028 1 serious threats that are facing in cybersecurity and to 2 devastating consequence. 3 We at DHS look across not only the nation but 4 also the globe. There are 16 critical infrastructure, 5 transportation, finance, water, defense. I won't go 6 through them all, but I will congratulate you. 7 As you look at the 16 critical infrastructures, 8 finance probably wins the cybersecurity threat award. 9 The second one is energy, and then it kind of goes from 10 there. So you are a massive target, and you're a target 11 for two reasons in my mind. First is because you're 12 where the money is. The second one is that you also 13 represent our nation. There was a time when nations used 14 to focus on their militaries. They would focus 15 potentially on their commerce overseas. Now they can 16 focus on the commerce within your own nation. 17 Adversaries -- and Mary did an outstanding job 18 of covering the threats. Let me switch it around a 19 little bit. The adversaries are actually looking at 20 vulnerabilities. So before you get the threat, you have 21 to find an opening. And they are doing an extraordinary 22 job of finding those openings. I will tell you they are 23 actively looking every second of every minute of every 24 day for an opening. 25 There is a water utility in the middle of our 0029 1 nation that you would drive by and never even notice that 2 was a victim of a very significant cyber attack that 3 emanated from the other side of the globe. They never 4 really had cybersecurity on their radar because they 5 figured who would pick on them. And candidly, when my 6 team told me about the threat, I said, "You've got to be 7 kidding me. Who would pick on them?" But they had an 8 industrial control system. It was a way for an adversary 9 to learn more about it. They saw an advantage, and they 10 took it. 11 So I think the thing I wish to highlight is 12 just that you not only have to look at the threats, but 13 you have to look at the vulnerabilities. And because the 14 adversaries are looking for any hole possible to come 15 into your networks -- and once they're in, they're able 16 to stay there persistent. 17 It's also no longer about drive-bys. They are 18 really taking their time. They are being very thorough 19 and methodical and patient in their attacks. So we need 20 to really raise the awareness of these challenges as 21 well. And I think, again -- and I think Ari would agree 22 -- that's what the Cybersecurity Executive Order is 23 trying to do when we talk about baseline. And I won't 24 steal my NIST colleagues thunder, but I think it's a 25 great start. 0030 1 The other thing I would encourage on the 2 financial sector, my last part of this, is you are way 3 ahead of the rest of our nation's cybersecurity, reason 4 being is -- is you're getting attacked a lot. I'd 5 encourage you on the information sharing we get there to 6 share that information not only with your people you work 7 with in business both nationally and internationally, but 8 also with government because we have a lot of work to do 9 with a number of sectors that you rely upon for your 10 businesses that we need to benefit from your experiences. 11 MS. GALLIGAN: Thank you, Tom. 12 Since Ari and Larry mentioned the threats, let 13 me go to your second question, which was what are the 14 challenges that are market participants are facing. And 15 I would say, when -- at Deloitte, looking at our clients 16 across all industries, there's three things that are 17 consistent. The first is how do I figure out what I 18 really need to protect. I cannot protect everything on 19 my network. I cannot protect, as Larry said, or stop or 20 close every vulnerability. So how do I do a true risk 21 assessment that is specific to my sector? And I need to 22 know the baseline then for my sector, the sectors that 23 Larry's referring to. And what are the actors and 24 methods that I need to pay most attention to? 25 The second challenge is how do I manage access, 0031 1 not just, as Larry mentioned, the insider access, but 2 more and more that third party access, the vendors, the 3 professional services, the people who need to connect to 4 my system that may or may not have the same type of 5 cybersecurity that I have. 6 And then the third one would be how do I 7 monitor the monitors. So if I am collecting threat 8 intelligence information and I have that monitoring, do I 9 have the talent or do I have the knowledge to zero in on 10 what is very specific to my company, my industry, my 11 sector. 12 And I would say, Tom, those are probably the 13 three biggest challenges, and I know the panel can add 14 to that. 15 MR. BAYER: Any other comments from our panel 16 members? 17 (No response.) 18 MR. BAYER: Okay. We'll move on to another 19 question, which is how do market participants manage 20 cybersecurity risk. What role should senior management 21 play, and how about the board of directors? What should 22 directors and senior management be thinking about with 23 regard to potential threats? What should they be doing 24 in response to those threats? And I would ask Andy Roth 25 and, again, Mary Galligan to respond. 0032 1 MR. ROTH: So I think, just building on what 2 was said before, we've moved from a stage where you had 3 point solutions to cybersecurity, and we're very much in 4 a continuous monitoring mode where you have a 5 multilayered approach. And these issues and these 6 vulnerabilities stem from business processes. So if 7 you're sending information on employees to a vendor -- 8 again, vendor has been highlighted -- if you are 9 processing information internally or you're processing 10 credit card information, it's very important that we have 11 a multi-stakeholder effort very similar -- again, not to 12 steal the NIST multi-stakeholder effort. 13 This is not one person's job within an 14 organization. This is a senior management accountability 15 thing. This is a top-down, that this is very important 16 to us. And it takes preparation and coordination among 17 all the key stakeholders so that situations arise that 18 they can be dealt with quickly. 19 Also the nature of the tool set that companies 20 are using is changing. Traditionally in companies 21 security went to the part of the company that managed 22 physical security. And that may not have been the proper 23 skill set or the place within the organization. And 24 certainly this is something that has bubbled up to audit 25 committees who are looking very closely at it and 0033 1 probably finds its home really in the risk management 2 function of the company and specifically operational risk 3 management where you try to drive accountability and 4 controls around core business processes. And I think 5 that's -- the companies that do this the best view this 6 as a enterprise-wide effort that requires everybody's 7 participation. 8 MS. GALLIGAN: And, Tom, building along with 9 that, it is true that the boards of directors have many, 10 many questions in this area. It's estimated that only 11 one percent of all boards in America have someone on the 12 board that is really proficient in cybersecurity and in 13 technology. 14 So what should you do as that group of board of 15 directors? It's knowing what questions to ask. So as 16 you're getting briefed, as you're that audit committee, 17 as you're that risk committee, are you asking the right 18 questions? And I would boil many, many, many questions 19 down to two. Asking along the lines of something that 20 Larry mentioned, how do I know what data is leaving my 21 company? So whether it's leaving by insider threat or 22 whether it's leaving by packets of information, how are 23 we in this Company X monitoring that? 24 And the second thing -- and it's probably 25 becoming more and more crucial -- is do we have a cyber 0034 1 incident response plan. Is it up to date, and have we 2 truly practiced it? You're seeing more and more 3 companies in different sectors doing cyber war gaming, 4 doing simulations. And the reason is because, as soon as 5 the incident happens, we move from a cyber threat and a 6 cyber incident to what Cyrus mentions, this business 7 issue, this economic issue. So those companies and those 8 organizations with a robust cyber incident response plan 9 over and over do better and minimize their risks. 10 MR. BAYER: So for the next set of questions I 11 would ask Ari Schwartz, Adam Sedgewick, and Mary 12 Galligan, again, to answer these sets of questions. What 13 is the magnitude of cybersecurity incidents, and how 14 should a market participant evaluate cyber threats among 15 other risks that it may face? And, again, if you could 16 tie it into how a board or other senior management teams 17 could provide oversight and guidance, that would be 18 great. Thank you. 19 MR. SCHWARTZ: Well, I'll just start by saying 20 I think risk is the right way to talk about this, and, 21 you know, it's important to put that out front. A lot of 22 times when you're discussing these issues with boards and 23 with other senior management, it's sort of -- there's 24 sort of an attitude of "what can we do to get past this 25 problem". And it's not really a problem that you're 0035 1 going to get past. It's a problem you're going to 2 manage. 3 And so I think the very starting point is to 4 have that attitude in place that this is a question of 5 ongoing risk and what can we do to mitigate risk and what 6 can market players know about the risks that the 7 companies that they are dealing with face. 8 MR. SEDGEWICK: And just expanding on that a 9 bit, you know, the work we did under the President's 10 executive order in developing the cybersecurity framework 11 was really a multi-stakeholder process where we asked 12 these sort of fundamental questions initially about what 13 are the needs of our participants, those organizations 14 that make up the critical infrastructure. And then the 15 tasking of the frameworks was to build that set of best 16 practices in a way that makes it easier for people to 17 manage cybersecurity risk, building on what they're 18 already doing. 19 So we were constantly going and asking these 20 very same sorts of questions that we're asking today, 21 what are organizations doing, how are they managing 22 this problem, and where are the real gaps. 23 And one of the things that we heard early on 24 was this conversation about what is the role of the 25 business executives within an organization and how do you 0036 1 translate what's happening in a technical perspective all 2 the way up so that you can have that line of 3 communication. I remember at one of our workshops that's 4 exactly what they said. We have a lot of technical 5 standards to deal with, technical issues, but our 6 challenge is communicating that to senior executives. 7 So I think that there are a lot more questions 8 coming from the boards. There are a lot more questions 9 coming from senior executives. But that challenge is, 10 when those questions are asked, how do you actually 11 communicate what needs to be done and then how are those 12 changes driven throughout an organization. 13 And I think the challenge that a lot 14 organizations has is that they think that they've solved 15 the problem because they've treated it like a technical 16 issue, and they might potentially be spending a lot of 17 money on it. 18 So we can talk more about what the framework 19 does, but that was sort of one of the fundamental 20 structures, was to say, "Let's take these existing sets 21 of practices, these existing security capabilities, and 22 map it in a way so that you can have that communication 23 not only within your organization, but with all those 24 other stakeholders that you rely upon, realizing that 25 more and more you might not own the risk that you have to 0037 1 be aware of. 2 MS. GALLIGAN: And, Tom, building on what Adam 3 just said is the other role that senior management can 4 play is creating a culture in organizations that 5 literally says, "This cybersecurity issue starts at the 6 keyboard. It starts with every single employee." 7 So how do I create a culture in the 8 organization, as Adam said, that does not look at this 9 as a tech issue and the IT guys will take care of it 10 but as a business issue and a culture where, if I see 11 something, whether it's a spearphishing campaign where 12 I get an email with a link that doesn't look right -- 13 do I report it? Do I understand my role in protecting 14 the information for my company? 15 So I think it's very important if senior 16 management can also create that role, that culture that 17 says, "This is everybody's responsibility. 18 CHAIR WHITE: Tom, may I ask a question? 19 MR. BAYER: Yes, ma'am. 20 CHAIR WHITE: Just briefly -- and this is 21 really a board responsibility question -- I mean what are 22 you seeing in terms of sort of where that responsibility 23 resides at the board? I mean is it the risk committee; 24 is it the audit committee, better yet, a cyber committee 25 or a subcommittee? 0038 1 And then I guess the second question I have, 2 really coming from Mary's comment about one percent of 3 the boards have the expertise or a representative with 4 the expertise -- do the boards that really are exhibiting 5 best practices actually get outside expertise to advise 6 them, or are we just sort of beginning to see that 7 structure develop? 8 MS. GALLIGAN: Well, in answering your second 9 question, Madam Chair, first is, yes, you're seeing more 10 and more boards getting that outside expertise as they 11 would whether it was legal expertise or accounting 12 expertise. 13 Where should it sit on the board? You could 14 probably have another whole panel about that. So you 15 have some people having it sit on the audit committee 16 because perhaps they haven't really gone through, as Adam 17 has said and that has been mentioned, that risk 18 assessment. About 50 percent of boards have a risk 19 committee. So some will put that cyber risk on the risk 20 committee. 21 And then the last piece is, if the company's 22 entire backbone -- let's say you are the Amazon of the 23 world. Then you're going to see boards having separate 24 cyber risk committees. And whether that's the way of the 25 future needs to be determined. 0039 1 MR. ROTH: I would add that what's necessary is 2 not just at the top level having accountability -- and it 3 can be between an audit committee and an audit end risk 4 committee -- is having an integrated policy and committee 5 framework with defined escalation thresholds so that 6 there's a clear path for this to make its way to the top 7 of the organization to be dealt with appropriately. 8 MR. ZELVIN: And maybe I could just follow up 9 just very quickly. So this morning when I walked into my 10 operations center and I was briefed, there were 230 11 incidents over the last 24 hours. Of the 230, they 12 personally briefed me about on five. And by the way, you 13 know, we look across DHS, the entire dot gov, all the 14 critical infrastructure, state, local, international, and 15 I had actually had a colleague in London reach out to me 16 this morning as well. 17 Out of those five there was only one that I 18 need to go to my next-level supervisor. But we have set 19 thresholds of when do I go to the secretary, when does 20 the secretary go to the President. But I'll tell you 21 those thresholds have really not been well-designed. And 22 we've really had to make those over the last few years. 23 I came from a military background. In the 24 military you had very clear -- what they call commander's 25 information requirements. They're too linear in cyber. 0040 1 We had to get to something where people could really 2 think about it and make some other decisions because you 3 just can't cover everything. 4 So I think as you look at these things, you've 5 really got to kind of set a barometer of what you want 6 reported, when, and how, and those barometers don't 7 really exist very well. 8 MR. BAYER: So let's take a deeper dive into 9 cybersecurity preparedness, especially from a U.S. 10 industry perspective. And also, in particular, financial 11 services. So I would ask Cyrus, Larry Zelvin, Javier 12 Ortiz, and Ari Schwartz to respond initially. 13 So here are the questions. What is the current 14 state of preparedness? Are data systems and information 15 secured and protected generally? Do market participants 16 know they need to -- what they need to do in order to 17 ensure that their preparedness is up to the task of 18 managing this complex threat? What role do your agencies 19 and organizations play in facilitating cybersecurity 20 preparedness? 21 Cyrus? 22 CYRUS AMIR-MOKRI: Let me start with the last 23 one first. I think what we try to do and what the 24 industry has asked us to do -- and they ask us for this 25 more than almost anything else -- is to facilitate 0041 1 information sharing. As I mentioned, you know, both the 2 private sector and government gather information. And, 3 you know, to the extent that we have information that's 4 relevant, we try to facilitate providing, you know, that 5 information in a timely way so that the private sector 6 can help with their readiness and defense. 7 I think there are other things that we do as 8 well. So, for instance, if there is an incident and if 9 there is a request for technical assistance, what we try 10 to do is to receive that request. Usually we do it with 11 the assistance of the front line regulator. Sometimes 12 that's the SEC. Sometimes that's the banking regulators. 13 It just depends on who the victim of the attack is. And 14 what we try to do is we then contact DHS, FBI, other 15 expert agencies with technical assistance capacity to go 16 in and help. And then depending on what the situation 17 is, you know, that could lead to a criminal 18 investigation. It could lead to other follow-up. 19 In terms of, more broadly, your question about 20 readiness, I'll go back to what Larry said a few moments 21 ago. The financial services industry is probably one of 22 the most advanced in terms of thinking about 23 cybersecurity, and there is a reason for that. And that 24 is that financial services firms over the years have 25 become technology firms. When you look at transactions, 0042 1 when you look at -- whether it's in the institutional 2 space or the retail space, whether you think about data 3 and recordkeeping, a lot of it is on information 4 technology systems. And so just by the nature of things, 5 financial institutions have had to cope with these kinds 6 of issues for many, many years. 7 That being said, going back to what Mary was 8 talking about a few moments ago, there are constant 9 exercises, simulations, and you can never say you're 10 completely prepared because you have so-called zero day 11 exploits, which is a technical term for malware or a 12 method of intrusion or penetration that's not known to 13 the community. So there's always something going on. 14 And I think about what Larry was saying a few 15 moments ago when he gave you a sense of the magnitude of 16 the scanning and the search for vulnerabilities. So it's 17 always an effort to stay a step ahead of the game. And 18 so that's how we have to think about it. I think it's -- 19 we shouldn't think about it as, okay, we're ready; we're 20 prepared; that's the end of the story. 21 And even just to give you another example of 22 how we sometimes think about it -- and you can see this 23 in the literature as well. You know, you shouldn't 24 imagine or you shouldn't assume that defense means 25 keeping everyone out all the time. You're going to 0043 1 think, "All right. People are going to penetrate and 2 then what?" And so you think about your readiness and 3 your defenses and your ability to manage incidents in 4 terms of what happens when someone does actually 5 penetrate. How do we make it difficult for them to 6 maneuver, or how do we detect that very quickly and 7 respond? 8 So, you know, it's going to be a reality that's 9 going to live with us as long as we rely so heavily on 10 information technology, which is the foreseeable future. 11 MR. ZELVIN: Let me talk a little globally very 12 quickly, and then I'll talk about the financial sector 13 globally. The analogy that I use is that when I was much 14 younger, my family -- used to be me and my cousins in the 15 back of a station wagon, and we'd take a turn, and we'd 16 all go the other way, and it was just a lot of fun, you 17 know, just as you were getting slammed around the car. 18 You do that today; you'll likely be arrested 19 for child endangerment. There are air bags. There are 20 seatbelts. There is reinforced steel where the people 21 sit. We have taken a lot of measures to make cars safer, 22 but regrettably, people can still die in cars. In 23 cybersecurity, in my view, we are in the station wagon 24 and the kids in the back stage. We're still building 25 those really requisite safety devices. 0044 1 The financial services sector -- you all are 2 doing extraordinary work. It's highly impressive. But 3 the things I worry about is I had some very senior folks 4 in an investment bank say to me, "So who do we contact? 5 What do we buy to make this go away?" You can't do that. 6 It's not going to work that way. There are people that 7 can help you, but ultimately I think it's the companies 8 themselves that are really going to have to manage this. 9 You can't outsource it. You can't buy this away. 10 The second part -- you can lock down your 11 enterprise in this country and everything's great. But 12 if you have major portions of your business that are done 13 overseas because it made good business sense, you're 14 vulnerable like no tomorrow because you are not only 15 vulnerable to the people working on your networks in 16 those countries, who you may or may not know, who may or 17 may not be getting screened, but you're also vulnerable 18 to their government's intrusion into those systems. So 19 if the United States is now having a rise of tensions 20 with that country, you now have other folks that have the 21 ability to get into your potential networks and inflict 22 harm on your company and potentially on our country. 23 And I don't really have good answers for these 24 things, but I do highlight them as things to think about. 25 There was another panelist -- this might have been Mary 0045 1 who said -- and I think it's spot on -- is that you can't 2 just look at your own company. You've got to look at 3 everybody who has access to your networks because those 4 are holes, as I was talking about, the vulnerabilities 5 into your systems that may make the difference between 6 you succeeding and failing. 7 MR. ORTIZ: Thank you, Tom, for the opportunity 8 to be here -- and Commissioners. Good morning. 9 The financial services industry is doing a 10 great job in the forefront of cybersecurity. This is 11 really setting the stage for what other companies in 12 other industries will be doing. A couple of thoughts 13 that highlight that is that for financial services 14 organizations security has become the holy grail of their 15 business. And you are asking about boards and how -- I 16 think that many of the financial services organizations 17 -- their boards and senior management have visibility 18 into the cybersecurity postures that help protect their 19 investors or customers or others. 20 And one way of thinking of it is that the DNA 21 of a threat is never the same because we never know where 22 that threat is coming from. And because of that -- Andy 23 touched on this. 24 And, Adam, we had a chance talk about this 25 before. 0046 1 How do we implement continuous monitoring tools 2 and protocols inside of organizations that will help let 3 us know ahead of time what's going on? At the end of the 4 day, connecting those dots will help expedite the -- 5 sifting through the vast volumes of data of those threats 6 and be able to help focus our resources from the security 7 professionals and the business risk managers to be able 8 to say, "In the scarce resource environment that we're 9 in, here is where we need to be focusing all of our 10 attention," because of what's happening at that point in 11 time. 12 So Mary brought a good point, which is do we 13 practice? Do we know? How do we handle a cybersecurity 14 intrusion? And that is a critical component of how we 15 respond because it helps -- puts everyone in the 16 organization in the frame of mind of protecting the 17 critical assets, information, money, other things. So 18 financial services, as an industry, is doing a great job 19 -- and the individual companies in that space. 20 MR. BAYER: Thank you very much. So now I 21 would like to go into a discussion about how government 22 and industry collaboration and protecting our assets, in 23 particular, the critical infrastructure, including our 24 networks and systems -- how can we facilitate a more 25 productive dialogue amongst interested constituencies. 0047 1 And I would like to also weave in how the 2013 2 cybersecurity executive order and the recently issued 3 cybersecurity framework by NIST play into this calculus. 4 So for this I would like to have Adam 5 Sedgewick, Ari Schwartz, Larry Zelvin, Javier, and Andy 6 respond please. 7 MR. SCHWARTZ: So, yeah, let me start with the 8 -- how we got to the executive order first because I 9 think that that's a good place to kick this off, which -- 10 you know, a lot of the work -- as we've discussed, really 11 there has to be a coordination between the private sector 12 and government in this space. And that's been really 13 apparent from the beginning in the cybersecurity space 14 working this whole government approach that Cyrus 15 discussed where you have different leadership from the 16 different agencies involved as well as the sector- 17 specific agencies and independent agencies involved and 18 then working all the different sectors in the private 19 sector working together. 20 And that has made a real -- a challenge to come 21 up with kind of these overarching policies. And we've 22 seen that from Congress as well. And really, you know, 23 Congress -- it's very clear that we need Congress to act 24 in this space. And you hear this in a real bipartisan 25 way and in a non-partisan way from Congress. But still 0048 1 the difficulty of all these different pieces makes it 2 very -- a big challenge to address these issues. 3 In 2010 Majority Leader Reid asked the White 4 House to come forward with a proposal, which we did soon 5 afterwards. And for the next two years we worked forward 6 on that proposal in earnest with Congress, but Congress 7 was not able to get that proposal into something that 8 they could pass at that time. And there was a feeling 9 that there were some of the pieces of that that could be 10 done without legislation through executive action. 11 And so while that's not everything that we 12 wanted to do, we felt that it was important to move 13 forward what we -- on with what we could. And so we took 14 two of the most important pieces of where we could act, 15 very quickly, the first being information sharing from 16 the government to the private sector and in that space 17 try to move that forward in a way that protects privacy, 18 that -- we can get more information to the private sector 19 so that they could take action on known threats. 20 Number two was the idea of promoting voluntary 21 cybersecurity standards for critical infrastructure. And 22 in that space we really relied on NIST and DHS to move 23 forward with these ideas of how we could promote 24 standards in this space. And NIST took up the role of 25 the building the cybersecurity framework through a multi- 0049 1 stakeholder process. 2 MR. SEDGEWICK: Sure. And I'll go next just to 3 expand on that multi-stakeholder process and the effort 4 in sort of building the cybersecurity framework and where 5 we are today. 6 So the effort under the cybersecurity framework 7 -- I always summarize it -- was really to do three 8 things. The first was to identify those existing 9 standards and best practices that are out there today 10 that industry's already using and, in that, having a 11 reliance on international consensus standards. That's 12 consistent with U.S. government policy going back for 13 quite some time. The role of NIST is to help coordinate 14 some of that work. 15 And then the second was to create a structure 16 or the framework in order to elevate the use of those 17 standards that have been proven to be effective. 18 And then the third area which -- the period 19 that we're in today is identify where those gaps are. So 20 where are the things where there might not be standards, 21 there might not be best practices? Where are the things 22 that our stakeholders, through this multi-stakeholder 23 process, are telling us that we still have a lot more 24 work to do? And then we can work with industry to 25 develop solutions to those problems. So there are some 0050 1 things like cybersecurity workforce, supply chain, 2 privacy, a couple of those areas that we're doing work in 3 now. 4 And so through that process that was done in 5 this open engagement -- we had five workshops throughout 6 the country. We constantly were asking the stakeholders 7 what could we do that would be something that they would 8 use that would help them manage the risk. And we would 9 do analysis. We would take that data in, and we'd 10 present it back. 11 So bit by bit, we built this. We did it in a 12 public manner. NIST has always said that it saw its role 13 as being a convener where they would take together these 14 views. We have some technical expertise to do some 15 analysis and then present it back and say, "Is this 16 consistent with what you would use." 17 So what the framework is itself -- and I'll 18 just spend a couple minutes on this. And I'll start with 19 what it's not. Consistent with the conversation today, 20 it is not a checklist. It is not a compliance framework. 21 You do not get a grade at the end of the day that says 22 you are set to go. You do not get to be framework 23 compliant. 24 What it is is really setting those existing 25 standards in a way that makes it usable for organizations 0051 1 to help with some of these challenges, to help with some 2 of the things we're talking about today in terms of 3 communicating within your organization, communicating 4 with your partners, and to help with some of the problems 5 we've seen in the past where there has been a fracturing; 6 there have been stovepipes. Different sectors, different 7 industries rely on different things. They have 8 difficulty communicating about what they're doing. So 9 it's a tool to help with that communication and really to 10 allow industry to hold each other accountable. 11 And it does it and really has three pieces. It 12 has something called the framework core, which is that 13 body of work. It presents it in security capabilities. 14 It goes from very high level to very detailed. And it 15 creates a structure that allows for innovation. 16 We were told in the executive order to be 17 technology-neutral, to be flexible. So it sets that 18 structure, and that allows the market -- it allows 19 innovators to come in and help develop the solutions to 20 meet those problems. And at the highest structure, it 21 sets those existing capabilities, those existing 22 standards into these five larger buckets, identify, 23 protect, detect, respond, and recover. And that becomes 24 a communication tool. It also becomes a way that people 25 can see and understand how to build strong cybersecurity 0052 1 programs. 2 One of the challenges that we had -- and, 3 again, as reflective of the conversation today -- is not 4 everyone at the same place in the market. Not everyone 5 is getting started, and not everyone has these advanced 6 cybersecurity capabilities. So we had to create 7 something that would appeal to all those audiences, to 8 the people that don't have a cybersecurity program in 9 place -- they just rely on the capabilities of the 10 information technology and the control systems that they 11 have -- to the people that are spending a lot of money 12 and have that advanced analytics. So that's what the 13 framework core provides. 14 And then the other pieces of the framework -- 15 and this is all reflective of the document that we made 16 publicly available on the 13th -- is the framework 17 profile, which, again, is a tool to help organizations 18 grow those programs. You assess where you are today. 19 You assess the technology that you rely upon. You assess 20 your regulatory and legal environment. And then you 21 develop a target profile to help you achieve the outcomes 22 that you're looking to achieve within your organization. 23 And it might be a way to communicate what you 24 expect some of your partners to achieve. And you might 25 take that profile and say, "I'm going to treat this part 0053 1 of my organization different than other parts, so I'm 2 going to set up multiple profiles," in your organization. 3 It really allows that flexible and dynamic approach. 4 And then the third thing is something we call 5 the framework implementation tiers, which is a way for 6 organizations to assess where they are and, again, where 7 they would like to be. It's not intended to be a measure 8 of goodness. Not every organization is going to strive 9 to be at the highest tier because that wouldn't be cost- 10 effective. It wouldn't be appropriate. Some of the 11 financial organizations that are in the room are going to 12 think about risk management very differently than that 13 water utility that Larry talked about, both before they 14 were informed and after. 15 So taken together, that's what the framework 16 is. We're still in the early days. We realize that the 17 success of this initiative is not going to be based on 18 the strength of the document or the strength of the 19 process that created it but really on two things, which 20 is did it fit our intent of having something that 21 organizations are able to pick up and use and then, more 22 important than that, does it meet the intent of the 23 executive order in terms of improving cybersecurity 24 across critical infrastructure. 25 So that's why we're here today. That's why 0054 1 we'll continue these discussions with our stakeholders. 2 That's why we will look to see how we can improve this 3 process, and it really is a process. And we'll continue 4 to have workshops and look at these various areas and 5 kind of grow that public-private partnership, realizing 6 that there is a broad stakeholder community that we have 7 to work with, and all of them need to be involved. All 8 of them need to be thinking about this problem. 9 So that's where we are. And now I think Larry 10 will talk about the information-sharing piece. 11 MR. ZELVIN: So I'll be very brief. I am 12 cognizant of the time. 13 So you can't have information security and 14 information technology without information sharing. 15 Information sharing is absolutely critical. Within the 16 center I run we had 240,000 incidents reported to us last 17 year. We put out 12,000 actionable reports. That was a 18 60 percent increase in the year previous. People asked, 19 "Is that because you were getting more reports or there 20 are more incidents?" And the answer is simple. Yes. 21 One of the best partnerships -- public-private 22 partnerships I've ever witnessed is between the Financial 23 Services Information Sharing and Analysis Center, the 24 FSISAC, and government. It is absolutely extraordinary, 25 and it's absolutely critical. It serves not only a 0055 1 horizontal function within the financial services but 2 also a vertical function between financial services and 3 government. 4 We get information from intelligence. We get 5 information from law enforcement. We get information 6 from our own sensors, and we get some of the best 7 information, bar none, from the Financial Services 8 Information Sharing and Analysis Centers. We also get 9 great information from banks based on the individual 10 relationship we have and the trust there. 11 The challenge before us is that there is no 12 statutory clarity on what information companies can share 13 with government. So we will meet with the C Suite. We 14 will meet with the network defenders, and they will say, 15 "We will share with you. Not a problem." And then it 16 gets to the general counsel's office. And I am not 17 lawyer-bashing. I like lawyers. They keep you out of 18 jail. And they advise their clients that "hey, there is 19 nothing here that says we can do this. You're assuming 20 risk. We shouldn't maybe do this." And now the company 21 starts thinking, and now all of a sudden we're not 22 information-sharing. 23 It is very important to have clarity so 24 companies can share what is appropriate respectful of 25 privacy, respectful of civil liberties. But the 0056 1 technical data you need to defend the networks can do 2 that. 3 MR. ORTIZ: I will say that the cybersecurity 4 order from the White House and the NIST framework have 5 helped the private sector, very technical people and 6 business people to come together and better understand 7 what are the actions that they should be taking 8 proactively to help address the ever-growing threat of 9 cybersecurity. Because they can't disconnect from the 10 internet, because they need to continue to operate their 11 businesses, this framework has really facilitated the 12 internal discussions. 13 And personally what I hope it will do is it 14 will is it will help folks like Larry better integrate 15 with private sector companies where they can break down 16 some of the barriers of information-sharing, and there 17 may be other ways of doing that through other private 18 sector entities that maybe aggregate threat information 19 by industry sector and can then share that information in 20 anonymous way but actionable way with Larry, his team, 21 and other folks that are interested in protecting 22 different sectors of our economy. 23 So we hear from our clients and prospective 24 clients that the framework and the executive order have 25 been very helpful in spearing that conversation 0057 1 internally to get the techies and the business folks to 2 work together. 3 MR. ROTH: So I come from a -- when I was in 4 the private sector -- a bank holding company environment. 5 And one of the things that was -- matured very quickly 6 during the challenges we faced was the operational risk 7 management function within those organizations. And, 8 again, to bring this back to business process risk, I 9 think that the executive order and the NIST framework -- 10 they are helping drive risk management discipline from 11 financial services to other companies. 12 And you have lots of different risks. You have 13 market liquidity risk. You have reputational risk. But 14 operational risk is a core function that all companies 15 should embrace because ultimately you run a business, and 16 you have businesses processes, and you have lots of 17 different folks that are involved in that. It's become 18 economically advantageous to outsource a lot of the 19 pieces of your company to somebody who can do that better 20 and can do that better and can do it at scale. 21 I think it's also important to understand that 22 the nature of the threat has really changed, and so even 23 the best are challenged now. And if you talk to -- we 24 haven't gotten too technical, and I will try not to get 25 too nerdy, but the nature of the threat is different. Up 0058 1 until now we've used traditionally signature-based fraud 2 detection, which is I know this looks bad because I've 3 seen it before. I'm going to stop it. And that doesn't 4 help you with the things that you haven't seen yet. 5 And so there is a whole new world of tools that 6 are available. Those tools are not a point solution. 7 Again, to bring it back to what Ari mentioned before, 8 this is about risk management. Larry also mentioned this 9 is prioritizing; how do you separate signal from the 10 noise. And I think to be truly effective you have to 11 have a comprehensive operational risk management approach 12 to cybersecurity. 13 MR. BURNS: Thanks. Thanks, Tom. Just wanted 14 to throw in it's a pity we lost Cyrus. We're very 15 appreciative that he came in and gave us part of his 16 morning. 17 Larry, you were talking for a moment there 18 about the Financial Services Sector Coordinating 19 Committee and the important work they're doing. That's a 20 good public/private partnership that's going on. 21 Just on the government side of things, of 22 course, we've got the FBIIC, the Financial and Banking 23 Information Infrastructure Committee, which is fostering 24 a lot of good information sharing. And Cyrus, as sector 25 lead, has been terrific, he and his team, at making sure 0059 1 we get information, which enables us to share technical 2 expertise when we can provide it and obtain it when we 3 need it. 4 But we're running into some of the same 5 challenges you're describing, where are the roadblocks; 6 where are the doors shut, how do we -- and so the 7 question for you and for others on the panel, how can we 8 go about taking concrete steps to improve that process 9 that's, just in the time I've been watching things gotten 10 better, but we still have miles to go. 11 MR. ZELVIN: Yeah. One -- you know, it's 12 really up to the industry to decide how it wants to 13 organize and how it can -- you know, it wants to engage 14 with government. And there's -- and I mentioned the 15 Financial Services ISAC just because they've been 16 just very active. And, as a matter of fact, there's an 17 FSISAC person on our operations floor, a gentleman by the 18 name of John Suber, and John's been absolutely 19 phenomenal. 20 You know, it's also unclear to the internet 21 service providers and also the managed security service 22 providers, those folks that you contract out to do the 23 security for your networks, you know, what information 24 you need. So we finally asked them. We said, "What 25 information do you need? What do you need from us? We 0060 1 have available and we'd like to share with you the seven 2 criteria that they have asked from internet protocol 3 addresses, ASNs, ports, things of that nature. And, 4 again, I don't want to get too technical as well. 5 But if you could share that widely with the 6 industry, if they could work that through not only with 7 the leadership risk committees but also their lawyers and 8 be able to bring that under whatever form you wish and 9 give it to us, we have the ability to share that out with 10 all the internet service providers. And there's 11 hundreds, if not thousands. 12 We have the ability to share it within the 13 federal government to protect the dot gov, state, local, 14 tribal, territorial, across the 16 critical 15 infrastructures. And we also have the ability to get out 16 to 200 certs around the world. Our law enforcement can 17 use that to go after the bad people, and also 18 intelligence can use it to collect -- to see where the 19 adversary may be going next and who is actually doing it. 20 So it is a whole government approach, but, 21 again, it all really comes to the trust that you can give 22 government the information; it will be used 23 appropriately; and that we can go out and better defend 24 ourselves. 25 MR. SCHWARTZ: So I think that on information- 0061 1 sharing threats, you know, we -- I think people tend to 2 focus on the way that they end up sharing. But I think 3 there are -- it's good to take a step back and think 4 about all the different kinds of information-sharing in a 5 -- for a panel like this. So I've discussed sharing from 6 government out to the private sector that we tackle in 7 the EO. Larry's been discussing from the private sector 8 into government. 9 There's also private to private, which is 10 challenged on a lot of the barriers there that they face 11 from private to private. And there is also within 12 government, and there's also within -- among governments, 13 especially from federal to states where there have been 14 some challenges as well. 15 And, you know, one of the things that we're 16 really focused on right now is trying to identify all of 17 the different barriers that each of these sets of sharing 18 run into and try and knock down the barriers really one 19 at a time. The goal of doing it through legislation was 20 a great effort, and we hope that we can get it to move 21 forward in a way that protects privacy and civil 22 liberties, that protects other challenges that we have to 23 face out there. But it's been a difficult road to do 24 that in legislation. 25 So the question now is what can -- what barriers can 0062 1 we knock down that we're facing today. And that's 2 getting the general counsels to be a little bit more open 3 about -- let's get beyond no, right? Once they say no, 4 what is the exact reason that you're telling me no, and 5 is there something that we can do from a policy side to 6 address that particular barrier. 7 MR. BAYER: So I would ask Mary, Javier, and 8 Andy to give us their perspective on what private sector 9 entities would want from government entities about 10 protecting against cyber threats or when responding to an 11 attack or breach. What information do you need from 12 government entities and you want us to share with you 13 with respect to the incidents of cyber attacks or 14 breaches? 15 Mary? 16 MS. GALLIGAN: Well, Tom, I saw it both when I 17 was on the FBI side working with Larry's folks as well as 18 now with Deloitte -- is that what it boils down to after 19 you get past the general counsel hurdle -- is what 20 information can you give me that is timely and 21 actionable. 22 So as we talk about building this for the 23 future of our nation, what is that infrastructure going 24 to look like? Do you share it by email? Do you call me 25 on the phone? And the government has to protect their 0063 1 techniques and sources. So over and over you hear 2 clients say, "How do I get specific actionable 3 intelligence, especially outside the financial services 4 industry where it may not be as mature. 5 MR. ORTIZ: I think that the private sector is 6 looking for as much -- Mary is absolutely right -- as 7 much actionable information as they can in order to be 8 able to address whatever the issue is at that moment. 9 Industries -- cyber attacks sometimes actually go in a 10 cycle that focuses for a short period of time on a 11 particular industry. So if the financial services 12 organizations are under attack, it will be very important 13 for that information to get very quickly to the other 14 participants in that industry. 15 And more broadly, the -- because of the nature 16 of the threat, the -- what Mary said is absolutely right 17 -- actionable. What is it that is happening that I need 18 to be worried about? And the reason why that's really 19 important is that all the systems that are in place today 20 that protect the perimeter, the connection to the 21 internet, are inundated with alerts. So we don't really 22 know what really matters until we study that information, 23 and that could take weeks, months, and sometimes years. 24 We've seen that recently in the press with a 25 retailer who had a lot -- the information was there. 0064 1 They just couldn't distill it fast enough in order to 2 take action. So actionable intelligence -- and then the 3 use of the tools that will allow that information to be 4 quickly distilled and take action -- you know, take 5 action on that. 6 MR. ROTH: So I think one of the biggest 7 challenges in information security is what's called 8 information asymmetry where one side has much more 9 information than the other. And right now the bad guys 10 are very good at sharing information. And so when there 11 is a known vulnerability with a third-party cloud 12 provider, something like that, the word spreads very 13 quickly, there's a robust marketplace for these zero 14 days, and there are tool kits, as Ari mentioned, and 15 defined exploits that people who don't necessarily have 16 the technical knowledge themselves to perform it now can. 17 I am a vigilant defender of personal privacy, 18 and I think that has to be a paramount consideration 19 here. At the same time, the most damaging threats are 20 the ones that people don't know about yet because you can 21 defend pretty well against the things that you've seen in 22 other contexts. And I think that the government is in a 23 unique position that it's standing between lots of these 24 different entities and stakeholders that it can 25 facilitate sharing forensic information on a new type of 0065 1 malware, that up until now, has gone undetected. And the 2 speed that you can disseminate that information can mean 3 the difference between a really damaging event and 4 something that is contained. And I think having that be 5 an ongoing dynamic process, this dialogue being probably 6 the first step in that process, is the key to sustainable 7 information security. 8 MR. ZELVIN: Can I just offer one -- another 9 thing for maybe you all to think on? One of the 10 challenges is that who does the government share 11 information with in these companies. I've been in many 12 forum where folks have said, "You haven't shared this 13 with my company." And I ask, "What company are you 14 with," and they tell me, and I say, "No, we have. We 15 just didn't share it with you." 16 Who in these companies do we reach out to? You 17 know, in New York where Mary worked for the FBI, I mean 18 they've got a really good relationship. But as you go 19 out this sector so much bigger nationwide, who is it that 20 we need to reach out to in the companies identified? We 21 need to get them security clearances, and that is part of 22 the executive order because government can only bring it 23 down so far. Secret clearance is about right and then as 24 you get higher, probably a little bit more, and same 25 thing with the Commission. 0066 1 We regularly brief the NRC, the FBI, NSA, and 2 DHS. We'd welcome the opportunity, if you wish, to come 3 over and brief you as well. But, again, these things are 4 best done at the very high security level, not to give 5 you the full knowledge if possible. But I would 6 recommend who, get them clearances, and then in the 7 absence of that, we are working very closely with the FBI 8 in that we will do read-ins at a classified level, bring 9 you to the FBI field office. We will give you the 10 information, and then magically you will forget it after 11 you take action. But we could and should do better. And 12 the executive order does address that. 13 MR. BAYER: Thank you. And I would turn this 14 portion over to the Commissioners for any follow-up 15 questions. 16 COMMISSIONER AGUILAR: I do have one question. 17 By the way, this -- I don't want to wave this, but I've 18 been doing this for like 10 minutes. You didn't catch 19 it. So if I need to scream I will. 20 But I have a question for any members of the 21 panel, but it may be more apt for Larry or Ari or Adam. 22 The last set of comments talked a lot about information- 23 sharing, market intelligence, and trying to get an 24 understanding of what is happening in the market. 25 Couple years ago the Commission issued guidance 0067 1 on when public companies should disclose cyber events. 2 And companies, I think, have been more reporting since 3 then. Is that information -- does that get to you in the 4 normal course of events? Do you have to go -- is it 5 pushed to you, or do you have to go pull it to you? 6 Is there a way to take better advantage of this 7 increasing disclosure that public companies are making so 8 that one company can learn from another company so that 9 NIST and Homeland Security and other government agencies 10 have a better ability to more readily grasp the 11 information and therefore look at the forest, if not 12 necessarily the trees, in terms of trends and what's out 13 there. 14 Is the information useful to you, or is it a 15 just a drop in the ocean and it's just noise that's not 16 of any relevance to you? 17 MR. SCHWARTZ: I'm going to -- I'll start, and 18 Larry can pick up on it -- which is I'd say, at least 19 from my experience, different sectors have reacted 20 differently from what I've seen. There have been some 21 sectors where we get a lot more information, where we 22 know a lot more of what's going on. And even with the 23 kinds of disclosure, the new disclosure information 24 that's out there, we still hear better from some sectors 25 than others. 0068 1 MR. ZELVIN: So, Commissioner, again, my 2 opinion, it is more pull than push. If I get push, my 3 personal feeling is it's -- usually the company's about 4 three or four days into it, and they're really gotten to 5 the point of "boy, this is really kicking our behind," 6 or, "this is about to go public," or there's something 7 else that is making them, you know, say, "We really 8 should contact government or go out to the industry. 9 I think the financial sector is way ahead of 10 others because they've realized -- I mean in this area 11 it's -- in cybersecurity you're not competitors. There 12 are other places where it is a very competitive 13 environment. So if you're disclosing you've had 14 problems, you're maybe bringing down your reputation. 15 Your competitor could go, "Well, look at all the problems 16 they have had." That hasn't occurred here, so that's a 17 good thing. But companies are still very fearful of 18 disclosing. And there is no requirement to disclose, and 19 they'll do it only as a last resort. 20 I was on a panel at Georgetown Law not to far 21 from here where it was simulation of a CEO, a CIO, and a 22 CISO. And for the first hour they were talking about 23 nothing other than how do they control this and how they 24 do not get the information out. And then at the next 25 hour an FBI agent from the Washington Field Office and I 0069 1 showed up. And he said, "Hi, we're here from the 2 government, and we're here to help." As soon as we show 3 up, it's public, and I've already talked to a whole bunch 4 of people. 5 So encouraging more sharing with out the fear - 6 - and one last point, if I may. We have the ability in 7 government to protect the information of the companies 8 that are reporting to us. We have statutory ways we can 9 do it. We have policy ways. And in many ways we'll just 10 do it because we don't want to betray. We're not 11 regulators. We're not law enforcement. We're not 12 intelligence in DHS. You can just let us know, and we'll 13 work it through. So we've got a lot of work to do there. 14 MR. BAYER: Yes. Commissioner Piwowar. 15 COMMISSIONER PIWOWAR: This has been an 16 excellent panel. I probably have like 50 questions, and 17 40 of them are probably too nerdy to ask, so I'll 18 probably do that offline. 19 Just a very basic question -- when I think of 20 identities of cyber attackers in the financial services 21 industry, right -- so we've learned that, you know, on 22 the one hand the financial services industry is ahead of 23 the curve compared to other industries, right? On the 24 other hand, it's where the money is, right? Are the 25 types of cyber attackers that we get in our space the 0070 1 same as sort of -- and in the other spaces, right? So I 2 don't see like an Edward Snowden/Bradley Manning/ 3 WikiLeaks. Maybe we do. 4 Do I think of like cyber gangs or criminal 5 organizations? Do I think of hostile governments? Do I 6 think of terrorist organizations, all of the above? And 7 are there any trends in terms of the type of attackers 8 that we're seeing in our industry? 9 MR. ZELVIN: I'll kick and ask others. 10 Commissioner, the answer to your question is yes, no, and 11 maybe. You know, to be fair, it's all of the above. You 12 know, the nation-states are coming after us because they 13 see that is the heart of America. You know, it is you 14 really are the representation of our country. So if 15 they're mad at us, they use you all to come after us. 16 There are folks out there that are hacktivists 17 that are -- that, you know, are attacking you just 18 because, again, they perceive that as American capitalism 19 and going after it that way. I worry about the insider 20 threat a great deal. You know, for some reason 21 somebody's just not happy and they want to do something. 22 So it is a full spectrum. 23 And I think it was Javier that said, "What 24 really worries me most about the financial sectors is the 25 adversaries adapting against you in ways that we're not 0071 1 seeing in other places." So in many other sectors -- 2 and, again, we look across the 16 critical 3 infrastructures, it is normally using the standard set of 4 things that Mary was kind of covering, you know, spear 5 phishing, DDoS. In your sector they're getting really 6 creative because they have to because you're adapting so 7 fast. And they're adapting at the same speed you are, 8 but the problem is there's more of them than there are of 9 you. 10 So it's a huge challenge, and Mary covered it. 11 It's really more about how do you plan for that bad day? 12 What are you going to do about it -- because you can't 13 count on it not coming. You've got to have a plan that 14 you can adapt to and be able to react to to give you that 15 time to go ahead and take care of -- respond to that 16 threat and recover from it and then reconstitute. 17 MR. ORTIZ: Commissioner, I would add to 18 Larry's comment that we don't know where the threat is. 19 We don't know who they are. The folks that we thought 20 about yesterday are completely different three weeks from 21 now. So the most important part is to go back to the 22 basics that Mary articulated. Are we ready? Are we 23 prepared? Are we thinking of our business in a way that 24 is nimble enough and connects our business owners and our 25 technical owners in a way that they can get to the 0072 1 problem the fastest way possible and hopefully before it 2 becomes an issue. 3 And that is a critical component of -- you 4 mentioned the University of North Carolina and some of 5 the work that they're doing. It's a great way of 6 starting connecting techies and business folks to be 7 thinking of the world in a different way. So as they 8 enter the workforce and they go work for financial 9 services firms, they already come with some of that 10 knowledge but certainly that -- this position to work 11 together in business and technology to address some of 12 those issues. 13 MR. ROTH: I think it's also a little bit 14 defined by the value of the information. Specifically, 15 when you look at credit card and payment information, 16 that is incredibly valuable to bad guys. And so we are 17 seeing -- at least I've seen more concerted activity, 18 more coordinated activity, more organized crime in those 19 areas, a more developed marketplace for exploits to be 20 able to take that information and what Javier and others 21 have alluded to, the rapid evolution of malware that's 22 able to copy this information and exfiltrate it and send 23 it and to be dynamic and cloak itself and disappear. 24 And so it's -- in some ways it's been likened 25 to a game of Whack-A-Mole, you know, when you harden your 0073 1 defenses at one side, then you have an issue potentially 2 somewhere else. But I would say the payment card 3 networks are really facing the bleeding edge of these 4 advanced persistent threats. 5 MR. BAYER: So all good things must come to an 6 end, and this concludes our panel. I want to thank all 7 of the panelists for your outstanding input to this 8 roundtable. And I also want to say that Keith Higgins 9 will be chairing the second panel. We're going to take a 10 five-minute mini break at the most to change out the 11 panel members. I want to notify the audience that we 12 will be taking an organized break after the conclusion of 13 panel two. Thank you. 14 CHAIR WHITE: Thank you all very much. This is 15 a terrific panel. Thank you. 16 (A brief recess was taken.) 17 MR. HIGGINS: We're back. Before we get 18 started though, I'd like to first acknowledge 19 Commissioner Kara Stein who has joined -- who joined us 20 much earlier and give her the opportunity to say hello 21 and give a few brief remarks if you'd like. 22 COMMISSIONER STEIN: Thank you, Keith. 23 I'm happy to be here today, and I think it's a 24 testament to the fact that cybersecurity, as people were 25 mentioning on the first panel, has moved away from being 0074 1 an IT issue and sort of a back room operations issue to 2 one that's on the forefront of both, you know, managing a 3 company and helping investors as well. 4 So I look forward to the rest of the panel. I 5 think from my talking to industry folks feel like they're 6 constantly out on the forefront and trying to evolve as 7 quickly as the bad guys are. And I think at least from 8 the first panel the dynamic nature of this process, which 9 I think requires regulators to be dynamic as well in 10 figuring out how to help industry be resilient and robust 11 and able to withstand what's going to, I think, be a 12 constant challenge. So I look forward to the rest of 13 today and hearing from our second panel on corporate 14 disclosures. Thank you. 15 MR. HIGGINS: Thanks, Commissioner Stein. 16 Well, we've had the opportunity to get an 17 overview of the cybersecurity landscape on this morning's 18 panel. And if you're not already scared out of your 19 wits, we'll move on to disclosure. 20 We'll focus on what public companies are 21 currently disclosing about their cybersecurity threats 22 and breaches, both potential and those that have already 23 occurred, and how they determine the appropriate 24 disclosure, the timing of that disclosure, and what 25 information about cybersecurity investors need to know 0075 1 and need to know in order to make informed voting and 2 investment decisions. We really have a terrific panel to 3 join us today who will lead us through these topics, and 4 let me introduce them. 5 Starting on my right we have Roberta Karmel, 6 who is the Centennial Professor of Law at Brooklyn Law 7 School and a former SEC commissioner. Next to Roberta is 8 Doug Meal, who is a partner at Ropes & Gray whose 9 practice specializes in representing companies in 10 connection with cyber incidents and data breaches. 11 Next to Doug is Jonas Kron, who is senior vice 12 president and director of Shareholder Advocacy at 13 Trillium Asset Management. Next to Jonas is David Burg, 14 who is the global and U.S. advisor cybersecurity leader 15 for PricewaterhouseCoopers. Next to David is Leslie 16 Thornton, who is vice president and general counsel of 17 WGL Holdings and its subsidiary Washington Gas Light 18 Company, and then finally Peter Beshar, who is executive 19 vice president and general counsel for Marsh & McClennan 20 Companies. 21 We'd like to have Leslie and Peter start off 22 this morning by talking about how cybersecurity risk fits 23 into the overall disclosure framework for public 24 companies and how they've -- particularly how they've 25 seen that issue evolve over the last half-dozen years. 0076 1 So, Leslie, could I ask you to kick it off? 2 MS. THORNTON: Sure. So in my world I think 3 the cybersecurity landscape has really changed in the 4 last two or three years. So, you know, in February 2013 5 -- some of this will be repetitive, I'm sure. But in 6 February 2013, President Obama, as you know, issued an 7 executive order for better cyber protections for the 8 nation's critical infrastructure companies. And in 9 response, the NIST organization issued a cybersecurity 10 framework that is sort of -- I think it's still open for 11 public comment. 12 Congress tried in 2012 and '13 to pass 13 legislation that would have had a very direct impact on 14 disclosure requirements for organizations. Those pieces 15 of legislation didn't pass, but one of them, I think is 16 still pending. 17 State attorney generals are issuing inquiry 18 letters to companies in financial industries, asking for 19 details about sort of their cyber preparedness plans. 20 And public services commissions are likely moving the 21 issue up on their what's important to talk to our 22 utilities about list. 23 In addition, boards are being advised by 24 organizations like the National Association of Corporate 25 Directors and other very, very highly reputed 0077 1 organizations that cyber should be way higher on their 2 priority list than they are so far. And they're sort of 3 laying out for them what they should be asking, that they 4 should be asking harder questions, you know, do they have 5 enough expertise on the board to deal with those type of 6 questions. So it's a different sort of world, I think, 7 in the last few years. 8 In the energy world I sort of break it down 9 into cyber and then cyber. So cyber, in my world, is 10 Target, Neimans, companies like that, but it's also, you 11 know, half a dozen agents from the FBI or DHS or both 12 showing up at your company to tell you that your critical 13 infrastructure has been penetrated by a nation-state 14 actor "and we're briefing the President on it every day". 15 That's a different level of cyber. 16 What does all that have to do with disclosure? 17 So in my world -- I think in the world that's emerging, 18 there's materiality and then there's materiality. There 19 is -- you would think that people in the federal 20 government coming up to your office and scaring you in 21 the middle of the night would be material, but the 22 considerations on that kind of cyber risk or event are 23 different, right? 24 I mean you wouldn't necessarily disclose a 25 nation-state actor trying to do harm in an industry 0078 1 that's very vulnerable for all the reasons that sort of 2 make sense and I probably don't have to describe 3 particularly if in that situation you don't have a 4 customer base or a employee base that has been 5 compromised because that's not what they're after, right? 6 You don't have issues related to 100,000 credit 7 cards being compromised. You have a more discrete issue. 8 So I -- that's sort of my framework of disclosure and 9 what's material and what's material and what the world 10 looks like. It's just different. 11 MR. HIGGINS: It is. Thank you, Leslie. 12 Peter, anything you want to add to that before 13 we move on to the next topic? 14 MR. BESHAR: Great. Thank you. 15 Chair White, Commissioner Stein, Commissioner 16 Aguilar, and Director Higgins, I want to thank you for 17 the opportunity to participate in this fascinating series 18 of panels. If you notice, Leslie and I were slinking 19 farther and farther down in our seats during the opening 20 panel. 21 And I wanted to try to frame two questions and 22 see if we can at least begin providing some answers. The 23 first is is the private sector getting the message. And 24 the second is should there be a unique disclosure 25 requirement and regime around cybersecurity risk and the 0079 1 question, Chair White, that you posed, should boards of 2 directors have a unique cyber committee of the board; is 3 that the appropriate way to grapple with this risk. 4 And I think clearly government has been out 5 front of most of industry and most of the private -- the 6 non-profit sector in trying to identify the risk posed by 7 cybersecurity and then prodding the private sector to do 8 more to understand and appreciate the nature of the risk. 9 You hear it when senior intelligence officials talk about 10 the risk of a cyber 911 or a cyber Pearl Harbor. There 11 was a report two days ago that the FBI and the Department 12 of Homeland Security have notified 3,000 companies that 13 their systems have been breached just in the course of 14 2013. The NIST framework that was just put out last 15 month and obviously the work that the Commission and the 16 staff have done -- that really government is ahead of the 17 curve. 18 The private sector is trying to catch up, and 19 the message that the government is sending, I think is 20 very much being taken on by the private sector within 21 senior management teams for Leslie's company and for 22 ours. Cybersecurity is clearly a tier one ERM risk for 23 the enterprise. Boards of directors are far more attuned 24 to the significance of the risk as a result of work that 25 the National Association of Corporate Directors have done 0080 1 and simply from reading the headlines. 2 And then for us at Marsh & McClennan, in our 3 capacity as an insurance broker, we see another indicator 4 that the private sector is responding with speed in the 5 form of taking up cyber insurance rates within the 6 industry. So just in the last year Marsh clients 7 purchased 20 percent -- 20 percent more of Marsh clients 8 purchased standalone cyber insurance than just a year 9 prior. 10 And the industries where the take-up rates are 11 the highest -- people spoke in the prior panel about 12 financial industry -- clearly ahead of the game -- the 13 healthcare sector with the significance of HIPAA as a 14 statute and what that means for the protection of data. 15 The take-up rates for our clients in the healthcare 16 sector are 45 percent, so really quite substantial. And 17 interestingly, in the education arena many universities 18 have been impacted by the threat, and so the message is 19 very much getting through. 20 Now whether or not for disclosure purposes 21 cybersecurity should be treated as a unique threat, it's 22 quite interesting to see the approach that the Commission 23 and the staff have taken in the guidance and in some of 24 the comment letters that have been promulgated in the 25 past years because it's a somewhat different approach 0081 1 that has been taken as to other substantial risks that 2 public companies face. 3 So, for example, in the comment letters, the 4 idea that companies are being guided to disclose prior 5 incidents even if they're not deemed to be material is a 6 difference in practice than the way you would typically 7 approach most risks, for example, around litigation. You 8 wouldn't typically disclose or be urged to disclose non- 9 material litigation matters or financial risks and the 10 like. Similarly the -- the staff doesn't typically urge 11 companies and registrants to identify risk mitigants. 12 And so in the cyber guidance that the staff put out, 13 there was a reference to cyber insurance being a risk 14 mitigant. 15 So I think we're all trying to react, 16 Commissioner Stein, as you said, to an incredibly dynamic 17 world. And what is so, I think, positive about the NIST 18 framework is the inherent flexibility of that standard, 19 that it can apply and adjust to this very much evolving 20 threat. And what's clear is that government and business 21 and the non-profit world need to partner together to 22 figure out what's the best way to respond to that because 23 we surely know that our adversaries are adjusting their 24 tactics as we speak. 25 MR. HIGGINS: Peter, if I could ask, in your 0082 1 company's writing or, you know, getting insurance or 2 having companies buy insurance, I assume that there must 3 be some underwriting process by which you assess what the 4 risks are. And are the frameworks available to be -- for 5 an insurer obviously you're writing it, so you must be 6 able to do it. And is that helpful to companies to -- in 7 other words, the fact of buying insurance means that they 8 need to get up their game in order to qualify for the 9 insurance. 10 MR. BESHAR: It's a terrific point, Keith. The 11 very process of applying for insurance has a protocol 12 associated with it which is not wildly different than the 13 NIST framework. You have to conduct a GAAP analysis of 14 sorts. There's industry standards and best practices -- 15 and try to assess where you are in relation to those 16 standards and what steps might you take to present 17 yourself as a better insurer. We at Marsh & McClennan -- 18 we're not actually the insurer. 19 MR. HIGGINS: Yeah, right. I understand. 20 MR. BESHAR: But we are the broker trying to 21 assist the clients in trying to position themselves so 22 that they can get the best rates and be perceived as a 23 good risk as opposed to a poor risk. 24 MR. HIGGINS: Thanks. 25 Moving on to sort of the board level, one of 0083 1 the things we want to talk about is we heard the earlier 2 panel talk a little bit about how boards are more 3 involved. How, in your experience -- and actually I'd 4 like to turn to David if -- David, if you don't mind. 5 Have you experienced, as a consultant and being involved 6 with companies, more board level attention to this? Are 7 you spending more times with boards of directors? How 8 are companies organizing the way that they look at 9 cybersecurity? 10 MR. BURG: Yeah. First of all, thank you for 11 the invitation to be here today. I really do appreciate 12 the opportunity. 13 Yeah. It's very interesting, you know, Keith, 14 your question surrounding board members focus and 15 attention on this issue. You know, at 16 PricewaterhouseCoopers we have about 2,000 professionals 17 around the world that focus on helping our clients 18 respond to breach events but also spend a lot of time 19 consulting with their clients to really -- to transform 20 or change their security program. 21 And what we've seen from a trend perspective 22 over the last -- really over the last two years is a 23 significant uptick and increase in the level of 24 involvement at the board level where board members have a 25 very keen interest in understanding the nature, the 0084 1 extent, the consequence of breaches but are also asking 2 questions around, you know, why did this happen; why were 3 we targeted; and what are the strategic implications of 4 the breach. 5 And I think that last point around the 6 strategic implications is very important because most 7 corporations have some dimension of a global footprint, 8 are technically multinational corporations. And when 9 thinking about strategy and thinking about where and why 10 various attacks may occur, board members are stepping 11 back and really realizing that it's important to think 12 about cyber and enterprise risk management really as 13 being one and the same. 14 And so we see board members, interestingly, 15 reacting and diving very deep into understanding the 16 details of the attack, how it occurred, who the attackers 17 were but then also starting to focus on the competency of 18 the security program and the professionals and the use of 19 technology inside the organization as well. 20 I think one of the other important points 21 around board focus in terms of dealing with breach 22 response events is that the board today, in many cases 23 around serious cyber breaches, remains focused on not 24 only the short-term response of the corporation, but 25 really will stay focused on the mid-term and the long- 0085 1 term performance of the company as it executes its 2 response program, as it deals with communication to 3 customers, to consumers, to business partners, again, in 4 some cases, you know, clearly with a significant focus in 5 the United States, but also taking into consideration all 6 the implications around the world. 7 MR. HIGGINS: Doug, let me ask you the 8 question. You spend a lot of time on incidents, data 9 breaches and the like. How often are you advising boards 10 of directors and boards are being -- are involved in the 11 actual, you know, monitoring the progress of an event? 12 MR. MEAL: Well, again, I appreciate so much 13 the opportunity to be here, so thanks for having me. 14 Well, yes, if the issue didn't catch the 15 board's attention before the fact, when there is a major 16 breach, they are all over it, absolutely. And so we, in 17 the work that we do in working the breaches that we work, 18 are regularly advising boards on everything related to 19 the breach, including their disclosure obligations around 20 the breach. 21 I think I would say -- and a lot of the 22 companies that we've worked for on breaches are public 23 companies. I would say that I really can't think of a 24 case -- and we've worked a lot -- where the disclosure 25 thinking or analysis was driven by the securities law 0086 1 issues, frankly. 2 Basically there are other state laws, other 3 situations that are going to create a disclosure 4 obligation, and that's what drives it. And I think just 5 to be someone speaking from the trenches in terms of the 6 reality of what really happens, there is a tremendous 7 disincentive to disclose a breach. 8 And so if the breach isn't otherwise going to 9 become public, if you suffered a breach, you know that if 10 the breach were to become public, you are now going to be 11 a target of a lot of class action plaintiffs, of consumer 12 protection regulators, who will not look at you as the 13 victim of the breach the way folks were talking in the 14 earlier panel, but will look at you as almost the 15 perpetrator of the breach. 16 And so if a company can conclude that it 17 doesn't otherwise have a disclosure obligation, probably 18 it's going to be very easy for the company, if it is a 19 public company, to conclude that it doesn't have a 20 material situation that would generate a securities law 21 disclosure obligation. 22 So while it's always in the back of the mind of 23 the board and senior management and when -- certainly 24 when it is going to become public, then we work hard in 25 terms of crafting an appropriate securities law 0087 1 disclosure. But it is not the driver. When you're in 2 the trenches dealing with a real breach, what's really 3 happening doesn't drive the thinking. 4 MR. HIGGINS: Moving that, Jonas, to turn to 5 you, from an investor's point of view, what do investors 6 want to know about cybersecurity both on the preparedness 7 front as well as upon incidents occurring? 8 MR. KRON: Sure. And I'd also like to add my 9 thanks for the opportunity to share our thoughts today. 10 I think one of the things that hasn't been 11 spoken about very much and might be useful to sort of 12 introduce into the conversation -- and this is actually 13 something that was identified in the SEC guidance in 14 2011, which, if I could just quote from it -- "discussion 15 of aspects of their registrants business or operations 16 that give rise to material cybersecurity risks." 17 The information that companies collect about 18 their users, about their customers, how it collects that 19 information, what it does with that information, how long 20 it keeps that information -- you know, in the social 21 networking world that we have these days there tends to 22 be an over-collection of information. 23 And companies disclosing what -- providing more 24 disclosure about how they're collecting that information, 25 what they're doing with that information, how long 0088 1 they're keeping that information so that we know whether 2 they're creating a greater risk profile for themselves. 3 If that risk profile, if that risk exposure is larger, 4 then they're going to be a bigger target. And so having 5 a level of disclosure around that could be very helpful 6 for us. 7 MR. HIGGINS: I see. So disclosure 8 specifically with respect to companies that take in 9 personally identifying information from customers or 10 consumers with whom they deal? 11 MR. KRON: Exactly. You know, there's -- you 12 know, in some ways we're sort of in the Wild West. You 13 know, we're collecting -- you know, media companies, 14 retail companies -- they're all collecting an enormous 15 amount of information right now, and they're sharing it 16 with other folks, and they're seeing it as marketing 17 opportunities and for advertising. 18 And that's all good, and there's a lot of 19 innovation there, and there's a lot of opportunities for 20 sort of providing social goods related to that. But 21 there's a risk that comes with it as well, especially if 22 you're over-collecting and if you're keeping that 23 information for too long in ways that aren't really 24 necessary for their core business. And that can be a 25 differentiator between companies, those companies that 0089 1 really have decided, "We're going to rein in some of 2 this. We're not going to push the boundaries so much," 3 versus companies that are just trying everything out and 4 creating risk for themselves that way. 5 MR. HIGGINS: One thing about risk disclosure 6 is we have historically said over the years that we don't 7 really want companies to be disclosing a risk that 8 everybody faces, that if it's going to be generic 9 boilerplate type of disclosure, we really don't want to 10 see it. And I think companies often have a difficult 11 time deciding, you know, what level is the right level to 12 disclose, and it does sound a lot like it's everybody 13 else's problem. Is personally identifying information 14 different? 15 And, Leslie, would a company that's involved -- 16 maybe a critical infrastructure company be different? 17 How do you think about that? 18 MS. THORNTON: I think a critical 19 infrastructure company is different in that. So in the 20 energy world, utility world, you might have a bad actor 21 that is not China interested in intellectual property, 22 not Russia interested in customer data, but some other 23 nation-state actor that is targeting your operating 24 system, your system that moves gas, to see if it can 25 cause -- see if it can over-pressurize the lines, under- 0090 1 pressurize the lines, just -- you know, and for companies 2 like that in an area like this, right, were there are 3 lines under the White House and under the SEC and under 4 the Congress, it's a different calculation, right, 5 because they're not interested in what our customers pay 6 or what our customers do. They're interested in 7 something else. And so the calculation is a hundred 8 percent different. 9 One of the things that we did last year -- our 10 chairman did last year is very pressing about this -- was 11 reach out to the cyber community for a sort of cyber 12 expert that we could seat on our board. So last spring 13 we sat on our board Linda Gooden, who was retiring from 14 Lockheed Martin as its head of technology infrastructure 15 for many, many years with some levels of clearance that 16 don't even have like letters. And so she's been on our 17 board for a year, and there is nothing she doesn't know 18 about this world, about this space. 19 And so for us when we have conversations with 20 the board, we have someone on the board now who 21 understand the language, can ask the right questions when 22 our CIO is in front of them explaining where we are and 23 how protected our system is or isn't. It's just been a 24 great help. And it doesn't just help management sort of 25 manage the issue. It helps the other board members, 0091 1 right. It helps them have a real sort of comfort level 2 that somebody there knows it in a level of detail that -- 3 you know, she was in that job for 30 years. That's a lot 4 of experience. We're really happy to have her. 5 MR. HIGGINS: Let me follow up on that, and, 6 Roberta, let me ask you the question. There was some 7 talk on the panel about needing to make sure that -- this 8 was on the other panel -- that the board of directors had 9 someone who really has expertise in cyber. And, you 10 know, we went through a time when you had to have an 11 audit committee financial expert, and you have to have -- 12 you know, then compensation became very important -- 13 important to have somebody with that expertise. Roberta 14 is a scholar of corporate law and securities law and the 15 like. 16 What do you think about -- I mean the board's 17 job is oversight of the organization. They manage the 18 affairs of the corporation. What about having those 19 expertises, and when does it become the board is actually 20 running the company as opposed to overseeing the 21 management? 22 MS. KARMEL: Well, I don't think the board 23 should be running the company as opposed to overseeing 24 the management. I think it probably depends somewhat on 25 the nature of the company's business. For example, a 0092 1 very large international financial institution might want 2 someone on the board who has that kind of cybersecurity 3 expertise. I'm not sure all other companies would feel 4 this is as urgent. On the other hand, there are a lot of 5 issues that boards have to think about. And a lot of 6 times it's about whether the board is sufficiently up to 7 speed so they can ask the right questions. 8 I was on the board of a public company when 9 there was this Y2K threat. And it seems to me that every 10 single board meeting for about two years we had a report 11 on what the company was doing with regard to the Y2K 12 threat. I'm not on the board of a public company now, 13 but my guess is this is a question that comes up now at 14 all of the board meetings, what is our cybersecurity 15 preparedness; are we in compliance with the various 16 protocols that apply to us depending on what the company 17 is. 18 So I mean a board of directors is supposed to 19 be composed of people who are generalists to some extent. 20 And to have a special cybersecurity committee as opposed 21 to a person or responsibility by the audit committee or 22 the risk management committee for cybersecurity matters - 23 - I would hate to see that become a requirement. I think 24 it's something that should be determined from company to 25 company. 0093 1 MR. HIGGINS: David, let me ask you the 2 question. In your experience is there any best practice 3 that you've seen, or does one size fit all, or is it 4 really going to depend on the company? 5 MR. BURG: Yeah. I think it really does depend 6 on the company, the industry in which the company 7 operates, and also the recognition or the awareness at 8 the management level of the kind of risks that are, in 9 fact, attendant because of cybersecurity and because of 10 the reliance on technology to fuel innovation. 11 I mean I think that at the board and CEO level 12 I mean there are a couple of interesting trends that are 13 occurring right now. I mean right now we see that 86 14 percent of U.S. CEOs view technology as a enabler of 15 innovation that will allow businesses to grow in size 16 over the next five years. And at the same time close to 17 70 percent of U.S. CEOs expressed concerns related to 18 cybersecurity. And, in fact, when you look at even the 19 top five risks this year at Davos that were discussed, 20 cyber was number five on the list of issues to 21 potentially manifest with the greatest amount of damage. 22 And so something -- the awareness factor seems 23 to have risen to a point where many CEOs and board 24 certainly are aware today of the kinds of risks that come 25 from operating in a world where we're highly 0094 1 interconnected. We have this global business ecosystem. 2 We heard in the first panel that we are not comprised -- 3 when we think of enterprise risk, of the risks inherent 4 in securing the business enterprise itself but also we 5 must assume because of these extended business supply 6 chains that the risks actually extend outside of the 7 boundaries of the company's control. 8 And so I think the reality is that today, 9 because we do face incredibly sophisticated threat actors 10 that are out there, including those that were highlighted 11 in the first panel discussion -- but I mean I cannot 12 underestimate the level of attack or sophistication that 13 we see in the real world in very real cases where it's 14 not just the vulnerability that exploited, the perimeter 15 that is able to be penetrated, but we see attackers going 16 very deep inside business processes, inside applications, 17 completely manipulating or controlling or making 18 irrelevant the security control environment. 19 And so with the very real kinds of risks that 20 we see, whether the cases are made public or not, we are 21 seeing that, among the -- you know, the large 22 sophisticated corporations at the CEO and board level, 23 there's -- again, there is this awareness. And so in 24 some cases there may be efforts undertaken to place 25 someone on the board who has tremendous depth in the area 0095 1 of technology and cyber, as Leslie pointed out. 2 So, you know, I think that the short answer to 3 your question is that it does depend. But there 4 definitely, I would say, is a -- there is a lot of energy 5 right now that's focused on making sure that at the 6 senior management and board level -- that the risks 7 associated with operating in today's business environment 8 are very well understood and then that investments to 9 mitigate or to reduce risks are appropriate. And we can 10 -- so maybe talk more about you know, some of those 11 activities, you know, later in the panel discussion. 12 MR. HIGGINS: Great. Thanks. 13 You know, one thing -- a lot of 14 responsibilities are put on the audit committee. Any 15 thoughts on whether the audit committee -- I mean, you 16 know, it seems they're heaped on for quite a lot of 17 things. 18 Peter, any thoughts on is the audit committee 19 the right place to have this task? 20 MR. BESHAR: Certainly for our company, Keith, 21 the audit committee agenda tends to be the longest of any 22 of the committees. It'll last three or four hours and 23 not infrequently. So it is a very full plate that 24 they're being asked to do. 25 That said, I think it's probably a good place 0096 1 for there to be a regular discussion about logical 2 security and what the company is doing in terms of 3 preparedness and resilience. And then in addition, most 4 public companies have an ERM framework, and there is a 5 regular reporting on what are perceived as the top risks 6 into the full board of directors. And that's another 7 mechanism to ensure that cybersecurity is regularly 8 assessed. 9 MR. HIGGINS: On the other front, I would note 10 that the Center for Audit Quality just put out a member 11 alert in the last few days, which really details what 12 their view is of the role of the independent auditor. 13 And I won't go into it today, but it was an interesting 14 take that they have on the role of the independent 15 auditor in the cybersecurity landscape. 16 Moving to disclosure, you know, you always look 17 at your -- after an event happens, you always go back and 18 look at your disclosure. And you might think, "Gee, I 19 wonder -- what should have been disclosed?" So what 20 are companies disclosing right now about their risks? 21 Have we learned anything from the various incidents that 22 have happened about what they should be disclosing, and 23 what's a good way to go at it? Anybody -- anyone want to 24 start with it? 25 Jonas, you want to start with it? 0097 1 MR. KRON: Sure. I think, you know, what we've 2 seen is -- well, it was one of the responses to the SEC 3 guidance, and I think we see a lot of the language that 4 was there being actually mirrored in the disclosures. 5 And unfortunately I think we're seeing a lot of 6 boilerplate, and I think that's really, you know, the 7 honest truth of what we're seeing, and I think that's 8 really unfortunate. 9 And I think that's -- you know, what the 10 disclosures are supposed to do is to be able to create 11 differentiation, to be able to say, you know, "Yes, all 12 companies may have exposure to this, but why is my risk 13 profile different than my peers," or, "This is the level 14 of expectation in the industry for security. This is 15 where we are, and this is how we're going to get up to 16 that level of disclosure -- or up to that level of 17 security." 18 And in some respects everybody is exposed to 19 the risks that are out there. There's not necessarily a 20 differentiation there. The differentiation comes from 21 within the business practices of the company and its 22 operations. And it sort of goes back to what I was 23 saying before, what decisions the company is making about 24 this information, the information that they're 25 collecting, the information they possess, that creates 0098 1 risk for them. Sometimes the business model is the risk 2 creator. And I think being clear about that is really 3 important for companies to do that. 4 And to come back just to the board piece of it, 5 I think it's important for the boards and particularly 6 the committee structures to simply spell out where the 7 responsibilities lay. I don't know if it's necessarily 8 important for it to be with an audit committee. Some 9 companies have technology and e-commerce committees. I 10 have seen companies that have public policy and 11 regulatory committees. I'm not sure it's necessarily 12 important to say which one it belongs in, but let's be 13 clear about where it is and what the responsibilities are 14 so that the market can make a decision about that. 15 MR. HIGGINS: Let me ask you a question, and 16 the boilerplate comment, I think, is a fair one. If you 17 take boilerplate on the one hand and on the far side you 18 take a look at the specific road map of the company's 19 vulnerabilities and what the consequences of those 20 vulnerabilities could be, where do you find the balance? 21 How do you -- is there somewhere in the middle that will 22 be helpful to investors while at the same time not 23 harmful to companies? And Jonas, if you want to take 24 this -- but obviously I'll kick it out to all the other 25 panelists as well. 0099 1 MR. KRON: Sure. You know, yes, I think there 2 definitely needs to be a balance point that's reached. 3 And I think you sort of outlined the two poles of that 4 pretty nicely. And unfortunately, I think we're just way 5 too far over on the boilerplate, you know. I don't think 6 we're anywhere close to having conversations about "oh 7 geez. Companies are providing too much information." I 8 think we're -- you know, we're pretty far from that 9 situation right now. So I think there's definitely a lot 10 more room to come towards the middle. 11 And some of that can be, you know, still done 12 in more general terms and can be done through an audit 13 process. You know, if you have third parties coming in 14 to provide audits for companies' practices, then you can 15 create -- you can create sort of a shield, I suppose, 16 from sort of providing that exceptional level of detail 17 that's going to be problematic but nevertheless providing 18 investors with a point of reference where they can make a 19 determination. 20 MR. HIGGINS: Let me ask this question. And I 21 actually don't know the answer to it. Is there a 22 recognized framework like the COSO framework is for 23 internal controller or financial reporting? Is there a 24 recognized framework by which cyber preparedness can be 25 assessed, evaluated, and you can determine whether there 0100 1 are significant weaknesses or material -- you know, 2 material weaknesses. 3 Doug, do you know? 4 MR. MEAL: Well, there are. I mean, for 5 example, in the payment card arena there is the payment 6 card industry data security standards, and that's a 7 framework that any merchant who handles payment card data 8 is supposed to comply with, and in other areas there are 9 as well. 10 But I can't -- I can't really envision a 11 scenario where somebody undergoes a security assessment 12 against one of those standards and then in its disclosure 13 says, "We were found to be in violation of 1.2.1 and 2.2. 14 and 3.3, and we have this and that and the other 15 vulnerability." That might be meaningful disclosure, but 16 that would also be the exact sort of opening the door 17 type of disclosure to your adversaries that you don't 18 want to do. 19 So I don't know that there really is that 20 middle ground that is going to provide the level of 21 detail that would really be meaningful as a 22 differentiator on the one hand but wouldn't, in fact, go 23 to far in terms of what you'd be saying. 24 But there are standards that are out there. In 25 fact, one of the big problems in the area is that many of 0101 1 the companies that had suffered these breaches had been 2 certified as fully compliant with the relevant standard 3 and suffered a breach anyway. 4 MS. KARMEL: Just to add in about financial 5 institutions, I believe there are standards and that the 6 bank examiners examine as to whether or not the banks are 7 in compliance with these standards. I think probably the 8 last thing the examiners would want disclosed is exactly 9 what is the examination standard and exactly what 10 breaches they have found. I mean this goes along with a 11 lot of other kinds of dynamics of highly regulated 12 industries. And I think what, you know, we should want 13 is companies that are taking these cyber threats 14 seriously and where the government is helping the 15 companies to overcome them. So this may be an area where 16 more disclosure is not really in the public interest. 17 MR. HIGGINS: Chair White? 18 CHAIR WHITE: Just to go back to the -- I guess 19 the triggers for disclosure, obviously materiality being 20 the legal test. 21 And Doug, I think you mentioned that staff 22 guidance and the securities law may not be what's driving 23 the disclosure, that you have disclosure obligations 24 under state laws and various other laws. 25 I guess I'd be interested in just a little bit 0102 1 of an elaboration on what those other laws' triggers are 2 -- is one question that I have. I worry a little bit 3 that, you know, as you described, that the materiality -- 4 you know, is it being applied as it should be. Is it the 5 quicker trigger that it -- you know, should it be a 6 quicker trigger than it is under the securities laws is 7 one question. 8 And then Leslie, you commented on and I think I 9 took away -- well, I'll tell you what I took away from 10 it, and you'll tell me if it's wrong. But the -- there 11 are events, and there are events. There's cyber, and 12 there's cyber. And so if somebody from DHS and the FBI 13 comes to visit you -- that's not a usual event hopefully 14 every day in your company's life. But it doesn't 15 necessarily mean that, even though they're there for the 16 purpose of informing you of something that's, you know, a 17 cyber breach, potential cyber breach, that may or may not 18 be material I mean, I think, is what you were saying. 19 And then my other question is do you ever 20 get -- is there ever an issue where you're visited 21 usefully, hopefully, by DHS or the FBI and they inform 22 you of something and they say, "But you can't disclose 23 that?" One more. I'll do one more and then I really 24 will stop, okay -- which is that -- and it's something 25 I'm trying to sort out in my own mind. 0103 1 I mean obviously, you know, in terms of 2 investors and what they need, I take your point, Jonas, 3 for -- on the boilerplate that needs to be useful to the 4 investors. Obviously you have the sensitivity on the 5 other end. Some of the information, at least as a 6 country, to sort of protecting companies, if it's 7 provided to let us say it may not be material to 8 investors, but it would sure be useful, you know, to DHS 9 or whomever it is on a confidential basis to further, you 10 know, protect companies and investors. You know, how do 11 we sort out all those issues? And I will say you can 12 answer two of the four, whatever you'd like. 13 MS. THORNTON: You know, I never said thank you 14 for having us, so thank you for having us. Sorry about 15 that. 16 So there are circumstances where federal 17 government agencies will show up and say, "This is what's 18 going on. It's classified so you can't talk about it." 19 So how do you balance the SEC's interest in disclosure 20 issues with another part of the federal government 21 telling you that you can't talk about whatever's going 22 on? 23 So the way most companies, I think, in -- 24 particularly in the energy industry would respond to that 25 would be to cooperate with those agencies until whatever 0104 1 level of incident is resolved and then sort of make a 2 determination whether there's -- it's the right time to 3 disclose something. I mean you have to -- you know, 4 obviously shareholders want to know if something's going 5 on that's going to materially affect, you know, the value 6 of their holdings. 7 But on the other side what you don't want to do 8 is disclose something that they can't interpret correctly 9 because they have this much information about it or they 10 just don't understand well enough to sort of make the 11 right decisions. It's a very complicated sort of set of 12 circumstances. 13 I would hope that in a situation like that the 14 agencies would understand each others' sort of priorities 15 and not penalize a company for taking one approach or the 16 other. Happily, those things don't happen very often, 17 but my concern is generally that they're going to happen 18 more often, that nation-state actors in different places 19 than we are already aware of are getting more 20 sophisticated and figuring out that they can do things 21 more than just steal a bunch of credit cards and create 22 new credit cards. They can actually create physical 23 harm. 24 And then what do you do with that? What do you 25 do in that scenario? I think in most -- I think most 0105 1 companies in that scenario sort of follow the lead of the 2 agencies that they're working with and just take those 3 leads. 4 I assume that people know about or I hope 5 people known about -- it's a joint task force, and it's 6 called the National Cyber Investigator Joint Task Force. 7 And it's made up of, I think, about 13 agencies, and it's 8 NSA and FBI and DHS and CIA and all the ones that you 9 would think that go in there. And their job is to work 10 with companies when something happens, to figure it out, 11 fix it, make sure it doesn't happen again, and cooperate 12 in a way, I think, that's productive. 13 That kind of cooperation, I think, addresses 14 some of the broader issues of how you protect a company's 15 value, right? I think that's one of the things that is a 16 by-product of that kind of cooperation. And so I like 17 the fact that they exist. I like the fact that you can 18 get into consent agreements with them and actually work 19 together with those experts and create a safer 20 environment for your company because in the end that's 21 your shareholder value. Did I answer your question at 22 all? 23 CHAIR WHITE? Oh, it does, and I think it's 24 also -- and this goes back to Jonas' point about the 25 boilerplate too. I mean what you worry about, you know, 0106 1 is rational thinking that may not lead to meaningful 2 investor disclosure that actually -- you know, on a 3 material subject that does need to be made as well. I 4 mean I think -- I think that's kind of on the other side 5 of the coin. I mean obviously the more we're protected 6 across the board, you know, for shareholder value, 7 company value, you know, that clearly leads in that 8 direction, but it doesn't quite answer the -- what's the 9 company obligated to disclose and when and how -- I think 10 is the -- but I get all the complexities of that. 11 MR. MEAL: I can address the first question, 12 which is what are the laws that are driving the 13 disclosure. And I think what we're really talking about 14 there is the state laws that mandate, at a very, very low 15 threshold, disclosure of breaches that result in access 16 to personal information. But I think that illustrates 17 the point. 18 Think of -- what are the breaches you hear 19 about? Well, they're always the breaches that involve 20 personal information. You never hear about the breaches 21 that involve theft of IP. You never hear about the 22 breaches that involve theft of somebody's plan to launch 23 a takeover of another company, right? And that's what 24 I'm saying, is that all of those are being sort of ruled 25 out of the disclosure game on materiality grounds. And 0107 1 the only ones that rise to the surface are the ones where 2 there is this much, much, much lower threshold of 3 disclosure. 4 MR. HIGGINS: Commissioner Stein? 5 COMMISSIONER STEIN: I just wanted to follow up 6 because I think this is sort of begging the question. Is 7 -- has the Commission given issuers enough guidance, 8 going back to 2011, to actually make where that line is 9 clearer, right? Do we need some type of minimum 10 standards? Is that possible, you know, given the broad - 11 - the breadth, right, of the public companies that we're 12 overseeing. Or is there a way for us to be as dynamic in 13 our, you know, requirements as the landscapes and the 14 threats are, right? 15 I mean I think what I got from the first panel 16 is we all need to be dynamic. We all need to keep 17 moving. We all need to be looking in a multifaceted way 18 at this and how we can help each other because at the end 19 of the day we want our issuers to be strong and not 20 harmed, and we want our investors to have confidence in 21 the companies they're investing in. And I think at the 22 end of the day we want people to go to the higher common 23 denominator. 24 We want to be nudging, right? This is a little 25 bit, I think, from the shareholder perspective, is how do 0108 1 I nudge my company towards making better decisions, you 2 know, on the cybersecurity front so that my investment is 3 not subject to, you know, the whims of another country's 4 attack on us. 5 So, again, in the spirit of that -- and I think 6 this is what this entire day is about. How can we be 7 more dynamic? How can we help you? If the materiality 8 standard isn't working in this particular situation in 9 the way it might and others, what should we be talking 10 about? Should it be principles based, or should there be 11 a floor, and should that vary from industry to industry? 12 That's a big set if issues, but I think that's what 13 we're really, you know, struggling with. 14 MR. HIGGINS: That's terrific. And if all of 15 you can kind of -- if somebody could take the lead on it 16 -- Peter, do you want to start? 17 MR. BESHAR: Sure. Commissioner Stein, I think 18 that the Commission and the staff have really struck the 19 right balance up to this point of trying to prod 20 registrants, raise awareness of the importance of the 21 issue, and at the same time be sufficiently flexible so 22 that you're giving us different data points. So do you 23 have personally identifiable information of a substantial 24 sort in your operations, credit card numbers, social 25 security information, personal health data? When breach 0109 1 happens, did your organization identify that breach, or 2 was it some third party that came and brought that to 3 your attention? 4 So I think the idea of additional guidance from 5 the staff and from the Commission would be immensely 6 helpful actually as opposed to trying to do it solely 7 through comment letters, which is helpful, but at the 8 same time to consolidate it and pull it together of what 9 are the best practices. 10 One of the issues that I think reflects the 11 dynamism in the world right now is there's a perception 12 from some of the research that this is not really a stock 13 price event as a general matter; it's much more of a 14 reputational and brand impact. And that's a little bit 15 less significant or less interesting to investors. 16 Everybody obviously cares about the reputation of the 17 company. But some of this data has suggested that in 80 18 or 90 percent of data incidents, the stock price has 19 either not gone down at all, or it has recovered really 20 quite quickly. 21 I suspect that dynamic's going to change over 22 the next year or two. Whereas the nature of the threat 23 intensifies, the impact on the operations will be more 24 significant. And if therefore it becomes much more of a 25 quantitative impact on the company rather than a 0110 1 qualitative, then that's something that would be quite 2 helpful, obviously, to be properly disclosing. 3 MS. THORNTON: I do think it would be helpful 4 to have -- I agree with all that. I do think it would be 5 helpful to have a way to look at different types of 6 industries, particular types of threats. So -- and I 7 don't know how -- there's so many different types of 8 industries, and the bad guys get more and more 9 sophisticated but -- that maybe you do have a different 10 analysis for critical infrastructure companies than 11 Target. I mean maybe you just have a different sort of 12 way of looking at it or at least a framework that allows 13 a different type of dialogue about it. 14 MR. KRON: Sure. If I could just sort of add 15 in a couple of thoughts there, you know, is one is, you 16 know, there tends to be this sort of discussion about 17 shareholders in this monolithic way. And the fact of the 18 matter is that different shareholders have different 19 levels of risk tolerance, and so that's sort of the 20 important thing, is being able to determine, you know, 21 what the -- we want companies to have risk. That's part 22 of innovation. That's part of creating value, but it's 23 compensated risk. And what level of compensation of risk 24 and return -- you know, what's that balance point that 25 you're looking at? 0111 1 So having companies help us understand the 2 differentiation between themselves and their peers in 3 terms of level of risk can be a really important piece of 4 information. 5 The other is that there's sort of the 6 buy/sell/hold point of view and, you know, whether you're 7 going to affect share price because all of a sudden 8 everybody wants to get out of the company. But there's - 9 - a lot of shareholders are long-term shareholders. And 10 the question is really the exercise of shareholder rights 11 and who you're voting for for the board of directors and 12 what those conversations you're having with the company 13 are going to be going forward as a shareholder that's 14 going to be there for the long term. So providing that 15 kind of information -- you know, providing the 16 information with that in mind, I think, is equally 17 important. 18 MR. HIGGINS: Thanks, Jonas. 19 David or Doug or Roberta, anything -- that was 20 good thing actually to wrap up on, Commissioner Stein, 21 because we actually are coming past our appointed hour, 22 and we do owe you an opportunity for lunch. 23 But David, Doug, Roberta, anything else? 24 MR. MEAL: I'll just add a little bit of a 25 contrary view in that the idea of sort of -- what we sort 0112 1 of posited here is we ought to figure out a way to, from 2 a securities law perspective, force disclosure of more 3 breaches, essentially. 4 COMMISSIONER STEIN: I didn't say that. 5 MR. MEAL: Or think about that. 6 COMMISSIONER STEIN: But I -- you know, I think 7 we need to think about risk in a more dynamic way. And 8 to go back to it's not a point of conduct; it actually 9 might not be a breach. It might be how effective I am at 10 the perimeter; how am I effective once it's been -- my 11 company's been infiltrated; how resilient I am in pushing 12 back because the attacks are going to happen. 13 So, again, I want to broaden out our way we're 14 all looking at this because I think that's what I got 15 from the first panel. And -- 16 MR. MEAL: Okay. 17 COMMISSIONER STEIN: -- you know, let's get out 18 of the black and white because I don't think it's black 19 and white. This is very gray. And how we don't hurt a 20 company, right -- and at the same time people understand 21 what the risks are, and we're helping, by disclosure of 22 that risk, moving the company forward to a better place. 23 MR. MEAL: the -- I mean I think back to the 24 Albert Gonzalez era. He was the hacker who hacked dozens 25 of companies in the 2007, '08, '09 era. A couple of 0113 1 those companies, a couple of my clients, disclosed the 2 breaches that they suffered. They were rewarded with all 3 kinds of litigation, regulatory investigations, and 4 enormous litigation expense and burden. Many of the 5 companies concluded, clearly on materiality grounds, not 6 to disclose their breaches, and it came out years later 7 that they had suffered breaches that were even larger 8 than some of the ones that were disclosed, and they never 9 suffered any of that. 10 Now from a materiality point of view, from a 11 securities law point of view, you could make an argument 12 per the stock movement issue that none of those breaches 13 were material. Even TJX, which suffered one of the 14 hugest breaches in history -- its stock price did not 15 move when its breach was announced. 16 So I guess I would sort of, you know, in a bit 17 of a contrary view say I don't see that it's going to 18 help investors to push a standard that would lead to a 19 disclosure of immaterial breaches in terms of breaches 20 that don't really affect the way the investors evaluate 21 the company. 22 MS. KARMEL: Well, you know, there are steps 23 the SEC could take to require more disclosure about 24 cybersecurity matters, and if I were on the staff, I 25 could think of a number of new rules. You could make 0114 1 this a line item disclosure. You could have a disclosure 2 as to exactly what our responsibility on the board is for 3 cybersecurity matters. 4 But I have a resistance to the idea that, when 5 a matter becomes a really important matter of public 6 policy, the SEC should be tasked with doing something 7 about it. When environmental matters were very much on 8 the top burner -- not that they aren't today -- but then 9 the SEC was told by legislation actually, "Well, you have 10 to make companies disclose all environmental incidents 11 whether they're material or not." 12 I think that that kind of an initiative just 13 adds to the prolixity and length of disclosure documents 14 without being really that helpful to investors. And at a 15 time when hopefully the SEC is going to look at 16 disclosure policy and try and simplify it, I don't think 17 the Commission should be going overboard in another 18 direction of putting in new regulatory requirements with 19 regard to cybersecurity disclosure. 20 I think the 2011 guidance was good. It led to 21 a lot of interesting disclosure although much of it does 22 seem to be boilerplate. Maybe that could be refined and 23 a new such guidance could come out, but I'm not sure the 24 SEC is the agency that really should be pushing companies 25 to do more by requiring more disclosure of breaches and 0115 1 other kinds of information that aren't material. 2 MR. HIGGINS: Let me just say a little bit like 3 a radio talk show host, well, radio audience, what do you 4 think? I mean this is what the comment file is for. And 5 obviously we've set up the two poles that we can -- that 6 are there. And there's obviously somewhere in between 7 that needs to be found, and that's what we hope our 8 commenters will look at. 9 Well, thanks so -- 10 MS. THORNTON: Can I add one quick thing? 11 MR. HIGGINS: Oh, sure, Leslie. 12 MS. THORNTON: If you're going to look at 13 things and try to be more dynamic, one issue I would 14 introduce as an important part of being dynamic would be 15 the direction that board members sort of understand is 16 their obligation to do or not do because one of things 17 I'm concerned about is when you're -- I sit on a public 18 company board in my spare time. And one of the things 19 that, you know, we think about is what is our obligation. 20 You know, the audit chair is always concerned "if I don't 21 push management on a disclosure issue, is this going to 22 come down on me". And that's not a position I would -- 23 it's not a comfortable position to be in, particularly in 24 such a sensitive environment. 25 So I would just keep the fact that directors 0116 1 have to weigh in on that and what language will be in 2 anything new that would impact their evaluation. 3 MR. HIGGINS: Well, thanks again to the panel. 4 You've done a terrific job. I appreciate the time that 5 -- we appreciate the time that you have taken and thanks 6 so much. 7 The lunch break is a little bit shorter than we 8 hoped it would be. We want to get back at 12:45. The 9 good news is that fast food heaven is only steps away, 10 and so you can just go over to Union Station and grab a 11 quick bite. And we're going to try to convene at 12:45, 12 so thanks for your patience. 13 CHAIR WHITE: Thank you to all the panelists. 14 COMMISSIONER STEIN: Thank you. 15 CHAIR WHITE: Terrific panel. Thank you. 16 (A brief recess was taken.) 17 MR. BURNS: Good afternoon and welcome back to 18 the Cybersecurity Roundtable. My name is Jim Burns. I'm 19 the deputy director of the Commission's Division of 20 Trading and Markets, and I'm pleased to have the 21 opportunity to moderate the next panel. 22 This session will focus on the cybersecurity 23 issues facing market participants who operate key market 24 systems, including the protection of customer and client 25 information, the integrity of primary systems such as 0117 1 trading systems, as well as secondary systems and the 2 impact of common cybersecurity attacks such as denial of 3 service. This session will also consider some leading 4 thoughts on approaches to cybersecurity. 5 Because some of you may have only now joined us 6 after the break, I'm going to echo remarks that were made 7 at the very beginning by one of my colleagues that the 8 views you might hear me expressing now or by any 9 moderators throughout these remaining two panels are our 10 own thoughts and don't necessarily reflect the views of 11 the Commission or of our colleagues on the staff. In 12 fact, as a moderator, you might hear me ask questions at 13 times that are diametrically opposed to what I might 14 happen to think about something, not that that and 50 15 cents will buy you a cup of coffee. 16 And I'm pleased to see on the stage our Chair, 17 who has been with us throughout the morning, Chair Mary 18 Jo White, and her colleague Commissioner Kara Stein. 19 Throughout the day Commissioners Aguilar, Gallagher, and 20 Piwowar have been coming in and monitoring, and their 21 staffs are with us. 22 I would also like to remind our audience -- 23 again, you've heard this a number of times today, but for 24 those who've just joined us, we have an open comment file 25 associated with this roundtable, and we strongly 0118 1 encourage you to submit your comments on the issues that 2 might be raised by this panel as well as the other panels 3 today. Because of the limited time at our disposal, 4 we're not going to be hearing real time questions from 5 the audience. 6 A lot to cover in this session so let me very 7 briefly introduce our panelists whose very impressive 8 biographies you can read about on our web site -- and not 9 in the order that they're sitting but in my cheat sheet - 10 - Katheryn Rosen, the deputy assistant secretary of the 11 Office of Financial Institutions Policy at the U.S. 12 Department of the Treasury, Mark Graff, Chief Information 13 Security Officer of NASDAQ OMX, Tom Sinott, Managing 14 Director, Global Information Security, with the CME 15 Group, Aaron Weissenfluh, Chief Information Security 16 Officer at BATS Global Markets, Todd Furney, Vice 17 President, Systems Security at the Chicago Board Options 18 Exchange or CBOE, as we call it around here, and Mark 19 Clancy, Managing Director and Corporate Information 20 Security Officer at the Depository Trust and Clearing 21 Corporation or DTCC. 22 Our panelists bring with them a wealth of 23 knowledge and experience related to cybersecurity issues, 24 specifically those that are facing our markets today, and 25 we are extremely grateful to them for taking the time to 0119 1 come and see us. 2 If I could at the very outset -- and we're 3 privileged to have Katheryn Rosen here with us whose 4 colleague Cyrus Amir-Mokri addressed us in the first 5 panel. And we'd like to open the floor for her to 6 offer a few opening remarks. 7 MS. ROSEN: Thank you, Jim. And thank you 8 Chair White and Commissioner Stein. 9 I'm really happy to be here today to help 10 explore cybersecurity in the context of exchanges and 11 other market infrastructure. As deputy assistant 12 secretary for Financial Institutions Policy where our 13 responsibilities include our office of Critical 14 Infrastructure Protection, I sit at the nexus of broad 15 matters facing financial market utilities. 16 Earlier in the session Jim mentioned that the 17 Treasury serves as the government sector-specific agency 18 for financial services on critical infrastructure, which 19 addresses threats to critical infrastructure both 20 physical and cyber. 21 Treasury also wears the hat as the chair of the 22 Financial Stability Oversight Council. And this is where 23 operational risk and, in particular, cybersecurity and 24 threats to cybersecurity have risen to the top of the 25 priorities of the agenda for addressing threats in the 0120 1 overall system. Last year the council made a 2 recommendation to financial regulators for heightened 3 risk management and supervisory attention to this matter. 4 Our core areas of focus have really been around data 5 integrity, market and consumer confidence, and the risk 6 of contagion or systemic risks stemming from a cyber 7 incident. 8 As we have this discussion today, it's really 9 important to remember that our financial firms are 10 becoming technology firms, and those that operate our 11 financial pipes have become or are increasingly becoming 12 financial firms -- technology firms. With an emphasis on 13 central clearing and exchange-traded transactions and the 14 growing prevalence of retail mobile applications, the 15 automation of our marketplace is ever-increasing. 16 So given these market systems are central to 17 payment flows and, in many cases, to trade execution, 18 cyber hygiene is important. And some of the areas we 19 subscribe to are front line protections to build 20 resilience, incident management protocols that involve 21 both information technologists, who understand the bits, 22 but also business leaders who understand the enterprise 23 risks -- these two folks need to be speaking together -- 24 and recovery plans that have been stressed and tested. 25 These elements are critical to the health and confidence 0121 1 of the broader financial sector and to the economy. 2 Let me add one more stream of thought before I 3 throw it back to Jim, and that is the global financial 4 regulatory community has recognized the threat of cyber 5 attacks and is taking it up in a serious and focused way. 6 We want to thank the SEC for their efforts around 7 Regulation SCI. We think this is an important step 8 forward in terms of addressing firms, SEC registrants to 9 maintain information security policies and procedures, 10 conducting business continuity testing, and provide 11 certain notifications in the event of disruptions. We 12 know they're working towards a final rule, and we thank 13 them for that effort. 14 The banking regulators, through the FFIAC, have 15 organized a working group to look very closely at these 16 matters as it relates to supervisory attention. A lot of 17 the examination work has been naturally focused on 18 physical events. These areas need to be updated for 19 cyber, and the FFIAC was working really hard to do that. 20 And we can't forget the work that IOSCO is doing on a 21 global basis around principles for financial market 22 infrastructure, really looking to business continuity and 23 resolution issues and recovery, which fall right in the 24 tenets of our cybersecurity risk and remediation there. 25 Well, we very much appreciate these efforts, as 0122 1 I mentioned, and the Treasury Department, in particular, 2 looks forward to working with these different communities 3 on these elements going forward. 4 But I'll also mention the administration's 5 efforts before I close, and that is the efforts around 6 the President's executive order where the NIST framework, 7 the National Institutes of Standards and Technology put 8 forward a framework for how firms large and small can 9 think about cybersecurity, cyber hygiene and best 10 practices. These firms have been -- especially those 11 here on the stage -- have been extremely helpful in 12 having that framework be applicable to firms within 13 the sector as well as cross-sector, and I think we'll 14 talk much more about best practices here going forward 15 and where that framework can help, so thank you. 16 MR. BURNS: Katheryn, thank you very much. 17 With that, unless there's anything at the outset either 18 Commissioner would like to mention, let's begin the 19 discussion of some of the most common cyber threats that 20 are faced by exchanges and SROs. And, Katheryn, if I 21 could put you on the spot again after that opening, would 22 you be willing to start us off with a brief explanation 23 of the biggest cybersecurity threats to our securities 24 markets infrastructure today? And if you could, maybe 25 give us a sense of what your view of the goal of such 0123 1 attacks is? 2 MS. ROSEN: Well, I'd say in -- I can't really 3 -- I won't speak to necessarily the actors or the 4 techniques or the technical issues. I'll leave that to 5 the folks who are on the front lines doing that each and 6 every day. 7 I'll say that one of the most important things 8 that we need to do to combat such threats is share 9 information. And this is becoming -- I think it's almost 10 the mantra of cybersecurity as we talk amongst not only 11 the financial sector and the government but across 12 sectors. And as we think about sharing information, it's 13 really not only private sector to the government, but 14 government to private sector and the government to 15 government and the private sector to private sector. 16 There's a lot of nodes on this information 17 chain that we need to share information. I think getting 18 folks like the CISOs and business leaders in the 19 financial services sector the information they need is 20 important and we need to be able to declassify this 21 information. And we need to be able to get private sector 22 folks clearances so they can actually read the information 23 we're not able to fully declassify. 24 So in terms of how do we think about cyber 25 threats and how do we think about providing financial 0124 1 services sectors what they need, those are really the key 2 elements. 3 MR. BURNS: Mark, could you give us a sense 4 from your perspective what specific or unique challenges 5 you see for exchanges in assessing the threats that are 6 out there, the vulnerabilities that exist? 7 MR. CLANCY: Sure. So let me build on what 8 Katheryn started with and -- 9 MR. BURNS: Oh, I beg your pardon. I was 10 looking at Mark Graff, but I want to -- no, you're off to 11 the races there, Mark. Go ahead. 12 MR. CLANCY: So I'll -- sorry -- jump out in 13 front and let Mark bat clean-up. 14 So just to build on what Katheryn said, you 15 know, we've been thinking about this as a community. And 16 when I get to this panel, we all know each other well. 17 We share information about what we see in our 18 environments literally every day. But when we look at 19 the threats that we face, we kind of bucketize it into 20 four groups. And I steal Richard Clark's summary of the 21 four groups, the criminals, hacktivists, espionage 22 actors, and war-like actors. 23 And if you think about it, you have the 24 criminal element, nothing new, the financial sector 25 people, who want to steal money or things that allow them 0125 1 to steal money. I'm not really surprised. But the other 2 groups are ones that maybe folks aren't as familiar with. 3 So you have activists, hacktivists, who are trying to 4 express some political objective. Maybe they're 5 protesting, you know, detainment of somebody who's 6 leaking information or those types of groups. And so 7 their objective is not to steal but to further their 8 political objective. 9 You have espionage-like actors. That might be 10 other nations or other groups who are trying to steal 11 secrets to improve their economy or their nation or their 12 competitive advantage, and then you have war-like actors, 13 who are those trying to disrupt or degrade the 14 functioning of the market infrastructure. 15 And when you look at it from our viewpoint, we 16 have to worry about all four of those groups. But what 17 they're trying to do and the means that they go through 18 and the lengths which they go through are not the same 19 for all four. And on the financial market infrastructure 20 side, we see less activity in the criminal space and more 21 activity in the other three groups. 22 MR. GRAFF: Well, I agree with everything the 23 other Mark said also. I think a point I like to make 24 sometimes is, you know, there are real world analogies 25 that can help us reason through some of these things. 0126 1 And if you think of the example of trying to protect your 2 house and the people and the things in it, you don't 3 always really care whether somebody who might attack you 4 would -- you know, what neighborhood they come from or 5 what their last name is or even what they're wearing. 6 What you really care about is what would they try to do 7 to you. 8 And so I think we do a lot of threat modeling 9 exercises, and we do try to keep well informed by talking 10 to other members of the industry and especially the 11 federal government. The FBI in particular has been very 12 helpful. But what I try to focus on is, you know, what 13 bad thing can happen, not necessarily who would try to do 14 it, but what kind of disruption are we worried about, 15 what kind of loss of confidentiality we're trying to 16 worry about. We try to nail those things down and get 17 pretty specific in terms of loss scenarios and then from 18 that construct defensive measures and ways of detecting 19 whether or not some of those attempts have been made. 20 MS. ROSEN: Jim, I'll actually just jump in 21 real quick. I'm sorry. 22 MR. BURNS: Oh, go ahead. Yeah, please do. 23 No, do. 24 MS. ROSEN: You asked for interactive, so I'm 25 going to go for it. 0127 1 I think just to comment on Mark's -- we call it 2 a tear line, right, where the information comes in. And 3 this is what I'm speaking about in terms of 4 declassification, making sure that these firms can get 5 the information on what is happening, not necessarily the 6 who because they -- as you articulated, you don't care. 7 You may care in terms of thinking about your modeling 8 exercises from a long-term perspective but really making 9 sure these firms have information so they can actually 10 build defense from a technical standpoint. So I just 11 wanted to -- that was amplifying my declassification 12 point. 13 MR. BURNS: And building on what Mark Graff was 14 just mentioning, threat modeling, organizations are 15 having to consider how to manage, how to look for, how to 16 tackle risk. How do you take that on from an enterprise 17 perspective, Tom? 18 MR. SINOTT: So as any risk, whether it's 19 operational, market, financial, we have an information 20 security risk committee that reports in to both the audit 21 committee and into the enterprise risk management 22 program. And really it is looking at the various threat 23 actors and their capabilities and their desires and then 24 trying to determine what layers of our defense and that 25 strategy address those, and if there's any gaps, identify 0128 1 those gaps, take those risks to senior management for 2 decisions about how to resolve those gaps. 3 MR. BURNS: This is one I'd love to dig a 4 little deeper with some of the other exchange 5 representatives because this is obviously something we're 6 thinking about ourselves as we're working on SCI and 7 otherwise. How are you tackling this from an enterprise 8 perspective, Aaron? 9 MR. WEISSENFLUH: To piggy-tail on that, one of 10 the things that we take pride in doing is we do quite a 11 bit of threat modeling. We -- on the front end we take 12 the intelligence that we receive both from the government 13 and the paid sources, and we put together our internal, 14 external exercises, live fire exercises, to -- like Tom 15 said, we check to see where those would hit our weak 16 points and how we would react to such threats. So that's 17 a constant for us. We're always testing. It's something 18 we do just day in and day out. 19 MR. GRAFF: And I think another way we approach 20 this is with a set of structured risk assessments. There 21 are decisions to make when you're operating a business. 22 And I think my colleagues here at the table would agree 23 that there -- every day you're confronted with various 24 choices, you know, do we want to do it this way or that 25 way. And so NASDAQ OMX has a well developed risk 0129 1 assessment methodology where we take a look at the 2 various implementation possibilities, design 3 possibilities and try to balance the risks implicit in 4 various approaches with the business need. 5 And I'll give you an example I think SEC would 6 be very interested in. Obviously we have a requirement 7 to implement kill switches, and my team was involved in 8 analyzing various technical approaches that were 9 considered by the developers. And there was a lot of 10 really good back and forth, so we did analyses. "Well, 11 if we did it this way, here are some of the risks that 12 would accrue. Here are some of the countermeasures we 13 could develop, some of the defensive measures we could 14 develop and so forth. And I think that's a very 15 important part of the equation as you move through and 16 advance the business interest, always taking a look, you 17 know, what do these risks amount to and how can we 18 balance them. 19 MR. FURNEY: We've made cybersecurity an 20 integral part of the enterprise risk management program 21 as well. We're giving -- we have specific line items for 22 things like insider threat and advanced persistent 23 threat, taking a look at, you know, what are our risks in 24 that area and what are we doing to mitigate those risks 25 and are we doing it enough to do that. 0130 1 We give updates to the board on our 2 cybersecurity program. They've taken a very active 3 interest. It might have been at the audit committee 4 level before, and now it's at the full board level, so -- 5 MR. BURNS: You know, a question that got 6 raised in the first panel -- I think the Chair raised -- 7 was are you -- are these sort of issues being internally 8 resourced? Are you going outside seeking assistance from 9 other third parties who provide expertise in these areas? 10 Is it a combination? 11 MR. CLANCY: I'm happy to start. So it's very 12 much a combination. I mean -- and to echo on Todd's, you 13 know, conversation, in 2010 this was a topic that was on 14 the board agenda once per year for 10 minutes. And in 15 2014 it's on every single board agenda and multiple board 16 committees, all right. So that governance focus has 17 changed. That's changed the level of investment and 18 resources that we've placed against it given the critical 19 importance of the market infrastructure. 20 And the investments that we make are a 21 combination of growing our internal staff and 22 capabilities, leveraging service providers, and 23 leveraging expertise both in the private sector and in 24 government. So it's sort of all hands on deck approach, 25 and we use resources as they fit the -- what we're trying 0131 1 to do because the expertise is fairly diffused, and 2 there's not a lot of it out there in the industry. 3 MR. SINOTT: So I would add that I think, 4 again, to Mark's point, a couple years, three years ago, 5 the insider threat was probably a bigger responsibility 6 for us, and that certainly changed completely. Now it's 7 the external threat. And since those threats are so 8 massive, you need to bring everybody to the table. And 9 so we certainly look for external expertise. We look for 10 government assistance. We're meeting, as Mark indicated, 11 with the FBI quite often just to look at our own risk 12 profile. And it's all of those things combined that 13 really provide us the assurance that we're adequately 14 addressing our risks. 15 MR. BURNS: Another point that came up on the 16 first panel -- there's a tremendous amount of focus on 17 external threats, parties that would do us harm from 18 outside. And it was brought up that, you know, various 19 stakeholders were describing efforts to ensure that 20 threats from within don't somehow compromise the systems 21 of critical market participants. 22 And I wonder -- not asking you to talk about 23 any particular incidents that may have occurred but the 24 steps that you have to take to just ensure that employees 25 or other agents working on your behalf and just that 0132 1 you've got good controls within to prevent internal 2 intrusions if I can coin a phrase -- terrible phrase but 3 there it is. How do you go about that? What kind of 4 attention do you provide there? And I throw that up to 5 any of you. 6 MR. WEISSENFLUH: Sure. If I could jump in, 7 we're in a very fortunate situation. We're in Kansas 8 obviously. We have a smaller employee base. And I've 9 really never worked at a place where the interviewing 10 process on the front end was so absolutely rigorous. So 11 from the front end you'll meet with who would be your 12 reporting manager. You meet with the entire team that 13 you will work with. And keep in mind we only have a 14 hundred and some employees. But then from there on 15 you'll meet with the HR director. You'll meet with the 16 CFO, the CEO, and the COO. 17 So from that point you've got a large committee 18 of different people who have different points of view 19 that will test what you know and how you react to 20 different situations. And then, of course, once you pass 21 that -- pass through that gauntlet, you're into the 22 organization, but you've got to go through rigorous 23 background investigations, and those are continuous. So 24 we do that. And again, working with the government, 25 we're working to secure clearances for those that are 0133 1 information and physical security. 2 MR. GRAFF: Couple of points on this. At 3 NASDAQ OMX we have a mature process of personal security, 4 and it's got a good track record there. I thought I 5 could make a couple of points though about the insider 6 threat. 7 First of all, it's not enough these days to vet 8 your own employees or check them again periodically 9 because we are in trust relationships with so many other 10 companies and so many other vendors. I always suggest to 11 CISOs, if they want to get an idea what's wrong with just 12 thinking about the castle and moat model we talk about 13 all the time, sit in your own corporate headquarters for 14 an hour. Just sit in the lobby and watch as everybody 15 comes in and out and keep track of what companies they 16 work for. 17 You know, you'll see customers coming into your 18 facility. You'll see, you know, people who are supplying 19 bread to the cafeteria. You'll see somebody's going to 20 come in and water the flowers and change the light bulbs. 21 All of those things happen on your network as well. And 22 if you're not taking steps to protect yourself against 23 malefactors amongst your vendors and in those third party 24 relationships, then you haven't addressed the entirety of 25 the problem. 0134 1 The other thing I'd just mention when we talk 2 about insiders -- I do want to make the point real 3 quickly that, because our networks are in trust 4 relationship with other elements of the trading 5 infrastructure, the trading ecosystem, our network 6 systems are in, to some degree, trust relationships with 7 the brokerage houses and other exchanges that we, you 8 know, circulate information with. 9 So one of the things we have to think about 10 when we talk about insider threat is are we covering the 11 scenarios where perhaps a small brokerage house is 12 compromised and then sends invalid packets or invalid 13 trades towards the exchanges. That is a kind of insider 14 threat I think we need to very seriously think about in 15 addition to the classic personnel security steps we are 16 already taking. 17 MR. SINOTT: I think the one big thing you have 18 to add to the component -- of course, we're doing all of 19 the things that Aaron and Mark indicated. But I think 20 Mary Galligan hit it in panel one, and that was the whole 21 idea of a security-oriented culture, right? The culture 22 has to be the -- embracive. It has to be passionate 23 about security in order for that front line to really be 24 defending you. 25 MR. CLANCY: Two small things to add. So one 0135 1 is, you know, when we've looked at it, the sort of 2 difference between the external threat and the internal 3 threat is the external threat is trying to become like an 4 insider, right. And so when you see it in your systems 5 and in your logs, you can't tell whether it's an insider 6 or an outsider doing something bad to you. So you need 7 to make sure you've addressed that. 8 The practical difference is the insiders know a 9 lot more about how your business works, what your rules 10 are, what your systems are, and how they work. And so in 11 that sense they pose a greater risk impact because their 12 knowledge of how your operations works is not something 13 they have to discover. It's something they already know. 14 And so we use all the techniques that were 15 described, but when we've focused on that, we've started 16 to up our game even more on that particular group, 17 particularly those, you know, excellent employees, get 18 high performance reviews, pass all the background checks. 19 If you ask what happens in the intelligence world, those 20 are all the people who betray their country, right? So 21 you kind of have that problem of the more vetting you do 22 gives them access, which creates more opportunity. 23 So what we really looked at is how can we put 24 structures in place so that one person can't hurt us or 25 two people can't hurt us, which has an offset in 0136 1 operational complexity, you know, nimbleness. You know, 2 we have an outage; we have to recover things. We've got 3 to go through procedures to, you know, break glass, pull 4 credentials out of the vault, that kind of stuff. And so 5 there's a risk balance that needs to be struck there. 6 But it's very much -- it's part of our analysis 7 of, you know, what are the things that can hurt us the 8 most. And we've looked particularly in the IT community 9 at those system admins, who have lots of privileges, who 10 know exactly how everything works -- and that's what 11 they're supposed to do. How can we put controls in place 12 to essentially hedge the risk that's posed? We still 13 want to be able to react to, you know, system problems 14 and what not and have timely resolutions of issues, but 15 we also don't want to create opportunities where we could 16 suffer great harm if they abuse that access. 17 MR. FURNEY: One additional point I'd like to 18 make is the -- having a solid -- when it comes to the 19 insider threat of stealing data, having a solid data loss 20 prevention plan in place is a benefit not only for 21 insider threat but also for -- if you are compromised, 22 it's a good way to see how information might be flowing 23 out of your organization and as a means of detecting the 24 compromise. 25 MR. BURNS: Thank you very much. It sounds 0137 1 like strong internal controls around this point is de 2 rigueur for all of you guys. That's great to hear. 3 Now in the event that a cyber attack or, heaven 4 forbid, a breach occurs, can you give us a sense of what 5 you -- what sort of information do -- or would you expect 6 that your members would want to see with respect to your 7 efforts to protect or mitigate against cyber threats. Or 8 just more broadly what are -- should the reasonable 9 expectations of your members be about what you're doing 10 and what they're going to be hearing from you about those 11 efforts? 12 MR. CLANCY: So this is a tricky problem, 13 right, and so you have what I call the race to the top 14 challenge, right. So something happens. You have this 15 tension between knowing exactly what the problem is in 16 its full scope and its completeness, which you only know 17 at the end of an investigative process, which could take 18 a very long period of time, and the immediate need to 19 know how does this impact me right now. And those two 20 needs are in tension with each other through any incident 21 response, right. 22 And what firms like ours have done is we have 23 tried to strike balance where we have to collect some 24 basic mastery of the facts as to what has happened or 25 what do we think has happened, which is more probably 0138 1 what has. And then we start providing notices to our 2 clients about the event that occurred with what we know 3 at the time. And the challenge is those facts continue 4 to evolve. So the earlier you notify your client, the 5 less certainty you have you actually know what happened. 6 And if you looked at any of the patterns of all these 7 other intrusions, right, the first report rarely meets 8 what the actual end state is. 9 And so we have to recognize as we're managing 10 through incidents we have this sort of almost conflicted 11 view of, you know, telling early and telling completely 12 because it's not possible to tell completely early. 13 And so where do you strike the balance? And 14 that's what our -- you know, our company, our risk 15 management discussion is about, what our guidance from 16 our management report has been. And so you have to set 17 sort of a bias point. I will say historically our, you 18 know, 2010 version is we only talked about things at the 19 end when investigations concluded and all facts were 20 known and disbursed. 21 And now we notify much earlier, and it's 22 created a lot of burden on the team responding to the 23 incident because you're mitigating the incident; you're 24 communicating with external stakeholders. The external 25 stakeholders, be they customers or regulators or law 0139 1 enforcement, want other things, which then expands the 2 scope needed for your response, which complicates your 3 response. And so you've got a process in place to manage 4 that on incident response. 5 You also have to set your objectives as a 6 corporation. What is your number one objective? Ours is 7 restoring the service that the financial markets operate. 8 It might not be pursuing criminals or pursuing 9 intelligence leads, which might be the primary interest 10 of a government agency involved. So you've got to manage 11 all of those sort of tensions, and you've got to sort of 12 play this out. 13 So one of the things all the firms here do is 14 we drill this. So if we don't have an incident, we run 15 through our playbook. We say, "What are we going to do 16 operationally? What are we going to do on the 17 communications side? What are we going to do with, you 18 know, our legal folks? What are we going to do with our 19 reach-out to government? How are we going to request 20 assistance if we need government assistance, have all 21 those things sort of pre-exercised so that if a 22 significant event happens, it's not the first time you're 23 figuring it out. 24 MR. BURNS: Let's shift the discussion a little 25 bit though we will pick up on a string which you were 0140 1 just talking about, Mark, and talk a bit about how market 2 systems approach cybersecurity. 3 And, Todd, I'll ask you first to start us out, 4 but I'm very interested in what others have to say. 5 Could you provide just a general outline of your current 6 practices with regard to testing and assessing the 7 security of your key systems? For instance, how 8 frequently are you running your own tests? What kinds of 9 testing do you generally perform? 10 MR. FURNEY: Sure. The vulnerability scans 11 that we do are performed on a weekly basis across our 12 various environments, and that's the corporate 13 environment, the production environment, the back office 14 environment. And we do maintain a very rigid separation 15 of those environments. But sometimes those are different 16 kinds of scans that are performed. They may be 17 credentialed scans on internet-facing systems or 18 uncredentialed scans on the production or back office 19 systems, but they are all scanned once a week. 20 And then we have penetration testing that's 21 performed at least on an annual basis. And those -- when 22 we have our third-party pen testing, we encourage those 23 testers to use all of the tools that would be used in a - 24 - by a real world attacker, and those could include 25 things like vulnerability scanning. They could be 0141 1 phishing emails to our employees. They could be phony 2 help desk calls to reset passwords, leaving USB drives 3 that contain malware sitting at strategic locations and 4 seeing who picks those up and puts them in their PC. 5 So we don't -- we try not to take anything off 6 the table for the types of tests that they do. And it's 7 helped out significantly because when someone does fall 8 for one of those phishing messages that, you know, are 9 part of the test, we're able to follow up with them in 10 person, and, you know, it makes a difference. They're 11 much less likely to do that in the future. 12 MR. BURNS: Jeepers. I'm kind of thinking 13 that's pretty rigorous, leaving USB -- 14 CHAIR WHITE: Todd is smiling as he's going 15 through these tests. I just want to note that for the -- 16 MR. BURNS: I want to -- this is an important 17 point to linger on here and then maybe, Tom, would you 18 mind kind of following up from your own perspective? 19 Others are welcome too, but you guys particularly. 20 MR. WEISSENFLUH: Sure. We've -- we do all the 21 same things that Todd does, and we've spoken several 22 times on the phone and through email during different 23 scenarios and incidents that are going on. 24 Quantum Dawn was a big thing for us over the 25 last year. The entire industry participated. It was -- 0142 1 MR. BURNS: Could you spell that out a little 2 bit for people who might not be familiar with that 3 exercise? 4 MR. WEISSENFLUH: Sure. And feel free to jump 5 in anyone else if I misstate this. 6 But Quantum Dawn was an industry-wide test of 7 incident response. We had many different scenarios. 8 Basically the world fell apart, and it was an automated 9 application that we all worked with and through. And we 10 set up a representation of the exchanges and dealt with 11 different scenarios. We communicated with each other by 12 phone, email. I don't think anybody faxed anyone else. 13 But we were in close communication. We ran through 14 several automated exercises, and even the SEC was 15 involved. That was a pretty interesting test. 16 But along with that, one of the things we've 17 done -- and this is -- we've put some goofy names on it. But 18 we involved all the technical folks in our organization 19 with quarterly tests. And I won't tell you what the name 20 is because it's a little goofy. But we've brought in our 21 technical folks. We set up our labs, and we give them 22 tools, education, and let them go wild in the lab. 23 I'm sure other organizations do it, but it does 24 bring everybody together. It helps us bond a bit, and it 25 gives us some new information on angles of how people 0143 1 would attack both from the inside and the outside. And 2 fortunately we've got technical folks in many different 3 locations. So we can run our internal/external pen 4 testing, which sounds a little odd. But for us as a -- 5 it is a constant effort to test both our systems 6 internally and externally. 7 MR. SINOTT: So I think that, you know, those 8 three types are what we all practice, right, industry- 9 level testing, tabletops, as Mark identified, and then 10 just standard operating procedure testing. And then the 11 other side of it is the bad actors don't take any days 12 off, so they're testing this every day. 13 But, you know, we basically test the layers of 14 defense all the way from the perimeter down to the 15 desktop to make sure that the components that we have in 16 place are doing what they're supposed to be, be it 17 blocking firewall rule access, identifying signatures in 18 your IDS, making sure your web application firewalls are 19 doing their job, all the way -- and a key component is 20 actually at the application side, right, so we make sure 21 that -- and within our system development life cycle in 22 our application development that we embed security in 23 that we don't find out once that application goes into 24 production that it has a cross-site scripting 25 vulnerability or an injection type of vulnerability. 0144 1 So I think all of those pieces tie into the 2 layer of defense and testing them accordingly to make 3 sure they're working. 4 MR. GRAFF: Just to continue the thought from 5 what Tom was saying, we use the same testing methods. 6 I'm particularly fond of the fake phishing emails. We 7 started in 2013, and people actually at this point seem 8 to enjoy it remarkably. 9 But I wanted to make the point that for me one 10 of the main purposes of testing is to determine whether 11 the -- the system is working the way you designed it. So 12 design is such a fundamental part. Good architecture is 13 such a fundamental part of cybersecurity. And -- for 14 example, I think you'd find that amongst the exchanges 15 you have here that the market systems are carefully 16 insulated from exterior systems such as anything you 17 might find on the internet. There are multiple layers, 18 multiple barriers. 19 And one of the things we do in our testing is 20 vigorously try to penetrate not only our web sites, but 21 we test these barriers to see whether it's possible to 22 get through one because we have overlooked something. 23 And that's one of the things we're doing, is we're 24 looking at the design and testing what the reality is 25 against what we think, you know, the design would lead us 0145 1 to see whether we've really got what we think we have. 2 MR. BURNS: Mark Clancy, anything from DTCC's 3 perspective? 4 MR. CLANCY: Well, I would -- I was doing sort 5 of the plus one on all of those statements. So I think - 6 - and Mark hit on this. For us we fundamentally look at 7 building security in from the beginning. It makes the 8 whole of things that follow a lot easier. 9 You know, we do the same kind of thing. We do 10 source code testing. We do application testing. We do 11 infrastructure testing. We do vulnerability testing, and 12 it's a never-ending cycle. We've actually done a lot of 13 ours, so we have daily instrumentation. And where we've 14 translated it to the organization is we have defined risk 15 tolerance measures at a company level of what is the 16 normal noise we're working through in the portfolio we 17 can manage and what's the point we're running at elevated 18 risk. 19 And people I work with accuse me of using tons 20 of analogies, so I'll give you a simple one. You know, I 21 look at the vulnerabilities like we're the Forest Service 22 and we have to produce a daily report of how many acres 23 of forest are tinder dry. It's not a forecast for 24 lightening strikes and cigarette butts our of car windows 25 and camp fires, all right. But it tells you, if there is 0146 1 a fire, how much it could spread. 2 We use that as part of our discussion of the 3 risk tolerance. And we say, "Look. Our forest is this 4 big. We have a little bit of acreage. If we have a 5 problem there, it's containable. If it's too much 6 acreage, we need to accelerate our efforts to mitigate 7 those vulnerabilities and refocus efforts in the company 8 around that. 9 And unfortunately that is something that 10 literally changes every day. If you look at the whole 11 portfolio together, you know, we have this thing called 12 Super Tuesday from our friends out in Redmond. You can 13 see that on our graphs. Every month you can figure out 14 when the second Tuesday of the month is because of what 15 shows up on Wednesday, all right. So it's a continuous 16 cycle. 17 So my advice to people who are doing this once 18 in a while and the different types of tests, different 19 frequency, you have to get your tempo so that you 20 actually know where you are every single day, and that is 21 not easy. We're not completely there, but we've come a 22 long way from where we used to be. 23 MR. BURNS: Mark Clancy, I think you touched on 24 this a little bit before, but in developing these tests, 25 you're -- some of this is homegrown; some of this you're 0147 1 looking to assistance from the outside. Others -- 2 MR. CLANCY: Yeah, so -- so for example, you 3 know, we use as many things that are out in the public 4 domain as possible, so we're very big proponents of the 5 Security Content Automation Protocol that's developed in 6 the U.S. government for doing standard checking. You 7 know, we do that for vulnerability checking. We do that 8 for configuration checking. Those things mentioned 9 happened. 10 Daily we use commercial tools, and we have 11 experts in our own team who do this stuff all the time, 12 and then we bring experts from the outside. So if there 13 is a technology that we're not familiar with or there is 14 a system that we don't have the capacity to -- or 15 sometimes our clients want independent assurance. They 16 want to know how we've looked at application X. 17 We'll bring in an outside expert firm and have 18 them, you know, beat on it, and we'll find whatever flaws 19 there might be and address them and provide some 20 transparency not at the bits and bytes level of the 21 detail, but at a summary so our clients can understand 22 we're using tried and true methods like the Open Web 23 Application Security Project Top Ten list, and we're 24 checking for all those kind of vulnerabilities. We're 25 using the SANS Top 20 controls and all those kind of 0148 1 measurable framework things as a way to make sure we got 2 all our bases covered. 3 MR. BURNS: Mark, you had -- Mark Graff, you 4 and I were talking a little bit before we came in about 5 the NIST standards in this area too. I wonder if you'd 6 like to talk about that. 7 MR. GRAFF: Yeah. Thank you. I've been -- in 8 my background I spent nine years at Lawrence Livermore 9 National Laboratory before I came to NASDAQ. And I 10 became really dependent on the federal guidelines in this 11 area, NIST special publications, the 800 series, and in 12 particular 853. And I helped the National Nuclear 13 Security Administration move in the direction of using 14 853 as a basis for guidance in cybersecurity programs. 15 And I -- I really did want to just congratulate 16 SEC on its recent move. I know that we've all been 17 talking about Reg SCI and making progress in this area, 18 and I want to congratulate you and thank you for this 19 recent move towards using 853 and similar guidelines and 20 standards as a basis of this so-called Table A and Reg 21 SCI. You know, your CISO and CIO, Tom Bayer and Todd 22 Scharf, met with us all, vetted that with us. I think 23 the response was enthusiastic, and I did want to, again, 24 thank you for moving in the direction of these kinds of 25 federal guidelines. They're very well established and 0149 1 extremely useful to us. 2 MR. BURNS: Mark, keeping you on the spot for a 3 moment -- thank you for that shout out to our colleagues 4 here. And I will say it's been great effort within the 5 agency on the staff level as we think through SCI-related 6 issues between Trading and Markets, OCIE, which has taken 7 over the ARP program for us, and, of course, Tom and 8 Todd, our CIO and CISO, Tom Bayer and Todd Scharf. We're 9 very grateful to have them on board as we're thinking 10 through these things on the staff level. 11 But to keep you on the spot, I think that one 12 question that market participants may have is policies 13 and procedures in place -- and others are going to be 14 welcome to share the heat too. If we find ourselves in a 15 place where trades might be affected as a consequence of 16 a cyber attack of some sort, some -- or misinformation in 17 the market but issues are discovered only after 18 transactions of the sort are completed, have you guys 19 given any thought to those consequences and what you 20 might need to do? 21 MR. GRAFF: Yes. I've been involved in several 22 conversations with general counsel and also our -- the 23 authorities that are operating our markets. And we've 24 looked at this very closely. Let me -- as a non-attorney 25 let me give you just my sense of where we land on this. 0150 1 We -- my understanding is that we retain the 2 authority if we believe we can't operate the markets 3 fairly that -- because of a concerted significant cyber 4 attack. We believe we retain the authority to cease 5 trading until such time as we could operate it fairly. 6 The larger question and one which has generated 7 a lot of interest inside NASDAQ OMX is what to do if we 8 reach an understanding, get to a point where we think 9 there are trades, which either have been submitted and 10 not executed or have been executed but not reconciled, 11 that may have been the result of a cyber attack against a 12 brokerage house. 13 And I was asked to -- we were reviewing this, 14 and I was asked to make this point. According to our 15 understanding, the guidance we got from SEC on this a 16 couple of years ago was that in the case where a 17 brokerage house might have been compromised and an 18 improper trade had, in fact, been -- an order had, in 19 fact been issued from a compromised system, you know, our 20 question was, "What do we do about that?" 21 And we -- as we understand it, the guidance 22 currently is that since it's not an "erroneous" trade as 23 such but, in fact, it was a trade that had legitimate 24 earmarks but was not the true volition of the brokerage 25 house -- our understanding is that we're not empowered to 0151 1 break that trade. And so we'd love to have some 2 clarification from that from SEC not in the case of the 3 large catastrophe, which we think we comprehend pretty 4 well. But in the smaller issues, what would be the 5 current thinking? 6 MS. ROSEN: I'd love to jump in just from a 7 public policy standpoint. When we look at the landscape 8 in terms of cybersecurity, we put it into three buckets, 9 and we've talked about the first two a lot here. And 10 that's front line protections, creating resilience, the 11 incident management in the event that something happens, 12 and then there's this recovery concept. 13 And this recovery concept feels less explored 14 as a marketplace. And it's not just these firms but 15 banks, non-banks, credit unions, whatever it may be who 16 may have an intrusion. It's a really, really important 17 piece and really trying to figure out how to get this 18 right. There's a lot of legal questions, of course. 19 And then there's that just general question of, 20 if we have a market close, how do we open it; who opens 21 it; what authoritative voice do we need because when it 22 comes down to issues of data integrity, consumer 23 confidence and really having folks show up to the 24 marketplace the next day, we really need to be organized 25 from a public and private partnership standpoint to 0152 1 really get this thing right. 2 So I just want to amplify your comments from a 3 public policy standpoint as -- that this is a very, very 4 important piece that we need to work on. 5 MR. CLANCY: I'll perhaps build on that a 6 little bit, right. So being at the clearinghouse side -- 7 and we see the volume from the NMS venues and the non-NMS 8 venues all come to the DTCC side. You know, one of the 9 questions we looked at is I think the rules for declaring 10 self-help are pretty well understood. What's a little 11 murkier and something we need to explore in this threat 12 domain that Katheryn led to is what happens if every 13 exchange declares self-help at the same time, right, 14 because of some pan-cyber attack across multiple 15 infrastructures. And I think those types of policy 16 issues are ones that the recovery aspects are -- they 17 need more focus. We need to think about that as market 18 infrastructures. 19 If there is a data corruption event in one part 20 of the ecosystem, how does that get unwound? You know, 21 all the markets we have are interlinked with each other, 22 so if you have a problem with one space and you have to 23 go back to an IT snapshot from many hours before, what 24 can everybody else do? Do you have parity in the markets 25 where 50 percent of the firms can go to a snapshot that's 0153 1 6 hours hold and 50 percent can't? Do you still have a 2 functioning market then? 3 Those types of more strategic policy-level 4 systemic risk issues are the kinds of things that we've 5 been thinking about. I won't pretend we have the 6 answers. We're starting to come up with the questions we 7 need to be asking. And that's a little more away from 8 the here and now problem, which we also face. And our 9 challenge is operating simultaneously at the here and now 10 tempo and the strategic tempo. And so that's one of the 11 things we've been trying to focus on at DTCC. 12 MR. FURNEY: Jim, I wanted to add that we too 13 have -- if that bad trade that you had referenced, if it 14 occurred at a bad price, we have -- obviously there are 15 rules that allow us to bust those trades. 16 And I'm not sure if all the exchanges have 17 this, but CBOE does have a provision that contemplates -- 18 in our rules that contemplates exchange action if there's 19 a bad trade that's caused by a -- some problem on our 20 side as well where we can bust trades. You know, I -- I 21 think it would need to be looked into further whether, 22 you know, like a cyber event falls into that category. 23 But -- 24 MR. GRAFF: So I enjoyed those remarks, and I 25 just want to point out that this is -- of course, it was 0154 1 a great thing for us to talk about in this context. And 2 also it's the sort of things like this that can get 3 discussed during the exercises like the Quantum Dawn 4 exercise. This is when these kind of real world problems 5 kind of bubble up, when you're all forced to think about 6 anomalous activities. So that's another good reason to 7 continue these industry-wide exercises, because some of 8 these issues will really only be surfaced in activities 9 like that. 10 MR. BURNS: Katheryn, I wanted to -- thank you 11 guys very much. I wanted to jump back to you for a 12 moment. There's a lot of talk. I'm kind of hearing it 13 from the panel a bit, and you were talking about it 14 earlier -- interest in disclosure of information about 15 breaches or challenges or vulnerabilities. Can you give 16 us -- share some of your thoughts on information 17 disclosure in the context of when breaches are occurring, 18 best practices? 19 MS. ROSEN: Well, as you all were talking 20 earlier about particular communications to members, what 21 I was thinking about while that conversation was going on 22 was communication back to the financial sector and 23 communication back to the government because, if you 24 think about the different constituencies as Mark was 25 walking through the then and then the investigation and 0155 1 going forward is a lot what we see in the financial 2 sector is firms helping each other as they see intrusions 3 or other mechanics. They have good networks to speak to 4 each other, and, as you know, these folks all know each 5 other very well. 6 The second is giving information back to the 7 government, which is really important such that we can 8 act as a clearing house because we may see something 9 happening outside of the -- say, the financial market 10 infrastructure space -- could be happening in energy. It 11 could be happening in another part of the financial 12 sector, which is the same kind of intrusion. But we're 13 not able to connect those dots in a clearinghouse fashion 14 for information unless folks are feeding that in. So I 15 wanted to just come back to that. 16 I think that the disclosure issues in a breach 17 -- I think it raises a lot of questions. I think that we 18 need to have that debate. There is the question about 19 being able to protect information that may be needed, 20 whether it's for law enforcement purposes or other in the 21 context of an action, and making that consumers and 22 especially the retail consumers in other instances are 23 protected as well so they can actually safeguard their 24 own information. 25 So I throw this back to a debate. I think it's 0156 1 incredibly important, and I think there's a lot to be 2 investigated there. 3 MR. BURNS: Aaron, if I could throw it to you 4 for a moment -- but I'd be interested in others' thoughts 5 as well. Can you talk about the processes that exchanges 6 and other SROs have right now for sharing or obtaining 7 from one another information about cybersecurity threats? 8 MR. WEISSENFLUH: Absolutely. I think I stated 9 earlier that I've spoken to at least three or -- at least 10 three on the panel here when we've been going through 11 different issues, seeing different things in the wild. 12 And we communicate through a form that was set up through 13 the FSSCC, Financial Services Sector Coordinating 14 Council. That information is contained within that 15 group, and it's readily available. 16 Speaking to the group every -- I think we meet 17 quarterly, and with the CHEF, we have, obviously, portals 18 that we can log in and see information. And, again, I 19 make calls to these guys constantly. 20 MR. CLANCY: Yeah. I'd just add so one of the 21 mechanisms we mentioned is the Sector Coordinating 22 Council. We also use the Financial Services Information 23 Sharing and Analysis Center, FSISAC, which, in full 24 disclosure, I'm on the board of. And we created a sub- 25 community in 2011 for clearinghouses and exchanges 0157 1 because, when we looked at our community, we didn't see 2 exactly the same challenges that, you know, retail card 3 processors were seeing or banks were seeing. We saw a 4 different sliver of the puzzle as it related to 5 infrastructure. 6 And so we kind of bonded with the people who 7 look most like ourselves. We're part of this broader 8 community, and so information flows multi-directions. So 9 there has been -- no surprise to anyone -- a series of 10 denial of service attacks against the financial sector 11 broadly. We've used groups like CHEF and others and the 12 FSISAC to coordinate tactical information so that when 13 the first victim receives the evil traffic, potential 14 victims two through n have a clue as to what they're 15 likely to see and can prepare and tune their defenses to 16 be more effective. 17 And in my opinion, that information sharing at 18 a near real time operational level is the best tool we 19 have to improve the effectiveness of our defenses. And 20 so we're doing a lot of things around that space as a 21 community, and DTCC is as well. 22 MR. WEISSENFLUH: And if I could just add to 23 that real quick, several years ago we were all affected 24 by a similar attack. And, as Mark said, when we began 25 sharing information, within minutes we all knew that we 0158 1 were under the same attack and were able to mitigate and 2 remediate quickly. 3 MR. GRAFF: I wanted to just chip in too. 4 You've heard the word "CHEF". It's -- what is it, C-H-E- 5 F? It's Clearinghouse and Exchange Forum? Is that -- 6 yeah. But it's a subset inside FSISAC, and it's just 7 invaluable. We can get a note out in just a few minutes 8 and say, "I'm seeing this. Is anybody else seeing it," 9 and it's been fantastic. And there's news on that. 10 That was such a great success that I took that 11 model, and we've actually now applied it on an 12 international level, working with the World Federation of 13 Exchanges, which is a non-profit. So now we've extended 14 that, and the 65 or so international exchanges that are 15 members of WFE -- we now have a group. We didn't call it 16 CHEF. We called it GLEX, Global Exchange. But actually 17 I just -- 18 MR. BURNS: You might want to think through 19 that one. 20 MR. GRAFF: I know. I didn't put it to a vote. 21 I really liked it. And so I just got back. I went to 22 Mumbai last week. We got the charter officially accepted 23 by WFE. So I mean I sent the first message out Monday 24 morning. 25 So now we have the ability -- because I think 0159 1 that it -- your international boundaries don't count for 2 much when you're talking about the cyber threat and the 3 potential impact of attacks on all of us. So we're 4 looking to expand all that as we collaborate not just 5 inside the U.S. 6 MR. FURNEY: I wanted to add in too that we've 7 been -- I think over the last two or three years that DHS 8 and FBI have been really good at sharing information with 9 us. They will call us into the local field office and 10 give us a classified briefing if there's something that's 11 urgent, or they'll stop by CBOE offices -- and I'm sure 12 it's the same for the others -- when you're not able to 13 make it out to their office. They've really gone out of 14 their way to share information in a very timely manner. 15 MR. SINOTT: Two things I'd like to add. One 16 is that the FSISAC -- the beauty about that is that you 17 can put data in there anonymously, right, so that it's 18 not attributable, which really helps. And then secondly 19 we meet, for example, quarterly with the FBI just to make 20 sure that we're in tune. 21 MR. BURNS: And you -- for your benefit you 22 should know that Treasury has been doing a terrific job 23 of sharing the information it can that the FBI is, in 24 turn, sharing with you. And I think that's one of those 25 areas we're looking to smooth out the rougher edges of 0160 1 and make sure there is -- to the extent we can -- there's 2 continuity in that sharing of information. 3 I want to make sure we're asking the question 4 more broadly though about best practices as we kind of 5 draw to a close. Are we looking -- you mentioned the 6 NIST standards. Do you have a sense -- among you all 7 there sounds like there's a lot of harmony in the way you 8 go about things. Is there consensus around best 9 practices in this area across exchanges and 10 clearinghouses? Is there a lot of -- or are there gaps 11 or distinctions? And how can we be helpful in that 12 regard? 13 MR. GRAFF: I think the 853 is very well 14 respected, and the other one I would recommend -- and I 15 think I lot of people would agree -- is the ISO/IEC 16 27,000 series used internationally. That's an actual 17 standard as opposed to 853 -- more a set of guidelines. 18 Those have broad support around the world, I think, and 19 there are others that are extremely helpful. Mark 20 actually mentioned a couple that are not strict standards 21 but the OWASP list, the SANS Top 20 and so forth. 22 Again, there's really a common lingo here. I 23 think it's -- one of the things we're trying to do, I 24 think, is help establish a baseline. Just to take a 25 moment, the -- one of the things I see and important for 0161 1 my job -- one of the things I've been doing this year is 2 making a kind of a listening tour out to some of the 3 smaller exchanges. NASDAQ OMX has a lot of reach. I've 4 been to, you know, Singapore and Hong Kong and all over. 5 And I find that the larger exchanges all around the 6 world really have a -- pretty much a common understanding 7 of best practice. There's nothing formal yet. But 8 there's a good gist. 9 But the smaller exchanges really are not that 10 well informed, and I think that's something that we might 11 have a responsibility to help with -- is just lay this 12 out because it -- you know, all the folks on the panel 13 here have been doing this for a long, long time, and we 14 really are conversant with it. But some of the newer 15 smaller exchanges, really they're at just as much risk as 16 we are in many ways, but they don't have this reservoir 17 of information to draw on. So I would like to see a 18 little more in terms of a common set of documents and a 19 common set of guidelines for everybody to use. 20 MR. SINOTT: I think we all practice a 21 defensive strategy for our cybersecurity program. I 22 think to Mark's point, one of the beauty parts about the 23 executive order and the NIST cybersecurity framework is 24 that it is one that simplifies it for those who aren't 25 more on the bleeding edge of protection and so that they 0162 1 can really -- I think if I look at it -- you know, we 2 took a very preventative, reactive approach to security 3 in the beginning, right. And now we're trying to be much 4 more proactive in trying to be able to identify those 5 threats and be ahead of the game. 6 And I think what helps to do that is the whole 7 risk analysis, the understanding of the risk and how that 8 risk is going to impact your business. And by doing 9 that, then you can understand what's important to you and 10 what you need to protect first versus last and make sure 11 you've got the order correct. And I really think that 12 the framework does that for companies that aren't as far 13 advanced. 14 MS. ROSEN: I would just amplify Tom's comment 15 that the framework is intended also to be iterative. So 16 as you all apply it and you see at the baseline -- and 17 many financial services firms have -- are already gone 18 beyond what's actually in the baseline set by the 19 framework. But that is not to say that, as it's used, 20 that there are others that -- Mark mentioned other 21 standards that come to bear that could be even more 22 globally appreciated -- could be incorporated into that 23 framework as well. 24 MR. BURNS: Our time is fast drawing to a 25 close. I want to make sure the Chair and Commissioners, 0163 1 if you have any questions -- 2 CHAIR WHITE: If I may just go back for a 3 second to what I'm calling the scenario exercises, seems 4 like that was pretty well received by -- in terms of 5 usefulness. Is there anything one should be doing 6 differently in terms of those exercises either in terms 7 of frequency, participants, you know, pieces and 8 components? 9 MS. ROSEN: I was just going to make one 10 comment that a really important exercise we need to do is 11 for the market open. Just the way that the Quantum Dawn 12 played out that -- and the time that we had to do it that 13 we just did not get to that step. So I think there's 14 already work underway to create a test that will do and 15 will focus on that and the communications it takes to 16 reopen the market. So I believe that's well underway. 17 MR. CLANCY: I would add so those exercises are 18 very useful. I think one thing we struggled with in 19 building them is getting the audience within the firms 20 that covers the whole spectrum of operations. So they 21 tend to be very technology-focused, maybe business ops. 22 And you really need to also get into leadership, which is 23 quite difficult. I don't want to downplay that at all. 24 But I think that type of thing could be helpful because 25 the decisioning tree -- the decisions that people like 0164 1 myself may make are probably completely different than the 2 decisions our executive leadership team would make in 3 this type of scenario. 4 So I think that's the second factor, but 5 building on things like Quantum Dawn and having a 6 simulation environment really helps provide the data they 7 use for decisions. They don't use the packet capture. 8 They use the market ticker. So we have to have a way to 9 show a market ticker during a cyber exercise because then 10 people know how they're actually going to respond. 11 COMMISSIONER AGUILAR: Really Jim actually 12 asked this question, so I want to give him credit for it. 13 But he asked it in context of multiple questions, and I 14 didn't hear the last one being answered. And the 15 question is what can we do to help in the process? What 16 do you think should be the Commission's role going 17 forward? What can we do to facilitate information 18 sharing? What can we do to facilitate best practices? 19 MR. CLANCY: That's a fabulous question. 20 COMMISSIONER AGUILAR: Or do we just go away 21 and you pretend we don't exist? Does that -- 22 MR. CLANCY: No. I mean I would say -- you 23 know, the first thing I'd say is start with it with the 24 understanding that you do have, that the risk isn't 25 uniform, all right. So the risk of a DTCC versus a BATS 0165 1 versus a CME versus a CBOE -- it's not the same even in 2 the market infrastructure let alone the extended 3 financial community. 4 So the first important thing you have to do is 5 make sure that everything you think about in terms of 6 guidance and how you do it is risk-focused. That said, 7 where we can really use a lot of help is getting a better 8 understanding of how to measure and quantify this family 9 of risk. 10 We have a tremendous amount of experience in 11 credit market and liquidity risk. And so I go to an 12 internal meeting. We have, you know, a two percent 13 exposure to this amount to this priority with this 14 confidence, and then I -- my chart is yellow, right. So 15 we don't have the tools in this risk discipline yet. And 16 I'm not sure that's something the Commission can solve on 17 its own, but that's the type of thing we need to do as a 18 broader community so that we can have better risk 19 discussions with more quantification so they can make 20 better investments in the things that we actually need to 21 go hedge the risk on. 22 MR. GRAFF: I had a suggestion too. You know, 23 I already spoke about how gratified I was by the shift in 24 Reg SCI towards the 850 structure. But the other thing 25 that occurs to me is, in talking with so many other 0166 1 exchanges -- I take your point. I've actually never 2 heard someone in my position complain about regulators in 3 terms of, you know, "We have to talk to them. They take 4 a lot our time." Nobody ever -- that I've ever heard of 5 -- actually complains about that. 6 But I think there's a thirst for well informed 7 regulators and well trained regulators that understand 8 how our markets really operate and what we really do. 9 And I was just wondering whether we couldn't collaborate 10 on -- on perhaps a little curriculum where we do some 11 mutual training for -- in the cyber area. It's all so 12 new that maybe we could collaborate that way. 13 MR. SINOTT: And if I could add one item, it's 14 that the -- all of us have multiple regulators that want 15 to come in and inquire about our programs. And obviously 16 that's an important oversight, and we understand that. 17 But at the same time when we do a SEC ARP review and then 18 we do a CFTC review and then we do a Fed review and then 19 we do a Bank of England review, it gets time-consuming. 20 And so where we can collaborate and coordinate those, 21 that would be really helpful. 22 MR. GRAFF: And that might be -- I mean IOSCO 23 might get involved in some of that too. There might be a 24 basis for some collaboration there, right? Yeah. 25 MS. ROSEN: And I was just going to follow up 0167 1 by -- from a government's perspective you all are working 2 hand in hand with Treasury and the other financial 3 regulators in terms of incident management, technical 4 assistance outreach and really working with the financial 5 sector hand-in-hand. So what you are doing is working 6 really well. 7 COMMISSIONER STEIN: I just had a quick 8 comment. I think the conversation today brings up the 9 idea of resiliency across the board, right, and not just 10 in cyber, the cyber situation. And, you know, again, I'm 11 -- that one of the themes today is how dynamic this all 12 is and that there's not a single breach or a single point 13 of contact or a -- the way we used to think about it, you 14 know, even five to 10 years ago. The moat and castle -- 15 you know, somebody was saying -- I was like, "Yeah. Oh 16 yeah. I get that." 17 But I think one of the issues everyone's 18 struggling with is updating, you know, firmware, you 19 know, having best practices and being out on the cutting 20 edge of this field that's evolving so quickly. And a lot 21 of the bad guys are evolving more quickly than some of 22 the other folks, regulators and/or industry. 23 So, again, to keep this conversation going in a 24 variety of contexts, help us be nimble -- we want to help 25 you be nimble. And bring your best ideas to us, and 0168 1 we'll all try to do that. And I hope this forum, you 2 know, provides a sort of space for all of us to have that 3 conversation. 4 MR. BURNS: Thank you, Commissioner. 5 Anything else from the -- 6 CHAIR WHITE: Thank you very much. It was a 7 very, very helpful panel, very informative. 8 MR. BURNS: This is terrific. Before I let you 9 go I would -- just to keep to the schedule I want to just 10 make sure -- because this is such a fantastic group of 11 experts here. We barely scratched the surface. Have we 12 left unasked any questions that just -- for purposes of 13 starting the conversation, spurring writings, submissions 14 from people? Is there anything we -- I'm sure there's 15 plenty. 16 MR. CLANCY: I would just add -- and Katheryn 17 sort of hinted at this at the public policy side. I 18 think we do need to talk about what some of my colleagues 19 in the global market are calling this cyber contagion 20 and, if there is a systemic event, how would we handle 21 it. Fortunately that's not the kind of thing we're 22 seeing today. But, you know, as I sort of joked, 23 somewhere between three minutes and 30 years from now we 24 will. And we don't know are we closer to the 30 years or 25 the three minutes. 0169 1 And that problem for us is something that we 2 have urgency around in looking at with all of the 3 participants in the market of how do you even decompose 4 this type of problem so you could then figure out what 5 resilience is needed to address it. And I hope that we 6 are able to do that before we have to learn how that 7 works. 8 MR. SINOTT: I think I would add, if you have 9 children, especially college age, cybersecurity is 10 definitely a good career choice. 11 MR. BURNS: That's the takeaway for me. 12 Thank you, panel, very much. We really 13 appreciate your time. Thanks for your participation 14 today. If I could invite you to walk off that way, we've 15 got another group that will be coming in shortly. 16 And we hope to just keep things rolling, so if 17 you'll bear with us. 18 (A brief recess was taken.) 19 MR. GRIM: Okay. Good afternoon everyone and 20 welcome to the final panel of our Cybersecurity 21 Roundtable. My name is Dave Grim, and I'm the deputy 22 director in the Division of Investment Management. 23 Beside me is Drew Bowden, the director of the Office of 24 Compliance, Inspections, and Examinations, and Jim Burns, 25 who you've seen a lot of today, the deputy director of 0170 1 the Division of Trading and Markets. And together we'll 2 be moderating this session. 3 In today's prior panels we explored the 4 cybersecurity landscape, public company disclosure 5 concerning cybersecurity threats and breaches, and the 6 cybersecurity preparedness of market participants that 7 operate key market systems. 8 We're going to now turn our focus to what 9 broker-dealers, investment advisors, and transfer agents 10 are currently doing in this area, particularly in the 11 area of identity theft and data protection as well as 12 cybersecurity risks within their respective industries. 13 We'll also discuss best practices. 14 In putting together this panel we made an 15 effort to bring together representatives from a wide 16 variety of interested groups to share their perspectives 17 and insights on the issues we are discussing today. They 18 represent a wealth of experience and knowledge in the 19 area of cybersecurity, and we're very fortunate that they 20 agreed to join us today. 21 So starting to the right of Jim there we have 22 Marcus Prendergast, who is the director and corporate 23 information officer at ITG. Next up is Jimmie Lenz, the 24 chief risk and credit officer at Wells Fargo Advisors 25 LLC. 0171 1 Then we have Mark Manley, the deputy general 2 counsel and chief compliance officer at 3 AllianceBernstein, John Denning, Senior Vice President, 4 Operational Policy Integration, Development & Strategy, 5 Bank of America and Merrill Lynch, Daniel Sibears, 6 Executive Vice President, Regulatory Operations/Shared 7 Services at FINRA, David Tittsworth, Executive Director 8 and Executive Vice President, the Investment Adviser 9 Association, Craig Thomas, who is the chief information 10 security officer at Computershare, John Reed Stark, the 11 managing director at Stroz Friedberg, and finally down 12 there is Karl Schimmeck, the managing director, Financial 13 Services Operations at SIFMA. 14 We anticipate that this group will lead us 15 through a highly engaging, informative, and constructive 16 dialogue in the area of cybersecurity. Moreover, as we 17 mentioned a few times previously, we urge all of you to 18 join the debate by sending in your thoughts in the form 19 of comment letters. As you know, there's a -- both a web 20 intake form and an email address that we encourage 21 everyone to use. 22 So with the introductions all set, I'm going to 23 turn it over to Dan to begin our discussion by asking him 24 to talk about the nature of the cybersecurity risks that 25 he sees his members facing. 0172 1 MR. SIBEARS: Great. Thank you very much, and 2 I appreciate the opportunity to be on the panel today. 3 Thank you for the invitation. 4 So in preparing for the panel we thought that 5 it would be a good idea for me just to give a little bit 6 of a background before on what FINRA was doing prior to 7 the time we get into the actual risk. So I'll sort of 8 set up the risk discussion, and I'll be brief about the 9 background piece. 10 So this has been obviously a key area of 11 concern on the regulatory side for those of us that are 12 regulating the broker-dealer community, and we issue at 13 FINRA an annual priorities letter every year. We 14 prominently noted in this year's letter that 15 cybersecurity was a key issue for us, particularly in the 16 area on firms' infrastructures and what firms were doing 17 relative to the safety and security of sensitive customer 18 data, particularly in the PII space. 19 We indicated in that letter that we were going 20 to be engaged in some initiatives in 2014 to better 21 define the risks that the broker-dealer community faced 22 in this area. And, in fact, we did just recently launch 23 a fairly extensive -- what we call a sweep -- and you can 24 get information about the sweep on our web site -- to 25 have a better understanding from a cross-section of firms 0173 1 about what kind of risk they're facing in this space. 2 And just very briefly, some of the goals of the 3 sweep are to better understand the types of threats and 4 risks that firms are facing, what their risk appetite is, 5 what their exposures and major areas of vulnerabilities 6 are and what the involvement is actually of their 7 governing structure and their IT folks in addressing and 8 trying to solve this problem. 9 We have just started to get the results in, so 10 this is very, very preliminary. But in terms of the 11 risks that we are already seeing evolve from the 12 information, there are three key areas that firms are 13 already reporting on. And, again, this is a cross- 14 section of the broker-dealer community. But we found 15 that half of the firms that are reporting back to us are 16 viewing operational risk as their top concern. And we'll 17 probably get into the details about operational risk up 18 here in the next hour. But they include areas such as 19 actions by people, system and technology failures, failed 20 internal processes and certain external events. 21 The next area that firms have indicated is a 22 top risk is insider risk by employees. And the third is 23 hackers penetrating their systems as the third -- the 24 highest-identified risk. 25 Now below those, the most common ones -- but we 0174 1 don't have a -- kind of a aggregation of critical mass in 2 terms of percentages are firms being concerned about 3 risks related to denial of service attacks. And the new 4 kind -- I would say phishing attacks and then also some 5 new kinds of phishing attacks. 6 We were just talking in the ready room here 7 about the phishing attacks we're most familiar with where 8 customer information is basically appropriated and then 9 trades and other instructions are given to the firm -- 10 generally results in wires being made out to some third 11 party at some bank that's usually offshore. 12 The newer phishing attacks that we have been 13 hearing about -- and for some of you folks this may not 14 be new, but it's coming through the survey that we're 15 doing -- is phishing attacks on the employees themselves 16 so that what appears to be actions by internal employees 17 are actually by outside folks who have penetrated and 18 actually co-opted the internal employees to share the 19 same kinds of information that we heretofore have seen 20 directed toward customers of the broker-dealer. And then 21 the final thing -- and I'll stop for the introduction -- 22 are risks related to malware infections. 23 MR. GRIM: Okay. Great. Thanks. Thanks for 24 that introduction. I'll turn it over to our industry 25 folks. I don't know if you want to go IA, TA. Fire 0175 1 away. Go ahead, David. 2 MR. TITTSWORTH: Great. Thank you. And thanks 3 for the opportunity to be here. I'm shocked that Chair 4 White and three of the other four Commissioners are here. 5 If I knew you were going to stick around all day, I 6 wouldn't have accepted this invitation. 7 CHAIR WHITE: Thank you, I think. 8 MR. TITTSWORTH: Yeah. Very similar to what 9 Dan said, I think, in talking to our members both large 10 and small as well as a number of custodians and some law 11 firms, compliance consultants, I'd put the risks into 12 three categories. First, on the individual side where 13 you have -- where you're managing money for individuals, 14 wealth management, traditional investment counsel type of 15 activities, it's the account takeover that is the number 16 one risk. 17 And that seems to have grown in frequency a lot 18 just in the last year or two, so where somebody will get 19 somebody's identity and then pose as that end user client 20 and make a request to the investment advisor, "Send me 21 $64,000 to this Hong Kong account," or some variation on 22 that theme -- so account takeovers. 23 In the institutional money management context I 24 think it's more hacktivism, denial of service, state 25 sponsored terrorism, I guess, would be at the top of that 0176 1 list, but those type of systemic threats or risks. 2 And then, very similar to what Dan said as 3 well, cutting across whatever type of firm it is on the 4 investment advisor side, the internal risks -- I guess 5 the worst one would be a rogue employee. But certainly 6 where employees move -- you go to a different firm. 7 There are issues involved, what type of information does 8 each employee have access to. 9 Or human beings do dumb things from time to 10 time. So somebody leaves their laptop at Starbucks at 11 the train station, and how do you deal with those types 12 of threats. 13 MR. SCHIMMECK: This is Karl Schimmeck. And I 14 want to say thanks to the Commission for having us here 15 today. You probably heard a lot of details around 16 individual types of threats and risks that are out there 17 and, I think, kind of all just hit on a couple of high- 18 level topics, and I think the firms will be able to give 19 a little more specificity to it. 20 But from an external standpoint, I think what 21 we're seeing is the threat actors are becoming more 22 sophisticated. They're strengthening. They're 23 practicing what I'll call kind of the -- they're kind of 24 practicing information sharing on their parts to, you 25 know, figure out what's working well, what's not working 0177 1 well, and, you know, are much more advanced than they've 2 ever been. 3 That being said, the firms are actively working 4 these issues. They are doing -- they are spending 5 millions of dollars. They are dedicating, you know, 6 their best resources to this problem so that they can 7 manage this risk for their clients with the firm, you 8 know, as those trends continue to evolve. 9 Last thing on the external, you know, I think 10 what we've also seen is, you know, previously fraud has 11 been a big issue for the industry. And, you know, we've 12 seen that from the criminal side. And I think Mark 13 Clancy hit on it with the different threat actors. But 14 this whole idea of APT nation-state attackers is a big 15 change, and I think we're -- you now see a big move from 16 fraud and theft to destruction and to -- and putting 17 those systemic risks on the table as they're looking to 18 disrupt markets. 19 So I think that's where, from an external 20 standpoint -- and you obviously have the insider 21 component of that as well possibly that, you know, that, 22 I think, is one of the major risks posed. 23 Looking at it from the public standpoint, also 24 from a risk standpoint, I think, you know, we're 25 concerned about the trust and confidence that that will 0178 1 then put at risk for, you know, our clients and the way 2 the markets operate. 3 So, again, you know, this is something where we 4 can't -- you know, we're putting the best resources out 5 there on it because if the markets don't operate 6 properly, people don't have the confidence to transact. 7 They don't have the trust in their counterparties and 8 that those counterparties are actually who they think 9 they are. You know, that then puts the whole system and 10 the whole market, you know, at a possible risk of not 11 operating properly. 12 Lastly, kind of from an internal standpoint 13 from a risk, the -- you know, again, we're talking about 14 kind of the best resources that are out there. But I 15 think a risk is -- is that, you know, as we look to 16 protect everything, you know, we then spread ourselves to 17 thin. 18 So I think the risk is that, you know, with the 19 limited resources that are out there, whether it's time, 20 money, or individual skill sets, you know, we need to 21 focus on, you know, where is the systemic risk, put those 22 protections there, you know, be threat-informed, risked- 23 based in our approaches and doing, you know, anything 24 other than that and just trying to, you know, blanket 25 this issue with the idea of trying to remove it really is 0179 1 not going to work. It's not something that can be -- you 2 know, that you can put a regulation or a plan in place 3 and remove it. So it has to be actively managed over 4 time. 5 And in that -- you know, that is, to Mary 6 Galligan's point on one of the previous panels, it draws 7 to the culture that it has to change. And, you know, 8 maybe 10 years ago it was about everyone in a firm being 9 a risk manager. Now it's about security and managing the 10 security internally, externally and that being a core 11 competency for the membership. 12 MR. STARK: Thank you also to the Commissioners 13 and to the Chairman for allowing me to be here. I was 14 with the Commission for about 20 years as a staff member. 15 I'm used to be opposite you in front. It's -- I like 16 being on this side. I feel like I'm one of you, but I 17 know I'm not. 18 But I think -- I agree with everything that 19 Karl just said. My last 11 years here was spent as the 20 chief of the Office of Internet Enforcement. And now 21 I've been for five years at Stroz Friedberg. And what we 22 are at Stroz Friedberg is just really high tech plumbers. 23 We come in when there's been one of these data breach 24 incidents, and we provide the expertise to dig in and 25 figure it out. 0180 1 I worked with Dan on quite a lot of these 2 account intrusions and account takeover matters during -- 3 they started in the '90s and then -- in the late '90s and 4 then in the 2000s. But I think Karl's on to something 5 there, that what we're seeing a lot at Stroz Friedberg 6 are these intrusions where the people are undetected like 7 APT, advanced persistent threat. 8 Picture it this way. You know, you come home, 9 and you think your house has been robbed. And nothing is 10 out of place, and nothing's missing. That's what a lot 11 of these attacks are like. We come in and we sort of dig 12 into the unallocated spaces of computers, the slack space 13 of computers, to look for artifacts and remnants and 14 fragments of what the intruders have done. 15 What are they looking for? You know, they may 16 be stealing now more intellectual property or inside 17 information or identities of people to use in other ways 18 than just financial ways in stealing that information. 19 And, you know, it's a very tough situation. 20 The one thing I would want to come away from 21 this roundtable was -- is that when you come in and make 22 these -- sort of do these investigations -- and that's 23 really what we are also, is internal investigators -- it 24 takes time. I think there's a certain impatience. 25 There are multiple constituencies when you 0181 1 arrive. You have the FBI, who is looking at you like 2 you're a victim because you've just experienced an 3 intrusion. And you want to provide everything you can to 4 them so that they can get the bad guy even though it's 5 usually rather hopeless if they're overseas. And then 6 you also have, you know, as many as 46 different 7 attorneys general contacting you, wanting you to tell 8 them, you know, what you're doing to protect the 9 information that has somehow be exfiltrated. 10 So you have those constituencies. Plus you 11 have the board and the audit committee and the -- your 12 customers and your shareholders. You've got your 13 disclosure obligations. And you're trying to give a 14 consistent message to all of them, but it's a real 15 challenge when, in the first few days, you're really just 16 preserving and digging into the information. Even if 17 you've got, you know, several dozen people working on it, 18 it's very difficult to have the -- to get the results 19 very quickly. 20 And it's -- like when I was chief -- and, 21 again, working with Dan on these matters, if I ran up to 22 the director's office and said, "I think I solved this," 23 after a day or two, you know, saying alpha, the next day 24 it's going to be omega, and I'm going to look like a 25 fool. 0182 1 So I think in dealing with those 2 constituencies, particularly the data breaches around 3 here -- I handle quite a few of them around here where 4 there's a lot of government contracts. So you're 5 reaching out to whoever it might be, the Air Force, the 6 military. And you're saying, "Okay. Some of your data 7 may have been exfiltrated to China, and we've got to 8 figure that out in the next couple days or in the next 9 couple weeks." And so usually it does take a lot of time 10 to do this. And I think the companies are struggling. 11 For investment advisors, I think in many ways - 12 - and BDs -- this sort of APT, the sequel injections, the 13 more -- more stealth where they aren't leaving any 14 evidence -- I think those are becoming -- going to be new 15 territory for them because they're so trained to go after 16 these account takeovers and protecting people's monies. 17 Now they've got new duties and responsibilities. 18 So -- but I do think it's terrific that you're 19 having the roundtable. I've seen the modules that Drew's 20 team has been sending to investment advisors and others 21 that -- in looking into cybersecurity. They're 22 extraordinarily impressive in terms of their depth and 23 breadth. I'm not sure how well received they're going to 24 be because sometimes, you know, you're dealing with two 25 people in a -- you know, in a firm who handle the IT. 0183 1 And sometimes it's the same person who's protecting as is 2 the person who's reporting the breach. 3 You know, one of the first things we advise 4 people to do is to have different people for that because 5 it's tough to report to your boss that you've possibly 6 made a mistake. 7 Well, I don't want to take up too much time, 8 but those are some of the things that come to my mind. 9 MR. MANLEY: Can I expand a moment on something 10 David said about the high risks for money management 11 firms? He mentioned the top risk was account takeover or 12 a hijacking of consumer emails. And, you know, I 13 mentioned to Drew before the panel that I'm used to doing 14 these panels with other legal and compliance folks. And 15 today I'm backed up with a lot of technology folks. 16 And I think about it, and I say that probably 17 10 years ago maybe this was viewed by some as an IT 18 problem that this was something that was a central focus 19 of your IT department. But for asset managers today and 20 broker-dealers and fund complexes this has to be a 21 central business imperative. And we look at it like that 22 today. 23 But the email is really a consumer-driven risk 24 that we look at and we say, you know, the customer still 25 notwithstanding events like Target and the publicity that 0184 1 centered around the Target disclosure still had a limited 2 appreciation of their role in cyber attacks and their 3 part in it. 4 And, you know, not to put the burden on the 5 consumer but to try to educate the consumer is something 6 that we spend time with. And the volume of consumer 7 emails that come to us that are actually -- that have 8 been taken over by third parties and come to us 9 unauthorized -- and we reach back out to the client, and 10 we tell them, "Did you send us this email," and they say 11 no. They're astonished. And so that's -- it's a small 12 fraction of our client base, but it's a volume that is 13 increasing. And it's interesting that David speaking to 14 multiple money managers would all come up with that as 15 their top risk today. 16 MR. DENNING: So I want to thank the Commission 17 for the opportunity to be here today. The sector sees a 18 broad spectrum of threats that, you know, are poised 19 every single day. 20 Our teams work 24 by 7, 365. They are never 21 turned off. I think that from a risk perspective the 22 greatest risk in my mind is zero day malware that is 23 blended with more of the mundane threats that are out 24 there. And that probably poses the greatest risk and 25 probably the most difficult to defend against -- are 0185 1 those zero days because you need to get those into the 2 system and actually get them into your defensive 3 mechanisms. 4 Where the biggest challenge is, I think, 5 preplanning. And I think this is from small firms all 6 the way to large firms -- is having a robust information 7 sharing mechanisms within the industry, with law 8 enforcement, with government agencies, to help reduce 9 risk and to speed up the time of identification of 10 threats so that you can put your countermeasures in place 11 or update your countermeasures in a time-efficient way. 12 So there are questions that I think every firm 13 deals with both internally and externally -- is who and 14 why are we sharing information with; what are we sharing; 15 what are the protections for the information that's 16 shared; are there legal aspects -- are all of our legal 17 aspects covered; and is our customer information 18 protected. 19 You know, these questions have to be answered 20 beforehand, and there needs to be a structure, an 21 umbrella that you operate underneath. I think Larry 22 Zelvin's point with DHS earlier today really echoed with 23 me -- is that the information sharing laws need to be 24 there, and you need to actually have those as muscle 25 memory not necessarily an ad hoc process every single 0186 1 time you go into an incident. 2 So I think that that's not only a -- you have 3 your risks, but there are challenges as well. I think 4 that the sector is well positioned, as you've heard along 5 the way today, especially through the Financial Services 6 Information Sharing and Analysis Center. We have a 7 robust capability to share information between firms and 8 then from the ISAC to the government, and it's well 9 tested. 10 But, again, the goal here is early warning, 11 sharing best practices for network defenses. It's the 12 only way that we're going to be able to efficiently 13 reduce risk to the sector and to our individual firms. 14 And that starts with information sharing. 15 MR. PRENDERGAST: Chair White and 16 Commissioners, thank you -- I'm Marcus Prendergast from 17 ITG -- for inviting us here today. 18 This conversation is the start of a journey, 19 and I'm very happy that you've launched this and to see 20 how we're approaching this very proactively. I think 21 it's very important. And I just want to build off what 22 we've heard so far. 23 So ITG is an agency broker, a technology 24 provider and ATS. And we've heard a lot on the retail 25 side about account takeover and malware and zero day and 0187 1 a lot of things that can impact retail investors. And 2 there's a whole other side on the institutional side that 3 I want to discuss a bit. 4 But just to build off their theme, to me the 5 greatest risk in cybersecurity today is simply keeping 6 current with and even trying to keep ahead of the 7 adversaries. They're moving very fast. We talked 8 already -- I know Mary Galligan had mentioned -- so I 9 feel I can mention our tool kit. We talked about those 10 earlier. But they do -- they are changing techniques at 11 lightning speed, and it's very difficult to keep ahead of 12 this. 13 And I oftentimes -- I always look at history to 14 look for answers to the future. And I think about Kaiser 15 Wilhelm II, who believed wholeheartedly at the turn of 16 the 20th century that his cavalry was the answer to any 17 war; he would be successful and victorious on any 18 battlefield. And he was, right? That was the history. 19 That's what his past had taught him -- was if he could 20 feed those horses, he would be victorious. 21 And in Somme in France, September 15, 1916, the 22 English introduced something new in their tool kit. They 23 had two things. The first thing was the tank, right? 24 And they also introduced the whole concept -- well, not 25 introduced but brought along with it the whole concept of 0188 1 trench warfare. And despite how well fed those horses 2 were, they couldn't keep up. 3 And so what I take this as -- we need to look 4 at -- keeping ahead of the challenges is very important. 5 We can't be -- just because something today -- a vendor 6 is promising a product that's going to solve your 7 security needs today or that the security you have in 8 place today or last week or last quarter. It doesn't 9 mean that tomorrow there might be something that comes 10 around and can penetrate your defenses and really make 11 your firm vulnerable. So it's to be very agile -- is the 12 message I have. 13 MR. LENZ: First of all, I want to thank you 14 all. Commissioners, I know you have a lot of passion 15 around this obviously because you all are -- been here 16 all day long. 17 But to add on to that a little bit, the one 18 thing that -- the industry, I think, has put all kinds of 19 resources behind it, human resources, dollars technology. 20 And one of the things that we've tried to do and we've 21 tried to espouse is the understanding of what the future 22 will look like. So as we build, you know, imagine what 23 the technology environment will be in two or three or 24 four or five years. We're putting things in place right 25 now with 18- and 24-month time frames. But if you 0189 1 haven't imagined what the environment's going to look 2 like in 24 months, then you're outdated before you even 3 get there. 4 You know, if you want to look at it simply as 5 Moore's law, right, technology increases at exponential 6 rates. And we can -- even if we want to do it at a basic 7 level, we can say, "Well, if that's true" -- which it has 8 been since the '60s -- "then in the next few years we're 9 going to see computing power increase at this rate." And 10 we better be ready for that. While we may not know 11 exactly what we're going to be hit with, we can at least 12 understand, well, this is what's going to be available 13 and move to that side of it. 14 But I think the industry has been, you know, 15 very proactive. SIFMA has been -- this is one of the 16 things I've seen them move faster on than anything else, 17 not only to raise the awareness, but also the competence 18 of the members. And I thank you again for bringing 19 together, you know, a panel like this. 20 MR. BOWDEN: So interesting that most of our 21 representatives here today are from larger size 22 organizations. We heard about the risks that you see. 23 Do you know are the risks the same for organizations 24 within your business, whether it's BD or IA or transfer 25 agent or exchange that are medium or small size? Or do 0190 1 you think if they're medium or small size the risk is 2 less? And then the other would be do you have a view on 3 the state of preparedness of industry participants who 4 may not have the resources and the people and incentives 5 that you have. 6 MR. TITTSWORTH: Is it okay if I kick it off? 7 Investment advisors -- Drew, as you know, most of them 8 are small businesses. So 88 percent of SEC-registered 9 investment advisors have 50 or fewer employees. Fifty- 10 eight percent have 10 or fewer non-clerical employees. 11 By anybody's definition those are small businesses. 12 So -- and I think that the answer to your 13 question is that typically those smaller firms do not 14 have the resources the larger firms have. So, you know, 15 people here are -- some of our larger members that I was 16 talking to on the institutional front -- they're members 17 of FSISAC, and they're cooperating with the other big 18 player out there in financial services and the government 19 and having, you know, a robust dialogue about what's 20 coming next and all the threats and what you can do about 21 them. These smaller firms, there's nothing that is 22 equivalent to that. 23 I think we need to do more. I think you 24 convening this roundtable today is helpful, and just 25 awareness is obviously the first step before you can do 0191 1 anything. So anything the Commission can do to just 2 spread the word out there -- and maybe we should look for 3 some opportunities where we might be able to work 4 together to make sure that the smaller firms aren't 5 falling behind. I do think some of the threats are 6 different, right, and Mark and I agree that, again, on 7 the individual account side these account takeovers are 8 the number one risk. That doesn't necessarily correlate 9 to small business, but a lot of them -- that's who their 10 primary clients are. 11 MR. MANLEY: David, do most of the small IAs 12 outsource their network? 13 MR. TITTSWORTH: So I don't know the answer, 14 and I think we'll have a better appreciation that -- we 15 do an annual compliance testing survey of our members and 16 do it in cooperation with ACA and Old Mutual every year. 17 And we're going to ask a bunch of questions about 18 cybersecurity. We get several hundred responses, and we 19 make that publicly available. But honestly, Mark, I 20 don't know what the answer is. Great question. 21 MR. MANLEY: I believe that there -- John and I 22 were talking about this before the panel, and I still 23 have a belief that there's a number of intrusion attacks 24 and threats that hit network systems that are what some 25 would refer to as drive-bys. They are not targeted at 0192 1 the institution. They are targeted at anything that's 2 out there. And they are spaghetti sticking to a wall. 3 And those networks, small or large, are going 4 to have to respond to those threats. And oftentimes 5 they'll find outsourced solutions or oftentimes they'll 6 have, you know, the most advanced virus detection systems 7 that they can find, which are probably out of date three 8 days after they're installed. So -- yeah. 9 MR. DENNING: So I think that, just put 10 plainly, the cyber risk is high for any company, that 11 capabilities range greatly from small companies to large 12 companies. I think one of the advantages the financial 13 sector has is the Financial Services ISAC and, you know, 14 frankly, the financial sector doesn't see network 15 security or cybersecurity as a competitive issue. We see 16 it as a team effort, and it is one fight. 17 And then, you know, the bottom line and the key 18 is early warning. So the more advanced knowledge of 19 something coming down the pike provided to a small, 20 medium, or large company can make the difference between 21 having a minor problem and having a critical problem. 22 MR. SCHIMMECK: One point just to build off of 23 what John was saying, you know, I think key thing is 24 we're not waiting for something to happen. There's a lot 25 of activity that's going on right now. And building on 0193 1 what the FSISAC is doing, you know, small- or medium- 2 sized firms don't have the resources to have a fully -- 3 let's say capable -- you know, threat intelligence 4 capability. So what can be automated? What we do on 5 machine to machine? Where can they get, you know, the 6 benefits of a larger firm sharing it? 7 And that's going on right now where large firms 8 are committing time, money. There are specialists 9 through programs at the ISAC building a threat automation 10 program and then making that available free just to 11 members, to smaller firms, that they can plug into that 12 and then use it so that they -- you know, they have a 13 tool and a tool kit to address that. 14 And also just building on the small/medium 15 focus, it's -- you know, everybody in the industry 16 understands that, you know, that is at the soft 17 underbelly, is that, you know, that has to be brought up. 18 And it's the entire industry's job to make that happen. 19 It's not the regulators' job. It's not the 20 government's. It's everyone working together, public, 21 private together to make this happen. 22 And, you know, this is -- you know, but again, 23 it's -- you know, there's no magic bullet that's going to 24 put this in place. It's just a lot of incremental things 25 that will build up that capability and will have, you 0194 1 know, better protections. And, you know, to think 2 Marcus' point, we'll be able to, you know, evolve and be 3 flexible as the threats change. 4 MR. STARK: I think the risk to IAs in 5 particular is kind of scary because one data breach could 6 bring down an IA, I think, very quickly because of the 7 kind of notifications and the kind of relationships they 8 have with their clients and the integrity. There's 9 really a direct correlation as opposed to a retail data 10 breach where you may still shop there afterwards. But if 11 your money is in custody of someone and they're gambling 12 your wealth and suddenly it's at risk, you might feel 13 differently. 14 The other huge challenge that we've touched 15 upon is personnel. It's incredibly difficult to find 16 incident response people to come work for you. We are a 17 firm -- at Stroz Friedberg we have over 300 people. We 18 are constantly recruiting new incident response people 19 and trying to breed them on our own. There's no incident 20 response school you can go to. There's a few masters 21 programs around. You know, I taught at Georgetown for 15 22 years at the law school, a technology course. I always 23 say now, "Don't go to law school. Get some sort of 24 computer science degree. Turn it into incident response, 25 malware reverse engineering." 0195 1 And malware, by the way -- malware can be as 2 simple as a hammer and a nail being used to bang into a 3 house. So, you know, trying to -- it's not so easy to 4 tell whether something is malware because when you get 5 on-site, it might be a tool that normally the company or 6 the regulated entity uses. 7 So finding these people -- I sit in my chair 8 and try to interview them and get them to come work for 9 me. They may be in their 20s or 30s or wherever they 10 are. They've got me in the palm of their hand. We'll 11 give them anything to come work for us. So I don't know 12 what it's like for these IAs or BDs to find people 13 because, you know, it's a new breed of professional, and 14 there's a huge shortage among them. 15 So for -- to expect an IA to have some sort of 16 incident response infrastructure in place of personnel is 17 a big expectation even if they want to. You know, and 18 most of these people -- they -- a lot of our employees 19 want to work from home. They want to -- you know, they 20 travel the world as -- doing incident response 21 everywhere. It's just not so easy to hire them. And I 22 know the SEC has the same challenges when I was here to 23 try to find those people. 24 MR. THOMAS: I just wanted to -- what I've 25 heard so far is everyone thinks that we've got the 0196 1 network and our boundaries, our fortress built, and 2 everything else. And we're forgetting one thing, okay. 3 We've got a new generation of users, so we're now going 4 mobile, right? 5 So back in the days when BlackBerry was 6 actually brought out, security professionals worked with 7 BlackBerry to secure that device. BlackBerries are out 8 of date. We now have iPhones, Androids, no regulations 9 around how they build them from a security point of view, 10 and the business are pushing out applications to business 11 online, right? So your boundaries just changed. So how 12 do you protect that? 13 MR. GRIM: I see Commissioner Aguilar over 14 there waving around his BlackBerry. 15 COMMISSIONER AGUILAR: Well, I just want to 16 figure out, if it's out of date, what am I doing with it? 17 MR. MANLEY: It's most secure. 18 MR. GRIM: So I think we've had a good 19 discussion of the risk side of things. Let's shift the 20 discussion a bit to sort of best practices, best 21 practices in response to the risk that you guys have been 22 laying out for us. I don't know who wants to start on 23 that, but I'm sure there's a number of thoughts here. Go 24 ahead and fire away. 25 MR. THOMAS: I want to jump in on that one. So 0197 1 not one single control will protect you, right? So it's 2 a defense in-depth approach. And you've got to believe 3 that you are going to get attacked even though -- I think 4 I agree with what Karl was saying before, you know, 5 expect the -- don't wait for someone to tell you that 6 you're going to have -- there's a threat. It's too late. 7 If someone says something's going to happen, it's much 8 too late, right? You've got to be thinking ahead of the 9 game. 10 Technology moves faster than security. 11 Security is always trying to catch up with the new 12 technology. And it has always been the same, and it will 13 never change. So defense in-depth, slow the person down 14 as he comes in, track what he does, monitor, look at what 15 he's trying to do, and then stop it. And that's the only 16 way you can do it. 17 MR. PRENDERGAST: And in terms of the insider 18 threat, we've talked about that quite a bit. Our motto 19 is trust but verify. We use advanced technologies that 20 are now available to help you evaluate and detect insider 21 threats, looking for things, doing behavioral analytics 22 is something that firms should look to do because that is 23 just as risky as the outsider getting in. 24 MR. MANLEY: I think having an extensive 25 knowledge of your organization and where you have 0198 1 sensitive personal data or data or information that, if 2 accessed by a third party, is going to expose the 3 organization to either regulatory or commercial liability 4 or risk is important. It's also -- you know, one of the 5 challenges are our own staff training, making sure that 6 employees who understand that this information is 7 sensitive -- and it may be very well segregated in an 8 area that's been ring-fenced. So when that intrusion 9 does get in, does penetrate, you have the information 10 ring-fenced within your infrastructure so that the 11 intruder can't get actual access to it once inside. 12 The challenge is you have employees who take 13 that data out. They remove it. They copy it. They load 14 it into personal folders. They put it in another place 15 as part of their own commercial enterprise of what -- not 16 maliciously but just as part of the job they're doing, 17 not being aware of the risks that they're creating, that 18 that information, which was ring-fenced -- part of it is 19 now not. 20 MR. LENZ: I think that what everybody's 21 describing in one way or another is the idea of a program 22 rather than, as was stated, simple controls, an ongoing 23 program that integrates information security with your 24 risk management because it has to be managed. It's going 25 to be somewhat dynamic but along with technology and the 0199 1 business. 2 Within our organization there are a lot of 3 unique businesses, and I think that information security 4 is -- has to align with the business. It can't operate 5 in a vacuum. We are really, really cognizant of that, 6 and we try to make sure that, when we are looking at 7 things, we look at the individual needs so we know what 8 you can and can't turn off but so we understand how to 9 apply it in those different venues. 10 MR. STARK: I agree with Jimmie. And Jimmie 11 and I were talking about this beforehand -- is taking the 12 -- what we see a lot at Stroz Friedberg -- we see a lot 13 of these companies have different ways of handling 14 whatever the risk is. But looking at IT security from a 15 risk perspective, from a holistic perspective rather than 16 from a solely IT perspective, which is -- you know, 17 Jimmie's the chief risk officer. That's a better way, in 18 my opinion, of looking at things. 19 And just look at -- and Marcus was talking 20 about -- we call them bad leavers, which is a British 21 term, somebody who leaves the company -- from our London 22 office -- somebody who leaves the company badly. And 23 you've got to go into a data breach thinking that that 24 might be a possibility. It could be an insider threat. 25 It could be a bad leaver. There could be some other 0200 1 problem than somebody -- some attacker from a foreign 2 country. So you've got to take this very holistic 3 approach. 4 And you've got to consider the business too. 5 If you think about law firms -- we have a lot of law firm 6 clients where we do risk and security assessments of 7 them. And that's a very tough one because lawyers want 8 to put in their biographies who they do business with. 9 So that makes them very vulnerable to a very 10 simple phishing attack that comes at 11:00 at night. You 11 know, you want to show your clients you're aware at 11:00 12 at night. And you get an email that says, "Hey, can you 13 click on this and take a quick look at this contract?" 14 And it looks like it's coming from the GC of one of your 15 clients, so you click on it And the next thing you know 16 you've somehow injected a virus into your whole system at 17 the law firm. 18 Now should a law firm then not have biographies 19 about its people? Of course not. So you have to find 20 the right balance there, and you have to develop a risk- 21 based approach that's going to incorporate that. 22 And part of that is educating people because 23 you're only as strong as your weakest link. And most of 24 the links that we see in a lot of the attacks and the 25 different -- the different types of intrusions is just, 0201 1 again, someone who was either unsophisticated or someone 2 who was so -- what we found in a survey we did was the 3 higher level you were up in a company, the less you -- 4 the worst you were at computer security, the more you 5 took your devices home and they weren't protected the 6 right way, and the less you paid attention to the rules. 7 And that -- you're shaking your heads because you think 8 it's the same thing here? No. 9 COMMISSIONER AGUILAR: I -- 10 MR. STARK: Okay. Good. So, you know, I 11 appreciate Jimmie's approach and Wells Fargo's approach 12 to this risk-based model because I think that works the 13 best. 14 MR. SCHIMMECK: And one other thing I think for 15 best practices is that, you know, we talk a lot about 16 protection and all the prevention and all the policies we 17 have in place. But, you know, invariably in this space 18 things go wrong, breaches occur, attacks happen. And 19 it's all the response exercising that goes on and making 20 that a part of -- you know, I think it goes to the -- you 21 know, this is a corporate issue. It's -- those exercises 22 don't just sit in IT. They sit on the business side. 23 They sit in risk. 24 It's a corporate issue. It's everyone's 25 responsibility within the firm for -- to maintain 0202 1 security and making them a part of the exercises that are 2 going on and then feeding that into that continuous 3 improvement loop. So I think it goes to the point of, 4 you know, where can we anticipate, where we exercise. We 5 try to -- you know, we try to project what's going to be 6 there in the future. We put that into our systems. We 7 make changes, and then we can advance. 8 So I think just kind of that always adding in 9 and making the continuous improvement and that change. 10 And that is always going to move as fast as technology, 11 probably faster than we can keep up from an 12 organizational standpoint, and probably faster than, you 13 know, any regulation standards. Those types of things 14 can -- it can change, so it's very much on, again, the 15 culture to encourage that and to make that a part of the 16 core competency of a firm. 17 MR. DENNING: And then I'd just add I think all 18 the comments so far have been spot on. But I would -- 19 I'd say that being an active and contributing member to 20 the Financial Services ISAC and actually doing things, 21 not just being a receiver of information, also 22 contributing information as well so that the collective 23 ability to defend is increased and that we are all 24 exposed as a sector to new ideas, new techniques, new 25 protocols. And that raises the bar for all of us to 0203 1 innovate, and it is a -- the more agile you are as a 2 team, the better off you are in your ability to actually 3 defend against a cyber attack. 4 MR. BOWDEN: Can I ask a follow-up question 5 there? In terms of what are sort of best practices, 6 right, and the industry groups and FSISAC -- do you -- it 7 sounds like the response to an event is as equally 8 important as preparation or prevention, right? In FSISAC 9 do you do case studies? Do people actually share "this 10 is what happened and here's what I did in response"? 11 John, it sounds from yours -- an investigation going for 12 weeks as you're trying to figure out what happened and 13 what I do. "And I thought this, but I was wrong, and I 14 looked here." Like do people actually share post mortems 15 on events to help people -- 16 MR. STARK: I mean -- 17 MR. BOWDEN: -- that or a smarter response? 18 MR. STARK: You know, that's a great question. 19 I don't think so. You know, it's certainly something -- 20 for most of my clients, everything is very, very 21 secretive because -- not for nefarious reasons but really 22 just because you're working on something that's critical. 23 You're -- but I think one thing I noticed in the 24 cybersecurity guidance -- for example, that someone -- 25 whoever was writing it was really thinking about this 0204 1 because they said, "Don't disclose the particulars of the 2 data breach in your disclosure with the SEC because 3 you're going to give someone a road map on how you do it, 4 so you've got to be very careful." 5 The attack vector has expanded considerably 6 here because you're really looking, again, at the devices 7 that people bring home. You're looking at -- most 8 companies do operate globally in some, way, shape, or 9 form, even the smaller IAs. So you're dealing with 10 multiple dynamics. And there isn't a lot of -- even our 11 own case studies that -- at Stroz what we talk about is 12 that we're really limited in what we can say because, 13 again, if you start giving away that information, then 14 you're really triggering -- you're giving someone a road 15 map as to, you know, how to infiltrate your company again 16 and again and again. 17 And the other dynamic is the FBI is usually 18 coming and telling you, "Please don't tell anybody that - 19 - what we're doing. We've got lots of sensitive things 20 going on." So you're getting huge pressure from the 21 bureau not to share that information even amongst your 22 own employees. 23 And I do a lot of presentations to the board 24 and usually to the audit committee. So you're presenting 25 to the audit committee, and, again, you're trying to 0205 1 gather all this information during these initial weeks, 2 but it's challenging. 3 MR. DENNING: So I think that there's an 4 important distinction to be made. One is you have an 5 active law enforcement investigation, which I think 6 you've really touched on well. And then there's the day- 7 to-day information sharing of threat activity that's 8 going or malicious activity that's going on across the 9 sector. I think at the operational level the FSISAC and 10 all of the analysts that contribute on a daily basis by 11 our listservs and portals and things like that -- it's 12 very robust, the amount of chatter and activity there. 13 And the granularity of the information is -- I don't 14 think there's probably another sector that is quite like 15 ours. 16 So I think that from an operation side of the 17 house on a day-to-day basis it's robust. I think one of 18 the metrics that any organization should be measured on 19 is how often do you contribute to that dialogue. But 20 from an incident -- a major criminal investigation, 21 there's -- it's a completely different level of 22 interaction. 23 MR. MANLEY: So to go back to -- Drew, before 24 your question can be answered, I have a question, which 25 is do we believe that most organizations below, say, 0206 1 those involved in this process intimately actually have 2 formal written response plans to a data breach. 3 I mean most will -- if their systems are 4 disrupted, they'll invoke business resumption and 5 disaster recovery, the typical type of response. But if 6 they lose customer data, does the common firm have a 7 appreciation of how they're going to report to 49 8 different states? Which federal regulator are they going 9 to call? I mean do we believe -- I see some heads 10 shaking yes or no. Do we think that that's a common 11 practice that firms have formal written response plans 12 around a data breach incident? 13 MR. THOMAS: I would say not everyone has. We 14 created playbooks so -- to make it simple. So it's going 15 to happen. You want to have a structure to manage that 16 incident. So we thought of as many different types of 17 threats and different attacks and what the impact would 18 be and what do we need to do. So you literally pull out 19 the playbook, and you follow it through. At the end of 20 that incident, you look at that playbook, and did it 21 work. If it didn't work, fix it, just like BCP, right? 22 And we also do dry runs as well. So we have 23 scenarios and we play that we are being attacked; what 24 are you going to do. And we throw in all the information 25 that they need to have to make informed decisions, 0207 1 whether they're going inform their regulator; is it a 2 breach within their regulatory -- and not just within the 3 U.S. but obviously worldwide, right? 4 MR. STARK: I think another point to make is 5 about the remediation that comes afterwards. It's a 6 little tougher to -- oftentimes you've got to fix 7 whatever this problem was. And usually not done with a 8 switch. For instance, if one of your major databases was 9 compromised, you've got to figure out a way to turn it 10 off without alienating every single customer you might 11 have, and there might be some real urgencies to those 12 customers' need for your database that you would be 13 disrupting other things. So the remediation segment 14 after the incident response is usually fairly lengthy and 15 extensive as well, so I think it's very hard. 16 And as far as the regulated entities -- do they 17 have incident response plans? Are they well organized 18 for an attack? You know, generally I think the answer is 19 no because I think, again, the attacks change so quickly. 20 The new types -- it's difficult for me to keep up with 21 everything, and I can't imagine what it is for someone 22 who this isn't their core business. So I think it's 23 hard. 24 Some companies that I've seen, some regulated 25 entities have incredible set-ups. And the more consumer- 0208 1 oriented organizations have been doing this for a long 2 time. I think for the regulated entities, the smaller 3 BDs, the smaller IAs -- I think it's a much huger 4 challenge. Just to do a tabletop exercise, I think, is a 5 challenge in and of itself. And you do learn huge 6 things. Even when we respond to a data breach, we learn 7 something new every time. Hey, this is a big thing that 8 we better recommend to the rest of our clients to do 9 because we saw this huge failure. So it's a constant 10 learning process. I'm sure everyone would agree. 11 MR. DENNING: So I think that certainly having 12 playbooks, processes, and procedures in place is an 13 absolute must best standard, best practice for any 14 regulated entity, frankly, any entity that is doing 15 business on the internet or transacting business on the 16 internet. 17 And so from a sector perspective, we have the 18 all hazards playbook, which addresses both the physical 19 and the cyber side of the house. So I think the sector 20 is trying to set the example for all of its members. In 21 addition that, I would venture to say that most of the 22 top 100 companies certainly have playbooks that are well 23 exercised and tested, and the controls are in place. 24 But, again, I wouldn't want to over-generalize. But I do 25 think that it is an absolute necessity and the best 0209 1 practice. 2 MR. GRIM: All right. Let's spend a few 3 minutes on where do we go from here. And, in particular, 4 I'm sure a number of you have some thoughts on what you 5 think you might like to see the SEC do in this space. So 6 now is the time. 7 MR. PRENDERGAST: If I could start out, just 8 two things I'm hearing today, two themes. One is that 9 we've got to be very cognizant of different risks. One 10 is to the loss of PII data to the compromise of retail 11 investor account information. And then we have a whole 12 separate on the institutional side. So we've got to 13 think about those two different areas. 14 On the institutional side -- and I'm going to 15 use Commissioner Stein's words from earlier -- the SEC 16 should provide principle-based guidance and avoid any 17 attempt to issue prescriptive rules as it relates to 18 cybersecurity controls. Simply for that reason we've 19 talked about so many times is the constantly changing 20 threat landscape. Any prescriptive rules would be 21 outdated potentially by the time they were written and by 22 the time they were put into place. 23 And we've seen this with Canadian regulators 24 recently. The Office of the Superintendent of Financial 25 Institutions recently issued their uncharacteristic more 0210 1 prescriptive rules. And complying with these rules may 2 result in the use of time and resources without truly 3 mitigating the current cybersecurity risks. So they're 4 going to tell me how many horses to have, but it's not 5 going to help me when I'm out there and I'm facing the 6 tank. 7 MR. LENZ: I guess we can just go on down the 8 line here. I would agree heartily with the principles- 9 based approach. I think that, you know, if we -- if we 10 do things that are too prescriptive, to parochial, one of 11 two things is going to happen. Either it's going to be a 12 major pressure on the industry and even quash some 13 things. On the other side, we move to the lowest common 14 denominator, which, again, I know Commissioner Stein 15 mentioned the highest common denominator. 16 But I think that's what could very easily 17 happen should we vary from a principles-based approach. 18 Companies that -- even represented here are so different, 19 the IAs, the different sized BDs, institutional versus 20 retail versus agency. I think all of us are so unique 21 that trying to put anything more prescriptive into place 22 would be extremely difficult. And I think at the end of 23 the day it probably wouldn't have the desired effect. 24 MR. MANLEY: Being a chief compliance officer, 25 I think I would be shot if I was seen as asking for more 0211 1 regulation. But I will say that, with all the varied 2 guidance that's out there, starting with Reg SP for a 3 firm like ours and the layered guidance from subsequent 4 regulation and even the proposed regulation amendments to 5 Reg SP -- actually the Reg SP amendments, the proposed 6 amendments are actually some of the best guidance out 7 there, I think, structured guidance that gives asset 8 managers and financial service firms, fund groups, some 9 direction and some structured direction. And I won't ask 10 the question, but they're still out there. They've been 11 out there for a while, so -- 12 MR. DENNING: So I'd touch on two points. And 13 I actually think Treasury and DHS touched on them earlier 14 today. But one is -- will probably come as no surprise 15 to you given my comments today -- is helping the sector 16 to establish clear legal protections to facilitate the 17 broader sharing of threat data to and from the government 18 during an incident. 19 And then the second item would be work with the 20 sector to help establish clear engagement lines and 21 interjurisdictional guidelines that allow for rapid 22 processing of cyber attacks and publishing a method on 23 how and where to actually submit threat data. 24 MR. SIBEARS: So I will have a little different 25 perspective since I'm not representing any of the firms 0212 1 up here. But I do have a comment on this, which is that 2 John mentioned that the SEC was looking into this space 3 on the IA side. I mentioned that we're looking into it 4 at FINRA on the broker-dealer side. I think it's common 5 we will get together and talk about what we're seeing in 6 both of those financial services channels with the goal - 7 - I would think at least on the FINRA side -- to push out 8 some effective practices that may come out of these 9 inquiries and these studies. 10 You know, whether it turns into a rule-based 11 approach or not, I can't say whether it would be 12 principle-based or more prescriptive -- premature to say. 13 But clearly I think we recognize that this is a rapidly 14 changing environment. So there has got to be a component 15 that allows firms to be able to adapt and a recognition 16 that the environment does change and we need to be 17 cognizant of them. 18 MR. TITTSWORTH: So agree with a lot of what's 19 been said. The experts I talked to -- their number one 20 thing was please resist the urge to impose rigid or 21 prescriptive requirements. I heard that over and over 22 and over again. There's a framework in place right now 23 for investment advisors. Mark mentioned Reg SP, Reg S- 24 ID, business continuity planning flowing from an 25 advisor's fiduciary duty, 46-state notification laws for 0213 1 better or for worse. 2 I think the BCP analogy is really good and 3 compelling. And, Drew, I know that you all have in your 4 NEP published earlier this year said you're going to go 5 out and do some of the things that Dan's group is doing. 6 I think gathering information could be very helpful. 7 What you did last August post-Hurricane Sandy in a report 8 published with FINRA and CFTC -- a very good four- or 9 five-page document after you examined 40 firms and said, 10 "Here are some best practices" -- that's really helpful. 11 And I think at the stages -- you know, 12 listening to the panel, the first panel today of all 13 those experts -- they were saying, "We're in the early 14 days here. We all need to chip in." And so I'd urge you 15 let's get -- work together. And thank you again for 16 convening the roundtable. I do think just getting 17 information out there is very helpful. 18 MR. THOMAS: It would be great if all the 19 regulatory bodies I have to work with actually talked to 20 each other. So if you guys could collaborate with the 21 others so that we don't have conflicting controls, that 22 would be really great. 23 MR. STARK: Yes. I don't -- I'm not really 24 representing anyone. You know, our -- at Stroz Friedberg 25 it's all about seeing the truth and coming in 0214 1 independently and trying to figure out what happened. 2 But I looked at the OCIE module that they're using for 3 cybersecurity. And I think about cybersecurity 4 disclosure and that there probably weren't going to be 5 many enforcement actions. I didn't really expect that 6 when I was talking to clients in the sense. You know, 7 there isn't a big appetite in Enforcement for non-fraud- 8 related disclosure violations. Same thing when we did 9 Reg FD -- you know, I didn't expect a lot of cases when 10 that -- I think it was just the message. 11 But I look at the OCIE module, and I am worried 12 that a lot of companies just won't be able to give you 13 the satisfaction that you will feel that, hey, this 14 company is doing anything; this regulated entity is doing 15 everything they can. 16 And I hope you'll be judicious in your 17 referrals to Enforcement because I think it's in all 18 these companies' best interests to do the right thing and 19 to protect the data. Otherwise, they're going to be out 20 of business. So chances are if they're slack or they're 21 not doing what they should, they need to be told what 22 they should be doing and encouraged. 23 But the thought of an Enforcement referral -- 24 one thing I've learned in the five years of private 25 sector consulting is the thought of an Enforcement 0215 1 referral is really scary for a lot of these entities. 2 And if it's not -- if you're not sensing a fraud in the 3 sense when you're reporting this, hopefully you'll 4 consider that when you're making these referrals because 5 once Enforcement comes in, it's an entirely different 6 dynamic. 7 MR. SCHIMMECK: And probably just -- and I 8 agree with all the other points that are made but just 9 couple other ones. So one is regulation isn't bad 10 because it does provide focus. It does provide the 11 impetus especially when -- you know, from the top of the 12 house for those individual -- individual practitioners to 13 get the resources they need. So when you go down the 14 list of the small and medium firms, well thought out 15 regulation can be additive. 16 And I think that's the -- that's another piece 17 -- is be additive in the regulations, a lot of stuff, as 18 you heard, that's going on out there both that the 19 industry is doing, the other regulators are doing, the 20 federal government's doing that -- you know, there's a 21 lot of activity. So where you can be -- you know, do no 22 harm, be additive to the situation and, again, provide 23 that focus. 24 One other thing. I think the NIST -- the 25 development of the NIST framework, I think, is a good 0216 1 example of, you know, two things. One is focus on the 2 outcomes. Again, compliance as an outcome is not what we 3 want. We want better protections. We want the industry 4 protected, you know, in a better manner. And being in 5 compliance -- the attacker doesn't care. It doesn't 6 help. It -- you can be in compliance and still be 7 vulnerable. 8 The second thing is the partnership piece. You 9 know, I think anybody at this table stands ready to 10 partner and to, you know, add information, provide 11 context, point the Commission in the right direction in 12 regards to what's additive, what's not additive, what 13 could be helpful, what could be harmful. And we've done 14 that over a year with NIST. 15 I think when that came out a year ago nobody 16 thought 16 sectors could get together and come to some 17 common ground in regards to what would be the correct 18 outcomes around this. We have. We're going to put those 19 into practice. That's going to be, I think, you know, 20 very much the base and a catalyst for this year in 21 regards to improving protections. So, you know, I think 22 that's a great example of like how things could work. 23 Lastly, one other thing just kind of hitting on 24 this is, as -- and this is less on the cybersecurity side 25 but more just in general regulation, I think. Something 0217 1 to consider is, as regulations are being thought out -- 2 and, you know, CAD could be an example. CARDS on the 3 FINRA side is another example. Think through the 4 security implications of what you're putting into 5 practice. You know, we do a cost benefit analysis. We 6 realize is this the best way, you know, is this reducing 7 paperwork; is it saving money; you know, what's the cost 8 going to be. But by centralizing data are we creating 9 more risks out there? 10 So I think just something to always consider, 11 always to be a part of, you know, any consideration as 12 you're putting regulations together is what are the 13 security implications? Are you putting the sector at 14 greater risk by putting that regulation into place, or -- 15 you know, is -- or, if you are, you know, how do we 16 mitigate that, and how do we think through that process 17 as we go? 18 MR. GRIM: So before I turn it over to Jim to 19 wrap things up, I wanted to be sure -- Chair, 20 Commissioners, you guys have any questions you want to 21 ask our panelists? 22 CHAIR WHITE: No. It's been very, very 23 helpful. I mean very helpful. 24 COMMISSIONER AGUILAR: No. But before you wrap 25 it up, since you're about to wrap it up, not only this 0218 1 panel but all the panels that we've had today have been 2 fantastic, well informed, knowledgeable, articulate, 3 which I've gotten a lot out of it. I think I speak for 4 my fellow Commissioners and the Chair and others who got 5 a lot out of it. 6 But I don't only want to thank the panels for 7 that. I got to thank the people who picked you because 8 they could have picked others. So I want to thank the 9 staff, and I want to thank my fellow Commissioners for 10 helping to identify the people that were well informed, 11 articulate, and knowledgeable. So kudos to the staff for 12 setting up a great roundtable. 13 COMMISSIONER STEIN: I second that. But I want 14 to thank all of you for your, you know, volunteer 15 activity today, you know, coming in and talking to us. 16 And I hope, again, this can be a dialogue that we 17 continue and that we can try to stay dynamic as you try 18 to stay dynamic and head of the curve. 19 MR. BURNS: Thank you. Thank you very much. 20 And with that I believe my time is up. The reason they 21 put 15 minutes for closing remarks is because they were 22 counting on me to do it in about 30 seconds, so let me 23 try. 24 Thanks have already been expressed to this panel, 25 and we owe a debt of gratitude to all the others who came 0219 1 today. If one were trying to summarize this rich stream 2 of dialogue, you could say that at the outset Tom and 3 Keith tried to -- fostered from federal and private 4 sector reps, touched base with them and learned from 5 them, had emphasized for them something we all know very 6 well, the importance of information sharing when we're 7 responding to cybersecurity challenges. 8 In the second panel Keith took us through an 9 exploration with public companies of how right now 10 they're disclosing these issues -- some of these issues 11 as they arise and how that disclosure might be enhanced. 12 After lunch there was a great discussion, kind 13 of lousy moderating but a great discussion of 14 cybersecurity issues facing entities that we at the 15 Commission have viewed as part of our critical market 16 infrastructure. And we had a chance to talk about some 17 of those threats as well as the importance of a number of 18 things, testing, whether individually, by firms, or 19 across the industry, and there was some conversation 20 about that in this panel -- then ensuring firms are 21 focusing on risk management and good internal controls 22 and, again, a theme we heard throughout the day, 23 information sharing. 24 Finally, Drew and David and this terrific panel 25 drilled down a bit more with particular market 0220 1 participants, got a better picture of what you are 2 seeing, how you're planning, and what's keeping you up at 3 night. And kind of some bottom lines that seem to be 4 coming up from you and earlier and which suggest 5 potential next steps are things we all need to wrestle 6 with, again, how we foster better information sharing and 7 sharing of best practices, principles-based guidance, 8 tailoring requirements so they can be adapted to firms of 9 varying profiles and ensuring that small firms are part 10 of that dialogue and that they're encouraged in the 11 process and not discouraged; generally encouraging good 12 planning, testing, communication to playbook. That's a 13 term we heard quite a bit during the day. 14 And Katheryn touched upon it; others have too. 15 Kind of the great frontier for us all, recovery planning 16 seems like it's a big challenge that you're already all 17 engaged in. And, you know, Quantum Dawn, other plans for 18 testing -- those are very important steps. Obviously we 19 had Superstorm Sandy that hit us, the SIP incident that 20 hit us, and we were learning things from those in real 21 time, how much to better doing what you're all committed 22 to doing, which is trying to do some testing in advance 23 of that. 24 If there were a bottom line, I would say that I 25 didn't get a sense from anybody that anyone's resting on 0221 1 their laurels or taking it easy or sitting back -- a real 2 sense of the importance of being devoted to the challenge 3 and not just vigilance but a dynamic kind of vigilance. 4 And as an aside, I know what I'm going to go tell my kids 5 they need to grow up and become -- certainly not a 6 lawyer. 7 Thank you all very much again. And could I 8 quickly just say -- I'll leave people out, but I'm sure 9 on the Chair's behalf and the Commissioners' behalf some 10 of the individuals who have helped us a tremendous amount 11 today, Jennifer Riegel and Catherine Brown from 12 Corporation Finance, David Joire from the Division of 13 Investment Management, Christian Sabella and Cristie 14 March, Shauna Sappington, and George Makris from my 15 division, Trading and Markets. We, again, invite your 16 input in the public comment file. 17 And may I say thank you to the Chair and the 18 Commissioners for participating throughout the day. I 19 think it speaks to the importance of the issue and the 20 quality of the panelists that you've devoted so much 21 time. And we appreciate your leadership and devotion in 22 these issues. Thank you all very much. 23 (Whereupon, at 3:07 p.m., the roundtable was 24 concluded.) 25 0222 1 PROOFREADER'S CERTIFICATE 2 3 In The Matter of: CYBERSECURITY ROUNDTABLE 4 File Number: OS-326 5 Date: March 26, 2014 6 Location: Washington, D.C. 7 8 This is to certify that I, Nicholas Wagner, 9 (the undersigned), do hereby swear and affirm that the 10 attached proceedings before the U.S. Securities and 11 Exchange Commission were held according to the record and 12 that this is the original, complete, true and accurate 13 transcript that has been compared to the reporting or 14 recording accomplished at the hearing. 15 16 _______________________ _______________________ 17 (Proofreader's Name) (Date) 18 19 20 21 22 23 24 25 0223 1 REPORTER'S CERTIFICATE 2 3 I, Beth Roots, reporter, hereby certify that the 4 foregoing transcript of 221 pages is a complete, true and 5 accurate transcript of the testimony indicated, held on 6 March 26, 2014 at Washington, D.C. in the matter of: 7 CYBERSECURITY ROUNDTABLE. 8 9 I further certify that this proceeding was recorded by 10 me, and that the foregoing transcript has been prepared 11 under my direction. 12 13 14 Date:__________________________ 15 Official Reporter:__________________________ 16 Diversified Reporting Services, Inc. 17 18 19 20 21 22 23 24 25