Podcast: R&G Insights Lab – Culture & Compliance Chronicles: Using Data Ethically
In this episode of R&G Insights Lab’s podcast series, Culture & Compliance Chronicles, litigation & enforcement attorney Tina Yu continues her discussion with four Ropes & Gray partners and counsel who focus on data, privacy and security across the firm’s enforcement and transactional practices. Picking up from a conversation about the role of data and its commoditization, Ed Black, Rohan Massey, Rosemarie Paul and Clare Sellars delve into how the regulators in the U.S. and UK are approaching this new world of data, where companies are trying to find their footing in navigating the possibilities stemming from growing data analytics capabilities, but where safe harbors may still be a ways off.
Tina Yu: Hi everyone, and welcome back to Culture & Compliance Chronicles, a podcast series focused on data analytics and the behavioral sciences approach to risk management, brought to you by the R&G Insights Lab. I’m Tina Yu, a litigation & enforcement associate at Ropes & Gray. In the second part of this two-part discussion, I’m once again joined by partners Rosemarie Paul, Rohan Massey and Ed Black, and counsel Clare Sellars. In part one, we talked about the role of data, its commoditization and risk management strategies. In part two, we’ll discuss approaching regulator viewpoints and navigating the new world of data. Just circling back on where we started out in the beginning, data is everywhere – it's part of our daily lives, it's become a commodity, it's become a tool for many organizations, for the government, it's everywhere. Rosemarie, this is a question for you because given the role that data plays in our lives, what are the regulators thinking? What are their views on this?
Rosemarie Paul: Well, it's interesting, and Ed covered it really well from the U.S. perspective as to how use of data is being analyzed and reviewed, not least in the context of whether or not it gives market participants an advantage that isn't in the traditional form of inside information or insider dealing because this is data that doesn't fall within those traditional definitions. The UK Financial Conduct Authority is actually in the process of conducting a call for input on accessing and using wholesale data. A call for input is basically the regulator saying we'd like you to participate in this and give us some information so that we can take a view as to how we might approach regulation and enforcement in the future. The reason why they assume that they need to do this is because of the new forms of data and analytical techniques that are being used across wholesale financial markets – the fact that access to data is needed now to identify investment opportunities, evaluate positions and meet regulatory obligations. But they have observed that, as the demand for data increases, firms that actually generate the data can use or market their data in ways that, from a regulatory perspective, could be perceived as creating poor outcomes for users, and ultimately, has a knock-on effect in consumers. So, the example they give around that, would be data generation that increase the charges or limit the availability of data. Considering what are the barriers to accessing data is really key to this.
Understanding how accessible technology is to analyzing the data, and whether or not that gives you an unfair advantage – these are new questions, that until the technology got to the stage it has in the last few years, regulators hadn't really had to grapple with. Considering whether or not there are potential harms to market integrity, whether or not there's potential barriers to competition as a result of this inequality of access to data or access to analytics tools for data, whether or not the wider use of algorithms could give rise to new types of market abuse or collusive behavior, and I think, interestingly, the regulators also asking what ethical considerations should be taken into account. It feels to me like a very open conversation, a very open approach by the regulator, and it will be very interesting seeing where they land on this. The call for input closes I think in October 2020 – it's been extended because of COVID. But it's going to be a discussion I think on both sides of the pond as to how the regulators approach this new style of use of data that is outside the insider trading person who takes information that they've received in the course of their employment and trades on their own account – that traditional insider analysis is going to change, I think.
Tina Yu: I just think it's so fascinating that this is an area where regulators are learning alongside all of us and really trying to find ways forward. I think while we're still grappling with all these major changes in our world, especially in the data space, what are some of the items that we can keep in mind, both ethically and from a regulatory standpoint, to make sure that we're doing the right thing? We've talked about effective compliance programs in the past, and a lot of that has been in the corruption context. So, what do we do about that in data? We did discuss earlier on some ways to help mitigate risk and risk management, but what are some of the really key items that maybe won't be in black letter law but just some of the tips and experiences that we've had, like we do in the corruption context, on how to make this easier for people going forward and really how to help everybody find their footing in this very fast-changing world?
Ed Black: Well, the question of, “How to do the right thing?” is of course heavily loaded. We can't answer all of it, certainly not some of the philosophical aspects of that question, but I think that question is most keenly felt at the moment from a day-to-day business perspective inside those organizations that are simultaneously charged with handling data in a fashion that's not only legally compliant but reflects their corporate values, which, for most organizations, embrace fair dealing and honesty, while at the same time, facing immediate pressure inside and outside their organizations to be the smartest managers and traders that they can be. And of course, being a smart manager and trader in the current environment means making the best use of data in your analytics set. So, how do you balance those two pressures? Again, the complete answer is not available. One of the things that we're seeing in the U.S. that does address this particular tension is the development by trade associations of diligence standards for the acquisition of data from third-party vendors.
The way in which a lot of this data now moves around is through something that people have sometimes referred to as the “alternative data market.” These are data brokers who make data sets available that they've collected from all of these digital platforms and from, as you mentioned earlier, wholesale market digital platforms, and it can all be bought and sold and traded in the alternative data market. So, the fact that you have that pressure to be a smart manager, should you just race into the alternative data market and buy whatever's available, while at the same time, you've got legal compliance and a corporate culture of honesty and transparency – right at that inflection point, we're seeing a lot of private organizations actually say that, “Our best practice should be to develop agreed-upon standards for diligence in pre-clearing and approving data that comes into our organization, and as industries, we should have a shared understanding of what that means.” Examples include standard diligence materials published by one organization called the Alternative Data Council; the Alternative Asset Management Association has similarly published diligence standards – these are attempting to manage that specific attention. Of course, the question about, “What is ultimately right? in some larger sense is still out there, but diligence and onboarding standards can be a way that at least in that very practical, commercial version of that question, you can both try to do what's right and be a brilliant business decision-maker with the keen eye that you need to be in order to do your job.
Rosemarie Paul: That's so interesting, Ed. I think it ties in really squarely with what we discuss in behavioral science, but also, very much in UK's financial regulatory services, the emphasis on the culture of the organization, having codes of conduct and values-based guidance. So, as you say, if you're under pressure to be the best manager, to make the worthiest decisions, but your firm's values apparently tell you something else, that's a problem. And the thinking about what is the right thing, what is the appropriate thing – no one's got an answer, plain and simple, on that. But the basics that Rohan averred to, in terms of not taking more data than you need, be proportionate, don't keep it for longer than you need, be transparent about how you're using it and what you're using it for – these are simple, but not easy, high-level concepts. I think they really end up being baked into the ethics of use of data. Rohan, what do you think?
Rohan Massey: I think that's absolutely right, Rosemarie. As we look at data, we certainly look at the regulation of personal data, which is growing year after year, across the globe. Every jurisdiction is putting in its own restrictions and its own management on all of these areas. So, I think it's more and more important that you focus on the limitations of the data you're collecting and limitations for use to ensure that you can be compliant, because we also know that regulators around the world are increasingly stringent in their enforcement and are being given far greater powers, both monetary and in the ability to stop processing, which can have a really serious material impact on your business if you are not compliant with the regimes that you're subject to.
Ed Black: So, Rohan, in thinking about those compliance burdens, one of the questions that we commonly see coming into the firm at Ropes & Gray deals with derivatives data. We've talked a lot about collecting data, and most of this conversation has focused on what might be called “primary data” – that is the data directly collected around consumer behavior or a particular transactional setting. But a lot of data that's out there is already a one-step, two-step process – it's already been aggregated in some way, it's already been analyzed in some way. The position that the market appears to be taking in the alternative data market is that if you process the data enough—good question as to, “How much is enough?” – different companies differ—but if you process it enough, you can migrate the data out of compliance burdens. You can put it into a posture where it's no longer associated with health care, no longer associated with an issuer of stock from publicly traded equities, where it's somehow free of compliance burdens. Now, is that true, and how do you see derivatives data in relation to the compliance world?
Rohan Massey: A very challenging question. It is possible, I think, to de-identify data. Where we have a regulated space, again, it is usually linked to the personal aspects of data. So, if you can take it away from that aspect and make it purely statistical and get the commercial value you need out of the data at that point, then yes, I think you can take it out of the compliance regime. Again, I think it becomes increasingly difficult when we look at the volume of data sets that we have, the ability of any specific organization to look right the way across its available data and to interrogate that because the more that you are able to do that, the greater the likelihood of re-identifying the data. And if we take the European perspective, personal data is any data that has not been irrevocably anonymized. So, if you can bring together data sets, A through Z, then there is a high chance that you will still be able to identify what may appear, on its face, to be anonymous or pseudo-anonymous data – you can re-identify it because you can draw those aspects together. So, I think there is a challenge there. Some of it goes back to exactly what Rosemarie was saying – if you are only collecting very limited data sets, that will help you to make sure that you can stay in this unregulated area of statistical information that it not subject to the increasing burdens of the personal data protection and privacy regimes around the world.
Rosemarie Paul: I think it's also worth noting that we are perceiving data very much as an asset. There's the fact that if you're holding a lot of data, that in itself can attract risk as well through cyberattacks, through leakage, through loss of data – the impact around that. And I think that's one of the factors that has to be baked in as well – I think the regulators are really grappling with it. My own view is that even if it's synthesized, even if it's stripped back, if it can be perceived as giving a firm an advantage, for example, in a market trading environment, that may be something that the regulators decide to subject to scrutiny, but the jury's really still out on that.
Ed Black: So, Rosemarie, I think that's a great point. And one of the practical consequences of this trend you're talking about—that is, that regulators are trying to keep up with the data and have these things currently under analysis—means that as a practical matter, advising companies engaged in commercial transactions, we don't have the tools that we had in many other settings. In many other settings, exposure to liability and even the ability to generate income or generate equity growth can be facilitated with a clear legal environment that provides essentially a safe harbor alternative – that is, we can say to the client, “If you just put this language in the contract, if you just adopt this compliance checklist, if you just keep this kind of record for an audit trail, then we can be 99% confident that everything's going to go smoothly, and you're in a safe harbor and you can proceed with whatever your proposed business plan is.” In the current environment, we don't have safe harbors – we have benchmark practices that are being developed, for the most part, privately by commercial actors. And there is some hope that there's a safety in numbers – that if the industry as a whole identifies certain practices and everyone follows those practices, the regulators will either go along or understand that they can't go counter to widely followed practices without giving notice, lead time and an ability to respond. But we don't have safe harbors, and as a result, the collection, storage and use of data is going to be something that is subject to a certain amount of irreducible risk, probably for at least a few years to come.
Final note, Rosemarie, on your point about regulators responding if things just don't look good. In the space, in addition to all of the industry-specific regulations for securities law, for traders, health care regulation, of course, for medical information and so on, then at the level of the states themselves, all states have a general unfair business practices statute. And in each state, the attorney general is free to bring criminal actions under those statutes if things are just fishy, and we've seen that used already in the data setting. There's a well-known case in New York, brought by the State Attorney General of New York, involving a data company that had a market-moving data set that was widely available and published at a certain time. They approached certain key market players and said, “If we drop this data set to you, two seconds ahead of dropping it to the rest of the securities trading world, would you pay us some extra fees?” Of course, they did. The state attorney general heard about the practice, brought a Martin Act of unfair business practices action, and shut the whole thing down. There was no need for a data privacy statute, no need for an industry-specific regulation – this action was simply taken as an unfair business practice based on harm to the market perceived by the State Attorney General of New York. So, it's a world where the temperature's a little bit higher all the way around, and will be until things get sufficiently settled so that we have safe harbor-type tools that can be used.
Tina Yu: Ed, Rosemarie, Rohan and Clare – thank you all again for participating in this very insightful discussion. And thanks to you as well, our listeners, for tuning in to our Culture & Compliance Chronicles podcast series. For more information, please visit our website at www.ropesgray.com. And of course, if we can help you navigate any of the topics we discussed, please don't hesitate to get in touch with us. You can also subscribe to this series wherever you regularly listen to podcasts, including on Apple, Google and Spotify. Thanks for listening.