Podcast

Subscribe to RopesTalk Podcast

Apple

Google

Spotify

Recommended Podcasts

Podcast: Decoding Digital Health: Recent Developments in AI and Machine Learning for Life Sciences and Health Care

The Ropes & Gray Decoding Digital Health podcast series discusses the digital health industry and related legal, business and regulatory issues. On this episode, IP transactions partner Regina Sam Penti joins the Digital Health Initiative co-leads, Kellie Combs, Christine Moundas and Megan Baca, to discuss the increasing prominence of artificial intelligence (“AI”) and machine learning technology in the health care and life sciences industry. The discussion reviews key applications of AI, the IP protections around machine-learning, transactional and licensing issues that arise in the context of AI, as well as the health care and FDA regulatory issues to consider.

Read More

Capital Insights

Capital Insights.

Podcast: Data, Privacy and Cybersecurity Under the Biden-Harris Administration


Time to Listen: 24:30 Practices: Data, Privacy & Cybersecurity, Litigation, Regulatory Enforcement & Civil Litigation, Incident Response and Preparedness, Privacy & Cybersecurity Compliance and Counseling

This edition of Ropes & Gray’s Capital Insights series captures our latest thinking on the coming changes in privacy and cybersecurity. In this podcast, join Ed McNicholas, co-chair of the data, privacy & cybersecurity practice, and Fran Faircloth, an associate in the data, privacy and cybersecurity practice, both based in Washington, D.C., as they look around the corner and discuss the developments in the White House on privacy and cybersecurity issues and how those changes will echo in federal agencies, the states, and internationally.

Data, Privacy and Cybersecurity


Transcript:

Fran FairclothFran Faircloth: Hello, and thank you for joining us on this Ropes & Gray podcast. My name is Fran Faircloth and I am an associate in the data, privacy and cybersecurity practice here in Washington, D.C. I’m joined today by my colleague Ed McNicholas, co-chair of the data, privacy & cybersecurity practice. Ed represents technologically sophisticated clients in litigation, investigation and counseling matters related to complex data, privacy and cybersecurity issues. Ed has significant experience with investigations and class action litigation related to cybersecurity incidents. He also advises clients on federal, state, and foreign privacy and data security requirements, including in the areas of financial privacy, health care privacy, communications privacy, ad-tech, data analysis, cybersecurity, and national security.

Ed previously served as an Associate Counsel to President Clinton. In that capacity, he advised senior White House staff regarding various Independent Counsel, congressional and grand jury investigations. Ed has developed unique experience representing clients in the midst of media-driven legal challenges. His crisis management skills are particularly useful in coordinating the swirl of complex litigation, congressional hearings, and federal and state investigations that can follow from major privacy and cybersecurity incidents.

Our podcast today is part of our ongoing Capital Insights series where we capture our latest thinking on developments from the federal government that might affect our clients. We’ll discuss the changes in the White House on privacy and cybersecurity issues and how those changes will echo in federal agencies, the states, and internationally.

Ed—to start off, this past year has seen so much activity in the privacy and cybersecurity space. What do you see as the key issues facing the Biden administration in those areas right now?

Ed McNicholasEd McNicholas: Fran, first, thanks for that wonderful introduction—it’s a pleasure to be here. Cybersecurity under the Biden administration will be the paramount challenge. We have seen extensive attacks from Russia, in particular, against the election infrastructure. We have seen it against several pieces of the U.S. technology infrastructure. Microsoft has been attacked. Amazon Web Services have been reportedly used as part of those attacks. We’ve seen trusted pieces of software used against American companies and the government. 

Now, I represent one of the key figures in the SolarWinds issue, so I’m not going to speak directly to that case. But I did want to say that there is no real dispute now that the Russians are using American infrastructure to attack American companies. And we have pretty clear evidence that the Chinese are doing the same.

The reason they’re using American infrastructure is that the monitoring for attack traffic that we could do overseas is not allowed in the U.S. because we don’t want our intelligence services surveilling U.S. servers in the same way that we can surveil overseas.

This is a bit ironic because we have this ongoing discussion with the Europeans about whether there is more privacy inside the U.S. or outside the U.S., and in fact, we have far greater protections inside the U.S. than you do outside the U.S. in terms of privacy from surveillance. That’s why we see this sort of attack using infrastructure in the U.S.

It will be up to the Biden administration to figure out a way of fighting the Russians and the Chinese and others who would do us harm in the U.S., while at the same time, preserving privacy and civil liberties. General Nakasone has made clear that he considers that to be a blind spot for our military and intelligence communities, but we have to be careful not to trample civil liberties or disrupt consumer confidence in the cloud while eliminating that gap in our knowledge. The challenge here will be to ensure that U.S. companies can feel comfortable working with other U.S. companies and working globally without the threat of a cybersecurity attack from foreign nation states. 

One of the key things that will be part of any solution is to adopt the recommendations of the Cybersecurity Solarium Commission. The Solarium Commission report is probably the leading document on cybersecurity strategy in the United States currently. One of the things that the Commission report emphasizes is the importance of defending forward and making sure we go after the attackers where they live. Second, there needs to be robust cooperation between the private and public sectors to make sure that companies can share information with the government, and that government will share information with the companies. Obviously, the government needs to protect sources and methods, but there needs to be a more robust interchange between the private sector and the government. 

And also, that companies using ISACs and ISAOs—those are Information Sharing and Analysis Centers, and Information Sharing and Analysis Organizations—can come together without the fear of violating privilege or having antitrust issues creep in, so that they can share information about ongoing attacks and the methods that are successful in stopping them. Because if we have information sharing between companies and have information sharing with the government, and we have government sharing information back to the companies, we can fight together, and I think together we can stop these attacks on our shared infrastructure. Organizing that shared effort will be the key challenge of the Biden administration for cybersecurity.

Fran Faircloth: Who do you think will be the key players in regulating privacy and cybersecurity?

Ed McNicholas: One of the interesting things is I think the White House will be a key player in cybersecurity going forward. Over the past few years, we’ve not seen a lot of White House leadership on cybersecurity issues, on privacy issues, but I think that will change. The other leading player will be the Federal Trade Commission. In many ways, the Federal Trade Commission will be in the driver’s seat over the next few years. They will have the full backing of the White House, which is not something they’ve enjoyed recently. There will be a new Chair. Now, you can’t see this on the podcast, but I’ve got a crystal ball in front of me, and the two main contenders that appear in my crystal ball are the acting FTC Chair, Rebecca Slaughter, who, whether or not she becomes the full-time Chair, will remain a commissioner and a significant voice at the FTC, and the other image in my crystal ball is Attorney General Karl Racine, who’s attorney general of the District of Columbia currently. I’ve actually had the pleasure of working in different parts of my career with both of them, and know that they would both be excellent choices for Chair. We can’t go wrong here.

Commissioner Rebecca Slaughter, I think, would be particularly well-known to people because she was the commissioner who dissented from the Facebook fine, not because she thought it was excessive—it was a $5 billion fine—but because she thought that the Federal Trade Commission should have gone after Mr. Zuckerberg personally, and that the FTC should go after other executives who profit from what the FTC considers to be unfair trade practices. 

Attorney General Karl Racine and I served together in the Clinton White House on the investigations team. Now, he’s the attorney general for the District of Columbia and has seen D.C. launch its first data breach notification law, but there’s been a good amount of speculation that he might be moving over to the Federal Trade Commission—maybe not—he’s got a lot of opportunities in front of him. But I think both of these potential candidates would lead the Federal Trade Commission to enhance the focus on what I would call “dignitary harms.”

Let me explain that for a second—under the Republican administrations, the Federal Trade Commission tends to focus more on cost-benefit analysis, and they tend to look for monetarily quantifiable harms. Under Democratic administrations traditionally—these are broad generalizations—the Federal Trade Commission has tended to look more for dignitary harms, that is harms to people that are suffered that don’t have any tangible dollar value. Now, the best common law example I can give of this is peeping tom cases—guys creeping about and looking into windows. There’s no monetary harm from that, but I think everyone would say there’s a privacy harm. That tangible privacy harm exists, even though it doesn’t cost anyone a single dollar. 

That’s one of the key tricks about privacy that will come into stark relief in the next four years. Much of what we consider to be a common sense privacy harm, is a dignitary harm. If you found out that someone had cancer or HIV, or even COVID, there’s not any tangible harm from that normally. If they actually lost their job because of that, there would be a cause of action there, certainly. But generally, we’re focused on social stigma and a sense of creepiness.

We’ve also seen the Federal Trade Commission moving in a very interesting way to ensure that if companies have committed an unfair or deceptive trade practice, and use that practice to gain information or access to data from which they derive algorithms or other insights, and if the Federal Trade Commission catches them, they will not only force the company to pay a fine or to go under a consent decree depending on the company, but also to give back the data and delete the algorithms, so they won’t have any profit from illegal access to data. 

Over the next four years I think we’re going to see companies becoming much more aware of the need to effectively govern data—something we call “data stewardship” for organizations. And I think a trend from Europe that we’re going to see becoming much more popular here is having a code of ethics within the company that decides how they govern data, and I think that code of ethics will be crucial in convincing the Federal Trade Commission that any given company is doing the right thing with respect to data.

Fran Faircloth: How do you see the Biden administration repairing the rift with Europe over data protection and addressing the destruction of the Privacy Shield agreement by the European Court of Justice?

Ed McNicholas: Fran, that’s an excellent question. The President appears to be prioritizing the negotiation of a new data protection equivalence agreement with the European Commission. The core compliance infrastructure that was established under the Safe Harbor, and transitioned to the Privacy Shield, should work for the next agreement as well.

The concerns raised by the CJEU about bulk surveillance are entirely outside of the experience of the vast majority of companies, and the decision, I think, has been frustrating for them. Moreover, as part of the NATO alliance, the U.S. provides a good amount of its surveillance to the benefit of European security. And, quite obviously, European governments also conduct surveillance that is not dissimilar to U.S. surveillance. I think the Biden administration may have more success in reaching a common-sense agreement with Europeans because now there is more trust that surveillance under President Biden will be more consistent with the shared notions of human rights that are common to all of these constitutional republics. 

Substantively, U.S. businesses can operate quite well under reasonable versions of the EU privacy rules, and there is certainly considerable overlap between the requirements of the European approach and the movement we’ve seen in several U.S. states. So I think the president should be successful in getting a new agreement with Europe.

Fran Faircloth: You mentioned that movement by the states toward passing comprehensive privacy laws, with California leading the way with the CCPA and the CPRA, and now with Virginia joining them, and Washington, New York, Minnesota, and Florida, among others considering similar comprehensive privacy bills. Do you think this activity by the states will finally prompt Congress to pass a comprehensive federal privacy law?

Ed McNicholas: You know, that’s a difficult question. If I were in my office, I could point to a binder that I have of privacy proposals over the years. One of them was the Kerry/McCain Bill, it was bipartisan—John Kerry, John McCain—as bipartisan as you get. Two well-respected senators came together, put this forward, and everyone thought, it would surely pass. It did not. The history of U.S. privacy legislation has been scarred by the internecine scrum on Capitol Hill over the jurisdictions of the various committees. Right now, we have privacy set out as sector-specific laws and there are various committees and they have jurisdiction over these sectors, and they like to keep their jurisdiction because they think they are very effective in that jurisdiction. 

To have a comprehensive U.S. privacy law, we need these congressmen to come together for the common good. They could have a select committee on privacy and cybersecurity. There are many ways to solve this. And I certainly want our legislature to have that sense of the greater good that they’d be willing to give up some power from the committees and go with an overall privacy bill that would be useful in terms of protecting privacy and enhancing cybersecurity. The basic principles are there, the bargain is there to be had, but it will take some real courage out of Congress to get that bill through. 

The overall outline to that bill would no doubt be a notice and choice bill. It would respect the sectors that have been particularly heavily regulated, like medicine, financial services, telecom—I think those will remain separate sectors under this bill. But the vast internet is not regulated by any federal bill, other than perhaps the Federal Trade Commission Act, and this new privacy and cybersecurity bill would implement a notice and choice mechanism where people would have some clear explanation of how their data is being collected and used, and they would have some opportunity for choice, maybe to opt out, perhaps some rights to access to that data, such as to correct what data a company has about them or even just to find out what data companies have about them. I think this will be very similar to the DSAR process, the Data Subject Access Rights process we see across Europe. 

Now, one of the crucial questions in the legislative bill will be whether or not there’s a private right of action and whether or not a federal bill would preempt state bills. Businesses generally will not sign off on a bill unless it preempts state bills. If we have a federal standard, business will want there to be one uniform federal standard, which seems like a fair ask. However, some of the states currently have a private cause of action for consumers to enforce their rights, often times through class actions, and the plaintiff’s lawyers will not sign off on a bill unless it has a private cause of action. So far, a bill that both preempts state bills and has a private cause of action has not been a viable political deal on Capitol Hill, but perhaps it might become a viable political bill in the future.

Fran Faircloth: What if Congress can’t get its act together, do you think we’ll see more success in the states—will we end up with 54 different comprehensive privacy laws, the way we have 54 different breach laws in the 50 states and the various territories?

Ed McNicholas: We may well. I would love to say there was going to be one federal bill and that was going to keep it all very clean, but it seems that the states here are willing to act. And there are two trends that are emerging here—one of them is coming out of California, another one’s coming out of Virginia, where I’m sitting right now. They are both echoes of the GDPR, however, they are significantly different sorts of echoes. 

The California bill seems to have taken the text of the GDPR—the plain text—gone from that and developed ideas about how they regulate privacy in Europe, and working from the plain text, try to put that into the California law.

The Virginia example seems to have looked at the GDPR not as the plain text, but as an expression as what has developed since we had a data protection directive 15 years ago, along with the 15 years or so of common law interpretation from the various data protection authorities. And so, although the text of the GDPR might say something in one way, the way privacy is actually protected in Europe, can often be a bit different. There’s a lot more nuance, proportionality and reasonableness, for instance, the notion of legitimate interests is essential to how privacy’s practiced in Europe, although California didn’t pick it up at all. The actual European approach to privacy is more subtle, and there’s a lot more gray as to how data is actually protected in Europe. And I think the Virginia bill does a better job of picking up the actual manner in which Europeans protect data. Significantly, also, in Europe, their main driver for enforcement is regulatory—it’s not set up for plaintiff lawyers to try to play gotcha on complicated technical requirements. So, where California has a private cause of action and a highly, I should say, textualist approach to data protection, they are not set up to have plaintiff lawyers play gotcha with complex technical measures. So the California privacy bill does have a private cause of action with respect to data breaches where there’s a lack of reasonable security, which of course is undefined. And in California, there are statutory damages or per-person damages. This creates a significant risk that the California bill will be enforced in a way that is vastly more aggressive than what we see in Europe.

In contrast, the Virginia law does not have a private cause of action. Virginia has attorney general enforcement and it has a much greater focus on ensuring that there is protection of reasonable expectations of privacy. It’s not focused on a technical gotcha approach for privacy, but it does force companies to actually engage in data governance. For instance, one of the things that Virginia would require is a data protection impact assessment by which they mean a consideration set down in writing that companies would do when they roll out sensitive systems, let’s say a new system involving sensitive data, and it forces a company to put down, in writing, their consideration of the privacy interest involved and then to document how they are addressing those privacy interests. Significantly, this document would be prepared by counsel most times, but it could be shared with the attorney general, who would be the enforcer in Virginia, without vitiating the privilege because privilege is a question of state law and the state privacy bill in Virginia protects the privilege, even if you share it with the attorney general. That’s a much more nuanced approach from Virginia than we’ve seen in California. 

Of course, California just had a recent privacy initiative and they are proud of being on the cutting edge of privacy, and I’m sure they will remain there. In fact, they will have the first full out privacy enforcement agency, the California Privacy Protection Agency, which might wind up being something like a European data protection authority. It will be interesting to see if as a new agency is established and begins operations, whether we see more nuance slipping into the California approach so that the privacy debate comes together. 

Of course, state after state, we’re seeing these bills pop-up, some of which are like California, some of which are like Virginia, some of which are a bit different. Legislatures across the country will be going back and forth deciding, do we want to be more like California or more like Virginia? And I wonder if we’re going to see one take hold. 

Hopefully we won’t have 20 states with one approach and 20 states with another, and 10 with some of their own approach—that would be a complete mess. But maybe we’ll see California take hold and that will be, the CCPA will become the law essentially in 40 states. And at that point, I think Congress would step in.

Although, in truth, many people thought this would happen with state data breach laws. Once we had about a dozen state data breach laws, Microsoft actually led a lobbying campaign to get a federal breach bill together. Think about that—this was a tech company going to Capitol Hill asking for federal legislation—rarely happens. But even with Microsoft going to Capitol Hill, asking for federal legislation, it was not successful, and so today we have 54 state data breach laws because we have D.C., Guam, Virgin Islands and Puerto Rico. And that same thing might happen with state privacy laws.

Fran Faircloth: Do you think the courts will have a role to play in this space in the next few years?

Ed McNicholas: To my mind, the Supreme Court is going to have a significant role to play on data privacy regulation. The First Amendment and its protections for commercial speech have been evolving rapidly, and the current Supreme Court is strongly in favor of expanding First Amendment protections. It’s a core guarantee of the Constitution, and very much in favor with the current Supreme Court.

For instance, there had been a move, almost 20 years ago now, by the Federal Trade Commission to have an opt-in as opposed to an opt-out privacy mechanism for uses of telephone information, called CPNI, and that was struck down by the Tenth Circuit on commercial free speech grounds.

Later, we’ve seen bills trying to protect physician privacy in New England be struck down by the Supreme Court also on commercial free speech ground. And the core idea of European privacy is what I call the “purpose limitation,” and that idea is that if you’re going to collect information for a particular purpose, you have to use it only for that purpose. So let’s say I sell you a pair of golf shoes. I’m supposed to use that information only to send the golf shoes to you and maybe for warranty claims or something like that, or maybe to sell you another pair of golf shoes later, but I can’t take that information and sell it to golf club manufacturers or people who sell golf balls or country clubs memberships so that they can send you marketing information from these unrelated third parties. Now, in fact, you may well appreciate getting a discount on golf clubs or golf balls if you buy golf shoes, but many of the provisions in the California approach to privacy would make it much more difficult to do that. This purpose limitation though has significant tension with the First Amendment and it could well be vulnerable to an attack.

Now, the Virginia approach to privacy, in truth, might restrict the data that you share with third parties to provide goods and services about them. It will be an interesting question whether the privacy protection becomes opt-out or opt-in. But if it becomes opt-in, I can see the Supreme Court saying, “If people collect data in a free market, they are free to use that information because it is the business record of the company. They own that information. It’s their information and they can do with it as they like, subject to some restrictions for privacy interests.” But those privacy restrictions need to be accomplished in the least-restrictive means that protects privacy. The European approach has chosen a method that is certainly not the least-restrictive means. And to the extent that we try to take the European approach and import it wholesale into the U.S., we could well see the Supreme Court strike that down.

So I think as legislatures are developing privacy law, it’ll be important for them to keep in mind that there’s a commercial free speech backdrop here—the courts are not out of this debate—and I would not be surprised if we see some of the more aggressive state laws being struck down on First Amendment grounds. So in short, I think we’re going to have a lot more work to do here, Fran.

Fran Faircloth: Ed, thank you for joining me today for this discussion. And thank you to our listeners. For more information on the topics that we discussed or other topics of interest relating to the new administration, please visit our Capital Insights online hub on our website at www.ropesgray.com. And, of course, if we can help you navigate any of the topics we discussed, please don't hesitate to get in touch. You can also subscribe to this series wherever you regularly listen to podcasts, including on Apple, Google and Spotify. Thanks again for listening.

Cookie Settings