The Data Day: Data Protection Laws in India—A Conversation with Sajai Singh of J. Sagar Associates

Podcast
November 16, 2023
25:37 minutes

Tune in to the fourth episode of Ropes & Gray's podcast series, The Data Day, brought to you by the firm’s data, privacy & cybersecurity practice. This series focuses on the day-to-day effects that data has on all of our lives as well as other exciting and interesting legal and regulatory developments in the world of data, and features a range of guests, including clients, regulators and colleagues. On this episode, hosts Fran Faircloth, a partner in Washington, D.C., and Edward Machin, counsel in London, are joined by London partner Rohan Massey, a leader of the firm’s data, privacy & cybersecurity practice, and special guest Sajai Singh, a partner at J. Sagar Associates in Bangalore, to discuss the Digital Personal Data Protection Act, India’s long-awaited comprehensive data protection law.


Transcript:

Edward Machin: Welcome, and thank you for joining us on the latest installment of The Data Day from Ropes & Gray, a podcast series brought to you by the data, privacy & cybersecurity practice at Ropes. In this podcast, we’ll discuss exciting and interesting developments in the world of data. We feature attorneys at Ropes & Gray as well as clients, regulators, and other industry leaders in conversations about what’s new in the world of data. I’m Edward Machin, counsel in Ropes’s data, privacy & cybersecurity practice—I’m based in our London office. And I’m joined by my colleague and cohost Fran Faircloth, a partner in our Washington, D.C. office.

Fran Faircloth: Thanks, Edward. On this installment, we have Rohan Massey, a partner in our London office and a head of our data, privacy & cybersecurity practice group, speaking with Sajai Singh, partner at J. Sagar Associates in Bangalore. Mr. Singh joined Rohan to talk about the Digital Personal Data Protection Act, India’s long-awaited comprehensive data protection law that I, for one, am eager to hear about.

Edward Machin: Me, too, Fran. I’m looking forward to the conversation.

Rohan Massey: Hi, and welcome. I am delighted to be joined today by Sajai Singh, a partner in JSA in India. Sajai is a leading partner on data protection issues in India. And today, we are going to talk about the new data protection laws in India. So, without further ado, welcome, Sajai. Just introduce yourself and a little bit about you.

Sajai Singh: I have been practicing law now, Rohan, for 33 years, and I have been with this firm, JSA, for 31 years. I was a litigator before that, and once I joined JSA, I got into what is called “non-contentious work” or corporate work, and I’ve been with them ever since—since 1992. My practice has evolved as India’s economy has evolved. It started with the opening up of the Indian economy in 1991, where foreign investment started coming into India, and India stopped being more like Myanmar and inward looking—it became more outward looking. So, the firm was set up to support foreign investment coming into the country. And with that, our laws, policies, everything has evolved because there was a lot of say of the foreign investors that came in. That process continued until 2000, where, I guess, many people, many foreign investors, many foreign governments, wanted to know if India had any data protection law, anything to do with the booming IT/ITeS economy in India. So, we got what may be called an “information technology law” in 2000 that didn’t have anything on data protection—it had things on more like e-signatures, digital signatures, concepts of e-governance, a lot of things that were important but didn’t really cover this issue. Eleven years later, we had something, a secondary legislation, that covered a part of what may be called “data protection.” Today, after all these years, in 2023, we now have a data protection law. So, all of it, why I gave you this history, was that a lot of it was determined—a lot of the basis for its evolution and the law today hasn’t come from an internal need, but it has come based on the opening up of the economy in 1991 and the influence that foreign investors in India and foreign governments interested in India have had on our lawmaking process.

Rohan Massey: Fantastic—thank you. Tell me a little about the law itself. What’s come into play? Is it a GDPR-style law, or is it something different?

Sajai Singh: It uses the same seven principles of data protection or “data economy,” as we call it—I’ll talk about them in a minute—and therefore, there will be similarities, for sure. But there are differences, as well. This particular law is called the Digital Personal Data Protection Act of 2023. The key word, of course, “personal” you would have picked up—so it’s only personal data, not non-personal data or anything else. The first word, “digital,” is very important, as well, because it is digital data that you could have collected online or digital form, or it could be physical data that was digitized. So, it’s digital personal data protection that is covered here. The seven principles that I mentioned, again, you would find similarities with GDPR and the Brazilian law, and maybe the U.K. law, and various others.

  1. The first concept is the lawful, fair, and transparent usage of personal information, which is very important.
  2. Purpose limitation—you need to specify the purpose and collect it and use it only for the purpose that you collected the information.
  3. Data minimization is the next concept. You only collect data that you need—nothing else.
  4. Accuracy is an expectation from the person who collects the data—you call the person a “data collector,” we call the person a “data fiduciary”—and they have to keep the data accurate and up to date.
  5. Then, there is a storage limitation so that you don’t keep data perpetually.
  6. There’s a security expectation, that you keep the data secure, safe, and use reasonable security practices and procedures.
  7. And, finally, there’s an accountability expectation from the person who’s collecting and determining the use of the data.

So, these are the principles—you will find they ring very true. But as we go along in our conversation, I’ll highlight some concepts like the duties of a data principal, the concept of whitelisting which exists in GDPR, and the concept of blacklisting for cross-border data transfer that exists in India. There are these differences, there are some uniquenesses, so we’ll come to them as we go along.

Rohan Massey: Thank you. Interesting to hear. I suppose, one of the critical things to think about, as well, is the idea of penalties under the Act. Does it follow the GDPR on having very high levels of sanctions? Obviously, in Europe, we have 4% of annual turnover or 20 million euros. Is it the same punitive regime?

Sajai Singh: It isn’t a punitive regime. I was dealing with some matters with a Spanish client, and I just realized that a lot of the Data Protection Authority’s penalties in Spain are now tending to be very small or smaller than maybe an Italian or a German DPA. The rationale that the client gave was that it’s very focused on reputational damage rather than the punitive impact of the fine. I would say India is going to head in that direction, because, as a concept, India doesn’t impose very large or huge fines on people—many people won’t be able to pay. And those who will be able to pay, we have certain concepts built into our contract law—the two principles being the principle of mitigation and the principle of restitution—both of those eventually end up forcing the authorities, like the courts or various other authorities, to impose a very nominal fine. But why is that fine, the nominal fine, even relevant? It is hugely relevant because—it’s not the fine that much—it’s the reputational damage. For Indians, like many Asian countries, the loss of face is huge, and people try and avoid a loss of face in any situation. So, if there is something where your reputation is going to be questioned or challenged, or all that you built over these years or over a period of time is going to be lost or tarnished in any way, that’s going to be more detrimental to the individual than a penalty. That’s the overall or the broad comfort that I can give you.

The second comfort is that the law itself, the DPDP Act, actually imposes guidelines or prescribes guidelines for the data protection board—which will be a data protection authority—to impose penalties. And these guidelines—pretty much the principles of natural justice—have actually been written down, otherwise, they would be concepts that every court would follow in India. Starting with the nature, gravity, and the duration of the non-compliance. Then, the type and nature of the personal data that was affected. The repetitive nature of the non-compliance, was it a one-off case, or does it happen often with that data fiduciary? Did someone—assuming the data fiduciary—gain something because of the non-compliance, or was somebody put to a loss because of the non-compliance? Whether the data fiduciary took steps to mitigate, and how effective and timely these steps were. Is the penalty proportionate and effective to the non-compliance? And, finally, something that is very important, the likely impact of the imposition of the financial penalty on that person or entity—this is where the reputation comes in more as well as the ability to pay (you may not have the ability to pay). So, these have been prescribed.

The maximum penalty today, $30 million in U.S. dollars, has been prescribed for a non-compliance called “failure to prevent a personal data breach”—that’s the highest. And then, there are other situations, non-compliances, where penalties have been prescribed, including if a data subject (we call them a “data principal”) does not follow their duties, they have a penalty—it’s about 100 pounds. So, the penalties, as you will notice, are much lower, in any event, from what you may be used to under GDPR. But the fact is that even if the penalty has been prescribed to $30 million in USD, I doubt that’s going to be imposed on anyone because of two factors: the broader factor of how penalties are looked at in India and the specific guidelines under the DPDP to the board before imposing a penalty.

Rohan Massey: Thank you. So, interestingly, in the DPDP, there are very clear guidelines on this, and they seem to be very narrow, very detailed. Is that same level of detail carried through with regard to obligations, as you just mentioned, on the individual as much as on the organization? Do we have that same level of clarity throughout the law, or is it going to be implemented by rules and regulations later?

Sajai Singh: The point is very appropriate, I would say, Rohan, that you’re making. This is the overarching “mother law,” as you may call it. This gives concepts. The details will come out in the rules and regulations. I would say it’s not just the secondary legislation that we are talking about in terms of the details. It will also be guidance that the board issues, because there will be a period of advocacy where the board is explaining to people a little bit more about the law, giving clarity on implementation and compliance. So, all of that will come. I would say a lot of it will come from the board. The level of detail that you are talking about, that I mentioned here vis-à-vis the board, is, I think, what was essential to put here, because setting up the board has gone into a lot of detail. Therefore, once it’s set up, which it should be soon, how would it impose penalties? Breaches are happening, things are happening, non-compliance, obviously, will happen, and then, what does the board do? So, I think that was important for the government to put in the DPDP Act. But everything else, I would say—including what the form of a notice should be to seek consent (because it’s very much a consent-based law), what the form of the consent should be, what the data breach notification should look like (because notification is required to go to the board and every affected data principal)—will come out in the secondary legislation, the subsequent legislation, and the guidance that the board will issue. In many places in the DPDP Act, it is clearly specified that the government has the right or the board has the right to further legislate or give clarity on the point.

Rohan Massey: If we’ve got a waiting period now for guidance on these issues from either the board or the government—we’ve obviously got a binding law in place (it came into place in August of this year)—what should companies be doing now, whilst they wait for that guidance to come in? How should they find themselves not to be caught between a rock and a hard place? How should they be going forward at the moment?

Sajai Singh: As of now, while this particular law has replaced the previous law that existed on data protection, at a practical level, until we have guidance—giving you a small example of the reasonable care that is expected that the data fiduciary should put in place to take care of the personal data—what reasonable security practices and procedures, what management policies should it impose or follow? We still go back to the previous law—the previous law had ISO 27001 guidelines, and those are what we are asking clients to continue to follow until we have other guidelines or until we have more clarity. So, right now is this period—though it’s not been called a “transition period”—you can’t stop following what you were following in the previous law. I think the major change has been that the previous law was focused on a two-tier level of compliance: one for personal data and one for a subset of personal data called “sensitive personal data.” That has gone away—now it’s just personal data. In the previous law, you needed consent only for sensitive personal data, and you could theoretically process personal data without consent. There were exceptions, but that’s gone away. Now, for every piece of personal data, you have to follow through with purpose, specified notice, and then consent. During this period, I would say clients may have to wait for broadly 80% clarity for about a year, and then, 100% clarity—I’m not sure if 100% clarity will ever be there, but a broader level of clarity. I think you are looking at a 24-month period, so things will start getting clearer every month, every few months.

I would say one thing that’s completely gone away where personal data is concerned in India is the concept of “opt-out.” So, if there are clients who have an opt-out concept vis-à-vis personal data, they need to get rid of it. It is now purely opt-in, so that’s fundamental. There are several rights that a data principal has, and many of them are rights like access to data, having information about the data, having a grievance redressal process or seeking grievance redressal vis-à-vis their personal data, and seeking erasure of their personal data. All of these are concepts that require a communication line being established between the data principal and the data fiduciary—that is something that clients can already start looking at and establishing if it’s not there. The grievance redressal process should be put in place—it’s pretty clear that they will need to have—if someone comes up with a request or someone comes up with a grievance, you will redress it. On the same token, if someone seeks erasure of their personal data, you need to erase it—and if you have to erase it, you need to prove that you erased it and you didn’t have a little bit of data lying in some archival material. So, these are things that the client can already start doing.

I’m suggesting that clients do a data mapping exercise to make sure that you know what kind of personal data you’re collecting. Are you collecting data of children? Are you collecting data of people with disabilities? “People with disabilities” is a category of people who have been put in a very similar [category] to children—you need verifiable parental consent. So, it’s a good idea to at least check what you are collecting, what personal data you’re collecting, because then you can know the areas where you may have to get a compliance mechanisms system in place. It is a good idea to have an SOP for data breaches. What would you do? How will you inform? How do you check? While an audit may not be required for everyone other than significant data fiduciaries, it’s a good idea to go through the whole process of auditing your systems.

If you have a data processing agreement with an Indian entity or in India—if two entities, a data fiduciary and data processor, have an agreement (many times they don’t)—I’m telling clients to check. Are you using a data processor, and if so, then do you have a DPA? If you have a data processing agreement, then it’s very important to check that, because there are some obligations that you can pass on to the data processor, or you require compliance so that the data fiduciary is not non-compliant. So, that’s something that can be checked and verified.

Finally, we have the ability of a data principal to seek the notice that they get for collection of personal data, the consent they give, and also probably the privacy policies to seek that particular document in their language. India has 22 national languages, and any data principal can ask for the notice in their language. When they do that, you have to provide it—you don’t have to have everything ready, but when there’s a request, you have to provide it. So, it’s a good idea to translate all these data principal-facing documents—being the privacy policy, the notice for collecting personal data, and the consent form or the consent text—into some of the main national languages of India. And if you don’t do that, then at least have translators ready who can translate all of these documents into the language of the data principal.

These are some things that clients who either are in India and processing personal data in India, or are overseas and providing goods and services to India or Indians and, in that process, collecting personal data, you would need to comply with the law because the DPDP Act also has extraterritorial reach if you are providing goods and services in India. So, these are some things that you can do now.

Rohan Massey: Thank you very much. It sounds like there’s a lot to do, but hopefully a little bit of time left to do it in the next few months and years. So, thank you very much—that was a very insightful overview, and we are very grateful to you for coming in. Thanks, Sajai.

Sajai Singh: Not a problem—anytime. I hope as the law evolves, we can have more podcasts, and we can talk more in detail. It’s very likely that a lot of things we discussed today will look very different in due course.

Fran Faircloth: That was so interesting. Thank you, Rohan and Sajai. Now, Edward, I’m going to ask you the question we normally do at the end of this show: What’s the strangest, most interesting, or even the best thing that you’ve heard about in privacy in the last few weeks?

Edward Machin: Thanks, Fran. So, in the past couple of weeks, I’ve been on a family holiday in California. One of the interesting things I saw out there, which was certainly new to me and not something that I’ve seen done—certainly in the U.K. or Europe—is that when you would walk into a number of shops, they would have at the doorway a QR code and a sign telling you how to exercise your California privacy rights. Like I said, it’s something that I haven’t seen in the U.K., but quite a cool feature, and I wonder whether that kind of transparency is something that we may see coming across our side of the pond.

Fran Faircloth: That’s very interesting. I haven’t seen anything like that on the East Coast, either, and it will be interesting to see whether it spreads this way. One thing that I have seen recently is the White House dropped President Biden’s new Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence—something that has been long anticipated, and we’ve all been watching for. The Executive Order does many things. Among other things, it requires developers of the most powerful AI systems to do regular safety testing and share that. It also includes protections, including things like watermarking to help protect against AI-enabled fraud and deepfakes, which I thought was an interesting addition. And it even proposed to establish an advanced cybersecurity program to develop AI tools to find and fix vulnerabilities in critical software. So, the Executive Order is very broad—it covers a lot of ground, from making sure that AI is safe to using AI to help improve safety in cybersecurity.

Edward Machin: Thanks, Fran. The White House statement also comes hot on the heels of the G7 leaders’ agreement on an AI Code of Conduct for companies, so this is certainly a topic that we’ll be featuring on the podcast again, probably sooner rather than later. For now, that’s it for another episode. Thank you to everyone who tuned in to this episode of The Data Day from Ropes & Gray. And thank you, Sajai, for joining us. If you would like to join us for an episode or you know somebody we need to have on the show, please reach out to Fran or me by email, or we’re both on LinkedIn. If you enjoyed the show, please do subscribe. And you can listen to the series wherever you regularly listen to your podcasts, including on Apple and Spotify.

Sajai Singh
Partner, J. Sagar Associates
See Bio
Subscribe to The Data Day Podcast