You can call me AI: takeaways from the IAPP's DPC23

November 20, 2023
5 minutes

Last week saw more than 3,000 people descend on Brussels to attend the IAPP’s Data Protection Congress 2023.  Spending the week at a conference is a bit like going to a music festival: choosing between sessions, catching up with friends, and not getting much sleep, but it’s undoubtedly time well spent.

And much like Glastonbury, Primavera or Coachella, it’s impossible to see everything and everybody that one wants.  For that reason, this article contains impressions of and takeaways from my DPC23.  If you had a different experience, I’d love to hear about it.


Unsurprisingly the headline act was AI — but it was impossible to avoid the topic du jour wherever you went.  Besides a continued interest in generative technology, the focus of numerous sessions was AI governance — what it means and how to do it.  Interestingly, I overheard a few attendees discussing the need to move beyond buzzwords — “ethics”, “innovation”, “harm” — and towards concrete and practical solutions for developing and using AI in an ethical, safe and secure way.  That sentiment aligned with my experience of talking to businesses this year about the legal, regulatory and policy issues being raised by AI, and putting frameworks (as well as policies and procedures) in place to address those issues will be a defining theme of 2024 for many organisations.

For me, though, the most interesting AI developments of the week took place as I was on the train back to London, and then again the following evening.

  • On Thursday, the UK Government confirmed that it won’t regulate AI “in the short term”, so as not to stymie innovation and growth in the industry.  This isn’t a change in policy, but it’s interesting to see the Government stand firm in the face of a growing global consensus — including President Biden’s recent Executive Order on AI — that some form of hard AI regulation is needed.  This time next year we will have a clearer idea of where governments stand on their approaches to AI legislation, for better or worse.
  • On Friday, Sam Altman, the co-founder of OpenAI (whose ChatGPT is largely responsible for introducing AI into the public consciousness), was removed from the company by its board.  Clearly, one business doesn’t make an industry — and AI has been, and will be, much more than chatbots.  But Altman’s departure (and subsequent developments over the weekend, including the possibility that he may be reinstated) show that the industry’s fault lines and mission are both still very much a work in progress, and for most organisations that are curious about AI but unsure about how to proceed, doing so with caution is still the best approach.  Indeed, aside from a small number of big players, most folks I spoke with in Brussels are comfortable exploring AI from the comfort of the peloton, and that’s likely to be the case well into next year. 


Although discussions around AI dominated the event, I would urge readers not to overlook the raft of EU data-related laws that (i) have already taken effect (e.g., the Data Governance Act), or (ii) will take effect in the coming 18 months (e.g., the Digital Operational Resilience Act, the NIS2 Directive, the Data Act).  They’re not making front-page news, but I’d venture that some combination of these laws will have a more concrete impact than AI on many businesses in 2024 and beyond.  I’ve been banging this drum for some time, so it was good to see a panel dedicated to the alphabet soup of these laws and also to speak to numerous people whose companies were at various stages of preparing for their introduction. 

There isn’t space here to describe the laws, but in my view the key takeaway is that many of them require the board/c-suite to have ultimate responsibility — and liability — for driving risk, security and compliance standards at their organisations.  To be clear, including an overview of (for example) a DORA compliance programme in the board’s papers isn’t going to cut it.  Instead, the next 12 months should be used to educate and engage senior management on the steps their organisations need to take to ensure they are compliant.  Civil and criminal liability and/or banning orders for board members are features of some of the laws, and should help you to focus the minds of your executives.


The closest I get to music festivals these days is watching from my sofa, but my recollections of the real thing are that the most interesting stuff is often found away from the main stages.  And so it proved at the DPC23, with two discussions in particular catching my attention.

  • The “pay-to-play” model introduced by Meta in response to the CJEU’s decision on its use of targeted behavioural advertising is currently being discussed by the EDPB, which is expected to publish a common position on the viability of the model in the coming months.  Given that consent is emerging (has emerged?) as the only lawful basis under the GDPR for targeting advertising, companies should now be thinking about how to (i) transparently describe their processing for behavioural advertising, and (ii) obtain granular consent from users that don’t pay for an ad-free service.
  • Bruno Gencarelli, the head of international data flows at the European Commission, confirmed that before the end of 2023 his department will release a report on the functioning of existing Commission adequacy decisions.  Although the focus on data flows out of Europe has lessened following the entry into force of the EU-U.S. Data Privacy Framework and accompanying UK-U.S. Data Bridge, businesses shouldn’t assume that the show is over.  Sending data to non-adequate countries still requires attention, and although there looks to be limited appetite among most EU DPAs for regulatory enforcement of international transfers, counterparties and potential acquirers will expect Schrems 2 requirements to be met.  The Commission’s reports will therefore hopefully allow organisations to leverage the findings of existing adequacy decisions for their DTIAs and related assessment of third-country laws and practices.

Subscribe to Ropes & Gray Viewpoints by topic here.