UK Information Commissioner warns of risks around neurotechnology

Viewpoints
June 8, 2023
4 minutes

On 8 June 2023, the UK Information Commissioner (ICO) issued a warning regarding the possibility of discrimination arising in respect of new technologies that monitor the brain.  The ICO is concerned that, if neurotechnologies are not developed and utilised properly, there are significant risks that they could become biased, with neurodivergent individuals especially being in danger of discrimination.  To try to mitigate such risks, the ICO plans to develop guidance for neurotech developers.

The ICO has produced a report entitled "ICO tech futures: neurotechnology".  This examines the collection, analysis and use of neurodata (in other words, information that is directly produced by the brain and the nervous system), with personally identifiable neurodata being considered to be personal data regardless of the purpose of processing such data.  The report is intended to be an introductory guide regarding the regulatory aspects of neurotechnology and explores the impact of such technologies and their effects on privacy.

The expectation is that using technology to observe neurodata will increase significantly during the next ten years.  According to the report, approximately 39 UK-based companies are currently focused on neurotechnology, while global investment in neurotechnologies and the filing of related patents is increasing rapidly.  The ICO's fear is that if new neurotechnologies are not designed with certain types of individuals in mind, such technologies may discriminate against particular groups of people.

Although once the preserve of science fiction, as the ICO notes, neurotech is already in use in the healthcare and research sectors, where it is strictly regulated.  Neurotech can anticipate, recognise and treat complex diseases, both mental and physical, and impact upon patients' responses to certain illnesses, for example, dementia.

The ICO's concern is that neurotechnologies are also quickly being developed for uses outside of healthcare. For example, workplace and employee hiring and monitoring (e.g., monitoring of concentration levels), personal wellbeing, sports and marketing purposes, in some cases to provide more personalised services.  If such systems are not created and trialled on a broad variety of different individuals, inaccurate data and inherent bias may become ingrained in such technologies, which may adversely impact certain individuals and groups.

The report notes that processing neurodata can create serious risks to the information rights of individuals in three particular ways:

  • The intrinsic and involuntary nature of neurodata (which is subconsciously generated, with people having no direct control over the specific information which is disclosed);
  • The possibility for organisations to create large scale, complex data sets about individuals, which may facilitate the creation of detailed inferences about highly sensitive information, such as information concerning mental health; and
  • The possibility that neurotechnology not only monitors and collects neurodata, but could modulate neuropatterns and alter behaviour (this could increase dangers for people concerning the automated use of their personal data and may result in a lack of transparency and understanding about the purposes for which and the ways in which organisations are using it).

The report highlights a number of regulatory issues in connection with the processing of personally identifiable neurodata which are likely to require consideration.  These include issues in respect of: 

  • Regulatory definitions (including in respect of medical neurodata and personal, but rarely special category biometric data, classificatory neurodata and high risk neurodata);
  • Neurodiscrimination;
  • Consent; neurodata and appropriate bases of processing;
  • Closed-loop processing posing heightened risks around shifts in purpose and in automated processing;
  • Accuracy and data minimisation;
  • Neurodata and research (including in connection with transparency and data sharing); and 
  • Information rights (including the rights to erasure, rectification and portability in certain circumstances and the right of access).

The report notes that neurotechnologies can deliver significant benefits and advantages for society, organisations and individuals alike, such as facilitating the development of neuroscience and understanding of the human brain, offering potential new treatments for neurogenerative conditions and greater accessibility for people with disabilities.  

While recognising the fact that neurotechnologies can have societal benefits, the ICO's Executive Director of Regulatory Risk, Stephen Almond, noted that that such technologies can collect certain sensitive personal information, often without individuals knowing that this is happening, including details of emotions and complex behaviour and warned of potentially serious ramifications around discrimination if such systems are created or used in unsuitable ways.

Discrimination in neurotechnologies may arise where inherently biased systems are developed, which could result in inaccurate data and inferences about individuals and groups.  Inaccurate data may occur if neurotechnologies are not developed and tested on a broad assortment of individuals, with neurodivergent communities being especially vulnerable to discrimination as a result of inaccurate data that has been trialled on neuro-normative patterns.  There are also fears that unfair decisions could be reached, even if accurate data is utilised.

The ICO observes that there is a danger that deploying neurotechnologies in the employment context could lead to inequitable outcomes.  For example, if specific neuropatterns or data come to be regarded negatively due to ingrained bias, then individuals with those patterns may be passed over for certain opportunities.

The plan is for the ICO to develop specific neurodata guidance by 2025.  This is likely to focus on explaining relevant legislative and technical neurological definitions, emphasise links to current ICO guidance, set out the ICO's views on growing risks and outline sector-specific case studies.  The report notes that having a clear understanding of the terminology and technology helps individuals to understand their rights and also assists organisations in complying with their transparency and other data protection-related obligations.  Unless this can be achieved, a number of data protection challenges are likely to arise, such as difficulties around obtaining valid consent to processing and automated processing of neurodata, among others.

The ICO is proposing to work with key stakeholders as well as the public to produce the new guidance, which should hopefully lead to greater regulatory clarity in this area.  While the proposed guidelines are still some way off, it is clear that any organisations which are engaged in or considering developing or deploying neurotechnologies, including in the workplace, should take care to start considering any relevant data protection considerations arising around the use of such technologies now.