This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
Digital developments in focus
| 4 minutes read

ICO tech report highlights privacy challenges for four key technologies

The ICO is encouraging developers to consider privacy at an early stage when implementing new technologies and, in its first annual Technology horizons report, examines the implications of some of the most significant technological developments for privacy in the next two to five years. The report follows commitments made in the recent ICO25 Strategy for the ICO to set out its views on emerging technologies with a view to reducing burdens on business, supporting innovation and preventing harms.

The Technologies

The ICO report focusses on applied technologies likely to impact privacy in the near future, rather than on the foundational technologies (such as AI or 5G) they are built on. The four applied technologies it identified are:

  1. Consumer healthtech: this covers wearable devices and software applications that help people assess their health and wellbeing (including electronic components in smart fabrics). The report distinguishes these from medical devices or digital therapeutics (Dtx), which are not covered as it notes these products are likely to have faced greater scrutiny and safeguards during the process of being designated as a medical device.

  2. Next-generation Internet of Things (IoT): these are physical objects that connect and share information and which can also sense, respond to/interact with the external environment. While IoT is not new, it is currently evolving. Edge computing and improved hardware,software and interoperability will all enable the next generation of IoT to respond to people’s needs in real time.

  3. Immersive technology: while this encompasses a broad range of applications (called ‘extended reality’, or ‘XR’), the report focusses on augmented and virtual reality. This includes hardware (headsets etc.) that create immersive experiences for users.

  4. Decentralised finance (DeFi): DeFi refers to financial systems that remove centralised intermediaries from transactions and financial products/services. The report looks at software that employs blockchain technology to support peer-to-peer financial transactions and notes that personal information is often embedded in public transaction records.

It is, however, worth noting that when selecting the technologies most likely to impact privacy, the ICO found that the “most significant of these” was in fact neurotechnology. This fifth area will therefore be the subject of a separate ICO ‘deep dive’ report in Spring 2023 (which will be similar to its other recent deep dive reports).

Other areas that, although still important for privacy, were considered to have a less immediate impact were quantum computing, digitised transport, generative AI, synthetic media, digital ID and behavioural analytics. The report provides a short description of each of these, and in the case of behavioural analytics, reminds us that the ICO cautioned organisations to assess the risks of using these technologies (see our blog).

Common Challenges 

As well as noting the significant opportunities presented by these different technologies, the ICO highlighted a number of common challenges:

  • Lack of transparency and control: a growing number of technologies are collecting personal information in ways that are not always transparent and people may not have meaningful control over. E.g. augmented reality devices may capture information about third parties (other than the intended user) or healthtech apps may give third parties access to the data collected (for example, if they use software development kits which allow people to log on through their accounts on other platforms but which give that platform access to data without this being made clear to the user).

  • Complex data ecosystems impact people’s understanding/ability to exercise rights: complex ecosystems can make it hard for people to understand how organisations are processing their information (and hold them to account) and organisations must ensure that people can still exercise their information rights. Some technologies may also collect more information than they need for their primary purpose. E.g. people could be tracked across consumer healthtech or virtual reality devices in ways that may not be transparent or necessary for the primary purpose.

  • Sensitive data may require additional safeguards: Some technologies collect information about sensitive personal characteristics that may require additional safeguards. Organisations need to understand when this information is classified as special category data and put appropriate measures in place. E.g. healthtech organisations must think carefully about whether they are collecting biometric or health data, and this may be impacted by things such as the source of information (e.g. was advice to exercise more given by a doctor or a fitness wearable?). Organisations must also remember that observed or inferred data which is linked to a person may be personal information. E.g. information collected by an IoT device and linked to a person’s account which shows when an application is switched on and off may suggest information about that person’s whereabouts.

Challenges were also identified around the accuracy of inferences made by some devices, bias, data minimisation and cybersecurity as well as some tech specific challenges. An example of the latter would be the potential difficulties that exist in exercising rectification and erasure rights if information is held on the blockchain. For a more detailed discussion on this point, see our March of the Blocks: GDPR and the Blockchain white paper.

What can organisations do? 

Privacy by design is key, and the report notes that some organisations are already exploring new and innovative ways to engineer privacy into the design of these technologies. An example it gives is manufacturers embedding redaction technology into extended reality devices to minimise the unintended processing of information about bystanders. The report also notes certain steps organisations can take. For example, it reminds organisations that they should:

  • carry out data protection impact assessments where needed, and try to minimise data collection (e.g. if an IoT is always ‘on’ and collecting information);
  • revisit privacy notices to ensure they are still fit for purpose - e.g. a long, written policy designed for a 2D environment is unlikely to work in a 3D one;
  • keep on top of the various reports and guidance being issued by the ICO - e.g. where AI is involved, refer to the ICO’s various AI guidance. Similarly, if children may use the tech, refer to the Children’s code. New guidance, research and reports are also expected in areas such as IoT and neurotechnology;
  • check if they will need to comply with the new Product Security and Telecommunications Infrastructure Act which is due to introduce basic security standards for IoT devices.
  • consider applying for the ICO’s Sandbox scheme if they need support on a project related to the technologies in its report.
Some organisations are exploring new and innovative ways to engineer privacy into the design of these technologies…. Other organisations are failing to imagine how privacy could be engineered into their ideas. [The ICO] will not allow businesses that are doing the right thing to be outcompeted by businesses that fail to comply with data protection law (ICO Tech Horizons Report)


big data, ai, biotech, data, emerging tech, fig, fintech, quantum computing, regulating digital, tech procurement and cloud, ar, blockchain and smart contracts