Innovators have created digital tools that help navigate roads, pay bills, video chat with family, and much more. However, these tools depend on data which can be breached or misused. The loss or misuse of data can reduce privacy, and lead to harms ranging from identity theft to bias and discrimination.
Large tech companies continue to make headlines around limiting data collection and providing more disclosures. Regulators, both domestically and abroad, are exploring data-dependent technologies like artificial intelligence.
Governance and legal frameworks for appropriate collection and use of data are being explored, but guidelines and oversight continue to be fragmented.
The debate over data collection and use in the United States continues amidst the global COVID-19 pandemic and the associated economic recovery. Digital contact tracing to mitigate the spread of the virus depends on data collection, but many Americans have expressed hesitancy to provide information due to privacy and security concerns. As vaccinations occur there is a new debate around how to digitally record, share, and secure proof of vaccination.
The debate around what data are appropriate to collect and what entities should be able to use them is important. But there are also opportunities to use innovation, policy, and technology itself to help protect privacy and enable appropriate and fair use of data.
How can we create products and services that are effective while also minimizing data-related risks such as breach and misuse?
Defining privacy enhancing technologies
Privacy enhancing technologies (PETs) are a group of systems, processes, and techniques developed to achieve this balance between use and protection.
PETs enable processing to derive value from data while minimizing the privacy and security risk to individuals. PETs are not defined by a single technology and there is no consistent definition for the category yet. PETs may be used to comply with new privacy-focused legislation, like the European Union’s General Data Protection Regulation (GDPR) or California’s Consumer Privacy Act (CCPA), but using these systems, processes, and techniques does not ensure privacy or compliance.
Entities may be motivated to adopt PETs to support information security and confidentiality, help comply with regulation, and to differentiate their products for consumers looking for increased privacy protection.
To help understand the evolving landscape of PETs, the San Francisco Fed’s Fintech team has published a report that explores the current landscape of these systems, processes, and techniques: Privacy Enhancing Technologies: Categories, Use Cases, and Considerations.
The Potential of PETs
PETs contribute to privacy and data protection in a variety of ways. The first category of PETs are tools that alter data itself. These typically seek to disrupt or break the connection between data and the individual they are associated with. Another group of PETs focuses on hiding or shielding data rather than altering it. Encryption is an example of this since it changes the format of data but is intended to only obscure it temporarily rather than alter it permanently. Finally, there is a broad category of PETs that represent new systems and data architectures for processing, managing, and storing data. Some of these systems break apart data for computation or storage, whereas others provide management layers to track and audit where information is flowing and for what purpose.
PETs have the potential to enable both greater security and confidentiality of data across use cases, and to enable greater individual control over data by providing transparency, choice, and auditability within data systems.
Current challenges with PETs
While the potential for innovation in this area is strong, there are also several challenges to the use and growth of PETs.
- Entities deploying PETs must have sufficient internal capacity and expertise. Given the breadth of different technologies and systems, it can also be challenging to understand what resources are needed to start using these tools.
- While promising, some techniques and systems are still in the early phases of development, and there is limited investment in ongoing research.
- The use of PETs does not ensure that firms are automatically more privacy-protective or in compliance with new laws. In many cases, even enhanced techniques can be reversed or compromised. PETs still need to be treated like any technical implementation, with oversight and management around use, access, and security.
- PETs need to be considered alongside new movements around data rights. If individuals request information about themselves, seek to move their information to new providers, or would like their data deleted entirely, entities need the capability to respond to those requests.
Privacy enhancing technologies—like all innovations—present both opportunities and challenges. But they are a promising option to enable both the use and protection of people’s data.
The SF Fed’s Fintech team looks forward to continuing to participate in research and dialogue around privacy enhancing technologies and their applications. For a deeper discussion of these issues, visit Fintech Edge.
Kaitlin Asrow is a fintech policy advisor at the Federal Reserve Bank of San Francisco.
You may also like to read:
The views expressed here do not necessarily reflect the views of the management of the Federal Reserve Bank of San Francisco or of the Board of Governors of the Federal Reserve System.