Can technology be part of the solution to enhance privacy?

 Can technology be part of the solution to enhance privacy?

Tech companies and governmental organisations exchanged views on the role of technology in privacy at CPDP2023

“Data is powering so much innovation in the computing space. But what’s also clearly front of mind Is the infection point of AI,” Anthony Chavez, Vice President of Product Management, Google told CPDP2023 on the Privacy through Innovation panel.

“What’s also clear is the need to be equally focused on how we manage and handle personal data.
How do we reduce tracking of user data across apps, while supporting business? We can achieve both goals at the same time.

“Privacy and open access to information is something that should be universal. A blunt approach is restricting prevalent methods like third party cookies. That fails on both dimensions – it harms publishers and content creators, and it also fails from a privacy perspective. Without viable solutions, it just drives worse forms of tracking.

“We need to develop new privacy-enhancing technologies (PETs).

“One of the core use-cases is the ability to measure online ads. Measurement is central to the adtech ecosystem. We do more data processing on the device, and we provide guarantees that user data will never leave that enclave.”

 

Consent as a commercial exchange

The European Commission’s Directorate General for Justice (DG JUST) encompasses both GDPR and consumer protection. Marie-Paule Benassi, Head of Enforcement of Consumer Law and Redress, DG JUST said: “I would like to reflect on how we can make the process smoother for consumers or even automate it, where a consumer would set his or her own preferences.

“Why do consumers need to do that? It’s not only for their privacy to decide on their data protection preferences; they go onto websites and use apps, so they don’t pay. But there is an exchange: as a consumer, when you start navigating online you accept to be presented with advertising. This is not understood by most consumers. It can be found in the T&Cs, the small print, but it’s mostly not explained. When consumers can, they reject cookies, particularly for targeted advertising.

“Why do consumers need more information on the business model? It’s already so complex to consent to cookies if you want to do your work correctly. What has been forgotten is the fact that the consumer is in a commercial relationship.”

 

Demystifying privacy-enhancing technologies

Christian Reimsbach-Kounatze, Information Economist / Policy Analyst at the OECD Directorate for Science, Technology & Industry said: “Our work goes beyond advertising, towards the use of PETs for non-personal data. Why did we work on this?

“It was important to us to highlight that tech would be part of the solution and not always part of the problem. PET has been prominent in the G7 statements. Our work aimed to demystify the potential of PETs. And we wanted to convey the barriers to adoption.

“It’s wider than privacy and consumer protection. It also depends on what is our concept of privacy. There is a notion of confidentiality being important; or, if the concept you have is focusing more on autonomy and agency, you’ll have a different solution.

“If you look at the supply side, developing PET is expensive. So we see countries providing financial support for R&D in this area. On the demand side, does regulation prohibit or encourage the use of PET? One of the challenges is related to the legal uncertainty that exists.

“You also have to care not only about the big companies, but also care about the SMEs, which don’t have the resources, and there is a problem. They don’t have the inhouse capacity to put those technologies in place.”

 

Privacy-preserving adtech

Speaking on another CPDP2023 panel on Privacy Preserving Advertising, Martin Thompson, Distinguished Engineer at Mozilla said: “The problem with privacy has been recognised in the advertising industry for many years. We’re seeing a trend towards privacy-preserving advertising technologies. One is trust-based: ‘we won’t do anything bad with your info’.

“A lot of systems manifest as a centralised clearing house for information, which is trusted in some way. That clearing house is then responsible for managing the use of your information.

“All these things amount to nothing more than a ‘pinkie promise’ that is vulnerable to breaches.

“Google’s work here is exemplary: systems that provide guarantees of access to information that allow processing of information without its release.

“Advertising is extraordinarily ineffectual as a tool for changing people’s minds – it only does so in aggregate, and if you’re doing good targeting. The goal of the ad industry is to improve their efficiency.

“We’re looking at providing ways to build ad systems with the use of information. We provide the ability to do measurement, for example – we recognise that this is extremely important for the industry – without revealing information about individuals on the website. We’re working with Meta on a proposal that shows some promise.

“Consenting is a messy problem; we’re moving towards automatic cookie labelling in our products.”

This article was written by RAID’s Editorial & Conference Director, Ben Avison. To get involved in RAID 2023 conference in Brussels on 26 September, contact ben.avison@cavendishgroup.co.uk