Ensuring children’s privacy online in a fragmented policy landscape

 Ensuring children’s privacy online in a fragmented policy landscape

How to best protect children’s privacy online is a huge challenge for policymakers, regulatory authorities, parents and educators, technology companies and children themselves. Jurisdictions have various policies at different stages of development, with little national, regional and international alignment between them. Consequently, data controllers and processers are left struggling to understand the role they may or should play and, as much, their rights and obligations and how to exercise or meet them respectively.

 

National regulatory approaches

A group of expert lawyers trying to make sense of this for their clients addressed an audience at the CPDP2023 conference, on a panel moderated by Stewart Dresner of Privacy Laws & Business.

The first piece of legislation to specifically address children’s privacy online was the UK Children’s Code, which is now proving very influential internationally. Created by the Information Commissioner’s Office (ICO) and brought into effect in 2020, it focuses on online activities that have a commercial element.

“The UK Children’s Code is a statutory code, so both the ICO and courts have to have regard to it”, said Ruth Boardman, Partner and co-head of International Privacy and Data Protection Group at Bird & Bird, pointing out that that this is not the case with guidance in other jurisdictions. “Points in the UK Code have been picked up on by California Age-Appropriate Design”.

The UK Children’s Code applies to all users under the age of 18. “Children means children,” she said.

But other countries have different age thresholds when it comes to regulating children’s privacy online, which is a challenge for companies.

Diletta de Cicco, Counsel at Squire Patton Boggs pointed out how the approach of Garante, the Italian data protection authority, compares with the ICO code. “They are similar approaches: although the Italian DPA does not provide guidance on the topic, they mention the ‘accountability’ requirement that is in line with the risk-based approach provided by ICO. The difference is that the Italian DPA directly moved into action, so we can learn some lessons from the decisions, but we do not have guidance yet.”

“The age of the user also impacts the legal basis for processing data. Children under 18 are not eligible to sign a legal contract.”

Other challenges companies face including the difficulty of ascertaining how likely is the service to be used by a child, and of identifying a child’s age.

AI is one possible tool for age estimation. “Identifying patterns of behaviour, during sign-in for example, is a way to mitigate against consequences,” said Joke Bodewits, partner at Hogan Lovells who works with clients in the gaming industry – the majority of whose users are children.

Another issue is how to notify children how their data is being used. “GDPR provides for a right to transparency on how data is being used.” Laura Brodahl, Associate at Wilson Sonsini Goodrich & Rosati.

According to GDPR, the data subject must be made aware of the risks, rights, rules, and guarantees of the processing in a transparent and understandable way. It stresses the importance of communicating in simple and easily understandable language.

“If your service is going to be used by children, the threshold for ‘easily understandable’ will be lower,” said Brodahl.

This means not only using short sentences and avoiding legal jargon. “It’s also about the delivery of the information, not the content. Children might be looking for pop-ups or video. They have shorter attention spans. Using bright colours, or something that has a playful element, is where we night see things going,” said Brodahl.

“It’s not only about informing children; it’s about informing parents about what’s going on in the games,” added Bodewits.

Brodahl pointed out that children are very different at different ages. “A child’s desire for the parent to be involved is very different at 5 than at 15,” she said.

“You need to have an age rating, determined by colours, voices, functionality – it steers you to design the game in an age-appropriate way,” said Bodewits.

 

“A broader trust and safety landscape”

Joanna Conway, Internet Regulation (Legal) and Legal ambassador for Cyber & Strategic Risk, Deloitte told RAID exclusively: “Privacy is just one facet of protecting children online – and that is itself part of a broader trust and safety landscape. Children raise unique challenges for platforms in terms of compliance with the breadth and nuance of global disparate regulations that apply to children online.

“As we continue to support our clients to protect children, Deloitte’s Internet Regulation team expects regulators to prioritise child safety as new broad regulations like the EU Digital Services Act and UK’s Online Safety Bill apply and expects the trend to specific and regional regulation, including in US States and Asia, to continue.”

Another session at CPDP2023, organised by FRA (Fundamental Rights Agency), addressed the role of algorithms regarding potential harms as well as the foundation of protective measures.

Emilia Gomez of the European Commission’s Joint Research Centre shared different research projects undertaken to examine AI technologies through the lens of children’s rights, including the report: JRC Publications Repository – Artificial Intelligence and the Rights of the Child : Towards an Integrated Agenda for Research and Policy (europa.eu). The Norwegian DPA shared the progress made at the EDPB level to prepare guidelines on the processing of the minors’ personal data – still under discussion due to differences of approach among the DPAs – and on CSAM, which are close to being finalised.

Cecilia Alvarez, EMEA Director of Privacy Policy at Meta, described the holistic approach of Meta that combines safety, wellbeing and privacy measures and tools to facilitate age appropriate experiences and responsible empowerment, including measures regarding service restrictions; education and transparency tools; safety, wellbeing and privacy safeguards, ad restrictions; parental supervision tools and understanding user age, guided by intense research, consultation and co-design activities with experts, parents and teens, regulators and policymakers.

 

CSAM vs privacy

The most extreme violation of children’s rights online is child sexual abuse material (CSAM), against which the EU is legislating – but this itself raises privacy concerns. The nascent EU regulation includes detection duties as well as a voluntary scheme for companies to report CSAM, which is not considered a privacy violation.

“There is a new proposal that courts can require services to scan for CSAM,” said Simon Mortier of McDermott, Will & Emery. “Some regulators, including the European Data Protection Supervisor, are concerned that it’s disproportionate, that the tech is immature, that it can detect CSAM but not grooming – and whether it would oblige encryption to be broken.”

Speaking on a later panel on safeguarding children against online sexual abuse, Thomas van der Valk, Privacy Policy Manager EMEA at Meta said: “We have several programmes to protect minors on our platforms. Regarding CSAM, if you followed the logic of detection orders, it would undermine the promise and the benefit of end-to-end encryption. It’s not black and white. We have a lot of measures in place to combat CSAM and grooming, but they are more that can be built around other types of data – metadata, user behavioural signals, traffic indicators and user reports, in particular, for prevention purposes.

“That’s not the same as installing software that would scan everyone’s content. This is prevention, to make sure harm cannot happen in the first place.”

The fact that each country has different regulatory requirements remains a huge challenge for companies looking to provide safe services that respect the privacy of children online.

“There is no overarching document that gaming clients can use when designing services for children,” said Bodewits. “Contract laws deviate from country to country; so do advertising guidelines.

“Companies want to present to the single market in a single way. We see an evolving legal landscape, with different questions from different regulators.”

Speaking on a panel on how platforms are approaching minors’ privacy, Caroline Goulding, Data Protection Officer at TikTok said: “There is so much market-by-market divergence. As a company, you need a floor of protection and build on that. We want to advance children’s rights and work together.”

“This is clearly an area where we need more guidance and codes of conduct,” said de Cicco. “Companies have asked for global standards. Perhaps there is a solution that should be handled globally.”

“The best interests of the child is the best way to process this,” said Boardman.

This article was written by Ben Avison, Editorial & Conference Director of RAID (Regulation of AI, Internet & Data), a Moral Supporter of CPDP. To find out more about RAID 2023, taking place in Brussels on 26 September, visit www.raid.tech