In this wide-ranging interview, Helen Dixon, Commissioner for Data Protection in Ireland contemplates a new architecture and governance of the internet arising from the battleground of technology regulation – and offers sound advice for data controllers
RAID: How should policy and regulation balance the public benefits of technology with the interests of data protection?
Helen Dixon: Policy and the resulting legislative framework currently provide all of us with useful foundations for examining questions of balance and trade-offs between the benefits of technology and the individual’s right to data protection. High-level principles of fairness, transparency and accountability in the collection and processing of personal data simply make sense.
However, it is the application of these high-level principles to everyday technology scenarios that poses challenges for regulation. In the broadest sense, one might argue that, as societies, we are losing the battles. Each of us carries around a smart device capable of surreptitiously video and audio-recording anyone we meet as well as guaranteeing our own location and habits are tracked. Our streets and housing estates are filled with CCTV and smart bells. Every supermarket and building we enter is secured with state-of-the-art surveillance devices. Vehicle cameras and dashcams, bicycle helmet cameras and drones ensure our image can be picked up even when we’re out and about. Air miles, supermarket and pharmacy loyalty schemes ensure we are accurately profiled to sell us more. Tracking of our movements across the internet and cross-devices ensures we can be optimally isolated and targeted to buy more or give up more of our time and attention. And children are exposed in the same way as adults to this current construct. The near future will bring the further integration of biotechnology with information technology and increased deployment of artificial intelligence implementations.
But if we are losing the battles, as regulators, we need to ensure we ultimately win the war. We are likely only to achieve this by arriving at a clearer articulation of the type of world in which we want to live. Perhaps, the point has now been reached where consideration needs to be given to calling out specific harms or implementations that should be banned in law. If, as societies, for example, we believe automated facial recognition should never happen in public places, then we should seek to have it prohibited in law. If we believe tracking across the internet and targeted advertising should not happen because a user’s consent can never be valid in this context (as some suggest), then we should prohibit it in law.
To achieve a new order in these areas will likely involve a re-architecting of and a new governance model for the internet. Policy and law makers from the disciplines of competition, consumer, data protection and media law need to join up and at a global level if that is ever to be a reality. It is also important that these regulatory dilemmas are not approached from a purely ‘ablest’ perspective; we know that advances in technology – particularly in the field of AI – have revolutionised the lives of many people living with physical and learning disabilities in our society and any resolution of the dichotomies mentioned above must include representation from the widest possible range of stakeholders, which is one of the reasons why the DPC has made children and vulnerable groups a regulatory priority going forward.
In the meantime, as data protection regulators, we must use the sensible principles and rules we have to guide organisations in conducting analysis and giving thought to the balance of rights and we must enforce in the highest risk cases.
RAID: What common challenges do you think data regulators face around the world, and what regulatory initiatives do you support beyond Europe?
Helen Dixon: Common challenges faced by regulators stem from the ubiquitous nature of automated personal data processing as well as the rate of evolution of data-consuming technologies. This leaves regulators supervising endless sectors and scenarios involving personal data processing in circumstances where there is no consensus on what outcomes we want to achieve. Do we want to seal off all communications as one hundred per cent confidential, thereby preventing platforms from detecting when child abuse images are being transmitted? Do we want a democratic, open, free-of-charge to user internet, or do we want more subscription-based, quality services and less tracking? Do we want CCTV everywhere to make us believe we are more secure or do we want more freedom from surveillance? Do we want end-to-end encryption and accept, then, that the police will equally be hampered by it when they want to detect potential criminal activity? Do we accept that backdoors for the “good guys” can equally be exploited by the “bad guys”? Do we want more and faster ground-breaking health research or do we want to ensure our health data is never shared outside of public health authorities?
The reconciliation of these contrary positions poses one of the biggest challenges to the regulatory landscape; not every choice is a black-or-white binary. The challenge for regulators is that data protection – as a fundamental right – lives in the nuances between these polar extremes.
RAID: What challenges does Brexit pose you as a regulator, for example issues around sharing data between Ireland and the UK as a third country?
Helen Dixon: Brexit poses many challenges for the Irish DPC as a regulator. The loss of the UK and the ICO from the EU is significant in terms of their contribution; particularly so as another common law jurisdiction like Ireland. Data transfers have been a serious concern, but the recent granting of adequacy to the UK by the EU Commission has alleviated some of those issues. A considerable cohort of companies that were main-established for GDPR purposes in the UK have now moved their data protection decision-making HQs to Ireland, increasing our span of responsibility.
RAID: What challenges does the agglomeration of big tech companies in Ireland pose?
Helen Dixon: Regulating big technology platforms is never going to be easy. They operate at very large scale, evolve at high speeds in terms of mergers and acquisitions, are complex in structure and products, and bespoke in terms of the disruptive technologies and systems they deploy. The DPC has built up considerable expertise and knowledge of the most major platforms over the last 10 years and has had successes in ensuring they deliver a more data protection-enhanced service to users. Over the last five years, the DPC has hired over 100 additional skilled lawyers, technologists and investigators to deal with the significant volume increases in regulation arising from the DPC’s role as the Lead Supervisory Authority under the GDPR. .
A range of “big technology” investigations by the DPC are now underway or concluded. The first such two cases concluded have gone through the full “cooperation and consistency” mechanisms under GDPR (Twitter and WhatsApp) with yet more scheduled to be escalated to Article 60 imminently. There is no doubt the Irish DPC – through its investigations – will be leading the interpretation of key aspects of the GDPR through the next few years. The GDPR is still a very “young” legislation, so to speak, and its mechanisms – such as Article 60 and Article 65 – can only be truly tested by operationalising them.
RAID: What are the data regulation issues arising from the growth of IoT and cloud computing?
Helen Dixon: While IoT and cloud computing are not themselves “issues” with regard to the GDPR, there are a number of areas which must be given particular consideration for processing systems which are based on, or incorporate elements of, IoT technology and cloud computing.
For example, regardless of whether one is dealing with paper files or a complex cloud environment, personal data must always be processed in a lawful, fair and transparent manner, as set out in Article 5. Transparency in respect of IoT devices – such as a smart watch or virtual voice assistant for example – may be more difficult to achieve in some cases, but it is the responsibility of the controller to determine the most appropriate mechanism with which their transparency requirements can be met. As IoT devices and associated processing systems are complex in nature, often integrating with various pieces of hardware and software (e.g. companion apps), controllers must be able to account for each of the primary, secondary and even tertiary processing operations which may arise from the many hardware and software integrations that would be typical of an IoT system, and ensure that each processing operation and purpose for processing has a valid legal basis.
When implementing cloud solutions, or using cloud computing technologies and platforms, controllers need to ensure that such transfers are lawful, and document the legal mechanism under which the data transfer is taking place. Controllers must also ensure that appropriate supplementary measures and safeguards are in place to protect personal data that is transferred to third countries. With regard to those security considerations, a controller must be satisfied – before entrusting personal data to a cloud provider – that the cloud provider’s security standards are sufficient and appropriate for the processing of personal data they will undertake on the controller’s behalf.
From a regulator’s perspective, the challenge is of course that we must supervise all of this activity: driving compliance and enforcing the legislation in a radically non-traditional environment. With the vast majority of traditional regulation, the demarcating lines of responsibility are clearly called out by the geographical boundaries between countries. Cloud-based systems – by their very nature – exist in a space where the “borders” are harder to identify, which adds time and complexity to the business of regulation.
RAID: How should the complex mechanisms of algorithms be clearly communicated to the public, in order to achieve informed consent to use their data?
Helen Dixon: There is no one specific way in which controllers can fulfil all their transparency requirements with regard to personal data processing that takes place through complex algorithms. Articles 12-14 GDPR require controllers to provide appropriate information to data subjects relating to the processing that is taking place. Article 13(2)(f) states that, in cases where automated decision-making and/or profiling is taking place, controllers are required to provide “meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject”.
As a starting point then, controllers themselves must first understand the processing of personal data that takes place as a result of any algorithms that they employ, particularly where automated decision-making and/or profiling is taking place. They must then subsequently determine the most appropriate mechanism with which to explain the underlying logic to data subjects whose personal data is processed in this way, in a concise, intelligible and easily accessible form. Clear understanding is the key to clear communication, so controllers should be thoroughly interrogating any proposed new algorithms for their businesses before they deploy them.
Helen Dixon, Commissioner for Data Protection in Ireland is speaking at RAID (Regulation of AI, Internet & Data) online on 12th October. Register here.