In technology we trust

 In technology we trust

Governments, regulators, companies or consumers – who should take responsibility for creating a safe and trustworthy digital environment?

 

At RAID Digital on 4 May, Emma Wright, Director, Institute of AI and Partner at Harbottle & Lewis moderated a panel of experts comprising: Boštjan Koritnik, Minister of Public Administration, Government of Slovenia; Joanna Conway, Partner & Internet Regulation (Legal), Deloitte; Anna-Liisa Pärnalaas, Counsellor for Digital and Cyber Policy, Permanent Representation of Estonia to the EU; Arvind Gupta, President, Digital India Foundation; and Bradley Tusk, Co-founder and Managing Partner, Tusk Venture Partners, tackling the question of How to Ensure Trust in Digital Technology.

“With new technologies, trust is a must,” Boštjan Koritnik, Minister of Public Administration, Government of Slovenia said in his opening remarks. “The Digital Services Act (DSA) follows the principles that what is illegal offline must also be illegal online. It will increase trust as it will make the internet a safer place for EU citizens.”

Anna-Liisa Pärnalaas, Counsellor for Digital and Cyber Policy shared some insights into the scope of the Act. “The DSA is an EU regulation, but its impacts are wider – the approach is quite ambitious. One of the main aims of the regulation is to create a safer digital space where the fundamental rights of users of digital services are protected.

“The regulation aims to protect people against the spread of online illegal content, but it also prohibits dark patterns, creates additional transparency requirements for recommendation systems and so on. One of the important aspects is that it avoids the fragmentation of the single market that would happen if every member state started to adopt their own rules.”

 

Trust through regulation

“The EU’s approach is really around creating trust through regulation,” said Joanna Conway, Partner & Internet Regulation (Legal), Deloitte. “The UK is doing something similar. It’s arguably gone further than the DSA in relation to content and regulating the internet. We’ve recently had the revised draft of our Online Safety Bill. The stated intention of the UK Government is to make the UK the safest place online.

“In other respects the UK is behind the EU. The EU has brought out the DMA which is aimed at regulating competition in the space.  We are waiting to see the extent of the UK’s proposals around that. And similarly you’ve got the AI Act and data provisions, and we are slightly behind in the UK on those. But it’s the same model of creating trust through regulation.

“A lot of these new laws provide for flexibility around codes of conduct, which is where the regulators will be engaging a lot with the tech companies – from SMES right through to big tech – to try and devise what best practice should be, and that is going to be crystalised in the codes.”

Bradley Tusk, Co-founder and Managing Partner, Tusk Venture Partners highlighted the comparative lack of regulation in the US.

“I wish I could say in the US we were doing a lot of what we are seeing in the EU, but we are not. The US has really dropped the ball on data privacy, on platform liability and on antitrust. I’m a venture capitalist – I’m by no means anti-tech, but for me, early-stage companies have a much harder time competing when they have these giant companies around them.”

 

Digital public goods

India takes a contrasting approach to the US by creating digital public goods, said Arvind Gupta, President of the Digital India Foundation. “For example, our digital identity platform which serves 1.4bn Indians is a public good. Anyone can access it; it is under public scrutiny, and it’s built as a public private partnership. From now onwards all platforms – payment, identity, document sharing, digital health platforms, are built with this digital public goods approach. That is finding credibility and trust with most consumers.”

Similarly, Slovenia’s inclination towards transparency earned it top ranking in the open data policy category of the European Commission’s Open Data Maturity Report.

“We believe if data is not in conflict with the interests of an individual or a business entity it is a public good,” said Koritnik. “For example we have an online presentation of the state budget where we can see what the state has spent and for what and how much generated. We have set up an open data ecosystem.”

 

A race for regulators and platforms

It is often said that regulators struggle to keep up with technology companies. Emma Wright, Director, Institute of AI and Partner at Harbottle & Lewis highlighted the risk of “David and Goliath scenarios where regulators are ill equipped against the teams that are geared towards taking on these issues in the big tech companies.”

But could the reverse also be the case? Technology have a huge task on their hands keeping on top of all these new policy proposals in the pipeline.

“At Deloitte we are working with companies in this space and it’s really important to have a systematic approach because these laws and regulations are coming so fast from so many jurisdictions at the same time,” said Conway. “And because of the impetus and the political and societal pressure for these laws to come in, you end up with a fragmented landscape.

“It does mean you need a systematic approach – you need to be monitoring for these laws and regulations, following them from inception at the point when you might be able to influence them still, right through the legislative journey and making sure you are making the right assessments as they progress be able to comply with them when they come in.”

 

Self-regulation

Could self-regulation be a potential solution? Gupta thinks not. “The tech companies are definitely not the arbitrators. Local laws are supreme,” he said, going on to point out that technology will increasingly need to be deployed by the regulators themselves.

“Fake news is an information pandemic is only going to multiply. There need to be more algorithms set up to attack and mark fake news before it starts spreading. That’s a very big concern I have because that has an impact on democracies.”

“There are industries where self-regulation can work well,” said Tusk. “There is some hope that, for example repealing Section 230 of the Telecommunications and Decency Act, would probably have a material effect on improving content moderation and reducing toxicity on the sites. There is legislation in Congress that has bipartisan support for new antitrust measures that limit the abilities of giant platforms to force users to use their products only.

 

Who is the right police(wo)man?

Wright questioned whether governments were the right agents to determine what content consumers can see, asking “who is the right police(wo)man?”

“If you think about the sheer volume of data on the internet, it is no easy thing to detect, monitor and then determine where the balance lies between freedom of speech and harmful content,” said Conway. “If you look at the UK, there are many separate laws that can apply to make something illegal online. It’s not an easy thing to work out where the balance is.

“In the UK, the EU and some other jurisdictions we are seeing a move towards regulation and away from self-regulation. But a lot of the emphasis in those laws is actually around transparency and accountability, and that’s how they are building trust. So they are not being prescriptive, but saying to platforms they need to be transparent and accountable. So it’s not purely down to the regulator; it is the platforms and tech companies who are going to be held to their own standards – although it’s not self-regulation. In a way they are going to be moderating themselves in that very difficult environment.”

 

Education, empowerment and inclusion

So governments, regulators and platform providers all play a role in creating trustworthy online environments – but what of the consumers themselves?

“It’s users who create content and whose data is involved; so there is a huge piece around media literacy and safety, teaching how to be safe online and protect their data,” said Conway. “We are seeing lots of initiatives globally by governments, regulators and tech companies themselves to try and empower people.”

“An important aspect of trust in digital technology is inclusion,” said Koritnik. “We can have user-friendly services, apps and solutions only by creating them together with citizens.

“In Slovenia we have excellent examples of services cocreated with our citizens, such as subsidised transport systems which are completely digitised. We succeeded in reducing personal contact and waiting times; it was a good case of cocreation.”

“When it comes to developing solutions, we need data, but it’s difficult without digital trust,” said Pärnalaas. “This is where digital skills and data literacy is crucial. We can’t expect people to trust something they don’t understand.”

 

Trust in a changing internet

This is particularly pertinent to new and rapidly developing technologies. “One of the real risks to trust is new stuff,” said Conway. “As users we don’t understand it and we don’t know what the risks are. We need to be teaching users what they should be looking out for, at an earlier stage. Empowering users will help repair trust overall.”

“There is a move generally to democratise the internet more – that’s one of the impetuses behind Web3 for instance. These tech companies do want to gain trust and safety with their users. We’ve seen a progression in the UK from for example where you have social media groups, the moderator is being placed under greater legal obligations to moderate the content on that group.

“If you involve people in deciding what is or is not harmful content and what you should do about it, then you gain the trust of users. When you moderate content there can be an inflammatory reaction when you take down or demote content – users can be incensed around freedom of speech.

“It might be that through engagement with users you come to a view that the best thing to do might be to flag this information as potentially harmful and provide counterfactuals and counter views. But if you have the buy-in from users when you do that, then you know are hitting the right notes.

“These are incredibly difficult issues but personally I don’t think they are ones for the platforms to solve on their own. Putting the obligations on the platforms to balance illegal content and freedom of expression is really difficult; engaging users is one way to try and build their trust.”

A big component of Web3 will be the metaverse. “Everything good and bad about the internet will be magnified by a factor of 10 when the metaverse comes,” said Tusk. “All the things we have failed to address in the US – data portability, interoperability, privacy, consumer protection, taxation – will be significantly greater problem once the metaverse is here.

“It’s clear that these problems are coming and need to be addressed; it’s clear that we don’t have the political ability or infrastructure to deal with it.”

The decentralisation of the internet is already happening. “We are already seeing different internets in different countries,” said Conway. “Sometimes that can be deeply alarming. It may be possible to come up with broad principles, but you have to accept that what’s acceptable in one country might not be in the next, so you are always going to see that splintering.”

 

Come together to build trust

Despite all the fragmentation, people and governments around the world are experiencing common challenges. “Taking into account all the differences, the problems we are facing are actually quite similar. I believe having more policy coordination and sharing best practices is very useful,” said Pärnalaas.

“It often comes to finding the optimal balance between ensuring customer trust, but also making sure that these measures are proportionate and don’t set up walls for our own SMEs and businesses.”

“Digital technology already has and will have an important impact on the quality of life for all of us, said Koritnik. “It is up to governments and the private sector to ensure citizens have greater trust in digital tech.”

Gupta said: “The public and the private sector has to come together to build trust in technology.”

It was also evident from the expert contributions from Meta on other panels at RAID Digital conference that technology companies have both the will and the skill to develop platforms that contribute to a robust regulatory environment and promote trust in the process.

This article is based on a panel discussion at RAID Digital on 4 May 2022.