Making light work of dark patterns

 Making light work of dark patterns

At a Privacy Laws & Business webinar, experts from Meta, Stanford, the Norwegian Consumer Council and Amadeus revealed the prevalence of dark patterns and what businesses, regulators and policymakers can do to prevent them. Ben Avison reports


The term “dark patterns”, which is gaining traction both in regulatory and consumer parlance, was first coined in 2010 by user experience (UX) specialist Harry Brignull. He launched the website, on which he explains: “Dark Patterns are tricks used in websites and apps that make you do things that you didn’t mean to, like buying or signing up for something.

“When you use websites and apps, you don’t read every word on every page – you skim read and make assumptions. If a company wants to trick you into doing something, they can take advantage of this by making a page look like it is saying one thing when it is in fact saying another.”

The Privacy Laws & Business webinar Shining a Bright Light on Dark Patterns provided an illuminating discussion on how such practices could be prevented.

“A one-size-fits-all regulation is not the answer, as we all have different industries and contexts – but we need to figure out a way forward,” said Dr Jennifer King, Privacy and Data Policy Fellow, Stanford Institute for Human-Centered Artificial Intelligence.

Finn Myrstad, Head of Digital Services Section, Norwegian Consumer Council called for a holistic approach to dark patterns. “You might need to do this through consumer law, supplemented by data protection law, also the digital services act in the European Union, but it will also need to be supplemented by other measures.

“Competition policy is an important additional measure. Consumers need a place to go if they are unhappy with a service. And you need to look at portability and interoperability rules, so consumers can vote with their feet if they discover a dark pattern and are unhappy with service.”

He highlighted the negative pressure that network effects have on consumer power. “One of the problems is that a lot of companies have a de facto monopoly within their category and it’s hard to leave regardless, because all your friends or services are on that platform”.


The need to satisfy

Perhaps the biggest driver for moving away from the use of dark patterns will come from the business requirement to satisfy customers.

“Consumers get really frustrated when they are tricked,” said Myrstad. “There should be an argument made for treating your customers with respect. Consumers will give up more data as long as they know and trust that their data won’t be misused.”

Dr Dan Hayden is Director of Data Strategy at Meta and co-lead of TTC Labs – a co-creation and design lab focused on improving user experiences around personal data.

“Companies are being more reflective about data as a potential way to offer services or provide a digital experience, and also as a risk, as something to be managed carefully and protected,” he said.

“It’s about practicing respect and fairness in how you engage with people about their data. The overall ethos has to shift more towards fairness and respect, and with it the core principles around data minimisation and privacy by design.

“There are a couple of consensus-driven and clear practices that shouldn’t persist: this is frustrating a user’s intent and making it deliberately hard for people to achieve what they want to do. Eliminating those practices is in the common interest and I’m really keen to solve some of these problems.”


20 steps to unsubscribe

One example of a dark pattern is making a subscription service easy to join, but very hard to unsubscribe to.

“It should be as easy to leave a service as to join a service. That’s easy and intuitive for consumers to understand,” said Myrstad.

He gave the example of a complaint they filed against Amazon in 2021. “Basically it was two steps to sign up to Prime, but to come out of it was 10-20 steps, and they used a series of dark patterns to dissuade you from signing out, and most people thought they had signed out when they hadn’t.”

Other examples of dark patterns include experiences where consumers are forced to make a phone call to unsubscribe. “Those are areas where industry can do a lot better,” said Hayden.

Many online experience and marketing activities are perfectly legitimate, of course. “There are ways companies can say this is what I’m offering you a service, this is what you’re trying to achieve on your digital journey, and you can make a proposition,” he said.

“I think you do that within clear bounds that eliminate bad practices that treat people unfairly, that should be consensus driven, that should be potentially supported in regulation, that should eliminate things like nagging, double negatives, forced timing and action, that combine things that don’t make sense in terms of data usages.

“But with that said, there’s also areas where you might have to do some bundling of data use – who should decide how that works? How much language does someone want to deal with in that situation, on the way to using an app to book a bus service or buying a ticket for a concert? Reducing friction in those areas really helpful and makes things more accessible and easier for people.

“We should achieve consistency and coherence. At the very outset we should identify and prioritise the things that are the worst experiences for people, that hit them in their pockets, that leave them frustrated and treat them with disrespect – and not ignore these in pursuit of eliminating all other practices.”


Common standards

Jane Hunt, Senior Legal Counsel, Amadeus called for common standards in the way consent is requested from consumers. “Nobody is going ask you to give very deep technical information, but there has to be a common standard where somebody says, I understand enough about how this works to be able to recommend it or encourage it.”

“It’s really important that there is consistency between different kinds of standards and regulations,” Hayden said. “Europe has a number of different existing laws that companies are grappling with – both positive and sometimes challenging ways of interpreting what does it mean to design a consent that is freely given, informed and clear.

“There is lots of space to do better, but I’d be very wary of introducing other different standards.

“I do worry about the potential negative externalities, where we have very prescriptive guidance in how an interface should be designed. It’s very difficult for that to keep up with evolving UX trends.

“And is the ultimate victim of that a worse experience for people? The history of how people engage with legal standards around interfaces isn’t covered in glory; we haven’t seen loads of massive improvements driven by distinct standards.

“What we have seen improvements in is where industry has set a common guideline to eliminate bad practices. For me there’s a very common incentive. We published a good piece of research on the TTC Labs site reflecting on how bad privacy experiences from one company reflect badly on the lot.

“That’s a compelling reason to say, we need to hold bad actors to account and eliminate the worst practices – this shouldn’t be at the expense of addressing broad, complex and difficult challenges around the breadth of experiences.”

King highlighted the risk that a certain approach to regulation could actually negatively impact consumers. “One of my biggest concerns in the dark patterns space is that regulators could take this very narrow view of how to make these things better and end up making the user experience worse: because suddenly it is, ‘I want to make sure you’ve read the privacy policy, so now I’m going to require that business make sure you’ve read through the whole thing and you’ve got to page through the whole thing and there’s a check box’, which to me is not an improvement at all.

But policies and regulation can provide positive solutions too. “There are ways forward with privacy enhancing policies,” she said.

Hunt cited the “amazing work on child-appropriate design” by the UK Information Commissioner’s Office (ICO). “Some of their sandbox work – some of the lessons on good practice from design – can be used by people who want to develop something good.”


A balance of desires

The ultimate question for regulators is how to achieve a balance between companies’ desire to take as much data as possible, and giving customers a real choice over how much data to give.

“We need to draw the line,” said Myrstad. “We need to ban certain practices because the risks outweigh the benefits. That will be create a more equitable space where there will be a better balance of power and information.

“We need to move away from the free-for-all use of data that is happening today. We can look at dark patterns holistically. We all want a better and safer internet, but we also need to look at these serious challenges.

“I hope we can create laws in Europe and the US in particular, because this is where most of these companies are based, and also in other areas of the world.”

But laws do have a shelf life. “We need a principle-based approach that is futureproof,” he said. “There is also room for specific bans of certain types of practices. But if you only do a ban of certain practices, there will always be new things that come up and you can never futureproof it.

And once laws are created, they need to be enforced. “It is a challenge that enforcement is pretty slow in all these areas. Under consumer law, the mechanisms for ensuring compliance are pretty weak within Europe in my experience. Companies tend to fight this out with lengthy delays in the court systems. I wish we could have swifter mechanisms for compliance.”

Striking this balance between the desires of consumers, businesses and regulators is a complex and ongoing challenge, not least for big tech companies like Meta. Hayden said: “You need a bit of nuance and flexibility. Fairness is the key consideration.”

For more information visit and


Tag: @darkpatterns