This post was written by Kaushalya Gupta, Policy Program Manager, and Carlos Iglesias, Senior Research Manager.
The Tech Policy Design Lab, our effort to address the biggest tech challenges facing our societies, is a collaborative space for governments, companies, and civil society to work to redesign our digital world and make our web inclusive, safe, and empowering for everyone.
Our successful pilot project brought together a diverse group of experts to take on the devastating rise of online abuse and harassment against women and girls. To identify the next challenge the Lab will face, we undertook a robust, consultative process over the past few months. We surveyed Contract for the Web endorsers across 31 countries, held external consultations, and conducted internal discussions and workshops. Through this process, we evaluated 22 broad themes covering today’s biggest tech policy challenges and over 200 relevant topics.
The result? Tackling deceptive design will be the next challenge for the Tech Policy Design Lab — vital to empowering every web user and making the web safe and secure.
Why deceptive design?
Everything we do online is influenced by how the tools we use are built. Deceptive design is design practices built into user interfaces that obscure or impair consumer autonomy or choice and can alter decision-making or trick users into taking actions they might not otherwise take. This harmful design phenomenon is — intentionally or unintentionally — unfortunately widespread today.
While unfair and deceptive practices are not new or unique to the web, the rise of digital services, in combination with the lack of specific regulation, has created a space where individuals can be overwhelmed by the volume of information and easily manipulated through the multiple default design choices thrown at them on a daily basis. There has been some important focus on this issue in design and policy circles, including by the Norwegian Consumer Council and US Federal Trade Commission — but there is still a long way to go to make sure everyone is protected.
The potential harms of deceptive design range from the simple frustration and loss of time while trying to make your own choices as you navigate through these patterns, to the more dangerous effects on our privacy or finances. Examples include payment services that make your transactions public rather than private by default, or paid subscriptions that are all but impossible to cancel.
Many individuals — especially the more vulnerable like children and teens, the elderly, or disadvantaged communities — have a particularly difficult time discerning “trustworthy” design choices from manipulative ones in the digital sphere.
This year, deceptive design witnessed a surge in momentum and interest from policymakers, researchers, and journalists, with ongoing discussions around the world and new regulatory proposals making this an ideal time to engage on this issue on a global scale. There’s also good news that this is a fixable problem.
Moving towards trusted design
The Contract for the Web calls on companies, governments and civil society to develop technologies that support the best in humanity and challenge the worst. The Contract sets the model to help forge a new phase of tech policy development: where solutions are developed on the basis of sound evidence; where there is meaningful multi-stakeholder participation; where people’s experiences drive policy and product design; and where solutions take into account the full diversity of those who use digital tools.
In the coming months, we’ll gather evidence of the potential harms of deceptive design, including who these practices impact the most, and how they affect the most marginalised communities in particular. We’ll bring together stakeholders who can help solve this problem and work with these actors to co-create alternatives for more ethical, empathetic, trusted design that puts people and their needs first. And we’ll follow up to ensure those alternatives are put into action to build a safe and empowering web for everyone. A web we all can trust.
Join us November 18 for the official launch of the Tech Policy Design Lab where experts will discuss why moving from deceptive design towards trusted design patterns is critical to our shared digital future.
A note on language
This Tech Policy Design Lab project was originally called “Dark Patterns – Moving Towards Trusted Design”. Though “Dark Patterns” has been the prevailing term to describe the problem for several years, this language reinforces the exclusionary framing that “dark” is “bad” and “light” is “good”. That’s why, after engaging with our community, we have changed the project name to “Deceptive Design: Moving Towards Trusted Design Patterns”.
For more updates, follow us on Twitter at @webfoundation and sign up to receive our newsletter.
To receive a weekly news brief on the most important stories in tech, subscribe to The Web This Week.
Tim Berners-Lee, our co-founder, gave the web to the world for free, but fighting for it comes at a cost. Please support our work to build a safe, empowering web for everyone.