This post was written by Kaushalya Gupta, Policy Program Manager and lead on the Tech Policy Design Lab project tackling deceptive design.
Deceptive designs, a.k.a. “dark patterns”, pose a serious threat to the protection of fundamental rights, as they may cause considerable harm to people, especially to the vulnerable. As part of our continued engagement on this issue, we joined Access Now and Simply Secure to make recommendations to the European Data Protection Board (EDPB) on the social media interface Guidelines on “dark patterns”.
These Guidelines offer practical recommendations to both designers and users of social media platforms to recognize and avoid deceptive design—with great potential to serve not only as an essential resource, but to shape industry practices in the future.
In the time since the Guidelines were adopted in March, the political institutions of the European Union reached an agreement on many of the central issues of the Digital Service Act, including measures that ban certain types of deceptive design. We believe the Board’s Guidelines should now act as a tool to implement and enforce this legislation.
Our joint recommendations focus on how to ensure the Guidelines have the most impact for a broad audience and how to leverage the Guidelines among civil society organisations to contribute to a change to the status quo.
In particular, we hope the EDPB will consider our recommendation to limit use of the term “dark patterns” and shift instead to “deceptive design”, a more culturally appropriate, user-centric, and inclusive phrase. We welcome the move by Harry Brignull, who coined the original term “dark patterns” to adopt the phrase “deceptive design.
We’re confident that these Guidelines will go a long way in educating users about deceptive design practices and their rights with respect to their personal data.
What’s next in our work to tackle deceptive design
It’s imperative that we further our work toward a future of trusted design. That’s why we’ll host our first Tech Policy Design Lab workshop on 1 June 2022.This workshop will focus on envisioning models for adopting “trusted design” in order to tackle deceptive design on the web at scale.
Often described as “click to subscribe, call to unsubscribe,” deceptive designs are more than a nuisance — they are too often a default practice that platforms perhaps unwittingly have adopted that wreak havoc on consumer choice, autonomy, and financial stability. This workshop is the first of a series that aims to create workable models for design practices that empower consumer autonomy and choice, and provide a way forward for trusted design practices.
Our policy design workshop will include representatives from civil society and governments, industry advocates, and other subject-matter experts. By the end of the workshop we hope to craft a vision of a web based on trusted design that governments, tech companies, industry partners and designers can take forward — one that is grounded in global perspectives and the lived experience of those most affected by deceptive design.
Watch this space for more information about our sessions at RightsCon 2022, which will take place from 6 – 10 June 2022.
To learn more about the Lab, please visit techlab.webfoundation.org.
For more updates, follow us on Twitter at@webfoundation and sign up toreceive our newsletter.
To receive a weekly news brief on the most important stories in tech, subscribe toThe Web This Week.
Tim Berners-Lee, our co-founder, gave the web to the world for free, but fighting for it comes at a cost. Pleasesupport our work to build a safe, empowering web for everyone.