This post was written by Ingrid Brudvig our Digital Inclusion Research and Advocacy Coordinator, and originally published on Medium. Follow Ingrid on Twitter at @IngridBrudvig.
Last week I had the opportunity to join IT for Change and Advanced Centre for Women’s Studies, Tata Institute of Social Sciences in Mumbai where they hosted the first ever National Dialogue on Gender-based Cyber Violence. The conference was organised as part of the Women’s Rights Online project, which is led by IT for Change in India. The event convened feminist scholars and researchers, women’s rights and sexuality rights activists, digital rights groups, disability rights groups, lawyers, and students from a range of disciplines including women’s studies, media and cultural studies, social work, psychology, sociology, and law to engage in important discussion to explore gender-based cyber violence “normatively, legally and empirically, bringing feminist conceptions of the digital…to articulate what freedom from violence means in relation to online spaces.” (See Conference Note).
I was excited to have been asked to present a feminist critique about technological architectures and the law. This is an interesting topic for me — being a student of Anthropology, I believe it’s important to critically examine not only the political economy and socio-cultural institutions, but also the micro-sociological dynamics of space, place and lived experiences to situate how technological structures, laws and policies are reconstituting our social world. This blog is a start to what I hope will become many new reflections on technology, gender, anthropology, borders and belonging in the digital age.
In addressing this topic, firstly I’d like to unpack in the term “technological architecture” and the interlinks between online platforms and social structures, which has, indeed, led to the “encoding of social hierarchies”. Platforms are sustained by data footprints which are used to categorise us into boxes of gender, race, class, caste, income and other prescribed categories. Technology companies have created, as MIT professor Sherry Turkle coins, our “second self” through our data. Our data is harvested and sold, and we are targeted with content and ads telling us who we should be, how we should think and the choices we should make. There’s a lack of transparency and accountability around the design of algorithms and use of our data, despite the fact that we, as users, are the product. This invites ethical concerns around several issues which I see as central to a feminist critique of technological architectures:
- The use of personal data as currency in online ‘mediascapes’ is driven by the manipulation of consent in opaque terms of service agreements.
- The architecture is also emblematic of a culture of surveillance — the default mode that governs our digital interactions.
- The combined application of big data and artificial intelligence by social media platforms fuels this notion of “data-driven indicators of belonging” — and, indeed, the social reproduction of hierarchies (including related to gender), which allow little room for us to contest how we are coded to belong in this world, and self determination to define our own flexible identities. Through our digital selves, we are programmed into contrived hierarchies with little choice in the matter, and limited recourse for resistance. Furthermore, as Nandini Chami of IT for Change rightly pointed out in the National Consultation in Mumbai, how can we control our identity when it’s widely distributed throughout data nodes across the digital sphere?
- We’re in a situation where instead of “gender by design”, we’re stuck with “add gender and stir”. And in many cases, the pot is boiling over.
Indeed, while popular social media platforms facilitate access to social networks, social capital, and some degree of information, and communications architecture — the context in which access has played out, which is primarily via social media platforms, has led majority of women around the world to be connected to the internet only through a walled garden, which ultimately limits women’s agency, and as Anita Gurumurthy of IT for Change, coins, their “equality of autonomy”. Indeed the global trend toward mobile-first and mobile-only access, perpetuates this walled garden phenomenon, and we have to ask, how is women’s agency limited, and how is violence exacerbated by way of the technological architecture of connectivity?
In 2015 the Web Foundation and partners carried out a study in ten countries on Women’s Rights Online which surveyed over 7,500 women in urban poor areas like New Delhi, Nairobi, Lagos, Manila and Jakarta. In the study, just 37% women reported that they have accessed the internet — but of these women, 87% reported using mainly Facebook, while just 21% reported ever having actively searched for information online. This includes information about sexual and reproductive health, legal rights and education. While the internet is a critical architecture of public space, mediating access to information, to the economy, to social capital, and to public life — Web Foundation research shows that currently it is not supporting women’s access to information, and civic participation — it may even be exacerbating inequalities.
The pandemic of online harassment and violence against women leads to obvious violations of women’s rights — not only in terms of freedom from violence and bodily integrity, but also for women’s freedom of expression, privacy and right to personal data protection. These violations are inextricably linked. For example — violence and threats to women’s online privacy and digital security may further prevent women’s access to information (including related to their rights) — as many who experience harassment are driven to self-censor, or they are driven offline entirely, as Women’s Rights Online (WRO) research conducted with Tadwein Gender Research Centre in Cairo showed.
In the global WRO study, a majority of women who had experienced online harassment said they did not report it, mainly because “it’s not worth reporting”, “it happens all the time” and “authorities don’t care”. When looking into the policy landscape, WRO research partners found that in Kenya for instance, the legal framework safeguarding security online is generic in nature, and fails to mention online gender-based harassment. In countries like Mexico, Ghana and Indonesia where there is some mechanism to address online gender-based violence, responses are inefficient since the authorities responsible for resolving these cases generally ignore or minimize complaints, hold the victim and not the perpetrator responsible, or they lack training, knowledge and departmental resources to pursue cases. Web Index research of 86 countries worldwide indicates that in 74% of countries, law enforcement agencies and the courts are failing to respond to ICT-mediated gender-based violence. This demonstrates how global this problem is, and how lagging states are in delivering effective responses.
In the void of legal mechanisms and with a culture of impunity, platforms end up being the place of recourse — but what happens when our rights rest on platforms’ terms of service and reporting mechanisms, instead of legal frameworks? How does platform governance pair with local jurisdictions, which might not even have the sound legal frameworks in the first place? Are other intermediaries like cyber cafes and internet service providers protected from legal liability for unlawful content created, stored or disseminated by their users? What accountability mechanisms are needed to ensure that platforms comply with international human rights standards, and that they are not taking broad or excessive measures which infringe on freedom of expression, or privilege a particular ideological standpoint or body of knowledge, including Western bias?
These are questions where there may not be answers yet, but where partnerships between diverse groups including women’s rights organisations are key to delivering digital equality.
So what happens next? Countries need to adopt and implement legal frameworks to protect digital rights and freedoms. Platforms need to also take responsibility and improve their response mechanisms. In December 2017 the Web Foundation and IT for Change hosted a dialogue at the Internet Governance Forum with members of the Women’s Rights Online network in Brazil, Colombia, India and Myanmar engaging with Facebook on its online safety policies. Key recommendations that were debated include:
- Companies need to ensure due process in addressing reports of violence online — people need to have the right to a timely and effective response, as well as the right to repeal content takedowns;
- Platforms must also start producing transparency reports including information on how many cases of online gender-based violence are reported, and the resulting action;
- Freedom of expression needs to be integrated into platforms’ Terms of Service, and social media companies should work with local partners for diverse perspectives.
It’s also critical that researchers, activists and the media find creative ways to introduce this discussion into mainstream policy spaces that inform how policymakers think about these issues, without gender being the pre-, post- or side discussion. It’s important that ICT policymakers are trained on gender responsive policymaking, and in particular, on how every aspect of ICT policy can, and must be held accountable to women’s rights. For example, the topic of affordability is not often approached as a gender issue, but, as research by the Alliance for Affordable Internet shows, the prohibitively high and unaffordable cost of data around the world disproportionately affects women. High costs become a form of censorship, preventing women from accessing information on their rights, and infringing on freedom of expression. In Uganda, for example, 1GB of mobile prepaid data costs 27.71% of average monthly income.
As feminist collectives, we need to continue to engage in policy spheres of influence, as well as explore how to shift away from the platform centrality we are accustomed to. To conclude, I’m left thinking about a critical question posed by Nanjira Sambuli of the Web Foundation, who points out that, “As much as these platforms are useful nodes for capturing attention and garnering audiences, maybe the question is, how, in practice, can we ‘de-condition’ people, particularly women, and internet users in the global South from the notion that platforms and their unaccountable architectures, constitute the internet, or the vast majority of it?” In the end, the diversity of our online and social experiences will depend on how far technological architectures embrace the reality of desires for mobility and flexibility in its world of code.
For updates on our work, follow us on Twitter at @webfoundation and sign up to receive our email newsletters.