Read the latest from the Web Foundation

News and Blogs

Activists and tech companies met to talk about online violence against women: here are the takeaways

Web Foundation · August 10, 2020

The web is not working for women. Women and girls are less likely to have access to and use the web, and those who are online disproportionately face abuse and harassment —  and research shows that women of color, and Black women, are most impacted.

As part of our commitment to tackle all forms of digital inequality — and intersectional gender inequality specifically — the Web Foundation is running a series of consultations that bring together tech companies and women from across civil society to share experiences and tackle online gender-based violence together.

The second consultation, held virtually on July 15, heard from 26 women-led civil society groups in 20 countries and five of the world’s biggest tech companies who discussed the online threats experienced by women activists, especially women of colour and Black women.

We opened with a keynote presentation from Seyi Akiwowo, Founder and Executive Director of Glitch who spoke about the escalation of online abuse experienced by women during the Covid crisis. She was followed by a Web Foundation presentation to preview our upcoming Women’s Rights Online Report.

Participants then discussed content moderation and privacy and safety, under modified Chatham House rules. We’d like to thank all of the participants, particularly those organisations and activists who have long worked tirelessly to make the web a safer, more empowering space for women. An overview of the session, along with a list of attendees, is available.

Here are seven top takeaways:

1. Online violence causes offline harms

Abuse and harassment perpetrated online has offline consequences and can ruin people’s lives. It is part of a continuum of violence against women, with abuse crossing between online and offline spaces.

Women activists are often targeted with abuse designed to silence them, including with traumatising threats of violence that often lead to emotional and physical harm. In many countries, activists who continue to speak out are at risk of arrest. Facing abuse, violence and legal risks, some women understandably self-censor, while others continue to speak out and all too often suffer consequences. 

To keep users safe and to defend democratic participation, civil society organisations urged tech platforms to take this abuse seriously and work to minimise violence on their services.

2. Content policies are not enforced equally 

Content moderation is a critical part of social media platforms’ responsibility to keep people safe, but the moderation of content policies is currently not applied equally.

According to civil society participants, tech companies tend to focus their resources on the United States and Europe, resulting in a serious enforcement gap against abusive content between the Global North and South.

Participants also described experiencing or witnessing companies sanctioning feminists and racial justice advocates for calling out hate speech and abuse, with the perpetrators of the hate being sanctioned less frequently.

3. Cultural context is vital

In order to make good decisions around content moderation, account suspension and other actions to address problematic content, companies must understand the cultural context in which they operate. Activists said that policies designed for the Global North are applied globally without sufficient consideration of unique cultural contexts.

One participant explained that in their conservative country, a picture of an unmarried woman standing next to a man could lead to dangerous consequences, even honour killings. While content like this would not violate a platform’s policies, it can be used against women. There have been cases of women being blackmailed and forced to pay to keep photos offline.

This demonstrates how important it is to understand cultural complexities and nuance, especially in contexts of racism and oppression. Companies must invest in understanding how their platforms are used and misused outside the US and EU. More training is needed for content moderators in local contexts, local languages and local cultures.

4. AI may be necessary, but not sufficient

Social platforms have quickly scaled to operate with near-global reach. To manage this scale, artificial intelligence has become a key tool in their efforts to identify and tackle problematic content. It’s critical that companies are transparent about the effectiveness of these algorithms, conduct regular reviews, and work to ensure they are sensitive to diverse cultural contexts.

But AI alone is not enough. Content moderation demands effective human involvement, such as human review, particularly when dealing with complex cultural contexts and for verifying automated content decisions.

5. People need more control over their settings — especially privacy settings

While privacy and user control settings have improved over time, activists said they need more granular control of who can interact with them and the content they see.

Participants were frustrated that on most platforms they aren’t able to mute specific videos or images portraying violence — which can lead to re-victimisation. They also noted there aren’t enough settings built to support control and self-care. One participant suggested a feature that would let people screen out content from accounts that were identified as likely to be a “troll”.

When people depend on social media for their activism and need to spend many hours a day on platforms, the ability to screen out often traumatic content and abusive users is important for their wellbeing.

6. We have power to protect each other online

When we go online we are part of a community. When we see others suffer abuse and harassment in public spaces online, we can help support them by being ‘active bystanders’:

  • report abuse using platform tools
  • support victims of abuse by sending a private message to them know they’re not alone
  • reply to their original post constructively
  • amplify their message by sharing with your communities.

Watch this video from keynote presenter Seyi Akiwowo to find out more about what it means to be an active bystander.

7. We need global and cross-platform collaboration

The tech companies participating in the consultation acknowledged the need for greater cross-platform collaboration and best practices to fight online gender-based violence. That has happened already around child protection and needs to be prioritised for women’s safety too.

A woman whose intimate images are shared without her consent must report this abuse to each platform and go through several different processes to have them removed. That’s not good enough. Greater collaboration between companies and with civil society and governments could make platforms safer for women and provide victims of abuse with ‘universal’ reporting mechanisms and support services. 

Next steps

Evidence from this session — and from forthcoming consultations focused on the experiences of women politicians and journalists, and on girls and young women — will inform a series of policy design workshops where women’s rights organisations and tech companies will work together to build concrete policies and products to tackle online gender-based violence, with the needs of women at the forefront, not addressed as an afterthought.

Violence against women is a huge threat to progress on gender equality. And unless we make sure the web is a safe place for women and girls, technology will be one more channel for women to be attacked, suppressed and marginalised, rather than be the platform for voice, opportunity and positive change that we know it can be. That’s why it’s urgent that companies and governments work closely with women’s activists and the wider tech community to tackle violence online and make the web safe and empowering for everyone.

This work also builds towards the UN Generation Equality Forum now taking place in 2021, to mark 25 years since the Beijing Conference. While some companies and governments have expressed support to combat online gender-based violence, we now need to match words with action. We urge the tech companies who have participated in these consultations to push forward with real commitments to address violence against women in the context of the Generation Equality Forum. 


For more updates, follow us on Twitter at @webfoundation and sign up to receive our newsletter.

To receive a weekly news brief on the most important stories in tech, subscribe to The Web This Week.

Your comment has been sent successfully.
  1. WM Heath

    August 12, 2020

    1. It's not just the platforms that moderate. Building on your point 6, for more on people-powered moderation see eg https://regulate.tech/people-powered-moderation-10th-august-2020/2. There's only one Chatham House Rule. #OnceaSubAlwaysaSub

    Reply

    Your comment has been sent successfully.