Read the latest from the Web Foundation

News and Blogs

Illustration by Sonaksha Iyengar

Illustration by Sonaksha Iyengar

How online gender-based violence affects the safety of young women and girls

Web Foundation · March 8, 2021

From finding and fostering online communities, to accessing education, to organising and getting involved in politics, the web can be a lifeline for young people. But despite the many potential benefits, the web is not working for young women and girls. A recent Plan International survey of over 14,000 young women and girls found that 58% of respondents have experienced online harassment, including abusive language and cyberbullying. And research by the Web Foundation and the World Association of Girl Guides and Girl Scouts found that 84% of young women think the problem of online abuse is getting worse. For LGBTQ+ youth, Black youth, and young women of colour worldwide, the impact of gender-based abuse is amplified when they also experience attacks based on their identity such as sexual orientation, race or ethnicity. 

As the world celebrates International Women’s Day, it’s clear that we need urgent action from companies and governments to address this growing threat to global progress on gender equality. As part of our work to tackle this issue, and building off existing research and our past consultations in this series, the Web Foundation recently hosted our fourth and final consultation on online gender-based violence and abuse. The multi-stakeholder session examined how different product and policy solutions impact the safety of young women on social media platforms.

Here are the top seven takeaways: 

1. Many young women are not aware of platforms’ policies and product features that can help them stay safe.

And even if young women do know a security or safety setting exists, they often find it difficult to understand or use — especially because these features are usually different on each platform. This is an even bigger challenge for young women who may speak local languages not translated in the platform’s guidelines. In these cases, young people have to find ways to translate the policies themselves. 

Both tech company reps and other participants agreed that tech companies must make concerted efforts to raise awareness and make tools and policies easier to use and find. For example, participants suggested tech companies could provide a safety tour when someone creates a new account — especially for young people and women. The tour could guide people through the safety features available for their account, and explain how the user can set them up. Participants also suggested that companies should work towards standardising safety features and policies across platforms.

2. Young women and social media platforms don’t always describe or explain experiences of abuse as ‘abuse’, so they aren’t always reporting it.

Many participants noted that when young women see abusive content online, they describe the content as “embarrassing” or “hurtful” instead of violent or abusive. This can lead to problems with reporting abuse, because the categories offered by the platforms do not always match the way young women would describe what’s happened. Because of this, participants said that tech companies need to do more to ensure young people recognise and report content they view as embarrassing or hurtful as abuse and violence.

3. Abuse online can lead young women to stay silent or even leave these spaces.

This happens when a young woman is both directly impacted by or witnesses abuse against someone else. Young women are excited and curious when they begin exploring the online world, but this can quickly turn to shame and silence when they face online abuse. As a result, many will think twice about what they post online, stop posting altogether or even close their accounts. Young women who see others facing abuse online may also choose to limit their participation in these spaces. 

For many other young women, the number of harmful posts and comments across social media platforms can lead them to believe that abuse is normal, and that it is simply the price young women have to pay for accessing online spaces. 

4. Young women online are particularly worried about abusive videos and images.

In our previous consultation, women in public life described how abusive images and videos threatened their credibility, but these were usually in the form of doctored images or videos, like deepfakes. For young women, these threats more commonly took the form of non-consensual intimate images (NCII). Other participants mentioned cases where perpetrators send young women sexually explicit images through direct messages.  As the pandemic has driven more of our interactions online, participants and tech companies noted that there has been an increase in this kind of abuse. 

5. Reporting systems are not working well for many young women and LGBTQ+ youth.

Many participants noted that young women use terms or slang that moderation systems may not pick up. This is an even bigger challenge for young women who speak different languages and dialects, or for LGBTQ+ youth whose terms for gender identity and self expression are dynamic and ever-changing.

Participants also noted that young women often do not receive responses at all from companies when they do report content, leaving them discouraged from reporting again. The lack of a quick response when young women do take the time to report echoed what we’ve heard in all of our previous consultations and results in a significant lack of trust in reporting systems. 

While there are ways systems could be improved to make reporting more transparent and effective, strengthening reporting systems is only one way to ensure that young women are safe online. As one participant noted, “We won’t have a safe online space if we only have it as a place to report violence.”

6. The limits of formal safety tools lead young women and girls to find informal ways to protect themselves.

Young women use formal tools like muting, setting their accounts to private, and turning off commenting or locations to keep themselves safe. Unfriending or blocking accounts are also sometimes used, but because a perpetrator may be able to see if they have been blocked or unfriended, participants said these options are not as useful in contexts where a young woman might know the perpetrator, like small towns and schools.

While these formal tools can stop some of the abuse, there are still gaps in their protection. This leads young women to also find informal ways to protect themselves online. For example, participants noted that young women, and in particular those from activist communities or marginalised groups, will share lists of accounts they have blocked to help stop coordinated abuse. Other young women put emojis in their names to make it more difficult for abusers to find their accounts 🔍. 

7. Solutions to online abuse need to be structural and consider gender-by-design methodologies that tackle the root causes of inequality.

These solutions must also consider the experiences of young women especially those from marginalised communities,  including the voices of Black youth, LGBTQ+ youth, and young women of colour from the Global South. For our part, the Web Foundation is leading a multi-stakeholder initiative with both tech companies and civil society organisations through our tech policy design workshops, the pilot program for the Web Foundation Tech Policy Design Lab. In this space, we will co-create solutions for online gender-based violence to help push for a web that works for young women.  As one attendee said, “tools are not the answer to change misogyny — it will take decades of work to change the patriarchy. But while that is happening, we have to make sure the tools are helpful.” 

Next steps

With the conclusion of the consultations, the Web Foundation will move forward with the next steps in our work to counter online gender-based violence: a series of policy design workshops with civil society organisations, researchers, academics and tech companies. 

As part of our commitment to tackle all forms of digital inequality — including online abuse that targets women because of their different and intersecting identities — the Web Foundation will use the insights gathered during our consultations series to apply design systems thinking and human-centred design principles in our workshops. This will enable participants to co-create policy and production solutions to online gender-based violence and abuse. 

This work can’t wait. As we mark International Women’s Day, governments and companies must commit to action to make sure women can fully and freely participate in our online world.


For more updates, follow us on Twitter at @webfoundation and sign up to receive our newsletter.

To receive a weekly news brief on the most important stories in tech, subscribe to The Web This Week.

Tim Berners-Lee, our co-founder, gave the web to the world for free, but fighting for it comes at a cost. Please support our work to build a safe, empowering web for everyone.

Your comment has been sent successfully.