Tackling harmful content
In addition to creating a safety portal featuring a range of resources, Facebook announced that they were sharing their tech used to fight abuse on Facebook with others who are working to keep the internet safe and open-sourcing two technologies that detect identical and nearly identical photos and videos. Alongside Microsoft’s generous contribution of PhotoDNA to fight child exploitation 10 years ago and the Google Content Safety API, this announcement is part of an industry-wide commitment to building a safer internet.
So far, they have taken action on:
- 8.8 million pieces of bullying and harassment content, up from 6.3 million in Q4 2020 due in part to improvements in our proactive detection technology.
- 9.8 million pieces of organized hate content, up from 6.4 million in Q4 2020.
- 25.2 million pieces of hate speech content, compared to 26.9 million in Q4 2020.
- Misinformation – From the start of the pandemic to April 2021, Facebook has removed more than 18 million pieces of content from Facebook and Instagram globally for violating their policies on COVID-19-related misinformation and harm.
- Transparency Centre – Facebook’s Transparency Centre explains how they remove harmful content that violates their policies.
Tackling loneliness
Facebook has been part of the Government’s Tackling Loneliness Network, to co-chair the ‘tackling loneliness in young people’s group, and to contribute to the Emerging Together: The Tackling Loneliness Network Action Plan which was published in May. The report makes recommendations on supporting individuals and organizations to tackle this difficult issue.
Draft Online Safety Bill
Facebook has strict policies against harmful content on their platforms and their transparency reports show that they are making significant strides and removing more harmful content before anyone reports it to them.
Facebook Safety Centre
Launched in 2017, the Facebook Safety Center walks people through tools to control their experiences on Facebook and a range of tips and resources. This includes step-by-step videos and resources from over 75 expert NGO partners from around the world.
Facebook Parent Portal
As part of the safety centre, the Parent Portal offers parents and carers insight on the basics about Facebook, tips on how to talk about online safety with children, and access to a range of expert resources created to support parents and carers.
Facebook Youth Portal
Aimed at teens, the Youth Portal offers young people information about the tools and policies available on Facebook that they can use to stay safe on the platform. As well as this, there is advice from other young people on topics like how to manage negative experiences.
Online wellbeing support
Introduced to the Safety Centre in May 2018, the Online wellbeing section provides people with more information on where to get help regarding suicide prevention. There are also signposts to tools on Facebook to support people posting about suicide, including reaching out to a friend, contacting helplines, and reading tips about things they can do at that moment and social resolution.
Digital Literacy Library developed for educators
The Digital Literacy Library was created in August 2018 and features a collection of lesson plans to help young people think critically and share safely online. Developed by the Youth and Media researchers at the Berkman Klein Center for Internet & Society at Harvard University, the resources are aimed at educators of youth ages 11 to 18. The lessons incorporate over 10 years of academic research by the Youth and Media team and reflect the diverse voices of young people from around the world. The lessons address topics like reputation management, identity exploration, cybersecurity, and privacy.
Safety guides
There is also a range of safety guides created with partners around the world that touch on a range of issues; here are links to a few examples:
- Think Before You Share: Designed for young people and contains tips about thinking before you post, not sharing passwords, and how to resolve online issues.
- Help A Friend In Need: This contains information about what to look out for on social media when your friend may be feeling down and how to get help.
- Be Kind Online: This is a guide to support LGBTQ+ teens to encourage kindness online.
In addition to the guides, they also provide a Help centre to provide more education and advice.
Recent updates
- Defaulting teens to private accounts – Young people under 18 years old in the UK who join Instagram will have their account default into a private account. For those who already have a public account, Instagram will show them a notification highlighting the benefits of a private account and explaining how to change their privacy settings.
- Stopping unwanted contact – New technology will stop suspicious accounts from being able to discover or follow teens on Instagram.
- Changing how advertisers can reach young people – Instagram will only allow advertisers to target ads to people under 18 in the UK based on their age, gender and location across Facebook, Instagram, and Messenger. This means that previously available targeting options, like those based on interests or on their activity on other apps and websites, will no longer be available to advertisers.
- Understanding users’ real age – Instagram will be sharing updates on the advancements they have made in developing innovative technologies to do more to understand users’ real age. This includes additional details on using AI to find and remove underage accounts and ensure teens receive age-appropriate experiences.
- Expanding Instagram’s Youth Advisors – Instagram has also added new experts in privacy, youth development, psychology, parenting and youth media – such as ParentZone in the UK – to Instagram’s group of global Youth Advisors who will continue to provide them with research, guidance and expertise as they develop new products and features for young people.