MENU

What are algorithms? How to prevent echo chambers and keep children safe online

A boy uses his laptop with a bubble surrounding him to represent how echo chambers separate users from others views.

Algorithms are an important part of social media feeds, but they can create echo chambers. These echo chambers lead to issues of online hate, misinformation and more.

What is an algorithm?

An algorithm is a set of instructions that a computer program follows to perform a specific task. There are different types of algorithms, but on social media, the list of instructions decides what content to show to users. Algorithms do this by learning from users’ interaction with other content, such as through likes, comments and shares.

Social media uses algorithms to keep users engaged on their platform by providing relevant and interesting posts. This is similar to how websites collect cookies to show users advertisements relevant to them.

What is an echo chamber?

An echo chamber is a situation where people only see information that supports their current beliefs and opinions.

Social media echo chambers work by ‘hiding’ content that is irrelevant based on the algorithm. This is content that users swipe past, don’t interact with or block on their feed.

However, the content users don’t see may help create a balanced view of the world. So, not seeing this content may create confirmation bias where content users see confirms their beliefs with giving different points of view.

How do algorithms create echo chambers?

Algorithms create echo chambers through showing users content similar to that which they already engage with. If that content is hateful, the suggestions will also show hateful content. For example, many users who follow Andrew Tate find themselves surrounded by similar content that spreads misogyny and online hate against women and girls.

Computers and algorithms cannot assess the information they suggest. As such, an algorithm cannot make the choice to show users balanced views or facts. The echo chambers then instead show users that ‘everyone’ believes the same as them.

However, these users are only able to see content from those with similar views. Therefore, it is up to the individual to think critically about what they see and interact with.

What are the risks of echo chambers for children and young people?

Online echo chambers may lead some users to becoming more extreme in their views because they don’t experience opposing views. This can lead to exposure to harmful content, conspiracy theories and radicalisation.

Children are also at greater risk for believing misinformation or being manipulated online. They may not yet have the critical thinking or digital literacy skills needed to be a discerning consumer of content because of their stage of brain development.* As such, they are more likely to believe extreme or controversial ideas.

Additionally, exposure to online hate like racism and misogyny or other harmful world views can take its toll on children’s wellbeing and growth. Seeing content that is inappropriate, violent or hateful on a regular basis can lead to desensitisation. As a result, they might not be aware that the content they see is harmful and are therefore unable to know when it’s right to take action.

Children and young people using social media may not yet understand how algorithms work. So, it’s important to help them learn how to manage content suggestions to take action themselves.

Online hate hub document

Preteen girl sitting on a sofa with a serious look on her face as she looks at a multicoloured device in her hands. Text reads 'Online hate: Facts and advice hub' with the Internet Matters logo.

Algorithms can create echo chambers that lead to online hate. Learn how online hate works and what you can do to stop it.

VISIT HUB

How to prevent echo chambers on social media

While algorithms can offer tailored social media experiences unique to each user, it’s important to recognise potential risks and solutions. Help children learn how to recognise when they’re in an echo chamber, how to prevent it from happening and where to get help when needed.

Talk about social media

Explore the benefits and risks of social media

Like many things online, social media has both benefits and risks. To stop your child from falling into the trap of echo chambers, make sure they understand this risk.

Checking the social media platform’s age minimum can help as well. If your child does not meet the age requirement, they may not be ready to use the platform. Encourage use of age-appropriate platforms instead until they build their critical thinking and digital literacy skills.

Review social media use

Set boundaries on social media

More time spent on social media means that algorithms learn more about your child’s interests. Echo chambers grow with time spent engaging in relevant content as well. Therefore, it’s important to set boundaries on social media. Set time limits, use in-app safety settings and talk to them about who they interact with or follow.

Practise critical thinking

Build critical thinking and digital literacy skills

Machines cannot think critically about what they suggest to users. As such, children need the opportunity to build these skills for themselves. So, as a part of the conversations you have, ask them guiding questions:

  • Why do they follow a certain content creator? What do they find interesting or entertaining?
  • Is there anyone who might not like what the content creator is saying? Why might that be?
  • If they share points of view, what are the opposite views?
  • Who else do they follow that ‘balances’ this creator’s point of view?

Remember that these conversations don’t need to happen just with controversial influencers. Talk regularly about all content your child sees to make critical thinking a regular part of their digital life.

Set up safely together

Work through safety settings together

Most social media platforms are meant for those aged 13+. As such, simply setting up parental controls for your teen is not going to be enough. After all, many teenagers know how to turn them off (or can figure it out).

So, instead, do it all together. Talk about why a setting is important and what it does. Have discussions about those points you disagree on to come to a middle ground and compromise. This way, they feel included in their online safety and can take ownership. Furthermore, if they need more support, they will feel more comfortable asking you for help.

Was this useful?
Tell us how we can improve it

Recent posts