Skip to content

Toggle service links

Why the solution to ‘fake news’ is education, not technology

This opinion article has been published by Senior Lecturer in Applied Linguistics at The Open University, Philip Seargeant


Since the Trump victory, and the Brexit result before that, the issue of ‘fake news’ has barely been out of the real news. Social media companies, such as Facebook, have been criticised for their role in the way these stories are distributed, and there’s been a lot of discussion about the impact it’s having on journalism, if not on democracy itself. President Obama has offered his thoughts on the topic, as has Hillary Clinton; while the Pope compared ‘fake news consumption to eating faeces’.

Are social media technologies the issue?

Most of the solutions for tackling the problem have focused on what Facebook should do to change its service. The argument goes that the personalisation algorithm, which shapes the way people experience the site, is responsible for creating ‘filter bubbles’; these shield users from views they disagree with, and allow fake news and highly partisan opinions to circulate unchecked. In December 2016, Facebook announced a set of measures to tackle this, including getting readers to flag stories for fact-checking.

Just blaming everything on technology runs the risk of implying that people themselves are all helplessly naïve, and ignores the fact that all media has an agenda; that it’s all constructing its own particular version of events to some degree.

Are we equally responsible for the filter bubble we experience on social media?

The way people interact on Facebook can be a complicated business. Research suggests that in fact people’s own actions are a key factor in the way stories and opinions are shared – often they themselves create a filter bubble effect through these actions. They’re constructing a particular image of themselves through what they say and do. They’re speaking in front of an audience which is made up of friends and acquaintances from all parts of their lives, who often have wildly different values. They have limited control over who will see the information they write and how it will be interpreted. And as they scroll down the news feed they’re switching constantly between the relentlessly trivial and very serious.

Education as a solution to the effect of the filter bubble

Our ability to make informed decisions about how we communicate depends on an understanding, not just of how the technology works, but how it works socially. In other words, it’s not just a matter of knowing, for example, how to keep on top of your privacy settings; it’s knowing the implications of what can happen as a result of choosing one setting over another. It’s not just a matter of understanding how to flag something you consider questionable; it’s a matter of evaluating its provenance and purpose in the first place.

Education, especially in Higher Education, can support the raising of awareness of how the flow of information in society is managed, and what this means for how people engage with each other’s opinions and values. This should focus on the role that social media now plays in politics and the circulation of news; on the tensions that can emerge between this role and assumptions that Facebook is mostly for trivial topics and therefore of little real-world consequence; as well as on the often inadvertent consequences that people’s actions online can result in.

How does this differ from the ‘critical thinking’ that Higher Education already offers?

The point is that an understanding of the medium, and its effect on the message, is also vital. In this situation, the medium is a mixture of the technology and the ways people communally use it, so this understanding needs to be informed by research into the way people actually use social media.

Universities are also in an excellent position to extend this type of learning to the wider community through resources that are made freely available online. If we’re anxious about tainted knowledge leading to an undermining of democratic society, providing people with a clear understanding of the tools they use to share and consume this knowledge is an excellent way of mitigating against such a scenario.


Learn more on the free online learning platform, OpenLearn

About Author

Christine is a manager in the Media Relations team within the Marcomms Unit at the OU. She is an experienced BBC journalist, sub-editor and news editor and has a background in regional newspapers. After moving to PR she worked as a press officer for the Zoological Society of London. She has a BSc in Social Sciences with Politics from The Open University; she focuses on STEM stories and widening access in HE. Chris swims regularly and has a pet Tortoise called Lightning.

Comments are closed.