Skip to content

Toggle service links

Report finds an urgent need for independent regulation of social media

On Monday 18 February 2019, the Commons Digital, Culture, Media and Sport (DCMS) Committee published its long awaited final report on Disinformation and ‘fake news’. The report covered an enquiry that spanned 18 months, oral evidence from 73 witnesses including The Open University, over 4350 questions and a final ‘International Grand Committee’ meeting in November 2018. Its main recommendations focus on the ‘urgent need to establish independent regulation’ of social media and technology companies, such as a compulsory ‘code of ethics’, as well as an overhaul of current electoral law. These are bold suggestions, designed to enable the government to tackle such weighty issues as electoral fraud, the abuse of user data and the spread of disinformation false news.

Dr Caroline Tagg, Lecturer in Applied Linguistics and Dr Philip Seargeant, Senior Lecturer in Applied Linguistics at The OU give an overview of the report:

Still a need for digital literacy

The focus of the enquiry has evolved considerably since it was first set up, and the final report (building as it does on the Committee’s interim report released last autumn), is predominantly focused on issues around electoral malpractice and data mismanagement rather than the more narrow concept of ‘fake news’ with which it began. Despite this, one of the key recommendations still concerns the need for digital literacy education. Given the wider context of the new report, ideas of what count as digital literacy are themselves becoming broader, and now include issues such as understanding what data is generated when people communicate via social media, what happens to this data, as well as the provenance and purpose of political adverts.

Creating an environment with friction

One notable feature of the report is the way it challenges social media companies’ aim to achieve ‘a frictionless experience’ for their users, and argues for the need to introduce friction back ‘into the system’, by which they mean ‘there should be obstacles put in their place to make the process of posting or sharing more thoughtful or slower’. The Committee cites the Center for Humane Technology’s suggestions for creating friction, including ‘the ability to share a post or a comment, only if the sharer writes about the post; the option to share a post only when it has been read in its entirety; and a way of monitoring what is about to be sent, before it is sent’. These are undoubtedly useful suggestions that can enable users to change their behaviour, whilst challenging the potentially dangerous assumptions underlying social media companies’ policies and strategies.

Why do we share fake news?

But changing technology is only one part of the equation. It is also important to understand why people share false stories, and the effect this type of disinformation actually has on people’s actions. After all, the spread of disinformation online is related to how people use sites like Facebook – and this is shaped by the fact that Facebook is, first and foremost, a social space. This was recognised in the Committee’s interim report, which in turn cited the evidence we gave to the Committee in January that ‘to many people Facebook was not seen as a news media site, but a “place where they carry out quite complex maintenance and management of their social relationships”’. As our research shows, when people post to Facebook they potentially address a range of different social ties, from close family members to colleagues and acquaintances. It can be a tricky process to manage these various relationships all at the same time while not offending or upsetting anyone. Because of this, what someone shares or likes is often determined as much by the ties they have with their network as by a strict evaluation of its credibility. Picking up on the concepts used in the final report, this might involve what we could call social friction: the way in which someone’s social ties and relationships might shape, complicate or challenge their online behaviour, potentially leading them to post something they might suspect is false (because it came from a friend). Or, alternatively, it might prompt them to ‘pause and think before generating or consuming content’, as the Committee puts it (because of the user’s concerns about how they might come across to others, for example).

Questioning what we read and write

For this reason, as we argued in our own evidence to the committee, any solution to the problem needs to include critical digital literacy education alongside technological solutions. In line with this, the final report points out that the Committee ‘cannot stress highly enough the importance of greater public understanding of digital information – its scale, importance and influence’. They also reiterate their recommendation from the interim report that digital literacy should be made a ‘fourth pillar of education alongside reading, writing and maths’, a recommendation which, unfortunately, the government has yet to pick up on; while the Committee’s earlier suggestion of a social media company levy to finance a comprehensive educational framework was rejected in the government’s response to the interim report. Nonetheless, digital literacy re-emerges in the final report as a crucial element of the need for friction, with the Committee’s recommendation  that ‘[t]echniques for slowing down interaction online should be taught, so that people themselves question both what they write and what they read – and that they pause and think further, before they make a judgement online’. It concludes by calling for a ‘united approach’ to digital literacy which includes ‘a public discussion on how we, as individuals, are happy for our data to be used and shared’.

This ‘united approach’, we would argue, needs to incorporate a critical education programme that also includes what we call social digital literacies. Alongside traditional digital literacy skills, we need to provide greater critical awareness among the general public of how our social interactions and relationships play an important part in influencing our decisions regarding what to share or like – and how this in turn can contribute to the circulation and visibility of news in the online environment. While technological tweaks can introduce friction into the system, users are also contending online with multiple sources of social friction – that is, the various social and personal relationships that we manage online and our concerns regarding how we come across to others – all of which shape our online behaviour and may prompt ‘more pause for thought’ as much as it may encourage us to spread false content.

Find out more

Read Fake news and the need for ‘social’ digital literacy on OU News

Studying Language and Linguistics at The Open University

Dr Caroline Tagg, Lecturer in Applied Linguistics and Dr Philip Seargeant, Senior Lecturer in Applied Linguistics at The OU

About Author

Hannah is the Student Stories Copywriter in the In-house Creative Team at The Open University, having previously been a Media Relations Manager in the Press Office. With over a decade in communications, Hannah has led projects both agency-side and in-house for large companies and well-known brands, including RBS, NatWest, Travelodge, Audible, AA and the Royal Academy of Dance. She has completed a Masters in Publishing Studies and is currently studying towards an MBA. In her free time she enjoys photography, reading and going to the theatre.

Comments are closed.