Nov 05, 2019
Written By Kerry Holmes
Social-media platforms and fake news
Nov 05, 2019
Written By Kerry Holmes
Recent scandals and allegations about misuse of data, election interference and misinformation has led to a surge in reviews, policy and regulations.
The printed media in the UK, particularly the tabloid press, has long been associated with partisan reporting, exaggeration and taking facts out of context. On occasions, papers have even published entirely fabricated stories.
Previously, their reach has been limited by circulation, and can be partly balanced by exposure to other outlets, such as television news programmes. But rising engagement with digital publications and social media increases the risk of being presented with so-called fake news, and reduces opportunities for challenge by other sources. The shift to user-led, on-demand television has further reduced media pluralism.
The Independent Press Standards Organisation (IPSO) regulates print and online newspapers. Scope is limited to members, who are obliged not to publish “inaccurate misleading or distorted information”, and complaints can only be brought by those personally affected. IMPRESS is an alternative to IPSO, with regulation also restricted to members. Broadcast media is regulated by The Office of Communications (OFCOM), which has a limited remit—not extending to social-media platforms.
Industry regulators don’t currently have the power to counter the rise in fake news. But there are some established laws that may apply, including defamation, human rights, electoral law, competition and data protection.
Defamation
Fake news may be classed as defamatory if likely to cause serious reputation harm to a person or severe financial loss to an organisation. To bring a claim against the operator of website or platform, you must give notice and demonstrate that you could not identify the individual responsible for the post itself.
In England, the defence of “honest opinion” replaced “fair comment” in 2013. Claims can be countered when an honest person could be shown to have held the opinion expressed, based on existing facts at publication. The defence of public interest was also broadened, and there are proposals to bring the law in Scotland and Northern Ireland in line with these changes. Defamation offers limited options for legal redress, particularly as not all fake news constitutes hate speech.
Free speech and human rights
The UK doesn’t have a constitution setting out rights, including freedom of speech. It incorporated the European Convention on Human Rights into its own law, guaranteeing freedom of expression, but retained exceptions. The Race Relations Act 1976, for example, prohibits "threatening, abusive or insulting" words likely to stir up hatred against any racial group. This includes words or behaviour likely to cause harassment, alarm or distress or breach of the peace, and any incitement to racial hatred or terrorism. The European Convention also prohibits discrimination based on "sex, race, colour, language, religion, political or other opinions, national or social origin, association with a national minority, property, birth or other status".
International human rights law attempts to find the same balance between freedom of expression and equality. The Universal Declaration of Human Rights cites a “six-part threshold” for determining hate speech: context; speaker identity and influence; intent; content; extent and magnitude; and likelihood of harm. Rights “must also be protected online”.
Electoral law
Misinformation for political gain may fall under UK law on election campaigning and financing, overseen by the Electoral Commission. Its 2018 report: “Digital campaigning: Increasing transparency for voters” acknowledged the value of new ways of reaching voters and the importance of free speech. It called for the originators of online information to be made apparent and for enforcement of existing law forbidding foreign nationals spending on UK campaigns. It also proposed increased fines and enhanced investigative powers.
“Our electoral regulations are hopelessly out of date for the internet age,” said Damian Collins, chair of the Digital, Culture, Media and Sport Select Committee (DCMS). In the report ‘Disinformation and “fake” news’, published in February 2019, the DCMS called for transparency of campaigning, greater powers for the Electoral Commission and clear liabilities for operators regarding illegal or harmful content.
Competition law
The DCMS suggested that the Competition and Markets Authority (CMA) should audit the advertising market on social media. The chancellor of the exchequer Philip Hammond echoed this in a March 2019 letter to CMA chair Andrew Turie. This followed another commissioned report from the Digital Competition Expert Panel, “Unlocking digital competition”.
Data protection
The European Parliament's Committee on the Internal Market and Consumer Protection relates that internet users exchanged 100 gigabytes of data per day in 1992. By 2016, this had risen to 26,600 gigabytes per second. Information not directed to the right audience becomes lost in the noise. Targeting requires personal data, which is subject to the EU’s General Data Protection Regulation (GDPR). This came into force in May 2018 and was incorporated into the UK’s Data Protection Act (2018). The Information Commissioner's Office (ICO) has the power to conduct criminal investigations and issue fines relating to GDPR.
Organisations must have policies to protect data that can identify an individual. This includes name and contact details, but also IP addresses. There are six key elements requiring that data is:
- Processed lawfully, fairly and transparently;
- Collected only for specific, legitimate purposes;
- Adequate, relevant and limited to what is necessary;
- Accurate and, where necessary, kept up to date;
- Stored only as long as is necessary;
- Processed in a manner that ensures appropriate security.
- Data can only be processed – i.e used:
- If the subject has given consent;
- To meet contractual obligations;
- To comply with legal obligations;
- To protect the subject’s vital interests;
- For tasks in the public interest;
- For the legitimate interests of the organisation.
In addition, consent must be freely given, specific, informed and unambiguous; a request for consent must be intelligible and in clear, plain language; and silence, pre-ticked boxes and inactivity no longer suffice as consent.
GDPR also defines “sensitive” personal data that requires extra precautions, including religion, politics, race and sexual orientation. Some fake news is known to have been communicated using data that has been obtained illegally, and enforcing data protection law may provide one way to respond.
UK Government response
The UK Government set out its ‘Digital Charter’ in January 2018 (updated in April 2019) to limit disinformation for “political, personal and/or financial gain”. The principles are summarised as:
- people need to understand the rules that apply online;
- personal data should be used appropriately;
- protections should help keep people safe, especially children;
- rights that people have offline must be protected online;
- the social and economic benefits of new technologies should be fairly shared.
The government’s “Social Media Code of Practice” of April 2019 requires efficient processes for users to report harmful content and for platform operators to deal with notifications.
“The Cairncross Review: a sustainable future for journalism” was commissioned by the government and published in February 2019. It highlighted:
- the market failure of public-interest news;
- revenue challenges for online publishers;
- the challenge to publishers posed by Facebook and Google;
- the need to maintain high-quality journalism.
The review asserts that the government may need to intervene to protect quality online publishing and public interest news. Among suggestions are favourable taxes for online publishers and a role in complementing commercial news for the BBC. It makes eight recommendations:
- New codes of conduct to rebalance the relationship between publishers and online platforms;
- Investigate the online advertising market to ensure fair competition;
- News quality obligation—a regulator to oversee steps taken by operators to improve awareness of the origins and quality of news;
- Media literacy—work with Ofcom to develop a media literacy strategy, identify gaps in provision and seek opportunities for collaborative working;
- Ofcom to explore the BBC’s market impact, and the BBC to help local publishers more;
- Innovation funding—a government fund aimed at improving the supply of public-interest news;
- New forms of tax relief—including extending the zero VAT rating to digital newspapers;
- Direct funding for local public-interest news.
European Union response The European Commission carried out consultations, set up a High-Level Expert Group and conducted a Eurobarometer poll. It then published a communication in April 2018 on “Tackling online disinformation: a European Approach”, presenting these actions:
- Improve transparency about origin, production, sponsorship, dissemination and targeting;
- Promote diversity of information and critical thinking by supporting high-quality journalism and media literacy;
- Use fact-checkers to indicate trustworthiness and improve traceability and authentication;
- Involve broad stakeholders and public authorities, online platforms, advertisers, trusted fact-checkers, journalists and media groups.
The EU’s Committee on the Internal Market and Consumer Protection’s report in June 2018 reported that website and platform operators have not taken responsibility for fact-checking. User behaviour (e.g. confirmation bias) and the use of algorithms to personalise content has led to:
- Content bubbles and echo chambers—where a single point of view is bolstered by curated content and not challenged or balanced by opposing views;
- The spread of fake information and unfounded opinions, which have influenced public opinion;
- Fake news intentionally shared via social media to deliberately manipulate public opinion;
- Planned operations to spread large scale disinformation—often with political motives.
As a follow up in September 2018, the European Committee produced a “Code of Practice on Disinformation”, setting out commitments, best practice and key performance indicators to “address the spread of online disinformation and fake news”. This covers:
- Scrutiny of ad placements;
- Political advertising and issue-based advertising;
- Integrity of services;
- Empowering consumers;
It requests the use of tools and fact-checking companies to promote verification and clearly distinguish advertisements from news. It asks that users are informed to understand why they have been targeted and that sponsor identity and spending is transparent. It calls for efforts to close fake accounts and ways to make apparent the difference between bots and human interactions. At the same time, it protects anonymous use of services and urges that access to lawful content should not be prevented. Operators are requested to dilute disinformation through improved findability of trustworthy content, different sources and views.
Next steps
At one end of the scale, the government of Singapore took strong action with The Protection from Online Falsehoods and Manipulation Act. This was criticised by Human Rights Watch and Amnesty International for curtailing freedom of expression in the process. However, the collaborative approach adopted by the European Commission is taking time. In April 2019 the signatories to the Code of Practice on Disinformation—Google, Facebook and Twitter—reported on progress. The Commission acknowledged: “The voluntary actions taken by the platforms are a step forward to… better protect our democratic processes from manipulation” but conceded that “a lot still remains to be done”.
Advertisement
Advertisement
Commercial Insights
- 3D printing and the future of intellectual property
- A new era for the legal industry: how LegalDefence is making law more accessible
- AAL Commercial Awareness: British EU Emigrants lose supreme court ruling
- AAL Commercial Awareness: Image Rights, Jose Mourinho and Sponsorship
- AAL Commercial Awareness: Jonas Gutierrez, Newcastle United and disability discrimination