Take 10 - 26 January 2023
Welcome to RPC's Media and Communications law update. This month's edition on key media developments and the latest cases.
RPC's Media team are happy to announce the promotion of two of our Associates to Senior Associates, Nadia Tymkiw and Samantha Thompson. Congratulations from all of us at RPC!
Journalists given access to report on family proceedings
Accredited journalists and legal bloggers will now be able to report on family court proceedings in a 'big cultural change' to the Family Division. The move is part of a new pilot called the 'Transparency Reporting Pilot', which will be conducted in Leeds, Carlisle and Cardiff, and which aims to promote public confidence and accountability in the family justice system. Mrs Justice Lieven, liaison judge for the Midlands Circuit, said that being unable to report on family court cases 'has a major impact on the open justice principle' and that having journalists in court would 'improve standards'. Court applications and placement applications made within court proceedings will be the first to be open to reporting and, after a six-to-eight week period, private law children cases may also be reported on. Remote hearings will also be included in transparency reporting pilot. A transparency order will set out what can or cannot be reported. Journalists will need to preserve the anonymity of children involved in proceedings but details of legal representatives, judges and local authorities may be reported. The transparency order will also allow skeleton arguments, case outlines and position statements to be released to the press. However, cause lists will remain as they are due to delays in rolling out the pilot and extensive costs to changing the lists.
"Legal gangsterism?" Bob Seely proposes anti-SLAPPS private member bill
Conservative MP Bob Seely introduced a bill on Tuesday which would make it more difficult to bring legal actions known as strategic lawsuits against public participation (SLAPPS). While the government pledged in July to give courts in England and Wales new powers to dismiss SLAPPS and put a cap on costs. Mr Seeley's proposals are introduced via his Defamation, Privacy, Freedom of Expression, Data Protection, Legal Services and Private Investigators Bill. Whilst the text of the bill is yet to be published, a model law by was published by Index on Censorship and other groups last November (for more information, see RPC's recent blog). Seely has urged the government to accelerate the introduction of its own rules, or back his bill, as it is unlikely to make further progress in its current form due to a lack of parliamentary time to consider private members’ bills. Mr Seely told the Common that 'Firms who offer SLAPPS have made themselves wealthy, effectively attacking a free media, freedom of speech and legitimate corporate due diligence,' comparing the SLAPPS 'business model' to 'legal gansterism'. He cited the 'worst example' in recent years as the 'multiple lawsuits' against HarperCollins and Catherine Belton, the writer of Putin’s People. Seely also accused the Bar Council, the Law Society and the Solicitors Regulatory Authority (SRA) of 'doing very, very little regulation'. Mr Seely said his Bill would 'limit the financial and psychological costs of a meritless SLAPPS claim which can be imposed on a defendant', sanction those who 'abuse our courts', and dismiss SLAPP claims before costs have accrued. Seely proposed that the UK could follow in the footsteps of the US, which fines law firms for bringing unwarranted claims against journalists who report stories deemed to be in the public interest. Nick Vineall, chair of the Bar, has stated that 'The Bar Council has already put on record its support for measures to tackle SLAPPS. We have submitted evidence to the Ministry of Justice and the EU commission expressing our support for reform.'
FKJ v RVT and others - WhatsApp with that?
On 11 January 2023, the High Court dismissed an application to strike-out out a claim brought by the Claimant against her previous employer for misuse of her private information.
The Claimant, a solicitor, previously brought a claim in the Employment Tribunal against her former employer and its managing partner (RVT) after she was dismissed for falsifying a timesheet, on the grounds of sex discrimination, unfair and wrongful dismissal. The Claimant alleged that whilst employed she had been sexually harassed by RVT, her supervisor at the time, alleging 79 instances of sexual harassment. The Claimant lost her tribunal claim largely on the basis of her own WhatsApp messages, 18,000 of which were exhibited by the Defendant, which the court took to undermine her credibility and show that the attention she had received was either not unwanted or didn't take place at all.
The Claimant subsequently brought a claim for misuse of private information against the Defendant for his use of her WhatsApp messages, claiming that RVT had hacked into her WhatsApp account. The WhatsApp messages exhibited in the Tribunal claim had included messages between the Claimant and her now husband, as well as messages with her best friend in which she discussed intimate details about her private life. In seeking to strike-out the claim, the Defendant argued that he gained access to these messages because 1) a large number had been found on the Claimant's work laptop; and 2) he had been sent further batches of her WhatsApp messages via an anonymous letter.
The High Court dismissed the Defendant's strike out application on the basis that it struggled to see how the Claimant wouldn't have a reasonable expectation of privacy in relation to her WhatsApp messages, and that it was instead likely to be a question of how much the Claimant would be awarded in damages. The Court found 1) the Defendant had failed to explain why these messages would no longer be private by virtue of the fact that they were on the Claimant's work laptop; 2) only 40 of the 18,000 messages were used in the Tribunal claim; 3) upon receipt of further messages from an anonymous source, the Defendant should have told the Claimant he had received them (which he had not done so until he issued his Grounds of Resistance in the Tribunal claim) and passed them on, instead of relying on them to support his claim which English law did not permit. The Court considered that the Defendant was merely seeking to "stifle" the Claimant by bringing a strike out application "without merit".
Online Safety Bill: Tech Senior Managers face prosecution for failure to protect young users from harmful content
The Online Safety Bill, introduced in the House of Commons on 17 March 2022, proposes better regulation for search engines and firms that host user-generated content. The bill aims to reduce the amount of online content deemed inappropriate for young users and that ministers believe causes serious harm to their safety. This includes content promoting self-harm, eating disorders, and those that depict sexual violence as well as child sexual abuse material, revenge pornography, selling illegal drugs or weapons, and terrorism.
The recently proposed measures, put forward by nearly 50 Conservative MPs and backed by the Labour Party, will impose a duty to ensure children's online safety by mitigating and managing the risks and impacts of harm to children online. The tech firms' requirements involve introducing and enforcing strict age limits and publishing risk assessments detailing threats their services may encounter regarding inappropriate content and keeping children safe. The recently proposed amendment enables Ofcom to issue enforcement notices to senior managers of tech platforms who are found to have breached their child safety duties by allowing exposure to age-restricted or illegal content. The online providers must co-operate fully with an Ofcom investigation in to whether their service has failed to comply with the requirements. Ofcom will only be able to prosecute senior managers if they fail to cooperate with an investigation. Failure to comply with their duties may result in criminal penalties of up to 10% of the company's global turnover or a maximum prison sentence of up to two years.
The Online Safety Bill went through the first reading on 18 January 2023. The second reading of the bill is due to take place in the House of Lords on 1 February 2023.
Amersi v Leslie
Mohamed Amersi, a wealthy Conservative party donor and businessman, issued proceedings back in 2021 against Charlotte Leslie, former Tory MP, and the Conservative Middle East Council (CMEC) of which she is a director. Amersi alleges that Leslie had spread defamatory memos about him to various key individuals, including to Ben Elliot, the Conservative party's chair, following his proposal to establish a group to improve the party's relationship with the Middle East. On 10 January 2023, Mr Justice Nicklin heard an application made by Amersi to amend his particulars of claim to show how the publication of this material had caused his reputation serious harm.
At the hearing Mr Justice Nicklin said to Amersi's Counsel that if he wanted to clear his name, he could have brought proceedings against other individuals since "it was clear" what they had published. He noted that the Court was not a "playing field" on which he could carry out his dispute, nor were judges simply going to "referee" without considering the merits of the case. Judgment has been reserved.
Ofcom opens enforcement programme on age assurance measures for adult video-sharing platforms
Ofcom is launching an enforcement programme into age assurance measures across Video Sharing Platforms (VSPs). Ofcom has three objectives for this programme. The first is to assess the age assurance measures to ensure they are sufficiently robust to prevent under-18s from accessing videos containing pornographic material. The second is to identify whether there are other platforms in the adult VSP sector that may fall in scope of the VSP regime but have not yet notified their service to Ofcom as required under the VSP framework. The third is to understand from providers of adult VSP services the challenges they have faced when considering implementing any age assurance measures.
Following the programme, Ofcom will decide whether any further action is needed, and how best to address potential harm. If Ofcom becomes aware of a platform within a UK jurisdiction that is not protecting its users appropriately, enforcement action could be taken by Ofcom even if it has already notified them. The programme will initially run for four months. The current guidance on which VSPs need to notify Ofcom can be found here.
View counts now available on Tweets - how will this affect Twitter libel claims?
Twitter has announced that view counts for tweets are now visible for iOS, Android and web users. It will be interesting to see the effect of this on libel claims in relation to Tweets. Examples of Twitter libel claims in the last year include Banks v Cadwalladr  EWHC 1417 (QB) and Wright v McCormack  EWHC 2068 (QB). The extent of publication is commonly contested in cases relating to Tweets, particularly in relation to the issues of serious harm and damages. In cases such as Monroe v Hopkins  EWHC 433 (QB) Claimants relied on Twitter analytics which showed how many 'impressions' a Tweet received – which shows not how many people have seen the Tweet, but how many times a tweet is displayed on the screen of a viewer of the tweet who is active at that time. A Claimant may now find it more difficult to rely on a publication in support of their position that that the words complained of have caused, or are likely to cause, serious harm to their reputation within the meaning of section 1 of the Defamation Act 2013 if the view count of the Tweet is low. RPC act for Carole Cadwalladr and Peter McCormack.
Open Justice: Court reporting in the digital age
The Government has set out their initiatives this year to strengthen open justice after concerns were raised by the House of Commons Justice Committee that it has been negatively affected by the decline in news coverage over the last 20 years. The Justice Committee's report Open Justice: Court Reporting in the Digital Age called for the Government to fill in the gap that has been left from the increasing digitisation of both the media and the courts.
In their response published earlier this month, the Government has committed to taking steps to make courts and tribunals more accessible to the media and general public. These initiatives include publishing a charter that outlines the existing rules that enable public access to court and tribunal hearings, as well as providing a complete record of judgments and decisions on online platforms.
However, their response has also rejected the call for new legislation to "define the proper limits of open justice" to include elements of information transparency, stating that the principle is already "amply" established in common law, specific statutes and under the Human Rights Act.
Human Rights Watch warns of significantly weakened human rights protections in the UK
In its 712-page World Report 2023, its 33rd edition, Human Rights Watch reviewed human rights practices in almost 100 countries. The organisation highlighted several laws in the UK introduced in 2022 that had the effect of 'significantly weakening human rights protections' which included the replacement of the Human Rights Act with the Bill of Rights and the clamping down on protest rights. This report came just before the Government tabled an amendment to the Public Order Bill which broadens the legal definition of ‘serious disruption’, allowing police to shut down protests before chaos erupts. The Human Rights Watch Report also critiqued the proposals in the Bill of Rights, which sought to 'diminish the influence of the European Court of Human Rights on domestic courts, reduce public authorities’ obligations to protect rights and limit the responsibility of the UK authorities to protect rights outside UK borders'. The UK government, however, asserts that the Bill of Rights seeks to protect the rights of freedom of expression and freedom of the press 'by introducing a stronger test for courts to consider before they can order journalists to disclose their sources'. The government has not addressed the report.
Meta has enhanced restrictions to maintain age-appropriate ads for teens
Meta is enhancing restrictions on the data available to firms used to advertise to teenage users of Facebook and Instagram. From February, gender is no longer a targeting option and advertisers can only utilise a young user's age and location. Meta's recent blog post states that using age and location ensures this demographic see age-appropriate adverts and products that are relevant to where they live. Furthermore, a teenager's engagement on the apps will no longer inform the type of adverts recommended to them. From March, young users can access enhanced Ad Topic Controls within the app, enabling them to opt to 'see less' of certain types of adverts. Topics that Meta's policies already prohibit, such as adverts about weight loss products or alcohol will be defaulted to the 'See Less' option so that teenagers cannot view content that is inappropriate for their age.
Instagram already implemented changes in 2021 that limited advertisers' options to target young users. The new measures follow company research, feedback from child developmental experts and global regulation. Meta says that these new measures are part of their "continued work to keep our apps age-appropriate for teens". Meta acknowledges that young users are less aware of how their data is used for advertising, more specifically, how products and services are recommended to them. Hence this strategy to further restrict information shown to young users and to limit advertisers access to the information used to target them.
Quote of the fortnight:
‘Everybody behaves better if they think journalists are reporting on them.’ - Mrs Justice Lieven, during a briefing on the Transparency Reporting Pilot on 18 January 2022.