House of Lords Communications Committee: “Regulating in a digital world”
The Government's Digital Charter identified its key priorities, including protecting people from harmful content and behaviour, the legal liability of online platforms, and data and artificial intelligence.
In January 2018, the Government published its Digital Charter. In response to the Internet Safety Strategy Green Paper, it announced that new laws would be created to “make sure the UK is the safest place in the world to be online” and committed to the publication of the forthcoming online harms white paper. The latter is expected to address a number of topics covered by the Committee’s report, including age verification for social media companies.
On 9 March 2019, the House of Lords’ Communications Committee published its report “Regulating in a digital world”. The report was far reaching and included some high-level objectives, but with little thought as to how they might be implemented. It is not yet clear which if any of the recommendations will be incorporated into the Government’s White Paper.
The Lords’ Committee posited that existing law and regulation affecting the provision and use of digital services was inadequate and it therefore proposed the creation of an overarching super-regulator, the Orwellian-sounding “Digital Authority”, which would not only co-ordinate non-statutory organisations and existing regulators but have over-arching powers in relation to the latter.
A new joint select committee is also proposed, to cover all matters related to the digital world and specifically oversee the Digital Authority, “to create a strong role for Parliament in the regulation of the digital world”.
This and all other regulators would be governed by a commitment to 10 key principles, many of which appear to be drawn from the obligations imposed under the General Data Protection Regulation and the Data Protection Act 2018:
6. ethical design
7. recognition of childhood
8. respect for human rights and equality
9. education and awareness raising
10. democratic accountability, proportionality and evidence-based approach.
The principle of parity was illustrated with the example that social media platforms should face the same obligations in relation to the imposition of age-based access restrictions as providers of online pornography.
Liability of social media platforms
The Lords’ Committee considered that the hosting and curation of content which can be uploaded and accessed by the public meant that a notice and takedown model was no longer appropriate. The Committee recommends revising or replacing the protections under the E-Commerce Directive 2000/31/EC, but rejected the imposition of strict liability.
Obligations of social media platforms
Arguing that the moderation processes employed by social media platforms “are unacceptably opaque and slow”, the Lords’ Committee recommends that online services hosting UGC “should be subject to a statutory duty of care and that Ofcom should have responsibility for enforcing this duty of care, particularly in respect of children and the vulnerable in society”, which should incorporate moderation services and an obligation to achieve safety by design.
The Committee did not accept the evidence calling for external adjudications of complaints or even judicial review of online moderation. Although the Committee does not seek to articulate the scope of the duty, in February the Children’s Commissioner published a draft statutory duty of care proposed to be applicable to any online service provider which proposes a duty to “take all reasonable and proportionate care to protect [anyone under the age of 18] from any reasonably foreseeable Harm”, which is defined as “a detrimental impact on the physical, mental, psychological, educational or emotional health, development or wellbeing” of children, and from which liability for the acts of third parties can only be avoided if the provider has done “all it reasonably can to prevent Harm”. The factors by which the discharge of the duty should be determined, such as the speed of responding to complaints (legitimate or otherwise), are not proposed to be limited to their application to children, and would therefore have the effect of imposing wider obligations vis-à-vis all users of the service regardless of impact.
Concerned about the impact of the creation of data monopolies and the consequences for consumer protection, and (perhaps surprisingly) comparing online service providers to utility providers, the Committee recommended that the consumer welfare test needs to be broadened to move away from a focus on consumption and price and that a public interest test should be applied to data-driven mergers.
The design and transparency of algorithms was of particular concern to the Committee.
In an example of a differentiation between acceptable conduct online and offline, the Committee disapproved of the use of technology to take advantage of psychological insights to manipulate user behaviour, for example to encourage time spent using a service. While psychological insights have long been a tool utilised by the retail sector, for example, and even the government itself with David Cameron’s “nudge unit”, the Committee suggested that ethical design required that “individuals should not be manipulated but free to use the internet purposefully”. The Committee recommended that the ICO should produce a code of conduct on the design and use of algorithms, potentially working with the Centre for Data Ethics and Innovation to establish a kitemark scheme, but also have powers of audit supported by sanctions.
The Committee also recommended that greater transparency around the use of algorithms and the data generated be achieved by requiring service providers to publish information about the data being generated and its use, as well as by affording users an enhanced right of subject access. The Committee proposed that the former be applicable to both data controllers and data processors.
Terms and conditions
The transparency, fairness and age appropriateness of terms and conditions was also a key focus for the Committee and the Committee suggested that these should be subject to regulatory oversight with any service provider which breached its terms of service being subject to enforcement. This would not appear to encourage service providers to provide gold standard service commitments for fear of being penalised for failing to meet them and could result in a lower common standard.
Why is it important?
While many of the Committee’s proposals are likely to be welcomed in some quarters, the practicality of designing and implementing them, and the impact they would have on the majority of users and the provision of services, means that they warrant at least further scrutiny, if not revision or rejection, if the government is to achieve the “right regulation”.
The proposals are intended to ensure that unlawful conduct is treated consistently whether online or offline, and there is a stated commitment not to limit free speech or lead to unjustified censorship. However, they extend far beyond the regulation of what is unlawful and trespass on what is deemed to be harmful or anti-social. The proposals would impose more stringent restrictions on the online space than other forums for public discourse, potentially threatening an undue restriction on freedom of expression. They also fail to articulate what constitutes an “online harm”. It is unacceptable to put online service providers in the position of legal adjudicators, with the threat of sanction if they are deemed not to be delivering in the desired manner.
By proposing to regulate the terms and conditions of user services, apparently without seeking to set minimum standards, the Committee risks subjecting the most responsible platforms to the greatest regulation by virtue of seeking to enforce their terms and conditions.
Any practical tips?Online service providers do not need to make drastic alterations yet but should be aware of what of these changes will mean for their business if they are realised in the White Paper.