Image of outside building. Side view.

Online platforms - Internet Safety Strategy green paper

Published on 18 December 2017

The internet is now all but ubiquitous, and there are growing public concerns about online safety. The issues range from online trolling, to hate speech, to location-sharing within social media platforms.

The development

On 11 October 2017, the government published its Internet Safety Strategy green paper as part of its strategy to ensure that Britain is the safest place in the world to be online. The green paper is underpinned by three principles:

  • what is unacceptable offline should be unacceptable online
  • all users should be empowered to manage online risks and stay safe
  • technology companies have a responsibility to their users.

Social media code of practice

In 2018 the government will issue a new, voluntary social media code of practice, as required by the Digital Economy Act 2017. This will seek to ensure that providers offer adequate safety policies, introduce minimum standards and ensure regular review and monitoring. The code will address bullying and insulting conduct online, as well as other behaviour likely to intimidate or humiliate (under s10(3) of the Digital Economy Act 2017). The government expects this to be a key tool in tackling conduct like trolling. The code will not cover unlawful content, as that is already covered by the legal framework.

The code will also include, as required by s103(5) of the Digital Economy Act 2017, guidance on:

  • maintaining arrangements to enable individuals to notify conduct to the provider
  • maintaining processes for dealing with notifications
  • ensuring relevant matters are clearly included in the terms and conditions for using platforms
  • information given to the public about action providers take against their platforms being used for harmful conduct.


The green paper includes questions on the possibility of working with industry to produce an annual internet safety transparency report. This would include facility for benchmarking reporting mechanisms, and would be used to guide any future policy interventions in the area.

The transparency report could be used to show:

  • the volume of content reported to companies, the proportion of content that has been taken down from the service, and the handling of users’ complaints
  • categories of complaints received by platforms (including by groups and categories including under 18s, women, LGB&T, and on religious grounds) and volume of content taken down
  • information about how each site approaches moderation and any changes in policy and resourcing.

Social media levy

The government plans to introduce a levy, to support greater public awareness of online safety  and enable preventative measures to counter internet harms. The levy will aim to be  proportionate, and not stifle growth or innovation, particularly for smaller companies and startups. 

The levy will (at least initially) secure contributions on a voluntary basis. The green paper  compares it to the levy set out in the Gambling Act 2005, where in practice the sector provides  voluntary contributions and support without the need for legislative input. 

Encouraging technology firms to "think safety first"

The Department for Digital, Culture, Media & Sport (DCMS) will work with industry bodies like  TechCity UK to support start-ups at the very earliest stages of their development to help build  online safety features into new products from the start. 

The consultation emphasises the business benefits of getting online safety right, including  simple reporting mechanisms and quick response times to complaints. It gives the example  that lack of reporting on new apps can lead to them being used to send inappropriate or  harassing content. Including reporting from the start would greatly improve customer  experience and save the business from future complaints and the need for app  redevelopment. 

The government will work with app store providers on the most effective way to implement  new protections for minors, and to promote clearer, more uniform safety features in app  descriptions. The Google Play Store and Apple’s App Store are cited as good examples of  17  platforms promoting safety features, but the green paper welcomes more consistency across  the industry. 

Digital literacy

The government is keen to promote children’s digital literacy, in part to tackle the growing  trend that online behaviours fail to meet the standards that we expect from children in the “real  world”. It also wants children to be able to critically interpret content they view online,  including to recognise commercial content and question the legitimacy of information (eg fake  news). The DCMS and Department for Education expect digital literacy, including online  citizenship, to form a compulsory part of the school curriculum. 

Support for parents and carers

The green paper is also concerned with empowering parents, carers and teachers in talking to  their children about online safety. The government wants to see consistent, easily accessible,  well-publicised advice for parents. DCMS plans to work with trusted partners to raise the level  of awareness around innovative technology solutions for parents. 

Police response to online hate crime

As part of the Internet Safety Strategy, the Home Office will create a new national police online  hate crime hub, staffed with specialist officers. All reports of online hate crime will be  channelled through this hub. 

Online dating and networking sites

Recent years have seen an explosion in the popularity of apps and social media services  which enable users to make social, romantic, and sexual connections. The green paper notes  that while some services are strictly oriented towards adults through their terms and  conditions, this is not always enforced.

Primary responsibility for enforcing the law on child sexual exploitation lies with the police.  However, the government considers that there is a role for companies providing adult-oriented  services, to ensure that their user-base is over the age of consent and to prevent solicitation  and contact between adults and children. There is also a role for users in identifying at-risk  individuals, and flagging them to review teams. The government will work with companies  providing these services to review processes and procedures. 

Online pornography

The Digital Economy Act 2017 introduced requirements for online pornography provided on a  commercial basis to be inaccessible to under-18s. The DCMS intends that a system of robust  age verification barriers will be in place during 2018. This will create a regulatory framework  that will disrupt the business of non-compliant sites. This should reduce the chances of a child    stumbling across pornographic content online, and allow every child to develop at the time that  suits them.  .

Why is this important?

Will this be the end of the online Wild West? Clearly, the government is no longer happy to  allow users and providers a free reign online, where it would interfere with the rights of others. 

The green paper cites disturbing statistics. For example, in the past year, almost one fifth of  12-15 year of olds encountered something online that they “found worrying or nasty in some  way” . Surveys show that parents are now more worried about their children sexting than they  are about them drinking or smoking.

The government is keen to work with industry, and much of its strategy relies on voluntary  contributions from the sector. Although not compulsory, any companies which are slow to act  will risk serious negative publicity. 

Any practical tips?

Businesses which stay ahead of the curve and implement innovative safety features will reap  the rewards in customer engagement and regulatory approval. 

Companies should prepare to work with government bodies, regulators, and to be visible in  their support for new internet safety initiatives. 

The green paper can be found here.