Across the UK in the last three years, 22 police forces have cumulatively reported an increase of 44% or 7,000 sexual offences involving children and young people where social media applications have been the means to facilitate the contact. Some of the reports have involved children as young as three years old. A separate, but related, concern is the access to inappropriate material children and young people can obtain through the use of social media and the internet.
The stance of social media companies is perceived as opaque. It is not clear how many complaints the social media companies receive, whether and how many people they employ to moderate sexual content, what the vetting and training procedures for those employees are and what the details of the companies’ moderation policies are. This approach is permitted by the existing EU directive which only requires companies to remove illegal content expeditiously once it has been brought to their attention. Notably there has been recent concern raised in the UK about the ability or willingness of the world’s largest social media organisation to comply with that obligation.
The NSPCC last year published a position paper proposing a statutory code of practice which would, subject to negotiation with all industry organisations, build on six principles:
- Managing the content on a service – the onus should be on the social media or communications provider to take steps in each sphere to take increased control of the acceptable content they provide on their platform, with additional clarity on minimum age limits and stronger requirements for suitable age verification and identity authentication processes.
- Parental controls – there should be a requirement to implement suitable parental controls for the type of service offered.
- Privacy controls – providers should have to give users suitable choices about how they use their personal information, offer privacy settings options including privacy by default along with having stricter measures in place for young children and to offer support to help them understand the implications of sharing information.
- Dealing with abuse/misuse – providers should have to make it easy for users to report abuse along with having a triage system in place to deal with content reports.
- Dealing with child sexual abuse/illegal content – providers should have to provide standardised functions for reporting online child abuse images and illegal sexual contact, have specialist teams in place along with systems for escalation and keeping users informed as to how they can report online child abuse messages.
- Education and awareness – providers should take responsibility, as part of the user experience, for educating children about safety.
Any code of conduct would of course have to be flexible and reviewed and amended regularly in order to keep pace with the rapidly changing world of social media.
There are many organisations including those engaged in the provision of social media platforms and communication which are working to keep children safe but there is much to be done to improve the current situation. There is no over-arching code of conduct and potential for confusion and inappropriate use remains. Some progress is being made such as the Digital Economy Act 2017, which received Royal Ascent on 27 April 2017. This included provisions in relation to preventing access to pornographic material by people under the age of 18, by requiring age verification for access to sites and applications, introducing civil penalties for online providers of such material who do not verify the age of customers and granting the British Board of Film Classification powers to require ISPs to block access to non-compliant material. It remains to be seen however how effectively those provisions will resourced and enforced.
Any statutory changes or a code of conduct will take time to introduce and will not be without challenge so for now in relation to children and the applications they routinely access education/awareness and parental control are the most current and effective brakes on misuse.
Written by Sarah Firth, associate at BLM