Online safeguarding has become far more complex for schools, particularly as the intersections between education and digital environments have continued to multiply. While the internet can be leveraged to the great benefit of students, their digital lives (external to online educational environments) can, and often do, also impact their offline lives. Exposure to harmful content and contact with others online, as well as the consequences of students’ own behaviour, can be harmful to children and young people. Increasingly, the algorithms and predictive analytics developed by social media apps compound the problems by, for example, recommending adult strangers connect with children whose content the adult has viewed, and pro-anorexia content to young people with eating disorders. EdTech apps that use trackers can contribute to these problems by sharing or selling data about children that use EdTech apps. The management of such risks and their impacts have come to pose a significant challenge to schools globally.
School staff, including teachers and administrators, are having to support their students as they navigate the world of social media which can involve cyberbullying and sexting. Social norms around such issues are evolving as they become more prevalent online: they are becoming increasingly normalised, to the detriment of children and young people everywhere. The complexities attendant to an evermore digitised social life are being confronted by students and their teachers alike, with school communities having to confront these evolving issues without any common understanding of best practice for or standard approach to handling the online harms that children face.
Existing approaches tend to focus on parental or school oversight, via easily overcome parental controls measures with minimal active engagement: providing and withdrawing consent to children’s access to certain platforms is neither readily offered or easily done when made available. Age-gating as it currently exists is similarly easy for children and young people to bypass, with self-declaration of age being the most widely used and simple to cheat.
If platforms, children and responsible adults (parents and teachers alike) are able to participate in a common approach, whereby, one, platforms could be made aware of users’ age bands in a privacy-preserving manner and, two, parents and educators could give or deny consent to their children’s access to certain features (and/or the processing of their data), many of the problems posed by content, contact and conduct online could be avoided.
Indeed, this was the central proposition of a recent Government-run programme of work entitled the Verification of Children Online (VoCO) project. It took as its guiding hypothesis that, “If platforms could verify which of their users were children, then as a society we would be better empowered to protect children from harm as they grow up online...” The VoCO programme of work involved a series of technical trials involving TrustElevate, BT, the Football Association and Trackd a music app. This project determined that age assurance, provided by TrustElevate, such that platforms know the ages of their users, was desirable, feasible and proportionate. It also ran in 2020, a year in which a major wave of activity in the online harms regulatory and policy spheres began.
Momentum has continued to build since then, and we are now facing a major turning point: the publication of the ICO’s Age-Appropriate Design Code, which elaborated on Safety by Design practices in relation to the risks children face online, and the recent institution of General Comment 25 in the UN’s Convention on the Rights of the Child focuses on the digital environment. It provides both states and businesses with guidance on relevant legislative, policy and other measures to ensure full compliance with their obligations under the Convention. The comment focuses on the opportunities, risks and challenges in promoting, respecting, protecting and fulfilling all children’s rights in the digital environment.
This marks a significant shift toward a more child-centric approach to duty of care toward users by legislators, policymakers and businesses. Certainly, what is deemed acceptable or otherwise when it comes to children’s rights, which should supersede commercial imperatives, is changing. While these have emerged, a number of class action lawsuits have been levelled against major platforms, including TikTok and YouTube, which are set to transform the ways in which they deal with underaged users.
The UK is currently finalising a new online safety regulatory framework, which grew out of the Online Harms white paper.
It will be a landmark piece of regulation that will drastically change how we as a nation approach content, contact and conduct harms as we move away from self-regulation on the part of online companies to a consolidated, regulator-enforced strategy.
Ofcom has been designated as the regulator and will have the power to fine companies failing in their duty of care up to £18 million or ten percent of annual global turnover, whichever is higher. It will have the power to block non-compliant services from being accessed in the UK. The role of Ofsted in ensuring schools’ alignment with this new legislation is being determined.
In this new era, it is vital that those adults responsible for the wellbeing of children and young people, including, of course, teachers and school staff, are made aware of the ways in which online safeguarding is evolving.
At the upcoming Schools and Academies show, supported by the Department for Education, TrustElevate’s founder, Dr Rachel O’Connell, will address the topics outlined above from the perspectives of schools and their roles and responsibilities concerning online safeguarding in this evolving landscape.
If you are interested in this and would like to be involved, please follow this link: https://schoolsandacademiesshow.co.uk/, if you are already registered, here is a link to the session we are hosting: https://saashowonline.app.swapcard.com/event/the-schools-and-academies-show-and-edtech-update/planning/UGxhbm5pbmdfNDIwOTI3. And, if you have any questions, please contact firstname.lastname@example.org.
TrustElevate is working in partnership with BESA and is an official LendED provider, sign up to use our technology for free here - https://www.lended.org.uk/product/keeping-children-safe-online-trustelevate/.
Dr Rachel O’Connell, founder of TrustElevate