*** DEBUG END ***

The internet must be made safe for children

25 January 2019

Regulation — and, where necessary, legislation — are needed to address moral decay, argues Stephen Cottrell

AN UNREGULATED digital environment is causing moral decay. The various harms that are being caused are deep-seated, corrosive, and pervasive. Many young people report that their first knowledge of sex comes from the internet. They gamble online. An alarming increase in mental-health issues among children is linked to bullying, which is magnified on the internet. If most images of the human body which you see are airbrushed to so-called perfection — and if you see these images at far too young an age — then little wonder you consider your own body second-rate.

The Children’s Society tells us that 61 per cent of children first created their social-media accounts before the prescribed age limit of 13. A recent OFCOM survey found that, by the age of 12, half of all children have some sort of social-media profile. This puts children at much greater risk, because these platforms are not designed with their use in mind. Just this month, I was at a school in Essex talking to seven- to 11- year-olds about their use of a game, TikTok. All of them were using it, but the lower age limit is supposed to be 13.

The fundamental issue that needs to be tackled is that the digital world assumes that all users are equal and that all users are adults — whereas, in fact, one third of users worldwide are children. Therefore, their health, well-being, and development require us to ensure that the internet, and the many ways in which children access it, are as safe as they can be. This has usually meant creating special safe places for children, or safety options that can be activated.

WOULD it not be better to turn this whole approach on its head? With any other public space — be it a cinema, a shopping mall, or a city square — the assumption is that this is a safe place for all ages to gather, and it is, therefore, safe for children.

This is made possible by regulation and, where necessary, legislation. Then we go on to create dedicated spaces for adults — not the other way round. In the cinema, this is done through film classification. In a public park or a city square, it is done through public-order legislation.

But the internet is a public space. Indeed, for children and young people, it is the public space. Neither do they separate their online and offline lives in the way that we adults do. This means that regulation and guidance to make the internet safe by design are all the more necessary.

Far from inhibiting the internet, as some vested interests claim, it will enable the internet to be the democratic, creative, and liberating space that it is meant to be. It is the lack of regulation that makes it dangerous and debilitating. Achieving a common standard does not make the internet restrictive for adults: it just means that we apply the same principles to all parts of our common life.

Let me put it another way. In the 1970s, we added fluoride to water and to toothpaste. Dental hygiene was transformed. We stopped dealing with the symptoms of tooth decay and designed a way of improving everybody’s health.

There is an important philosophical question here. What sort of world do we wish to build in this digital age? It is no good shrugging our shoulders and saying that it is all too difficult. Nor is it acceptable for Facebook, Google, and Amazon to say that they are not to blame, that they are just platforms. They carefully curate the way in which we receive the information that they gather. This at least gives them a powerful editorial voice, and any democracy should be concerned when editorial voices belong to the ones who can pay the most.

They do what they do for commercial purposes. Our data is their currency. But, increasingly, they are publishers as well — and their big bucks distort the whole eco-system of our media economy. It is hard to imagine that we would tolerate the creation of such monopolies in any other industry.

THE forthcoming report of the House of Lords Select Committee for Communication, which I am privileged to sit on, will present recommendations about how the internet can be regulated. It will say that the internet must be safe by design, and that children need to be taught how to inhabit it. Without this, we sell them short, and allow the liberating genius of the internet to be compromised and stymied.

In other walks of life, if it was your child in the pub or in the betting shop, or flicking through a pornographic magazine that some adult had happily handed over to him or her, or your child’s world view was shaped by an increasingly narrow echo-chamber of gossip, speculation, and fake news, you would do something about it, and you would want something done. That is our job. We need to find a way of putting fluoride in the internet.

The Rt Revd Stephen Cottrell is the Bishop of Chelmsford.

Letters to the editor

Letters for publication should be sent to letters@churchtimes.co.uk.

Letters should be exclusive to the Church Times, and include a full postal address. Your name and address will appear alongside your letter.

Train-a-Priest Fund 2022 Appeal

Please consider a donation to TAP Africa this year. Every penny you can give goes to ordinands in Africa who face financial difficulty, to support them as they complete their training. 

Donate online

Read more about this year's appeal

Forthcoming Events

24 May 2022
Disability and Church: Intersectionality
A joint webinar from HeartEdge and Church Times.

2 July 2022
Bringing Down the Mighty: Church, Theology and Structural Injustice
With Anthony Reddie, Azariah France-Williams, Mariama Ifode-Blease, Luke Larner, Will Moore, Stewart Rapley and Victoria Turner.

More events

Welcome to the Church Times

​To explore the Church Times website fully, please sign in or subscribe.

Non-subscribers can read four articles for free each month. (You will need to register.)