THE Government is giving the communications regulator Ofcom new powers to protect children and adults who are using the internet. An Online Harms Bill will be introduced next year to allow the watchdog to block access to online services that fail to do enough to protect children and other users.
Technology giants such as Facebook and Instagram could also be fined large sums if they fail to do enough to deal with the problem, and they will be compelled to publish an audit of how they have tackled posts that are harmful but not illegal.
Some campaigners have complained, however, that the steps do not go far enough, as they make no mention of measures against online scams and other types of internet fraud. Also, the new law would not introduce criminal prosecutions. Secondary legislation could introduce criminal sanctions for senior managers, however, if tech companies fail to meet the new restrictions.
Announcing the Bill, the Digital Secretary, Oliver Dowden, told Parliament that the legislation, which should be in force by 2022, represented “decisive action” to protect both children and adults online. He said: “A 13-year-old should no longer be able to access pornographic images on Twitter; YouTube will not be allowed to recommend videos promoting terrorist ideologies; and anti-Semitic hate crimes will need to be removed without delay.”
The Children’s Commissioner for England, Anne Longfield, said that there were signs that new laws would have “teeth”. She continued: “However, much will rest on the detail behind these announcements, which we will be looking at closely.”
Under the proposals, Ofcom would be able to fine companies up to ten per cent of their annual global turnover, or £18 million — whichever was greater — if they refused to remove illegal content and/or potentially failed to satisfy concerns about posts that were legal but still harmful. That would include pornography accessible to children, bullying, and disinformation, such as misleading claims about vaccinations.
The new regulations would apply to any company hosting user-generated content accessible to UK viewers, with certain exceptions, including news publishers’ comments sections and small businesses’ product-review slots. Chat apps, dating services, online marketplaces, and even video games and search engines would all be covered.
The NSPCC, which had pressed for criminal sanctions against senior managers, said that it would be “closely scrutinising” the proposals, and the chief executive of the Christian pressure group CARE, Nola Leach, asked why the Government had not implemented age verification on online commercial pornography sites, which was included in the Digital Economy Act 2017.
It could be three years, she said, before the new measures came into effect. “In this context, and when Parliament has already passed legislation to protect children from accessing pornographic websites which could be implemented within a matter of months, the obvious way forward is for the Government to now implement Part 3 of the Digital Economy Act as an urgent interim measure.”
Action had been promised in the 2015 Conservative manifesto, she said: “Manifestos are not honoured by passing legislation, but by implementing it.”
Human-rights policy. The Church Commissioners and the Church of England Pensions Board this week supported a call for big tech companies to set out their policies on human rights. The Commissioners and the Board, which represent assets of £8.7 billion and £2.8 billion under management respectively, have worked with other large international investors to assist the publication, by the Swedish Council on Ethics, of a document outlining its long-term expectations of how global technology companies should work strategically on human rights.
It demands that tech giants reinforce measures to respect human rights, and align their work fully with the UN Guiding Principles on Business and Human Rights.