No one, not even its biggest supporters, thought that last year’s Online Safety Act was perfect. Finding a balance between free expression and digital access on the one hand, and the prevention of various kinds of harm on the other, was a hugely challenging task. The resulting legislation has been criticised as confused. But it was an important first step in regulating some of the world’s most powerful companies, and forcing them to comply with values – enshrined in laws – made in parliament and not boardrooms.
Much of the detail was left to Ofcom, the media regulator. This was a mistake by the last government, which should have adopted a broader-brush approach rooted in principle rather than process. But the narrowly technocratic code issued by Ofcom on Monday has compounded the error. Rather than filling in the gaps in the act, Ofcom has left loopholes for both online predators and the businesses that profit while enabling them. One particularly dismaying omission is measures targeted at sites promoting suicide and self-harm.
Peter Kyle, the secretary of state for science, innovation and technology, had the power to overrule Ofcom on key decisions. It is disappointing that he decided not to tighten rules governing design features such as attention-greedy algorithms, messaging and age assurance. Currently the balance is tilted too much towards content. It is unduly lenient to permit smaller providers to operate with much less strict rules.
Unlike in Australia, where a new law will heavily restrict access to social media for children under 16, the approach taken by most UK civil society groups as well as politicians has been that the internet must be made safer rather than placed off-limits. Schools and even nightclubs have made rules on smartphone use in specific locations. Charities such as the Molly Rose Foundation, NSPCC and 5Rights Foundation have pressed for effective regulation rather than bans. After this week, the question they and others are rightly asking is whether Ofcom is up to the task.
The self-serving utopianism which Meta, Alphabet (owner of Google), X and others used to market their new products convinces few these days. Not many would deny that the darkest aspects of human nature and culture thrive on these platforms alongside their myriad positive social uses. Violent material, including child abuse, is severely detrimental to workers in the industry, as well as to users. When online harms ripple out from screens to influence behaviour, all sorts of people (teachers, police, parents) must deal with the consequences.
Campaigners were right to argue for a general duty of care, and a system whereby businesses are accountable for measurable improvements, rather than the current framework under which they have only to follow a code. Harm reduction, rather than compliance, should have been the overarching goal. Ministers should have faced down the threat of legal challenges, and insisted on their prerogative to make people, and particularly children, their priority.
A second code, due next year, will include details about how social media companies should act in a crisis such as the summer’s riots, where online rumours were a factor in provoking racist violence. Individuals have rightly been held to account for these crimes. The government must strengthen its resolve to insist that the corporate facilitators of this and other harms are held accountable too. The alternative is cowardly complicity.
Do you have an opinion on the issues raised in this article? If you would like to submit a response of up to 300 words by email to be considered for publication in our letters section, please click here.