Alex Hern 

UK should not legislate to control children’s use of technology, says culture secretary

Matt Hancock refuses to rule out law to protect minors online, but would stop short of French measures
  
  

Child on mobile phone
France has banned mobile phones in classrooms and is to extend the measure to school premises as a whole. Photograph: Monkey Business Images/Rex

The UK culture secretary, Matt Hancock, does not allow his own children to have mobile phones and thinks none should have access to devices overnight, but would not follow the French government’s lead in legislating on the issue.

Hancock, whose brief includes digital issues, said: “Keeping children safe online is mission-critical, and everybody has a responsibility. The parents have a responsibility to ensure that children use technology appropriately. For instance, I allow my children to do their homework online, but I don’t let them on to social media.

Asked how he polices his own children, Hancock told the Guardian: “They don’t have access to the devices. They don’t have phones. Why do they need phones? They’re children, they’re 11.

“There’s a responsibility for parents, but there’s a responsibility for government too in ensuring that it’s as easy as possible for parents to do this properly, to keep their children safe. So for instance, making sure that internet companies properly police their own terms and conditions is important.”

He dismissed suggestions, however, that the UK government should intervene more directly to control the exposure of children to new technology as France has done. A 2010 law bans phones in classrooms, and the country’s president, Emmanuel Macron, has pushed further. From September, children under 15 will not be able to use phones on school premises at all.

Children and tech

Laws governing children's relationship with technology vary worldwide, and are rarely enforced. The de facto age for many online services is 13, set by the US Children’s Online Privacy Protection Act in 1998, which prevents sites from targeting children, or knowingly allowing children to provide information online without parental consent. The burden of garnering that consent and the low returns for building services for children has meant, however, that providers tended to turn a blind eye to under-13s on their sites, neither catering for them nor policing their presence.

That said, tech aimed more explicitly at children has blossomed recently, and legislation that aims to protect children from potential harm has been passed. Schoolchildren in France are barred by law from using their phones in school.

Such laws are countered by efforts on the part of companies such as Facebook and Google to attract new users while young. Facebook offers Messenger Kids, which lets children speak to contacts vetted by their parents, while Google’s YouTube has a Kids app that offers copious parental controls and the ability to filter videos for all but the most child-safe content – although the filters, which are run by an algorithm, haven’t always been successful, prompting the company to announce a human-curated version.

Proposed guidelines to improve child internet safety in the UK from the Information Commissioner’s Office in their 'Age appropriate design code' include:

  • Disabling 'nudge' techniques designed to keep children online for longer like 'streaks' on Snapchat or Facebook 'likes'
  • Limiting how children’s personal data is collected, used and shared by social media companies.
  • Making “high privacy” the default setting for children using social media platforms, including disabling geolocation tools and targeted advertising as standard, unless there is a compelling reason not to.
  • Requiring social media companies to show that all staff involved in the design and development of services likely to be used by children comply with the code of practice.
  • Introducing robust age verification checks on platforms or treat all users as if they are children.

Hancock does not think laws are appropriate in these areas. “In some places laws are required. In other places, it’s just that as a society we have to mature to make the most out of this technology, which is amazing and brilliant, rather than use it badly.

“So, for instance, I think it’s really hard to parent in the digital age. It’s one of the hardest times to be a parent because there’s this new technology that often the children understand better than the grownups. And there aren’t norms of how we use it.

Hancock is more outspoken, however, about regulating the internet in other ways. He reiterated his plans to implement legislation that would force social media companies to take more responsibility for the content on their platforms, and said Britain might go it alone if it was unable to establish an international coalition to support regulation.

“We don’t rule out legislation,” he said. “It is best if this approach is done internationally, but it doesn’t have to be. Because ultimately, we need political accountability over an area of life where there is a huge amount of power in the hands of a small amount of people.”Hancock was speaking ahead of the launch of London Tech Week, which promotes digital industry in the capital.

The introduction of the European General Data Protection Regulation (GDPR) has tightened up the rules for children online across many popular social networks. The law, which specifies fines of up to €20m or 4% of annual global turnover, whichever is higher, reiterates a standard legal minimum age of 13 for children on the internet. It also sets an age at which users can be deemed to give “consent” for data processing with 13 the legal minimum, but with a default of 16 unless countries specifically legislate to lower it. The Data Protection Act 2018 did just that, lowering the age in the UK to 13.

The prospect of higher penalties has led some social networks to tighten up their practices. Twitter decided as a result of the new rules to delete the accounts of anyone who had joined the service as a child at the time when the company did not ask for a birthdate to sign up, even if they are now an adult.

The government has offered little detail about what form its future social media legislation might take. Speaking to the BBC in May, Hancock said it may involve fines for companies that fail to act on bullying and harassment on their platforms, with a scale equivalent to that of the GDPR.

 

Leave a Comment

Required fields are marked *

*

*