Dan Milmo Global technology editor 

WhatsApp criticised for plan to let messages disappear after 24 hours

Children’s charities say change creates a ‘toxic cocktail of risk’ by making detection of abuse more difficult
  
  

WhatsApp
Mark Zuckerberg, the head of WhatsApp’s owner, Meta, announced the decision saying it should be in users’ hands how long their messages last. Photograph: Francis Mascarenhas/Reuters

WhatsApp users are to be given the option to have their messages disappear after 24 hours, a change that drew immediate criticism from children’s charities.

In a blog post announcing the change, WhatsApp, which has 2 billion users, said its mission was to “connect the world privately”.

WhatsApp introduced disappearing messages last year, with the option of deleting chats by default after seven days, but from Monday that is being offered in two new timeframes: 24 hours or 90 days. Users will also have the option to turn on disappearing messages by default for all new chats.

Mark Zuckerberg, the chief executive of WhatsApp’s parent, Meta, said on his Facebook page: “Not all messages need to stick around forever.”

The WhatsApp blogpost added: “There is a certain magic in just sitting down with someone in-person, sharing your thoughts in confidence, knowing you are both connecting in private and in that moment.

“The freedom to be honest and vulnerable, knowing that conversation isn’t being recorded and stored somewhere forever. Deciding how long a message lasts should be in your hands.”

Disappearing messages can be turned on by default for all new individual chats by going to settings, tapping “account” then privacy and default message timer. It will not affect any existing chats.

UK children’s charity the National Society for the Prevention of Cruelty to Children (NSPCC) said the move was “poorly thought out” and would create a “toxic cocktail of risk” once combined with Meta’s plans for encrypting messaging on all its services including Facebook and Instagram.

“Offenders groom children on open platforms like Instagram before moving them to WhatsApp for further abuse where there is less chance of detection,” said Andy Burrows, head of child safety online policy at the NSPCC.

“This poorly thought out design decision will enable offenders to rapidly delete evidence of child abuse, making it even harder for law enforcement to charge offenders and protect children.”

Burrows added that the combination of disappearing messages and end-to-end encryption – which prevents law enforcement and tech platforms from seeing messages by ensuring that only the sender and recipient can view their content – would not pass the risk assessment process in the UK online safety bill, which requires that platforms give details of risks to users to the communications regulator, Ofcom.

In November Meta announced that end-to-end encryption would take place in 2023 at the earliest, a year later than expected. Announcing the move, Meta’s head of safety, Antigone Davis, said the company would be able to detect abuse under its encryptions plans by using non-encrypted data, account information and reports from users. A similar approach has already enabled WhatsApp to make reports to child safety authorities.

“Our recent review of some historic[al] cases showed that we would still have been able to provide critical information to the authorities, even if those services had been end-to-end encrypted,” she said.

The home secretary, Priti Patel, has been a vocal opponent of Zuckerberg’s encryption plans, saying she “cannot allow” a situation that hampers the ability of law enforcement to tackle “abhorrent criminal acts”.

Last month Ofcom’s chief executive, Melanie Dawes, said social media companies should ban adults from directly messaging children or face criminal sanctions.

 

Leave a Comment

Required fields are marked *

*

*