Anger as WhatsApp lowers age limit to 13 in UK, EU
Technology
Child safety group says Meta ‘putting profits before protecting children
(Web Desk) - Campaigners have reacted with anger to the social media company Meta lowering the minimum age for WhatsApp users from 16 to 13 in the UK and EU.
The change was announced in February and came into force on Wednesday. The campaign group Smartphone Free Childhood said the move “flies in the face of the growing national demand for big tech to do more to protect our children”.
It said: “Officially allowing anyone over the age of 12 to use their platform (the minimum age was 16 before today) sends a message that it’s safe for children.
But teachers, parents and experts tell a very different story. As a community we’re fed up with the tech giants putting their shareholder profits before protecting our children.”
WhatsApp said the change brought the age limit in line with that in the majority of countries, and that protections were in place.
Ofcom’s director of online safety strategy, Mark Bunting, said the regulator would not hesitate to fine social media companies that failed to follow its directions, once it had the power to do so.
He told BBC Radio 4’s Today programme that Ofcom was writing codes of practice for enforcing online safety. “So when our powers come into force next year, we’ll be able to hold them to account for the effectiveness of what they’re doing,” he said.
“If they’re not taking those steps at that point, and they can’t demonstrate to us that they’re taking alternative steps which are effective at keeping children safe, then we will be able to investigate.
We have powers to direct them to make changes, if we believe changes are necessary to make.
“If they don’t comply with those directions, we do have powers to levy fines – and we won’t hesitate to use those powers – if there’s no other way of driving the change that we think is needed.”
This week Meta, which also owns Facebook and Instagram, unveiled a range of safety features designed to protect users, in particular young people, from “sextortion” and intimate image abuse.
It confirmed it would begin testing a filter called Nudity Protection in direct messages on Instagram, which will be switched on by default for users under the age of 18 and will automatically blur images sent to users that are detected as containing nudity.
When receiving nude images, users will also see a message urging them not to feel pressure to respond, and an option to block the sender and report the chat.