Instagram to alert parents over teen self-harm searches

Instagram is to start notifying parents if their teenagers search for suicide or self-harm content on the platform.
It will mark the first time Meta has actually informed families about this kind of activity, rather than quietly blocking searches in the background.
In the coming weeks, Instagram will start notifying parents using supervision if their child repeatedly tries to search for content related to suicide or self-harm within a short period of time.
Parents will receive alerts via email, text, WhatsApp, or the Instagram app, along with expert advice on how to handle what’s likely to be a sensitive and challenging conversation with their child.
Meta says the system might flag things when there’s no major cause for concern, but they’d rather be over-cautious.
However, suicide prevention charities are worried about the possible ramifications.
The Molly Rose Foundation, set up by Molly Russell’s family after the 14-year-old took her own life in 2017 following exposure to harmful Instagram content, called the whole thing “clumsy” and believes it could make things worse.
“These flimsy notifications will leave parents panicked and ill-prepared,” the Foundation’s chief executive Andy Burrows said.
Molly’s dad, Ian, asked how any parent is supposed to react when they get that kind of message while they’re at work.
Charities believe Meta is viewing things from the wrong perspective and the priority should be to focus on stopping harmful content from being recommended in the first place.
Recent research suggests Instagram is still publishing ‘depression’ and ‘suicide related content’ that can be accessed by vulnerable teenagers.
The move comes as governments pile pressure on social media firms to do far better regarding child safety on their platforms, with Australia opting to ban under-16s from social media completely.