Instagram to alert parents when teens search for info on suicide

Instagram to alert parents when teens search for info on suicide
By: CBS Technology Posted On: February 26, 2026 View: 5

Meta-owned Instagram will soon alert parents if their teenage child uses the app to search for content related to suicide or self-harm, the technology company's latest effort to shore up safety features as it faces heat over how social media impacts young people. 

Meta said that, starting next week, parents who use Instagram's supervision tools will get a message — either via email, text or WhatsApp, as well as through an in-app notification — if a teen repeatedly searches for certain terms related to self-harm or suicide within a short time span. 

The company said the message will inform parents that teens repeatedly searched for suicide or self-harm content and offer resources on how to approach sensitive conversations around mental health.

"The vast majority of teens do not try to search for suicide and self-harm content on Instagram, and when they do, our policy is to block these searches, instead directing them to resources and helplines that can offer support," the company said Thursday in a news release.

Meta did not specify how many searches will prompt a parental alert, noting only that "we chose a threshold that requires a few searches within a short period of time, while still erring on the side of caution."

The new safeguard will initially roll out in the U.S., the United Kingdom, Australia and Canada before being deployed in other regions later this year, according to Meta. 

In October of last year, Meta also introduced age-based content restrictions that block users under 18 from seeing search results for certain terms, such as "alcohol" or "gore." At the time, Meta said it already shielded teens from search results related to suicide, self-harm and eating disorders.

Meta and YouTube trial

Meta's new safety features come amid an ongoing trial in Los Angeles over whether its platforms, along with Alphabet-owned YouTube, are deliberately designed to addict young users. Meta CEO Mark Zuckerberg last week faced questioning about Instagram's young users and Meta's efforts to boost engagement.

Instagram specifies that users must be at least 13 years old to sign up for its app. At trial, however, Zuckerberg conceded that the rule is hard to enforce because users sometimes lie about their age. To verify users' age, Instagram asks them to submit details such as their birthday, photo identification and a video.


If you or someone you know is in emotional distress or a suicidal crisis, you can reach the 988 Suicide & Crisis Lifeline by calling or texting 988. You can also chat with the 988 Suicide & Crisis Lifeline here.

For more information about mental health care resources and support, The National Alliance on Mental Illness (NAMI) HelpLine can be reached Monday through Friday, 10 a.m.–10 p.m. ET, at 1-800-950-NAMI (6264) or email info@nami.org.

Read this on CBS Technology
  About

Omnixia News is your intelligent news aggregator, delivering real-time, curated headlines from trusted global sources. Stay informed with personalized updates on tech, business, entertainment, and more — all in one place..