NewsWorld
PredictionsDigestsScorecardTimelinesArticles
NewsWorld
HomePredictionsDigestsScorecardTimelinesArticlesWorldTechnologyPoliticsBusiness
AI-powered predictive news aggregation© 2026 NewsWorld. All rights reserved.
Trending
IranTalksMilitaryFebruaryNuclearEpsteinTimelineGovernmentStrikesHealthDigestTrumpDocumentsThursdayRefundFileElectionsIranianPolicyDiplomaticCoalitionTargetingResearchReforms
IranTalksMilitaryFebruaryNuclearEpsteinTimelineGovernmentStrikesHealthDigestTrumpDocumentsThursdayRefundFileElectionsIranianPolicyDiplomaticCoalitionTargetingResearchReforms
All Articles
Instagram now alerts parents if their teen searches for suicide or self-harm content
TechCrunch
Clustered Story
Published about 12 hours ago

Instagram now alerts parents if their teen searches for suicide or self-harm content

TechCrunch · Feb 26, 2026 · Collected from RSS

Summary

Parents will be informed if their teen searches for suicide or self-harm content and offered resources.

Full Article

Instagram will start alerting parents if their teen repeatedly tries to search for terms related to suicide or self-harm within a short period of time, the company announced on Thursday. The alerts are launching in the coming weeks to parents who are enrolled in parental supervision on Instagram. The Meta-owned social platform says that while it already blocks users from searching for suicide and self-harm content, these new alerts are designed to make sure parents are aware if their teen is repeatedly trying to search for this content so that they can support their teen. Searches that may trigger an alert include phrases encouraging suicide or self-harm, phrases indicating a teen might be at risk of harming themselves, and terms such as “suicide” or “self-harm.” Instagram says parents will receive the alert via email, text, or WhatsApp, depending on the contact information they’ve provided, along with an in-app notification. The notification will include resources designed to help parents approach conversations with their teen. Image Credits:Instagram The move comes as Meta and other big tech companies are currently facing several lawsuits looking to hold social media giants accountable for harming teens. During testimony for a lawsuit taking place in the U.S. District Court in the Northern District of California this week, Instagram head Adam Mosseri was grilled by prosecutors in an ongoing social media addiction case over the app’s delayed rollout of basic safety features, including a nudity filter for private messages to teens. Additionally, during testimony in a separate lawsuit before the Los Angeles County Superior Court, it was revealed that an internal research study at Meta found that parental supervision and controls had little impact on kids’ compulsive use of social media. The study also found that children who faced stressful life events were more likely to struggle with regulating their social media use appropriately. Given the ongoing lawsuits accusing the company of failing to protect teens on its platforms, the timing of these new alerts isn’t exactly surprising. The company notes that it will aim to avoid sending these notifications unnecessarily, as overuse could reduce their overall effectiveness. “In working to strike this important balance, we analyzed Instagram search behavior and consulted with experts from our Suicide and Self-Harm Advisory Group,” Instagram explained in a blog post. “We chose a threshold that requires a few searches within a short period of time, while still erring on the side of caution. While that means we may sometimes notify parents when there may not be a real cause for concern, we feel — and experts agree — that this is the right starting point, and we’ll continue to monitor and listen to feedback to make sure we’re in the right place.” The alerts are rolling out in the U.S., U.K., Australia, and Canada next week, and will become available in other regions later this year. In the future, Instagram plans to launch these notifications when a teen tries to engage the app’s AI in conversations about suicide or self-harm. Aisha is a consumer news reporter at TechCrunch. Prior to joining the publication in 2021, she was a telecom reporter at MobileSyrup. Aisha holds an honours bachelor’s degree from University of Toronto and a master’s degree in journalism from Western University. You can contact or verify outreach from Aisha by emailing aisha@techcrunch.com or via encrypted message at aisha_malik.01 on Signal. View Bio


Share this story

Read Original at TechCrunch

Related Articles

The Vergeabout 12 hours ago
Instagram will alert parents if their kids ‘repeatedly’ search for self-harm topics

The alerts will start rolling out to Teen accounts with parental supervision protections next week. | Image: Meta / The Verge Starting next week, Instagram will notify parents to check on their teen searching for terms related to self-harm or suicide. Meta says a similar alert system for its AI chatbots is coming later this year. The new Instagram feature sends parents an alert when their child "repeatedly tries to search for terms clearly associated with suicide or self-harm within a short period of time." It's rolling out in the US, UK, Australia, and Canada starting next week, but it's only for parents and teens who opt-in to supervision. It's expected to expand to other regions later this year. "The vast majority of teens do not try to search for suicide and … Read the full story at The Verge.

The Hillabout 12 hours ago
Instagram launches new tool alerting parents about suicide, self-harm searches

Instagram is launching a new tool that will alert parents if their teens repeatedly try to search for terms associated with suicide and self-harm on the platform. The tool, which will roll out in the U.S. and several other countries next week, will flag for parents if their children conduct multiple searches with phrases promoting...

Engadgetabout 12 hours ago
Instagram will alert parents if teens repeatedly search for suicide or self-harm content

Instagram is adding a new alert for the parents of teen users of its social media platform. The network will alert the adult if their child repeatedly searches for terms about suicide or self-harm in a short time frame. From that notification, the parent will optionally be able to access resources for having conversations with their teen about these topics. These alerts will begin rolling out for parental supervision users in the US, UK, Australia and Canada next week, with later regions to be added in the future. "We chose a threshold that requires a few searches within a short period of time, while still erring on the side of caution," Instagram's blog post explains. "While that means we may sometimes notify parents when there may not be real cause for concern, we feel — and experts agree — that this is the right starting point, and we’ll continue to monitor and listen to feedback to make sure we’re in the right place."  The platform reiterated that search results for terms connected to suicide and self-harm are blocked for teen younger users, and content about those topics is not shown to them under its current policies. Instagram also noted that a similar parental alert feature is in the works for its AI tools, but news on that isn't expected until later this year. This article originally appeared on Engadget at https://www.engadget.com/social-media/instagram-will-alert-parents-if-teens-repeatedly-search-for-suicide-or-self-harm-content-120000156.html?src=rss

TechCrunchabout 3 hours ago
Google paid startup Form Energy $1B for its massive 100-hour battery

The deal paves the way for Form Energy to raise a new funding round before potentially going public next year.

TechCrunchabout 3 hours ago
So, we’re getting Prada Meta AI glasses, right?

Mark Zuckerberg was at Prada's fashion week event in Milan, leaving everyone to wonder if we're getting Meta AI glasses under the Prada brand.

TechCrunchabout 4 hours ago
Sophia Space raises $10M seed to demo novel space computers

The company's modular computer tiles offer a new vision for space data centers.