Instagram has announced that it will introduce a new feature to notify parents if teenage users conduct repeated searches on topics related to suicide or self-harm.
AzEdu.az referring to foreign media, reports that this step is being taken amidst increasing global regulations concerning the impact of Meta's social media products on teenagers.
The new feature will initially be rolled out in Australia, Canada, the United Kingdom, and the United States. Families who activate the "Parental Supervision" function, upon seeing their children conducting such searches, will receive expert advice and support services related to their children's safety.
Meta stated that this system is aimed at ensuring parents are aware of their children's search activities and can provide them with appropriate support resources. The company is also considering the issue of teenagers receiving support through artificial intelligence and whether such dialogues should be reported to parents.
This step comes amidst ongoing legal pressures against Meta and other social media platforms accused of making teenage users addicted. In a lawsuit held in Los Angeles, involving over 1600 plaintiffs, it is alleged that the Instagram, YouTube, TikTok, and Snap platforms were deliberately designed to be addictive.
In recent years, Meta has implemented a series of innovations aimed at enhancing the safety of teenagers. In 2024, a feature called "Teen Accounts" was introduced, and in 2025, a system restricting teenagers' access to certain content was launched. Despite this, the platform is still sometimes accused of taking insufficient measures in cases such as "şantaj" (sextortion).