Meta – the parent company of Facebook – has announced new measures to restrict content related to suicide, self harm, and eating disorders from its teenage user base.
The Hill reports that Meta, the parent company of social media giants Facebook and Instagram, has recently announced an important update to its youth safety and privacy policies. The goal is to protect young users from harmful content related to self-harm and eating disorders.
The update expands Meta’s existing policy, which previously banned recommendations of such content on teen users’ Reels and Explore pages on Instagram but still appeared to be a major problem on the platform.
Now, Meta is taking a step further by ensuring that content regarding these sensitive topics will not appear in teens’ Feeds and Stories, even if it is shared by accounts they follow. This is a major update, as there are widespread concerns about the impact of social media on the mental health of young users.
If a teenage user searches for terms associated with the restricted topics, Meta will direct them to expert resources for help. Meta also plans to implement stricter content control settings for all teen users on Instagram and Facebook. Previously, these settings were only applied to new teen users upon joining the platforms; now, they will be extended to all existing teenage users.
The update also includes a feature where Meta will send notifications to teen users, reminding them to update their privacy settings. An option to “turn on recommended settings” will be offered, which, when enabled, will automatically adjust their settings to limit who can repost their content, tag, or mention them.
These measures aim to give teenagers more control over their online presence and interactions.
Read more at The Hill here.
Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.