Meta Introduces New Parental Supervision Tools and Privacy Features
Social Media Giant, Meta, bolsters parental supervision tools and privacy features amid concerns over teen mental health.
Meta, the parent company of Instagram and Facebook, is taking steps to address the growing concerns surrounding the impact of social media on teenage mental health.
However, the effectiveness of these measures is being questioned as they require minors and their parents to actively opt in. In an effort to promote parental involvement, Instagram will now send notifications to teenagers after they have blocked someone, encouraging them to allow their parents to supervise their accounts. The aim is to seize the attention of young users during a time when they may be more receptive to parental guidance.
By opting in, parents gain the ability to set time limits, view their child's followers and those they follow and monitor the amount of time spent on Instagram. However, message content remains inaccessible to parents. Last year, Instagram introduced parental supervision tools to assist families in navigating the platform and accessing resources and guidance. A key challenge in the process is that teenagers must actively sign up if they want their parents to supervise their accounts, but the number of teen users who have opted in remains undisclosed.
This supervision allows parents to identify mutual friends between their child and the accounts they follow or are followed by. If a teenager is followed by someone unknown to their friends, it may raise concerns about the authenticity of the relationship.
Meta states that this feature "will enable parents to gauge their child's familiarity with these accounts and initiate offline discussions about these connections."
Additionally, Meta is incorporating existing parental supervision tools from Instagram and virtual reality products into Messenger. This opt-in feature allows parents to monitor their child's time spent on the messaging service, access information such as contact lists and privacy settings, but not the identities of individuals with whom they are communicating.
While these features can prove valuable for families already engaged in their child's online activities, experts highlight that this level of involvement is not the norm for many parents.
Last month, U.S. Surgeon General Vivek Murthy cautioned that there is insufficient evidence to deem social media safe for children and teenagers, urging tech companies to take immediate action to protect them. While acknowledging that social media platforms have implemented certain safety measures, Murthy emphasised their inadequacy. Notably, despite social media platforms prohibiting users under 13, many younger children access platforms like Instagram and TikTok by falsifying their ages, with or without parental consent.
Murthy further argued that expecting parents to manage rapidly evolving technology that fundamentally alters how children perceive themselves, form friendships and experience the world is unjust, asserting, "We're putting all of that on the shoulders of parents, which is simply unfair."
Starting this Tuesday, Meta will encourage, though not enforce, children to take breaks from Facebook, similar to its existing practice on Instagram. After 20 minutes of usage, teenage users will receive notifications prompting them to take a break from the app. If they choose to continue scrolling, they can dismiss the notification. TikTok recently implemented a 60-minute time limit for users under 18, but this can be bypassed with a passcode, set either by the teenagers themselves or, for children under 13, by their parents.
Diana Williams, overseeing product changes for youth and families at Meta, emphasised the company's focus on providing a suite of tools to support parents and teenagers in engaging safely and appropriately online. Meta also aims to empower teenagers to manage their own online experiences with features like "take a break" and "quiet mode" in the evenings.
Meta introduces new parental supervision tools and privacy features for Instagram and Facebook.
Minors and parents must actively opt in, raising concerns about the effectiveness of the measures.
Instagram sends notifications to teens after blocking someone, encouraging parental supervision.