Meta Rolls Out Teen Accounts on Facebook and Messenger Globally
Meta expands its Teen Accounts to Facebook and Messenger globally, introducing enhanced parental controls and protections for younger users. Teens will have limited content exposure and new safety measures to ensure a secure social media experience.
Meta announced on Thursday that it is expanding its Teen Accounts to Facebook and Messenger globally, following their initial availability in the U.S., U.K., Australia, and Canada. Teen Accounts, which were first launched on Instagram last fall, come with built-in protections and parental controls designed for younger users.
The introduction of Teen Accounts came shortly after Meta and other major social networks faced criticism from U.S. lawmakers for not providing adequate protections for teens using their platforms.
With the global expansion to Facebook and Messenger, teens will automatically be placed into a controlled experience intended to reduce exposure to inappropriate content and limit unwanted contact. For users under the age of 16, parental consent is required to alter any account settings.
Additionally, teens will only be able to receive messages from individuals they follow or have previously messaged. Stories will only be visible to their friends, and tags, mentions, and comments will be restricted to people they follow or who are their friends. Teens will also be prompted to leave the platforms after one hour of use per day, and they will be enrolled in "Quiet Mode" during the night.
This global expansion of Teen Accounts comes amid findings from a Meta whistleblower’s research, which uncovered that children and teens remain exposed to potential online harm on Instagram, even after the company introduced these new protections. The research indicated that despite Teen Accounts, young users can still encounter harmful content, including suicide and self-harm posts, as well as graphic sexual content. Meta has disputed these findings, claiming that its new measures have reduced the exposure of teens to such harmful material.
On the same day, Meta also unveiled its School Partnership Program, which allows educators to directly report safety concerns like bullying to Instagram for faster review and removal. Meta piloted this program earlier this year with positive feedback from participating schools, and now all U.S. middle and high schools can join the initiative to gain prioritized reporting and access to educational resources. Schools in the program will display a banner on Instagram to inform parents and students that they are official partners.
Thursday's announcement is part of Meta's ongoing efforts to address the mental health concerns related to teens' social media use. These concerns have been raised by the U.S. Surgeon General and several state governments, some of which have started implementing restrictions on teens' use of social media without parental consent.
What's Your Reaction?
Like
0
Dislike
0
Love
0
Funny
0
Angry
0
Sad
0
Wow
0