-: FOLLOW US :- @theinsaneapp
Meta has announced new measures to restrict the harmful content that teen Instagram and Facebook accounts can see.
-: FOLLOW US :- @theinsaneapp
The company now aims to protect young users from harmful content related to self-harm, graphic violence, and eating disorders.
-: FOLLOW US :- @theinsaneapp
Even if shared by a friend, content related to self-harm, suicide, and eating disorders will be hidden from teen users' feeds.
-: FOLLOW US :- @theinsaneapp
These changes include automatically hiding such content from the Feed and Stories sections, even if it has been shared by someone the teen follows.
-: FOLLOW US :- @theinsaneapp
Meta will also hide search results related to self-harm and eating disorders and provide expert resources for help instead.
-: FOLLOW US :- @theinsaneapp
Additionally, all teen accounts will be placed in the most restrictive content control setting, making it harder for them to come across sensitive content or accounts.
-: FOLLOW US :- @theinsaneapp
The company will also send notifications encouraging teens to update their settings for a more private experience.
-: FOLLOW US :- @theinsaneapp
These changes come amid increased scrutiny and legal challenges faced by Meta regarding the impact of its platforms on young users' mental health.
-: FOLLOW US :- @theinsaneapp
Meta's actions highlight the ongoing challenges platforms face in managing harmful content that negatively affect users, particularly minors.