Amid rising global pressure to shield children from harmful content, social media giant Meta has decided to restrict more content on Facebook and Instagram.
Meta announced on its blogpost that all teens now will be placed in the most restrictive content control setting on its app and additional search terms will be more restricted on Instagram.
This step is expected to roll out in coming weeks and Meta hoped this would help deliver more "age-appropriate" experience.
This will now make it more difficult for teens to come across sensitive content such as self-harm, suicide, eating disorder etc when they use features like Search and Explore on Insta.
Meta has been facing allegations both in Europe and the US that its app are addictive and are causing mental health issues among youngsters.
Attorney Generals of 33 American states sued the social media giant in October, 2023 alleging that the company time and again misled people regarding its harmful impact.
After a former Meta employee alleged that the company was aware of the harmful impact of its content on teens but failed to prevent it, the regulatory scrutiny of the social media giant increased.
The European Commission as also asked for information on how Meta keeps children safe from illegal and harmful content.
Children are usually the more appealing prospective consumers for brands that advertise on Facebook and Instagram who aim to influence them at a more impressionable age.