Instagram parent company Meta has introduced new safety features aimed at protecting teens who use its platforms, including information about accounts that message them and an option to block and report accounts with one tap.
The company also announced Wednesday that it has removed thousands of accounts that were leaving sexualized comments or requesting sexual images from adult-run accounts of kids under 13. Of these, 135,000 were commenting and another 500,000 were linked to accounts that “interacted inappropriately,” Meta said in a blog post.
The heightened measures arrive as social media companies face increased scrutiny over how their platform affects the mental health and well-being of younger users. This includes protecting children from predatory adults and scammers who ask – then extort- them for nude images.
Meta said teen users blocked more than a million accounts and reported another million after seeing a “safety notice” that reminds people to “be