Meta announced new safety features aimed at Instagram accounts managed by adults that mainly showcase children’s photos and videos. While Instagram prohibits users under 13, adults such as parents or managers often run “kidfluencer” or family vlogging accounts showcasing minors.
These accounts now face enhanced protections to prevent abuse and inappropriate contact.
Stricter message settings and comment filters protect child-focused accounts
Meta will automatically apply the platform’s strictest message settings to accounts primarily featuring children. This means these accounts will have tighter controls on who can send them direct messages, reducing the risk of unwanted or inappropriate contacts.
In addition, Instagram’s “Hidden Words” feature will be enabled by default, filtering out offensive or sexualized comments that predators sometimes leave under posts.
To further reduce risks, Meta plans to stop recommending these child-focused accounts to suspicious adults, especially those previously blocked by teens, and will make it harder for such users to find these accounts via search.
Meta emphasises that while most accounts are “overwhelmingly used in benign ways,” some bad actors exploit them, prompting these new protections.
New features also help teens stay safer in direct messages
Alongside protections for kid-focused accounts, Meta is boosting safety tools for teen users on Instagram. Teens will now receive more information about the people messaging them, such as when an account was created, helping them identify potential scammers.
Safety tips will appear in chats, allowing teens to block or report suspicious users with a single tap. These tools empower teens to protect themselves against predators and scams more easily.
Additionally, Meta disclosed it removed about 135,000 Instagram accounts posting sexualized comments or attempting to solicit explicit images from adult-managed child accounts. They also deleted around 500,000 related accounts on both Instagram and Facebook.
The platform recently made teen accounts private by default and restricts their messaging to approved contacts, aiming to tighten controls further.
Through these moves announced on Wednesday, Meta is intensifying efforts to shield young Instagram users from harmful interactions, responding to growing concerns about online safety and mental health risks linked to social media.