Skip to main content
Monday, August 18, 2025
BreakingBreaking news updates

Meta Expands Child Safety Features on Instagram

Meta Expands Child Safety Features on Instagram
An image related to the article topic.

Meta has announced new child safety features for Instagram, aimed at protecting children featured in adult-managed accounts. These updates include preventing accounts that primarily feature children from being recommended to potentially suspicious adults, those who have previously been blocked by teenagers. The changes also involve hiding comments from suspicious adults on posts featuring children and making it harder for such accounts to find each other through search. These measures follow a 2023 lawsuit alleging Meta platforms facilitated child sexual abuse material and a Wall Street Journal investigation highlighting Instagram's algorithm promoting pedophile networks.

The expansion builds upon existing safety features for teen accounts, now being extended to adult-managed accounts featuring children. Meta states that while most such accounts are benign, these additions aim to further protect children from potential exploitation. Future updates will include stricter default message settings and improved reporting options within Instagram DMs.

Impact Statement: These changes aim to improve child safety on Instagram by limiting the exposure of accounts featuring children to potentially harmful individuals.