By Zohaib Shah ⏐ 4 months ago ⏐ Newspaper Icon Newspaper Icon 2 min read
Meta

Meta is enhancing child safety on Instagram by bringing new protections to accounts featuring kids but managed by adults. Children under 13 can’t create Instagram accounts. However, parents or managers often run accounts to share content of their children. Although most accounts are used safely, some have faced inappropriate interactions. To prevent this, Meta is taking stronger action.

Soon, these accounts will receive Instagram’s strictest message settings to block unwanted direct messages from strangers. Also, Meta will enable “Hidden Words” by default. This tool filters harmful or unwanted comments from posts automatically. To further improve safety, Meta will stop recommending such accounts to users blocked by teens on Instagram.

It will also limit search visibility and reduce exposure to suspicious users. Comments from certain adult users will stay hidden. Earlier this year, Meta removed over 135,000 Instagram accounts for breaking safety rules related to children’s content. It also deleted more than 500,000 related accounts across Instagram and Facebook.

Meta launched teen account protections in 2023, automatically applying strict privacy settings to users aged 13–18. In April, Meta extended these protections to Facebook and Messenger. The company is also testing AI tools to detect adults who falsely claim to be over 18. This helps ensure safer account settings.

Meta says it remains committed to strict enforcement and ongoing improvements in safety for young users.