Instagram Implements Private Accounts for Teens Under 18 - Strengthening Safety Measures

Meta Platforms has announced a new set of safety measures aimed at teens, addressing concerns about social media addiction. 



Meta Platforms has announced a new set of safety measures aimed at teens, addressing concerns about social media addiction. Instagram accounts for users under the age of 18 will automatically be set to private, with the change expected to roll out in South Korea by January next year. This initiative is part of a larger effort to protect the mental health of young users worldwide.


Concerns about social media addiction are growing, particularly regarding its impact on teenagers. Instagram, in particular, remains a highly popular social network service (SNS) among both teens and users in their 20s and 30s. Given this popularity, Meta Platforms has been under scrutiny for allegedly designing Instagram and Facebook in ways that encourage addiction, especially among teenagers, raising significant concerns about their mental well-being.

In fact, last October, 33 U.S. states, including California, filed a lawsuit against Meta, claiming that Instagram and Facebook’s addictive features were harming the mental health of teens. Similarly, the European Union (EU) has launched an official investigation into the potential for these platforms to cause addiction in minors. Several countries, including the U.S., France, and Italy, have also been attempting to legislate limits on teen social media use, reflecting growing global concern over the issue.

Amid these developments, Meta recently announced new safety measures for Instagram on September 17th. The core of these measures involves placing restrictions on accounts used by teens under 18. Specifically, Instagram accounts for teenagers will now default to private settings. This means users can only receive messages from followers or people they are already connected with, and sensitive content will be filtered out. Additionally, the platform’s algorithm will no longer recommend sexual content or material related to suicide or self-harm.

To further protect teens, users will receive a warning if they spend more than one hour on the app, and a ‘sleep mode’ will automatically activate between 10 PM and 7 AM, disabling notifications and sending auto-responses. Meta has also strengthened parental control options, requiring parental permission for users under 16 to disable certain settings. Parents can now monitor and limit their child’s Instagram usage time, though users aged 16–17 will retain the ability to turn these settings on or off independently.

Meta is also developing technology to detect when users falsely claim to be adults or create adult accounts on other devices. This system aims to predict whether a user who has set their age as over 18 is actually a teenager.


Meta's decision to implement these private accounts and safety measures for teens reflects the company’s commitment to protecting young users' mental health. While the changes have already been implemented in the U.S., UK, Canada, and Australia, other countries, including South Korea, are expected to adopt them starting in January next year. While Instagram may see a temporary dip in teen users, CEO Adam Mosseri emphasized that gaining the trust and confidence of parents will ultimately benefit the platform in the long run. It remains to be seen how teens across the globe will respond to these new policies, but this is a significant step toward ensuring safer social media experiences for younger users.

댓글

이 블로그의 인기 게시물

절세계좌 이중과세 논란… ‘한국판 슈드’ 투자자들의 선택은?

보험, 팔긴 쉬운데 지키긴 어렵다”…생보사 장기 유지율 ‘뚝’

“살 빠지는 음식은 세상에 없다?” 다이어트에 효과적인 ‘이 음식들’ 소개!