Meta (the company behind Facebook, Instagram, and Messenger) has introduced a new rule for users under 16 years of age, making social media use safer for teenagers. These changes were first added to Instagram and are now being applied to Facebook and Messenger too.
Why These Changes?
Meta has been under pressure for a long time. Many people have blamed the company for not doing enough to protect young users from online dangers.
Now, Meta wants to show that it is serious about teen safety by giving parents more control and improving privacy settings for teenagers.
What’s New for Teen Users?
- No Live Videos Without Parents’ Permission
Teens under 16 cannot go “live” on Facebook or Instagram unless their parents allow it. - Blurred Images in Messages
If someone sends a nude photo in direct messages, the system will automatically blur the image so that the teen doesn’t see it directly. - Parents Can Monitor Activity
Parents will now have more tools to see and manage their child’s account activity, such as messages and interactions.
Why Is This Important?
Many US lawmakers are trying to pass laws like the Kids Online Safety Act (KOSA), which will force tech companies to make the internet safer for children.
Meta, TikTok, and YouTube are already facing hundreds of lawsuits from schools and families saying that these platforms are addictive and harmful to kids.
In 2023, 33 US states, including California and New York, sued Meta for misleading people about the dangers of its apps.
When Will These Rules Start?
These rules will begin within the next few months. In July 2024, the US Senate approved two important bills to protect kids online. These are:
- KOSA (Kids Online Safety Act)
- Children and Teens’ Online Privacy Protection Act
These laws are meant to hold social media companies responsible for how their platforms affect young people.