Meta, which is under a huge amount of pressure over kids’ safety on its platforms, has announced new parental controls for Instagram and Messenger—while also extending an anti-addictive feature for Facebook.
Parents and guardians in the U.S., U.K., and Canada can now check their teenagers’ privacy and safety settings in Messenger, view their contact lists and get updates when someone is added or deleted, and see who’s messaging them. They also get to see how much time their kids are spending on the messaging platform, though they still can’t read their messages.
On Instagram, parents now get to check how many friends their teen shares with the accounts they follow and are followed by. “This will help parents understand how well their teen knows these accounts, and help prompt offline conversations about those connections,” Meta said in a blog post. Insta’s parental supervision tools are also being updated, so teens who block someone are encouraged to let their parents supervise their accounts.
Instagram’s Quiet Mode, which shuts up notifications for a set period, is also being rolled out globally now, having been launched in the U.S., U.K., Ireland, Canada, Australia, and New Zealand back in January. Quiet Mode was a Facebook feature that was extended to Insta, but the reverse is also happening: Insta’s Take a Break feature is now crossing over to Facebook, so teens there get a prompt to take some time off after they’ve been surfing Facebook for 20 minutes.
“We’re also exploring a new nudge on Instagram that suggests teens close the app if they are scrolling Reels at night,” the blog post added.
This is all sensible stuff, implemented at a time when lawmakers in the U.S. are preparing to crack down on social media companies’ lackluster protections for their youngest users—there are three competing bills out there. Also, in common with some of that proposed legislation, the Federal Trade Commission wants to stop Meta monetizing the data of its under-18 users.
Will Meta’s latest changes ward off those threats? Not all of them, certainly—Meta isn’t stopping the monetization of its young users, nor is it instituting age-verification measures. However, some of the changes do specifically undercut the rules that have been proposed.
For example, the Protecting Kids on Social Media Act and Kids Online Safety Act bills would both block Meta from using recommendation algorithms on teens, to mitigate the threat of app addiction. Now Meta’s lobbyists can tell legislators that the company is trying to reduce the addictive use of its platforms. They can also say parents are getting new controls to identify harmful online behaviors, as the latter bill demands.
It’s probably not enough to mollify a Congress where there’s bipartisan support for clipping social media’s wings, but it may help.