Instagram and Facebook are adding more parental controls. Critics say they aren’t enough


Instagram and Facebook's parent company Meta is adding new parental supervision tools and privacy features to its platforms beginning Tuesday, June 27. The changes come as social media companies face increased scrutiny over how they impact teens' mental health. — AP

Instagram and Facebook’s parent company Meta is adding some new parental supervision tools and privacy features to its platforms as social media companies face increasing scrutiny over their effects on teen mental health.

But many of the features require minors – and their parents – to opt in, raising questions about how effective the measures are.

Instagram will now send a notice to teens after they block someone, encouraging them to let their parents “supervise” their account. The idea is to grab kids' attention when they might be more open to parental guidance.

If a teen opts in, the system will let parents set time limits, see who their kid follows or is followed by, and allows them to track how much time the minor spends on Instagram. It does not let parents see message content.

Instagram launched parental supervision tools last year to help families navigate the platform and find resources and guidance. A sticking point in the process is that kids need to sign up if they want parents to supervise their accounts. It's not clear how many teen users have opted in and Meta has not disclosed any numbers.

By making the feature optional, Meta says it is trying to "balance teen safety and autonomy” as well as prompt conversations between parents and their children.

When families do opt in, supervision allows parents to see how many friends their child has in common with accounts the child follows or is followed by. If the child is followed by someone none of their friends follow, it could raise a red flag that the teen does not know the person in real life.

This, Meta says, "will help parents understand how well their teen knows these accounts, and help prompt offline conversations about those connections.”

Jim Steyer, the CEO and founder of Common Sense Media, called the news a "smoke screen.”

"None of these new features address the negative impact their business model is having on the well-being of kids, including their mental health. We need national privacy laws to protect kids,” Steyer said in a statement.

Meta is also adding parental supervision tools already available on Instagram and on virtual reality product to Messenger. The opt-in feature lets parents see how much time their child spends on the messaging service and information such as their contact lists and privacy settings – but not who they are chatting with.

Such features can be useful for families in which parents are already involved in their child's online life and activities. Experts say that's not the reality for many people.

Last month, US Surgeon General Vivek Murthy warned that there is not enough evidence to show that social media is safe for children and teens and called on tech companies to take "immediate action to protect kids now.”

Murthy told The Associated Press that while he recognizes social media companies have taken some steps to make their platforms safer, those actions are not enough. For instance, while kids under 13 are technically banned from social media, many younger children access Instagram, TikTok and other apps by lying about their age, either with or without their parents' permission.

Murthy also said it's unfair to expect parents to manage what their children do with rapidly evolving technology that "fundamentally changes how their kids think about themselves, how they build friendships, how they experience the world - and technology, by the way, that prior generations never had to manage,”

"We’re putting all of that on the shoulders of parents, which is just simply not fair,” Murthy said. His office didn't respond to a request for comment on Meta's latest actions.

Also beginning Tuesday, Meta will encourage – but not force – children to take a break from Facebook, just as it already does on Instagram. After 20 minutes, teenage users will get a notice to take time away from the app. If they want to keep scrolling, they can just close the notification. TikTok also recently introduced a 60-minute time limit for users under 18, but they can bypass it by entering a passcode, set either by the teens themselves, or if the child is under 13, by their parent.

"What we are focused on is kind of a suite of tools to support parents and teens on how they how can they can best engage in safe and appropriate experiences online,” said Diana Williams, who oversees product changes for youth and families at Meta. "We’re also trying to build tools that teens can use themselves to learn how to manage and recognize how they’re spending their time. So things like ‘take a break’ and ‘quiet mode’ in the evenings.”

So why not just force children to take a break, rather than making it optional? Williams said the company believes in nudging teens rather than forcing them to disengage because they might be using the apps for things like researching a school paper.

"What we want to do is make sure that they’re recognizing how their time is being spent and whether or not it’s meaningful,” she said. – AP

   

Next In Tech News

Meta faces April trial in FTC case seeking to unwind Instagram merger
Intel expects US to shave chip-making grant, sources say
Nvidia shows AI model that can modify voices, generate novel sounds
Analysis-Lilium's fall throws spotlight on air-taxi cash crunch
AI analytics firm Pyramid Analytics secures $50 million from BlackRock
Google's US antitrust trial over online ad empire draws to a close
Corning offers to waive exclusive deals in EU antitrust probe, may stave off fine
US finalizes awards to BAE Systems, Rocket Lab for semiconductor chips
Social media sites call for Australia to delay its ban on children younger than 16
Study: New coating can make China’s stealth aircraft invisible to anti-stealth radar

Others Also Read