Instagram’s new safety measures include “nudging” teens away from harmful content, telling them to “take a break”
Facebook is rolling out new safety measures for Instagram to help improve the experience vulnerable teams have on the app.
In an interview with CNN, Nick Clegg, Facebook’s vice president of global affairs, said that the company will allow adults to monitor what their children are doing online if they choose to do so, “nudge” users looking at harmful content to other types of content and prompt teenagers to “take a break” from Instagram.
These measures were part of possible ideas and solutions by Instagram head Adam Mosseri last month when he announced that the company is halting work on a version of the app for children under 13. Clegg’s announcement on Sunday seemed to confirm the measures as part of Facebook’s plans.
“We can’t change human nature,” he added “You always compare yourself to others, particularly those who are more fortunate to yourself, but [what] we can do is change our product, which is exactly what we’re doing.”
Clegg also insisted that Instagram is a positive experience for the “overwhelming majority” of teenagers. When asked about those who use the app and suffer from sleeplessness, anxiety and depression, he noted the problem as a challenge Facebook and society must face together. (Read: It’s time to talk about tech and mental health in the time of COVID-19)
Instagram is bad for teens’ mental health, says latest report
But the responses stand in stark contrast to a report published by the Wall Street Journal, which presented data from Facebook’s internal research and testimony from whistleblower Frances Haugen showing how Instagram negatively impacted teen users’ mental health.
A Facebook internal document, which was reviewed by the Wall Street Journal, showed that 32% of teen girls who felt bad about their bodies said that “Instagram made them feel worse.” The findings were consistent across the last several years of internal research. The Wall Street Journal also revealed that a small percentage of Instagram users in the U.S. and U.K. have started thinking suicidal thoughts because of the app.
“Teens blame Instagram for increases in the rate of anxiety and depression,” said the internal presentation. “This reaction was unprompted and consistent across all groups.”
Responding to the damning report, Clegg highlighted the company’s efforts to improve safety measures for Instagram and Facebook, in particular, its $13 billion investment for security. He also mentioned that Facebook employs around 40,000 people to work on these issues.
“As I say, we cannot with the wave of a wand make everyone’s life perfect. What we can do is improve our products so that our products are as safe and as enjoyable to use,” he added.
If you like reading our content, why not show your appreciation by treating us to a cup of coffee? (or two, if you’re feeling generous)