Teen accounts on Instagram aren't really for teenagers
Meta, the parent company of Facebook, Instagram and WhatsApp, announced on Tuesday that it will introduce measures that limit the types of content young people can access, who they can talk to and how much time they spend on particular media. The new measures will begin with an Instagram rollout that began on September 17 in the US, but will eventually be applied to Facebook and WhatsApp as well.
The new policy includes automatically creating Instagram accounts for users 16 and under, limiting who can interact with the accounts of teenagers or tagging them in posts, muting some words associated with online bullying, and defaulting to access to the most restricted content. including doing, as well as encouraging young people. To spend less time on apps.
The new protocols come after years of debate about the effects of social media use on young people, with pundits and politicians arguing that social media and smartphones are responsible for a decline in teenage well-being.
Laws and lawsuits have blamed social media for everything from guilt and suicidal ideation to eating disorders, attention problems, and predatory behavior. Meta's new policies address those concerns, and may have some positive effects, particularly moving toward privacy. But they also address the rhetoric of politicians rather than the well-being of teenagers, and some experts warn that there is no causal link between youth social media use and those poor outcomes.
Metta is trying to address a lot of criticism about its impact on teenagers
Meta and other social media companies have come under intense scrutiny for their perceived ill effects on the mental health and well-being of young people. Cyberbullying, eating disorders, anxiety, suicidal ideation, poor academic results, sexual exploitation, and addiction to social media and technology are all concerns that Meta's new Instagram protocols were designed to address.
In recent years, reporting — like the Wall Street Journal's 2021 series on Facebook files — has explored how Meta's leadership knew Instagram could be toxic to teenage girls' body image, yet made no effort to mitigate the risks to vulnerable users. Surgeon General Vivek Murthy also attributed the rise in rates of depression and anxiety to social media use; His office released a report last year warning that social media use is a major contributor to the decline in young people's mental well-being.
The report states that up to 95 percent of American children ages 13 to 17 use social media, and nearly 40 percent of children ages 8 to 12 do as well. “At this time, we do not yet have enough evidence to determine whether social media is safe enough for children and adolescents,” says the report's introduction, and cites overuse, harmful content, bullying and exploitation as key areas of concern.
Murthy also called for a Surgeon General's warning labels on social media — similar to those on cigarette packs and alcohol bottles warning about the health risks of those products — in a New York Times op-ed in June. The op-ed called for federal legislation to protect children using social media.
Such legislation is already making its way through Congress — the Kids Online Safety Act (KOSA). COSA passed the Senate in July and heads to the House for markup Wednesday; It is unclear whether any version of the bill will pass both chambers, but President Joe Biden has indicated that he would sign one if he did.
The version of KOSA passed earlier this summer required companies to allow companies to turn off algorithmic features targeting child or teen accounts and to limit features that reward or enable sustained use of the platform or game in question. Meta's new policies will require companies to limit who can communicate with minors; “Prevent other users […] from viewing minors' personal information”; and reducing and preventing harm to the mental health of adolescents.
The Senate-approved version of KOSA goes further than Meta's new teen account policies, especially when it comes to youth data privacy, and it's unclear what impact, if any, Instagram teen accounts will have on laws surrounding youth social media use. .
Who are the new protocols for, and will they make teens' lives better?
The language in the Mater press release is directed toward parents' concerns about their children's social media use, rather than youth's online privacy, mental health or well-being.
The reality is that the Meta Teen account, as well as the KOSA Act, can only do so much to address cultural and political fears about what social media does to children's well-being because we don't know much about it. Available data do not show that social media use has more than a negligible effect on adolescent mental health.
“Many of the things proposed to fix social media are not really questions of scientific rigor, they're not really questions of health or anxiety or depression,” Andrew Przybylski, a professor of human behavior and technology at Oxford University, told Vox. . “It's basically a matter of taste.”
Stetson University psychology professor Christopher Ferguson, who studies the psychological effects of media on young people, said that in his view, the uproar over social media's impact on children's well-being has created “a moral panic” that echoes previous generations. Concerns that radio, television, the role-playing game Dungeons & Dragons, and other new media will corrupt children's minds and morals.
It's unclear what metrics Meta plans to use to determine whether the new rules are helping children and parents; When asked about those metrics, Meta spokeswoman Lisa Crenshaw told Vox exclusively that the company will “iterate to ensure teen accounts work” for Instagram users. Crenshaw did not respond to follow-up questions by the time of publication.
“These all look like good-faith efforts,” Przybylski said. “But we don't know if it's working.”