Instagram head faces senators amid anger over possible harm
WASHINGTON >> The head of Instagram on Wednesday met with deep skepticism on Capitol Hill over new measures the social media platform is adopting to protect young users.
Adam Mosseri appeared before a Senate panel and faced off with lawmakers angry over revelations of how the photo-sharing platform can harm some young users. Senators are also demanding the company commit to making changes and increase its transparancy.
Sen. Richard Blumenthal, D-Conn., who heads the Senate Commerce subcommittee on consumer protection, dismissed as “a public relations tactic” some safety measures announced by the popular photo-sharing platform.
“I believe that the time for self-policing and self-regulation is over,” Blumenthal said. “Self-policing depends on trust. Trust is over.”
Under sharp questioning by senators of both parties, Mosseri defended the company’s conduct and the efficacy of its new safety measures. He challenged the assertion that Instagram has been shown by research to be addictive for young people. Instagram, which along with Facebook is part of Meta Platforms Inc., has an estimated 1 billion users of all ages.
On Tuesday, Instagram introduced a previously announced feature that urges teenagers to take breaks from the platform. The company also announced other tools, including parental controls due to come out early next year, that it says are aimed at protecting young users from harmful content.
Don't miss out on what's happening!
Stay in touch with top news, as it happens, conveniently in your email inbox. It's FREE!
Senators of both parties were united in condemnation of the social network giant and Instagram, the photo-sharing juggernaut valued at some $100 billion that Facebook acquired for $1 billion in 2012.
The hearing grew more confrontational and emotionally charged as it went on.
“Sir, I have to tell you, you did sound callous,” Sen. Marsha Blackburn of Tennessee, the panel’s senior Republican, told Mosseri near the end of the hearing.
Senators repeatedly tried to win commitments from Mosseri for Instagram to provide full results of its internal research and its computer formulas for ranking content to independent monitors and Congress. They also tried to enlist his support for legislation that would curb the ways in which Big Tech deploys social media geared toward young people.
Mosseri responded mostly with general endorsements of openness and accountability, insisting that Instagram is an industry leader in transparency.
The issue is becoming increasingly urgent. An alarming advisory issued Tuesday by U.S. Surgeon General Vivek Murthy warned about a mental health crisis among children and young adults that has been worsened by the coronavirus pandemic. He said tech companies must design social media platforms that strengthen, rather than harm, young people’s mental health.
Meta, which is based in Menlo Park, California, has been roiled by public and political outrage over the disclosures by former Facebook employee Frances Haugen. She has made the case before lawmakers in the U.S., Britain and Europe that that the company’s systems amplify online hate and extremism and that the company elevates profits over the safety of users.
Haugen, a data scientist who had worked in Facebook’s civic integrity unit, buttressed her assertions with a trove of internal company documents she secretly copied and provided to federal securities regulators and Congress.
The Senate panel has examined Facebook’s use of information from its own researchers that could indicate potential harm for some of its young users, especially girls, while it publicly downplayed the negative impacts. For some Instagram-devoted teens, peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts, the research detailed in the Facebook documents showed.
The revelations in a report by The Wall Street Journal, based on the documents leaked by Haugen, set off a wave of recriminations from lawmakers, critics of Big Tech, child-development experts and parents.
“As head of Instagram, I am especially focused on the safety of the youngest people who use our services,” Mosseri testified. “This work includes keeping underage users off our platform, designing age-appropriate experiences for people ages 13 to 18, and building parental controls. Instagram is built for people 13 and older. If a child is under the age of 13, they are not permitted on Instagram.”
Mosseri outlined the suite of measures he said Instagram has taken to protect young people on the platform. They include keeping kids under 13 off it, restricting direct messaging between kids and adults, and prohibiting posts that encourage suicide and self-harm.
But, as researchers both internal and external to Meta have documented, the reality is different. Kids under 13 often sign up for Instagram with or without their parents’ knowledge by lying about their age. And posts about suicide and self-harm still reach children and teens, sometimes with disastrous effects.