A former Facebook employee told members of Congress Tuesday that the company knows that its platform spreads misinformation and content that harms children but refuses to make changes that could hurt its profits.
Speaking before the Senate Commerce Subcommittee on Consumer Protection, former Facebook data scientist Frances Haugen told lawmakers that new regulations are needed to force Facebook to improve its own platforms. But she stopped short of calling for a breakup of the company, saying it wouldn’t fix existing problems and would instead turn Facebook into a “Frankenstein” that continues to cause harm around the world while a separate Instagram rakes in most advertising dollars.
Efforts to pass new regulations on social media have failed in the past, but senators said Tuesday that new revelations about Facebook show the time for inaction has ended.
Here are some key highlights from Tuesday’s hearing.
FACEBOOK KNOWS IT’S CAUSING HARM TO VULNERABLE PEOPLE
Haugen said Facebook knows that vulnerable people are harmed by its systems, from kids who are susceptible to feel bad about their bodies because of Instagram to adults who are more exposed to misinformation after being widowed, divorced or experiencing other forms of isolation such as moving to a new city.
The platform is designed to exploit negative emotions to keep people on the platform, she said.
“They are aware of the side effects of the choices they have made around amplification. They know that algorithmic-based rankings, or engagement-based rankings, keeps you on their sites longer. You have longer sessions, you show up more often, and that makes them more money.”
THE WHISTLEBLOWER TOUCHED A NERVE
During the hearing, Tennessee Sen. Marsha Blackburn, the committee’s ranking Republican, said she’d just received a text from Facebook spokesperson Andy Stone pointing out that Haugen did not work on child safety or Instagram or research these issues and has no direct knowledge on the topic from her work at Facebook.
Haugen herself made it clear several times that she did not directly work on these issues but based her testimony on the documents she had and her own experience.
But Facebook’s statement emphasized her limited role and relatively short tenure at the company, effectively questioning her expertise and credibility. That didn’t sit well with everyone.
Facebook’s tactic “demonstrates that they don’t have a good answer to all these problems that they’re attacking her on,” said Gautam Hans, a technology law and free speech expert at Vanderbilt University.
SMALL CHANGES COULD MAKE A BIG DIFFERENCE
Making changes to reduce the spread of misinformation and other harmful content wouldn’t require a wholesale reinvention of social media, Haugen said. One of the simplest changes could be to just organize posts in chronological order instead of letting computers predict what people want to see based on how much engagement — good or bad — it might attract.
Another was to add one more click before users can easily share content, which she said Facebook knows can dramatically reduce misinformation and hate speech.
“A lot of the changes that I’m talking about are not going to make Facebook an unprofitable company, it just won’t be a ludicrously profitable company like it is today,” she said.
She said Facebook won’t make those changes on its own if it might halt growth, even though the company’s own research showed that people use the platform less when they’re exposed to more toxic content.
“One could reason a kinder, friendlier, more collaborative Facebook might actually have more users five years from now, so it’s in everyone’s interest,” she said.
A PEEK INSIDE THE COMPANY
Haugen portrayed Facebook’s corporate environment as so machine-like and driven by metrics that it was hard to hit the brakes on known harms that, if addressed, might dent growth and profits.
She described the company’s famously “flat” organizational philosophy — with few levels of management and an open-floor workplace at its California headquarters that packs nearly the entire staff into one enormous room — as an impediment to the leadership necessary to pull the plug on bad ideas.
She said the company didn’t set out to make a destructive platform, but she noted that CEO Mark Zuckerberg holds considerable power because he controls more than 50% of the voting shares of the company and that letting metrics drive decisions was itself a decision on his part.
“In the end, the buck stops with Mark,” she said.
BIPARTISAN OUTRAGE
Democrats and Republicans on the committee said Tuesday’s hearing showed the need for new regulations that would change how Facebook targets users and amplifies content. Such efforts have long failed in Washington, but several senators said Haugen’s testimony might be the catalyst for change.
“Our differences are very minor, or they seem very minor in the face of the revelations that we’ve now seen, so I’m hoping we can move forward,” said Sen. Richard Blumenthal, D-Conn., the panel’s chairman.
Still, Democratic Sen. Amy Klobuchar of Minnesota acknowledged that Facebook and other tech companies wield a lot of power in the nation’s capital, power that has blocked reforms in the past.
“There are lobbyists around every single corner of this building that have been hired by the tech industry,” Klobuchar said. “Facebook and the other tech companies are throwing a bunch of money around this town and people are listening to them.”
———————
AP Technology Writer Matt O’Brien contributed to this report.
Ex-Facebook manager criticizes company, urges more oversight
WASHINGTON (AP) — While accusing the giant social network of pursuing profits over safety, a former Facebook data scientist told Congress Tuesday she believes stricter government oversight could alleviate the dangers the company poses, from harming children to inciting political violence to fueling misinformation.
Frances Haugen, testifying to the Senate Commerce Subcommittee on Consumer Protection, presented a wide-ranging condemnation of Facebook. She accused the company of failing to make changes to Instagram after internal research showed apparent harm to some teens and being dishonest in its public fight against hate and misinformation. Haugen’s accusations were buttressed by tens of thousands of pages of internal research documents she secretly copied before leaving her job in the company’s civic integrity unit.
But she also offered thoughtful ideas about how Facebook’s social media platforms could be made safer. Haugen laid responsibility for the company’s profits-over-safety strategy right at the top, with CEO Mark Zuckerberg, but she also expressed empathy for Facebook’s dilemma.
Haugen, who says she joined the company in 2019 because “Facebook has the potential to bring out the best in us,” said she didn’t leak internal documents to a newspaper and then come before Congress in order to destroy the company or call for its breakup, as many consumer advocates and lawmakers of both parties have called for.
Haugen is a 37-year-old data expert from Iowa with a degree in computer engineering and a master’s degree in business from Harvard. Prior to being recruited by Facebook, she worked for 15 years at tech companies including Google, Pinterest and Yelp.
“Facebook’s products harm children, stoke division and weaken our democracy,” Haugen said. “The company’s leadership knows how to make Facebook and Instagram safer but won’t make the necessary changes because they have put their astronomical profits before people.”
“Congressional action is needed,” she said. “They won’t solve this crisis without your help.”
In a note to Facebook employees Tuesday, Zuckerberg disputed Haugen’s portrayal of the company as one that puts profit over the well-being of its users, or that pushes divisive content.
“At the most basic level, I think most of us just don’t recognize the false picture of the company that is being painted,” Zuckerberg wrote.
He did, however, appear to agree with Haugen on the need for updated internet regulations, saying that would relieve private companies from having to make decisions on social issues on their own.
“We’re committed to doing the best work we can, but at some level the right body to assess tradeoffs between social equities is our democratically elected Congress,” Zuckerberg wrote.
Democrats and Republicans have shown a rare unity around the revelations of Facebook’s handling of potential risks to teens from Instagram, and bipartisan bills have proliferated to address social media and data-privacy problems. But getting legislation through Congress is a heavy slog. The Federal Trade Commission has taken a stricter stance toward Facebook and other tech giants in recent years.
“Whenever you have Republicans and Democrats on the same page, you’re probably more likely to see something,” said Gautam Hans, a technology law and free speech expert at Vanderbilt University
Haugen suggested, for example, that the minimum age for Facebook’s popular Instagram photo-sharing platform could be increased from the current 13 to 16 or 18.
She also acknowledged the limitations of possible remedies. Facebook, like other social media companies, uses algorithms to rank and recommend content to users’ news feeds. When the ranking is based on engagement — likes, shares and comments — as it is now with Facebook, users can be vulnerable to manipulation and misinformation. Haugen would prefer the ranking to be chronological. But, she testified, “People will choose the more addictive option even if it is leading their daughters to eating disorders.”
Haugen said a 2018 change to the content flow contributed to more divisiveness and ill will in a network ostensibly created to bring people closer together.
Despite the enmity that the new algorithms were feeding, she said Facebook found that they helped keep people coming back — a pattern that helped the social media giant sell more of the digital ads that generate the vast majority of its revenue.
Haugen said she believed Facebook didn’t set out to build a destructive platform. “I have a huge amount of empathy for Facebook,” she said. “These are really hard questions, and I think they feel a little trapped and isolated.”
But “in the end, the buck stops with Mark,” Haugen said, referring to Zuckerberg, who controls more than 50% of Facebook’s voting shares. “There is no one currently holding Mark accountable but himself.”
Haugen said she believed that Zuckerberg was familiar with some of the internal research showing concerns for potential negative impacts of Instagram.
The subcommittee is examining Facebook’s use of information its own researchers compiled about Instagram. Those findings could indicate potential harm for some of its young users, especially girls, although Facebook publicly downplayed possible negative impacts. For some of the teens devoted to Facebook’s popular photo-sharing platform, the peer pressure generated by the visually focused Instagram led to mental health and body-image problems, and in some cases, eating disorders and suicidal thoughts, the research leaked by Haugen showed.
One internal study cited 13.5% of teen girls saying Instagram makes thoughts of suicide worse and 17% of teen girls saying it makes eating disorders worse.
She also has filed complaints with federal authorities alleging that Facebook’s own research shows that it amplifies hate, misinformation and political unrest, but that the company hides what it knows.
After recent reports in The Wall Street Journal based on documents she leaked to the newspaper raised a public outcry, Haugen revealed her identity in a CBS “60 Minutes” interview aired Sunday night.
As the public relations debacle over the Instagram research grew last week, Facebook put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12.
Haugen said that Facebook prematurely turned off safeguards designed to thwart misinformation and incitement to violence after Joe Biden defeated Donald Trump in last year’s presidential election, alleging that doing so contributed to the deadly Jan. 6 assault on the U.S. Capitol.
After the November election, Facebook dissolved the civic integrity unit where Haugen had been working. That was the moment, she said, when she realized that “I don’t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous.”
Haugen says she told Facebook executives when they recruited her that she wanted to work in an area of the company that fights misinformation, because she had lost a friend to online conspiracy theories.
Facebook maintains that Haugen’s allegations are misleading and insists there is no evidence to support the premise that it is the primary cause of social polarization.
“Today, a Senate Commerce subcommittee held a hearing with a former product manager at Facebook who worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with (top) executives – and testified more than six times to not working on the subject matter in question. We don’t agree with her characterization of the many issues she testified about,” the company said in a statement.
__
Associated Press writers Matt O’Brien in Providence, Rhode Island, and Amanda Seitz in Columbus, Ohio, contributed to this report.