Frances Haugen: Key moments from Facebook whistleblower’s UK parliamentary testimony


    
    Frances Haugen testifies in UK Parliament.
    Screenshot/CNET
    Facebook whistleblower Frances Haugen was in London on Monday to provide evidence to a UK parliamentary committee analyzing the country’s incoming Online Safety Bill. The appearance came nearly a month after the?former product engineer revealed herself as the person behind a leak of internal research that has brought new scrutiny upon the company — in particular regarding its safety systems. Shortly after revealing her identity, she testified before Congress detailing Facebook’s “moral bankruptcy.”?
    Haugen’s latest testimony coincided with yet more reports from news outlets based on internal leaked documents referred to as the Facebook Papers. As global scrutiny of the world’s largest social network heats up, the calls to regulate Facebook and other social media platforms are gaining traction. Compared to the US, efforts to bring in such regulation are relatively advanced in Europe and the UK.
    During the two and a half hours that lawmakers questioned Haugen, she repeated many of the statements she’d already voiced in other public forums, albeit tailored to apply to the UK’s draft legislation. The?Online Safety Bill, previously known as the Online Harms Bill, would place UK media watchdog?Ofcom?in charge of regulating social media platforms in the name of keeping users safe.
    When asked whether she thought the Online Safety Bill — which will hold companies accountable through fines, blocking services and, in extreme circumstances, bringing criminal charges against executives — was keeping Facebook CEO Mark Zuckerberg awake at night, Haugen responded: “I can’t imagine Mark isn’t paying attention to what you’re doing.” (It appears Zuckerberg hasn’t been to the UK since Parliament issued him a formal summons in 2018.)
    “I am incredibly excited and proud of the UK for taking such a world-leading stance with regard to thinking about regulating social platforms,” added Haugen.
    Echoing the sentiments of her fellow whistleblower Sophie Zhang,?who gave evidence to the parliamentary committee last week, Haugen encouraged members of Parliament to create specific guidelines for risk assessments that they expect social media platforms to carry out. “If Facebook doesn’t have standards for those risk assessments, they will give you a bad risk assessment, because Facebook has established over and over again, when asked for information, they mislead the public,” she said.
    Monika Bickert, Facebook’s vice president of content policy, said in a statement on Monday that the company was pleased to see the Online Safety Bill moving forward. “While we have rules against harmful content and publish regular transparency reports, we agree we need regulation for the whole industry so that businesses like ours aren’t making these decisions on our own,” she said.
    Along with her thoughts on the Online Safety Bill, here are some of the other key points Haugen made in her testimony on Monday:
     On the ramifications of Facebook’s safety policy outside the US
    Many of the revelations brought to light by recent leaks involve the impact of Facebook’s safety policies and its AI systems outside of the US. In the UK, it could be as simple as failing to recognize abuse by not understanding the offensiveness of local slang, said Haugen. In a country such as Ethiopia — which has six main languages, only two of which are supported by Facebook — it’s even harder for Facebook to ensure it’s keeping people safe, she added.
    Haugen also talked about the societal harm caused by Facebook’s algorithms. She said she thinks the political violence that’s occurred in Myanmar and Ethiopia following the spread of misinformation on Facebook represents just the “opening chapters” of the serious consequences we can expect from Facebook’s engagement-based ranking system.
     On the idea of Instagram Kids
    Facebook recently hit pause on the development of an app, known as Instagram Kids, that it was building for younger users who didn’t meet its minimum age requirement of 13. Haugen said that if it’s not possible to make a version of Instagram that’s safe for 14-year-old kids, she highly doubted it was possible to make a safe version of the app for 10-year-olds.
    “I find it really telling that if you go to Silicon Valley, and you look at the most elite private schools, they often have zero social media policies. They try to establish cultures where like you don’t use phones and you don’t connect with each other on social media.”
    She said that she found “misleading” Facebook’s argument that if kids were going to find ways to sneak on the platform in violation of age restrictions, it should at least make a safe version of the platform they could use. Facebook has enough data on users to be able to make accurate guesses about their ages, she said, and yet it doesn’t publicly say how many underage users it believes are on the platform or what it’s doing about the problem.
    Facebook should have to publish what it’s doing to detect underage users, she added, “because I guarantee you what they’re doing today is not enough.”
     On how Facebook compares to its rivals
    While discussing Facebook’s lack of transparency, Haugen said that the company kept more of its internal research and decision-making behind closed doors compared to competitors. “Both Google and Twitter are radically more transparent than Facebook,” she said.
     On Facebook’s lack of investment in safety
    Haugen talked extensively about how Facebook prioritizes maximizing profit ahead of everything else and incentivizes employees to work on projects that will improve profitability, while shutting down suggestions that would involve making any kind of systemic changes to the way Facebook operates.
    She said that she was surprised to hear last week that Facebook was investing in 10,000 new jobs in Europe to build out its metaverse and felt frustrated that the same resources weren’t being allocated to improve safety. “Do you know what we could have done with safety if we had 10,000 more engineers?” she said. “It would have been amazing.”
    Bickert responded by saying that contrary to what was discussed at the hearing, Facebook had always been commercially incentivized to remove harmful content. “People don’t want to see it when they use our apps and advertisers don’t want their ads next to it,” she said. “That’s why we’ve invested $13 billion and hired 40,000 people to do one job: keep people safe on our apps. As a result we’ve almost halved the amount of hate speech people see on Facebook over the last three quarters — down to just 0.05 per cent of content views.”
     On whether Facebook is evil
    Lawmakers tried to draw Haugen on whether she thought Facebook is evil, but she wouldn’t take the bait. Instead she said that the company was “overwhelmingly full of conscientious, kind, empathetic people” who were embedded in bad systems with bad incentives that led to bad actions.?
    There was a “culture of positivity … so intense that it discourages people from looking at hard questions,” she said. It meant that people who looked the other way were more likely to be promoted than people raising alarms. She added she didn’t believe the company was malevolent, but that problems had been caused by “negligence” and “ignorance.”
    While Facebook didn’t invent hate, she said, the company was “unquestionably making hate worse.”