Facebook’s global head of safety hasn’t fully read UK’s Online Safety Bill


    
    The UK’s Online Safety Bill will directly impact Facebook.
    Omar Marques/SOPA Images/LightRocket via Getty Images
    Tech executives and lawmakers around the world all seem to agree — social media regulation is necessary and it is coming. One of the first pieces of legislation to come into play will likely be the UK’s Online Safety Bill, the draft text of which is being examined by a parliamentary committee.
    That bill will help set the tone for safety regulation around the world, as other countries also seek to ensure citizens are protected from harmful content, and the?draft legislation has been available since May. It might be reasonable, then, to assume that key executives from social media companies — such as Facebook, which has been facing intense criticism about the risks it poses — would have scrutinized it in detail by now. That’s not necessarily the case, apparently.
    On Thursday, Parliament’s Draft Online Safety Bill committee took evidence from Facebook’s?head of safety, Antigone Davis. Asked whether she would be the person in charge of submitting company risk assessments to the UK regulator, Davis responded: “I don’t know the details of the bill.”
    Members of Parliament expressed their concern that Davis was attending the session without having read the draft bill she was providing evidence for. “I just have to say I’m deeply, deeply shocked that you aren’t on top of the brief about what this bill is all about and what it means not just to us, but to the whole of the world as well,” said MP Suzanne Webb.
    “I actually am familiar with the bill,” responded Davis.
    When asked to clarify whether or not she had read the bill, Davis replied: “I’m familiar with parts of the bill,” implying that she had not read the bill in full.?
    The 145-page Online Safety Bill, previously known as the Online Harms Bill, would place UK media watchdog?Ofcom?in charge of regulating tech companies in Britain. Ofcom would have the power to fine tech companies £18 million ($25.3 million) or 10% of their annual revenue, whichever is higher, if they fail to remove harmful or illegal content, as well as to block sites and services. Senior managers at tech companies could even face criminal charges if those companies consistently fall short of their obligations. ?
    Chris Yiu, Facebook’s director of public policy for Northern Europe, who was also present at the hearing, said he had read the bill, including the explanatory notes.
    Facebook didn’t immediately respond to a request for additional comment.
    Following years of criticism that it doesn’t do enough to protect people’s privacy or to eliminate hate speech and misinformation, Facebook has been hit with renewed allegations that it puts profits over user safety. Internal documents leaked by whistleblower Frances Haugen led to a flurry of stories in recent weeks from?The Wall Street Journal?and a consortium of US and international news outlets about the company’s policies, practices and decision-making. ?
    Last week, another Facebook whistleblower, Sophie Zhang, giving evidence to the same parliamentary committee, said she had read the bill in full.?
    “It seems like basic politeness to me that if I’m asked to testify regarding an upcoming bill, I should actually read the bill in question,” said Zhang on Twitter on Thursday.