Audio By Carbonatix
Facebook whistleblower Frances Haugen is set to give evidence to UK politicians amid fresh revelations about the company's inner workings.
Ms Haugen will face the committee fine-tuning the UK's proposed Online Safety Bill, which will put new rules in place for big social networks.
It comes as several news outlets published fresh stories based on her thousands of leaked documents.
Facebook, meanwhile, has characterised previous reporting as misleading.
Ms Haugen left Facebook earlier this year, but took thousands of documents when she did so, providing them to the Wall Street Journal.
That paper then ran a series of articles which Facebook considered to be negative - and, it contends, mischaracterised the source material.
But the allegations - that Facebook knew that Instagram was damaging to teenagers' mental health, for example - led to her being invited to testify to politicians and regulators around the world.
I’m looking forward to discussing the Online Safety Bill with the @OnlineSafetyCom in @UKParliament tomorrow 🇬🇧
— Frances Haugen (@FrancesHaugen) October 24, 2021
Tune in ⬇️https://t.co/AeUcWb0lc3
Her appearance in London comes at a crucial time in the debate about tech regulation, as the Online Safety Committee considers additions and tweaks to the proposed new rules.
Proposed additions include whether online abuse of women and girls should become a legal offence.
The chair of the committee, MP Damian Collins, said it "will establish a new era of regulation for tech platforms which will make them accountable".
Opaque power
"The real question is around can we, as a public, change the incentives such that it makes more sense for Facebook to invest more money in safety on Instagram," Ms Haugen said in a BBC interview with Ian Russell, the father of 14-year-old Molly Russell, who killed herself after viewing disturbing content on Instagram.
"I'm sure that… the experience Molly had caused them to look at these questions more," she added.
But Ms Haugen said that much quicker progress is needed.
"Facebook's own research shows that a startlingly high fraction of [under-18s] exhibit what is known as problematic use [on Instagram]," she said.
Being unable to control their use of it is "kind of like cigarettes in that way," she added.
"Unquestionably, Facebook could be investing more resources into making the platform safer," she said. "They have made a series of choices to prioritise profits over people.
"Right now there's no company in the world that has as much power as Facebook, and as little transparency."
Fresh revelations
It comes amid a fresh batch of reporting on the leaked documents, which were provided to more than a dozen news organisations to search for additional news lines after the Wall Street Journal's initial series of reports.
NBC News reported on a Facebook internal experiment about a fictional woman's new account which was recommended extreme QAnon conspiracy theory groups within two days. This was based on nothing more than basic interests such as following Fox News and Donald Trump, and expressing an interest in Christianity.
"Within one week, Smith's feed was full of groups and pages that had violated Facebook's own rules, including those against hate speech and disinformation," NBC reported. Facebook told the outlet such research helped lead to its ban on QAnon in October 2020.
Bloomberg, meanwhile, published a new piece detailing the internal staff reaction to the 6 January Capitol riots, including quotes from those who questioned why they were still working for the company.
CNN reported on Facebook's own analysis of the events around 6 January, in which the tech giant's staff said that while "hindsight is 20:20", the procedures the company had in place were inadequate to stop the "Stop the Steal" theory, which falsely alleged that the US presidential election had been subject to massive fraud.
The New York Times, meanwhile, published a story about the impact that Facebook's policies have in India, which it characterised as an "amplified version" of the issues around misinformation, hate speech and violence the company faces elsewhere. A fictional test user in India accepting Facebook's recommendations was apparently subjected to "polarising nationalist content, misinformation, and violence and gore".
Axios reports that Facebook has internally warned its staff to expect "more bad headlines in the coming days".
Facebook has previously denied many of the reports released during the Wall Street Journal's initial reporting, referring to the documents at one point as "stolen".
But at the same time, the company has admitted that in many areas it has more to do - and is taking issue primarily with what it says is misrepresentation or cherry-picking from the leaked documents.
It also points to its long-standing calls for reform of tech industry regulation - which would affect all major big tech firms, not just Facebook.
Latest Stories
-
Ghana Reference Rate dips to 10.03% in May, signalling possible loan rate cuts
8 minutes -
Gov’t evacuates man in viral South Africa xenophobia video attack
27 minutes -
From grain pickers to road works: How an Upper West tour shifted Agbodza’s focus
36 minutes -
Awoshie-Barnyard crash leaves four seriously injured, triggers heavy traffic
48 minutes -
Dog heads don’t prevent heartbreak – ICS debunks growing myth
60 minutes -
Flying with two wings: Africa’s opportunity to strengthen economic governance
1 hour -
Callistus Mahama: Before the race begins; A call for discipline, reflection, and dutyÂ
1 hour -
Weija Paediatric Hospital delayed as Health Ministry cites procurement dispute
2 hours -
Greater Accra Minister apologises over Northern posting remarks
2 hours -
Nigeria opposition alliance falters as two leading figures quit, clouding 2027 unity push
2 hours -
Oil prices ease as US pauses Project Freedom to seek deal with Iran
2 hours -
Mission is to preach peace, says Pope in response to Trump attacks
3 hours -
Nigeria supplies less than half of allocated crude to refineries in early 2026
3 hours -
Iraq offers May-loading crude at deep discounts for loading inside Hormuz
3 hours -
‘I thought he was going to hit me’ OpenAI co-founder says of Musk
3 hours