Audio By Carbonatix
A team of researchers at the Max Planck Institute and Northeastern University may have figured out how to beat ad discrimination on Facebook once and for all.
After a string of alarming ProPublica stories, Facebook has had trouble keeping advertisers from targeting ads to specific races, potentially violating discrimination laws around housing and employment.
But the Planck institute team has a new idea for how to approach the problem, judging an ad based on the overall users targeted instead of the inputs used to achieve that target. The researchers laid out how such a system might work in a paper last month, the first step in a project that could rewrite the way ads are targeted online.
But with the basic idea laid out, the team has run into a bigger problem: Facebook itself. The next step should be to find out how many Facebook ads might end up blocked by the new anti-discrimination system, but Facebookâs data policy makes that question impossible to answer. Anyone who buys an ad can see the targeting tools, but only Facebook knows how people are actually using them.
âWe just donât know how the ad-targeting system is being used,â says Northeastern University professor Alan Mislove, who co-authored the paper. âOnly Facebook has that data, which is an unfortunate situation.â
That lack of data blocks research into some of Facebookâs biggest problems, going far beyond just ads. For critics, Facebook is spreading conspiracy theories, enabling Russian influence campaigns, and actively undermining democracy.
But despite the mounting concerns, thereâs very little data detailing how serious those problems really are or how they arise. We just donât know how Facebook affects society at large â and the companyâs data lockdown makes it impossible to find out.
In some ways, this lack of public information is part of Facebookâs promise to users. Itâs easy to research Twitter, where every post is public, but the same accessibility opens the door to more aggressive doxxing and harassment. The vast majority of Facebook, on the other hand, is hidden from public search, which makes it much harder for anyone to know whatâs happening. Researchers can see public groups and interest pages, but without a linked personâs login token, they canât break through to private profiles or groups, making it difficult to trace influence campaigns or misinformation.
As a result, when researchers want to find out how fake news spreads or conversations get derailed, they turn to Twitter. Open-source researchers like Jonathan Albright are able to track how troll networks seize on specific outlets and stories on Twitter. But on Facebook, such analysis is simply impossible.
âEven if you can get the data, you are left without the necessary means to fully understand it because of the closed (proprietary) and constantly changing News Feed algorithm,â Albright says. âThe underlying platform data is simply not accessible.â
When analysts do look at Facebook, it tends to be the easily accessible parts. Reporters have focused on the Trending Topics board largely because itâs visible and accessible, even if it doesnât drive significant traffic or user interest. Other reporting has employed on tools like CrowdTangle, which can show share volume from specific publishers (particularly useful for profiling fake news outlets), but gives little sense of how stories are spreading across publishers. As a result, reporters can spot misinformation when it goes viral, but even the most sophisticated tools canât tell them how it got so popular.
Thatâs particularly urgent because Facebook is facing serious questions about its impact on society, and we have no data to tell us which concerns are important. Earlier this week, UN officials said Facebook played a role in a possible genocide against Rohingya Muslims in Myanmar â a horrifying charge, if true. It would be immensely valuable to track how anti-Rohingya sentiment actually spread on the platform. The results might exonerate Facebook or point toward specific changes the platform could make to address the problem. As it stands, we simply have nowhere to begin.
The same problem is in play when researchers go looking for bots or otherwise fraudulent accounts. There are more than 20 million dummy accounts on Facebook, and while the company prefers to keep bot-hunting efforts internal, a small industry has grown up for researchers who can reliably spot and report the accounts. Facebook does accept reports from those researchers, but it doesnât make it easy for those researchers to find connected accounts or report them, often requiring an in-person contact with an individual support team member.
One bot-hunter, who asked not to be named because of his ongoing work with Facebook, said person-to-person reporting made his work far more difficult. âThe challenge with this approach is that individual support team members may have slightly different interpretations of what constitutes abusive content,â the researcher said, âand larger campaigns consisting of hundreds or thousands of abusive posts / profiles are difficult to manually report at scale.â
Reached by The Verge, a Facebook representative pointed to the companyâs ongoing bug bounty program, which Mislove has worked with before, as an example of collaboration with outside researchers, and said the company is eager to find new ways to work with researchers provided the work doesnât compromise user privacy. âWe value our work with the research community and are always exploring ways to share and learn more from them,â the company said in a statement.
After a wave of ad scrutiny, Facebook is already testing out more ad disclosure pages that would show every post from a given advertiser, scheduled to roll out in the US before the 2018 elections. Itâs significantly more data than you can get from systems like AdWords, although it still wouldnât give much detail on how the ads are targeted. For researchers like Mislove, that leaves the most important questions still unanswered.
âWhat I need to know is the targeting parameters,â he says. âWithout that, we either need to trust Facebook, or we need better tools to get some visibility into how their ad system is being used.â
DISCLAIMER: The Views, Comments, Opinions, Contributions and Statements made by Readers and Contributors on this platform do not necessarily represent the views or policy of Multimedia Group Limited.
Tags:
DISCLAIMER: The Views, Comments, Opinions, Contributions and Statements made by Readers and Contributors on this platform do not necessarily represent the views or policy of Multimedia Group Limited.
Latest Stories
-
Government pays School Feeding caterers 2025/26 first term feeding grant
8 minutes -
Mz Nana, other gospel artistes lead worship at celebration of life for Eno Baatanpa Foundation CEO
2 hours -
Ayawaso East NDC Primary: Baba Jamal campaign distributes TV sets, food to delegates
2 hours -
MzNana & Obaapa Christy unite on soul-stirring gospel anthem Ahotoâ
2 hours -
Ayawaso East by-election underway as five candidates vie for NDC ticket
2 hours -
Loyalty is everything in politics; Bawumia must decide on Afenyo-Markin – Adom-Otchere
3 hours -
Ghana positions itself as a Competitive Fund Domiciliation Hub
3 hours -
NPP: Afenyo-Markin defends post-election coordination, urges focus on party unity
3 hours -
MoMo boss Shaibu Haruna named fintech CEO of the Year as MobileMoney Ltd, MTN Ghana sweep top awards
3 hours -
Kofi Bentil praises Afenyo-Markinâs leadership style but calls it combative
4 hours -
NDCâs demolishing exercises will feature in 2028 election â Adom Otchere
4 hours -
âI was hoping for 60%â – Paul Adom-Otchere on Dr Bawumiaâs flagbearer win
4 hours -
Africaâs growth depends on empowering SMEs, women and youth â CEO of Telecel Group
5 hours -
Force for good in action: Absaâs colleague volunteerism in 2025
5 hours -
14-Year-old boy drowns at Fiapre Catholic Junction in Bono Region
5 hours
