KU Leuven: Facebook falls short in recognizing political ads

Researchers from KU Leuven (research group imec-DistriNet) and New York University (Cybersecurity for Democracy) show that Facebook misjudges up to 83 percent of the political advertisements they or the researchers consider political worldwide. Sometimes they are not recognized as political advertisements, but often non-political advertisements are also incorrectly labeled as political.

In the US and New Zealand, Facebook is the best at filtering ads correctly. Only 1 percent escapes there. Even if in the US this is still about ten thousand advertisements. Facebook has the worst score in Malaysia, where 45 percent remains under the radar. Belgium is in between, in 28th place of the 58 countries surveyed, with almost nine percent.



“Facebook failed to detect undeclared political advertisements from almost every Belgian political party,” says Victor Le Pochat, doctoral researcher at imec-DistriNet (KU Leuven) and FWO candidate. Globally, Facebook was very bad at distinguishing political from non-political ads. Thus, ultimately, both users and advertisers on Facebook are misled: it is unclear whether an ad is actually politically charged, or ads are erroneously removed because Facebook believes they are political.”

Political Ads
Facebook came up with its own rules in 2018 for “ads about social issues, elections or politics”. The Brexit referendum and the 2016 US election brought much criticism to the social network about how colored information or misreporting could run rampant. Henceforth, political ads would have a label indicating who pays for them. Facebook mainly relies on civic sense, and expects advertisers to indicate themselves whether their advertisements are about social issues, elections or politics. But not all advertisers do. An algorithm then tries to identify those undeclared advertisements. This study has shown varying degrees of success.


Facebook could take some simple measures to improve the detection of political ads, but has already indicated that it is not much in favor of this.

– Computer scientist Victor Le Pochat

Impact
The limitations of screening do harm on two levels. You have the false negatives , political advertisements that are not recognized as such. These affect the reliability of Facebook and its own regulations. At the same time, they open the door to malicious advertisers and misinformation. But Facebook also mistakenly recognizes a lot of ads as political, the false positives . This also erodes confidence in how well the platform enforces regulations, and it ensured that important social information, for example about COVID-19, did not reach the public.

Comments are closed.