|

Opening up ad-targeting data misses Facebook’s real problem

Opening up ad-targeting data misses Facebook’s real problem
Opinion

The problem isn’t ads; it’s organic posts. 

 

The news that Meta will be opening access to political ad targeting data in its ad library archive certainly seems, at first glance, like a good thing.

But really, it’s the most microscopic of steps in the name of transparency.

The company behind Facebook has consistently (and, some might say, wilfully) missed the mark when it comes to how misinformation and disinformation spread on the platform, and it’s doing so again now. The problem isn’t ads; it’s organic posts.

This move to open up the ad library doesn’t even begin to tackle the real problems with political influence on the platform. Once again, Meta looks to blame advertisers, when its own moderation and misinformation problems are the real issue.

Facebook determines what posts get the most reach on the platform primarily based on virality. It incentivises outrage, whether truthful or not.

Far-right/conservative news sites that traffic in misinformation know this and optimise their content for it, which is part of why those sites and movement figures like Ben Shapiro, Don Bongino, and Tucker Carlson routinely dominate the lists of most-engaged-with pages.

While any page owner can post anything to Facebook, ads go through content and policy reviews. Advertisers in Facebook’s broad grouping covering “social issues” – including political advertisers – go through additional screening of not only ad content but also the individuals managing those ad campaigns. Ads really are not the biggest problem here.

Advertisers consistently targeted in policies

The problem of misinformation in organic content was rampant throughout the 2020 election. One study released last year found that Facebook posts containing misinformation got six times more engagement than factual posts from reputable news organisations.

New York University, which conducted this study, was also one of the organisations whose access to Facebook data was blocked by the company.

While Facebook claimed throughout the election to be paying special attention to and addressing misinformation around voting, researchers found that nearly half of top-performing posts that mentioned voting by mail were false or misleading.

And an analysis by ProPublica and The Washington Post also found a “surge of misinformation” on Facebook leading up to the 6 January attack on the Capitol.

But, in spite of the significant problem of misinformation in content, Facebook has consistently targeted advertisers in its policy solutions.

The net result of this is: it’s easier to post misinformation and have it virally spread than it is to get accurate information out in a targeted way.

One last example: in the final days leading up to the 2020 election and then for several weeks afterward, Facebook prohibited political and social issue advertisers from launching any new ads for the ostensible reason of preventing the spread of inaccurate information while the results of the election were unclear.

So, while there could be no new ads in the week after Election Day, there was a lot of organic political content, and all ten of the most engaging Facebook posts were from Donald Trump – including him claiming, falsely, that he’d won.

News around Facebook doing anything in the name of transparency seems like something we should instinctively applaud. But in this case, the move is just another sleight of hand diversion, putting the focus back onto advertisers and away from those who are really allowing political misinformation and unethical behaviour and content to flourish – Facebook itself.

Eric Reif is senior vice-president, paid media, at Blue State. He has more than a decade of experience of political strategy advertising and fundraising having raised hundreds of millions of dollars for Democratic Party campaigns, progressive courses and nonprofits.

Media Jobs