europe

Will the EU fight for the truth on Facebook and Instagram?


Factcheckers had no doubt about the real audience for the news this week – delivered via Mark Zuckerberg’s medium of choice, the awkward video message – that, starting in the US, Meta would abandon professional, third-party factchecking across its networks in favor of the user-powered “community notes” model used on X.

“This is all intended to curry favor with Trump,” one factchecker wrote as soon as the news dropped, on the private WhatsApp channel where the community gathered to vent. Their public responses made the same point a little more diplomatically.

If the incoming US president was the audience, though, a crucial question lies across the Atlantic: how the European Union responds to Meta’s retrenchment. The answer could have consequences for factcheckers far beyond Europe’s borders.

Meta’s factchecking program spans 130 countries today and is the biggest single funding source for factchecking worldwide. It came together in a matter of weeks after the 2016 US election – with some prompting from factcheckers themselves – as Zuckerberg faced intense scrutiny over Facebook’s fake news problem. Not long ago, Meta boasted about having spent some $100m on factchecking initiatives since 2016.

Still, factcheckers have worried for years that the social media giant would pivot away again once the political winds shifted. “I know most of us have relied on these resources, but deep down, we all knew this day was coming,” wrote Mehmet Atakan Foça, the founder of Turkish factchecking site Teyit, on the WhatsApp channel. “I urge you to see this as a fresh start – an opportunity to rebuild from the ground up.”

Stress-testing new disinformation laws

What exactly the new policy will mean for the world’s factcheckers depends on how quickly and widely Meta rolls it out beyond the United States. The company has been studiously vague on that question – except to tell reporters it had “no immediate plans” to end factchecking in the EU, seen as a nod to its obligations under EU law.

The EU has led the world in erecting a sophisticated and comprehensive regulatory framework for globe-spanning digital platforms like Meta and Google, anchored by the Digital Services Act. A newly strengthened Code of Practice on Disinformation – developed with input from across civil society, and designed to interlock with the DSA – explicitly requires the largest platforms to work with researchers and factcheckers to mitigate risks from online disinformation, including “fair financial contributions for factcheckers’ work”.

But that regulatory framework is unfinished and untested. The EU case against Elon Musk’s X, the first formal charges brought under the DSA, remains unresolved even as lawmakers call for a new investigation of the billionaire’s recent meddling in European elections. Meanwhile, all of the major platforms appear to be falling far short of their commitments under the self-regulatory Code of Practice. It is still an open question how platforms will have to work with factcheckers, and what shape enforcement will take, if those commitments become a Code of Conduct as envisioned under the DSA.

So far, the EU’s only comment on Meta’s move has been that any major platform would have to “conduct a risk assessment and send it to the EU Commission” before cutting ties with European factcheckers. What the institution says and does next will be a crucial test of the principles in the DSA, and may help to shape Meta’s policies around the world.

Carlos Hernández-Echevarría, head of policy for the Spanish factchecker Maldita, argues that the DSA’s deliberately vague language, designed to be forward-looking and collaborative, is being exploited by “a US-based industry more and more reluctant to do anything meaningful against misinformation and other online ills. However, the law is on the books now and has to be enforced.”

“At the end of the day, the European Commission will have to say publicly whether these platforms have ‘effective risk mitigation’ measures in place for misinformation. As wide as that concept is, I don’t think it’s something you can say about a lot of them,” he added.

skip past newsletter promotion

Uncertain consequences

Still, the working assumption among factcheckers is that Meta will eliminate third-party factchecking in Europe, and worldwide, after trialing the new community notes system in the United States. In his comments, Zuckerberg took aim at “an ever-increasing number of laws industrialising censorship” in Europe and promised to “work with President Trump” to push back on restrictions around the world. Brazil has issued a legal demand for the company to clarify what Meta intends to do with its fact-checking operation there.

It’s difficult to predict the consequences for the global factchecking movement that has grown up over the last two decades, but they would be drastic. About 40% of factcheckers signed on to the principles of International Fact-Checking Network, required to join Meta’s program, are commercial, for-profit operations. Many of those depend on Meta for all of their income. If the program disappears completely, perhaps a third of Meta’s 90 partners around the world would shut down, or shutter their factchecking arms.

Nearly all of the rest, though, would be forced to lay off staff and dramatically scale back their work. That includes dozens of nonprofit and university-based factchecking efforts, from Brazil to Bosnia to Bangladesh, that use the money they make debunking hoaxes on Facebook and Instagram to help pay for factchecking politicians, as well as initiatives like running media literacy programs, doing policy work, and developing new technologies to fight disinformation.

“What has happened in the US is just the beginning,” the Filipino site Rappler, founded by Nobel laureate Maria Ressa, concluded. “It is an ominous sign of more perilous times in the fight to preserve and protect our individual agency and shared reality.”



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.  Learn more