Is it time we regulated against misinformation on Facebook?

As originally published on stuff.co.nz on 29 Jul 2021

Facebook’s mission is to bring the world closer together, so why does it feel like they’re doing the opposite and what, if anything, should we be doing about it?

Joe Biden warned last week that conspiracy theories and misinformation continue to create deep divisions in the US.

“The rest of the world's wondering about us” he said, which is true, but we’re also wondering about the rest of the world.

A week earlier he accused Facebook of “killing people” with the “outrageous misinformation about the COVID-19 vaccine” while his Surgeon General called out the social media giant for not doing enough to stop the spread of misinformation on its platform.

 
 
 
 

Facebook hits back hard against this type of criticism, but doesn’t seem to be doing nearly enough to remove the vast amount of misinformation from its properties.

This matters. Biden’s right that this misinformation is killing people. It’s been contributing to increasing amounts of vaccine hesitancy and outright anti-vax sentiment since well before Covid-19, particularly within poor and minority communities.

Take Samoa where 83 people, most of them children, died during a recent measles outbreak due to falling vaccination rates. More than 70 kids died from a completely preventable and eradicable disease due in no small part to a steady diet of anti-vax misinformation fed to Samoan parents on Facebook.

It’s impossible to tell how many will die here and around the world as a result of Covid-19 vaccine hesitancy, but it’s clear that it’s going to be many thousands, if not millions over the next 10 years.

 

In the US, more than 99 per cent of those now dying from Covid-19 are not fully vaccinated yet vaccine hesitancy is now the number one impediment to vaccination efforts there.

Yet, despite it announcing in February that it was banning misinformation on vaccines, Facebook is still awash with anti-vax propaganda.

And it’s not just vaccine misinformation that thrives on Facebook. A full 20 per cent of Americans still believe in kooky QAnon conspiracy theories and it’s common knowledge that Russian interests helped put Trump into power in 2016 with an effective misinformation campaign waged largely on, you guessed it, Facebook.

Mark Zuckerberg himself admitted earlier this year that climate change misinformation is rife on Facebook while committing to do more about it. Since then the amount of climate-related misinformation has only increased on the platform.

So why doesn’t Facebook do something about it?

Zuckerberg has argued that Facebook shouldn’t be an “arbiter of truth”.

Facebook has also complained that policing and removing such content is very difficult given the sheer scale of the task at hand.

This may be true, but look at how effective Facebook is at policing content that it actually wants to remove using its army of bots and moderators. You won’t find child porn or widespread piracy on Facebook, for example. Not because people aren’t sharing it, but because prosecutions would follow if Facebook didn’t quickly remove it.

 

The reason it won’t take it all down is because misinformation sells. It’s all about the Benjamins.

Rednecks and anti-vaxxers are worth much more to Facebook than the average person, as they’re so much louder and they spend significantly more time on Facebook’s properties.

Facebook’s business model doesn’t care about how truthful or accurate a piece of content is. Only how many times it’s clicked on, liked and shared and those with extreme views are going to click and share a lot more than the rest of us.

Disinformation expert Paul Barrett found that social media platforms typically amplify right-wing voices, affording them far greater reach than nonpartisan content creators. The reason they do this is because it makes good economic sense.

We simply can’t expect Facebook to moderate its own content because of the financial incentives preventing it from doing so. Put simply, Facebook is conflicted and unable to “do the right thing” without hurting earnings. So it hurts society instead.

In New Zealand our mainstream media is self regulated. The Media Council (previously the Press Council) was established in 1972 explicitly to avert intended regulation of the industry by the incoming Kirk Labour government. Since then it has done a passable job of preventing excessive bias or inaccuracy in the press.

The Media Council exists in recognition that mainstream media outlets, who are subject to many of the same commercial realities as Facebook, face financial pressures that prevent them from effectively regulating themselves.

So is it time to regulate against misinformation on Facebook and other social media platforms?

Doing so could be daunting for a tiny country such as ours but, as the Aussies showed when they started forcing Facebook and Google to pay for news content they profit from, it is possible for small countries to reign in the digital giants.

Previous
Previous

Cyberattacks on the rise: How to protect your business