Facebook and radicalisation: how can we regulate the internet to prevent harm?
[ad_1]
Oh, hi! Long time no see! My name is Robbie and I’m another white man behind a desk, and things are pretty bad, huh?
PART ONE: THE PROTEST
In case you never go outside and have these videos sent to your door on a USB delivered by a carrier pigeon, you might have missed a wee bit of a scuffle outside New Zealand Parliament. It turns out there were some folks there with some really interesting opinions! Would you like to meet them? I bet you wouldn’t!
It’s time to play Meet That Protestor!
[Spanish Flea starts playing. Robbie pulls out a thin old-school microphone.]
Bachelorette number one is Chantelle Baker! With nearly 100,000 Facebook followers, Chantelle well outnumbers her dad, former leader of the New Conservatives, who only has 52,000 Facebook followers. Embarrassing! Chantelle enjoys sharing live streams of peace and love that include other protestors saying, “We’re not leaving till we hang them.” Baker disavows those protestors in the strongest possible terms by joining them and supporting their protest.
READ MORE:
* The outing of an Internet troll showed women can win
* The internet is no safer three years on from the Christchurch terror attack
* We have failed to rein in social media’s misinformation
Bachelor number two! It’s the Freedom and Rights Coalition with 44,000 followers on Facebook. No wonder people love them! After all, they stand for Freedom and Rights. This might be Brian Tamaki’s group, but unfortunately, he can’t make it because of all that ‘breaking the law’ business. But don’t worry! He could be your pen pal! The man writes beautiful letters warning New Zealanders that we’re heading “down the path of UN ideology of socialism”. After a walk down the path of socialism, why not go for a romantic walk down the beach with the Freedom and Rights Coalition!
Bachelor number three is Counterspin Media NZ! Why not form a thruple with Kelvyn Alp and Hannah Spierer? Together you can fight the ‘Deep State’ and the ‘transhumanist agenda’, and you’re bound to win, because those things don’t exist! Alp has lots to teach you about the moon landings, and he’d love to buy you dinner with the money he makes selling weight loss pills and fraudulent vaccine passports. Maybe he’ll even tell you about the violent coup he’s got planned. Shhh! It’s a secret, but he’ll probably yell it to you in a livestreamed video being watched by police.
Congratulations! You’ve made it to the protest! Now that you’re here, why not get to know someone new? Introducing Bachelor number four, far-right white supremacist group Action Zealandia! Parliament changed security arrangements after video appeared to show these folks gaining access to a construction site in Bowen House. When they’re not helping protestors become Nazis, Action Zealandia enjoys organising terror cells, fighting hate speech laws, and grooming teenagers to join their white supremacist organisation! Action Zealandia isn’t allowed to have a Facebook page any more, and I’ll give you 10 guesses why.
What a friendly bunch.
Obviously, that’s just a sample – there’s also Billy Te Kahika, NZ Doctors Speaking Out with Science, or, more accurately, speaking without science, and a bunch of people exhausted and angry with Covid and the world in general who didn’t know what else to do.
But! We’re not going to be talking about police decisions in response to the protest or Speaker of the House Trevor Mallard playing games with the sound-system. Instead, we’re going to talk about how the protestors got there, because while there are a number of different factors that led people there, there’s one thing that stands out above all else: Facebook.
PART TWO: TVEC
Founder, CEO, and controlling shareholder of Facebook Mark Zuckerberg was determined to get people vaccinated. He and his wife, Priscilla Chan, have invested a lot of time and money into vaccination programmes, which makes it even sadder that his wee side-hustle, Facebook, turned so many people against them. It’s like if Ronald McDonald became obsessed with encouraging healthy eating, but refused to give up his day job.
Facebook makes money in a similar way to a newspaper: users write content, Facebook publishes that content, they editorialise that content with an algorithm, and then they sell ad space around that content. For context, these ad sales accounted for nearly all of Facebook’s $86 billion USD revenue in 2020.
Unlike a newspaper, Facebook doesn’t pay its writers, fact-check its content, or spend much money on editorial oversight. The good part of this system is that you get to hear from people that newspapers wouldn’t normally publish. The bad part is that you get to hear from people that newspapers wouldn’t normally publish.
For example, last month was the three-year anniversary of the Christchurch shooting, which was streamed live on Facebook and seen by thousands of people.
In response to the shooting, Facebook re-established the GIFCT as an independent organisation with their frenemies Microsoft, YouTube, and Twitter. The GIFCT is focused on removing TVEC. Acronyms are fun! Unless they stand for Global Internet Forum to Counter Terrorism or Terrorist and Violent Extremist Content. Then they’re no fun at all.
The operating board of the GIFCT is made up of Facebook, Microsoft, Twitter, and YouTube (owned by Google). It’s sort of like a cigarette company setting up a council to help us deal with all the lung cancer. Thanks, corporations! Where would we be without you helping us deal with the problem you cause for profit?
When one of these companies identifies a piece of content as TVEC through vague and unknowable means, they give it a hash, pop it on the database of hashes, and it goes out to the Hash Sharing Consortium. Now, the Hash Sharing Consortium might sound like a nerdy group of stoners, but it isn’t. It’s different.
Basically, tech companies give TVEC a digital fingerprint, so all the other major online service providers can be like, ‘Oop, someone’s trying to upload a terrorist video.’ And they stop them from uploading the content.
People can then slightly edit that content and reupload it and then someone has to report the new content, get that new content uploaded to the hash database, and the whole process starts again. It’s kind of like whack-a-mole, only instead of a mole, it’s the worst thing you’ve ever seen in your life.
PART THREE: WHAT COUNTS AS TVEC?
The way these OSPs or Online Service Providers decide what counts as terrorist content is haphazard at best. When New Zealand’s Chief Censor, David Shanks, decides what content should be illegal, it’s an extremely delicate process. He has to decide whether sharing extremist imagery is an important part of exposing New Zealanders to the horrors of the world, or whether it may cause further harm. According to the Guardian, “You’ve got to protect freedom of expression,” he says.
“You’ve got to protect this vital ability to have opinions, to spread them, to access information of any kind.”
The only reason to diverge from that principle, ever, he says, is to prevent harm – something he consults groups ranging from medical experts to high school students about.
Facebook has a different approach.
As of 2019, Facebook was paying people US$15 an hour to look at up to 400 posts a day of the worst stuff imaginable: beheadings, animal abuse, hate speech, and pictures of you with the flash on.
It was then up to these underpaid, traumatised workers to decide the cultural and political context of each post from countries all over the world, sometimes in languages they didn’t understand. This led to genuine Nazi content being left up and people getting blocked for sharing photos of Taika Waititi from Jojo Rabbit. That’s not a joke, by the way, that actually happened.
This work is supplemented by machine learning that works perfectly. According to the Wall Street Journal, in leaked documents “scientists pointed out the company’s ability to detect the content in comments was bad in English, and basically non-existent elsewhere”. So, as I said, perfect.
This work is also supplemented by a list of dangerous individuals and organisations, the terrorist section of which seems to be heavily copy-and-pasted from the US government’s list of terrorists. This is good because the US government is totally unbiased and completely trusted by every country in the world.
And while these companies often leave up content they claim to have banned, they’re also taking down content that they shouldn’t. In June 2017, YouTube announced a plan to combat terrorist content online, and it worked well. In fact, you could argue it worked too well.
To quote Wired: “The quick flagging and removal of content appears successful. Unfortunately, we know this because of devastating false-positives: the removal of content used or uploaded by journalists, investigators, and organisations curating materials from conflict zones and human rights crises for use in reporting, potential future legal proceedings, and the historical record.”
The thing is, censorship is hard. It’s complicated and political and has enormous ramifications for democracy. Facebook isn’t doing it properly, and they’re not being nearly transparent enough about how they do it.
But even worse than Facebook’s inability to effectively remove harmful content is how they editorialise all the stuff that’s left.
PART FOUR: THE ALGORITHM
To understand why Facebook is such a great platform for radicalisation it helps to look at the changes they made to their algorithm in 2018. When I say ‘the Facebook algorithm’, I mean the code they use to decide what pieces of content they promote.
In the same way that a newspaper uses an editor to decide what goes on the front page, Facebook uses an algorithm to decide what goes at the top of your newsfeed.
What’s fun about the changes they made in 2018 is that Facebook pretended they were making these changes for the greater good. Unfortunately, internal documents suggest that instead of making these changes for the good of humankind, they were actually making the changes to increase profit, which was shocking to everyone involved.
In 2018, Facebook was worried about “declining engagement”. To quote the Wall Street Journal, “the fear was that eventually, users might stop using Facebook altogether”. A terrifying thought.
[Insert Lionel Hutz shuddering at the idea of a world without lawyers.]
So, Facebook switched things up. Now, an angry reaction was worth five times a like; a long passionate comment was worth twice a short comment saying, ‘Good job!’; an angry rant sharing the original content was worth 30 times a like and so on.
In a public letter to Facebook, BuzzFeed CEO Jonah Peretti, the elder brother of Chelsea Peretti (FUN FACT!) complained that the change to the algorithm was forcing them to post increasingly controversial content to generate arguments in the comments.
In an internal report investigating the effects of the new algorithm on the politics of Poland, Facebook researchers wrote, “One party’s social media management team estimates that they have shifted the proportion of their posts from 50/50 positive/negative to 80 per cent negative, explicitly as a function of the change to the algorithm”.
Staff at Facebook tried to come up with solutions, but again, quoting the Wall Street Journal, “Mr Zuckerberg said he didn’t want to pursue it if it reduced user engagement, according to the documents.” What a piece of s….
PART FIVE: THE CONSEQUENCES
The world’s most popular newspaper, Facebook, is a lawless hellscape that chooses its top stories based on how many complaints they get, and it turns out that has some negative consequences for the planet earth and the people who live there.
A landmark study in Germany looked at thousands of anti-refugee hate crimes and compared them to variables that might be relevant. These included “wealth, demographics, support for far-right politics, newspaper sales, the number of refugees, history of hate crime and the number of protests.” One variable stood out: “Towns where Facebook use was higher than average … reliably experienced more attacks on refugees.”
Sometimes Facebook’s ability to fuel hate crimes is even more extreme. Facebook was forced to admit that it “played a role in inciting violence during the military’s genocidal campaign against the Rohingya” in Myanmar. Just for some context here, inciting violence in a genocide is bad.
Similarly, in Ethiopia, an investigation by Vice said violence had been “supercharged by the almost-instant and widespread sharing of hate speech and incitement to violence on Facebook, which whipped up people’s anger”.
And then there’s Covid-19. Again, Facebook wanted to stop the spread of misinformation–they gave free ads to the World Health Organisation and added links to accurate information on posts about the pandemic, but it wasn’t enough to counteract the fundamental business model of Facebook. Publish content without editorial oversight and promote anything that drives engagement.
So, cool. Great. Facebook, the world’s most popular newspaper, might eventually stop publishing neo-Nazis like the folks at Action Zealandia, but they will actively promote anti-vaxxers because of all the people arguing in the comments, and once you’re at the rally, I’m sure there are some friendly dudes in brown shirts ready to say hi.
PART SIX: SO, WHAT DO WE DO ABOUT IT?
The problem is big, but, surprisingly, governments seem willing to tackle it anyway.
Law-makers in the EU proposed a law that would require online service providers to remove illegal content within one hour. It turned out to be pretty controversial, but as with most controversial acts–the French did it anyway. And this law doesn’t just cover TVEC!
The BBC writes: “Failure to remove content could attract a fine of up to €1.25m (£1.1m). France’s regulator, the Superior Council of the Audiovisual (CSA), will have the power to impose heftier fines of up to 4 per cent of global turnover for continuous and repeated violations.”
For context, 4 per cent of Facebook’s revenue in 2021 was nearly $5 billion USD. That’s quite a big fine.
The UK has put forward a white paper on ‘statutory duty of care’, arguing “the platform that should be regulated not the content, including the design of the platform and the operation of the business. Secondly, the duty of care implies a risk assessment so that reasonably foreseeable harms are avoided where possible or mitigated.
It’s written like that because the British are, unfortunately, British.
Here in New Zealand, we’re undergoing a review of content regulation and working on hate speech reform.
The Department of Internal Affairs has been put in charge of removing TVEC – forcing them to draw the line between radical politics and terrorism and the line between important journalism on the topic of terrorism and media used to promote terrorist acts.
Maybe that’s something our Chief Censor and the Classification Office should decide, but there you go. Apparently, we’ve decided to give it to the Department of Internal Affairs, the department of government work that nobody else wanted to do.
Ultimately, we know that the GIFCT is insufficient, because social media companies are not going to voluntarily invest enough money to monitor what they share. We know that the Facebook algorithm is a worse editor than Rupert Murdoch, willing to throw anything on the front page that riles people up. And we know that this problem is not limited to Facebook, this s…show runs across multiple platforms that all basically run the same way.
But it took us a long time to figure out how to regulate television and radio and newspapers. And it’s going to take us a long time to figure out how to regulate the internet. These are enormous questions of democracy, and free speech, and protecting people from harm.
Everyone needs to be a part of this discussion.
So, if you wouldn’t mind, maybe start a long pointless argument in the comments. Do all the different kinds of reactions you can think o
f. Reshare this video with a long speech you’ve copy-and-pasted from the internet.
Maybe that way we can get an important story on the front page.
White Man Behind A Desk is the work of satirist Robbie Nicol and playwright Finnius Teppett. See more at Patreon.com/WhiteManBehindADesk.
[ad_2]
Source link