So What’s The Problem With Facebook?

Facebook is probably doing us more harm than good. Here’s why.

Sean Buchan
9 min readJul 29, 2020
One of the world’s most influential men at a Facebook Developer Conference, 2019. Photo Anthony Quintano, Creative Commons: https://www.flickr.com/photos/quintanomedia/46985053944

You may or may not know this, but throughout July, Facebook has been earning a bit less money than usual. No, not due to coronavirus. But because over 500 US corporates and over 1,000 organisations across the globe have stopped placing adverts on the platform. The list is impressive too, including huge brands such as Coca Cola, Boeing, Ford, Starbucks, Puma and Disney. It’s likely causing Facebook its biggest headache yet.

As a marketer I have personally placed or helped place over $100,000 worth of adverts on Facebook and Instagram in my time. Every penny has made me uncomfortable. I work exclusively for projects with an environmental or social purpose; and my feeling for some time has been that using this platform has been at odds with my principles.

Unsure? Read on and decide for yourself.

Issue #1 of 4: Hate Speech

I’m starting here because it’s how the July advertiser boycott began. On June 17th the Stop Hate For Profit campaign called upon US corporates to remove their ads. This was primarily due to Facebook’s poor handling of the US protests sparked by George Floyd’s death. In particular, Facebook took no action on the President of the United States’ post “when the looting starts, the shooting starts”. Zuckerberg’s response to calls to take down the post was … less than encouraging.

Personally, I have a visceral negative reaction to this kind of divisive and inflammatory rhetoric … But I’m responsible for reacting not just in my personal capacity but as the leader of an institution committed to free expression … our position is that we should enable as much expression as possible unless it will cause imminent risk of specific harms or dangers.

So Zuckerberg is saying he doesn’t agree with it, but that such a post does not cause “imminent risk of specific harms or dangers”. This shows a fundamental misunderstanding, or deliberate ignorance, of what words can do when spoken from a position of such power. And here starts a running theme of Zuckerberg’s apparent lack of leadership - as he often states that he feels one way, but his company acts contrary to that.

Facebook’s own staff staged virtual walkouts as a result of Zuckerberg’s tepid response. It was all over the press. But in the end, nothing was done about the post.

This is, unfortunately, not the worst thing Facebook has failed to do when it comes to hate speech. At time of writing Facebook continues to permit holocaust denial on its channels, in particular through public and private groups. While it’s removed some of the worst offending groups and posts now and again, it’s nothing like the scale required to stem people across the globe growing in antisemitism. People are being provided the tools to indoctrinate and collaborate with others on these destructive ideas.

Worth noting that adding something like holocaust denial to their policies isn’t necessarily the goal, either. It’s more about the action. Facebook is already known for breaking its own rules, for example by allowing hate spreading organisations to break its policies and posting rules, such as The Daily Wire.

You’ll want to brace yourself for the worst of them all. In 2018 a New York Times report found the systematic genocide of Myanmar’s Rohingya population was incited and justified with the help of Facebook. I’ll say that again — genocide. The Myanmar military created dummy accounts imitating news outlets and celebrities and flooded them with incendiary, hateful posts and comments. Targeted propaganda. On this one Facebook admits it acted too slowly (see same article); but Facebook has not exactly made meaningful attempts to stop the same sort of thing happening again.

Issue #2 of 4: Disinformation — The Science Edition

(Disinformation can colloquially be known as fake news, but I try not to use this term due to its political weaponization.)

The most topical example of disinformation on Facebook right now is the anti-vaccination movement, which poses an immediate health risk with a Covid-19 vaccine hopefully not too far away. Yet on this issue, Zuckerberg is on record saying the following:

if someone wants to post anti-vaccination content, or … join a group … we don’t prevent them from doing that … But we don’t go out of our way to make sure our systems encourage people to join those groups.

Wow, inspiring stuff. I don’t know about you but I had to read that latter sentence four times before I understood it; and when I did, I was disappointed.

Another example, more topical than you’d maybe like to think, is how easily climate denial is spread — actually, I would say encouraged — by the platform. During the Australian bushfires this January (remember that?) climate denial pages used the opportunity to spread their message to great success. The way the Facebook algorithm works, and the way people use the platform to socialise, this kind of content often spreads better than facts because people like to engage with what they want to hear, not what they probably should hear. While the horror of bushfires touches an emotional part of us, we don’t like to simultaneously feel like we’re causing it. So we’ll tend to sympathise and donate to an appeal fund while agreeing with posts that say it’s just unlucky weather. In this way Facebook can hide behind how much “support” there has been for the tragedy, while actually undermining long-term efforts to stop a climate crisis.

While disinformation such as climate denial is being actively fact-checked by Facebook and apparently punished by the algorithm, there’s little evidence yet that it’s working. And in fact plenty of content still slips through fact checks via loopholes like this one where climate science was deemed an opinion, not a fact; therefore immune to fact-checking.

Meanwhile, that independent fact-checking platform Facebook has established? It’s widely reported as hugely under-resourced. I could cite various sources here, but the one that stuck out to me was Snopes, who left the programme two years ago, recently stating in a series of tweets that it felt more like a public relations effort on the part of Facebook.

Issue #3 of 4: Disinformation — The Politics Edition

Sorry folks, we’re at that point where we have to talk politics.

So you may already know that crazy rumour that Trump was elected with some help from Russia. Well, turns out it was true. Awkward. Whether Trump’s campaign conspired with Russia is still pending, but the Mueller investigation this year did find there was “sweeping and systematic” interference in the 2016 election from Russian actors. What this means practically is that Russian owned accounts produced content on the platform systematically discrediting the Clinton and her campaign, buoying the Trump campaign and causing disillusionment with the general democratic process. In many cases this was allegedly directed by Putin himself.

Think it probably didn’t work? Well, in terms of engagement, it’s worth noting that on Facebook fake news outperformed real news in the lead up to election day. Facebook is designed in a way that this kind of intervention doesn’t require secret agents in a bunker — it can be done pretty much in plain sight. If you have the time and / or money to dedicate to undermining an election, you can do it.

The other now famous example is the Cambridge Analytica scandal during the Brexit vote in 2016. Users’ data was harvested and used by the Vote Leave campaign without express permission and used to micro-target them. Again, what this means in practice is that an advertiser can know that Sam from Buxton dislikes their Turkish neighbours, so Sam is shown ads about how staying in the EU will bring 76 million to the UK.

Vote Leave Ad #1 — Turkey’s Joining The EU! If you’re curious, you can see the range of ads that were made public in a Parliamentary report here.

Or Jordan from Middlesbrough wishes we could go back to an industrial Britain, so is shown ads about British Steel being stripped of £350 million a week.

Vote Leave Ad #2 — Cake Or Death, But Easier. If you’re curious, you can see the range of ads that were made public in a Parliamentary report here.

I have to be really clear here — this stuff works. Even if you think it wouldn’t work on you (although, don’t be so sure) it does work on others. Our digital literacy is astoundingly low given how much we rely on such technology.

Remember, also, this is with data that Vote Leave shouldn’t have had in the first place. Zuckerberg had to apologise for it. Facebook have since improved ad transparency quite a bit (you can see all political and social Facebook ads here) and added some minor restrictions to political and social ads. But that doesn’t mean it didn’t happen.

And, frankly, it doesn’t mean it won’t happen again. While political campaigns may have rarely been honest in the past, things like fixed TV ads and posting leaflets through doors meant there was natural limit to how targeted you could be with your lie, and how often you could put out a lie. But we’re now at a point where this one platform could decide every election from here on out; and that’s a scary thing when you think about it. Politicians and their marketers know that this stuff is working, and in general they are one step ahead of Facebook’s policies. Facebook is reacting, not being pro-active. And that’s a big problem for us.

Issue #4 of 4 — The Profit Motive

Here’s the one you probably haven’t seen as much.

My personal view is that Facebook is a for-profit entity operating in a public commons; and this is a massive problem.

There, I said it. The biggest problem with Facebook is the profit model.

Let me explain myself. Socialising is a fundamental piece of what it means to be human. Without socialising there is no society. As socialising online grows, accelerated by 2020’s global pandemic, doesn’t it become increasingly strange that a large part of our social lives is governed by a computer algorithm? And this algorithm isn’t benevolent by design; it’s designed to keep your eyes and attention on the platform so that you can, essentially, be shown more adverts. If you don’t believe that then consider that over 97% of Facebook’s revenue is from adverts.

Now I’m not saying that the algorithm is malevolent by design either. After all the happier and more satisfied you feel from using the platform, the more likely you will be to return. But let’s be honest, Facebook as a corporate entity doesn’t prioritise how you feel over its bottom line. If you don’t believe me, how about this. In Q1 2020 Facebook made £63 million of profit a day. One more time so it sinks in. Daily profit, so that’s after paying for staff, buildings, data centres, all costs, was £63 million a day.

When push comes to shove, Facebook maximise its profit at the expense of your wellbeing. With something that I believe is fundamental to society.

I genuinely believe that herein lies the solution. Facebook honestly shouldn’t operate in the market that it does, and given the evidence and their previous record, I simply no longer trust them to change for the better. But I struggle to see the route to change. Is it hyper-regulation? Is it as extreme as declaring online networking a public commons and reclaiming it as state run? This would have its problems too. Is it sufficient public action in order to threaten their bottom?

I don’t know, but we need to develop answers fast.

Activism Corner — So What Can We Do?

All right, corporates are hopefully starting to hit the bottom line. It’s a start. But what can we do as individuals to push this further?

  • Learn: Read more on Stop Hate For Profit and in particular their demands. If you like what you see, join the movement and stay informed.
  • Share: Pass this article to someone who would find it helpful. Honestly, you’re more of an influence than you believe and drawing attention to important issues is a big help for everyone. Thank you.
  • Share (With Me): I welcome all feedback. Did I miss something? Did I get something wrong? Has this really helped or inspire you? Do you have any ideas off the back of this? Connect with me in the comments or on Twitter / LinkedIn
  • Do: Time for some armchair activism, literally. I wrote a short document at the start of July explaining how Facebook and Instagram users can write to advertisers and make them aware that they’re funding hate / climate denial / fake news (whatever is most important to you). The July boycott may be almost over, but pressuring your favourite brands for the next boycott will help. Use your voice — brands do listen.

--

--

Sean Buchan

Lifelong activist reporting on climate change, big tech and democracy. I have compassion for all, but little patience for those that abuse their power.