Why We Must Stop Pandering to the Algorithm

This morning, like many others, I woke up to a barrage of messages from my family Whatsapp group, arguing about coronavirus. Long story short: seven diametrically opposing views do not make for happy families.

What has this got to do with pandering to the algorithm? Everything. My wise old matriarch delights in sharing her daily curated newsfeed content, dogged with negative stories challenging the efficacy of the vaccine.

Like clockwork, one of us disputes the claim while another says we need to be open-minded. Matriarch fights back. Chaos ensues.

Nature or nurture?

As human beings, we have always been hard-wired to form social groups. Part of it is a survival instinct. As Psychology Today puts it, “In a hunter-gatherer group, being ostracised or banished could have been a death sentence.”

Another reason is social influence. We go to school with children of the same age; we attend church with those of the same faith. We’re born without prejudice but we’re put into groups from birth.

Sure enough, having a common interest, be that religion, sport or superheroes, is relatively harmless.

But what happens when the algorithms intervene? What if our shared beliefs are not based on the collective fandom of one particular entity, but rather, the calculated outcomes of an entirely artificial equation?

Why we rely on algorithms

The term ‘algorithm’ dates back to the ninth Century – long before we needed Google for everything. It comes from Muhammad ibn Mūsā al’Khwārizmī, a mathematician whose Latin name was Algoritmi, or the “decimal number system”.

As we navigate through history, we see algorithms hard at work in:

  • Ancient Greece, to power modern mechanics
  • The 19th Century Analytical Engine, with algorithmic input from Ada Lovelace
  • The Second World War, when Alan Turing played his part in cracking the Enigma Code, and paving the way for ‘general purpose’ computers.

Today, we use them to answer questions, drive from A to B, and segment our customers. They’re also extremely cost-effective for harnessing astronomical amounts of data. This has its benefits: personalisation and time saving, for example.

It also has potential for harm. New Scientist states that “cash-strapped authorities are turning to algorithms they don’t fully understand in an effort to cut costs”. This has resulted in one police force investigating half as many reported assaults, based on a computer’s decision.

The filter bubble

Beyond their uses for enterprise, algorithms may also influence how we think. Following Mark Zuckerberg’s historic hearing, Facebook made changes in 2018 that would prioritise “posts that spark conversations and meaningful interactions between people”.

At the time, businesses groaned as they saw their page reach diminishing, but the long-term effects were far more pernicious.

“Intellectual isolation”

Let’s say you’re not really sure of your stance on something. You log in to Facebook to see what the world has to say. The algorithm has prioritised these ‘conversations’ and as such, you’re confronted with hundreds of posts from one hard view. Before long, you’re in the rabbit hole, seeking out more information that aligns with these views.

In the background, apps and cookies are monitoring this activity to get a better impression of you. Let’s take Apple News – a news aggregator platform which “personalises content to your interests”. The more of this content you view, the more of it you’re fed. Before long, you’ve succumbed to the filter bubble.

A filter bubble is the intellectual isolation that can occur when websites make use of algorithms to selectively assume the information a user would want to see, and then give information to the user according to this assumption.

-Eli Pariser: The Filter Bubble: What the Internet Is Hiding from You.

The problem with this bubble is that it limits us from exploring alternative views, in turn leading to confirmation bias. Like our hard-wired desires to conform, we seek out information that supports our views, and we dismiss the alternatives.

In practice, this means a lot of family arguments, or dramatic Whatsapp group exits. Mic drop.

The deleterious effects of algorithms

So, you like dogs; I like cats. So what? If the effects of these filter bubbles were limited to claims of hoax moon landings, we could breathe easy.

For starters, we have the ongoing issue of fake news. Again, not a problem if we’re bickering over Kimye divorcing, but quite significant when we consider that lives are at stake.

The 2020 Reuters Digital News Report revealed that 37 per cent of us feel we have been given “misleading information” about the coronavirus:

corona reuters

Source: The Reuters Institute Digital News Report 2020

That’s on social media alone. More than a quarter of us blame video sites, while almost a fifth blame search engines…and at 32 per cent? Of course. Whatsapp.

But it goes beyond COVID. I’m not even getting into Brexit or the 2016 election results because there aren’t enough hours in the day. Just look at the dangers of algorithms in 2020/1:

Patients are refusing ‘non-English’ vaccine

At the turn of the year, Dr Paul Williams, the Labour MP for Stockton South, and former GP, tweeted:

This is quite an eye-opening insight into polarised Britain. We might even call it groupthink: the group are united in their nationalism to the point that they will make poor decisions, like turning down a ‘non-English’ vaccine.

Think algorithms don’t play a part? Think again. In August 2020, Professor J. Paul Goode discussed the links between algorithms, artificial intelligence and nationalism. He says:

“Disruptive technologies can empower the powerless, influence the movements of populations and ideas across borders, change how the nation is imagined in daily life, and facilitate revolution.”

Think of that in the context of the Capitol riots.

The 2020 A-level fiasco

In one of just many examples of AI bias, the 2020 A-level fiasco led to almost 40 per cent of students receiving grades lower than expected.  Worse still, the UK government capped growth rates for universities. Even if universities had attempted to be lenient, they were not allowed to grow by more than 5 per cent.

In lieu of real exams, an algorithm calculated students’ grades based on three things:

  1. Historical grade distribution of schools during the years 2017-2019
  2. The individual student’s rank, based on teacher evaluations
  3. The student’s previous results for that particular subject.

Point one is particularly noteworthy. It essentially penalises students based on the performance of others. Further analysis concluded that those in smaller schools were more likely to benefit.

Then there was the appeals process, which came with a fee – once again, discriminating students from certain backgrounds. Thankfully, the students’ results have since been retracted and will be determined by teacher feedback, or, you know, an actual human being.

Skills and employment bias

Algorithms are powered by data – or human input. As humans we may have inherent biases towards our own race or sex, which may affect outcomes on anything from crime prediction to job applications.

One of the issues is the sheer scale of the data. Stas Sajin argues that it is not enough to simply “remove race” from job applications as a variable. There are so many other indicators of our personal attributes. As he notes, Amazon’s hiring algorithm inadvertently discriminated against women – there was no mention of their sex, but there were references to “Women’s Chess Club Captain” and such.

If we extrapolate this, algorithms could be penalising applicants based on their socioeconomic background, race, sex, faith – it calls for human intervention.

The power of Twitter

With all this evidence, we could be forgiven for claiming the algorithms have brainwashed us. Facebook has 1.82 billion users per day – all that data is bound to influence our thinking.

But we have to take responsibility for our actions. Twitter, for example, offers a mixture of “algorithmic and real-time content”. Therefore, we might see Trump inciting violence because we follow him, or because we see it in the trending section.

If political unrest wasn’t enough, we also have cancel culture. Earlier this week, Rowan Atkinson was featured in a controversial article discussing online mobs, and said it was important that we consider a wide spectrum of opinion.

The article also cites the case of JK Rowling, a former national treasure who was ‘cancelled’ after a series of tweets discussing trans people. Would 81,000 people have waded into that conversation had it not been on Twitter? Or did those who had no opinion either way feel compelled to form one…because it was trending?

We must take responsibility

Whatever your opinions on JK Rowling, Rowan Atkinson, Donald Trump – you cannot ignore the facts. On January 6, Trump tweeted: “Get smart Republicans. FIGHT!” Twenty-four hours later, five people were dead.

It’s up to us whose accounts we follow, whom we choose to retweet and how we interpret information that’s given to us. President of Reuters News Michael Friedenberg perhaps sums it up best:

Historically, the supply chain of content has always been about creation and distribution. Somewhere along the line it added verification. But the order should be creation – verification – distribution.

We play a part in disseminating the message now. We need to consider the consequences.

The good news

The good news is that the onus is not on us alone. Social networks are facing up to their responsibilities and tightening their grip on ‘fake news’ culture. World leaders are not exempt from this. Take this excerpt from the Reuters report:

“Facebook has stepped up funding for independent fact-checkers and a number of platforms including Facebook, Twitter, and YouTube have taken down misinformation that breached guidelines, including a video from Brazilian President Jair Bolsonaro.”

In the same vein, Trump has now been banned from most major social media platforms.

Similarly, in February 2019, Google published a whitepaper on how it plans to fight disinformation, including:

  • Focusing on quality content
  • Counteracting ‘malicious actors’ (those out to deceive)
  • Giving users more context with knowledge panels
  • Partnering with experts and championing quality journalism.
family squabble

What, us? Bickering?

So, what can you do?

If you’re finding yourself locked in daily tussles, take it from someone with five siblings:

1. Ask if it’s worth your time

Is the vaccine an exercise in government control, or the first step to getting through this pandemic? Revised: can you change anything, either way? It’s best to focus your energy on things you control, like how you talk to others.

2. Curate your own feed

Seeing too much content from the same groups? Leave the group, change your cookie settings, untick the advertising tick boxes, unfollow the hashtags. If you want to broaden your horizons, follow both camps. Make your own conclusions.

3. Respect other people’s right to an opinion

Note that opinion is different from fact. If somebody is broadcasting an opinion as fact, it’s right to challenge them – particularly if it’s harmful. But if somebody is merely sharing an opinion, respect your rights to disagree, or even prompt healthy debate. You may have facts that they don’t, or vice versa.

4. Be careful what you share

It’s on you, dear reader. You have no idea how much influence that share button has. You can use fact-checking tools like Snopes to verify the claims. It’s quite eye-opening what you’ll find.

5. Question the algorithm

When the outcomes of an algorithm affect your life, by all means, challenge them. The A-level results are just one example, but AI may also hinder job interviews, loan applications and goodness knows what else. Remember that, even in 2021, we can fall back on human common sense.

Above all, remember:

Those who cannot change their minds cannot change anything.

– George Bernard Shaw

It’s OK to change your mind. It’s fine to disagree with others. What’s not fine is other people getting hurt, and the algorithm can only work based on what we tell it.

algorithms trump
Katie Lingo
by Katie Lingo
8th January 2021