March of the Algorithms: Who’s at the wheel in the age of the machine?

by Jenny Nicholls / 16 February, 2019
self driving car

Complacently relying on algorithms can lead us over a cliff – literally, in the case of self-driving car navigation systems. Photo/Getty.

Jenny Nicholls looks at how algorithms can go rogue.

On the road to the sea at Jones Beach, Long Island, New York, lies a series of strangely low bridges. Their designer, a 1920s urban planner called Robert Moses, allowed just 8-9ft (about three metres) of clearance between the bridges’ arc and the surface of the road.

This was high enough for cars filled with the kinds of people Moses liked – white and wealthy – to reach his new, award-winning state park at the beach. But it wasn’t high enough for buses, filled with the kind of people he didn’t like: black and poor. A contemporary who worked with him described Moses as “the most racist human being I had really encountered”. 

This horrible story appears in the first chapter of a new book by English mathematician Hannah Fry, Hello World: How to be Human in the Age of the Machine (Penguin, $40). It might seem strange to begin a book about computer code with a story about old bridges. But, as Fry explains, “racist bridges aren’t the only inanimate objects that have had clandestine control over people. Sometimes it’s maliciously factored into their design, at other times it’s a result of thoughtless omissions.”

Meet the racist soap dispenser. It was working fine, until Chukwuemeka Afigbo tried to use it. “If you have ever had a problem grasping the importance of diversity in tech and its impact on society,” he tweeted, “watch this video.” Inside the device was a simple computer code called an algorithm. It used a sensor that didn’t recognise black skin, because it hadn’t been tested on a wide enough range of skin colours.

Read more: Facebook and Google know more about me than my family

Fry calls algorithms “the cogs and gears” of the modern machine age. Algorithms fly planes, diagnose disease, advise us what to read, listen to and buy, and in the US, tell judges who to send to jail. At their worst, says Fry, they can be “arrogant and dictatorial”.

Not to mention as biased as the bridges of Moses’ highway.

As Fry explains, the rule-based algorithms of classic computing – If a happens, then do b; if not, then do c – are being replaced with machine-learning algorithms, which are more opaque. These can figure out how to solve a problem themselves, given a certain goal and criteria. They can even self-replicate. They are good at data-heavy problems of pattern recognition, such as language translation or face recognition. The problem is that if you let a machine figure out the solution to a complex problem by itself, the workings are often impossible to follow, unlike rule-based algorithms. And if data sets are incomplete, or biased, the algorithm will be too, as an African-American Harvard professor discovered when ads aimed at “felons” (as people who have been imprisoned are called in the US) began to follow her online. 

At its best, algorithmic decision-making can be fast and efficient, tireless and objective. Unlike many alarmed tech commentators, Fry does not argue algorithms are inherently bad, any more than bridges are. Often, she points out, research shows they perform more reliably and more objectively than humans, even highly paid ones such as medical specialists or judges.

Algorithmic decision-making is being adopted wherever tough decisions need to be made: for the granting of loans, bail, benefits, places in US colleges and job interviews. But complacently relying on them, Fry shows us, can lead us over a cliff – literally, in the case of car navigation systems.

In a Guardian piece headlined, dramatically, “Franken-algorithms: the Deadly Consequences of Unpredictable Code”, writer Andrew Smith points out that “bias doesn’t require malice to become harm, and unlike a human being, we can’t easily ask an algorithmic gatekeeper to explain its decision”. This might even be part of the appeal.

You could almost feel Silicon Valley shudder last March, when a self-driving car killed a cyclist after becoming confused by her shape – shopping bags were hanging from the handlebars. The vehicle’s algorithms abruptly passed control to a “back-up driver” who hadn’t even been watching the road. In a parallel tragedy in 2009, an Air France pilot forgot his flight basics after algorithms handed him control during an emergency. Lulled into complacency by long hours of algorithmic flight, he had, fatally for him and his passengers, forgotten how planes work.

Peter Thiel

US entrepreneur Peter Thiel became rich by buying and packaging online personal data.

One of the most fascinating chapters in Fry’s book is titled, simply, “Data”. Online marketing algorithms, she explains, categorise us in sometimes bizarre ways.

A chief data officer for a company that sells insurance told Fry his company used grocery shopping information gleaned through a supermarket loyalty scheme to learn more about their customers. “They’d discovered that home cooks were less likely to claim on their insurance, and were therefore more profitable. But how did they know which shoppers were home cooks? Well, there were a few items in someone’s basket that were linked to low claim rates. The most significant, he told me, the one that gives you away as a responsible, houseproud person more than any other, was fresh fennel.”

Imagine what you could infer about a person if you had more data – such as the stuff they “liked” and “shared” and “posted” online.

US entrepreneur Peter Thiel’s company, Palantir Technologies, is a “data broker” worth around $20 billion – about the same market value as Twitter. Thiel became rich by buying and packaging online personal data into cross-referenced profiles and on-selling the information to companies looking for categories of people to advertise to.

“Your name, your date of birth, your religious affiliation, where you like to holiday, your credit-card usage, your net worth, your weight, your height, your political affiliation, your gambling habits, your disabilities, the medication you use, whether you have had an abortion, whether your parents are divorced, whether you are prone to addiction, whether you are a rape victim, your projected sexual orientation, your real sexual orientation, your gullibility” are the kinds of deeply personal details or inferences which may be filed under an ID number you will never be given. Your secrets, says Fry, are a commodity. Brokers put “cookies” on your computer to act as a signal to companies they represent. Sometimes this is helpful: it protects a non-driver from seeing endless car ads, for instance, or fraudsters from impersonating other users. But in other cases Fry describes, categorising a consumer with an algorithm can have traumatic, even malignant effects on users. The jury is still out on what effect it’s having on society.

Read more: How to stop yourself being tracked on the internet 

When Heidi Waterhouse miscarried a longed-for pregnancy, she told a conference of software developers in 2018, it proved impossible to stop a flood of pregnancy ads, followed, in due course, by baby ads, which appeared on every site she visited. “Nobody who built that system thought of that consequence,” she said.

Notoriously, a firm called Cambridge Analytica used personality profiles to target voters with fake news stories during Donald Trump’s election campaign in 2016. It is hard not to feel nauseated by a sentence like this: “[Undercover footage appeared to show] Cambridge Analytica finding single mothers who score highly on neuroticism, and preying on their fear of being attacked in their own home, to persuade them into supporting a pro-gun-lobby message.”

Commercial advertisers use methods like these, says Fry, “extensively”.

And so does at least one state, in the name of “social order”. China is bringing in a system of “social credit” in which everyone is tracked and rated by an algorithm using data from everything from social media to ride-hailing apps. Those with high ratings receive access to a wide range of dazzling benefits, such as no-deposit apartments. Those with low ratings will find travelling or getting loans difficult. Last June, Wired magazine noted that a Chinese journalist-blogger was stopped from buying plane tickets by the system. He had racked up court fees defending himself in a defamation case, and the court shared his details with the nascent rating agency.

In 2016, former math prodigy Cathy O’Neil published a book on algorithms called Weapons of Math Destruction, which argues that algorithms can polarise communities and entrench prejudice. She calls for “algorithmic audits” of any systems that impact the public.


Back in 2004, a 19-year-old future tech magnate messaged his friend.

Zuck: Yeah so if you ever need info about anyone at Harvard, just ask.

Zuck: I have over 4000 emails, pictures, addresses…

[Redacted friend’s name]: What? How’d you manage that one?

Zuck: People just submitted it.

Zuck: I don’t know why.

Zuck: They “trust me”.

Zuck: Dumb fucks.


The billionaires who own tech companies will recoil at even the faintest, most molecular whiff of “algorithmic audits”. But who’s in charge here? In her final chapter, Fry concludes with a powerful call to stop seeing “machines” as our “objective masters and start treating them as we would any other sort of power. By questioning their decisions; scrutinising their motives [and] demanding to know who stands to benefit.”

This article was first published in the March 2019 issue of North & South.

Follow North & South on Twitter, Facebook, Instagram and sign up to the fortnightly email.

Latest

No mention of right-wing extremist threats in 10 years of GCSB & SIS public docs
103770 2019-03-21 00:00:00Z Politics

No mention of right-wing extremist threats in 10 y…

by Jane Patterson

There is not one specific mention of the threat posed by white supremacists or right-wing nationalism in 10 years of security agency documents.

Read more
Deirdre Kent: The woman who faced down the wrath of Big Tobacco
103798 2019-03-21 00:00:00Z Profiles

Deirdre Kent: The woman who faced down the wrath o…

by Joanna Wane

As the face of anti-smoking lobby group ASH, Deirdre Kent played a vital role in the smokefree New Zealand movement.

Read more
Māori leaders say acts of terror nothing new in NZ
103766 2019-03-21 00:00:00Z Currently

Māori leaders say acts of terror nothing new in NZ…

by Leigh-Marama McLachlan

Māori leaders are calling on New Zealanders to reject the notion that 'this is not us' in the wake of the Christchurch mosque attacks.

Read more
Cynthia Millar and the strange beauty of the ondes martenot
103723 2019-03-21 00:00:00Z Music

Cynthia Millar and the strange beauty of the ondes…

by Elizabeth Kerr

The sci-fi sound of the ondes martenot is playing a key part in the upcoming performance of an epic symphony.

Read more
Christchurch gunsmith warned police about white supremacists last year
103662 2019-03-20 00:00:00Z Crime

Christchurch gunsmith warned police about white su…

by RNZ

A Canterbury gunsmith living and working says he told police less than six months ago they needed to look at the rise of white supremacists with guns.

Read more
12 moments that show how New Zealanders have united in the face of terror
103665 2019-03-20 00:00:00Z Social issues

12 moments that show how New Zealanders have unite…

by Vomle Springford

In the following days after the Christchurch terror attacks, New Zealand has come together to support the victims of the shootings.

Read more
How modern art inspired the music of Anna Clyne's Abstractions
103649 2019-03-20 00:00:00Z Music

How modern art inspired the music of Anna Clyne's…

by The Listener

The works of the English contemporary composer feature in the NZSO’s forthcoming The Planets series.

Read more
Lecretia Seales' widower makes his case for death with dignity
103655 2019-03-20 00:00:00Z Health

Lecretia Seales' widower makes his case for death…

by Matt Vickers

Ahead of a report back on the End of Life Choice Bill, Matt Vickers, widower of assisted dying advocate Lecretia Seales makes his case.

Read more