I just did what the computer told me to

Alistair Croll
2 min readApr 10, 2017

--

Computers set our credit limits, insurance premiums, welfare assistance, taxation, and soon, every other facet of our lives. But decisions made by code can be horribly unjust. And if we abdicate responsibility to computers — and act like we’re just following orders — we make things worse. As we move towards an algorithmically governed society, we need to remember our humanity more than ever.

Take today’s news of United Airlines forcibly removing a passenger from an overbooked aircraft so its employees could travel. Confusion abounds: The passenger claimed to be a doctor, a detail that seems to have vanished from news reports; the airline, the police, airport security, and the TSA are all pointing fingers at one another.

Whatever the case, it’s a PR nightmare. What we do know is that when passengers didn’t volunteer for an $800 voucher in return for giving up their seat, the airline made an arbitrary decision about who to remove.

Or rather, they asked a computer to do so. This is important.

Reports of the incident mention that the flight crew said “a computer system was used to select passengers for removal.” They were abdicating responsibility to an algorithm, and using that as an excuse for their otherwise inhumane behaviour.

The situation escalated from there, of course: When the passenger refused, the crew called in security, which they can do very easily for a wide range of reasons.

Psychologists have long known that humans will do terrible things if they feel they’re authorized to do so. Milgram’s 37 and the Stanford Prison Experiment are two famous examples of this behaviour, which many have used to explain the willingness of war criminals to commit atrocities: “I was only doing my job.”

Computer says no.

Humans will soon be able to hide behind an algorithm to justify their actions. I suspect one of the jobs we’ll create, when robots and machines take the repetitive ones, is that of compassion and understanding. Humanity functions on gray areas—the cop who doesn’t pull you over for speeding a little bit; the welfare office that overlooks a single mother’s income from moonlighting, and gives her food stamps; the landlord who gives a tenant a break, rather than letting an algorithm decide rent hikes.

When machines remove the gray areas, it’s up to humans to put them back in. Hiding behind a computer’s decision is a dangerous, slippery slope.

--

--

Alistair Croll

Writer, speaker, accelerant. Intersection of tech & society. Strata, Startupfest, Bitnorth, FWD50. Lean Analytics, Tilt the Windmill, HBS, Just Evil Enough.