First, we had suspects

Fifty years ago, if you worked for law enforcement, you started with a crime.

First, you had a suspect; then you collected data

Homer’s got a point: do the easy stuff first.

The point is, you did the easy stuff first. Nobody wanted to wade through bags of garbage just in case there was something incriminating hiding between the rancid salmon and last week’s stale teabags. And wiretaps were hard, requiring preparation for warrants and working with the phone company.

Today, you have data, and find suspects

The field of Predictive Policing (which is full of software startups and controversies) is based on the notion of using data to guess where crimes are most likely to happen. It’s a concept that was well explored in Minority Report and other films.

It has serious problems. Two of them are:

  • Its algorithms shit where they eat. Software that predicts where crime will occur can easily become a self-fulfilling prophecy. When more police patrol a hotspot, they make more arrests, which lowers property values, which makes it harder for a healthy community to thrive because of a drop in tax revenue.
  • It suffers from reporting bias. If you leave a mattress out on the street in the rich Oakland Heights area, everyone calls the authorities. If you leave one out in the poor parts of Oakland, someone takes a nap. So the source data is questionable in the first place.

Collecting data is easy

But content is not always intent. The possibility of misinterpretation is huge.

  • Searches aren’t crimes. When breast cancer patients searched for medical information, their searches were treated as adult content, because the word “breast” was being broadly interpreted as smut. Even the decisions about what to block are weirdly subjective.
  • All talk, no action. Those who discuss terrorism are probably more open about doing so than the actual bad guys, who (presumably) take steps to hide their actions.
  • Political winds change quickly. A decade after the Russians were America’s allies in World War II, US citizens were being blackballed from jobs and society for consorting with people from the Soviet Union.

Collect-and-find-suspects is the upending of “innocent until proven guilty,” a fundamental tenet of our society. But if you think the current state of affairs is bad, you should consider what tomorrow will be like.

A computer is better than you at recognizing things

Google’s new Photos app is creepy-good at finding data in your photos. I synchronized pictures with it recently and it extracted every face from my daughter’s class photo (though it didn’t know who the faces were, yet.) But it it doesn’t recognize those faces, because it’s not allowed to.

For example:

Have you seen this man?

Google knows perfectly well that’s a picture of Barack Obama. But it’s chosen not to look him up, because the photo of him is in my private photo collection. This is a policy decision, not a capability decision. Lawmakers and law enforcement likely have entirely different policies.

When computers watch movies for us

Most tourists in Amsterdam look like this anyway.

The video’s a bit funny—the AI is obsessed with bicycles for some reason—but it’s also great foreshadowing. McDonald can return home, type in a search string, and instantly find that moment in his journey around the city. It might look rudimentary today, but this stuff is getting better, faster than you think.

Now consider the number of video sources in the world. Imagery from every traffic camera, bank machine, airport drop-off zone, taxi dashcam, restaurant security camera, and more. There are millions in the UK alone.

The other British Broadcasting Corporation: It’s people (or its people).

Consider, too, the past. Because video footage is a time machine. A video search isn’t just about what’s being recorded today, but what’s been recorded and stored in the past. Every frame on Facebook, Youtube, and elsewhere. Every digital recording system.

You know you’re in there.

Today, scouring through video footage is the equivalent of dumpster-diving in the fifties. It’s smelly, and tiresome, and nobody really wants to do it. There’s a coefficient of friction that makes it hard to do, so it’s not worth doing.

Yet.

We’re about to clear another milestone—parsing video content better than humans. And once we can do it in parallel, we can search years of footage instantaneously.

Tomorrow, you’ll have history, and find data targets

When parsing and searching video gets easy, it’ll change how we approach homeland security, background checks, employment interviews, and even documentation of our daily lives. This technology’s already making great strides in policing, and has promising consequences. If it’s available to the general public, it might lend us an air of civility—knowing the great Other is watching, perhaps we’ll be more careful.

At the same time, a life constantly watched is one in which we’ve lost our right to be forgotten, and where, in the absence of a recording, we’re presumed guilty. Tomorrow’s panopticon casts a wide gaze, seeing through our own cameras, annotated effortlessly by the code with which we gird ourselves in search of safety.

CCTV graffiti, via Wikipedia

--

--

Writer, speaker, accelerant. Intersection of tech & society. Strata, Startupfest, Bitnorth, FWD50. Lean Analytics, Tilt the Windmill, HBS, Just Evil Enough.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Alistair Croll

Writer, speaker, accelerant. Intersection of tech & society. Strata, Startupfest, Bitnorth, FWD50. Lean Analytics, Tilt the Windmill, HBS, Just Evil Enough.