Every time a mass murder or terrorist attack occurs, we begin working backwards. Did the perpetrators offer early telltale signs? Could the tragedy have been stopped beforehand? In the hours and days that follow, we quickly discover it isn’t any single action, but a pattern of many behaviors, transactions and activities that foreshadow carnage. But what if we could get our arms around those patterns and predict an attack was going to occur with 80%, 90%, or even 100% certainty? What if we had the ability to intercept a tragedy such as the shooting that recently occurred in Las Vegas?
That is the power of today’s predictive analytics technology.
Today, big data and predictive analytics are hard at work stringing together billions of data points and pinpointing future outcomes with unprecedented accuracy. And while you might accept the fact this is occurring on Wall Street and in the US Defense Department you may be surprised to learn how accurately we can predict human behavior. For example, we can now say —with 86% accuracy—whether a person is going to trip and fall within the next three weeks. It turns out a 5 centimeter per second decline in their normal walking gait is the precursor to a fall. Who knew?
But that’s just for openers. We can also predict whether an individual is predisposed to become an opioid addict long before a doctor writes that first legal prescription which sets them on a dangerous path. We know that children who torture and kill small animals are likely to become dangerous sociopaths. And we can identify which teenagers are susceptible to binge drinking.
We know where and when hurricanes will make landfall, how much energy will be generated tomorrow by wind farms across the country, and whether a newborn is genetically prone to thousands of cancers, baldness, soft-ear wax, and anti-social behavior. Never mind the collapse of a nation’s currency, or the effect oil shortages will have on banana prices in Tokyo.
Every day, computer algorithms are at work gathering data in real time and spitting out inevitable conclusions about what lies ahead. And as experts comb through Las Vegas shooter Stephen Paddock’s computer, social media accounts, cell phone records, credit card transactions, medical history, childhood, and other data, we will uncover a pattern of behavior headed for a dangerous criticality.
In truth, we have the scientific ability to forecast when a violent attack is likely. It may not be 100%, but it is accurate enough to identify individuals who are prone to carrying out mass attacks, as well as when they may be preparing to act.
But this ability raises a slew of difficult ethical and legal questions. If a computer model shows you are 90% likely to commit a violent act in the next few days what should and can we do? Even more to the point, what if the information required to make a 100% accurate assessment requires monitoring private information such as medical records? In Paddock’s case, his father was a diagnosed violent psychopath—an heritable trait. Is that information something law enforcement should have access to?
Should law enforcement know that a person predisposed to psychopathology (Paddock) was prescribed and began taking Diazepam in June–a drug known to produce violent behavior among individuals with preexisting aggressive tendencies? Or that he lived in 27 residences in four states and was considered an antisocial loner in each of those communities? Or that Paddock sent his live-in girlfriend away so that she would be out of the country when he planned his attack? Should they have access to the fact that he reserved a room at the Las Vegas Ogden, located across the street from the Life is Beautiful concert as well as the Blackstone Hotel in Chicago, overlooking the Lollapalooza music festival? How about his internet searches for Fenway Park and the Boston Center for the Arts, which also recently held open-air events? How about his attempt to purchase tracer ammunition—used to improve the accuracy of nighttime shooting—just a few weeks beforehand?
And if law enforcement did have access to all this data and used predictive models which showed Paddock was poised to act—what then? Do we send in the pre-cog thought police to arrest him, a la the movie Minority Report?
These are frightening questions. They represent lines our society is not prepared to cross.
Computer analytics are one thing, but what about free will? Just because a person makes detailed plans and intends to act on a crime doesn’t mean they won’t come to their senses at the last second. Even if a computer model achieves 100% certainty, are we prepared to take preventative measures based on intent?
On the other hand, we have to admit it is just as dangerous to ignore evidence and deny the fact that analytics has the ability to intercept many attacks. If we have the knowledge there is a very high probability of danger, don’t we have an ethical obligation to act on that knowledge and save lives?
These are the legal and ethical questions we will have to come to terms with in the near future as predictive models race toward 100% certainty. And we must start that conversation today. Because progress will not stop. Technology and science will not be slowed. And at the rate at which technology is accelerating, there should be no doubt as to whether we will soon be able to avert mass tragedy in the same way that we evacuate entire cities in advance of a deadly storm.
The time has come to work forwards, not backwards.