We’re going to have to clean up our language when talking about the future.
To clarify this ambiguous statement, we’re wading into a future where we will require more precise definitions to discuss increasingly complicated, complex and more finely nuanced objects, situations and roles people have in the world. However it unfolds, it’s a good bet that it will involve things for which we don’t yet have good names (see “vape,” “entopreneur,” and “card clash” if you have any questions). Catch-all terms, particularly when applied to emerging phenomena, do us more harm than good, and we need to find better options to communicate about them if we’re going to understand what comes next.
The sphere of emerging technology is probably where the definition gaps currently yawn widest. Let’s take the term “hacker,” not new per se, but one which has seen a lot of action in the past year as the pace of attacks on networks, databases and infrastructure has appeared to accelerate. It’s certainly been well abused during the recent Sony debacle. In A Brief History of Hackerdom, Eric Raymond reminds us that “hacker” was originally used in the early 1960s to refer to amateur computer enthusiasts, people who tinkered and built hardware and software. It applied to a fast-changing and diversifying set of subcultures around programming and computing. As such, it would probably cover many more dedicated internet users today — people who have taken it upon themselves to learn the tools of the (now) network.
Under current definitions, only a small minority of us would comfortably claim the title, because “hacker” has been largely criminalized in its most common usage. It is now bandied about to refer to anyone carrying out activities on or around computers or networks that goes against the interests of businesses, governments, or powerful individuals, not simply clearly criminal attacks or nefarious activity. (Not to mention the fact that the term hack has also been co-opted to refer to tips for productivity and efficiency in all areas of life.)
We don’t distinguish among “hacking” behaviors now — everything that’s done in any way to harm, compromise, gain unauthorized access to, probe, or monitor without knowledge is considered a hack or hacking. And media outlets obligingly stretch the definition as wide as possible for short headlines and shallow stories. We don’t ask about function, motive, provenance, authority, or any other detail. Hacking is coding is doxing is theft is an intelligence operation is a malware insertion is a leak is a practical workaround. File it all over there, in the menacing box with the skull and crossbones on it.
Without knowing what the nature or context of an act is, it’s far too easy to sweep all things that seem similar under one very large rug and leave it there. But in doing so, we stand to learn absolutely nothing about it or from it, and are no more prepared to deal with similar issues the next time they impact us. Security expert Bruce Schneier just made this point with regard to Sony’s recent troubles. We don’t really know anything firm about the incident, or who was behind it, so simply running around shouting “hackers!” doesn’t tell us much. Who executed the attack, why and how are all important data points in making future strategic adjustments.
They’re everywhere. (Who’s “they”?)
As a fellow observer of near futures pointed out, similar things are happening with words like “robots,” “algorithms,” and “drones.” We casually use them as shorthand, but (increasingly) there are worlds of difference between, say, an industrial robot on a production line and a telepresence unit on wheels, of the kind we’ve seen Edward Snowden’s face teetering on recently. “Robot” used to mean a humanoid machine capable of executing commands. Yet, advances in engineering mean the machines we task to do things for us take many shapes, and only a minority look anything like us. So when a headline shouts “Are Robots Stealing Our Jobs?” one has to ask, “What job, performed how and by whom?” to get closer to a meaningful understanding of what a robot could be here.
“Algorithm” is currently the hot term among tech folk and in the mainstream press alike, used to refer to any black box, computerized formula that makes a decision, whether it’s used to sell you socks instead of panty hose, or to deny you the ability to board a flight. The term is now showing up on major newspapers, above the fold, but few people on the street can tell you what an algorithm is. But, man, are they responsible for a lot of critical decisions.
Likewise, with the word “drone.” I think you’d know the difference between a fully armed Reaper drone locked on your location and a cheap palm-sized toy buzzing around you, at least for a few meaningful seconds. Even the “drone” industry is having a hard time settling on terminology. Part of this search for a better label is for marketing clarity, part of it is defense against negative attention. The term has already become quite sticky, as has negative attention around drones, so differentiating names by function, or throwing in qualifiers (toy drone, military drone, farming drone) is tough. Yet, unlike what has happened with “hackers,” as time goes on, we’ll probably see more fine-tuned language around drones, because unlike with “hackers,” we can stratify a good deal of what’s going on with drones in our daily lives, and we’ll need names to refer to different activities so we don’t accidentally call in a Hellfire missile strike when we just want an orchard irrigated or a package delivered.
In the dark
But “hackers,” “algorithms,” and to some extent “robots,” sit behind metaphorical — or actual — closed doors, where obscurity can benefit those who would like to use these terms, or exercise the realities behind them to their own benefit, though perhaps not to ours. We need better definitions, and more exact words, to talk about these things because, frankly, these particular examples are part of a larger landscape of “actors” which will define how we live in coming years, alongside other ambiguous terms like “terrorist,” or “immigrant,” about which clear discourse will only become more important.
Language is power—power that often implies, or closes down knowledge and understanding, both of which we need to make informed decisions about individual and collective futures. Everyone doesn’t need to become a technical expert, or keep a field guide to drones and robots handy (though it might be useful sooner than later), but, as I’ve pointed out in the case of complex systems and supply chains, we might all benefit from having a clearer understanding of how the world is changing around us, and what new creatures we’ll encounter out there. Perhaps it’s time we all start wielding language with greater clarity. I’m sure the robots will.