Why payments in the US have been stuck in the Dark Ages

Might as well just use this stuff.
Might as well just use this stuff.
Image: AP Photo/David Duprey
We may earn a commission from links on this page.

Payment systems and user behaviors have evolved over the past three decades. In this first of a two-part Monday Note, I offer a look at the obstacles and developments that preceded the Apple Pay launch.

When I landed in Cupertino in 1985, I was shocked, shocked to find that so much gambling was going on in here. But it wasn’t the Rick’s Café Américain kind of gambling. It was the just-as-chancey use of plastic: Colleagues would heedlessly offer their credit card numbers to merchants over the phone; serious, disciplined executives would hand their AmEx Platinums to their assistants without a second thought.

This insouciant way of doing business was unheard of in my Gallic homeland. The French (and most Europeans) think that trust is something that must be earned, that it has a value that is debased when it’s handed out too freely. They think an American’s trusting optimism is naïve, even infantile.

After I got over my shock, I came to see that my new countrymates weren’t such greenhorns. They understood that if you want to lubricate the wheels of commerce, you have to risk an occasional loss, and that the rare, easily remedied abuses are more than compensated for by a vibrant business. It wasn’t long before I, too, was asking my assistant to run to the store with my Visa to make last-minute purchases before a trip.

(On the importance of Trust and its contribution to The Wealth of Nations—or their poverty—see Alain Peyrefitte’s La Société de Confiance [The Society of Trust]. Unfortunately the work hasn’t been translated into English, unlike two of Peyrefitte’s other books, The Trouble with France and the prophetic 1972 best-seller The Immobile Empire. The title of the latter is a deplorable translation of Quand la Chine s’éveillera… Le monde tremblera, “When China Awakes, The World Will Shake,” a foreboding attributed to Napoleon.)

The respective attitudes towards trust point out a profound cultural difference between my two countries. But I also noticed other differences that made my new environment feel a little antiquated.

For example, direct deposit and direct deduction weren’t nearly as prevalent in America as in France. In Cupertino, I received a direct-deposit paycheck, but checks to cover expenses were still “cut,” and I had to write checks for utilities and taxes and drop them in the mailbox.

Back in Paris, everything had been directly wired into and out of my bank account. Utilities were automatically deducted 10 days after the bill was sent, as mandated by law (the delay allowed for protests and stop-payments if warranted). Paying taxes was ingeniously simple: Every month through October, a tenth of last year’s total tax was deducted from your bank account. In November and December, you got a reprieve for holiday spending fun (or, if your income had gone up, additional tax payments to Uncle François—Mitterrand at the time, not Hollande.)

Like a true Frenchman, I once mocked these “primitive” American ways in a conversation with a Bank of America exec in California. A true Californian, she smiled, treated me to a well-rehearsed Feel-Felt-Found comeback, and then, dropping the professional mask, she told me that the distrust of electronic commerce that so astonished me here in Silicon Valley (of all places) was nothing compared to Florida. There she said it’s common for retirees to cash their Social Security checks at the bank, count the physical banknotes and coins, and then deposit the money into their accounts.

Perhaps this was the heart of the “Trust Gap” between Europe and the US: Europeans have no problem trusting electronic commerce as long as it doesn’t involve people; Americans trust people, not machines.

My fascination with electronic payment modes preceded my new life in Silicon Valley. In 1981, shortly after starting Apple France, I met Roland Moreno, the colorful Apple ][ hardware and software developer who invented the carte à puce (literally “chip card,” but better known as a “smart card”) that’s found in a growing number of credit cards, and in mobile phones where it’s used as a Subscriber Identity Module (SIM).

An anti-government protester holds up a SIM card before cancelling their AIS subscription during a rally at the Shinawatra building in central Bangkok March 10, 2014. AIS was founded by Thaksim Shinawatra, brother of current Prime Minister Yingluck. REUTERS/Chaiwat Subprasom
A SIM card.
Image: Reuters//Chaiwat Subprasom

The key to Moreno’s device was that it could securely store a small amount of information, hence its applicability to payment cards and mobile phones.

I carried memories of my conversations with Moreno with me to Cupertino. In 1986, we briefly considered adding a smart-card reader to the new ADB Mac keyboard, but nothing came of it. A decade later, Apple made a feeble effort to promote the smart card for medical applications such as a patient ID, but nothing came of that, either.

The results of the credit card industry’s foray into smart-card technology were just as tepid. In 2002, American Express introduced its Blue smart card in the US with little success:

“But even if you have Blue (and Blue accounts for nearly 10% of AmEx’s 50 million cards), you may still have a question: What the hell does that chip (and smart cards in general) do? The answer: Mostly, nothing. So few stores have smart-card readers that Blue relies on its magnetic strip for routine charges.”

In the meantime, the secure smart chip found its way into a number of payment cards in Europe, thus broadening the Trust Gap between the Old and New Worlds, and heightening Moreno’s virtuous and vehement indignation.

(Moreno, who passed away in 2012, was a true polymath; he was an author, gourmand, inventor of curious musical instruments, and, I add without judgment, an ardent connoisseur of a wide range of earthly delights.)

Next came the “Chip and PIN” model. Despite its better security—the customer had to enter a PIN after the smart card was recognized—Chip and PIN never made it to the US, not only because there were no terminals into which the customers could type their PINs (let alone that could read the smart cards in the first place.) But, just as important, there was a reluctance on the part of the credit card companies to disturb ingrained customer behavior.

It appeared that smart cards in the US were destined to butt up against these two insurmountable obstacles: The need for a new infrastructure of payment terminals and a skepticism that American customers would change their ingrained behavior to accept them.

In 2003, I made a bad investment in the payment system field on behalf of the venture company I had just joined. The entrepreneur that came to us had extensive “domain knowledge” and proposed an elegant way to jump over both the infrastructure and the customer behavior obstacles by foregoing the smart card altogether. Instead, he would secure the credit card’s magnetic stripe.

During the due-diligence process, a French bank executive illustrated the problem using a real-life example. A garçon at a chic restaurant makes dozens of magnetic copies of all the credit cards that pass through his hands. He hands the copies to his comrades who spread through the city and run up a couple million euros in charges in just a few minutes—well before the card issuers’ security systems have time to notice the unusual purchases. (The waiter was eventually caught after a careful collation of the duplicated cards revealed they had one thing in common: The chic restaurant near Place Vendôme.)

The trouble with a conventional credit card is that the information that’s encoded in the magnetic stripe never changes, thus making it impossible to tell the difference between an authentic card and a copy. Our entrepreneur solved this problem by hiding a battery-driven circuit under the magnetic stripe that would write a new, one-time cryptographic token each time the card was swiped. You use your credit card at a shop, the token is sent to and decoded by the card issuer, and, if that token hasn’t been seen before, a Thumbs Up signal is sent back to the merchant. If someone tries to make a magnetic copy of your card, they get a copy of the token that you used most recently—a token that’s no longer any good. Thumbs Down and confiscate the card.

The elegant security coupled with the compatibility with existing terminals and habits was was met with great enthusiasm…and money. It was big, bold, and more interesting than yet another e-commerce deal. But the technology never really worked. No hard feelings—this is the nature of the venture business.

Then the iPhone appeared. With its SIM module, decently secure operating system, and nice user interface, Apple’s new smartphone re-ignited my hopes for a saner way of making payments.

In addition, Apple was a pioneer in the micro-payment field. In 2003, iTunes had introduced a clever, music-by-the-slice system that was quickly embraced by hundreds of millions of users—and their credit cards. iTunes begat the iOS App Store, and the credit cards came with it. (We’ll note here that Apple never intended to make money from the micro-payment system; its purpose was to sell iPods and, later, iPhones and iPads.)

For awhile, nothing came of this virtuous combination of hardware, software, and App Store. Worse, Google got “there” first in 2011 when it offered a Google Wallet app on Android devices equipped with NFC (Near Field Communication) chips. The devices only worked in stores equipped with terminals that accepted MasterCard PayPass—but the system worked (at least technically).

In this Feb. 27, 2013 photo, a man uses the NFC payment Visa system at the Mobile World Congress, the world's largest mobile phone trade show, in Barcelona, Spain. (AP Photo/Manu Fernandez)
An. NFC payment system.
Image: AP Photo/Manu Fernandez

Apple’s lack of innovation and apparent lack of concern was widely criticized: How could the Cupertino company let Google take the lead in such a crucial sector of economic activity? Observers “ordered” Apple to adopt NFC on the double and come up with its mobile payment solution—now!

Ultimately, Google Wallet failed to take the world by storm and, in 2013, its creators paused to reflect on its failures [emphasis and insert mine]:

“Jonathan Wall, the founding engineer of Google Wallet, puts it bluntly: ‘With Google Wallet, we had one point of failure—the carriers’.”
“Sprint remains the only major U.S. carrier to support the service; AT&T, T-Mobile, and Verizon have instead decided to support Isis [now renamed SoftCard], a competing mobile payments service, effectively denying their customers access to Google Wallet (or vice versa).”

How did Google not foresee this? It’s on page one of Our Friendly Carriers’ playbook: Never Become a Dumb Pipe.

Finally, three years “late,” on Sept. 9, 2014, Apple Pay was announced. Six weeks later, it was activated in stores. It Just Works—where accepted.

Next week, we’ll look at the reactions to Apple’s new payment system. Some are sane and reassuring, others hilarious and unfortunate. As usual, opponents and incumbents contend that Apple shouldn’t barge into a world where it has no credibility. And, just as usually, the reaction among iDevice users is mostly positive.

You can read more of Monday Note’s coverage of technology and media here.