Corporate corruption is everywhere. Can data catch the criminals? 

A scene from “Enron,” the stage play.
A scene from “Enron,” the stage play.
Image: Reuters/Lucas Jackson
We may earn a commission from links on this page.

Executives keep smiting their companies with ethical lapses, and inflicting billions in damages on the public. Almost all of it is preventable with data companies already hold about themselves. That, at least, is the claim behind a new breed of ethics and compliance software emerging thanks to machine learning and enormous vaults of data held by the world’s biggest companies.

Lapses like Caterpillar’s alleged $2 billion tax fraud exposed by a whistleblower in 2013. Or Volkswagen’s decision to cheat on diesel emissions tests for 11 million cars (its settlement tab is $14.7 billion and growing). And Wells Fargo’s ploy to open as many as 3.5 million potentially fake bank and credit card accounts in customers’ names since 2002, incurring a $142 million national class action settlement.

The antidote, argues Patrick Quinlan, the founder of an ethics software company called Convercent, is handing companies a self-portrait, fair or foul, painted with their own data. Quinlan’s software plugs into corporate digital infrastructure, from email to contract management, ingesting gigabytes of data about employees and company transactions. Machine learning algorithms are turned loose to look for suspicious patterns of behavior.

Companies are forced to clean up their act, or double-down on any malfeasance that’s revealed. “We cannot make you better,” Quinlan said in an interview. “We can only give you a mirror to look at who you really are and prove it to you.”

Can data prevent the next Enron?

Quinlan, a serial entrepreneur and former US Army infantryman, said he had no intention of building “ethics software” when he started Convercent in 2012. But after learning more about corporate and regulatory compliance, he realized the bigger opportunity lay in cultivating ethics. Compliance is about “walking away from the bad decision,” he says. “Ethics is walking toward something, doing the right thing.”

The promise, or hype, is that data will stop corporate crimes almost before they begin by transforming the culture of accountability.

More than 600 companies have signed up for Convercent’s service, including Uber, Airbnb, Microsoft, Tesla, Under Armour, and paper company Kimberly-Clark. Ultimately, Convercent plans to replace today’s often perfunctory, check-the-box compliance efforts with an aggressive accounting of companies’ own culture. The Denver-based company—which has so far raised $78 million, according to Pitchbook—is joined by other startups such as the AI-powered chat-bot Spot aimed at pointing powerful algorithms at company’s data to clean up their corporate messes.

For decades, compliance has dominated corporate culture. That’s now ending. Market pressure is building to make (or at least market) ethical company cultures that attract talent and shade out rivals. Studies suggest (pdf) that more ethical companies outperform the market with higher returns and lower volatility on average ”I think there is an ethical transformation happening around the world,” says Quinlan. “It’s a replacement of regulators with the court of public opinion. This ethical transformation is every bit as real as the digital transformation.”

Forrester reports (paywall) that big enterprises are already applying tougher standards than regulators do. US government agencies like the Securities and Exchange Commission are leading some laggards after acquiring sophisticated fraud detection technology (only 2% of enterprises surveyed by EY in 2014 were using the most advanced automated fraud detection technologies).

But Maurice Schweitzer, a management professor at the Wharton School of the University of Pennsylvania, says the most immediate driver of change is not new technology; it’s how society punishes the bad actors. For executives, it’s become personal.

Punishment has been “surprisingly light” for decades, says Schweitzer. Corporate malfeasance has historically had only an ephemeral impact on stock prices and balance sheets. Usually, it’s amounted to far less than any ill-gotten profits. Take Goldman Sachs role in the 2008 financial crisis. It’s $5 billion fine for deceptive hawking of sub-prime mortgage-backed securities barely dented its $30 billion or so in annual revenue (the fine may even drop to $0 after some financial chicanery).

But Schweitzer says more than company balance sheets are on the line—executive pay and careers as well. Companies’ management can be eviscerated in front of millions on Twitter, Facebook, and in the press. “For CEOs,” says Schweitzer, “the risk of an ethical lapse is very, very high.”

Pliant corporate boards are taking a harder line. In the last two years, executives at Wells Fargo, VW and United were jettisoned or sidelined after scandals. Wells Fargo’s board clawed back $75 million in compensation after pushing its CEO and head of community banking out of the company. At least when crimes go public, the message is clear. ”There’s an opportunity for a company like this [Convercent] to exert enormous amount of change for the better,“ says Schweitzer.

How the technology works 

Many companies attempting to catch bribery, corruption and fraud use rules-based tests pulling on spreadsheets. But these are easily evaded, says Vincent Walden at the accounting and professional services firm EY. In response, Convercent, along with competitors such as NAVEX and Metricstream, have moved their reporting systems online. But the real payoff comes once the process is automated. 

That was the challenge facing restaurant franchise Ruby Tuesday, which used monthly Excel spreadsheets to manage issues and keep track of thousands of employee documents. “The specific insights we were getting before were none,” said James Vitrano, Ruby Tuesday’s general counsel in charge of risk management. Because chains like Ruby Tuesday’s suffer 120% employee turnover each year, managers were often in the dark about what was happening across the organization.

Vitrano said he wanted a “360 degree understanding” of what was going on for employees. He used new data from Convercent to handle complaints, elevate high-performing restaurant managers, and reduce risk of falling afoul of the law. “If I can enhance and improve [the employee] experience, guess what’s going to stop happening? Unethical shit,” he said. “People do things because they’re unhappy or there is a problem. I need to prevent that.”

Eventually, says Convercent’s chief product officer Phillip Winterburn, the company will be able to visualize patterns using natural language processing and predictive analytics that head off ethical issues in the first place. It’s already ingesting data from human resource departments’ helplines, surveys, and enterprise accounting, contracting, supply chain management, and travel applications (Winterburn says it anonymizes data while looking for patterns “without breaking down privacy and personal protections”).

Data alerts can go directly to board members ensuring that suspicious reports are not hidden. Quinlan argues Wells Fargo, for example, would have clearly seen a pattern of sales managers pressuring employees to break the law (data reported but never properly analyzed years before the scandal broke). Eventually,

Building corporate Big Brother

If this all sounds like a corporate dystopia for unwitting employees at all times, you’re not alone. Companies are beginning to be able to easily infer employees’ sexual orientation, political party and other personal information they would have no right to in other contexts. “It’s a tool that could be used for good and a tool that could be used like Big Brother,” says Wharton’s Schweitzer. “We need oversight … somebody to watch the watchers.

But tech has raced ahead of regulators. Stanford University researcher Michal Kosinski claimed (controversially) that algorithms can already deduce sexual orientation, intelligence, political leanings, and even criminal inclinations by looking at faces alone. Anti-discrimination laws on the books may prohibit retaliation, of course, but the jurisprudence on how to handle discrimination by algorithm is thin. Bright lines may be needed on what companies can, and cannot do, as they gain access to almost all communication and employee actions, and then analyze the text, voice or visual communication.

Winterburn, who is leading Convercent’s new data integrations, say he sticks to using data that its customers legally collect, and had tread cautiously in new territory to ensure privacy while allowing companies to root out malfeasance. Defining what that will look like as new data streams come online is less certain. “I don’t have a good answer,” he said. “It’s something that will evolve over time.”

Convercent faces its own ethical dilemmas. It’s working with the poster children for companies not doing the right thing. Phillip-Morris (lying about tobacco’s addictiveness), Zenefits (insurance violations) and Uber (it’s a long list) are all customers. Quinlan argues these are, in fact, precisely the kinds of companies they want to work with.

To help companies improve, you need to work with the problem children committed to reform, not just model students. “We can either say: we won’t get involved or we will help you get there,” said Quinlan. “I think we can help them get there.”