CEOs, whether they run a financial firm or sell mattresses, seem to want the world to believe they’re really and truly running a tech company. When it comes to banking, at least, new research shows that technology can make a difference when times get tough.
Banks that adopted more IT before the global financial crisis had fewer defaults when it hit, and also provided more credit after it began, economists Nicola Pierri and Yannick Timmer wrote in a working paper for the International Monetary Fund. The difference between the IT-intensive banks and the rest wasn’t because of their geography or because they unloaded souring loans faster once the crisis hit, according to the researchers. Instead, it appears the tech-focused firms made loans that were more resilient.
Pierri and Timmer measured this by looking at the ratio of personal computers to employees within bank branches, which they say has a strong correlation with overall IT budget spending. To study non-performing loans, they analyzed mortgages that were sold to Freddie Mac, the government-sponsored enterprise in the US.
In 2010, at the height of the credit crunch, banks in the top 25% of of IT adoption had about half the problem loans of those in the bottom 25% in their measure of IT adoption.
In the years before the financial shock in 2008, loan performance was similar for banks, whether or not they adopted a lot of technology; the tech-savvy crowd’s advantage showed up later when the financial system went into a tailspin.
“Our results suggest that technology adoption in lending can enhance financial stability through better monitoring and screening,” Pierri and Timmer wrote. “The IT adoption of banks seems to give banks an informational advantage regarding the mortgage.”
The researchers say other factors like funding sources, asset composition, and employee wages aren’t particularly correlated with IT adoption. They also found that banks led by people with techy credentials invested in more IT and made better loans. “These findings support the hypothesis that IT adoption in banking, which can be partly caused by executives’ personal experience and inclinations, led to more resilience during the crisis,” they wrote.
The research could be important for a range of stakeholders. Investors may want to use this type of analysis when picking stocks and bonds, regulators could use these inputs for surveying financial robustness, and boards may need to rethink their IT budgets.
The study also raises other issues. Banks that have less profit to spare may have less to spend on IT, making them less competitive relative to their competitors, and having ramifications for financial stability. Execs running some of Europe’s largest banks, for example, have complained that they are falling behind and making less money than their cousins across the Atlantic because they have to cope with lower interest rates.
The study’s authors acknowledge that there are still questions about what this research means for banking institutions today, a decade after the last big crisis. The technologies used before 2008 could be substantially different than those now in use at big lenders, fintech startups, and even Big Tech companies that offer financial services. There’s no way to test whizzy new cloud-based AI and machine learning technologies under severe duress until the next recession happens.
Even so, Pierri and Timmer think financial technology has likely made the industry better off rather than more fragile: “The ‘fintech era’ is likely to be beneficial to financial stability,” they wrote.