Perhaps the biggest development in information technology in the 21st century has been the rise of big data. But our capacity for tracking and storing information has thus far generally outpaced our ability to effectively analyze and make use of it. We tend to think that more data is better, but when we’re tasked with answering a simple question—for example, what kind of product is most often purchased on what day of the week—the answer can be obscured rather than clarified by an excess of information concerning the variables of the time of day of purchase, the temperature, the weather conditions, the larger economic parameters, and so forth. Moreover, the human labor required to create the algorithms and systems architecture capable of crunching such vast sets of data has usually proved time-consuming and costly.
The relatively new field of predictive analytics has offered some of the most promising approaches to finding actionable insights in big data. Businesses, governments, and healthcare companies have used predictive models to single out patterns based on statistics, allowing them to identify both risks and opportunities. In one recent successful case study, the Tennessee Highway Patrol used predictive analytics to estimate with a high degree of precision the conditions in which accidents would occur, allowing the agency in one year to cut traffic fatalities to their lowest level since 1963.
Heterogeneous mixture learning (HML) has arisen as one of the most innovative solutions in the sphere of predictive analytics. Created by the IT company NEC, which began research into the technology in 2011, HML’s industry-changing potential lies in its ability to empower the program itself to generate the analytic algorithms based on specific sets of data. By automatically identifying frequently occurring patterns hidden in copious amounts of information and writing algorithms to capture and predict such patterns in the future, HML accomplishes what human intelligence might take weeks to discover.
The applications for such technology are almost countless. With an astonishing level of accuracy, like bridges and tunnels, enabling agencies to shore up faulty facilities in advance and prevent serious accidents. Thus, HML has the potential not just to save money and time, but even lives.
An illustrative example of the effective application of HML can be found in the telecommunications industry, where one of the biggest concerns for subscription-based companies is customer retention. The reasons why customers drop service or neglect to renew might seem to be so variable that they’re hard to forecast accurately, but HML offers a big data analytics solution capable of carrying out “churn prediction”—identifying which customers are most at risk of canceling subscription services. NEC has worked with a number of telecom service providers in the Asia Pacific and employed churn management to great effect, in some campaigns reducing the churn rate by 10%.
Moreover, the information yielded by HML analysis can also help companies to identify which customers are almost certainly going to leave, allowing them to shift focus, efforts, and resources to customer bases where they can do the most good. And in an added boon to businesses, HML is highly accurate and transparent, making it an essential tool for 21st century commerce. With the help of such technology, corporate leaders are able to begin to regard the future less with apprehension and more with data-supported confidence about how best to act.
Read more about the ways that NEC’s HML technology is uniquely equipped to leverage big data.
This article was produced on behalf of NEC by the Quartz marketing team and not by the Quartz editorial staff.