The biggest mistakes in MBA rankings

The biggest mistakes in MBA rankings
Image: Reuters/Nigel Roddis
We may earn a commission from links on this page.

This originally appeared on LinkedIn. You can follow John A. Byrne here.

After debates over the value of an MBA, the most controversial topic in graduate business education is rankings of business schools and MBA programs. As Matt Turner writes at Poets&, rankings of academic institutions are inherently tough. “It’s difficult to put metrics around a nuanced educational environment,” explains Turner, who covers business school rankings for McCombs Today published by the McCombs School of Business at The University of Texas at Austin.

And yet the rankings published by The Financial Times, Bloomberg BusinessWeek, U.S. News & World Report, The Economist and Forbes carry a lot of weight. So we asked him to identify what he believes are the most common mistakes made by the organizations that routinely rank business schools. Here’s his list, which is bound to create a controversy of its own:

Imprecise questions: Media outlets need to take the time to educate themselves on the appropriate terms and lingo for higher education, admissions, salary earnings, and other data. Failure to do so leads to howling discrepancies.

Examples: An admitted student is not the same as an enrolled student; What is meant by “full-time” vs “part-time” when speaking of faculty (are we talking % time hired or tenure-track status)?; Are emeritus professors included in the faculty count (how about adjuncts, joint-appointments, faculty from other departments who routinely teach B-School classes?); Different schools use different methods to calculate standard (SAT, GMAT) exam scores; Is the school following MBA Career Services & Employer Alliance guidelines for reporting salary data?

Failure to show scores and clustering: Ranks are inherently misleading because you don’t know by how much one school outranks another. A couple of the major MBA rankings now include index scores, but many others do not. The same issue occurs within ranking sub-categories, where you get a rank instead of actual data. One doesn’t know if there is essentially a tie between schools, or if a yawning abyss separates them.

Failure to adjust salary for cost of living or for industry sectors. This is perhaps less a “mistake” and more a “critique” because there are strong biases at play that are likely to be overlooked. Among MBA rankings, only Forbes attempts to adjust salary for cost of living and only Financial Times adjusts for variation between industry sectors. Since roughly 45% of all ranking weight averaged among the five main MBA rankings (USN, BW, Forbes, Financial Times, and Economist) relies on salary and placement—by far the most heavily weighted area—failure to make these adjustments creates a strong hidden bias. The cost of living in NYC and San Francisco is not the same as Denver and Minneapolis. Bicoastal salaries are often 5% to 10% higher than those not on the coasts. And regarding industry sectors, consulting and finance tend to pay (sometimes considerably) more than marketing, general management or non-profit sector jobs. In short, schools more on the coasts, in large metro areas, or those heavily focused on consulting and finance will have a large advantage. Foreign schools also have made huge inroads into global rankings based on higher salaries (than the US).

Failure to separate apples from oranges: As Malcolm Gladwell pointed out, you can rank all things on a single dimension (miles per gallon, GMAT score, percentage employed at graduation, etc.), but if you want to attempt a holistic or comprehensive ranking, you need to limit the scope to a class, type, or specialty. We rank SUVs, trucks, and sports cars separately for a reason. Rankings often combine one-year MBA programs with two-year ones, 800-student programs with 40-student ones, foreign programs with domestic ones, programs with senior managers as students with programs with 21-year-olds as students. While it’s tempting to do so, because they offer the same basic degree, the results quickly become meaningless.

Recruiter surveys: These pose a variety of problems. Companies have differing hierarchies (national vs regional recruiters) with differing levels of autonomy, so it is difficult to know if the right people are being targeted for the survey. What a senior HR person might say about a school at the national level could vary radically from what a local recruiter would say about the same school. What recruiters have to say about students may reflect more a given year’s hiring quota than the quality of the students overall. Media usually keep schools in the dark about the details and the results are often quite at odds with what recruiters report to the schools about their own satisfaction with a school and its graduates.

Failure to audit data: Only the Financial Times officially audits MBA ranking data. All other rankings assume the data is good and clean. Rarely do media outlets even spot-check data against what schools publish on their own websites (which sometimes differs radically).

Failure to gather data consistently: Sometimes ranking outlets will attempt to gather data from many schools, but when those schools decline to participate, they gather the data anyway from “various public sources” such as websites or by emailing enrolled students. This inevitably leads to inconsistent data.

Failure to check volatility: Academic institutions tend to change slowly. Barring a catastrophe, it is difficult to believe that any school could rise or drop 10 ranks in a single year on any comprehensive ranking, and yet this happens all the time because of poor methodology. Older, more established rankings have learned to check volatility somewhat by conducting biennial (rather than annual) rankings or by weighting past surveys in the current-year rank, or both, but many rankings fail to do either, which produces roller-coaster effects.

Self-promotion: Several surveys contain criteria which are tied to outcomes from their own surveys, creating a self-fulfilling prophecy of sorts. For instance, a school will get extra points if their grads take up another degree or a faculty position at a top-ranked business school, “top-ranked” being defined by the media outlet’s own ranking of top schools.

To see composite ranking of the best business schools from Poets&, check out:

The Top 100 MBA Programs in the U.S.

The Top 50 MBA Programs Outside the U.S.