On Aug. 10, Democratic candidate for US president Hillary Rodham Clinton unveiled a $350 billion plan to eliminate college debt and allow young Americans to complete four-year degrees without taking out loans.
Some see Clinton’s plan as a crucial step in the right direction. These days, it’s virtually impossible to self-finance an American college education. For those not getting help from mom and dad, loans and/or federal grants are a matter of course. “In 2014-2015, the school year just ended, the total of tuition, fees and room and board for in-state students at four-year public universities was $18,943,” reports Anya Kamenetz for NPR. “The maximum Pell Grant didn’t keep pace with that: It was $5,730.” This leaves the average grantee roughly $13,300 to cover annually. (Pell Grants are funded by the US federal government and are based on financial need, as determined by FAFSA.)
To make ends meet without taking out substantial loans, “a student would now have to work 35 hours a week, every week of the year,” Kamenetz calculates. “To cover today’s costs with a low-skilled, minimum wage summer job? Over 90 days, a student would need to work 20.24 hours a day.”
American colleges and universities have to know this, to some extent. And yet, they seem unwilling to shift away from a degree-seeking structure that is both bloated and exorbitantly priced.
American higher learning could learn much in the way of providing affordable, quality education from its UK cousin across the Atlantic.
British students are generally required to declare an intended concentration prior to gaining admission to a university. This, proponents say, fosters an atmosphere of focus. Students commit to a path of study from the get-go, instead of the more casual dabbling a typical American liberal arts curriculum encourages. And a typical British Bachelor’s degree takes three years to achieve, not four. That translates to a lot of money saved on unnecessary credits at current US prices.
Though it’s difficult to generalize the curricula of American colleges and universities, general education requirements are fairly ubiquitous, and they are usually completed within the first two years of undergraduate coursework. (At least they’re often intended to.) For comparison, let’s look at the general education requirements of seven schools, representative of the varying kinds of institutions native to the American higher-education landscape:
At University of Texas at Austin, a large, public university in an urban setting, students are required to take a total of 13 general education courses before graduating.
At New York University, a large, private, urban university, undergraduates in the College of Arts and Sciences must complete seven courses in the “Morse Academic Plan” before degree conferral.
At Maine’s Bowdoin College, a small liberal arts college, students are required to complete five distribution requirements in everything from math to the visual and performing arts.
At Harvard University, an Ivy League institution, students must complete:
1 “aesthetic and interpretive understanding” course
1 “culture and belief” course
1 “empirical and mathematical reasoning” course
1 “ethical reasoning” course
1 “science of living systems” course
1 “science of the physical universe” course
1 “societies of the world” course
1 “United States and the world” course
(For a grand total of eight distribution requirements.)
At Wellesley College, a women’s college in Massachusetts, students are required to take a whopping 16 courses outside their major discipline.
At Howard University, a historically black college/university in Washington, DC, undergraduates must complete a freshman seminar, nine distribution requirements, demonstrate upper-level mastery of a foreign language, and complete a “cluster requirement” in African-American studies.
And at California Institute of Technology, one of the nation’s premiere engineering schools, students must complete 108 units in the humanities and social sciences before graduating. Typical non-STEM courses are 9 units, meaning roughly 12 courses must be completed in the liberal arts.
Admittedly, there are instances when general-education requirements are useful, if not crucial to a student’s eventual success. There is certainly an argument to be made for English majors taking the odd statistical reasoning and/or economics course. Likewise, when an area of study is crucial to a school’s identity—as African-American studies is to a school like Howard—it’s understandable that administrators would want students to develop a stake in that identity.
That being said, in every instance detailed, it appears students are spending a full year, at the least, paying for coursework that may not ultimately prove useful in whatever professional path they end up pursuing.
Some say the flexibility of the American curriculum is what makes it special. That may be the case. The fact remains, however, that colleges and universities have a financial incentive to keep curricula general and “flexible.” The longer a student takes to decide on a path of study, the likelier it is they’ll extend their study. That means more credit hours before graduating, and more tuition dollars in the school’s pocket.
At the end of the day, the current model of degree-financing in America isn’t sustainable. And while it’s nice to think of college years as a period of “finding oneself”—dipping toes in a number of academic disciplines before finally settling on a course of study—it’s not a luxury everyone can afford. And they shouldn’t be forced to.