October 24, 2008

In defense of the U.S. News rankings

They're not perfect, but they're a useful compass for families navigating through the college search.

It is now received wisdom that the U.S. News & World Report college rankings are at best meaningless and at worst deliberately manipulative. No more accurate than astrological predictions, these rankings can never hope to compete with the learned judgments of counselors or friends. But is the standard narrative really true? I doubt it.

You see, to better examine the question, you’ve got to look at what the magazine actually uses to rank schools. (You didn’t think it threw darts, did you?) According to its website, the factors used are peer assessment (25 percent), retention (20 percent), faculty resources (20 percent), student selectivity (15 percent), financial resources (10 percent), graduation rate performance (5 percent), and alumni giving rate (5 percent). These certainly give lie to the claim that the rankings are “arbitrary”—each is, at the very least, tangentially related to the difficult-to-define “academic quality.” And, although the weights are subjective, they seem reasonable—any other weighting would be just as subjective. Indeed, the magazine does not claim that the weights are objective—it states that each “reflects [its] judgment about how much a measure matters.” Tweaking may produce a different ranking, but the factors as assembled do not create entirely meaningless results.

It is also often claimed that because the relative positions of the schools change from year to year, the system has no reliability. How can the University of Chicago be worse than Duke and then, quite suddenly, not? But, in fact, their changing implies precisely the opposite: The system is working. If a list of schools in order of quality were to remain constant for years, there would be grounds for suspicion. As matters stand, however, that the academic quality of schools is constantly in flux is no cause for alarm. That the University of Chicago was placed below Duke in one year and matched with it the next is no caprice: It merely reflects how U.S. News perceives academic quality. The change is admittedly barely perceptible, but it is there.

But what of the charge that there are certain factors that are immeasurable? What of Chicago’s reputed intellectualism? How can anyone possibly pin that down? I think the answer is that in many cases these “immeasurable” factors are exaggerated. No offense to the school, but I don’t think that Chicago’s students are significantly more inquisitive, intelligent, and intellectual than, say, Columbia’s or Harvard’s. Perhaps in some ways, but not in others. Chicago’s student body may be a little more unique, but I would guess that most Chicago students would not be unwilling to attend Columbia or Harvard.

Are there legitimate criticisms to be made of the rankings? Of course. There are serious concerns that “peer assessment” has devolved into a prestige-fest and is far too subjective. Some colleges (notably Sarah Lawrence and Reed) have even refused to play ball. But I think that the methodology remains valid on a macro level—i.e., colleges within a certain bracket (5 or 10 slots in size) can together be fairly judged against other brackets but not against each other.

Can U.S. News ever replace visiting colleges, speaking with current and former students, and feel? It patently cannot: Choosing one’s college is an immensely personal experience. Nor, however, does U.S. News claim that it can:

“Of course, many factors other than those we measure will figure in your decision, including the feel of campus life, activities, sports, academic offerings, location, cost, and availability of financial aid. But if you combine the information in this book with college visits, interviews, and your own intuition, our rankings can be a powerful tool in your quest for college.”

The purpose of the U.S. News rankings is to give rationally ignorant parents (hopefully not students!) who do not have the time to study colleges a reasonable list of high-quality schools to investigate. They should be used as a starting point, like Wikipedia—not as a research paper.

Consider: If your local community college (whose academics, according to U.S. News & World Report, were lousy) happened to feel inherently great, would you prefer it to the University of Chicago? Would you attend? Before you dismiss the rankings as meaningless, ask yourself how many colleges not present on the list you applied to (that you would seriously consider attending for the next four years). Did you investigate the academic prowess of every single college in the United States? Why are those on the top of the list almost all the schools to which a Chicago student would likely apply? I suspect that there is a real and strong correlation between ranking and academic quality, however marginal.