Skip to main content

Grinnell’s 2019 U.S. News ranking

Topics/tags: Grinnell, academia, college rankings

Each year, U.S. News and World Report releases rankings of American colleges and universities, grouped into a wide array of categories [1]. The U.S. News rankings are fascinating creatures. Many people criticize them for inappropriate priorities [2], some less-than-reliable methodologies [3], and, for many years, a not-very-transparent system. As far as I know, U.S. News has addressed the last issue: I’ve seen reports in newsmagazines on how much each factor is weighted; many factors are also taken directly from, say, IPEDS data [4]. Institutions are supposed to be careful in the data they report to IPEDS [5].

Because of these and other related issues, many institutions, including Grinnell [6], have pledged to avoid using U.S. News rankings in their marketing. I don’t know whether or not these schools also refuse to respond to the U.S. News opinion surveys.

But most people also acknowledge that there’s a wide array of people who rely on the U.S. News rankings in assessing institutions. And so, even though the College doesn’t advertise its ranking or focus on changing particular characteristics in order to raise its ranking [7], we do pay attention. I’ll admit that there have been times that I’ve worried about the potential effects were we to drop out of the top twenty.

For most of my time at Grinnell, our ranking has been in the mid-teens in the list of National Liberal Arts Colleges. If I recall correctly, we were #13 in my first or second year. Last year, we were #18 or so. I also seem to remember that we dropped a bit when schools that were not traditionally in that list got added to the list [8]. I also have a vague memory that Carleton (now #5) was ranked right next to Grinnell when I started.

In any case, U.S. News and World Report just released their 2019 ranking of National Liberal Arts Colleges. And we’re number 11 [9]! I’m pretty sure that’s the highest ranking we’ve had in my time at Grinnell.

So, what changed from last year to this year that brought us up seven slots? I’d like to say that someone paid attention to the commitment to the humanities evident in the HSSC or that it reflects Grinnell’s planned improvements to support of student wellness. But that’s doubtful. Perhaps we’ve made huge changes to the curriculum. No, we haven’t. The CS major has grown enormously over the past few years, but it’s not likely to impact our U.S. News rankings. There is one thing: We also worked hard to improve our retention rate [10].

In the end, other issues are at play. We’ve traditionally been one of the top teaching institutions, often ranked at #2 under Best Undergraduate Teaching. But we dropped to #5 or so in that ranking last year. Why? No one knows. As far as I can tell, that factor is all based on opinion rather than hard numbers [11]. And people once again think better of us. Why? I don’t know [12].

More importantly, U.S. News has changed the criteria they use and the weights they associate with different criteria. Inside Higher Ed tells us about some new criteria.

New this year in the outcomes section are two social mobility factors that together make up 5 percent of the total ranking. One looks at the graduation rates of Pell Grant recipients, and the other compares Pell-recipient graduation rates to those of all students.

I would expect that that criterion has a large impact on Grinnell’s rating. Our need-blind admissions policies [14] mean that we admit and enroll a large number of students who could not otherwise afford a top-tier liberal arts education [16]. Beyond the Pell-eligible students, we also enroll significantly more unable to afford full fare students than do any of our need-blind cohort, at least if the charts I’ve seen are to be believed. While that’s not necessarily the best thing for the long-term financial health of the College, it does suggest that Grinnell can make more of a difference in social mobility.

U.S. News also tinkered around with a few other factors. And, well, rankings are pretty close. We scored 88 points. The number 10 school scored 89. The number sixteen school scored 87. A little shift can make a big difference. I would venture to guess that the increased emphasis on social mobility [17], the tinkering with other figures, and a renewed teaching reputation all helped make some little shifts. The combination of those little shifts let us jump up the scale.

Should it matter that we’re number 11? No. Am I nonetheless happy to hear that we’re number 11? Yes. Grinnell is an amazing institution, and I like to see its quality acknowledged.

Would I like to see more emphasis on social mobility? Yes. But I want something broader than that. In the end, I’d like to see a design your own scale system. However, that may be harder for them to monetize. And, if everyone has their own scale, it feels less like a national ranking.


Postscript: I couldn’t find a way to fit this note in the main body, but I’ve heard that the rankings have become the tail that wags the dog, as it were. U.S. News was a newsmagazine. Now it’s primarily a college ranking service.


Postscript: I said that enormous growth in the number of CS majors is unlikely to have impacted Grinnell’s rating. However, if social mobility is a large factor in the rating, I could be wrong: A CS major does seem to provide upward social mobility for a lot of students. I’m still trying to feel how I feel about the underlying issues. It’s not that I’m not thrilled that our major provides that benefit to many students; I am. Rather, I worry about the discipline being over-valued by society (or, conversely, about other disciplines being under-valued). I’d prefer to see our students being paid well to do whatever they love.


[1] E.g., national research universities, regional colleges, and national liberal arts colleges.

[2] E.g., They rely heavily on SAT/ACT scores. But those scores can serve as a proxy for wealth of incoming students.

[3] E.g., Some values of ranking criteria are determined by surveying college and university presidents.

[4] Integrated Postsecondary Education Data System, which is run by the U.S. Department of Education.

[5] Of course, supposed to be does not always correspond to actual practice. There have been numerous examples of colleges accidentally or purposefully submitting incorrect data to U.S. News and even to IPEDS.

[6] President Osgood made that pledge many years ago. I assume we have continued to follow that pledge.

[7] At least I don’t think it does.

[8] E.g., the United States Military Academy.

[9] Admittedly, five schools are tied for #11.

[10] Didn’t I just say that we weren’t trying to increase our rating by pursuing particular U.S. News criteria? Let’s assume that Grinnell had other reasons to improve our retention rate, such as thinking more broadly about student success.

[11] I wish there were clearer and agreed-upon mechanisms to evaluate the quality of undergraduate teaching.

[12] Perhaps the popularity of a certain disjoint ’blog has affected views on teaching at Grinnell. No, I think not.

[14] We do not pay attention to domestic students’ financial status in making admission decisions, and we meet full demonstrated financial need of domestic students [15].

[15] I am less sure about the policies for international students.

[16] I realize that some people argue that I should not claim that a top-tier liberal arts education is necessarily better. I plan to reflect and muse about that issue in the future.

[17] Which many insist is still not enough.


Version 1.0 of 2018-09-011.