How the U.S. News Ranking Skews University Behavior

Data source: U.S. News & World-Report 2018 Best Colleges

And here they are, the rankings that everybody loves to hate… the U.S. News & World-Report 2018 Best Colleges ranking.

There are numerous other rankings, but the U.S. News publication seems to carry the most clout. I list the rankings here not so much as an objective indicator of the quality of Virginia’s 15 four-year institutions of public education but as a gauge of their relative prestige. Prestige matters because the endless quest for status is one of the primary drivers of college and university priorities and spending.

The aspiration to higher rankings, hence greater prestige, is an endless treadmill. While Virginia’s public institutions strive to climb the ladder, so is every other college and university, both public and private. It’s difficult to rise in the rankings when every other institution in the country is trying to do the same.

Many institutions game the system by applying scarce funds to line items that influence the ranking metrics. Accordingly, it is especially useful to see what U.S. News counts and how institutions might invest resources to improve their scores.

Graduation and retention rates (22.5 percent). U.S. News gives 80% of this measure to the six-year graduation rate and 20% to the first-year retention rate. One can predict that institutions will invest resources to create programs that will influence both of these metrics. Likewise, one can predict that a disproportionate share of resources will be devoted to improving the first-year retention rate.

Undergraduate academic reputation (22.5 percent): U.S. News uses two measures here: academic peer ratings and high school counselor ratings. These are purely subjective, of course. One cannot help but wonder the degree to which the high school counselor ratings are influenced by… previous U.S. News & World-Report rankings. I would hypothesize that institutions intent upon improving their rankings would make efforts to increase visibility among high school counselors. Likewise, I would expect colleges to invest in recruiting star faculty who might bring renown to the institution.

Faculty resources (20 percent): Class size accounts for 40% of this measure. The most points are given to classes with fewer than 20 students, a decreasing number of points are given to classes with 20-29, 30-39, and 40-49, and no points are awarded for classes over 50. I would hypothesize that institutions would respond to this incentive by structuring class sizes to admit the maximum number of students within one of U.S. News‘s brackets. Thus, we would expect to see many more classes enrolling, say, 19 students than 20 students because 19-student classes earn more points under the U.S. News methodology than 20-student classes.

Student selectivity (12.5 percent): Two of the three metrics used in this category are average SAT score and acceptance rate. I would hypothesize that colleges and universities dedicate considerable resources to recruiting high-SAT students, and also that they also dedicate resources to ginning up lots of applications in order to generate the best possible acceptance rate to foster the image of popularity and selectivity. Also, one would expect institutions to dedicate resources to the kinds of assets — newer buildings, cushier dormitories, better food choices — that provide a quick, visceral appeal to high school students visiting campus.

Financial resources (10 percent): U.S. News rewards average spending per student on instruction, research, student services and related educational expenditure. It does not count spending on sports, dorms and hospitals. One would expect universities to adjust their accounting classification of expenses to maximize spending in the favored buckets. Among wealthier institutions, I would predict, there is no practical limit to money spent on student “enrichment” programs such as semesters abroad.

Graduation rate performance (7.5 percent): Adjusting for SAT scores, high school standing, and Pell Grants, U.S. News measures the difference between “expected” and actual graduation rates. If the school’s actual graduation rate is higher than the predicted rate, the college deemed to be enhancing achievement and over-performing. This strikes me as a useful measure, and one that is not easily gamed. I would love to see the data.

Alumni giving rate (5 percent): The percentage of alumni who donate to school is used as an indirect measure of student satisfaction. Of course, this is easily gamed. I would hypothesize that we will see greater resources and creativity expended over time to solicit donations. Even small donations will enhance an institution’s ranking..


Share this article



ADVERTISEMENT

(comments below)



ADVERTISEMENT

(comments below)


Comments

9 responses to “How the U.S. News Ranking Skews University Behavior”

  1. djrippert Avatar
    djrippert

    “It’s difficult to rise in the rankings when every other institution in the country is trying to do the same.”

    Ha ha. Yeah, let’s just give everybody a participation ribbon and be done with it.

    After 38 years of working in one of the most competitive industries on Earth I can attest to the fact that competition is a pain in the ass! The bastards in those other companies are always coming up with some new thing or some new way of doing the old things. Ugh! Why can’t they just let well enough be.

    Why oh why didn’t I have the common sense to work for a nice regulated monopoly like Dominion? I could have had a Dr Pepper lifestyle. In by 10, out by 2, hitting off the first tee by 4.

    The US News & World Report rating system seems pretty good to me. It’s been instructive for those who adhere to the philosophy of “The University” watch UVA slide in the rankings. Sometimes a cold slap in the face by your competition is what you need to stay sharp.

    Too bad there isn’t an equivalent to the US News & World Report for state legislatures, state departments of transportation, etc.

  2. There are a lot of ways to “game” the system and I think Jim just touched the surface. Northeastern University would be a prime example (although I’m sure they would object to “game”) and there was interesting article from a few years ago here:

    https://www.bostonmagazine.com/news/2014/08/26/how-northeastern-gamed-the-college-rankings/

    Northeastern went from something like 200 in the rankings and to about 40 over a 10 year period or so. Based on the metrics they report, they may even appear more selective than UVA or W&M.

    Some of the metrics have little to do with quality of undergraduate education. Indeed, some may hurt undergraduate education. Financial resources, for instance includes research, which as we have discussed here, may actually suck money away from undergraduate education.

  3. Steve Haner Avatar
    Steve Haner

    Worthless marketing nonsense. Fifty percent of their graduates still finish in the bottom half of their class, and that’s after subtracting drops and transfers….

  4. LarrytheG Avatar
    LarrytheG

    USNWR is a media company for GAWD sake. ANYONE can “rate” ..and they do… so what exactly gives NUS&WR more credibility in the first place? Folks on the right these days ..routinely label most traditional Media as “Lame stream” and “fake news”

    Wait for it… the lower rated institutions are going to start using the “fake news” come-back!

    This is a lot like claiming that Consumer Reports gives “prestige” to some products and brands over others. CR doesn’t “give” prestige no more or less than other media companies like USNWR does…

    No one has appointed USNWR to be the arbiter of “good” not is anyone required to believe them any more or less than any organization or media that also “rates”.

    In this day and time – anyone can go onto the interview and type in some product or service and append the word “review” to it and get a lot of opinions from various rating organizations.

    Hey – give it a go right now go to GOOGLE and key in ” best college reviews”
    and see how many you get – beyond the USNWR.

    You have to ask yourself – what exactly makes the USNWR more credible than the others?

    Blaming USNWR for their “influence” seems a stretch.

    come on folks.. get a grip.

  5. “Worthless marketing nonsense. Fifty percent of their graduates still finish in the bottom half of their class, and that’s after subtracting drops and transfers….”

    But when the average GPA is around 3.5 or so at highly ranked institutions (and there has pretty much been 50 years of across the board grade inflation) even the bottom half can look pretty respectable and perhaps perpetuates a Lake Wobegon effect of sorts.

  6. LarrytheG Avatar
    LarrytheG

    The bottom half will, on balance earn twice as much in their lifetime as the top-half of the high school grads.

    that’s what makes the degree from “a highly rated” institution – so valuable.

    however there are a dozen or more rating organizations besides UNSWR… and one of the best raters that does not use subjective ratings just straight up pure objective metrics and actually leaves it up to you to decide what things are important to you…. is this one:

    https://texasoncourse.org/uploads/images/resources/college-scorecard1.jpg

    and actually one’s research into which institutions should use several different raters… to validate which things they agree on across the board and which are somewhat unique to each rater.

    but blaming the rating organizations for causing colleges to modify their offerings and marketing … isn’t that like blaming Consumer Reports for messing up companies that make cars or refrigerators?

    DJ got it right – it’s called competition and competition , at least good competition, focuses on what the customer wants.. including value.

  7. Why of course there are alternatives. But certain ones have the cachet, and the popular credibility. Stars in the Michelin Guide are worth a lot more than a high rating in Better Home and Garden. USN&WR got in there early and built their reputation carefully into what it is today. No higher ed. institution can afford to ignore that. So, perhaps there’s a responsibilty also.

    1. LarrytheG Avatar
      LarrytheG

      Maybe – but no more or no less than ANY organization that “rates” like Michelin for restaurants or Consumer Reports for consumer items and the hundreds of online “raters” of everything from computers to dog food.

      The point here is that these are all non-govt, private-sector organizations and their “influence” is not something they can provide themselves – it comes from people who use their ratings – and it’s up to people to research and pursue information that informs them and helps them make decisions.

      Blaming any rating organization as having “too much influence” is just silly. What would you do about it anyhow?

      People make choices. They make choices about what they “believe” or not… GAWD.. take a look at people who don’t believe the Main Stream media, or even climate scientists… or DO believe in conspiracy theories and worse!

      so now… apparently “something” must be done because of the “excess influence” USNWR has on College Ratings because it causes Colleges to configure their offerings and marketing? Geezy Peezy!

      It actually smells a little like demanding govt regulation !! LORD!

      People need to take responsibility for their own choices and actions and stop blaming others for their “influence”.

  8. Larry,

    You are tilting at windmills. Jim never said regulate or control USNWR. He just described the mechanism by which those ratings influence institutional behavior. You have now somehow responded to that straightforward post with an indignant response containing “GAWD”, “LORD!”, and “Geezy Peezy”.

Leave a Reply