NRC (not really correct?) Graduate School Rankings

The NRC graduate school ranks are due out tomorrow, September 29. For those who don’t know, the last NRC ranking was in 1995 and the latest is much delayed (I.e. the “data” such as it is is already out of date.) Departments have been given access to the data for a week now but have been under embargo. As a blogger it is a moral imperative to search the inter tubes for leaks of this data. Surprisingly there has been little leaked, but today I’m proud to say that my own UW, while not technically breaking the embargo (okay maybe they have :)) has some info out about their forthcoming rankings. Now I’m probably definitely biased but I can pretty safely say that the UW CS ranking is off by a bit:

The NRC assessment of UW Computer Science & Engineering is based on clearly erroneous data. The assessment is meaningless, and in no way representative of the accomplishments of UW CSE. Errors in the data affect (at least) UW CSE, many other computer science programs nationally, and many programs in other fields at the University of Washington.
During the week of September 19th, NRC provided pre-release access to its long-delayed “Data-Based Assessment of Research-Doctorate Programs in the United States,” scheduled for public release during the week of September 26th.
We, along with colleagues in other computer science programs nationally and colleagues in programs in other fields at the University of Washington, quickly discovered significant flaws of three types in NRC’s data:

  • Instances in which the data reported by NRC is demonstrably incorrect, sometimes by very substantial margins.
  • Instances in which the accuracy of the data cannot easily be checked, but it does not pass even a rudimentary sanity check.
  • Instances in which institutions interpreted NRC’s data reporting guidelines differently, yielding major inconsistencies.

Here are three specific examples affecting UW CSE:

  • Due to difficulty in interpreting NRC’s instructions, NRC was provided with an incorrect faculty list for our program – essentially, a list that included anyone who had served as a member of a Ph.D. committee. In 2006 (the reporting year), UW CSE had roughly 40 faculty members by any reasonable definition. In the NRC study, our “total faculty” size is listed as 91 and our “allocated faculty size” (roughly, full time equivalent) as 62.5. A large number of these “additional faculty” were industrial colleagues – whose “academic records” (including grants, publications, and awards) were quantitatively evaluated by NRC as if these individuals were full members of our faculty. Since faculty size is the denominator in many measures computed by NRC, you can imagine the result – clearly erroneous.
  • NRC reports UW CSE with 0% of graduate students “having academic plans” for 2001-05 (the reporting period for this measure). In fact, 40% of our graduating Ph.D. students took full-time faculty positions during this period. We are one of the top programs nationally in producing faculty members for major departments; in recent years our graduates have taken faculty positions at Berkeley, CMU, MIT, Princeton, Cornell, Wisconsin, Illinois, Michigan, Penn, Waterloo, Toronto, WashU, UCSD, Northwestern, UCLA, UBC, Maryland, Georgia Tech, UMass-Amherst, and many other outstanding programs. NRC obtained this number from an outside data provider; it’s clearly erroneous.
  • NRC reports UW CSE as having 0.09 “awards per allocated faculty member.” The erroneous faculty count is not sufficient to explain this, given that our faculty includes 14 ACM Fellows, 10 IEEE Fellows, 3 AAAI Fellows, 14 Sloan Research Fellowship recipients, a MacArthur Award winner, two NAE members, 27 NSF CAREER Award winners, etc. We don’t know where NRC obtained this data, but it’s clearly erroneous.
    The University of Washington reported these issues to NRC when the pre-release data was made available, and asked NRC to make corrections prior to public release. NRC declined to do so. We and others have detected and reported many other anomalies and inaccuracies in the data during the pre-release week.

The widespread availability of the badly flawed pre-release data within the academic community, and NRC’s apparent resolve to move forward with the public release of this badly flawed data, have caused us and others to take action – hence this statement. Garbage In, Garbage Out – this assessment is based on clearly erroneous data. For our program – and surely for many others – the results are meaningless.

13 Replies to “NRC (not really correct?) Graduate School Rankings”

  1. If they aren’t happy with the NRC data, then they should publish corrected data, and keep it up to date (i.e., not just for 2001-2005). Since they don’t seem willing to do that, I have little sympathy for them.

  2. Where have they published their own data? Why have they waited so long? The NRC has an obligation to be accurate, but CS departments are also responsible for being open about their data. Not to single out UW. Across the field departments are hiding this kind of data. Then they sit back and complain.

  3. So where is the university going with this, and how many other universities are involved? (There is mention of “colleagues in other computer science programs nationally” having similar concerns.)

  4. Er, the numbers ought to have been mostly self-reported.
    The errors are likely due to a higher admin layer at UW, either your college or VP for Research office, or equivalent.
    I agree that there are sanity check errors in the NRC data, and some will lead to significant rank shift, but most appear to be random errors with no significance to outcome.

  5. Steinn: (1) not all numbers are self reported, (2) a process that considers itself “data driven” but refuses to correct obviously wrong data is…well I’ll let you choose an apropriate adjective.
    The NRC, by the way, at one point wanted to use journal counts as publications in CS. Knuckheads 🙂

  6. Oh I know where (at least one) of the errors was and sure it’s the UW grad programs fault. That’s NOT the issue. The issue is a process that uses single sources that are not verified in any fashion. This is the equivalent of relying on a single experiment to claim accuracy. Design of a rank should probably include methods for error detection, no?

  7. The paradigm of “elite university” is itself politically incorroct.
    http://www.nytimes.com/2010/09/30/opinion/30kahlenberg.html
    Elite Colleges, or Colleges for the Elite?
    By RICHARD D. KAHLENBERG
    Published: September 29, 2010
    Washington
    TODAY’S populist moment, with a growing anger directed at the elites who manipulate the system to their advantage, is an opportune time to examine higher education’s biggest affirmative action program — for the children of alumni.
    At our top universities, so-called legacy preferences affect larger numbers of students than traditional affirmative action programs for minority students, yet they have received a small fraction of the attention. Unlike the issue of racial preferences, advantages for alumni children — who are overwhelmingly white and wealthy — have been the subject of little scholarship, no state voter initiatives and no Supreme Court decisions….

Leave a Reply

Your email address will not be published. Required fields are marked *