Scandals involving schools of higher education lying to enhance their U.S. News rankings seem to be appearing more frequently. The most recent confession came from Claremont McKenna College. Its false numbers helped make it the ninth-best liberal arts college in the country. As usual, the school’s top leader blamed a rogue player instead of acknowledging a pervasive problem: deference to idiotic metrics has displaced reasoned judgment and the resulting institutional culture promotes predictable behavior.

Some difficulties flowing from U.S. News rankings methodology make the news. Like other recent instances of misreported data, the focus on Claremont relates to false admissions statistics, namely, SATs. At the University of Illinois College of Law, it was LSATs and GPAs.

Of course, such behavior is reprehensible. But do the rogue villains differ more in degree than in kind from deans who game the system? Some solicit transfer students whose low LSATs led to their rejection as entering one-Ls, but whose scores don’t count when they arrive as tuition-paying 2-Ls. Like the rogues, they seek to boost selectivity scores as measured by LSATs and undergraduate GPAs that comprise more than 20 percent of a law school’s total U.S. News ranking.

Similarly, employment rates at graduation and nine months later account for 18 percent of a law school’s ranking. That encourages deans to hire their own graduates for short-term projects and — until recent ABA revisions become fully effective — permits them to count every part-time, non-legal job as employment.

Expenditures per student account for about 10 percent of a law school’s score. That encourages deans to spend more money and increase tuition to cover the resulting costs while students incur more debt. The resulting vicious circle exacerbates intergenerational antagonisms that are rapidly becoming the legal profession’s — and society’s — next big crisis.

All of the recent attention about bogus admissions and placement numbers shines an important light on some dirty little corners of academia. But more profound rankings methodology problems have gone unnoticed. Specifically, selectivity and placement factors combined barely equal the weight that the ranking system gives to “Quality Assessment” — which accounts for 40 percent of a school’s overall score.

How does the U.S. News perform “Quality Assessment”? Two ways.

First, it sends out surveys to four individuals at all accredited law schools throughout the country: dean, dean of academic affairs, chair of faculty appointments, and the most recently tenured faculty member. The survey asks each recipient to rate all other schools on a scale from marginal (1) to outstanding (5). It doesn’t require that any respondent have any knowledge about any of the 190 schools that he or she rates. (Respondents have a “don’t know” option, but U.S. News doesn’t disclose how many used it. After all, that information would taint its misleading 66 percent response rate.)

A second assessment score comes from lawyers and judges. They, too, get the U.S. News survey asking for (1) to (5) responses about every school. Apart from 750 hiring partners and recruiters at law firms who made the newly developed U.S. News-Best Lawyers list of “Best Law Firms,” information about the “legal professionals, including hiring partners of law firms, state attorneys general, and selected state and federal judges” receiving the survey isn’t disclosed. But the anemic response rate is: 14 percent. One can reasonably ask why such flawed attempts at “quality assessment” should count at all.

One answer is that eliminating them would magnify the importance of the other factors, including test scores. In that respect, there’s a curious aspect of the recent NY Times article about Claremont’s false SATs. It quoted Robert Franek at length. Franek is senior vice president of The Princeton Review, a test-preparation business that has flourished as a principal benefactor of the U.S. News rankings mania.

The Princeton Review does rankings, too. Anyone who regards its list of law schools with the “Best Career Prospects” as meaningful should take a look at the top five for 2012 and ask, “Where are Harvard, Yale and Stanford?”

And then there’s The Princeton Review‘s original October 12, 2010 press release (subsequently revised) that announced the 2011 winner in the “Best Law School Professors” category: Brown.

Brown, of course, doesn’t have a law school.