LAW & FOOTBALL: RANKINGS DOUBLETHINK

For many people, the holiday season means an intense focus on college football. This year, a 12-person committee develops weekly team rankings. They will culminate in playoffs that produce head-to-head competition for the national championship in January.

A recent comment from the chairman of that committee, Jeff Long, is reminiscent of something U.S. News rankings czar Robert Morse said about his ranking system last year. Both remarks reveal how those responsible for rankings methodology rationalize distance between themselves and the behavior they incentivize.

Nobody Wants Credit?

Explaining why undefeated Florida State dropped from second to third in the November 11 rankings, Long told ESPN that making distinctions among the top teams was difficult. He explained that the relevant factors include a team’s “body of work, their strength of schedule.” Teams that defeat other strong teams get a higher rank than those beating weaker opponents. So even though Oregon has suffered a loss this year, its three victories against top-25 opponents jumped it ahead of undefeated FSU, which had only two such wins. Long repeated his explanation on November 19: “Strength of schedule is an important factor….”

Whether Oregon should be ahead of FSU isn’t the point. Long’s response to a follow-up question on November 11 is the eye-catcher: Was the committee sending a message to teams that they should schedule games against tougher opponents?

“We don’t think it’s our job to send messages,” he said. “We believe the rankings will do that.”

But who develops the criteria underlying the rankings? Long’s committee. The logic circle is complete.

Agency Moment Lost: Students

In his November 14 column for the New York Times, David Brooks writes more broadly about “The Agency Moment.” It occurs when an individual accepts complete responsibility for his or her decisions. Some people never experience it.

Rankings can provide opportunities for agency moments. For example, some prelaw students avoid serious inquiry into an important question: which law school might be the best fit for their individual circumstances? Instead, I’ve heard undergraduates say they’ll attend the best law school that accepts them, and U.S. News rankings will make that determination.

If they were talking about choosing from law schools in different groups, that would make some sense. There’s a reason that Harvard doesn’t lose students to Boston University. But too many students take the rankings too far. If the choice is between school number 22 and the one ranked number 23, they’re picking number 22, period. That’s idiotic.

In abandoning independent judgment, such students (and their parents) cede one of life’s most important decisions to Robert Morse, the non-lawyer master of the rankings methodology. It’s also an agency moment lost.

Agency Moment Lost: Deans, Administrators, and Alumni

Likewise, deans who let U.S. News dictate their management decisions say they’re just responding to incentives. As long as university administrators, alumni, and prospective students view the rankings as meaningful, they have to act accordingly. Any complaint — and there are many — should go to the person who develops the rankings methodology.

All roads of responsibility lead back to U.S. News’ Robert Morse, they say. But following that trail leads to another lost agency moment. In March 2013, Lee Pacchia of Bloomberg asked Morse if he took any responsibility for what’s ailing legal education today:

“No…U.S. News isn’t the ABA. U.S. News doesn’t regulate the reporting requirements. No….”

Agency Moment Lost: Methodology Masters

Morse went on to say that U.S. News was not responsible for the cost of law school, either. Pacchia didn’t ask him why the methodology rewards a school that increases expenditures without regard to the beneficial impact on student experiences or employment outcomes. Or how schools game the system by aggressively recruiting transfer students whose tuition adds revenue at minimal cost and whose lower LSAT scores don’t count in the school’s ranking methodology. (Vivia Chen recently reported on the dramatic increase in incoming transfer students at some schools.)

Cassius was only half-right. The fault lies not in our stars; but it doesn’t lie anywhere else, either!

The many ways that U.S. News rankings methodology has distorted law school deans’ decision-making is the subject of Part I of my book, The Lawyer Bubble – A Profession in Crisis. Part II investigates the analogous behavior of law firm leaders who rely on metrics that maximize short-term Am Law rankings in running their businesses (e.g., billings, billable hours, hourly rates, and leverage ratios).

Aggregate Rankings v. Individual Outcomes

In the end, “sending a message” through a rankings methodology is only one part of an agency equation. The message itself doesn’t require the recipient to engage in any particular behavior. That’s still a choice, although incentive structures can limit perceived options and create first-mover dilemmas.

Importantly, individual outcomes don’t always conform to rankings-based predictions. Successful participants still have to play — and win — each game. That doesn’t always happen. Just ask Mississippi State — ranked number one in the college football playoff sweepstakes after week 12, but then losing to Alabama on November 15. Or even better, look at number 18 ranked Notre Dame, losing on the same day to unranked Northwestern.

Maybe that’s the real lesson for college coaches, prelaw students, law school deans, and law firm leaders. Rather than rely on rankings and pander to the methodology behind them, focus on winning the game.

“GAMING THE REPORTING”?

In a recent interview with Lee Pacchia of Bloomberg News, U.S. News & World Report’s director of data research Robert Morse explained this year’s only revision to his law school rankings methodology. Morse gave different weights to various employment outcomes for class of 2011 graduates. But he didn’t disclose precisely what those different weights were.

Morse said that such transparency worried him. Full-time, long-term jobs requiring a legal degree got 100 percent credit. But he didn’t reveal the weight he gave other employment categories (part-time, short-term, non-J.D.-required) because he didn’t want deans “gaming the reporting of their results.” It was an interesting choice of words.

Teapot tempests

In some ways, all of the attention to the changes in this year’s rankings methodology is remarkable. Certainly, a school’s employment success for graduates is important. But the nine-month data point for which the ABA now requires more detailed information accounts for only 14 percent of a school’s total U.S. News ranking score. To put that in context, consider some of the more consequential rankings criteria.

Fifteen percent of every school’s U.S. News score is based on a non-scientific survey of practicing lawyers and judges. The survey response rate this year was only nine percent.

Likewise, the “peer assessment” survey that goes to four faculty members at every accredited law school — dean, dean of academic affairs, chair of faculty appointments, and most recently tenured faculty member — accounts for 25 percent of a school’s ranking score. It asks those four individuals to rate all ABA-accredited law schools from 1-to-5, without requiring that any respondent know anything about the schools he or she assesses.

Taken together, the two so-called “quality assessment” surveys comprising 40 percent of every school’s ranking are a self-reinforcing contest for brand recognition. As measures of substantive educational value, well, you decide.

Game of moans

But if, as Morse suggests, his concern is “gaming the reporting,” he must be worried that some deans would either: 1) self-report inaccurate data; or 2) otherwise change their behavior in an effort to raise their school’s ranking. He’s a bit late to both parties.

Scandals engulfed prominent law schools that submitted false LSAT and GPA statistics for their entering classes. But how many others haven’t been caught cheating? No one knows. As for permissible behavior that accomplishes similar objectives, examples abound.

For years, deans seeking to enhance the 12.5 percent of the rankings component relating to median LSAT scores for J.D. entrants have been “buying” higher LSATs through “merit” scholarships. Need-based financial aid has suffered. Ironically, those merit scholarships often disappear after the first year of law school.

Likewise, the faculty resources component is 15 percent of every school’s ranking. But it encourages expenditures — and skyrocketing tuition — without regard to whether they benefit a student’s educational experience.

Whom to blame

Morse establishes the criteria and methodology that incentivize behavior producing these and many other perverse outcomes. But he doesn’t think that any of the current problems confronting the profession are his fault.

“U.S. News isn’t the ABA,” he told Pacchia. “U.S. News doesn’t regulate the reporting requirements…[W]e’re not responsible for the cost of law school, the state of legal employment, the impact that recession has had on hiring, or the fact that 10 or 20 new law schools have opened over the last couple decades. We’re not responsible for the imbalance of jobs to graduates. No, I think we’re not responsible. I think we’ve helped prospective students understand what they are getting into than they were previously.”

Of course, the problem isn’t just the flawed rankings methodology itself. Also culpable are the decision-makers who regard a single overall ranking as meaningful — students, deans, university administrators, and trustees. Without their blind deference to a superficially appealing metric, the U.S. News rankings would disappear — just as the U.S. News & World Report print news magazine did years ago.

Cultural obsession

Pervasive throughout society, rankings may be a permanent feature of the legal profession. But it’s worth remembering that they’re relatively new. Before the first U.S. News list of only the top 20 law schools in 1987, prospective students and law schools somehow found each other.

Today, rankings facilitate laziness. The illusory comfort of an unambiguous numerical solution is easier than engaging in critical thought and exercising independent judgment. Forgotten along the way is the computer science maxim “garbage in, garbage out.”

UNFORTUNATE COMMENT AWARD

Today’s “Unfortunate Comment Award” winner is ABA President William (“Bill”) Robinson III, who thinks he has found those responsible for the glut of unemployed, debt-ridden young lawyers: the lawyers themselves.

“It’s inconceivable to me that someone with a college education, or a graduate-level education, would not know before deciding to go to law school that the economy has declined over the last several years and that the job market out there is not as opportune as it might have been five, six, seven, eight years ago,” he told Reuters during a January 4 interview.

Which year we talkin’ ’bout, Willis?

Recent graduates made the decision to attend law school in the mid-2000s, when the economy was booming. Even most students now in their third year decided to apply by spring 2008 — before the crash — when they registered for the LSAT. Some of those current 3-Ls were undergraduates in the first-ever offering of a course on the legal profession that I still teach at Northwestern. What were they thinking? I’ll tell you.

I’ve written that colleges and law schools still make little effort to bridge a pervasive expectations-reality gap. Anyone investigating law schools in early 2008 saw slick promotional materials that reinforced the pervasive media image of a glamorous legal career.

Jobs? No problem. Prospective students read that for all recent graduates of all law schools, the overall average employment rate was 93 percent. They had no reason to assume that schools self-reported misleading statistics to the ABA, NALP, and the all-powerful U.S. News ranking machine.

But unlike most of their law school-bound peers, my students scrutinized the flawed U.S. News approach. Among other things, they discovered that employment rates based on the ABA’s annual law school questionnaire were cruel jokes. That questionnaire allowed deans to report graduates as employed, even if they were flipping burgers or working for faculty members as temporary research assistants.

Law school websites followed that lead because the U.S. News rankings methodology penalized greater transparency and candor. In his Reuters interview, Robinson suggested that problematic employment statistics afflicted “no more than four” out of 200 accredited institutions, but he’s just plain wrong. Like their prospective students, most deans still obsess over U.S. News rankings as essential elements of their business models.

The beat goes on

With the ABA’s assistance, such law school deception continues today. Only last month — December 2011 — did the Section on Legal Education and Admission to the Bar finally approve changes in collecting and publishing law graduate placement data: Full- or part-time jobs? Bar passage required? Law school-funded? Some might consider that information relevant to a prospective law student trying to make an informed decision. Until this year, the ABA didn’t. The U.S. News rankings guru, Robert Morse, deferred to the ABA.

The ABA is accelerating the new reporting process so that “the placement data for the class of 2011 will be published during the summer of 2012, not the summer of 2013.” That’s right, even now, a pre-law student looking at ABA-sanctioned employment information won’t find the whole ugly truth. (Notable exceptions include the University of Chicago and Yale.) Consequently, any law school still looks like a decent investment of time and money, but as Professor William Henderson and Rachel Zahorsky note in the January 2012 issue of the ABA Journal, it often isn’t.

Students haven’t been blind to the economy. But bragging about 90+ percent employment rates didn’t (and doesn’t) deter prospective lawyers. Quite the contrary. Law school has long been the last bastion of the liberal arts major who can’t decide what’s next. The promise of a near-certain job in tough times makes that default solution more appealing.

Even the relatively few undergraduates (including the undergraduates in my class) paying close attention to big firm layoffs in 2009 were hopeful. They thought that by the time they came out of law school, the economy and the market for attorneys would improve. So did many smart, informed people. Youthful optimism isn’t a sin.

Which takes me to ABA President Robinson’s most telling comment in the Reuters interview: “We’re not talking about kids who are making these decisions.”

Perhaps we’re not talking about his 20-something offspring, but they’re somebody’s kids. The ABA and most law school deans owed them a better shake than they’ve received.

It’s ironic and unfortunate: one of the most visible spokesmen in a noble profession blames the victims.

TRUTHINESS IN NUMBERS

Two recent developments here and across the pond share a common theme: ongoing confusion about young attorneys’ prospects. But the big picture seems clear to me.

Last month, I doubted predictions that the UK might be on the verge of a lawyer shortage. I expressed even greater skepticism that it presaged a similar shortfall in the United States. In particular, College of Law issued a report suggesting that an attorney shortage could exist as early as late 2011 and “may jump considerably in 2011-2012.”

This came as a surprise because the UK’s Law Society has warned repeatedly about the oversupply of lawyers in that country. Why such dramatically different views of the future?

Some commenters to an article about the College of Law report suggested that perhaps the study hadn’t taken into account the existing backlog of earlier graduates who, along with young solicitors laid off in 2008 and 2009, were still looking for work.

Another explanation may be that the College of Law and its private competitors, including Kaplan Education’s British arm, wants to recruit students to their legal training programs. Sound familiar?

The following is from the College of Law website:

“84% of our LPC graduates were in legal work just months after graduation.*”

But mind the asterisk: “*Based on known records of students successfully completing their studies in 2010.”

I wonder who among their students isn’t “known.” As for “legal work,” a recent former UK bar chairman observed that the oversupply of attorneys in that country has driven many recent LPC graduates into the ranks of the paralegals. Digging deeper into the College of Law’s 84 percent number yields the following: 62 percent lawyers; 22 percent paralegals “or other law related.” At least the College appears to be more straightforward than American law schools compiling employment stats for their U.S. News rankings.

That takes me to the recent ABA committee recommendation concerning employment data here. U.S. News rankings guru Robert Morse has joined the ABA in assuring us that help is on the way for those who never dreamed that law schools reporting employment after graduation might include working as a greeter at Wal-Mart. Morse insists that if the schools give him better data, he’ll use it.

It’s too little, too late. Employment rate deception is the tip of an ugly iceberg comprising the methodological flaws in the rankings. For example, employment at nine-months accounts for 14 percent of a school’s score; take a look at the absurd peer and lawyer/judges assessment criteria, which count for 40 percent. Res ipsa loquitur, as we lawyers say.

Frankly, I’m skeptical about the prospects for progress even on the employment data front. Until an independent third-party audits the numbers that law schools submit in the first place, their self-reporting remains suspect. No one in a position of real professional power is pushing that solution.

Meanwhile, back in the UK, Allen & Overy — a very large firm — announced its “second round of cuts on number of entry level lawyers hired” — from the current 105 London training contracts down to 90 for those applying this November.  The article concluded:

“The news comes after the latest statistical report from the Law Society highlighted the oversupply of legal education places compared with the number of training contracts in the UK legal market. The number of training contract places available fell by 16% last year to 4,874 and by 23% from a 2007-08 peak of 6,303.”

So much for the College of Law’s predictive powers. Prospective lawyers in the UK are probably as confused as their American counterparts when it comes to getting reliable information about their professional prospects. Most students everywhere assume that educational institutions have their best interests at heart.

If only wishing could make it so.

LAW SCHOOL DECEPTION — PART III

Money talks, especially to prospective law students concerned about educational debt. Tuition reduction programs promise some relief. Surely, scholarships conditioned on minimum GPAs are better.

Recently, the NY Times profiled a Golden Gate University School of Law student needing a 3.0 to keep her scholarship. By the end of her first year, she’d “curved out” at 2.967. Her Teamsters dad drove a tractor before he was laid off, but she and her parents came up with $60,000 in tuition to complete her degree.

Maybe that’s reasonable. A “B” average doesn’t seem difficult. Is this just whining from what some article comments called “the gimme generation”?

Only if the victims knew the truth. She has no paying job, legal or otherwise. That’s her true victimization, along with many others.

— Statistically possible v. doesn’t happen v. fully disclosed

Golden Gate imposes mandatory first-year curves limiting the number of As and Bs. In second and third year courses, the curves loosen or disappear. The profiled student graduated with a 3.14 GPA — a nice recovery, but too late for the lost scholarship.

According to the article, more than half of the current GGU first-year class has merit scholarships and Dean Drucilla Stender Ramey said it’s statistically possible for 70 percent of one Ls to maintain a 3.0 GPA — also the threshold for the Dean’s List. Even if she meant “theoretically” rather than “statistically” possible, I’m skeptical. The school’s handbook reports the mandatory range for those receiving a “B- and above” in first-year required courses: 45 percent (minimum) to 70 percent (maximum). And a B- is 2.67.

“[I]n recent years,” the article continued, “only the top third of students at Golden Gate wound up with a 3.0 or better, according to the dean…. She also maintains that Golden Gate 1Ls’s are well-informed about the odds they face in keeping scholarships.”

This sounds like the lawyer who tells the jury: 1) my client was out of town at the time of the murder; 2) if he was in town, he didn’t do it; and 3) whatever he did was in self-defense.

— Playing with fire

Why offer merit scholarships? U.S. News‘s rankings, says University of St. Thomas School of Law Professor Jerry Organ:

“Law schools are buying…higher GPAs and LSATs.”

Albany Law School Dean Thomas F. Guernsey notes that such catering to the rankings has “strange and unintended consequences,” such as reducing need-based financial aid by redirecting it to those who otherwise “will go somewhere else.”

U.S. News doesn’t collect merit scholarship retention data because, according to rankings guru Robert Morse, “[W]e haven’t thought about it…[T]hese students are going to law school and they need to learn to read the fine print.”

That’s among the least of many profound flaws in the U.S. News methodology. Law school deans know them all, yet pandering to the rankings persists while students and the profession pay the price.

Somewhere in the cumulative behavior of certain schools lies an interesting class action. Particularly vulnerable are recruiters operating at the outer limits of candor to attract students who accumulate staggering loans and no jobs.

Imagine forcing some deans to answer these questions — under oath:

— Where did you go to law school? (That’s foundational — to show they’re smart; for example, GGU’s Dean Ramey graduated from Yale.)

— How many graduates did you put on your school’s temporary payroll solely to boost your U.S. News “nine months after graduation” employment rate? (I don’t know about GGU, but others have.)

— How many have full-time paying jobs requiring a JD? (GGU’s nine-month employment rate is 87.2% of 143 “reporting” 2009 graduates, but the “number with salary” is only 41 (or 29%). Two-thirds of “reporting graduates” had jobs requiring bar passage; only half held permanent positions. And who’s not “reporting”?)

— How many merit recipients lose scholarships? What did you tell those hot prospects when you enticed them with first-year money? Ultimately, how much did they pay for their degrees?

Ironically, even bold typeface disclosure might not change some prospective students’ minds because facts yield to confirmation bias. Convinced that they’ll overcome daunting odds to become winners, they can’t all be right.

Still, the potential class of law student plaintiffs grows by the thousands every year. If they ever file their lawsuit, the defendant(s) better get good lawyers.