Inside the Rankings: Behind the Metrics That Shape Legal Education

Session Summary

Donna

Good afternoon. We are The National Jurist, the magazine for law students. Thank you for joining our session today.

Joining me on the panel are Haylee Esplin, our Marketing and Circulation Coordinator; Michelle Weinberg, our Vice President and Editorial Director; and Jack Crittenden, our CEO and Publisher. I’m Donna Campbell, Managing Editor.

To get us started, Jack will walk through how we develop our rankings and honor rolls.

Jack

Thank you, Donna. And thank you all for being here — we really appreciate it. We know there’s a lot happening at this conference, so thank you for spending your time with us.

Please feel free to raise your hand and ask questions throughout the session. There’s no mic in the audience, but we can bring you up if needed.

The National Jurist began in 1991, when I graduated from American University Law School. Our first ranking was published in the spring of 1993. That ranking was based on brand-new data from The Princeton Review, which had just begun surveying law students about their experiences at their schools.

At the time, the data had been collected over about an 18-month period. We were able to analyze it and publish a ranking focused entirely on student perspectives. That was exciting for us.

Seton Hall ranked number one because students loved their experience there. Harvard, on the other hand, ranked near the bottom — second or third from last. That was controversial, and Princeton Review received significant criticism. Since then, Harvard has made major improvements, and we like to think that student feedback helped drive those changes.

The goal of that early ranking was to empower students — to give them information they could use to advocate for improvements at their schools. We ran that ranking a few times before Princeton Review made two changes: they were receiving a lot of pushback, and they moved to surveying only one-third of schools each year. At that point, we mutually decided to discontinue the ranking.

In 1998, we launched preLaw magazine. That gave us a more compelling reason to do rankings — not just to help law students improve their schools, but to help prospective students make informed decisions about where to attend.

We started with one ranking and expanded over time as we realized our readers had different goals. Not every student is looking for the same thing, so different rankings highlight different strengths.

Our methodology has evolved over the years, with most changes happening early on. Today, we use what we call a GPA-style format. The idea was simple: if law schools can grade students using GPAs, we can grade law schools the same way.

For each ranking, we first identify the primary criteria that matter most. For example, in Best Value Law Schools, we look at employment rates, bar passage rates, and cost — including tuition and overall cost of education.

We then decide how much weight each category should carry. Those decisions are made by our editorial team after conversations with students, readers, law school deans, and other experts in legal education.

We don’t publicly name those contributors anymore. Early on, I credited someone who later disagreed with the final ranking and didn’t want their name associated with it. Since then, we consult broadly, thank contributors collectively, and keep names private.

Once the weights are finalized, we publish them in our Dean’s Guide, which is available at our booth and digitally. For example, in Best Value, bar passage counts for 15%, employment for 35%, and tuition, cost of living, and debt for 50%.

Each school receives a grade in each category. Those grades are weighted and combined to calculate a cumulative GPA. The math is straightforward — assigning the grades is where it gets complex.

We use a modified grading curve. Ideally, the top 10% of schools receive A or A+ grades. Sometimes the data aligns cleanly; other times it doesn’t.

One example is CUNY, which historically has an exceptionally high public interest placement rate. If we followed a strict curve, multiple schools would receive A+ grades despite large gaps in outcomes. In cases like that, we modify the curve so only the truly exceptional school receives an A+.

After all data is tabulated — without knowing school names — we conduct an extensive accuracy and fairness review. Only then do we match schools to results and compare year-over-year changes. If there’s a significant shift, we investigate and often contact the school to verify the data.

We now have a full-time data editor, Jamie Rand, who oversees data collection and validation. She’s doing a great job, and we’re thrilled to have her on board.

We’re always open to feedback. Conversations with law schools — even difficult ones — help us improve our methodology. Ultimately, our goal is to help students understand the landscape and assess which schools best align with their goals.

Before I move on, I’d like to open it up for questions about methodology.

Audience

How do you research what prospective students are actually interested in right now? Trends change quickly, and applications seem to reflect what schools are promoting. How do you get beyond that to understand broader trends?

Jack

That’s a great question. I don’t want to steal Haylee’s thunder — she’ll talk about the data we monitor — but from the editorial side, we talk regularly with students, readers, and professionals in legal education. Conferences like this are part of that.

We also conduct surveys. This year, we ran two major surveys — one internally with law students and prospective students, and another through an outside research firm.

One interesting finding was that, for the first time, health law ranked as the top area of interest among law students. At the same time, interest in racial justice declined significantly compared to five years ago. You don’t want to overreact to one survey, but trends like that matter.

Between surveys, editorial outreach, and readership data, we’re constantly adjusting.

Michelle

Now we’ll look at what our data reveals about performance and how schools can turn that information into actionable improvements.

We analyzed three to five years of data across four of our major rankings and honor rolls. The first is Best Schools for Law Firm Employment, which runs in our back-to-school issue of PreLaw.

This ranking evaluates schools based on employment rate (40%), employment type and salary outcomes (40%), and alumni prominence via Super Lawyers (20%). Employment rates are calculated as a three-year average using ABA data.

Looking at trends, schools with strong salary and employment outcomes consistently perform well. Alumni visibility also helps differentiate schools at the top.

Consistent leaders include Cornell, Columbia, Duke, Northwestern, USC, Harvard, Michigan, Virginia, and Vanderbilt.

Next is Best Value Law Schools, which runs annually in our fall issue. Public schools dominate the top 20, with BYU as the only private school consistently in the top 10. Schools in lower-cost markets with strong employment pipelines have structural advantages.

Jack

One clarification — for honor rolls, the grade matters more than the numerical rank. An A+ is an A+. Beyond the top tier, schools are listed alphabetically, not numerically.

Michelle

Reducing average debt through scholarships and financial coaching is one of the biggest drivers of ranking movement. Consistent leaders include BYU, Georgia, Alabama, Florida, Kansas, and Florida State.

Our Best Schools for Practical Training ranking relies heavily on supplemental data schools provide, including clinics, externships, simulations, and pro bono opportunities.

Jack

We include an “other” category to account for specialty programs that don’t show up clearly in standard ratios. That ensures schools receive proper credit.

Michelle

Field placements and externships are major differentiators. Simulation-heavy programs and expanded pro bono requirements correlate with higher rankings.

Clinic guarantees and mandatory pro bono programs have increased significantly. Consistent leaders include CUNY, Northeastern, University of St. Thomas (Minnesota), and UC Irvine.

The final ranking we analyzed is Best Schools for Public Service, which evaluates employment outcomes, curricular depth, and debt support across public interest, government, and criminal law.

Jack

Public service is one of our most challenging rankings, particularly when evaluating LRAP and financial support. We encourage schools to share detailed information — it makes a real difference.

We’re currently collecting data for practical training and will soon begin data collection for Best Law School Buildings. If you’ve made upgrades — even small ones — please let us know.

Haylee

Choosing a law school is one of the biggest decisions students make. Across our platforms — search, email, social — we consistently see four primary drivers: rankings, cost, debt, and career outcomes.

Rankings help organize and support those concerns. Looking at Google search data over the past year, ranking-related stories dominate our top-performing pages and search queries.

Over five years, interest in rankings has remained steady or increased. Students aren’t tired of them — they want more context and clarity.

When we look at our top-performing stories across platforms, rankings consistently outperform non-ranking content by a wide margin. Students revisit them, compare schools, and use them as research tools.

When schools understand what students are searching for, they can position themselves more effectively around what matters most.

Jack

Rankings help students understand the unique strengths of schools they might not otherwise consider. For example, Lewis & Clark has nationally recognized environmental and animal law programs — rankings help communicate that beyond the Pacific Northwest.

Most students don’t attend top-10 schools. Our goal is to help the other 90% understand which schools align best with their goals.

If you receive an A or A+, that’s a strong differentiator. We encourage schools to promote grades — not just ordinal rankings — across websites, emails, and marketing materials.

Audience

Would you consider partnering with schools to feature real student voices alongside rankings?

Donna

We absolutely value student perspectives in editorial content and are always looking for ways to include them.

Jack

As long as editorial integrity isn’t compromised, we’re very open to partnerships. Haylee has been doing great work connecting with students through video and on-campus visits.

Readers are far more receptive to hearing from current students than from administrators — and we recognize that.

Audience

How should schools pitch student or alumni sources?

Donna

Email us directly. We’re always looking for student and recent alumni voices, especially younger attorneys. Alumni can be harder to secure, so school connections are incredibly helpful.

Jack

The closer our relationships with schools, the better the final product for readers. Please be responsive, and we’ll treat you with full respect.

Donna

Thank you all for attending. Please visit our booth — we’ll be announcing our Most Influential People in Legal Education at 5:45 and will have our latest issues available.