AI enters the classroom as law schools prep students for a tech-driven practice

When it comes to using artificial intelligence in legal education and beyond, the key is thoughtful integration.

“Think of it like a sandwich,” said Dyane O’Leary, professor at Suffolk University Law School. “The student must be the bread on both sides. What the student puts in, and how the output is assessed, matters more than the tool in the middle.”

Suffolk Law is taking a forward-thinking approach to integrating generative AI into legal education starting with requiring an AI course for all first-year students to equip them to use AI, understand it and critique it as future lawyers.

O’Leary, a long-time advocate for legal technology, said there is a need to balance foundational skills with exposure to cutting-edge tools.

- Advertisement -

“Some schools are ignoring both ends of the AI sandwich,” she said. “Others don’t have the resources to do much at the upper level.”

Professor Dyane O’Leary, director of Suffolk University Law School’s Legal Innovation & Technology Center, teaches a generative AI course in which students assess the ethics of AI in the legal context and, after experimentation, assess the strengths and weaknesses of various AI tools for a range of legal tasks.

One major initiative at Suffolk Law is the partnership with Hotshot, a video-based learning platform used by top law firms, corporate lawyers and litigators.

“The Hotshot content is a series of asynchronous modules tailored for 1Ls,” O’Leary said, “The goal is not for our students to become tech experts but to understand the usage and implication of AI in the legal profession.”

The Hotshot material provides a practical introduction to large language models, explains why generative AI differs from tools students are used to, and uses real-world examples from industry professionals to build credibility and interest.

- Advertisement -

This structured introduction lays the groundwork for more interactive classroom work when students begin editing and analyzing AI-generated legal content. Students will explore where the tool succeeded, where it failed and why.

“We teach students to think critically,” O’Leary said. “There needs to be an understanding of why AI missed a counterargument or produced a junk rule paragraph.”

These exercises help students learn that AI can support brainstorming and outlining but isn’t yet reliable for final drafting or legal analysis.

Suffolk Law is one of several law schools finding creative ways to bring AI into the classroom — without losing sight of the basics. Whether it’s through required 1L courses, hands-on tools or new certificate programs, the goal is to help students think critically and stay ready for what’s next.

- Advertisement -

Proactive online learning

Case Western Reserve University School of Law has also taken a proactive step to ensure that all its students are equipped to meet the challenge. In partnership with Wickard.ai, the school recently launched a comprehensive AI training program, making it a mandatory component for the entire first-year class.

“We knew AI was going to change things in legal education and in lawyering,” said Jennifer Cupar, professor of lawyering skills and director of the school’s Legal Writing, Leadership, Experiential Learning, Advocacy, and Professionalism program. “By working with Wickard.ai, we were able to offer training to the entire 1L class and extend the opportunity to the rest of the law school community.”

The program included pre-class assignments, live instruction, guest speakers and hands-on exercises. Students practiced crafting prompts and experimenting with various AI platforms. The goal was to familiarize students with tools such as ChatGPT and encourage a thoughtful, critical approach to their use in legal settings.

Oliver Roberts, CEO and co-founder of Wickard.ai, led the sessions and emphasized the importance of responsible use.

While CWRU Law, like many law schools, has general prohibitions against AI use in drafting assignments, faculty are encouraged to allow exceptions and to guide students in exploring AI’s capabilities responsibly.

“This is a practice-readiness issue,” Cupar said. “Just like Westlaw and Lexis changed legal research, AI is going to be part of legal work going forward. Our students need to understand it now.”

Balanced approach

Starting with the Class of 2025, Washington University School of Law is embedding generative AI instruction into its first-year Legal Research curriculum. The goal is to ensure that every 1L student gains fluency in both traditional legal research methods and emerging AI tools.

Delivered as a yearlong, one-credit course, the revamped curriculum maintains a strong emphasis on core legal research fundamentals, including court hierarchy, the distinction between binding and persuasive authority, primary and secondary sources and effective strategies for researching legislative and regulatory history.

WashU Law is integrating AI as a tool to be used critically and effectively, not as a replacement for human legal reasoning.

Students receive hands-on training in legal-specific generative AI platforms and develop the skills needed to evaluate AI-generated results, detect hallucinated or inaccurate content, and compare outcomes with traditional research methods.

“WashU Law incorporates AI while maintaining the basics of legal research,” said Peter Hook,associate dean. “By teaching the basics, we teach the skills necessary to evaluate whether AI-produced legal research results are any good.”

Stefanie Lindquist, dean of WashU Law, said this balanced approach preserves the rigor and depth that legal employers value.

“The addition of AI instruction further sharpens that edge by equipping students with the ability to responsibly and strategically apply new technologies in a professional context,” Lindquist said.

Forward-thinking vision

Drake University Law School has launched a new AI Law Certificate Program for J.D. students.

The program is a response to the growing need for legal professionals who understand both the promise and complexity of AI.

Designed for completion during a student’s second and third years, the certificate program emphasizes interdisciplinary collaboration, drawing on expertise from across Drake Law School’s campus, including computer science, art and the Institute for Justice Reform & Innovation.

Students will engage with advanced topics such as machine vision and trademark law, quantum computing and cybersecurity, and the broader ethical and regulatory challenges posed by AI.

Roscoe Jones, Jr., dean of Drake Law School, said the AI Law Certificate empowers students to lead at the intersection of law and technology, whether in private practice, government, nonprofit, policymaking or academia.

“Artificial Intelligence is not just changing industries; it’s reshaping governance, ethics and the very framework of legal systems,” he said. 

Simulated, but realistic

Suffolk Law has also launched an online platform that allows students to practice negotiation skills with AI bots programmed to simulate the behavior of seasoned attorneys.

“They’re not scripted. They’re human-like,” she said. “Sometimes polite, sometimes bananas. It mimics real negotiation.”

These interactive experiences in either text or voice mode allow students to practice handling the messiness of legal dialogue, which is an experience hard to replicate with static casebooks or classroom hypotheticals.

Unlike overly accommodating AI assistants, these bots shift tactics and strategies, mirroring the adaptive nature of real-world legal negotiators.

Another tool on the platform supports oral argument prep. Created by Suffolk Law’s legal writing team in partnership with the school’s litigation lab, the AI mock judge engages students in real-time argument rehearsals, asking follow-up questions and testing their case theories.

“It’s especially helpful for students who don’t get much out of reading their outline alone,” O’Leary said. “It makes the lights go on.”

O’Leary also emphasizes the importance of academic integrity. Suffolk Law has a default policy that prohibits use of generative AI on assignments unless a professor explicitly allows it. Still, she said the policy is evolving.

“You can’t ignore the equity issues,” she said, pointing to how students often get help from lawyers in the family or paid tutors. “To prohibit [AI] entirely is starting to feel unrealistic.”

Thanks to Our Digital Partners | Learn More Here

Sign up for our email newsletters

Get the insights, news, and advice you need to succeed in your legal education and career.

Close the CTA
National Jurist