Some law professors are banning AI in their courses. Others are cautiously adding it.
At Mitchell Hamline, Gregory Duhl is doing something much more ambitious. He redesigned his Contracts course by embedding AI throughout the course rather than ignoring it or treating it as a side issue. Considering Mitchell Hamline’s history of pedagogical innovation, it is not surprising that this kind of redesign would emerge there.
His recent article, All In: Embedding AI in the Law School Classroom, describes his experiment in detail. It’s a thoughtful account of a colleague trying to prepare students for both the bar exam and legal practice.
The experiment is particularly significant because it challenges the assumption that students should not use AI in 1L courses for fear of undermining the development of legal analysis skills.
The Core Argument
Duhl argues that it is not realistic to treat AI as peripheral to legal education and practice. AI tools can already perform many teaching and learning tasks as well as – or better than – humans.
His answer is to embed AI in courses rather than treating it as something to prohibit and police. He integrates it into the course design to:
- Prepare students to pass the bar exam
- Orient students to a profession increasingly transformed by AI
- Shift from policing technology to using it for learning
- Promote equity for diverse learners
Major Techniques in the Course Redesign
The article describes a comprehensive implementation strategy. Several techniques stand out.
Using AI-Generated Instructional Materials
Duhl worked with an instructional designer to convert lecture notes into AI-generated videos with visuals and hypotheticals. He notes that these videos, once refined, may surpass some aspects of his traditional lectures, especially for visual learners. Because AI can generate clear explanations of doctrine, he can use class time to engage in higher-level analysis and deepen students’ understanding. He shifts from being primarily a transmitter of information to a designer of learning.
Teaching Legal Doctrine and Analysis Using AI
Rather than leaving students to experiment with AI privately, the course includes structured exercises requiring students to:
- Prompt AI to analyze doctrinal problems
- Critique AI-generated answers
- Compare AI output to their own reasoning
- Revise prompts to improve quality
In other words, students learn contracts doctrine while supervising, testing, and refining their use of AI tools. AI becomes a valuable pedagogical tool, not just a shortcut.
Integrating AI Throughout the Course
Students use AI in preparing for class, drafting assignments, and reflecting on feedback. The goal is to normalize responsible, critical engagement rather than treat AI as either a forbidden activity or a superficial add-on.
The Human Role in an AI-Embedded Classroom
If AI can do a good job of promoting doctrinal and analytical competence, then faculty can devote more time to dialogue, modeling judgment, mentoring, and cultivating professional identity.
This should feel familiar to us in the dispute resolution world. We have long argued that effective practice requires more than doctrinal knowledge. It requires judgment, reflection, and awareness of intangible interests.
In a related article, Teaching Contracts: My Journey with Spellbook and AI Pedagogy, Duhl focuses on one dimension of this redesign – integrating Spellbook, a professional contract-drafting tool, into his Contracts course. He explains that the educational value of such tools depends on how faculty use them. He emphasizes supervising students’ drafting, reviewing, and negotiating of contracts and assessing their judgment about when to accept, modify, or reject AI tools’ suggestions.
Implications for Dispute Resolution Education and Practice
There are at least three implications worth considering.
First, AI-assisted drafting and analysis are likely to become routine in negotiation and mediation practice. Students who don’t learn to critically supervise AI are likely to be at a disadvantage.
Second, as AI becomes increasingly capable of instantly generating plausible legal analysis, the premium shifts to skills that are central to dispute resolution, including perspective-taking, strategic thinking, and relational judgment.
Third, Duhl’s willingness to redesign a required first-year course should prompt similar experimentation in dispute resolution curricula. Merely adding a brief AI discussion to an existing syllabus does not take full advantage of the pedagogical opportunities.
A Serious Invitation
Duhl’s Spellbook article explains why teaching AI to law students is so important:
The legal profession has progressed beyond debating whether to adopt AI for contract work; the focus now is on selecting appropriate tools and employing them effectively and efficiently. Because students will inevitably practice in a profession transformed by AI, they must learn to work with AI technologies in law school, a controlled setting where they can safely experiment, make mistakes, and refine their oversight and judgment.
All In offers a detailed roadmap for experimentation. He is trying these ideas during the current semester, and we don’t yet know the results.
He promises to publish a sequel describing what worked well and what could be improved. Like other experiments testing new techniques, learning from less successful elements will likely lead to improvements.
He concludes All In:
This is an invitation to all my colleagues to look at the tools of the future not as a threat to your expertise, but as a path to reclaim the joy of creation in your own professional work. We know that AI is already transforming legal education. Let us shape that change intentionally, guided by our institutional missions and our values, to create more effective and equitable classrooms to educate future lawyers.
Take a look.