Law School Will Allow Applicants to Use Artificial Intelligence
Arizona State University’s law school will allow students to use AI on their applications.
Last Thursday, the Sandra Day O’Connor College of Law announced that prospective students are allowed to utilize generative AI tools to draft their applications, so long as applicants state that they used such tools and that the information submitted is truthful, Reuters reports. Applicants will be allowed to use AI in their applicants starting the next admissions cycle on August 10.
“This is just one more of the tools that is in their toolbox when they think about how to present their admissions package,” Arizona State law school dean Stacy Leeds says.
OTHER LAW SCHOOLS’ POLICIES ON AI
In April, the University of California, Berkeley School of Law became the first law school to adopt a formal policy regarding the use of artificial intelligence in the classroom.
According to the policy, students are allowed to leverage AI technology to conduct research or correct grammar, but are prohibited from using it on exams or to compose submitted assignments.
“We felt that the requirement to attest to the fact that ‘all essays and statements are my original work’ covers the use of generative AI such as ChatGPT in a way we are comfortable with for the time being,” Berkeley Law assistant dean of admissions Kristin Theis-Alvarez says.
Just last month, the University of Michigan Law School banned prospective students from using ChatGPT and similar AI tools on applications. The law school even revised their application requiring applicants to certify that they did not use AI tools to draft their application.
Michigan’s senior assistant dean Sarah Zearfoss says the AI ban “seemed like the right approach.”
“Will I be able to enforce it? No.” she says. “But in general, I’m relying on the honor of the people who apply in a million different ways, so this is no different.”
Next Page: Should you use the LSAT’s score preview?
Questions about this article? Email us or leave a comment below.