Use it or lose it: requiring AI in classrooms and the workplace

1572
0

I thought my political science course would be a normal online class to take over the summer. When I was looking through the assignments, everything seemed normal. That is, until I checked the chapter worksheets.

I found that all of the chapter worksheets had at least one problem that required me to ask ChatGPT to answer the question, typically to compare and contrast with my human-written answers. 

I have ethical concerns regarding the effect of generative AI and derived language learning models on human labor and the environment, and did not want to support it myself even if the questions were meant to warn of AI’s inaccuracies. 

I emailed the instructor about my concerns over the assignments requiring AI, even in the context of forming comparisons, and asked for potential alternatives. The instructor repeated the reasons for requiring AI that I already addressed, stating that “AI is part of our future” and it was important to learn how to use it responsibly. 

I ask for alternative methods of completing the assignment like referencing ChatGPT’s answers for other classmates, the instructor replies they can’t spare accommodations for my request.

I tell them I’d be willing to skip the questions requiring AI if I could pass with an A depending on how the grade was weighted for those assignments, or I would drop the class; the instructor went ahead and dropped me anyway. 

I didn’t bother to share the incident to any part of Delta until now, figuring that it wasn’t going to be severe enough to count as a violation of conduct.

I was correct in that assumption. Daryl Arroyo, dean of Social Science, Education and Public Service, referred questions to Marketing and Communications without answering what policies allow an instructor to drop a student for refusing to use AI.

There is no state or federal law prohibiting the use of AI in curriculum that Delta must comply with. Delta’s only policy regarding AI is the Scholarship Essay Policy that was introduced last academic year, which prohibits the use of AI tools to write or edit scholarship essays. 

Otherwise, there is no strict mandate or restriction of AI being integrated into curriculum. The twelve required elements for syllabi outlined in the Class Information Sheet Checklist do not mention AI — much to my inconvenience signing up for a class where the professor didn’t have to clarify they would require AI use for assignments. 

“Faculty are split on utilizing AI in their classrooms. The beautiful aspect of having academic freedom is that faculty can determine whether or not it is used,” said Kathleen Bruce, chair of the Curriculum Committee, in a Sept. 3 email to The Collegian.

AI has been in discussion at Delta since Fall 2022 with one-on-one AI consultations about productivity tools. Since then, there have been several workshops and webinars at the Professional Development Center for faculty to educate themselves about AI as well as host ethical discussions about its use.  

One such series of workshops offered by the Professional Development Center is Community of Practice: Designing AI-Resilient Assignments, which explores various approaches to AI integration and how to design AI-resilient assignments through process-based learning.

“There have been things that have been released recently through the Academic Senate; there was some information released for faculty to consider for AI usage,” said Joseph Harrington, associate professor of Spanish and facilitator of Community of Practice: Designing AI-Resilient Assignments. “But the reality is, our institutions are large and it takes a long time for us as we digest all the different needs about what’s appropriate and what institutionally are the things we are going to promote and the barriers that we might want to be putting in.” 

After a Presidents’ Council meeting on Feb. 15, 2024, Delta created an AI Taskforce. Augustine Chavez, Vice President of Administrative Services and chair of the AI Taskforce, was unavailable for comment. 

As Delta continues to discuss how to handle AI matters, AI continues to advance throughout the world of education. 

Last year, the bill AB 2876, which requires California’s Instructional Quality Commission to consider AI literacy for their next 2025 revision of K-12 math, science and social science curriculum frameworks and materials, was signed into law by Gov. Gavin Newsom on Sept. 29

Since President Donald J. Trump took office for his second term, this year saw various federal initiatives to integrate AI into K-12 education. 

On April 23, Trump signed the executive order “Advancing Artificial Intelligence Education for American Youth,” which provided online AI literacy resources to K-12 youth and established a White House Task Force on AI Education that would plan a future Presidential AI Challenge, which later launched last month on Aug. 26.

It’s not only K-12 where AI has been encouraged into the curriculum. Case Western Reserve University School of Law became the first law school in the U.S. to require first-year students to earn a certificate in AI legal education, starting February this year. Ohio State University’s AI Fluency Initiative will require students of all disciplines to learn how to use AI to graduate, starting this fall semester with the class of 2029.

The push towards AI in education obviously corresponds with the rise of AI in the workforce, which classrooms are preparing students for. 

A survey conducted by American Bar Association’s Legal Technology Resource Center found that 30 percent of law firms in 2024 adopted AI tools, increasing to 46 percent among law firms comprising over 100 employees.

Businesses like Microsoft and Shopify now require employees to be skilled in AI proficiency and for AI to be used in company operations, with Shopify even prioritizing AI to handle tasks over human hires according to an April 7 article from The Verge. 

An alarming sentiment, considering outplacement firm Challenger, Gray & Christmas reported that technological updates, including AI integration, was the fourth highest factor for 2025 job losses by the end of July — responsible for 20,219 job cuts. Additionally, AI was explicitly identified as the cause for 10,375 more job cuts. 

The push towards AI is undeniable. The proven benefits of AI – as it is currently – much less so. The wave of support and reliance on AI across classrooms and workplaces is concerning to say the least, despite the actual impact of AI on critical thinking and productivity not being well understood yet. 

This wide-scale adoption of AI comes at a cost. The jobs of living people are being supplanted by AI out of a purported need to keep up technologically. The recent development of data centers to support these new AI systems also consumes large amounts of energy and increases emissions as a result, such as that of the Elon Musk-founded xAI’s for services like Grok — whose data center in Memphis poisons the air of its predominantly Black neighborhoods.  

Automation that we thought would be meant to help us has ended up replacing the input of a significant number of honest people who are just trying to make a living. Hopefully, the example of my professor will stay a one-off incident, and that same drive towards AI integration won’t make mandatory AI usage a policy at Delta.