The onset of generative AI technologies poses a unique challenge to the teaching of writing. With gen AI embedded in search engines, grammar checkers, and word processing programs, it is impossible to escape the effects these technologies will have on us, our students, and our teaching. In what follows, we share suggestions for teaching writing with (and without) genAI and will also highlight challenges that arise with this new technology. We offer examples and frameworks to consider as you decide whether and how to engage with gen AI in your writing classrooms and assignments.
The WAC program affirms, along with our professional organization, CCC (College Composition and Communication) that “writing is an important mode of learning that facilitates the
analysis and synthesis of information, the retention of knowledge, cognitive development, social connection, and participation in public life” (MLA-CCCC Joint Task Force on Writing and AI Working Paper).
This is an accordion element with a series of buttons that open and close related content panels.
AI and Teaching Writing: Some Ethical Issues
- Racism and Bias
LLMs (the large language models that power generative AI) are trained on huge amounts of text scraped from the freely available sources on the internet such as Wikipedia, Reddit, excerpts of Google books, etc. As scholars Emily Bender, Timnit Gebru, et al (2021) have argued, “White supremacist and misogynistic, ageist, etc., views are overrepresented in the training data, not only exceeding their prevalence in the general population but also setting up models trained on these datasets to further amplify biases and harms”.
As researchers at Stanford recently argued, “Despite advancements in AI, new research reveals that large language models continue to perpetuate harmful racial biases” (2024).
See, for example, this encounter with ChatGPT:
The chatbot’s response contains numerous stereotypes about who might need social services and why. Student writers need guidance on how to assess AI critically for biased content.
2. Linguistic and Cultural Bias
Chatbots are biased toward Western cultures and toward “Standard Academic English” and, as Agarwal et al (2024) have shown, tend to “homogenize writing toward Western norms, diminishing nuances that differentiate cultural expression.” Their study found that engaging with AI encouraged students not only to write “correctly” but to “alter lexical diversity” and to adopt “Western cultural norms.”
3. Gender bias
As Wan et al (2023) demonstrated, in their study of generative-AI produced recommendation letters, chatbots produce stereotypes based on gender:
Men | Women | |
ChatGPT | “expert,” “integrity”“respectful,” “reputable,” “authentic” | “beauty,” “delight”“stunning,” “warm,” “emotional” |
3. Additional Ethical Concerns
Environmental degradation
Exploitative labor practices
Privacy
Copyright infringement
Further information:
Perrigo (2023) “OpenAI Used Kenyan Workers . . .”
Hanna and Bender (2023) “AI Causes Real Harm. Let’s Focus on That over the End-of-Humanity Hype”
Verma et al (2024) “OpenAI promised to make its AI safe . . .”
Crawford (2024) “Generative AI’s environmental costs are soaring — and mostly secret”
Generative AI and the Writing Process
Why do we teach with writing? Because writing can engage students in course material, can prompt deep, critical thinking, and because writing helps students learn and solidify course concepts. As Klein & Boscolo (2016) have argued, “successful writing requires learning because the writer has to learn how to shape meaning . . . The distinction between knowledge telling and transforming . . . is between a static view of using and reproducing information, and a dynamic one, where a writer transforms what he or she has learned by using knowledge in a purposeful way . . ” (Klein & Boscolo, 2016).
If we want our students to learn through writing, does AI belong in the writing process? If so, where? Different scholars articulate different roles for AI in writing.
- Generative AI and Brainstorming
Many scholars, including Penn’s Ethan Mollick, have suggested that student writers can benefit from consulting generative AI during the brainstorming process of a writing assignment:
-Chatbots can be a helpful non-judgmental “partner” for brainstorming (Wieland et al. 2022)
-Using Gen AI for brainstorming can get the “creative juices to start flowing” (student participant in Habib et al. 2024)
-“The potential for more creative humans and better thinking is the promise of [a human/AI] partnership. . . . It is the job of educators to help students become better thinkers. Our new job is to help them become even better thinkers with AI” (Bowen and Watson, 2024).
However, some studies have shown that brainstorming writing assignments with generative AI may hinder rather than enhance creativity:
- “It was both easier and harder to come up with ideas when assisted by the AI. It was easier to use the things listed by the AI, however it then felt more difficult to brainstorm other uses beyond those created or taken by the AI.” (student participant in Habib et al. 2024)
- “Our findings reveal that while LLM assistance can provide short-term boosts in creativity during assisted tasks, it may inadvertently hinder independent creative performance . . . raising concerns about the long-term impact on human creativity and cognition.” (Kumar et al. 2024)