I embraced AI in my English class at Community College – and my students loved it

AI News


My 12-year-old twins can encourage ChatGpt with amazing urgency. They generated AI music, transforming family photos into faint Van Gogh-style portraits, and constructing chatbots that mimic their favorite anime characters. As their mother, I want to say that's because they're great.

My kids can read and write AI due to the weighted combination of luck and privilege. My husband and I work with a graduate degree that requires computer flow. Their Pennsylvania School District, Haverford, is consistently located among the top districts of our state. Their secondary schools benefit from stable funding, high quality teachers and a strong IT department, all leading to discussions about AI in sixth grade classrooms.

Susan E. Ray

It's a 20-minute drive from the school to Delaware County Community College, and I've been teaching for over 10 years. Many of our students come from low-performing high schools. My classroom is full of recent alumni who have been taught that AI is nothing more than a controversial misconduct. One of my returning adult learners said she had heard of AI, but she had no idea what it was. After class, I briefly demonstrated ChatGpt with an overhead projector. She sighed, “Well, now I know if my daughter is suddenly going through her homework so quickly.”

Is this a gap in knowledge? It's not just technology. It is generational, socioeconomic, and institutional. And it has grown widely by that day. As a first-year writing professor at a community college, if you don't meet with intention at this moment, you'll leave behind the most vulnerable students.

I felt this realization as a call to action, and I wasn't just jumping in, I slammed the cannon. Over the past six months I've recorded over 150 hours and built flow ency across multiple large language models. I studied terminology, immersed myself in the ethics and dynamics of generative tools, and leaned against the family IT mindset. I read books, listened to podcasts, and I had a long conversation with a colleague about what fair and ethical AI should look like in our courses.

In May I received a grant to offer ChatGpt subscriptions to Fall Composition I students. These students will meet in computer labs and provide space to explore these tools in a collaborative environment. OpenAI access allows students to benefit from faster responses, audio to text, custom learning tools, SORA, Openai images and video generators, and become more involved with reading materials. Throughout the semester I collect data, manage my research and assess how this access shapes learning and digital literacy.

And I used the grant this summer to integrate the AI detection tool Pangram into the My Composition II course. Rather than playing Sherlock Holmes, rather than scrutinizing student prose for cheating, Pangram's findings provide transparency to both students and instructors. Unlike the detectors I used in the past, Pangram identifies sentences generated in subtly humanized AI, removing familiar crutches that many students have reached in the past, avoiding the troublesome process of development as a writer.

The most effective tool I have adopted is the AI Transparency Journal, a shared Google document that allows students to track all AI interactions throughout the semester. They record each prompt, how the AI responded, what they were surprised and where they struggled, and record their processes, experiments, and growth.

In the current summer composition course II, I started with an experiment. Students uploaded the syllabus to ChatGpt, introduced them to their background, goals, and past writing experience using custom prompts, and asked AI to identify what they could enjoy and whether the course would help them grow.

The results were eye-opening. Students reported that they felt more prepared and reflexive before reading one assigned text. Even those skeptical of AI at first were amazed at how personalised and surprisingly insightful the response was. Several students shared their reflections on being with me:

  • “The reaction felt like I understood both good and difficult things about me. It helped my love for reading the Quran to lead to the diverse literature we explore.”
  • “I never expected AI to suggest keeping a personal phrase list to help my vocabulary. That idea alone has changed the way I'm approaching this class.”
  • “To be honest, it was like getting my horoscope to read, but it's more convenient. The clarity of AI helped me to better understand the syllabus than I read it myself.”

Even students who didn't feel the AI response appreciated the effective capture of their learning style and how they offered a game plan to tackle the accelerated course. Most importantly, it influenced metacognition, reflection and writing before cracking the first literary text.

ChatGpt generated this image based on the lines from Langston Hughes' “Let America Be America Again.”

I am writing this and grade submissions from the midpoint of the Poetry Unit, which is the midpoint of the six-week course. My students selected their favorite passages from either Langston Hughes' “Let America America” or Dunya Mikhail's “The War Works Hard” and created photos that capture the theme using the free AI image generator. They then posted their images and appreciated how they felt they had captured what they had in their imagination.

Many students are fascinated by the generated photos, and their journal responses are on average twice as long as needed. The few were disappointed, but they wanted to explain why. For part 2 of the assignment, I asked them to respond to at least one other image. Most chose to respond to two or three different posts.

After passing the midpoint of the current class, I compared the progress of the same ENG 112 course students a year ago with current students, integrating Pangram or formal AI tools. This summer I started with 37 students, and 29 still actively submitting their jobs. Of these, 24 have earned an A or B, and have consistently completed assignments. In contrast, last summer I started with 38 students, but by the fourth week I was still only involved with 21 weeks, with only 17 people completing the course with C or higher.

Craiyon generated this image based on lines from Dunya Mikhail's “The War Works Hard.” Garbled lines show the difficulty of encouraging AI in highly phorical poems.

That said, I struggle with my broad scale AI integration. I was making Zoom calls with more students than last semester as I walked through technically untreated students through the many steps required to navigate the AI interface.

But no one complains. Her 50s not only uses computers for email and Facebook, but also has one student. After one of our long video calls, she emailed me. “Dr. Ray, I'm grateful for your time today. I'm so glad you showed us all this. I didn't understand what all of this AI was before.

And under all our trial and error, something else is emerging: engagement, community, new energy, positively recharged learning spaces, and even expressible undercurrents floating around even virtual things.

So I'll leave this to you. Students need guidance in navigating these new technologies. If you don't teach us how to engage ethically with AI, not only will it broaden your skill gap, but also strengthen your equity gap.

It's time to shift conversations from fear to responsibility. Our students are ready. We need to meet here.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *