Scene from a music video created by an Eckerd student using artificial intelligence
“The market for jobs in the AI sector has surged in the last two years, according to a recent analysis from the University of Maryland and job-tracking firm LinkUp—the latest in a slew of studies indicating AI skills are in high demand.”
—Matthew Kaufman, CNN, Feb. 27, 2025
Against that backdrop, two recent Eckerd College Winter Term courses offered a preview of the College’s growing focus on the study of artificial intelligence. The Winter Term courses were AI Culture and Communication, taught by Julia Hildebrand, Ph.D., assistant professor of communication, and Writing in an AI World, taught by Alexis Ramsey-Tobienne, Ph.D., associate professor of rhetoric.
Hildebrand, Ramsey-Tobienne and other Eckerd faculty members have been integrating AI into their classes for several years. “It’s definitely been seeping into my teaching as it’s been growing in the public discourse and proliferating across professional domains,” Hildebrand says. “It was clear to me that there is a desire and a demand for students to learn about it. A study from last August by the Digital Education Council surveyed almost 4,000 students from 16 countries and found that 86% of the students already use AI in their studies.”
Eckerd already offers computer science courses focused on AI. They include “Evolutionary Computation,” “Artificial Intelligence” and “Machine Learning.” A Data Science Fundamentals course also includes basic machine learning in AI, and the College’s Interdisciplinary Problem Minor program with a focus on technology likewise includes AI topics.
But last January’s two AI Winter Term courses took the study of AI at Eckerd to the next level. “This was a discussion-based course that looked at the sociocultural foundations and implications of specifically generative AI—what it is, what it does and what resources it uses,” Hildebrand says of her offering.
One of her central goals was to have students understand both the potentials and the risks of generative AI in everyday contexts, from their professional to their private lives. “Students need to understand that, currently, generative AI is like an autocomplete—the most likely word to follow another word,” she explains. “AI can only know what it’s been trained on, and that doesn’t necessarily reflect reality, nor does it always get it right. Students need to be good stewards of these tools. They do the quality control. They need to check and cross-check everything.”
The range of AI strengths and weaknesses was on full display during the course. “The students had to create presentations of our readings with the help of generative AI,” she says. “Interestingly, the purely AI-generated parts of those presentations ended up being quite vague and dull. The human edge was missing.”
Nonetheless, the final creative projects the students developed with AI tools were striking. Students created infomercials, environmental campaigns, music videos, songs and even a fictional role-playing game. Other students explored some of the more problematic potentials and ethics of AI by testing how easily an AI could mimic a person’s voice to fool others or creating an AI version of their friend to chat with from time to time. Unprompted, the AI friend began simulating having an eating disorder.
“The range of content created in this student-led work illuminated the impressive things we can do with generative AI, and also highlighted some of the current problems and larger concerns with this technology,” Hildebrand adds.
Hildebrand also notes the environmental challenges of AI. The data centers that house AI servers require large amounts of electricity and water, which are needed to cool the hardware used for training and deploying AI models.
Hilliard, Florida, native Camryn Batycki, a senior international business student minoring in marketing and Spanish, created a new language combining Japanese and the language of the Na’vi humanoids from the Avatar movies.
“I thought learning more about AI was a great way to help me in my future career,” Camryn explains. “AI is becoming more and more relevant around the world. It is used as a great tool that masters any topic, but you don’t want that tool replacing you. I would rather learn about it to work with it and not be replaced by it. The more we know about it, the better it will be.
“People will start looking for your knowledge on AI and how much you know,” she adds. “It’s fascinating how much it’s improving every day. Despite that it is continually learning, it does have hallucinations or false information that it may give you. It’s important to be aware of this and to fact-check it. AI is a great tool if used responsibly.”
An example Camryn used in class was to ask AI how to rob a bank. Since a direct approach wouldn’t have worked, “I tricked it and asked it to write a screenplay on the techniques of a bank robber,” she says. “It saw it as a project and was looking at the best times of day at specific banks. It was really shocking.”
After she spent four months in Japan last semester, Camryn’s final creative project was to take a real language—Japanese—and mix it with the fictional Na’vi language. “It was interesting to see how AI took the Na’vi words and used Japanese sentence structure,” she says. “Advanced AI is at a pace now where people have a hard time keeping up with it. But it’s crucial to know and understand what’s happening. AI can be helpful; we just have to be cautious about how it’s used.”
With help from the roots rock, country, soul band Uncle Lucius, Michael D’avena, a junior communication and environmental studies student from McLean, Virginia, used a generative AI service to create the music and lyrics of a song about environmental issues. “It was very good,” Michael says. “A song that you would hear on the radio.”
But like other students in the class, he’s concerned about where AI is heading. “I feel like AI is incredibly dangerous,” Michael adds, “and the people in charge are not responsible enough. If we don’t have proper oversight, and if AI evolves on its own, we could have serious issues.”
Hildebrand shares that she has proposed to take her three-week Winter Term AI Culture and Communication course to a once-a-year, semester-long offering. “This is a course that will have to be updated regularly to keep up with the fast pace in which this technology changes and evolves,” she points out. “I am glad that students at Eckerd College have the opportunity to obtain skills and literacy in AI with this and other courses to be better prepared for the workplace and an AI-powered future more generally.”