Students attending data science classes this spring faced a difficult dilemma. Should we launch a facial recognition application that allows the unemployed to access benefits when women and people of color are often shut out? How does it feel to be one of those users who can
Cornell Professor Fellow as society ponders the dangers and unknowns of generative AI Liz Kearns She gives statistics students a first-hand look at the implications of their decisions.
She created an immersive video.Nobody’s Blame: An Interactive Experience in Data Science Practice‘ is designed to increase the audience’s empathy and awareness of unintended consequences. Karns introduced it to his Integrated Ethics in Data Science class at the end of the spring semester.
“Statistics students who build large-scale predictive models have a poor understanding of what happens and who is affected if they use a bad model,” said ILR School and Cornell Bowers Computing and Information said Kearns, a senior science lecturer. . “I always had a problem with that disconnect.”
In “Nobody’s Fault,” students experience what it’s like to be a data scientist dealing with a moral conflict. The video pauses from time to time to ask the viewer how they would deal with the difficult situations depicted. As they make decisions, the plot changes and you see the consequences unfold. And we’ll see how it affects unemployed women who can’t get facial recognition applications to work.
After a series of unfortunate consequences, the scene rewinds, presented with better options, and the students realize how things could have been different for a woman seeking her own gain.
“This video gave us a real-life experience of an ethical dilemma,” said Britt Snyder of MILR ’24. “It reinforces our learning by showing us the consequences of our decisions in real time, and how a seemingly harmless few percent can have such a big impact on society as a whole. bottom.”
Karns added: “There are no regulators, no certification exams, no degree requirements, no single discipline in data science.
Kearns developed the script based on three cases. She then collaborated with Webby and Clio award-winning London-based interactive video director Martin Percy to adapt it. Percy participated in each stage of production via Zoom and co-directed with Kearns.
The Center for Teaching Innovation (CTI) shot this video under the direction of instructional designers Ksenia Ionova and Amy Cheetl. This project was supported by: Innovative Teaching and Learning Awards From CTI.
Karns hopes to produce a few more videos. She also plans to make available to classes and instructors across Cornell University an accelerated experience that accompanies them.
Kearns teaches “what personal decision-making looks like and why personal virtues ultimately matter.” It’s the humans, the data scientists, who decide how these systems are deployed,” she said. “At no point can you say, ‘It’s just a system and the technology is doing its job.’ It’s always a human choice. And I want students to feel a little more responsible for that.” I think.”
For MILR ’24’s Joyce Gorospe, one of the biggest lessons from the course was that ethics responsibility lies with every individual, not just from the top.
“It’s one thing to read about ethical issues. It’s one thing to see real people performing, and another to see the emotions behind their reactions,” Gorospe said. “For us, it’s easy to put ourselves in their shoes and imagine ourselves in a situation where there may or may not be a definite answer, depending on how aware we are about these issues. bottom.”
Sandy Malconley is a freelance writer and editor.