USC Computer Science Students Step Forward with Queer in AI Movement – USC Viterbi

AI News


Nathan Dennler is one of 50 international co-authors of Queer in AI, which won the best paper at the ACM Conference on Fairness, Accountability, and Transparency (ACM FAccT). Photo/Levi Zheng

Nathan Denler is a different person. He is a Viterbi PhD student studying human-robot interaction, a proud Massachusettsman, figure skater and textile artist. He is also a member of Queer in AI. A global grassroots organization run by volunteers that aims to build an inclusive and inclusive society. A fair space for queer people working with artificial intelligence (AI).

Launched in 2017, Queer in AI aims to promote diversity and inclusiveness in AI research, ensuring that the experiences and needs of LGBTQ+ perspectives are incorporated into AI research and systems. Members include not only undergraduate and PhD students, but also professors, researchers, and people from industry. With about 870 members in 47 countries, much of this coordination is done online via Slack.

Co-advisor Professor Maya Matarich Denler and Assistant Professor Stefanos Nikolaidis first discovered queerness in AI at the Neural Information Processing Conference (NeurIPS) in 2019. There he was demonstrating a hair combing robot developed in Nikolaidis’ lab to help people with disabilities.

Dennler first learned about Queer in AI at a conference that was developing a hair-brushing robot. Photo/Peter Howard.

Soon, he volunteered with the group to host a workshop exploring potential harm from AI systems that specifically affect queer people. The result isAt the AIES (Conference on Artificial Intelligence, Ethics and Society) in August, there was a backlash. affinity workshop It has been a key pillar for queers in AI efforts to make people feel more comfortable discussing queer-specific issues related to AI.

“Generally, I think the problem with how those in power try to solve problems with AI is that they either assume that everything can be reduced to something quantitative, or that the categories are fixed,” he said. Denler said. “For many queer people, things like their understanding of gender identity are constantly changing and cannot be accurately described by categories or numbers, and many AI systems are unable to adapt to changes in users.”

From algorithms to advocacy

In June 2023, Denler and the 50 international co-authors of Queer in AI won Best Paper at the ACM Conference on Fairness, Accountability and Transparency (ACM). FAccT) “Queer in AI: A Case Study of Community-Driven Participatory AI”.

According to this case study, the United States has at least 20% fewer queer people in STEM fields than the national population, and they experience higher levels of career restriction, harassment, and declining professional reputation. Q.The authors write that from medical discrimination and misgendering to censorship of queer content, people in Youre also face issues of bias and exclusion in AI. That doesn’t have to be the case, says Denler.

Denler has been figure skating since she was eight years old. Photo/Willow Cai.

“Involving queer people as part of these systems helps identify use cases that are actively harmful, and to prevent these issues from actively harming queer people. It helps shape the way we conceptualize these issues,” Denler said. “For example, there has been previous work predicting gender and sexuality from faces to recommend products to people, but this carries a fairly large risk of mis-gendering people.”

Denler explored some of these ideas in his writings. Now in his fourth year, his research Mata Rich interaction lab It explores the role of voice, appearance, and tasks in robot gender recognition, and uses design metaphors to understand user expectations for robot behavior, including robot gender representation.

For example, in Denler’s lead-authored paper titled “Using Design Metaphors to Understand User Expectations for Socially Interactive Robotic Embodiments,” researchers found that body shape is an expression of femininity in robots. found to be related. This replicated previous study similarly correlates the relationship between the robot’s waist-to-hip ratio and the robot’s perceived gender representation. Denler also Co-author of the following paper on using gender-neutral voices for robots to reduce gender stereotypes based on appearance.

This summer, Denler interned at Uber in San Francisco, where he works on the company’s promotions team and personalizes the deals he receives from users. “At USC, we are researching personalization for human-robot interaction, adapting robot behavior for different end users, but in this role, we are adapting app behavior, not robot behavior. ‘ said Denler.

Looking to the future, Dennerer hopes that marginalized communities will be able to celebrate their unique perspectives with AI.

“Everyone has a unique perspective based on personal experience, and making sure that these perspectives are diverse ultimately makes all technological pursuits better,” says Denler. says Mr.

“Setbacks are common, but I hope the future will be better overall. I believe it will be inclusive.”

Issued June 29, 2023

Last updated: June 29, 2023



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *