An AI-supported recommender system should provide users with the best possible suggestions for their queries. These systems often serve different target groups and need to take into account other stakeholders, such as service providers, municipalities, and tourist associations, who influence the machine's response. So how do we achieve fair and transparent recommendations here? Researchers from the Graz University of Technology (TU Graz), the University of Graz and the Know Center investigated this using Graz-based startup Cyclebee's cycling tour app. They conducted research on how AI can take into account the diversity of human needs. This research received the Mind the Gap Research Award on Gender and Diversity from Graz University of Technology and was funded by the Styrian Future Fund.
Impact on large numbers of groups
“AI-based recommender systems have the potential to significantly influence purchasing decisions and trends in the number of guests and visitors,” said Bernhard Wieser of the Institute for Human-Centered Computing at the Graz University of Technology. “It provides information about services and places worth visiting, and ideally it should take into account the needs of individuals. However, there is a risk that certain groups and aspects will be underrepresented.” In this context, a key finding of this study is that targeting equity is a multi-stakeholder issue, as not only end users but also many other stakeholders play a role.
These include service providers along the route, such as hotels and restaurants, as well as third parties, such as municipalities and tourism organizations. And then there are stakeholders who may not even have been exposed to the app, such as local residents who may feel the effects of overtourism. According to this study, aligning all these stakeholders cannot be solved by technology alone. “If an app wants to provide as fair an outcome as possible for everyone, it needs to clearly define its fairness goals up front. And that is a very human process that starts with deciding which target groups to serve,” says Bernhard Wieser.
Involve all stakeholders in the design
This target group determination influences the selection of AI training data, its weighting, and further steps in algorithm design. To involve other stakeholders, researchers suggest using participatory design. Participatory design allows all stakeholders to participate and harmonize ideas as much as possible. “But ultimately it's up to the individual, as they have to decide whether to support something or not,” says Dominik Kowald of the Fair AI Group at the Know Center Research Center and Institute for Digital Humanities at the University of Graz. “You can't use AI models to optimize everything at the same time. There are always tradeoffs.”
Ultimately, it's up to developers to decide what this tradeoff will be, but researchers say it's important to be transparent for end users and providers. Users want to be able to adapt and influence recommendations, and providers want to know the rules depending on which routes are configured or providers are ranked. “Our research results aim to support the work of software developers in the form of design guidelines. We also want to provide guidelines to political decision makers,” says Bernhard Wieser. “Thanks to technological developments, it is important that recommendation systems become increasingly available to small businesses in the region. This will enable the development of fair solutions, create a counter-model to multinational companies and sustainably strengthen regional value creation.”
/Open to the public. This material from the original organization/author may be of a contemporary nature and has been edited for clarity, style, and length. Mirage.News does not take any institutional position or position, and all views, positions, and conclusions expressed herein are those of the authors alone. Read the full text here.
