Artificial intelligence (AI) has great potential to be a power for good. However, risks are involved, not just ethical and environmental considerations. In this article, we will consider steps to use AI for common benefits.
What is Generated AI?
Generated AI is a type of AI that allows you to create content based on patterns learned from existing data. Content can include text, images, music, and videos. Examples of generator AI platforms are ChatGpt, Sora, and MusicLm.
How Charities Use Generated AI
According to the Charity Digital Skills Report 2024, 61% of the over 630 charities surveyed say they use AI in their daily operations. Also, 33% use AI to help create content, admins (32%) and draft reports and documents (28%).
Charity Communicators use Generated AI for copywriting (39%), transcription events (21%), design (10%), and image creation, according to CharityComms Salary and Organizational Culture Survey 2024.
The use of generated AI by charities has been on the rise since 2023, and it is undoubtedly going to increase year-over-year as more people learn to use it and get the most out of it.
According to CharityComms' Salary and Organizational Culture Survey 2024, compared to 2023 (25%), there is 14% more in one year of using AI for copywriting in 2023 (39%) than in 2023 (25%).
Five Steps to Using AI for the Common Good
Are you ready to start using AI, develop your AI skills, and build a culture of learning in a charity? Below are five steps to using AI responsibly and ethically.
Understand the risks and how to mitigate them
Like other technologies, using AI is risky. It is important to understand what the risks are and to implement processes to mitigate these risks. For example, one risk is that the generated AI can generate false, misleading, or meaningless information.
This is called “hagatsu.” One way to mitigate this risk is to request a source of information at the prompt. That way you can see if it is a legitimate source.
Another risk of using AI is that it learns from human-generated data. This can reflect actual inequality and stereotypes. One way to mitigate this bias is to represent a diverse and broad demographic and culture.
Apply human surveillance
Using AI has helped charities save time and money and become more efficient. However, AI is not perfect, so it is essential to always monitor humans. A great example of this is Caddy, an AI-powered assistant to the citizen advice advisor used to provide services. Caddy uses trusted sources and a unique knowledge base to help advisers find information quickly.
The team that built the caddy came quickly to a deal that the caddy had a “human in the loop” verification system. Therefore, all answers will be approved by the supervisor before sharing with the client.
It is always best to use photos of people in the real situation of Comms. But if this is not possible, AI can create a photo for you. However, it is important to check the accuracy of what AI has created. aMnesty International was criticized for using AI-generated images of the protest in Colombia in 2021. This showed a woman wearing a Colombian flag covered in shoulder (but the colors of the flags were in the wrong order).
Using inaccurate or misleading AI-generated images can be a reputational risk.
Beware of digital exclusions
A lack of digital skills is a concern, both within the charity and the communities they serve, and AI may be widening that gap.
In a 2023 report by Connect Humanity, a global study of over 7,500 civil society organizations, 39% of respondents said lack of digital skills is an organizational barrier.
And 50% say that a lack of digital skills is the biggest challenge for service users and the community. Of those surveyed, 60% said their organizations did not offer digital literacy training.
AI is being used more by charities, but we must be aware of the possibility of digital exclusion both within and outside the organization.
Advocate for the use of ethical AI platforms
Not all AI platforms are created equally. As a charity, it is important to advocate for the use of ethical AI platforms. These prioritize responsible use, fairness and transparency.
One such ethical AI platform is to embrace the faces of a machine learning and data science platform and community. This is open source. This means that anyone can access the code. This means that developers and researchers have access to tools to discover and share AI models, collaborate with others, and build and experiment with AI.
Advocate for responsible use of AI
The charity sector plays a role in ensuring that AI is used responsibly, comprehensively and collaboratively. So far, 50 UK charities and socially talented organisations have signed up and will be participating in the charity AI Taskforce to promote ethical use both within and outside the sector to promote responsible ethical use of AI.
